Manulife HackITOut 2020— Co-op Experience
by Yathartha Panigrahy | December 16, 2020
Manulife HackITOut 2020 – Co-op Experience
I don’t think anyone truly expected or understood just how big an impact COVID-19 would have on the world. But this isn’t another essay about just how much everyone has been affected. Rather this is me reminiscing about my time at the HackITOut event that I was a part of from October 14–16th, 2020.
If you want a major TLDR; it was awesome!
A brief introduction
As someone who loves going to Hackathons, I was very worried that COVID-19 would pretty much shut the door on any such event for the foreseeable future. However, I have now been a part of two such events during my time as a Software Engineering co-op student at Manulife. And I can truly say that if anyone knows how to host virtual Hackathons, it’s Manulife!
The first event I was a part of was in April during my first term at Manulife. It was more informally referred to as a Slackathon by us participants. We were a group of approximately 10 co-ops who were given a couple of problem statements to solve with regards to the Slack workflows used at Manulife. It was an enriching experience with lots to learn, especially about the Slack API. This was my first taste of a hackathon in its virtual offering. The word “zoom” has been given a completely new meaning altogether because of 2020. Just a brief statistic, Zoom had approximately 10 million users in December 2019. In April 2020, that number had ballooned up to 300 million.
So it makes complete sense for Manulife to use Zoom for such an event. I would like to say that first “Slackathon” was a great trial run of both Zoom and the entire concept of a virtual Hackathon. But we were working in a rather limited environment where the total participants didn’t exceed 20. The overwhelming success of the event convinced Manulife that virtual Hackathons can definitely be looked into on a much greater scale.
HackITOut 2020 — The Data one!
So if there are enough participants and there’s a definite interest for it, what’s to say Manulife can’t host a much bigger internal virtual Hackathon altogether? And that’s just what we did. Hackathons themselves are not something Manulife is a stranger to. There have been multiple internal Hackathons before that have had tremendous success. However, this was going to be virtual and much bigger, so definitely breaking into new territory.
There were going to be close to 80 participants. Nearly 4 times the virtual Slackathon we had in April. And wow did Manulife pull it off successfully.
There were 12 teams and 3 problem statements, all of which has been told to us the month prior. There were also 7–8 co-ops participating with the rest of the numbers being made up of full-timers. We had a couple of training sessions as well the week prior to the Hackathon. These were very important for someone like me, a software engineer with very little hands-on experience with Azure and its data engineering/machine learning capabilities.
Everybody loves hoodies. They are snug and easy to wear and with everyone staying inside, hoodies have become the norm. Hackathons are notorious for handing out t-shirts and hoodies and this one was no different. This had to be by far one of the comfiest hoodies I have received! The sanitizer and candy were also a pleasant surprise. One being much tastier than the other.
The day before the Hackathon we had a social team-building call. A few other co-ops had come up with some rather intriguing tasks which were called Brains, Beauty and Brawn. We had interesting riddles and the opportunity to show off our creativity by using Zoom’s filters! I was part of Team #1 and it was great to interact with all my teammates over these fun challenges.
This photo definitely checks the beauty box, if I do say so myself!
The Hacking phase
We kicked off the Hackathon with some further detail into the problems statements and precisely what goals we needed to hit as teams in order to proceed further ahead.
There were 5 levels as such:
With each level, we had to increase the amount of information that we could extract from our given datasets. Which reminds me, I need to tell you guys the problem statement my team was solving.
Manulife wants to be able to predict or determine how people think and feel about Manulife relative to their competitors. Using data sources such as social media API’s this challenge aims to ingest that data into Azure, use machine learning to understand and process that data, and visualize it for user consumption. Since machine learning will be leveraged this also gives us the opportunity to take a look at trends and make predictions on people’s disposition towards Manulife so we can better understand how we can enhance our public image.
As a software engineer, this seemed exactly like the machine learning “thing” that I’ve been hearing about all these years. I hadn’t actually dabbled with Azure or anything specific to data wrangling which is why I actually wanted to participate in the Hackathon in the first place. Luckily I had a Data Engineer on my team who had a definite idea on where to get started.
We were provided with some CSV files containing social media content from various different insurance companies and what people were saying about them. After being able to import the data from the provided container to our very own container in Azure Data Factory, our team set up trying to clean up this data.
Let me tell you this, data clean up is not the easiest thing in the world. It requires you to be able to have a quick look through the dataset (which is huge) and make decisions with regard to what information is necessary. Sounds pretty obscure right? Say you have a hundred different columns filled with different information about different people. But you knew that each column had exactly the same kind of information. Now choosing even 10–15 of these columns which would help solve your problem at hand, is not an easy task to do. Alongside this, we had to actually run various scripts on Azure Databricks to actually “clean” this data up. This is the basic gist of what Big Data processing looks like.
Lines of Code
After what seemed like hours, multiple different hurdles at various points, and lots of help from the coaches, we were able to clean up our data to look like something we could actually make use of. This pretty much sealed Bronze for our team as we did a basic automated run on Azure Machine Learning to create our model.
Over the next day and a half, all the teams had to use PowerBI in order to also visualize the data that our machine learning model was providing. Humans are primarily visual learners, and a dashboard is the perfect intersection of efficiency and information. That is why all the teams were tasked with utilizing the power of PowerBI (there’s definitely a pun in there that I’m missing) to create some rather beautiful dashboards filled to the brim with information.
PowerBI Dashboard View 1
PowerBI Dashboard View 2
All in all, each team tried to improve exactly what our machine learning models would be churning out. We tried to clean up the data further and see if that could help us increase our accuracy levels. The teams that were able to get further ahead in terms of the award criteria, were able to utilize Event Hubs to deal with live data as opposed to just data from cleaned up CSV files. As teams progressed through each level there was a new nuance added that would keep them busy and improve their final solutions. The end goal was not a report of all our findings. Rather it was for some type of a “data product” that could serve Manulife in the future. And the closer each team got to a more refined product, the better their chances of demo-ing to some of our senior leadership team!
I quickly realized that although I may not have the most understanding or exposure to all things Azure, I would have to google things and ask my teammates and coaches in order to make the most out of it. I could definitely say that I learnt a lot just from being exposed to these Azure tools and these sorts of datasets.
Conclusion & Kudos!
5 teams were selected out of the 12 to demo to members of Manulife Canadian Technology Division’s senior leadership team. Unfortunately, our team was not selected. However, we did have the opportunity to demo on a smaller scale to one member of the team as they joined our ‘breakout room’ on Zoom. We were asked to go over our solution as well as explain how we set about going over the tasks.
The teams demoed and were entered into a raffle for gift cards. The better your team performed, the more chances you had of winning yourself something. Our team was able to snag a prize as well for being the most popular post on Slack for THAT picture in the beauty section! Of course, it was all my idea and I’d love to take credit for it. After prizes were awarded, at 4 PM on Friday, October 16th, we concluded what had to be the most interesting Hackathon I had ever been to!
I think in times where people are trying to strive for some sense of normalcy, being able to host a hackathon, albeit virtual, speaks to the merit of Manulife and its employees. We were not deterred and were able to expand and include many more participants. Zoom and Microsoft Teams served as the perfect backdrop for the Hackathon. Breakout rooms have to be the break out feature for me (once again there has to be a pun there). It made everything so much more convenient. Coaches could drop into individual breakout rooms where each team was hacking away. We would all drop back to the main room when we had our kickoffs which required everyone’s presence. And Teams was great for that final demo. Azure has so many capabilities that I didn’t know about at the start of the hackathon. But by the end, I knew I’d definitely be exploring more of it.
On the whole, it was a very interesting Hackathon. I don’t know if I’ll attend another one like this. But I definitely know I’ll never forget this one.
I would like to thank a few people without whom I definitely wouldn’t have been able to take part in either the Slackathon or the Data HackITOut. Big thanks to Gennifer Berton and Nicholas Blurton-Jones for being two of the coolest managers around and letting me take part. Especially Gen for organising both events. I’d like to thank Liliana Paroski, a fellow co-op who has devoted so much of her time into the Data HackITOut. The coaches without whom we definitely would’ve have learnt nearly as much. Thanks to Adam Wisniewski for helping our team a lot. The entire TCT team that I work with, only they know just how much effort they’ve put into the event and without them it just wouldn’t have happened. And lastly thanks to Gary Feierabend, Louis Jacob and Hugh Cumming for enabling Manulife to host events like these!
Till next time HackITOut! 👋