In 2018, the average cost for a corporate data breach reached almost $4 million.Â
Let’s take a look at a few of these attacks to learn what went wrong and secure your business from such risks.
1 MyFitnessPal
MyFitnessPal is a typical fitness application. It allows users to log cardio and strength exercises, connects with more than 50 devices and other apps, tracks steps, counts calories, and so on. Released in 2009, MyFitnessPal quickly gained popularity — it was chosen as the number one health and fitness app four years in a row. But everything changed in February 2018.
The MyFitnessPal data breach was probably one of the most publicized in the healthcare industry. Hackers accessed the personal data of almost 150 million users, stealing their names, hashed passwords, IP addresses, and email addresses. Fortunately, the criminals couldn’t get to users’ credit card and social security numbers, as this data was collected and stored separately.Â
Under Armour, the company which acquired MyFitnessPal in 2015, became aware of the data breach at the end of March 2018. Four days later, users started to receive notifications and emails requiring them to change their passwords and offering recommendations on how to safeguard their accounts. In February 2019, the stolen personal details appeared on the dark web.Â
Other apps owned by Under Armour were not affected, but the company still lost 4.6% of its market value because of the data breach. However, the company and the app survived. MyFitnessPal still has a lot of users and pretty high ratings in the app stores (4.5 on Google Play).
What to learn from this case:
- MyFitnessPal should have been equipped with two-factor authentication. For a mobile application, we would recommend using biometric authentication or at least push notifications.Â
- Reliable encryption is a must for companies that are serious about privacy and security.
- For the majority of passwords, Under Armour used the Bcrypt hashing function. This is a reliable mechanism. But for the remaining passwords, the company used the rather weak SHA-1. Using Bcrypt for all passwords could have reduced the scope of the breach.
- Collecting and storing the most important data separately is a great practice — it kept credit card data safe. Otherwise, Under Armour could have faced a much more serious loss in market value.
- If a breach happens, it’s essential to notify users as fast as possible — keeping silent will simply destroy your company’s reputation. Under Armour did well here.
2 PumpUp
Up positions itself as the world’s most positive fitness community. It offers users numerous workouts and programs, an opportunity to learn more about fitness and get support from other members, and other features. After the app was released in 2012, it became rather popular.
The PumpUp data breach took place in May 2018, when personal data of more than 6 million users stopped being private. Data compromised included information on users’ locations, email addresses, gender, and dates of birth, full-resolution profile photos, workout data, health information (for instance, weight and height), device data, and private messages. In certain cases, even credit card data was exposed.Â
The incident happened because the core backend server hosted on the Amazon Cloud was left without a password for an indefinite amount of time. Anyone could see the private content of the app’s users.
The exposed server wasn’t even found by the company — it was discovered by security researcher Oliver Hough who then contacted ZDNet, a business technology news website, to investigate the case. ZDNet spent a week trying to get in touch with PumpUp, but there was no reply. However, in the end, the server was secured.
Since there were no comments from PumpUp after the breach, we can’t tell exactly how much money they lost. But their reputation was definitely affected.Â
What to learn from this case:
- To avoid this problem, PumpUp had at least to protect their data with a password. Ideally, this would have been combined with two-factor authentication to keep users’ data safe, and Microsoft suggests avoiding call & SMS authentication.
- It seems that the company didn’t run any security tests — regular security scanning would have helped them notice the problem much earlier. EGO recommends performing such tests on a regular basis.
- Another mistake PumpUp made was ignoring communications from ZDNet and ignoring the incident. If a breach happens, a company should stay in touch to show users that it cares.
Strava
Strava is a fitness app for tracking running, cycling, swimming, and other activities. It allows users to map and record their routes, analyze their activities, participate in challenges, etc. The app was released in 2009, and since then it has been installed more than 10 million times on Android OS alone (according to Google Play; no data on iOS downloads is available).
The story of the Strava failure began in November 2017, when the company released a global heat map showing running routes for all users who opted to make their data publicly available. To create the map, Strava used GPS data from smartphones and fitness tracking devices on 1 billion activities. This data was collected from 2015 to 2017. Over 27 million users tracked their routes during this time, and due to confusing privacy settings, some of them didn’t even know that they were sharing sensitive data.Â
This map was the brainchild of Strava. But in January 2018, Nathan Ruser, an Australian student, noticed that by analyzing the map, it was possible to determine the whereabouts of military bases and other sensitive locations.
Strava and its map got a lot of criticism. In response, the company didn’t delete the map, but rather changed it significantly.Â
First of all, the data isn’t available to everyone anymore — to zoom in and see street-level detail, users now have to log in with their Strava account.Â
Second, the map is now updated monthly, which means that if a user changes their privacy settings and doesn’t want to provide data for the heat map anymore, their data won’t be included in the next month’s map.Â
Third, all roads and paths with little activity aren’t shown on the map until they’re used by different users (not only runners, for example) for different activities.
To develop the heat map, Strava had to collect, analyze, and put together loads of data, which took money and a lot of time. Then the company had to update the map significantly, which meant unexpected additional expenses.Â
What to learn from this case:
- In the case of Strava, there were no hackers or other criminals — the company gave out important information on its own. There was not even some kind of social engineering, as no fraud was involved. Strava simply didn’t pay enough attention to the potential outcome, and that was their main mistake — they didn’t anticipate the consequences. Explaining the importance of security and privacy to the entire team and training staff on a regular basis probably couldn’t have prevented this incident fully. But if the Strava staff would have thought about possible implications, they would have noticed that something was wrong during the map development phase.Â
- Privacy settings should not be confusing. Users must be able to set everything up easily and quickly. If privacy settings had been clearer, most users would have been able to prevent their private data from being published.
The Bottom Line
To protect your healthcare app from security mistakes and failures, you have to pay attention not only to encryption and multi-factor authentication. As you can see from the Strava case, it’s also crucial to plan updates and new releases very carefully.Â
Follow these simple rules: run security tests and staff trainings on a regular basis, secure your app with multi-factor authentication and encryption, keep privacy settings simple, and analyze all potential outcomes.Â
And, obviously, if something does go wrong, stay in touch with your users.
‍