Healthcare

Most Expensive Healthcare App Security Fails in 2018–2019

MyFitnessPal, PumpUp, and Strava all were unable to avoid data breaches. Find out why and what you can learn from these cases to make your app more secure.

December 1, 2020
devops ninja animation

In 2018, the average cost for a corporate data breach reached almost $4 million

Let’s take a look at a few of these attacks to learn what went wrong and secure your business from such risks.

1 MyFitnessPal

MyFitnessPal is a typical fitness application. It allows users to log cardio and strength exercises, connects with more than 50 devices and other apps, tracks steps, counts calories, and so on. Released in 2009, MyFitnessPal quickly gained popularity — it was chosen as the number one health and fitness app four years in a row. But everything changed in February 2018.

The MyFitnessPal data breach was probably one of the most publicized in the healthcare industry. Hackers accessed the personal data of almost 150 million users, stealing their names, hashed passwords, IP addresses, and email addresses. Fortunately, the criminals couldn’t get to users’ credit card and social security numbers, as this data was collected and stored separately. 

Under Armour, the company which acquired MyFitnessPal in 2015, became aware of the data breach at the end of March 2018. Four days later, users started to receive notifications and emails requiring them to change their passwords and offering recommendations on how to safeguard their accounts. In February 2019, the stolen personal details appeared on the dark web. 

Other apps owned by Under Armour were not affected, but the company still lost 4.6% of its market value because of the data breach. However, the company and the app survived. MyFitnessPal still has a lot of users and pretty high ratings in the app stores (4.5 on Google Play).

What to learn from this case:

2 PumpUp

Pump

Up positions itself as the world’s most positive fitness community. It offers users numerous workouts and programs, an opportunity to learn more about fitness and get support from other members, and other features. After the app was released in 2012, it became rather popular.

The PumpUp data breach took place in May 2018, when personal data of more than 6 million users stopped being private. Data compromised included information on users’ locations, email addresses, gender, and dates of birth, full-resolution profile photos, workout data, health information (for instance, weight and height), device data, and private messages. In certain cases, even credit card data was exposed. 

The incident happened because the core backend server hosted on the Amazon Cloud was left without a password for an indefinite amount of time. Anyone could see the private content of the app’s users.

The exposed server wasn’t even found by the company — it was discovered by security researcher Oliver Hough who then contacted ZDNet, a business technology news website, to investigate the case. ZDNet spent a week trying to get in touch with PumpUp, but there was no reply. However, in the end, the server was secured.

Since there were no comments from PumpUp after the breach, we can’t tell exactly how much money they lost. But their reputation was definitely affected. 

What to learn from this case:

Strava

Strava is a fitness app for tracking running, cycling, swimming, and other activities. It allows users to map and record their routes, analyze their activities, participate in challenges, etc. The app was released in 2009, and since then it has been installed more than 10 million times on Android OS alone (according to Google Play; no data on iOS downloads is available).

The story of the Strava failure began in November 2017, when the company released a global heat map showing running routes for all users who opted to make their data publicly available. To create the map, Strava used GPS data from smartphones and fitness tracking devices on 1 billion activities. This data was collected from 2015 to 2017. Over 27 million users tracked their routes during this time, and due to confusing privacy settings, some of them didn’t even know that they were sharing sensitive data. 

This map was the brainchild of Strava. But in January 2018, Nathan Ruser, an Australian student, noticed that by analyzing the map, it was possible to determine the whereabouts of military bases and other sensitive locations.

Strava and its map got a lot of criticism. In response, the company didn’t delete the map, but rather changed it significantly. 

First of all, the data isn’t available to everyone anymore — to zoom in and see street-level detail, users now have to log in with their Strava account. 

Second, the map is now updated monthly, which means that if a user changes their privacy settings and doesn’t want to provide data for the heat map anymore, their data won’t be included in the next month’s map. 

Third, all roads and paths with little activity aren’t shown on the map until they’re used by different users (not only runners, for example) for different activities.

To develop the heat map, Strava had to collect, analyze, and put together loads of data, which took money and a lot of time. Then the company had to update the map significantly, which meant unexpected additional expenses. 

What to learn from this case:

The Bottom Line

To protect your healthcare app from security mistakes and failures, you have to pay attention not only to encryption and multi-factor authentication. As you can see from the Strava case, it’s also crucial to plan updates and new releases very carefully. 

Follow these simple rules: run security tests and staff trainings on a regular basis, secure your app with multi-factor authentication and encryption, keep privacy settings simple, and analyze all potential outcomes. 

And, obviously, if something does go wrong, stay in touch with your users.


devops ninja animation
devops ninja animation
devops ninja animation
LIKE THIS ARTICLE? Help us SPREAD THE WORD.

More Articles

Back to blog