One of the largest life insurance providers in North America, John Hancock, will no longer offer policies that do not include digital fitness tracking.
On one hand, policyholders can earn discounts and rewards such as gift cards for hitting exercise targets and on the other, privacy advocates warn that insurers could use tracking data to punish customers who fail to meet them.
Are companies increasingly gambling with sensitive user data or are schemes like this one a natural progression of our digital lives? We asked a handful of privacy professionals and advocates of the personal data economy for their thoughts.
Here’s what they said…
Studies have shown that simply giving individuals access to their health data leads to healthier living and reduced healthcare spending, so digital fitness tracking have the same effect.
Sharing data in return for a reward offers a benefit to both sides, so is also welcome – although less so is any suggestion that those not hitting targets will be punished. Insight into our personal data is a powerful force, the more we are able to see, the more we can learn about our behaviour and routines and then use this knowledge to make better-informed decisions about our lives.
But this data should be held by, and visible, to us – it’s ours, we created it and thus we should own it. That data, building over time, is of most use to us and thus should be accessible to us.
Companies don’t need to hold this data, storage is a huge expense, and actually give themselves security headaches by doing so, because they have to ensure it is secure and GDPR compliant.
Insurance companies are in the business of math, as they sell probabilities of things NOT happening. Getting access to your private data, such as your activity levels, travel movements, and the speed you drive your car, is pure gold for them because they will end up making more money knowing what might NOT happen to you and deny you services if they think they’ll need to pay money to you.
But this is a “poisoned fruit” type of situation; it looks great, but it can cause serious damage after “it’s eaten”. All companies can be hacked, and identity theft and identity cloning are the new types of crime, where a bad actor can steal someone’s private digital trail and use it to commit crimes. All centralized databases exist only in two states – hacked or will be hacked. Secondly, it can be used by bad actors to track people for their needs. For example, terrorists can hire professional hackers to get very detailed intel on their victims.
Finally, there’s the human factor. This data can be exploited, and some private data can be sensitive. For example, by tracking physical data in relation to GPS (and this is what most trackers do), one could prove that someone is cheating on their partner. We have to be absolutely clear on the fact that by providing data feeds to a third party, at the same time we’re providing “collateral data” or “derived data” as well. This means that basically we’re providing a third party with the tools for making assumptions and decisions on what you are and what you do – but, as consumers, we have absolutely no control over how they will use this data.
Requiring the use of fitness trackers in order to qualify for life insurance raises all sorts of red flags. Not least among them, who has access to this data outside of the customer and life insurance company? Presumably, Apple and Fitbit also have access to the data generated from their devices, which adds a third party to the mix. We can also assume that the government can access these records, at least with a court order.
Car insurance companies use a similar model in which they give customers tracking devices that plug into their vehicle dashboards, which allows the company to track driving habits and adjust premiums accordingly, but the time frame for using one of these devices is usually only a few months, not a lifetime.
The thought of an insurance company looking over your shoulder every day to make sure you’ve got your steps in seems very dystopian. Where do we draw the line? In theory, companies and the government could force everyone to exercise and live longer, but does it make us happier? Happiness does not figure into life insurance companies’ bottom line.
Furthermore, fitness trackers can be tricked. If more companies start going this route, we’ll no doubt start seeing any number of hacks and workarounds to emulate steps and other metrics measured by fitness trackers. Equally, what about people who engage in exercise that isn’t measured by fitness trackers? They measure steps, but do they measure how far a user has biked, climbed, or swam? Whether the data is used to discriminate against less healthy people or not, the whole idea seems really flimsy.
With the advent of fitness trackers we are seeing a growing number of concerns. In the UK, there is already speculation relating to the latest developments by Apple that the already stretched NHS in the UK will not be able to cope with the volume of automatic calls from devices set up to call a Doctor in the event of an incident! Which while not directly related to the insurance issue raises concerns that fitness trackers are developing and hopefully becoming increasingly more reliable and consistent. But you still see differences in simple aspects of the fitness trackers dependant on the technology used.
A more concerning issue is what happens to the data collected? With the potential for large volumes of data comes the perils and potential bias of AI, without stretching the imagination too far it is easy to see machines working out patterns and being able to diagnose future health issues and from a healthcare perspective start pre-emptive treatments.
Whilst this would certainly be a major development for humanity, if you apply the wrong bias to this you can easily see the potential for premiums being increased or policy renewals being refused based on predictive data, business cash flows and treatment costs. It would also become increasingly important to check the small print in the policy being wary about requirements for daily data uploads, minimum exercise levels required.
Even with anonymised data, the only positive from a mass collection of this type of data is if it is applied to identifying and providing preventative treatments.