SUBSCRIBE TO OUR BLOG
How often do you read privacy notices before agreeing? If the answer is not often, then you are not alone. A study found that the average user needs 76 days per year to read every privacy notice for all the services they are using.
That is just one reason it is not always best to leave it to individuals to protect their own data privacy.
Several cognitive biases also come into play. One is hyperbolic discounting. This is when people see the present moment as more important than future consequences. For example, we are happy to share pictures of a raucous party on Instagram without thinking about how that might influence future employment chances. People are also often happy to give away their data in exchange for being able to use a service straight away.
Information collected by your car can be used to create useful new services - but privacy must also be protected.
In today's connected world, there is perhaps more social pressure than ever to engage with certain services whatever the privacy demands. This means consumers often do not have a meaningful choice.
Then there is loss aversion, where the pain of losing something is twice as powerful as the pleasure of gaining something. Once people 'own' their privacy by default, studies show they would pay twice as much to give it away as they would pay if they had never had it in the first place. When you look at it purely financially, privacy-by-default does not pay off for companies. This is one of the reasons we have seen increasing regulation in this area recently.
You can tell a lot about a person from their location data, especially when you add time to the equation. There is a reputational risk with some information. For example, the patterns and variations in someone's movements can reveal they regularly go somewhere they would prefer others not to know.
What is even more dangerous is that location data can present a safety risk. If it is possible to find out where you will be at any given moment from looking at patterns, then malicious actors could use that for harmful ends.
Peer pressure can make it difficult to resist giving away some kinds of privacy.
What makes location data so sensitive is what also makes it so valuable for many services in an increasingly autonomous world. We want to strike the balance between preserving the value of the data and protecting the privacy of individuals.
"We often talk about protecting data privacy, but what that means is protecting the humans behind it." - Aleksandra Kovacevic, Senior Privacy Manager, HERE
Thankfully, there are many innovations in privacy technology and methodology which can help. It is possible to minimize the data to ingest only what is really necessary, stripping all those attributes that can lead uniquely to the individual. There are also developments in privacy engineering that can make privacy more robust. One recent such trend is federated learning. This is where instead of sending raw data from your phone to the cloud or to some central unit, it is processed at the 'edge', on your device.
More and more businesses are becoming data-driven. Many want to use location intelligence about repeated patterns for mobility planning, or other beneficial services. Decisions based on data are also seen as being more objective.
At the same time, data is collected from individuals. We often talk about protecting data privacy, but what that means is protecting the humans behind it.
There have been more misuses in recent years, such as the Cambridge Analytica scandal. Awareness of privacy and what happens to our data is growing as a result, and regulation is becoming stricter, forcing companies to act.
A study in 2020 by IBM Security found the average cost per data breach that year was US$3.86 million.
The more data we ingest, the riskier it is – and we are collecting and ingesting more and more data. This means the strong controls that go beyond compliance must be in place to make sure that slips do not happen.
A lot of companies took notice once GDPR – one of the strictest privacy laws so far – came into force. They set up assurance teams to handle privacy in a reactive manner, when a product is about to be launched.
There are two problems with this approach. Firstly, it treats privacy as an afterthought. You can imagine the tremendous pressure on privacy officers not to block a product once profit losses are at stake. Secondly, in order to balance utility and privacy of data most effectively, it must be considered from the beginning and infused in the core of the product, instead of just given a stamp of approval at the end.
Location data can be sensitive and challenging.
Privacy can ultimately become one of the differentiating features. The tremendous success of Signal, for example, is not because users can send videos more effectively than they can with WhatsApp – they cannot. It is purely because they are sure that privacy is protected with Signal.
We cannot rely entirely on our privacy officers or focus narrowly on compliance at the expense of privacy engineering. We need to empower everyone in the organization, increasing awareness and putting best practices in place. Product managers are the 'owners' of the product and must demand and own privacy for that product.
At HERE, we implement privacy-by-design practices from the ideas stage throughout the entire product lifestyle. For example, we aim to ingest the minimum amount of data that we need for providing services. Our retention – how long we keep data in use – is balanced as well. We inform users clearly and simply how we are using their data.
It is crucial to think about privacy over the entire lifecycle of the product. As long as the product exists, the data exist – and therefore, privacy might be at risk. A lot of changes can happen in that time, which we are only able to catch if we are monitoring that risk. Changes, enhancements or even maintenance to the product can all affect privacy.
You should also make sure it is just as easy for individuals to withdraw their consent as it is to give it in the first place. This is HERE's approach with products such as HERE WeGo, for example.
Beyond internal privacy-by-design and privacy engineering practices, we provide an anonymization tool called HERE Anonymizer. It can be used by any data providers or other entities in a compliant and cost-effective way.
This advanced tool carefully balances the utility of the location data for targeted and selected uses with the need for privacy. It has a strong privacy assessment tool that enables the user to measure the balance between anonymization and utility.
It can process location data at scale while complying with privacy regulations such as GDPR and CCPA.
We also provide a consent management tool that gives users fine granular control over their data and transparency over the purpose it is being requested for.
HERE Consent Management is blockchain-based, which provides additional decentralization and therefore additional trust.
Companies need to go beyond pure compliance and act responsibly, protecting data privacy for the individual. As awareness of privacy risks grows, businesses are asking less what the business value of privacy is and are looking more at the opportunity to offer privacy as a differentiating value. Transforming the company mindset so it becomes a privacy-first business can put you in the best position to gain competitive advantage.
Maximize the value of your data in an open, secure environment that gives you strong privacy controls and transparency