Four key principles for companies to use data targeting in a positive way.
By Dr David Stillwell, University Lecturer in Big Data Analytics & Quantitative Social Science at Cambridge Judge Business School and Academic Director of the Psychometrics Centre
BBC Click, the broadcaster’s tech programme, aired its 1,000th episode earlier this month. To celebrate, it created a “Choose your own adventure” TV show, in which you – the viewer – make choices about what to watch next. Do you want to watch a segment about self-driving vehicles or one about robotics innovations in Africa? Instead of everyone watching the same programme, the TV is personalised to you.
But there’s a twist. While you watch the TV, the TV watches you. The choices you make are recorded and used to profile you, including and especially the seemingly inconsequential ones.
For example, before the programme even starts you’re asked to choose whether “I’m ready” or “LET’S DO THIS!!!!” The former constrained language is how introverts talk, the latter is more likely to be the exclamation of an extrovert. One little choice does not build a full profile, but after many choices a consistent picture starts to emerge.
The programme takes profiling to its logical conclusion. Once your personality has been profiled, you are shown an advert for a fictitious Smart Glasses product. But the specific advert you see is tailored to your profile. If you’re extroverted, the advert might be at a thrilling music festival with thousands of people crowding. If you’re introverted, the advert might be you at home with your loved ones.
The BBC Click personality profiling, which I advised the programme makers on, mirrors my own research. I found that Facebook Likes can predict personality and other intimate traits. Even a few Likes per week add up into a consistent picture of a user after years of social media use.
This type of profiling predates the digital era, as the 70-year-old credit card or supermarket loyalty card have long collected huge amounts of data for marketers. But in the age of the Alexa smart speaker and ubiquitous social media, this type of profiling now happens all the time – so unlike throwing away a credit card, it’s now virtually impossible to opt out of having data like this recorded.
Regulators are currently grappling with what to do about such targeting, if anything. But the challenge boils down to this: whether targeting is positive or negative depends how it is used.
People quite like personalised recommendations of what to read next on their Kindle or what to listen to next on their Spotify – and we even found that most people prefer personalised adverts to generic ones. But data can also be used against us in ways that make us uncomfortable: we don’t understand why we see the adverts we do, so many find it creepy.
So I suggest four key principles for companies using personalised data in a way that makes people comfortable, as I outlined in presentations at the EU Parliament and to the EU’s Data Protection Supervisor earlier this year.
- Be transparent in what data they’re using for what purposes. People want to know not just what data is being collected from them, but what it’s going to be used for.
- Explain why the data is relevant for those purposes. In 2016, when Admiral car insurance proposed to use Facebook data to set insurance premiums, people didn’t understand why their social media profile has anything to do with their driving behaviour.
- Put users in control by asking them to opt in and giving them the option to opt out if they change their mind. This needs to be as easy to use as the simple and sleek apps created by tech companies.
- Use data in a way that benefits not just the company but also the people who provided the data. Ultimately, executives should not ask “how can we use data to make more money?” but instead “how can we use data to make a better product?”
Clear articulation of some key principles – coupled with public pressure for compliance – will go a long way to ensuring that data need not be a bad four-letter word.