Vital statistics

How has enterprise adapted to changing attitudes on data protection? JASON WALSH investigates
(Image: Stockfresh)

15 July 2019

Everyone now knows that not only do we all leave a digital trail behind us, but also that many a business has been built on data collection in recent years. The question then is: with new legislation in place, not to mention a clear cultural shift underway, can businesses make use of data while retaining the trust of consumers?

Analysts Gartner certainly seem to think so: among the firms’ ‘Top 10 Strategic Technology Trends for 2019’ was privacy.

“Consumers have a growing awareness of the value of their personal information, and they are increasingly concerned with how it’s being used by public and private entities. Enterprises that don’t pay attention are at risk of consumer backlash,” the report stated.




This has not fallen from the sky, of course; Gartner is measuring consumer sentiment — and it is not alone. Indeed, a June 2019 EU Eurobarometer survey on data protection demonstrated that privacy is not just an issue in boardrooms and think tanks.

The survey of 27,000 Europeans found that 73% have heard of at least one of their six digital privacy rights.

The highest levels of awareness among citizens are recorded for the right to access their own data (65%), the right to correct the data if they are wrong (61%), the right to object to receiving direct marketing (59%) and the right to have their own data deleted (57%).

On the other hand, there is a long way to go on transparency.

Regaining control

Speaking about the survey, Věra Jourová, EU commissioner for justice stated: “Helping Europeans regain control over their personal data is one of our biggest priorities. But, of the 60% [of] Europeans who read their privacy statements, only 13% read them fully. This is because the statements are too long or too difficult to understand. I once again urge all online companies to provide privacy statements that are concise, transparent and easily understandable by all users. I also encourage all Europeans to use their data protection rights and to optimise their privacy settings”.

Speaking to TechPro, Dave Lewis, School of Computer Science & Statistics at Trinity College, Dublin says that privacy is more of a question for the political sphere than it is something that can simply be left to business or technologists. This has started to happen, at least at a European level, but it has not let moved TDs.

“It becomes a political issue: who owns the fundamental resource of our age? It needs to move up the political chain, but very few politicians would be campaigning on data rights,” he said.

Given Ireland’s economy is dependent on foreign-direct investment (FDI), notably including in technology and, these days, data-centric operations, there is the issue that these businesses would not welcome political interference. Lewis says that this is not necessarily the case.

“Alexa transcribing what people said was in the terms and conditions. What people did not necessarily understand was that this would be done by humans.” – Dave Lewis, Trinity College Dublin

“Are we in a good position to be talking to those FDI companies that we are reliant on to say ‘this is in your long term interests’?”

Privacy though, is so fundamental, he says, that it needs to be asserted in terms of inalienable rights. “If you regulate something the political wind can always change, but if you give people more of a fundamental right to your data ownership that’s different,” he says.

This has started to happen with the introduction of the EU’s general data protection regulation (GDPR).

Opaque policies

For Lewis, an associate professor of computer science and associate director at the Science Foundation Ireland-funded Adapt centre for research in digital content and media innovation, one thing that will have to change is privacy policies. They are often, he says, much too opaque.

“Alexa transcribing what people said was in the terms and conditions. What people did not necessarily understand was that this would be done by humans,” he said.

“There’s more work to be done to put the institutions in place. The Data Protection Commissioners’ office has been around for a while, but it really only had had the powers for a very short time,” he said.

“You do have some rights under [the] GDPR but part of the problem is that you exert your rights individually and some companies are thinking ‘we can afford a certain number of people not using our services’. There is a lot of room there for a lot of collective action, but we’re not living in an age where people are not really doing that.”

Some sectors that Adapt has been working with are taking an active approach.

“Medical data, drugs trials and things like that – they’re starting to experiment with having a patient group with whom you consult and even train them to understand the issues… public spirited organisations like hospitals [but] with companies it’s a slightly different story,” said Lewis. “The companies need to play ball – but there’s a growing realisation that they’re starting to lose the trust of their customers.”

Lewis contrasts the situation with rights to the collective bargaining powers of trade unions.

“Even focussing on the individual means were not focussing on what we need to do collectively,” he said. “It’s a bit like recognising trade unions versus a consumer choice approach.”

New business model

Some businesses are taking privacy seriously, though; moving beyond GDPR compliance toward offering privacy as a service.

“We’re only starting to see the emergence of privacy-enhanced technology but if you look at the start-ups you see a lot of people working on this,” said Aoife Sexton, chief privacy officer at Truata, a business that works to keep its customers on the right side of the law when it comes to privacy.

“That’s one side of it. On the other side of it, we’re starting to see some [large] companies differentiating with privacy. You only need to look at Tim Cook and Apple to see that.”

Sexton’s professional background is as a technology lawyer.

“People are looking more at data accuracy than just collecting vast amounts of data and also demonstrating a specific legal basis to do it and describing that to consumers.” Aoife Sexton, Truata

“A number of years ago I got interested in privacy,” she said. The data landgrab is now over, she says, but businesses need to demonstrate a commitment to privacy, not just make declarations “It’s one thing to say it, but how do you demonstrate it?”

Truata’s goal is to make data anonymous, thus compliant, but allow it to be used.

“If we can anonymise data so that it’s no longer personal but I can see trends and certain things that I can learn, why wouldn’t I do that?”

This means that not only should data collection not be an overreach, it’s also an opportunity to ensure that the data is actually useful.

“Historically it was a land grab. Certainly, the last 10 years it was ‘data is the new oil’ [and] ‘let’s collect as much as we can, we really don’t know what for, but we’ll collect it and work out what to do with it later’.

“GDPR has changed that. People are looking more at data accuracy than just collecting vast amounts of data and also demonstrating a specific legal basis to do it and describing that to consumers.”

Data protection duties

Naturally, the Office of the Data Protection Commissioner (DPC) has a lot to say on the issue. Speaking to TechPro, Graham Doyle, deputy commissioner and spokesman for the Office of the Data Protection Commissioner, says that the fact that the ground has shifted on data and privacy is demonstrable, not mere speculation.

“There’s definitely been a change,” he said. “There’s research out there that shows there’s been a change, both nationally and across the EU.”

This does not just apply to tech companies and major multinationals, either, he says: the GDPR is doing its job right across industry, including in the small and medium enterprise (SME) sector.

“We conducted surveys looking at SMEs and the public sector. With the SMEs in particular, we ran a survey 12 months out from the GDPR and it showed really low levels of awareness. Then [when] we did a later survey we saw a significant increase.

“That’s one barometer,” he said.

When it comes to data, the data is in, he says. One metric is that the DPC is, as is now required in law, being made aware of the appointment of data protection officers in organisations.

“We’ve had 1,300 data protection officer notifications since [the] GDPR [became law],” said Doyle.

“Another measurement that can be used is in terms of breach notifications: we’ve seen over a doubling of breach notifications since [the] GDPR came in, which, again, is significant because it shows that companies are aware of their responsibilities.”

“…we’ve seen over a doubling of breach notifications since [the] GDPR came in.” – Graham Doyle, Office of the Data Protection Commissioner

The other side of the equation also demonstrates change: the public is more aware than ever that not only does it have rights where privacy is concerned, but also that it has a means of exerting those rights.

“On the consumer side, we’ve seen well over a doubling of complaints. We were at around 2,500 [complaints] in 2017 – and that was the highest we ever had. Since [the] GDPR it’s been over 7,000.”

“Our helpdesk has received over 50,000 contacts. These are people looking for advice on data protection issues,” he said.

Big Data is big news, of course. In the years running up to the implementation of the GDPR we were made aware of just how important information and communications technology, particularly communications, is in the real world.

When Barack Obama was elected president of the United States of America, the news media breathlessly reported that his was the first victory brought about by social media. Specifically, the claim was that by organising online, his campaign team was able to harness both the enthusiasm and the cash of ordinary voters. This pocket change, in aggregate, allowed Obama to surf into the citadel of power by railing his base to create a wave of popular participation.

American politicians may lie in bed at night twisting and turning as they think of Russian Twitter bots invading polling booths, but Internet technology has had a much more insidious effect on politics: the micro-targeting of voters.

Usage and intent

Disinformation is nothing new, and lies have been facilitated by technology since before days when vellum constituted the bleeding edge of information distribution, but what is new is the ability to micro-target information, dubious or not, to individuals and small groups who may be considered receptive to it.

In fact, the contribution of ‘crowdfunding’ was negligible; like those of all US presidents in recent decades, Obama’s campaign was bankrolled by business and by billionaires. Nonetheless, what is interesting is that when his successor was elected the story was the same, but the slant was rather different: Donald Trump, the same newspapers reported, leapt into the highest office in the world by using social media to propagate lies that rallied his base.

The idea that what we might call ‘para-political’ or ‘epi-economic’ information can tell us about how people intend to behave is not new. In fact, it is the basis for the central idea of Marxism: politics flows from the expression of economic interests. Somewhat more recently, consider that Margaret Thatcher’s 1979 UK general election victory was not predicted by any traditional opinion poll, but was predicted by one that instead of asking people for whom they intended to vote inquired as to whether or not they owned a home, aspired to own one, owned a car and so on. Still, the fact that we are all online, some of us all the time, means that this information can now be collected and processed on a granular level that was inconceivable in precious eras.

This, some have argued, is how the 2016 Brexit referendum was won. According to Sexton, the fallout from Brexit came as a shock to many voters.

This is an issue can be summed up in just two words, whisper it quietly, so that Alexa, that spy in the heart of the home, can’t hear: Cambridge Analytica.

“Every week we see examples in the papers of privacy breaches, where someone is found to be using data in ways that were not explained,” said Sexton.

“Since Cambridge Analytica, which really was a watershed moment, people are thinking about it,” she said.

One upshot of this, and it is inarguably also an upside, is that the average person no longer thinks that on the Internet nobody knows that they are a dog. In fact, many internet users downright dislike the companies with whom they are doing business.

“Consumers are quite distrustful of companies, and I don’t think it’s an overstatement to say that there is a crisis of trust out there,” she said.
Trinity’s Dave Lewis makes a similar point.

“People are more aware of it, both in terms of legislation and in a more general sense. GDPR is in everyone’s faces now, for example, but also in terms of stories about data breaches and, of course, Cambridge Analytica,” he said.

Data collection power

Elise Bronsart, the journalist who broke the Cambridge Analytica story while reporting for the pan-European investigative TV news programme Vox Pop on Arte in France and Germany, says that the incursion of data into the political sphere risks reducing one of the most fundamental human activities to something as banal as a choice of consumer goods.

“I’d say data collection is just a tool, but a very powerful one. The more you know on someone [or on] a group of people, the more efficiently you can influence them. Brands [and] advertising see data as a goldmine – they keep on offering free things in exchange of our data. But politics is not like selling pizza – or shouldn’t be.”

When Cambridge Analytica hit the headlines, it soon became apparent that Brexiteers were not its only client.

“Also, you can also do amazing things with data […] but I think politics should stay away from data collection; it leads to clientelism: [where you] adjust your discourse to what you think people want to hear, in order to get elected. It leads to hide the truth, for example in Cambridge Analytica they hid the wall issue with Trump to Latinos living near the Mexican border.”

Nonetheless, Bronsart says that it is important to remember that people are not automatons.

“Luckily we don’t [just make-up our] minds just on Facebook, but still. So yes, I think all in all, it’s a danger to democracy – when used for political purpose [but] It’s more a question of principle; I don’t think Cambridge Analytica ‘made’ the success of Trump,” she said.

The fallout has had an impact well outside the political realm, though.
Timandra Harkness, a science and technology educator and broadcaster who is a regular on BBC Radio Four, says that privacy simply cannot be ignored anymore.

“It is more of an issue than ever,” she said. “Do we think about it all the time? No, but it does tend to be there is the background of our minds.”

Harkness, author of the book Big Data: Does Size Matter, says that businesses are well aware of this.

“Data is not an independent commodity. Oil is oil; it doesn’t matter where it comes from. Data, though, is completely context dependent” – Timandra Harkness

“Business is the same. In fact, businesses are, as a rule, much more aware of it because business is in the business of collecting data about their customers and suppliers, so if they want to avoid falling behind they collect data. Even if it’s not tradable it’s useful for predicting future trends,” she said.

Harkness says the oil metaphor only goes so far and that for data to be useful it has to be understood in its own terms: “Data is not an independent commodity. Oil is oil; it doesn’t matter where it comes from. Data, though, is completely context dependent. It’s not transferrable.

“It’s the droppings of human activity and relationships; data is the measure of human relationships and relationships with your customers,” she said.

But can privacy be a business model in itself? It isn’t right now, she says, but it could be.

“I think there is a business model to be found in privacy. I foresee that people will pay for services: with Apple you pay more for the product, but get more privacy – though that is overstated; Apple does collect data, it just doesn’t share it as freely as some companies do.”

There is a risk, though: making privacy something only for those who can afford it.

“Is privacy a luxury good? Do we just go without it if we can’t afford it,” she said. “The other thing is, the technology companies still believe their own hype about data, and it would be a bold company that rejected the model.”

Read More:

Back to Top ↑