Big Data

Big Data not without class

Uncategorized
(Image: IDG)

29 April 2014

I am indebted to Slashdot for drawing my attention to a speech made by President Obama’s senior counsellor John Podesta concerning the possible inadvertent discriminatory effects of Big Data.

Amid all the hype around the valuable insights and information Big Data will deliver for governments, public sector organisations and private corporations, it was interesting to hear of concerns about how Big Data technology could reinforce discrimination and inequities in those areas where it is deemed to have the most potential impact.

In his remarks, delivered to the UC Berkeley School of Information on 1 April, Podesta pointed to the example of an app called Street Bump, released by the city of Boston, which used smartphone sensors to detect potholes and report them to the department of public works.

“But what happened after Street Bump was first rolled out in Boston should give us pause,” he warned the audience. “Because poor people and the elderly were less likely to carry smartphones, let alone download the app, the app wound up systematically directing city services to wealthier neighbourhoods.”

This raises an interesting point about Big Data not so much in terms of the technology itself but in terms of how that data is collected. There has been so much talk about using smartphones, wearable gadgets and the like to collect data that could be turned into valuable insights with the aid of Big Data techniques, that we have tended to forget people need to own those devices to collect the data in the first place.

Podesta reported that the city of Boston realised the discriminatory nature of the app and tweaked it to account for under-reporting “so that everyone would have equal access to city services”. Hmmm. “The lesson here is that we need to pay careful attention to what unexpected outcomes the use of Big Data might lead to, and how to remedy any unintended discrimination or inequality that may result,” he warned.

Public service
Hmmm again. While the example Podesta gave highlighted the potential pitfalls of Big Data, it’s important to note it was not a commercial application. The city authority deploying the app tweaked it to try to take account of possible discriminatory outcomes but would a commercial organisation feel the need to do likewise? Would it even have the understanding to realise the results were being skewed by how big a percentage of a certain age group used its app and by the fact people who couldn’t afford a smartphone weren’t using it at all?

Podesta acknowledged the need for a “serious conversation” about civil rights and discrimination in the Big Data context. Big Data analysis of information shared voluntarily on social networks showed “how easy it  can be to infer information about race, ethnicity, religion, gender, age, and sexual orientation, among other personal details”, he stated. “But it’s easy to imagine how Big Data technology, if used to cross legal lines we have been careful to set, could end up reinforcing existing inequities in housing, credit, employment, health and education.”

There’s no doubt this is an issue which has tended to be neglected. To date, most of the concern around big data technology has been focused, instead, on issues of privacy and confidentiality. No one is disputing that these are areas where we require protections and standards that reassure people on how their information is being used. But it’s worth pointing out these concerns arise for people generating information in the first instance, not for those who are digitally disenfranchised either through age or poverty (or both).

And if the data doesn’t produce outcomes that reflect the experience of everybody, young and old, rich and poor, it isn’t really that big after all.

Read More:


Back to Top ↑

TechCentral.ie