Facebook news feed

Facebook takes more ethical approach to user research

Life
Facebook's news feed design for 2014 doesn't bring in wholesale changes. Image: Facebook

3 October 2014

Back in July, Facebook revealed that it had experimented with your News Feed by showing some users more positive posts and others more negative ones. The resulting outrage prompted the social network to change its research policy, but its self-imposed restrictions don’t go far enough to appease anyone.

The network said Thursday that it researches all kinds of things and isn’t going to stop, but it plans to be more responsible. As part of that effort, Facebook is making a series of changes, like enhanced reviews of research projects that study emotions or affect specific groups of people. That means an experiment like the week-long study on positive and negative Facebook posts would have gone through extra scrutiny, though the network didn’t say what kinds of research would be up for debate.

Facebook has also created an internal review panel, with members representing different departments within the company, like engineering, privacy, and legal. New employees will learn about ethical research in the company’s six-week training programme.

Room for improvement
Facebook should reevaluate its processes, as it recently did with its real-name policy, but this new framework is vague. People weren’t all that shocked by Facebook studying their data for research purposes – they were mad that the network was manipulating their experience to achieve a specific result.

In a Thursday blog post, Facebook chief technology officer Mike Schroepfer said the backlash forced the network to take a hard look at its policy.

“We were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism,” he wrote. “It is clear now that there are things we should have done differently. For example, we should have considered other non-experimental ways to do this research. The research would also have benefited from more extensive review by a wider and more senior group of people. Last, in releasing the study, we failed to communicate clearly why and how we did it.”

But nowhere in its new framework did Facebook mention allowing people to opt in to these experiments, which is essential when you’re manipulating a user’s experience. The network should make participation in research optional – a privacy setting you can change like any other.

These first steps toward transparency are a good start, but Facebook needs to be completely open about what it’s studying and why to reassure people that it’s not just toying with their emotions.

PCWorld

Read More:


Back to Top ↑

TechCentral.ie