Security Eye

The connected, the bad and the ugly

Facebook believes information is power but claims to have little when it comes to educating users, says Billy MacInnes
Blogs
Image: cottonbro/Pexels

15 December 2021

The IT industry likes to throw around a catchy phrase or two when it’s trying to entice people into buying its products and services. Think about “information is power” for example. That’s clearly not just confined to the IT industry but there’s no disputing that technology is playing a growing role in the collection, provision and dissemination of information.

Then there’s the other old standby, “data is the new oil”, although you have to question why anybody would want to equate what they believe to be their most valuable asset with something that is seen as dirty, polluting and environmentally dangerous. You might think the increasingly negative perceptions of the old oil would act as a deterrent from calling something “the new oil” but apparently not.

I was reminded of these facilely attractive phrases while watching a recent interview with Andrew Bosworth, Facebook vice president of augmented and virtual reality on Axios.

 

advertisement



 

Bosworth, who is the incoming CTO of Meta (the new name for Facebook’s parent company), is the author of the infamous 2016 internal memo entitled The Ugly which was leaked to BuzzFeed News in 2018.

The memo was unequivocal about Facebook’s practices and ethos: “The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is *de facto* good.”

It went on to state: “So we connect more people. That can be bad if they make it negative. Maybe it costs someone a life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools.”

After the leak, Bosworth claimed he didn’t really believe what he was writing in the memo at the time he was writing it and that the purpose “was to bring to the surface issues I felt deserved more discussion with the broader company”.

Perhaps he was doing something similar in the Axios interview when he was quizzed about the role Facebook’s platform has played in spreading Covid misinformation and vaccine hesitancy.

Broadly, Bosworth’s argument appears to be that misinformation is a societal problem. “People want that information,” he told Axios. “I don’t believe that the answer is ‘I will deny these people the information they seek and I will enforce my will upon them.’ At some point the onus is, and should be in any meaningful democracy, on the individual.”

Information is power, after all. But what is information? Bosworth claims that’s up to us to decide. “Individual humans are the ones who choose to believe or not believe a thing,” he argues. “They are the ones who choose to share or not share a thing, I don’t feel comfortable at all saying they don’t get to have a voice because I don’t agree with what they said. I don’t like what they said.”

Information not always informative

To me, this seems a bizarre interpretation of information. For example, what if schoolchildren decide not to believe history or biology teachers because they preferred to believe something posted on Facebook which contradicted most of what they were being taught?

Bosworth goes on to state: “Our ability to know what is misinformation is itself in question and I think reasonably so. I’m very uncomfortable with the idea that we possess enough fundamental rightness even in our most scientific centres of study to exercise that kind of power on a citizen, another human, and what they want to say and who they want to listen to.”

Two points to make about that. Firstly, our ability or inability to know what is misinformation is greatly diminished by the sheer volume of misinformation being propagated and disseminated via social media platforms which threatens to overwhelm genuine information. In fact, that often seems to be the sole purpose of pumping so much misinformation out over social media platforms.

Secondly, equating the validity of “our most scientific centres of study” as a source of expert information with an article your workmate’s cousin posted on Facebook from a site claiming to have inside knowledge of the “shamdemic” is pretty much misinformation in and of itself. 

He ends: “Instead, we have ‘what do people want to hear?’ which is really the best way to approximate the algorithm.” I’m not sure anyone can seriously argue that telling people what they want to hear is better than telling them what the “most scientific centres of study” say. For example, if you told someone with cancer that they didn’t have cancer because that’s what they wanted to hear, the inconvenient truth is that they’d still have cancer and they would probably end up dying of it because they wouldn’t know to get it treated.

Which brings us back to the infamous memo that Bosworth didn’t believe in as he wrote it. Remember those words: “So we connect more people. That can be bad if they make it negative. Maybe it costs someone a life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools.” Perish the thought but maybe some people die in a pandemic because they are exposed to a campaign of misinformation about the virus and vaccinations “coordinated on our tools”?

And maybe all those people suggesting that ‘data is the new oil’ could ponder the awful similarities between dying seabirds covered by a sticky black viscous pollutant and people drowning in a flood of misleading ‘data’, struggling to breathe in an ICU.

Read More:


Back to Top ↑

TechCentral.ie