The Facebook logo displayed on a smartphone screen and keyboard can be seen in this illustration photo … [+]
NurPhoto via Getty Images
That’s pretty unusual.
On Friday, Facebook published a long statement on the film The Social Dilemma in the form of a PDF on its website.
As long as I’ve followed the social media juggernaut, I haven’t seen such a response to any particular film or particular entity claim.
It’s a big step in a new counter-offensive and makes me think the movie must have an impact on people who use the service and wonder about the real value of Facebook.
I first wrote about the social dilemma a little over a week ago and made a simple request. I asked people to look at the app they use the most and delete it for a while. It’s a test to see how much you rely on, or even addicted to, this app.
I removed Instagram from my phone and I still feel the need to browse this feed or post a photo. (On the other hand, I started to have success with it.)
Facebook targeted some of the most surprising claims, but the one that stands out the most has to do with misinformation. The company claims it doesn’t make any money off fake news or conspiracy theories.
They say, “The idea that we allow misinformation to fester on our platform, or that we somehow benefit from this content, is wrong.”
However, the document contains some details.
How do we even define fake news?
I see posts on Facebook questioning the severity of the pandemic and whether it’s important to wear a mask. Some would argue that this is fake news and obviously a false claim, while others insist that wearing masks should be optional.
What really benefits Facebook the most is not so much outright lies and misinformation, but rather opinions on highly controversial topics. If I post today that I am not wearing a mask, it is against federal guidelines, but it is not really false news. Facebook benefits from these posts because we all see ads. It’s my opinion and it probably wouldn’t be flagged. (I could even try to test my theory.)
Of course, some users push these limits. Some of the posts I saw over the weekend wishing President Trump died of coronavirus are obviously an injury. You sound a bit like a death wish to me. But what about the people who just drop clues? What about mild effects? Facebook has admitted that it doesn’t have the perfect AI routines for spotting misinformation.
The company benefits from debates and opinions. Opinions only occasionally turn into actual misinformation or outright lies.
The official counter-argument also addresses social media addiction, but in a way that seems extremely sparse. They have a team of experts to advise them, they changed the feed, they don’t post viral videos. “The change resulted in a 50 million hours per day decrease in time spent on Facebook. That’s not the kind of thing you do when you’re just trying to get people to use your services more. ”
That reads as defensive. Tell us how much time people are spending on Facebook and what else you’re doing to curb the adding and we might be interested.
Overall, the counter-argument is a bit flat.
To change a Shakespeare quote:
“The social network protests too much, says.”