Mark Zuckerberg on Facebook’s future: the spirit is willing, but the algorithm is weak
Mark Zuckerberg published a 5800-word letter on the future of Facebook. It’s thoughtful, well-reasoned, articulate, and full of truth. It’s also full of wistful promises, which I don’t trust no matter who offers them.
Mark Zuckerberg’s letter, Building Global Community, starts like this:
On our journey to connect the world, we often discuss products we’re building and updates on our business. Today I want to focus on the most important question of all: are we building the world we all want?
There is no one tool that or platform that connects the world into a global community, but Facebook is the closest thing we have. As politicians draw back and become more insular — Brexit, President Trump’s trade and immigration policies — Mark Zuckerberg’s Facebook is thinking globally. It’s a reach. But it’s a reach worth pursuing.
Zuckerberg’s letter addresses five questions:
- How do we help people build supportive communities that strengthen traditional institutions in a world where membership in these institutions is declining?
- How do we help people build a safe community that prevents harm, helps during crises and rebuilds afterwards in a world where anyone across the world can affect us?
- How do we help people build an informed community that exposes us to new ideas and builds common understanding in a world where every person has a voice?
- How do we help people build a civically-engaged community in a world where participation in voting sometimes includes less than half our population?
- How do we help people build an inclusive community that reflects our collective values and common humanity from local to global levels, spanning cultures, nations and regions in a world with few examples of global communities?
Today I’ll focus on Facebook’s quest to deliver an informed community, one that doesn’t reinforce our prejudices or immerse us in falsehood.
Can Facebook deliver truth?
Here are some of Zuckerberg’s ambitions, with my analysis:
The two most discussed concerns this past year were about diversity of viewpoints we see (filter bubbles) and accuracy of information (fake news). I worry about these and we have studied them extensively, but I also worry there are even more powerful effects we must mitigate around sensationalism and polarization leading to a loss of common understanding.
Analysis: Zuckerberg correctly cites filter bubbles and fake news as the big problems which may metastasize, as he suggests, into sensational, polarized conversation. He’s moved well beyond the statement minimizing the fake news problem that won the Bullshitty Award for most outrageous public statement by a corporate spokesperson. But Facebook moves carefully when shifting its algorithm, because such changes create not just public criticism, but shifts in member satisfaction and potentially, revenue.
But our goal must be to help people see a more complete picture, not just alternate perspectives. We must be careful how we do this. Research shows that some of the most obvious ideas, like showing people an article from the opposite perspective, actually deepen polarization by framing other perspectives as foreign. A more effective approach is to show a range of perspectives, let people see where their views are on a spectrum and come to a conclusion on what they think is right. Over time, our community will identify which sources provide a complete range of perspectives so that content will naturally surface more.
Analysis: “Research shows?” I infer that this means Facebook has tried this on a subset of its members and doesn’t want to explain further. I’d like to know more about how “our community will identify which sources provide a complete range of perspectives so that content will naturally surface more.” This implies a vague and hopeful change in the algorithm at some indeterminate time in the future. That’s directionally correct, but no more dependable than a wistful promise to “be better.”
Accuracy of information is very important. We know there is misinformation and even outright hoax content on Facebook, and we take this very seriously. We’ve made progress fighting hoaxes the way we fight spam, but we have more work to do. We are proceeding carefully because there is not always a clear line between hoaxes, satire and opinion. In a free society, it’s important that people have the power to share their opinion, even if others think they’re wrong. Our approach will focus less on banning misinformation, and more on surfacing additional perspectives and information, including that fact checkers dispute an item’s accuracy.
Analysis: “Accuracy of information is very important” is a meaningless platitude. It is within Facebook’s power to identify fake news, satire, and other forms of misleading content by leveraging the knowledge of its users. The steps Facebook is currently taking are too timid. Again, the spirit is willing but the algorithm is weak.
Fortunately, there are clear steps we can take to correct these effects [of polarization and sensationalism]. For example, we noticed some people share stories based on sensational headlines without ever reading the story. In general, if you become less likely to share a story after reading it, that’s a good sign the headline was sensational. If you’re more likely to share a story after reading it, that’s often a sign of good in-depth content. We recently started reducing sensationalism in News Feed by taking this into account for pieces of content, and going forward signals like this will identify sensational publishers as well. There are many steps like this we have taken and will keep taking to reduce sensationalism and help build a more informed community.
Analysis: This is one of the only specific passages in this section of the piece — it describes a change in the algorithm that reduces sharing. It’s amazing that Facebook is actually willing to reduce sharing of sensational content, because that implies a commitment to long-term accuracy over the short-term benefits of more sharing. But the vague “many steps like this we have taken and will take” basically says “trust us, we’re working in your best interest.” That’s not very reassuring.
Research suggests the best solutions for improving discourse may come from getting to know each other as whole people instead of just opinions — something Facebook may be uniquely suited to do. If we connect with people about what we have in common — sports teams, TV shows, interests — it is easier to have dialogue about what we disagree on. When we do this well, we give billions of people the ability to share new perspectives while mitigating the unwanted effects that come with any new medium.
Analysis: Again with the “Research suggests.” This is just polyanna optimism. If I connect with a fellow Red Sox fan who loves Donald Trump and posts fake news stories, that’s not going to help me understand the world better.
We’re at Facebook’s mercy. It’s not a good place to be.
Mark Zuckerberg appears to be a benevolent despot, but a despot nonetheless.
I’m pleased that he is taking the big questions about Facebook and the future of global community seriously. And we can all see his evolution from denying the problems of filter bubbles and fake news, to grappling with them.
Never forget, though, that Facebook serves two masters. It serves its members and attempts to bring them as much of what they need as possible. And it serves its shareholders, as a corporation, attempting to generate as much profit as possible. If a change improves our lives and reduces Facebook’s long-term profit potential, Facebook is not going to make that change.
We should all be suspicious of these passages full of vague promises for the future and phrases like “research suggests.” They boil down to “you can trust us to do the right thing, even if we can’t reveal to you what we are doing.” I don’t trust Facebook, and I want to know how the algorithm is changing.
I’m impatient for real progress here. Aren’t you?
My biggest problem with all this is that Facebook cannot be inclusive, supportive, and informing without dictating a moral code to define it all. If they don’t do that (and they won’t) it’s all pretty much meaningless. It reminds me of another innocuous phrase about becoming a kinder, gentler nation. There will always be a faction of that mucks it all up by being self-serving jerks.
Zuckerberg: “Research shows that some of the most obvious ideas, like showing people an article from the opposite perspective, actually deepen polarization by framing other perspectives as foreign.”
Without Bullshit: “‘Research shows’?” I infer that this means Facebook has tried this on a subset of its members and doesn’t want to explain further.
I believe Zuckerberg is referring to what is known as the Backfire Effect, which has been researched. A good summary with references to various studies is here:
The second reference to “research shows,” regarding a greater likelihood to “improve discourse may come from getting to know each other as whole people instead of just opinions” may refer to a study published in Science magazine, and later popularized on This American Life and Science Friday. That study showed that familiarity between a questioner and an interviewee leads to a more lasting change of opinion on controversial topics. However, that study was later debunked, as reported by The New Yorker.
I give Zuckerberg a pass on the research sightings. It doesn’t change the thrust of your argument, though, that we are at the mercy of Facebook’s internal decisions.
If you are right, Zuckerberg should have cited his sources. “Research shows” lacks credibility if you can’t check the research because you don’t know the sources.
My problem with Facebook is it’s full of boring subjects, boring people and completely devoid of anything interesting, it’s existence is pointless. Account closed.