You know, I’ve always complained about the use of “broke” when applied to things like democracy. How simple, right? But over the past few days I’ve not been in my normal nuanced mood. I’ve said, in fifteen different ways over the past year, that our stream-based model of social media was making us dumber. But I’ve said it too subtly maybe?
In any case, this is a note that there are now dozens of thinkpieces out there on how Facebook broke democracy that are out over the past few days. And they’re good, and it’s refreshing to get people finally looking at the systemic bias of Facebook towards conspiracy sites and inflammatory political comment, and calling for Facebook to admit that they are, after all, a media company, and it is time they started taking responsibility for little things like ending life on this planet as we know it. In retrospect the world would have been a lot better off if Zuckerberg had stuck to his original idea of a “Hot or Not” knock-off for campus coeds. It really would have been.
But no, Facebook instead decided to move into news distribution with the same algorithms and structure it used to share the “Charley bit my finger” video to a billion people. Google provided the ad revenue model for conspiracy clickbait sites, and now your hairdresser cousin is 99% sure that Hillary Clinton may have personally murdered up to five FBI agents. They saw it on Facebook, after all.
I have a book I am working on which will combine a history of the web with some cognitive science and UX criticism to explain how all this came to be, but I want to flag one thing I’ve noticed in recent treatments of the issue. People are over-obsessed with the news feed algorithm.
The algorithm matters, because the algorithm amplifies a lot of bad things. But my sense, looking at the structure of Facebook, is that the computer portion of the algorithm is a bit player here. Rather, it’s how the whole system functions together.
Let me just give you one example. Here’s a card from Facebook:
Now forget about the algorithm that brought this here and focus instead on the card. Every decision on this card is maximized to keep you on Facebook. So, for example, my name is bold and blue and prominent. The headline is also prominent, and Facebook pulls the description from the page so that the Facebook reader can read a summary of an article without going to the article.
The question of where this is from? The source of the news? It’s there in the least looked at part of the card in a gray so thin and light that I don’t know you could get it any lighter without having an ADA case on your hands.
Nothing on the card encourages you to click it or go elsewhere. Your options are Facebook-centric options. You can like it. You can comment. You can share it! Facebook has deliberately not called attention to how you click this to read it, because Facebook’s goal is that you read this headline and this summary and then either move on or spend time creating Facebook content — likes, comments, emojis, or shares. They have engineered a card, using the smartest data scientists in the world, that encourages you to read a headline and a description and never-ever click through to check the source or see the full story.
In the 1960s these folks would have worked for NASA. In the 1970s, maybe the NIH. Today they work for Facebook making sure you never leave the site to actually read the things that you share and react to. So instead of getting to the moon, we get to wherever the hell it is that we are now.
Because they succeed. Data science combined with design works. People on Facebook share material they don’t read all the time. And that’s the point. That’s how you produce revenue off of third party content without undermining your own dominant position.
And honestly, that’s just step one. If I had time to go into it now, I’d explain how the whole sharing pattern — the stream-based model rather than what I call the garden-based model — prevents an iterative process of knowledge construction, instead reducing all knowledge to a stream of short headlines and summaries, of which your mind must form an intuitive sense. But I digress into nuance…
The point is there will be a lot of talk about algorithms over the next few weeks, and that’s good. But it is not that Facebook is an enlightenment engine loaded with a wrong scrap of code. Rather, Facebook’s entire model ends up being designed to produce stream-based reading behaviors, which produce money for Facebook and New World Order conspiracies for your cousin. Sometimes in equal amounts. Happy days!
I’ll add one last thing here — maybe as a standard reminder of why I am talking about Facebook on this education blog. I talk about it because as educational technologists and instructional designers the great challenge of our age is to graduate students who can either thrive in in existing information environments or design better ones. We give our students four years practice doing library research and yet do not educate them about the environment in which they will gain much of their civic and personal knowledge. We must critique these environments at a level deeper than “Facebook is a corporation and therefore bad.” We must explain them at a level deeper than “Watch me crowdsource an answer here.” We need a comprehensive approach here, or our fact-tethered existence is going to continue to float away, unmoored by data, facts, or comprehension of consequences.
Leave a comment