Facebook Broke Democracy, but the Fix Is Harder Than People Realize.

You  know, I’ve always complained about the use of “broke” when applied to things like democracy. How simple, right? But over the past few days I’ve not been in my normal nuanced mood. I’ve said, in fifteen different ways over the past year, that our stream-based model of social media was making us dumber. But I’ve said it too subtly maybe?

In any case, this is a note that there are now dozens of thinkpieces out there on how Facebook broke democracy that are out over the past few days. And they’re good, and it’s refreshing to get people finally looking at the systemic bias of Facebook towards conspiracy sites and inflammatory political comment, and calling for Facebook to admit that they are, after all, a media company, and it is time they started taking responsibility for little things like ending life on this planet as we know it. In retrospect the world would have been a lot better off if Zuckerberg had stuck to his original idea of a “Hot or Not” knock-off for campus coeds. It really would have been.

But no, Facebook instead decided to move into news distribution with the same algorithms and structure it used to share the “Charley bit my finger” video to a billion people. Google provided the ad revenue model for conspiracy clickbait sites, and now your hairdresser cousin is 99% sure that Hillary Clinton may have personally murdered up to five FBI agents. They saw it on Facebook, after all.

I have a book I am working on which will combine a history of the web with some cognitive science and UX criticism to explain how all this came to be, but I want to flag one thing I’ve noticed in recent treatments of the issue. People are over-obsessed with the news feed algorithm.

The algorithm matters, because the algorithm amplifies a lot of bad things. But my sense, looking at the structure of Facebook, is that the computer portion of the algorithm is a bit player here. Rather, it’s how the whole system functions together.

Let me just give you one example. Here’s a card from Facebook:


Now forget about the algorithm that brought this here and focus instead on the card. Every decision on this card is maximized to keep you on Facebook. So, for example, my name is bold and blue and prominent. The headline is also prominent, and Facebook pulls the description from the page so that the Facebook reader can read a summary of an article without going to the article.

The question of where this is from? The source of the news? It’s there in the least looked at part of the card in a gray so thin and light that I don’t know you could get it any lighter without having an ADA case on your hands.


Nothing on the card encourages you to click it or go elsewhere. Your options are Facebook-centric options. You can like it. You can comment. You can share it! Facebook has deliberately not called attention to how you click this to read it, because Facebook’s goal is that you read this headline and this summary and then either move on or spend time creating Facebook content — likes, comments, emojis, or shares. They have engineered a card, using the smartest data scientists in the world, that encourages you to read a headline and a description and never-ever click through to check the source or see the full story.

In the 1960s these folks would have worked for NASA. In the 1970s, maybe the NIH. Today they work for Facebook making sure you never leave the site to actually read the things that you share and react to. So instead of getting to the moon, we get to wherever the hell it is that we are now.

Because they succeed. Data science combined with design works. People on Facebook share material they don’t read all the time. And that’s the point. That’s how you produce revenue off of third party content without undermining your own dominant position.

And honestly, that’s just step one.  If I had time to go into it now, I’d explain how the whole sharing pattern — the stream-based model rather than what I call the garden-based model — prevents an iterative process of knowledge construction, instead reducing all knowledge to a stream of short headlines and summaries, of which your mind must form an intuitive sense. But I digress into nuance…

The point is there will be a lot of talk about algorithms over the next few weeks, and that’s good. But it is not that Facebook is an enlightenment engine loaded with a wrong scrap of code. Rather, Facebook’s entire model ends up being designed to produce stream-based reading behaviors, which produce money for Facebook and New World Order conspiracies for your cousin.  Sometimes in equal amounts. Happy days!

I’ll add one last thing here — maybe as a standard reminder of why I am talking about Facebook on this education blog. I talk about it because as educational technologists and instructional designers the great challenge of our age is to graduate students who can either thrive in in existing information environments or design better ones. We give our students four years practice doing library research and yet do not educate them about the environment in which they will gain much of their civic and personal knowledge. We must critique these environments at a level deeper than “Facebook is a corporation and therefore bad.” We must explain them at a level deeper than “Watch me crowdsource an answer here.” We need a comprehensive approach here, or our fact-tethered existence is going to continue to float away, unmoored by data, facts, or comprehension of consequences.

Notes on How Social Media Broke Our Democracy

I could not sleep last night at all. So I organized my notes I’ve been taking over the last year on the problem of doing politics in advertising funded stream-based systems.

I know this election was about so much more than that (so much more), and our problems are so much deeper. But I remain convinced that even if social media is not the fire or the fuel of Breitbartian racism it is in fact the oxygen that helps it thrive and spread.

There are 537 pages of notes in this PDF, and it may not be immediately clear what each has to do with the book, but in my head at least they all relate. They are worth a read.


I’m Writing a Book On the Disinfotopia of Current Social Media

I’ve decided to write a book before the year is out. I’ve decided this because I think the most pressing current issue for Open Pedagogy practitioners in the U.S. right now is how we address a social media environment that seems to be bringing out many of our worst demons, and I think several years of research on this issue has given me some insights here. In fact, I would go so far to say that the events of the past year have only served to validate the presentations I have given on The Stream and its problems over the last couple years. Add to that that I have over 532 pages of notes and pull quotes on the subject, burning a hole in my pocket.

I simply can’t imagine a more important task for us as instructional designers, teachers, and technologists than to deal with this issue, and I think the best way to deal with the subject is to do a deep dive.

I plan to start tonight, right after I cook the kids dinner. I hope to finish it over holiday break. I hope that if I post this intention here and people respond well to the idea it will get me to pump out pages rather than endlessly edit or delay. (My curse is I *love* research, and so the research phase never ends).

Here’s a description:

Lost in the Stream will be the first book to tell the story of how social media destroyed American political culture as seen through the 2016 primary and election.

It lays blame for that destruction on the omnipresence of the “Stream” in our lives:  the never-ending pull-to-refresh parade of news and outrage we have come to take for granted in our social media environments. It argues that the Stream, as exemplified by the personal feeds of Facebook, Twitter, and other sites, is creating a post-truth society, and that the 2016 election — the first “Twitter election” — is a harbinger of worse things to come if we do not address major flaws in the structure of our current social media environments.

There are a number of books that have tackled similar issues, pointing to insidious effects of the commercialization of the web, erosion of face-to-face culture, or the impact of the web on memory and concentration. This book acknowledges these impacts, but looks very specifically at how the “event-ification” of the web contributes to our current malaise. It presents the world of the web as it has become for many in the U.S. — polarizing, conspiracy-obsessed, brutally and unapologetically mob-like — and shows how The Stream is at the root of these problems. It suggests that older, pre-stream models of the web, from pioneers such as Vannevar Bush, Doug Engelbart, and Alan Kay, might hold the key to returning the web to its place as an augment of human intellect rather than a detriment to it.

Chapter List

Prologue: From Techno-utopianism to the Current Dystopia

Section One / Welcome to Disinfotopia

Chapter One: Islands in the Stream (That Is What We Are)
Chapter Two: The Facebook Conspiracy Factory
Chapter Three: Twitter’s Mob Justice
Chapter Four: Who Moved My Cognitive Surplus?
Chapter Five: Weaponized Transparency and the End of Organization(s)

Section Two / Old Models and New Hope

Chapter Six: As We May Think Again
Chapter Seven: Talking with Models
Chapter Eight: Reclaiming Calm
Chapter Nine: Can Education Save the Web?

Conclusion: The Web We Need