Facebook Broke Democracy, but the Fix Is Harder Than People Realize.

You  know, I’ve always complained about the use of “broke” when applied to things like democracy. How simple, right? But over the past few days I’ve not been in my normal nuanced mood. I’ve said, in fifteen different ways over the past year, that our stream-based model of social media was making us dumber. But I’ve said it too subtly maybe?

In any case, this is a note that there are now dozens of thinkpieces out there on how Facebook broke democracy that are out over the past few days. And they’re good, and it’s refreshing to get people finally looking at the systemic bias of Facebook towards conspiracy sites and inflammatory political comment, and calling for Facebook to admit that they are, after all, a media company, and it is time they started taking responsibility for little things like ending life on this planet as we know it. In retrospect the world would have been a lot better off if Zuckerberg had stuck to his original idea of a “Hot or Not” knock-off for campus coeds. It really would have been.

But no, Facebook instead decided to move into news distribution with the same algorithms and structure it used to share the “Charley bit my finger” video to a billion people. Google provided the ad revenue model for conspiracy clickbait sites, and now your hairdresser cousin is 99% sure that Hillary Clinton may have personally murdered up to five FBI agents. They saw it on Facebook, after all.

I have a book I am working on which will combine a history of the web with some cognitive science and UX criticism to explain how all this came to be, but I want to flag one thing I’ve noticed in recent treatments of the issue. People are over-obsessed with the news feed algorithm.

The algorithm matters, because the algorithm amplifies a lot of bad things. But my sense, looking at the structure of Facebook, is that the computer portion of the algorithm is a bit player here. Rather, it’s how the whole system functions together.

Let me just give you one example. Here’s a card from Facebook:

card.PNG

Now forget about the algorithm that brought this here and focus instead on the card. Every decision on this card is maximized to keep you on Facebook. So, for example, my name is bold and blue and prominent. The headline is also prominent, and Facebook pulls the description from the page so that the Facebook reader can read a summary of an article without going to the article.

The question of where this is from? The source of the news? It’s there in the least looked at part of the card in a gray so thin and light that I don’t know you could get it any lighter without having an ADA case on your hands.

card2

Nothing on the card encourages you to click it or go elsewhere. Your options are Facebook-centric options. You can like it. You can comment. You can share it! Facebook has deliberately not called attention to how you click this to read it, because Facebook’s goal is that you read this headline and this summary and then either move on or spend time creating Facebook content — likes, comments, emojis, or shares. They have engineered a card, using the smartest data scientists in the world, that encourages you to read a headline and a description and never-ever click through to check the source or see the full story.

In the 1960s these folks would have worked for NASA. In the 1970s, maybe the NIH. Today they work for Facebook making sure you never leave the site to actually read the things that you share and react to. So instead of getting to the moon, we get to wherever the hell it is that we are now.

Because they succeed. Data science combined with design works. People on Facebook share material they don’t read all the time. And that’s the point. That’s how you produce revenue off of third party content without undermining your own dominant position.

And honestly, that’s just step one.  If I had time to go into it now, I’d explain how the whole sharing pattern — the stream-based model rather than what I call the garden-based model — prevents an iterative process of knowledge construction, instead reducing all knowledge to a stream of short headlines and summaries, of which your mind must form an intuitive sense. But I digress into nuance…

The point is there will be a lot of talk about algorithms over the next few weeks, and that’s good. But it is not that Facebook is an enlightenment engine loaded with a wrong scrap of code. Rather, Facebook’s entire model ends up being designed to produce stream-based reading behaviors, which produce money for Facebook and New World Order conspiracies for your cousin.  Sometimes in equal amounts. Happy days!

I’ll add one last thing here — maybe as a standard reminder of why I am talking about Facebook on this education blog. I talk about it because as educational technologists and instructional designers the great challenge of our age is to graduate students who can either thrive in in existing information environments or design better ones. We give our students four years practice doing library research and yet do not educate them about the environment in which they will gain much of their civic and personal knowledge. We must critique these environments at a level deeper than “Facebook is a corporation and therefore bad.” We must explain them at a level deeper than “Watch me crowdsource an answer here.” We need a comprehensive approach here, or our fact-tethered existence is going to continue to float away, unmoored by data, facts, or comprehension of consequences.

Advertisements

13 thoughts on “Facebook Broke Democracy, but the Fix Is Harder Than People Realize.

  1. You could do a whole chapter on how comedy sites like the Onion have hacked this to create almost believable stories that people incorrectly click-and-share all the time. It’s conformation bias as comedy. Of course, there are examples of state-run news agencies doing the same – scare-mongering-as-comedy.

    Remember when textbooks were walls of text? The solution was to … add pictures, add sidebars, break into smaller pieces (chapters have sections, etc.). This pre-dated social media but was the first evidence of applying the idea that shorter attention spans guide our educational content design. Learning can be fun!

  2. Pertinent as always. Of course we educators have to talk about FB with it’s 1.4bn active monthly users (Dec 2015).

    Taking a long cold look at FB and Twitter has to be at the core of any up-to-date media studies curriculum. Unfortunately, media studies is seen as a lightweight subject. It can be taught that way, however, it can also throw a more powerful light on how society really works than pretty much any other subject I can think of.

  3. Thanks so much for this Mike. I have been puzzling over this for some time – even have paper to bore you with. I have been greatly influenced by book from my ex-colleague http://benlight.me/disconnecting-with-social-networking-sites/ that looks at Disconnective Practice
    I am currently writing an abstract for #OER17 that looks at critical literacy in (feminist) materialist perspective – happy to share private doc. if you are interested.
    Loving what you say about not being allowed to leave site.
    Tonight I found that the web site of a campaign I had been trying to support was a Facebook page https://www.facebook.com/stopfundinghate/about/ #WTF is that about – and I quite like the schmaltzy John Lewis ad https://youtu.be/sr6lr_VRsEo
    As they say on Facebook – it’s complicated!

  4. > In the 1960s these folks would have worked for NASA. In the 1970s, maybe the NIH. Today they work for Facebook making sure you never leave the site to actually read the things that you share and react to.

    Mike, I really respect your thought and writings. What you describe is not the goal. It’s a result of suboptimal proxies for the value generated. Check out Tristan Harris and Joe Edelman (timewellspent.io) if you haven’t already. It’ll take time to align behind better value proxies, but it’ll happen.

    Assume we align behind perfect (probabilistic) proxies for value. Is the deconstruction of locality then purely good? Or does it still put other properties at risk? What are those risks?

  5. Thanks so much for this Mike. I have been puzzling over this for some time – even have paper to bore you with. I have been greatly influenced by book from my ex-colleague http://benlight.me/disconnecting-with-social-networking-sites/ that looks at Disconnective Practice. The rhetoric of FB et al is of connections but every connection is enabled by a disconnection.
    I have written an abstract for #OER17 that looks at critical literacy in (feminist) materialist perspective
    Loving what you say about not being allowed to leave site.
    Tonight I found that the web site of a campaign I had been trying to support was a Facebook page https://www.facebook.com/stopfundinghate/about/ #WTF is that about

  6. Walled Gardens v Internet. Bubbles v Literacy. Cards, embeds, and other approaches can reflect the web, but greed and competition incentivize this type of walled garden bastardization of a good idea. As I’m building on your work, I’m wondering if there are browser plugins we can develop to pull back curtain on these, and inject some literacy.

  7. Pingback: The “They Had Their Minds Made Up Anyway” Excuse | Hapgood

  8. Pingback: The flaws to Facebook as a major news delivery platform – ENG 193 and DMS 193 Class Blog

  9. Pingback: Facebook extremism and fake news: How Facebook is training us to be conspiracy theorists — Quartz

  10. Pingback: The Bottom Line: Fake News and Facebook – The Watchtower

  11. Pingback: Truthy Lies and Surreal Truths: A Plea - Hybrid Pedagogy

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s