The “They Had Their Minds Made Up Anyway” Excuse

BGR graciously linked my post from the weekend, a post showing that the scale at which fake news stories trend on Facebook can dwarf traditional news in terms of shares. (Thanks, BGR!)

However, they end with this paragraph, which I’d like to reply to:

On a related note, it stands to reason that most individuals prone to believing a hyperbolic news story that skews to an extreme partisan position likely already have their minds made up. Arguably, Facebook in this instance isn’t so much influencing the voting patterns of Americans as it is bringing a prime manifestation of confirmation bias to the surface.

As regular readers know, my core expertise is not data analysis of Facebook, but in how learning environments (and particularly online learning environments) affect the way users think, act, and learn. A long time ago I was online political organizer, but my day job for many, many years has been the investigation and design of net-enabled learning experiences.

The BGR conclusion is a common one, and it intuitively meshes with our naive understanding of how the mind perceives truth. I see something I agree with and I reshare it — it doesn’t change my mind, because of course I already believed it when I reshared it. However, from a psychological perspective there are two major problems with this.

Saying Is Believing

The first problem is that saying is believing. This is an old and well-studied phenomenon, though perhaps understudied in social media. So when you see a post that says “Clintons suspected in murder-suicide” and you retweet or repost it, it’s not a neutral transaction. You, the reposter, don’t end that transaction unchanged. You may initially post it because, after all, “Whoa if true.” But the reposting shifts your orientation to the facts. You move from being a person reading information to someone arguing a side of an issue, and once you are on a side of the issue, no amount of facts or argument is going to budge you. This may have been built into the evolution of our reason itself. In this way, going through the process of stream curation is at heart a radicalizing process.

I’ve said many times that all social software trains you to be something. Early Facebook trained you to remember birthdays and share photos, and to some extent this trained you to be a better person, or in any case the sort of person you desired to be.

The process that Facebook currently encourages, on the other hand, of looking at these short cards of news stories and forcing you to immediately decide whether to support or not support them trains people to be extremists. It takes a moment of ambivalence or nuance, and by design pushes the reader to go deeper into their support for whatever theory or argument they are staring at. When you consider that people are being trained in this way by Facebook for hours each day, that should scare the living daylights out of you.

It’s worthwhile to note as well that the nature of social media is we’re more likely to share inflammatory posts than non-inflammatory ones, which means that each Facebook session is a process by which we double down on the most radical beliefs in our feed.

In general, social media developers use design to foster behaviors that are useful to the community. But what is being fostered by this strange process that we put ourselves through each day?

Think about this from the perspective of an Martian come to Earth, watching people reach for their phone in the morning and scrolling past shared headlines, deciding in seconds for each one whether to re-share it or comment or like it. The question the Martian would ask is “What sort of training software are they are using, and what does it train people to do?”

And the problem is that — unlike previous social sites — Facebook doesn’t know, because from Facebook’s perspective they have two goals, and neither is about the quality of the community or well-being of its members. The first goal is to keep you creating Facebook content in the form of shares, likes, and comments. Any value you get out of it as a person is not a direct Facebook concern, except as it impacts those goals. And so Facebook is designed to make you share without reading, and like without thinking, because that is how Facebook makes its money and lock-in, by having you create social content (and personal marketing data) it can use.

The second Facebook goal is to keep you on the site at all costs, since this is where they can serve you ads. And this leads to another problem we can talk about more fully in another post. Your average news story — something from the New York Times on a history of the Alt-Right, for example — won’t get clicked, because Facebook has built their environment to resist people clicking external links. Marketers figured this out and realized that to get you to click they had to up the ante. So they produced conspiracy sites that have carefully designed, fictional stories that are inflammatory enough that you *will* click.

In other words, the consipiracy clickbait sites appeared as a reaction to a Facebook interface that resisted external linking. And this is why fake news does better on Facebook than real news.

To be as clear as I possibly can — by setting up this dynamic, Facebook simultaneously set up the perfect conspiracy replication machine and incentivized the creation of a new breed of conspiracy clickbait sites.

There will be a lot of talk in the coming days about this or that change Facebook is looking at. But look at these two issues to get the real story:

  • Do they promote deep reading over interaction?
  • Do they encourage you to leave the site, even when the link is not inflammatory?

Next time you’re on Facebook you’ll notice there are three buttons at the bottom of each piece of content, outlining the actions you can take on that content. None of them says “read”.

Facebook Makes Conspiracies Familiar, and Familiarity Equals Truth

It would be terrifying enough if this existential problem — that Facebook was training people to be extremists and conspiracists through a process of re-sharing — was the worst bit. But it’s not. The larger problem is the far larger number of people who see the headlines and do not reshare them.

Why is this a problem? Because for the most part, our brains equate “truth” with “things we’ve seen said a lot by people we trust”. The literature in this area is vast — it’s one of the reasons, for example, that so many people believe that global warming is a hoax.

If you think about it, it’s not a bad heuristic for some things. If someone tells you that there’s better farmland over the mountain, and then another person tells you that, and then another person, then your mind starts seeing this as more likely to be true than not, especially if the folks telling you this are folks you trust. If someone tells you that there’s gold mines under every house in the town, but they can’t be touched because of federal laws — well that’s not something you hear a lot, so that’s obviously false.

You say — no, that’s not it at all! The gold mines are ridiculous, that’s why I don’t believe in them! The good farmland is logical! I’m a logical being, dammit!

I’m sorry, you’re not, at least on most days. If enough people told you about the gold mines, and every paper in the town made passing reference to the gold mines you would consider people who didn’t believe in the hidden gold mines to be ridiculous. Want examples of this? Look at the entire sweep of history.

Or look at Lisa Fazio’s work on the illusory truth effect. The effect of familiarity is so strong that giving you a multiple choice question about something you know is wrong — “The Atlantic Ocean is the largest in the world, True or False?” can actually increase your doubt that you are right, and even convince you of the false fact. Why? Because it counts as exposure to the idea that other people believe the lie. This is why Betteridge Headlines are so dangerously unethical. If I publish headlines once a week asking “Is There Gold Buried Under Sunnydale?” with an article that says probably not, eventually when you are asked that question by someone else your mind will go — you know, I think I’ve heard something like that.

All it takes is repetition. That’s it.

Don’t believe me? Go check out how many people believe Obama is a Muslim. Or ask yourself a question like “Does Mike Pence support gay conversion therapy?” without looking up anything and then ask yourself how it is that you “know” that one way or the other. If you know it, you know it because it seems familiar. A lot of people you trust have said it. That’s it. That’s all you got.

If Betteridge Headlines are borderline unethical, then Facebook is well over that line. The headlines that float by you on Facebook for one to two hours a day, dozens at a time, create a sense of familiarity with ideas that are not only wrong, but hateful and dangerous. This is further compounded by the fact that what is highlighted on the cards that Facebook is not the source of the article, which is so small and gray as to be effectively invisible, but the friendly smiling face of someone you trust.

source.png

The article that intimated Clinton may have murdered an FBI agent and his wife and burned them in their home to conceal the evidence was shared by over half a million people. If we assume a share rate of 20-to-1 that means that over 10 million people were exposed to the idea that this actually happened. If we assume that there is a lot of overlap in social networks, we can assume that they were exposed to it repeatedly, and it came from many people they trusted.

And that’s just one headline. People exposed themselves to Facebook multiple times a day, every single day, seeing headlines making all sorts of crazy claims, and filed them in their famil-o-meter for future reference. We’ve never seen something on this scale. Not even close.

If Facebook was a tool for confirmation bias, that would kind of suck. It would. But that is not the claim. The claim is that Facebook is quite literally training us to be conspiracy theorists. And given the history of what happens when conspiracy theory and white supremacy mix, that should scare the hell out of you.

I’m petrified. Mark Zuckerberg should be too.

Advertisements

55 thoughts on “The “They Had Their Minds Made Up Anyway” Excuse

  1. some good points here Mike;
    i am wondering if there is an element of a “moral panic” here in your description? maybe historians of publishing can point to similar cases? you state we have not seen something of this scale, arguably true considering reach of today’s tech, but we must have seen similar events if the illusory truth effect is robust?
    i think this is a related point – am just in the process of watching the documentary 13th up to the part of the severe fall out for black people of the screening of The Birth of a Nation, horrendous.
    ta
    mura

    • There might be an element of moral panic, but I think there’s good reasons to not see it that way. I’ll write more at a later time.

  2. Do you think there’s any relationship between the reputable news sites going behind paywalls and the rise of fake news/facebook as news platform?

  3. Pingback: Fooled by Recency: Hoaxers Increment Dates on Fake Stories | Hapgood

  4. Pingback: Zuckerberg reveals plans to address misinformation on Facebook | TechCrunch

  5. Pingback: Zuckerberg reveals plans to address misinformation on Facebook – The News Galaxy

  6. Pingback: Zuckerberg reveals plans to address misinformation on Facebook | Tech news data

  7. Pingback: Zuckerberg reveals plans to address misinformation on Facebook | now

  8. Pingback: Zuckerberg reveals plans to deal with misinformation on Fb | | Sudut Pandang

  9. Pingback: Zuckerberg reveals plans to address misinformation on Facebook – TechCrunch | Globall News

  10. Pingback: Zuckerberg reveals plans to address misinformation on Facebook – Entire News Link

  11. Pingback: Zuckerberg reveals plans to address misinformation on Facebook - KWOTABLE

  12. Pingback: Zuckerberg reveals plans to address misinformation on Facebook – TechCrunch | Everyday News Update

  13. Pingback: Zuckerberg reveals plans to address misinformation on Facebook – TechCrunch | NewsLeed

  14. Pingback: Zuckerberg reveals plans to address misinformation on Facebook – Buzzabe.com

  15. Pingback: Nigeria Travel Blog | Zuckerberg reveals plans to address misinformation on Facebook

  16. Pingback: Think and grow wealth Zuckerberg reveals plans to address misinformation on Facebook - Think and grow wealth

  17. Pingback: Zuckerberg reveals plans to address misinformation on Facebook - TechCrunch - BrdCloud News

  18. Pingback: Zuckerberg reveals plans to address misinformation on Facebook – TechCrunch – Gradiency

  19. Pingback: Zuckerberg reveals plans to address misinformation on Facebook – TechCrunch – G Email News

  20. Pingback: Zuckerberg reveals plans to address misinformation on Facebook – Daily Tech

  21. Pingback: Zuckerberg reveals plans to address misinformation on Facebook – TechCrunch | Conservative News

  22. Pingback: Zuckerberg reveals plans to address misinformation on Facebook | Technewsfree.com - World Technology News , Auto, Gaming, Mobile, Networking, PC, Mobile

  23. Pingback: The Hawkins Herald – Zuckerberg reveals plans to address misinformation on Facebook – TechCrunch

  24. Pingback: Zuckerberg reveals plans to address misinformation on Facebook – TechCrunch | Your News Update

  25. Pingback: Zuckerberg reveals plans to address misinformation on Facebook – TechCrunch | EXCELLENTNEWSPAPER

  26. Pingback: Zuckerberg reveals plans to address misinformation on Facebook | In Pakistan

  27. Pingback: Zuckerberg reveals plans to address misinformation on Facebook | technology market

  28. Pingback: Zuckerberg reveals plans to address misinformation on Facebook – TechCrunch | 5DTV World Breaking News Update

  29. Pingback: Zuckerberg reveals plans to address misinformation on Facebook – GadgetsCave.com

  30. Pingback: » Zuckerberg reveals plans to address misinformation on Facebook

  31. Pingback: Zuckerberg reveals plans to address misinformation on Facebook - Technical Engineering World

  32. Pingback: Zuckerberg reveals plans to address misinformation on Facebook | Report News Today

  33. Pingback: Zuckerberg reveals plans to address misinformation on Facebook – TechCrunch | Quick News Update Everyday

  34. I always check the source of an article on FB before I read it. Daily Kos, Occupy Democrats, Slate, USA Today, for instance I know will have a certain slant that I need to be aware of because many are purely political sites meant to spin facts or fiction in order to pursuade for their cause. I also find that people share without reading the article they are sharing because of it’s source. It is a minefield that FB users should be skeptical of and one in which they should check outside of FB sources before spreading lies.

  35. Pingback: Zuckerberg reveals plans to address misinformation on Facebook – TechCrunch | SilentMajority.News

  36. Pingback: Zuckerberg reveals plans to address misinformation on Facebook | Leaders NG

  37. Pingback: Renaissance's Skill and Facebook's Buyback - Artificial Intelligence Online

  38. Pingback: Zuckerberg reveals plans to address misinformation on FacebookTech Giant News

  39. Pingback: Zuckerberg reveals plans to address misinformation on Facebook – Marketmonitor

  40. Great insights Mike. I’m curious what you think of Twitter when you look at it from the same lens you’ve discussed above. In your opinion, is Twitter training us in a similar manner?
    Thanks.

  41. Pingback: Zuckerberg reveals plans to address misinformation on Facebook - Ithamsetty Portal

  42. Pingback: Facebook extremism and fake news: How Facebook is training us to be conspiracy theorists — Quartz

  43. what you have said echoes a little bit the idea of post structuralism theory that language does not reflect the material reality of the world rather it reshapes it and distorts it .. and what we see is mere representation of the world simply. .. truth is not something owned or controlled but a structure which society inventes.

  44. Pingback: Life; Mike Caulfield: Facebook spreads media extremism for profit – Corporate INK: Business Advisory | Writing | Research | PR | Solutions

  45. Pingback: Zuckerberg reveals plans to address misinformation on Facebook

  46. Pingback: Zuckerberg reveals plans to address misinformation on Facebook – Facebook News

  47. Pingback: Daily Reading #34 | thinkpatriot

  48. Pingback: Friday Roundup: Thanks For Nothing 2016 – The Quibble

  49. Pingback: The Bottom Line: Fake News and Facebook – The Watchtower

  50. Pingback: Zuckerberg reveals plans to address misinformation on Facebook | Trend Today

  51. Pingback: Links 12/6/16 | Mike the Mad Biologist

  52. Pingback: Familiarity = Truth, a Reprise | Hapgood

  53. Pingback: What should Mathematicians do now? – mathnation

  54. Pingback: What Should Mathematicians Do Now? – Study Score Calc

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s