Hapgood

Mike Caulfield's latest web incarnation. Networked Learning, Open Education, and Online Digital Literacy


The “They Had Their Minds Made Up Anyway” Excuse

BGR graciously linked my post from the weekend, a post showing that the scale at which fake news stories trend on Facebook can dwarf traditional news in terms of shares. (Thanks, BGR!)

However, they end with this paragraph, which I’d like to reply to:

On a related note, it stands to reason that most individuals prone to believing a hyperbolic news story that skews to an extreme partisan position likely already have their minds made up. Arguably, Facebook in this instance isn’t so much influencing the voting patterns of Americans as it is bringing a prime manifestation of confirmation bias to the surface.

As regular readers know, my core expertise is not data analysis of Facebook, but in how learning environments (and particularly online learning environments) affect the way users think, act, and learn. A long time ago I was online political organizer, but my day job for many, many years has been the investigation and design of net-enabled learning experiences.

The BGR conclusion is a common one, and it intuitively meshes with our naive understanding of how the mind perceives truth. I see something I agree with and I reshare it — it doesn’t change my mind, because of course I already believed it when I reshared it. However, from a psychological perspective there are two major problems with this.

Saying Is Believing

The first problem is that saying is believing. This is an old and well-studied phenomenon, though perhaps understudied in social media. So when you see a post that says “Clintons suspected in murder-suicide” and you retweet or repost it, it’s not a neutral transaction. You, the reposter, don’t end that transaction unchanged. You may initially post it because, after all, “Whoa if true.” But the reposting shifts your orientation to the facts. You move from being a person reading information to someone arguing a side of an issue, and once you are on a side of the issue, no amount of facts or argument is going to budge you. This may have been built into the evolution of our reason itself. In this way, going through the process of stream curation is at heart a radicalizing process.

I’ve said many times that all social software trains you to be something. Early Facebook trained you to remember birthdays and share photos, and to some extent this trained you to be a better person, or in any case the sort of person you desired to be.

The process that Facebook currently encourages, on the other hand, of looking at these short cards of news stories and forcing you to immediately decide whether to support or not support them trains people to be extremists. It takes a moment of ambivalence or nuance, and by design pushes the reader to go deeper into their support for whatever theory or argument they are staring at. When you consider that people are being trained in this way by Facebook for hours each day, that should scare the living daylights out of you.

It’s worthwhile to note as well that the nature of social media is we’re more likely to share inflammatory posts than non-inflammatory ones, which means that each Facebook session is a process by which we double down on the most radical beliefs in our feed.

In general, social media developers use design to foster behaviors that are useful to the community. But what is being fostered by this strange process that we put ourselves through each day?

Think about this from the perspective of an Martian come to Earth, watching people reach for their phone in the morning and scrolling past shared headlines, deciding in seconds for each one whether to re-share it or comment or like it. The question the Martian would ask is “What sort of training software are they are using, and what does it train people to do?”

And the problem is that — unlike previous social sites — Facebook doesn’t know, because from Facebook’s perspective they have two goals, and neither is about the quality of the community or well-being of its members. The first goal is to keep you creating Facebook content in the form of shares, likes, and comments. Any value you get out of it as a person is not a direct Facebook concern, except as it impacts those goals. And so Facebook is designed to make you share without reading, and like without thinking, because that is how Facebook makes its money and lock-in, by having you create social content (and personal marketing data) it can use.

The second Facebook goal is to keep you on the site at all costs, since this is where they can serve you ads. And this leads to another problem we can talk about more fully in another post. Your average news story — something from the New York Times on a history of the Alt-Right, for example — won’t get clicked, because Facebook has built their environment to resist people clicking external links. Marketers figured this out and realized that to get you to click they had to up the ante. So they produced conspiracy sites that have carefully designed, fictional stories that are inflammatory enough that you *will* click.

In other words, the consipiracy clickbait sites appeared as a reaction to a Facebook interface that resisted external linking. And this is why fake news does better on Facebook than real news.

To be as clear as I possibly can — by setting up this dynamic, Facebook simultaneously set up the perfect conspiracy replication machine and incentivized the creation of a new breed of conspiracy clickbait sites.

There will be a lot of talk in the coming days about this or that change Facebook is looking at. But look at these two issues to get the real story:

  • Do they promote deep reading over interaction?
  • Do they encourage you to leave the site, even when the link is not inflammatory?

Next time you’re on Facebook you’ll notice there are three buttons at the bottom of each piece of content, outlining the actions you can take on that content. None of them says “read”.

Facebook Makes Conspiracies Familiar, and Familiarity Equals Truth

It would be terrifying enough if this existential problem — that Facebook was training people to be extremists and conspiracists through a process of re-sharing — was the worst bit. But it’s not. The larger problem is the far larger number of people who see the headlines and do not reshare them.

Why is this a problem? Because for the most part, our brains equate “truth” with “things we’ve seen said a lot by people we trust”. The literature in this area is vast — it’s one of the reasons, for example, that so many people believe that global warming is a hoax.

If you think about it, it’s not a bad heuristic for some things. If someone tells you that there’s better farmland over the mountain, and then another person tells you that, and then another person, then your mind starts seeing this as more likely to be true than not, especially if the folks telling you this are folks you trust. If someone tells you that there’s gold mines under every house in the town, but they can’t be touched because of federal laws — well that’s not something you hear a lot, so that’s obviously false.

You say — no, that’s not it at all! The gold mines are ridiculous, that’s why I don’t believe in them! The good farmland is logical! I’m a logical being, dammit!

I’m sorry, you’re not, at least on most days. If enough people told you about the gold mines, and every paper in the town made passing reference to the gold mines you would consider people who didn’t believe in the hidden gold mines to be ridiculous. Want examples of this? Look at the entire sweep of history.

Or look at Lisa Fazio’s work on the illusory truth effect. The effect of familiarity is so strong that giving you a multiple choice question about something you know is wrong — “The Atlantic Ocean is the largest in the world, True or False?” can actually increase your doubt that you are right, and even convince you of the false fact. Why? Because it counts as exposure to the idea that other people believe the lie. This is why Betteridge Headlines are so dangerously unethical. If I publish headlines once a week asking “Is There Gold Buried Under Sunnydale?” with an article that says probably not, eventually when you are asked that question by someone else your mind will go — you know, I think I’ve heard something like that.

All it takes is repetition. That’s it.

Don’t believe me? Go check out how many people believe Obama is a Muslim. Or ask yourself a question like “Does Mike Pence support gay conversion therapy?” without looking up anything and then ask yourself how it is that you “know” that one way or the other. If you know it, you know it because it seems familiar. A lot of people you trust have said it. That’s it. That’s all you got.

If Betteridge Headlines are borderline unethical, then Facebook is well over that line. The headlines that float by you on Facebook for one to two hours a day, dozens at a time, create a sense of familiarity with ideas that are not only wrong, but hateful and dangerous. This is further compounded by the fact that what is highlighted on the cards that Facebook is not the source of the article, which is so small and gray as to be effectively invisible, but the friendly smiling face of someone you trust.

source.png

The article that intimated Clinton may have murdered an FBI agent and his wife and burned them in their home to conceal the evidence was shared by over half a million people. If we assume a share rate of 20-to-1 that means that over 10 million people were exposed to the idea that this actually happened. If we assume that there is a lot of overlap in social networks, we can assume that they were exposed to it repeatedly, and it came from many people they trusted.

And that’s just one headline. People exposed themselves to Facebook multiple times a day, every single day, seeing headlines making all sorts of crazy claims, and filed them in their famil-o-meter for future reference. We’ve never seen something on this scale. Not even close.

If Facebook was a tool for confirmation bias, that would kind of suck. It would. But that is not the claim. The claim is that Facebook is quite literally training us to be conspiracy theorists. And given the history of what happens when conspiracy theory and white supremacy mix, that should scare the hell out of you.

I’m petrified. Mark Zuckerberg should be too.



56 responses to “The “They Had Their Minds Made Up Anyway” Excuse”

  1. some good points here Mike;
    i am wondering if there is an element of a “moral panic” here in your description? maybe historians of publishing can point to similar cases? you state we have not seen something of this scale, arguably true considering reach of today’s tech, but we must have seen similar events if the illusory truth effect is robust?
    i think this is a related point – am just in the process of watching the documentary 13th up to the part of the severe fall out for black people of the screening of The Birth of a Nation, horrendous.
    ta
    mura

    1. There might be an element of moral panic, but I think there’s good reasons to not see it that way. I’ll write more at a later time.

  2. Do you think there’s any relationship between the reputable news sites going behind paywalls and the rise of fake news/facebook as news platform?

  3. […] Here’s a fake story that was shown a number of places on the web during the campaign, claiming that protesters of Donald Trump were being paid. This has been covered so many times by so many fake and satirical sites that it is now an article of faith among Republicans, due to exposure effects. […]

  4. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  5. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  6. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  7. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  8. […] you to be a greater individual, or in any case the type of individual you desired to be,” Caulfield said, […]

  9. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  10. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  11. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  12. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  13. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  14. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  15. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  16. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  17. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  18. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  19. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  20. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  21. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  22. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  23. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  24. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  25. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  26. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  27. […] this lerned we to be a improved person, or in any box a arrange of chairman we preferred to be,” Caulfield said, […]

  28. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  29. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  30. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  31. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  32. […] this lerned we to be a improved person, or in any box a arrange of chairman we preferred to be,” Caulfield said, […]

  33. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  34. I always check the source of an article on FB before I read it. Daily Kos, Occupy Democrats, Slate, USA Today, for instance I know will have a certain slant that I need to be aware of because many are purely political sites meant to spin facts or fiction in order to pursuade for their cause. I also find that people share without reading the article they are sharing because of it’s source. It is a minefield that FB users should be skeptical of and one in which they should check outside of FB sources before spreading lies.

  35. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  36. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  37. […] “Mark Zuckerberg Explains How Facebook Plans to Fight Fake News.” And here’s Mike Caulfield on the News […]

  38. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  39. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  40. Great insights Mike. I’m curious what you think of Twitter when you look at it from the same lens you’ve discussed above. In your opinion, is Twitter training us in a similar manner?
    Thanks.

  41. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  42. […] piece originally appeared on Hapgood. Follow Mike on Twitter at @DavidLuban. Learn how to write for Quartz Ideas. We welcome your […]

  43. what you have said echoes a little bit the idea of post structuralism theory that language does not reflect the material reality of the world rather it reshapes it and distorts it .. and what we see is mere representation of the world simply. .. truth is not something owned or controlled but a structure which society inventes.

  44. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  45. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  46. […] The “They Had Their Minds Made Up Anyway” Excuse […]

  47. […] And good ol’ Zuckster, always in our corner, is ready to tackle the fake news problem. […]

  48. […] Caulfield, former director community outreach for MIT’s OpenCourseWare program, discusses whether Facebook’s current strategy for users sharing content is evidence of confirmation bias […]

  49. […] this trained you to be a better person, or in any case the sort of person you desired to be,” Caulfield said, […]

  50. […] recent years, joint investigation shows I WONDER HOW LONG THOSE CARRIER JOBS WILL STAY IN AMERICA The “They Had Their Minds Made Up Anyway” Excuse There’s Hamilton, the hit musical and Hamilton, the city. Angry social media users confuse […]

  51. […] a month ago, I wrote a post that would become one of my most popular on this site, a post on the They Had Their Minds Made Up Anyway Excuse. The post used some basic things we know from the design of learning environments to debunk the […]

  52. […] Read News On Facebook by Craig Silverman Fake News Is Not the Only Problem by Gilad Lotan The “They Had Their Minds Made Up Anyway” Excuse by Mike Caulfield Factiness by Nathan Jurgenson Post-Truth Antidote: Our Roles in […]

  53. […] Read News On Facebook by Craig SilvermanFake News Is Not the Only Problem by Gilad LotanThe “They Had Their Minds Made Up Anyway” Excuse by Mike CaulfieldFactiness by Nathan JurgensonPost-Truth Antidote: Our Roles in Virtuous Spirals of […]

Leave a comment