Introducing SIFT, a Four Moves Acronym

The Four Moves have undergone some tweaking since I first introduced them in early 2017. The language has shifted, been refined. We’ve come to see that lateral reading is more of a principle underlying at least two of the moves (maybe three). We’ve removed a reference to “go upstream” which was a bit geeky. All in all, though, the moves have remained constant, partially because so many people have found them useful.

Today, we’re introducing an acronym that can be used to remember the moves: SIFT.

  • (S)TOP
  • (I)nvestigate the Source
  • (F)ind better coverage
  • (T)race claims, quotes, and media back to the original context

If you’ve followed the moves as they have developed over the past two years, these won’t surprise you, but there are a couple changes to the wording and the order.

The most notable is we’ve combined our habit (originally “check your emotions”) with the move (“circle back”) because these turn out to be the same thing. Basically — stop reading, stop reacting, figure out what you need to know and reapproach. In the beginning, this means to not read before you orient yourself. When researching, this means if you are getting sucked into an increasingly confusing maze of pages, STOP AND BACK UP.

The other moves are the same as the most recent iteration, with the change that “Find better coverage” replaces “other coverage” to emphasize the idea you are looking for other coverage, but ideally coverage that is slightly better on at least one dimension. What those dimensions are may be contextual, but often students have some half-decent intuitions here that can be refined over time.

We’ve also broadened out “Find the original” to its replacement which stresses that the point is not just finding the original for its own sake, but finding the original context. The original may be better — original reporting from the NYT or a fact-checked Atlantic article. But it could be worse — a claim that is sourced to a junk journal, or simply began as an unsubstantiated tweet. In the case of photos or videos, the original context is often mitigating, where media or quotes are presented with a false, inflammatory frame.

But the main introduction here is the acronym, a direct answer to CRAAP. (“Don’t CRAAP, SIFT?”).

Final note — some people might look at the acronym and think — “Isn’t this just more CRAAP? Another checklist?”

I deal with this extensively on this blog and in the textbook, but the problem with CRAAP has never been the acronym. In fact, the history of CRAAP as a web infolit device begins eight years (at least) before the acronym. The difference has always been the difference between a narrow list of things to do (SIFT) and a broad list of things to consider and rate (CRAAP). I’ve detailed at length why that makes such a difference in terms of cognitive load and other factors, so I won’t repeat it here. But my point is that a bad methodology got a lot of lift with a clever acronym that served as a convenient shorthand and a student mnemonic — it’s probably time the better methodology gets an acronym as well.

The Curation/Search Radicalization Spiral

Sam prides himself on questioning conventional wisdom and subjecting claims to intellectual scrutiny. For kids today, that means Googling stuff. One might think these searches would turn up a variety of perspectives, including at least a few compelling counterarguments. One would be wrong. The Google searches flooded his developing brain with endless bias-confirming “proof” to back up whichever specious alt-right standard was being hoisted that week. Each set of results acted like fertilizer sprinkled on weeds: A forest of distortion flourished.

What Happened After My 13-Year-Old Son Joined the Alt-Right”

I have one or two quibbles with the recent article in the Washingtonian about a 13 year-old’s slide into the alt-right by way of meme-world, but the article as a whole is quite useful and, for parents at least, very moving. I recommend everyone read it, but in particular parents.

Let’s get the quibbles out of the way first. I think the article is a bit too enamored with Nagle’s Kill All Normies, and that maybe leaks into the narrative as well, with the inciting incident (wrongly accused of sexual harassment) perhaps playing too dominant a role. I would say it’s a bit too sympathetic, except of course it’s the woman’s son and the kid is thirteen. So I think we can let it slide. (Don’t read Nagle, though. Read Becca Lewis and Joan Donovan instead).

Where the article does excel, though, is in the way it gets across the process of grooming that these communities use. People tend to think of grooming in the context of sexual predators or spies — the slow process of finding disaffected people and using their disaffection to warp their mind bit by bit. But we’ve long known that this is how online radicalization works as well, from ISIS to neo-Nazis.

The quote I’ve chosen at the top of this article talks about confirmation bias, and I’ll come to the ways that is right in a second. But let me first say what “confirmation bias” gets wrong about our radicalization problem. (Trigger warning: I will be drawing a short parallel that touches on sexual predation).

No foreign power looking to recruit a spy goes up and says, hey, will you spy for us? And sexual predators do not begin grooming by asking for sex. Instead, in each case, there is a slow process of getting the target acclimated, bit by bit, to ideas thought repulsive. The grooming is achieved by hiding the destination of the grooming until the target is already deep in the alternate reality.

This is an important point, because it’s actually working against confirmation bias. Confirmation bias would take a non-Nazi, and work to keep them a non-Nazi. Confirmation bias, were all the cards on the table at the beginning of the grooming, would be protective. You’d Google, find out you were reading Nazi literature and think um, maybe I’ll read something else.

So what’s going on with these Google searches?

The Google searches flooded his developing brain with endless bias-confirming “proof” to back up whichever specious alt-right standard was being hoisted that week.

A few things are likely happening. The first is curation. The reddit group was likely feeding her son a constant stream of outrages of men being ill-treated by feminists. An ad that denigrates male aggressiveness in sports. The story of a woman falsely accusing a man of rape. Statistics showing the wage gap is a myth. A feminist saying outrageous things. Probably some fake stuff, ala #EndFathersDay thrown in for good measure. When these things are put all together in a stream, it can seem like there is a vast conspiracy to suppress the real truth. How come they never taught you this stuff, right?

Now, this is where we’d think being inquisitive would help. Get out and Google it, right? And for someone skilled at finding the right information on the web that strategy might work. But the curation and the language used produces loaded searches that just pulls one deeper into the narrative that the curation scaffolded.

What do I mean? Well, take the infamous Dylann Roof search “black on white crime” which he indicated was his first step into the radicalization that led to him slaughtering black worshipers in a church basement in an attempt to incite a “race war”. In the beginning, he put “black on white crime” into Google, and this is what happened next:

But more importantly this prompted me to type in the words “black on White crime” into Google, and I have never been the same since that day. The first website I came to was the Council of Conservative Citizens. There were pages upon pages of these brutal black on White murders. I was in disbelief. At this moment I realized that something was very wrong. How could the news be blowing up the Trayvon Martin case while hundreds of these black on White murders got ignored?

As I’ve talked about previously, “black on white crime” is a data void. It is not a term used by social scientists or reputable news organizations, which is why the white nationalist site Council of Conservative Citizens came up in those results. That site has since gone away, but what it was was a running catalog of cases where black men had murdered (usually) white women. In other words, it’s yet another curation, even more radical and toxic than the one that got you there. And then the process begins again.

So this is what the spiral looks like:

The curation/search radicalization spiral. Curated reality prompts loaded searches which introduce the user to even more radical sites with more radical curations, which produce further searches. As the user moves outward, they become more and more unmoored from reality.

You can read Roof describe the process here:

From this point I researched deeper and found out what was happening in Europe. I saw that the same things were happening in England and France, and in all the other Western European countries. Again I found myself in disbelief. As an American we are taught to accept living in the melting pot, and black and other minorities have just as much right to be here as we do, since we are all immigrants. But Europe is the homeland of White people, and in many ways the situation is even worse there. From here I found out about the Jewish problem and other issues facing our race, and I can say today that I am completely racially aware.

From Roof’s “manifesto”.

The thing to remember about this algorithmic-human grooming hybrid is that the gradualness of it — the step-by-step nature of it — is a feature for the groomers, not a bug. I imagine if the first page Roof had encountered on this — the CCC page — had sported a Nazi flag and and a big banner saying “Kill All Jews” he’d have hit the back button, and maybe the world might be different. (Maybe). But the curation/search spiral brings you to that point step by step. In the center of the spiral you probably still have enough good sense to not read stuff by Nazis, at least knowingly. By the time you get to the edges, not so much.

Digital Literacy Interventions

There is so much that needs to be addressed here, in terms of platforms, schooling, awareness of the danger of various ideologies. In terms of underlying patriarchal and white supremacist culture, and the systems that serve to replicate and enlarge its influence. When I talk about digital literacy interventions, I do not mean to minimize this work. It’s massive.

But digital literacy is my piece of it. What do digital literacy interventions look like here?

There’s a multiple entry points here, corresponding to the parts of the spiral:


Students need a basic understanding of how curations can warp reality. I don’t think the “filter bubble” is the frame for this, since it implies that curations confirm existing beliefs and that stepping outside the curation is a net good. In reality, curations don’t protect us from opposing views, but often bring us to more radical views. Thinking about what you want from a curation in a way bigger than “both sides” is important. (Spoiler: what you want is context, and the people best suited to bring context are people in a position to know, via expertise, professional skill, or lived experience). What applies to human curation applies to algorithmic curation and recommendation as well. Students should be able to look at a YouTube recommendation list and articulate what the underlying principle of curation seems to be.

Loaded Search

Students need to be aware of how search terms shape results. I talk about this a bit in my textbook a few years back — how searching something like “9/11 hoax” presupposes a certain type of result. If I was rewriting this book now, I’d massively expand that chapter and the examples around it. Like much of digital infolit, the key here is that the students know how to “zoom out” to a broader more neutral term, using diction likely associated with the things they would want to read.

Loaded Results

Even the most loaded search term usually delivers a page with at least one good result. Teaching students to scan search engine result pages with an eye toward what sort of information is behind each of those links can help students, who often zero too much in on issues of result relevance when clicking and not enough on result genre and quality. Students can also be taught how to use somewhat curated searches — News-only searches, Scholar, Images.

New Site

Here, lateral reading is key. Before engaging with a new site, students should find out what the site they are reading is. What’s its agenda? Record of accuracy? Again, remember that grooming happens bit by bit, and one of its main mechanisms is hiding its true nature from the target. By getting students to realize early in the process that they are drifting into some radical and toxic territory they can choose to proceed with the right frame of reference, or maybe avoid those sources altogether.

Digital Infolit Can Help

Digital literacy, source-checking, and lateral reading are not replacements for action that needs to happen elsewhere. Sites like Reddit must consider what cultures they are supporting, and how their platform’s affordances may be exacerbating ill effects. The roots of white supremacy must be addressed. Full digital literacy should address issues of how economics, platform incentives, tribalism, and supremacist/sexist/colonial structures shape online discourse and production.

But the incremental nature of grooming on the internet does not just rely on ill-feeling or latent racism; it makes use of a series of misconceptions most people have about how to find and think about information on the web. The machinery of radicalization is massive, but small mistakes in search and site selection behavior help grease its wheels. Addressing those mistakes directly with students can help increase the difficulty of such radicalization for groomers — from neo-Nazis to ISIS — and given the relatively small cost of providing such training, is an intervention we should be pursuing.