The first use of the term “conspiracy theory” is much earlier — and more interesting — than historians have thought.

Was reading the new Oxford collection on conspiracy theory (quite an impressive collection, can be bought here) and noted that one of the articles dated the term conspiracy theory back to the 1870s. It’s not central to the author’s argument, but it’s not trivial either. The author sees the term as coming out of crime, and then navigating to politics:

Such considerations encourage a more systematic approach. By consulting databases that have digitized American newspapers from the nineteenth century, it is possible to gain an appreciation of how theory as a term made inroads into the discourse of crime. Thus, a search of the database America’s Historical Newspapers identifies the following dates as the earliest mention for conspiracy theory and other terms built on the template (crime x + theory):


murder theory (1867)
suicide theory (1871)
conspiracy theory (1874)
blackmail theory (1874)
abduction theory (1875)


From Conspiracy Theory: The Nineteenth-Century Prehistory of a Twentieth-Century Concept from Conspiracy Theories and the People Who Believe Them (p. 62). Oxford University Press. Kindle Edition. .


This is fairly common dating — scholarly accounts I’ve read have had similar or even later dates for the occurrence [see Wikipedia, for example, for the OED date of 1909(!!) as well as some other potential dates.]

However, unless I’m missing something, most of this is wrong. The first mention found in newspapers is in 1863, not the 1870s, and it does not come out of court terminology, but rather politics. In fact it is used much as it would be today, to derisively refer to a set of allegedly less educated people who see a secret plot when a simple non-conspiratorial narrative has much more explanatory power.

One note here for people who haven’t delved much into the literature on this — conspiracy theory as a popular term doesn’t really take off until the late 1950s, and the scholarly concern about conspiracy theories (under a variety of names) gets its first significant lift in the 1930s and 40s in discussions about totalitarianism and populism. And of course conspiracy theories — under different names — have existed since the dawn of time. When we talk about early mentions we’re talking about the evolution of the term — the idea that you could have a theory that hinged on conspiracy, and how that would be regarded back then. This in turn plugs into a debate about how much our perception of such theories has shifted over time.

The Conspiracy of the British Elites Against the Union

So now we come to the first mention of “conspiracy theory” I’ve found, apparently missed by the OED and others. It will take a little explaining to set up. But it’s particularly surprising others have not found it since it’s from an exchange in the New York Times.

It’s from 1863, and it’s a response to a letter that had run the Sunday before. That letter had dealt with the question of why England — whose papers and elites had spent so much time attacking the United States over the institution of slavery in the 1850s — was now taking the side of the South in the Civil War.

The answer, the writer says, is obvious. America had been exerting influence on English institutions, and this had threatened the aristocracy, which feared loss of power. So they had embarked on a plan. They would support whatever side was weaker, in the hope that America would be destroyed. Once America was destroyed, then the governing classes could point to the failure of the U.S. as proof that democratic reforms don’t work, and reclaim power. In order to do this they would support the South verbally, but, importantly, not intervene on their behalf since the point is to avoid any decisive action that would hasten the conclusion of the war. Their best play was to draw out the conflict.

The ultimate endgame? Creating the “most terrible financial explosion ever seen in a civilized country” — all to benefit the small class of English aristocrats!

New York Times, January 4, 1863. Page 2.


I am not a Civil War historian, so I can’t say if any of this was true. But in form it’s not terribly different from the conspiracy theories promoted by Sanders surrogates about the DNC, or theories tossed around the right wing blogosphere from time to time. Small set of elites that are afraid of the success of brilliant progressive/conservative ideas, and so what do they do — they sabotage them, just to say, hey I told you so.

Again — is it true? Who knows. But the form of the argument is very congruent with what we think of as conspiracy theorizing nowadays.

So the next week another person replies in the correspondence section of the NYT. And he says, look, you don’t need to invent this whole bizarre plot. England supported abolition when it was cheap for to them to do so. Now it’s looking like it might get expensive if their cotton is cut off, so they are muddling through this. Their lack of intervention is not due to a desire to let the war do maximum damage or cause financial collapse, but based on the fact that they have other foreign entanglements that are much more consequential at the moment and can’t afford a new one.


Here’s the text of the portion that mentions “conspiracy theory”:

Now, when we look for the cause of this, any man who has made European politics his study at home, or, being abroad has known mercily so much of them as one cannot help knowing, from dally perusal of the French and English papers, sees fast enough that since 1849 (to go no further back) England has had quite enough to do in Europe and Asia, without going out of her way to meddle with America. It was a physical and moral impossibility that she could be carrying on a gigantic conspiracy against us. But our masses, having only a rough general knowledge of foreign affairs, and not unnaturally somewhat exaggerating the space which we occupy in the world’s eye, do not appreciate the complications which rendered such a conspiracy impossible. They only look at the sudden right-about-face movement of the English Press and public, which is most readily accounted for on the conspiracy theory.

New York Times, January 11, 1863.

You’ll note here that conspiracy theory — almost ten years before the other examples historians often note — is used much how we would use it now. It’s a put down, an assertion that the complexity someone else sees is a result of ignorance or worse.

You’ll note too something that is almost too delicious: the first use of conspiracy theory is about a conspiracy said to involve the press . The first reference to conspiracy theory we have on record is, in part, a “The press is so unfair because they’re in the bag for the elite cabal” conspiracy. Fake news, man.

I can’t help but feel that this citation raises some questions — a few at least — around some of the cultural history I’ve read on the use of the term conspiracy theory. But it’s Christmas Eve, and I’m not really interested in those conversations right now — I just wanted to correct something I’ve been seeing people get wrong.

To my knowledge I’m the first to trace back the etymology this far, but if I’m not, you can let me know in the comments. If I’m not the the first, it’s worthwhile anyway to get this up so that people stop making this mistake.

The Homeostatic Fallacy and Misinformation Literacy

I wrote a thing for Neiman’s year-end journalism predictions yesterday that I’m quite excited about. Hopefully will be out soon. (Update: it’s here.)

In the article I finally publish this term I’ve been throwing around in some private conversations — the “homeostatic fallacy”. 

Homeostasis is a fundamental concept of biology. The typical example is human temperature. For humans the temperature 98.6 tends to be a very desirable temperature. And so what you find is no matter where you put a human body — in the arctic or the tropics — the human body finds ways to keep that temperature constant. In hot environments you sweat, and the condensation helps cool you. Your blood vessels dilate, to bring more heat to the skin where it can be dispensed with. In cold environments your blood vessels constrict. 

There are a lot of people out there that think that misinformation is sort of like hot and cold environments are towards the body. In other words, both misinformation (let’s think of it as cold) and information literacy (let’s think of it as heat) have relatively little effect, because the mind has certain homeostatic mechanisms that protect against identity threat and provide resistance to new ideas. And there’s a certain truth to this. Knowing the facts about nuclear power won’t suddenly make you a supporter, and learning the history of Reconstruction won’t turn you woke. 

And as such, we hear a lot of critiques about media literacy of the sort that you can’t change people’s minds by giving them better information. Homeostatic arguments sometimes go even further, claiming bad information is likely not leading to bad decisions anyway.

Often this is posed as a counter to a naive Cartesian view of beliefs, that treats our decision-making processes as scientific. But weirdly, both the homeostatic view of misinformation and the Cartesian one suffer from the same transactional blindspot. On a case by case basis — show a person a fact, check if it changes them — homeostatic mechanisms often prevail. Your mind has certain psychological set-points and will often fight to keep them constant, even in the face of disinformation or massive scientific evidence. 

But the fallacy is that the set-point itself will not change.

People often use temperature as an example of a homeostatic set-point, but I think another example is far more educational. Consider weight.

Your body has a natural weight it gravitates towards, and the power of that set-point can’t be underestimated. Think about this astonishing fact — one pound of weight is about 3,500 excess calories. To maintain weight over a year, your average intake would have to stay within about 10 calories a day of maintenance levels. Few people account for calories at this level of precision, and yet many retain a stable weight year after year. That’s the power of homeostasis.

How do we gain weight, then? There’s quite a lot of debate on this, actually. But, more or less, temporary behavior and adverse events, in sufficient quantities, not only increase our weight, but change our set-points. And now the homeostatic mechanisms that kept us trim work against us, to keep the pounds on.

In short time frames, a sort of psychological homeostasis is protective. We see bad info or good info and nothing changes. We share a corrosive meme and we’re still the same person. Just as a single cookie at the Christmas party will have a net zero effect on your weight through the magic of homeostasis, a single bit of disinfo isn’t going to change you.

But the goal of disinformation isn’t really around these individual transactions. The goal of disinformation is to, over time, change our psychological set-points. To the researcher looking at individuals at specific points in time, the homeostasis looks protective — fire up Mechanical Turk, see what people believe, give them information or disinformation, see what changes. What you’ll find is nothing changes — set-points are remarkably resilient.

But underneath that, from year to year, is drift. And its the drift that matters.

Recognition Is Futile (and also dangerous)

I often talk about the dangerous of teaching students to “recognize” fake news. Here’s a good example from today of why recognition is a lousy strategy that can lead to bad results, a tweet proposing that the President of Nigeria has been replaced with a clone.

Here is how Peter Adams’s excellent newsletter The Sift  describes the conspiracy theory:

Speaking on Sunday to Nigerians living in Poland, where he is attending a U.N. climate conference, Nigeria’s President Muhammadu Buhari denied viral rumors — amplified by his political opponents — that he had died and that a look-alike from Sudan had taken his place.

The rumors first emerged in 2017 amid Buhari’s lengthy, unexplained disappearances from public life. They have been strengthened by misinformation and conspiracy theories shared on social media — including by Mazi Nnamdi Kanu, a political activist and leader of a separatist group.

But wait, are those the same person? Look closely…

What do you think? 

See, here’s the thing, and I really wish we’d put this at the center of what we do. If I tell you these are the same person, you’ll say, “But of course, it’s so obvious!” and point to some details. On the other hand, if I say actually I played a trick on you and swapped a picture out for another one, you’ll say… “But of course, it’s so obvious!” and point to other details.

Why is that? Two reasons. First, the informational field here is dense. A photograph has literally hundreds of things I can drill down on, and given an initial orientation towards the photo I may often find whatever I need to bolster that initial orientation. Second, there’s no stopping rule for declaring whether these are the same person or not. There’s no point, for example, where you can have looked at three precise things and declare, with high certainty, that this is confirmed to be him. So instead of bringing you to more certainty, the longer you look at it the more likely you are to find some strange differences, and the more likely you are to become confused.

Most things aren’t quite as informationally dense as photographs, but the same problems come into play. The more features you have to look at, the more you start to be manipulated. Don’t believe me? Here’s a cruel trick you can play on your faculty. Show them this page:

Tell them — hey this is a suspicious medical site, how do we know? Unless they know the site is the site of one one of the most prestigious journals in medicine they’ll tell you all the reasons why it’s obvious this is a junk site:

  • It’s a .com, not a .org. Probably an imposter.
  • There’s literally an “Our Company” link at the top. Like, hello!?! This is not a journal!
  • “Beating sugar taxes” in that headline doesn’t sound very scholarly
  • And look at these pictures, they look like something from a clickbait site, stock photos, etc.
  • There’s a popup ad that looks like an ad for some sort of pharmaceutical promotion in the bottom, what a scam!

Start by telling them the opposite — that it’s one of the top five medical journals published in English, and again, they’ll come up with all the reasons why it was so so obvious from the start this was quality:

  • Good clean style
  • Lots of focus on medical practice
  • Scholarly titles on some articles
  • Mentions of the CDC
  • Research tab and Authors tab looks legit.

In a dense informational field you think you’re being Sherlock, and feel pretty smart. But the more data you look at the more you’ll get confused.

This is well known by people outside online media literacy of course. Ever go to buy a car? The psychology of dealerships is to put complex configurations of car options in front of you — this one is $1600 more but it has the power windows and the satellite radio is standard, it has seat warmers and leather interiors, this one is less but but doesn’t have ABS and doesn’t come in white, at least on the lot. All of this stuff is fed to you deliberately to overload your cognition so that by the time you sit down and hear about their special warranties you’re putty in their hands. 

If you want to survive on the internet or in a car dealership, you have to radically reduce what you look at. If seat warmers and satellite radio were not things you came in looking for — ignore them entirely. Their value to you is zero. Reduce the information you look at dramatically, and prefer things that are resolvable to comparable criteria (is this journal well cited compared to other journals) to things that aren’t comparable (which page looks more professional). Choose a couple crucial things, and stop there if those things suggest clear answer.

Our four moves approach, of course, deals with just this issue, and you can read about it here.

Empower Teachers First

Someone asked me today whether I could share any insights about OER creation. I have a few thoughts about that, but the one I always come back to is that you have to empower teachers first.

You know that thing on planes where it’s like “In case of sudden decompression, put on your own oxygen mask first. Once it’s securely fastened, help those around you put on theirs?”

That’s OER. If the teacher gets their mask on they are going to save the damn plane, and if they don’t you’re all screwed. 

There are other models of course — every mega-MOOC wanted to do a direct-to-student play back in 2011. The OpenCourseWare movement largely ignored teacher-facing resources for most of its history. In both cases, the lack of focus on reuse by teachers resulted in impact patterns that followed the “Matthew Effect”, with most gains going to the students that came in with privilege, knowledge, and access.  Those who already had knowledge and opportunity gained more opportunity, but those who didn’t never get a foot on that first rung.

The way to help at-risk students and the way to create more diversity in professions lacking it is not to create more and better self-study materials. It is to find teachers that are already teaching the populations you want to attract or help and relentlessly focus on helping them be better and more effective teachers. 

——

I’ve written on this, alone and with Amy Collier, a bunch of times over the years. Here are some posts on it. Some of this is dated but the larger points still hold:

Why I Am Concentrating On Open Teaching Resources (2010)

Openness as a Privilege Multiplier (2011)

Why We Shouldn’t Talk MOOCs as Meritocracies (2012)

Rethinking Online Community in MOOCs Used for Blended Learning (2013, with Amy Collier)

The Tensions of Open Pedagogy

New article out in EDUCAUSE Review that outlines a possible open pedagogy framework. Here’s the key graphic:

4 ovals connected by 4 diamonds in a circle. Center says 'Open Pedagogy'. Circle1: Access & Equity; Diamond 1: Participation; Circle 2: Community & Connection; Diamond 2: Curiosity; Circle 3: Opportunity & Risk; Diamond 3: Responsibility; Circle 4: Agency & Ownership; Diamond 4: Empathy.

As long-time readers of this blog know, I may be the misinformation literacy person right now, but I came here by way of thinking about open pedagogy and its intersection with digital literacy and democracy. It’s literally my 20 year passion in this space, what excited me about edtech back in 1996 and got me on this road.

The one thing I like about the language above is that it’s possible to start to tease out the tensions of open pedagogy from the model. And it’s the tensions that you have to start with, otherwise you just get religion.

As an example, Maha Bali has noted that access and equity often are in tension with agency and ownership. Self-hosted publishing systems, for example, which provide strong levels of ownership and agency, often throw up technical and economic barriers for students. There is a tension there that never resolves, but has to be seen as a set of competing ideals, reconcilable only in the scope of a given local context. 

You find similar tension between agency and community. Long ago, when we did Keene State’s media fluency outcomes, we separated out participatory modes (about personal engagement and empowerment) and collaborative modes (about working together). And there is a tension there — to act collectively is to give up some power and ownership for the sake of connection and communal action. Things like federation provide new (and exciting!) ways of dealing with these concerns, but do not eliminate the need for trade-offs.

Even Community and Equity can be at loggerheads. When we have students engage in real-world environments like Wikipedia, we are giving them ability to have real impact, but it can come at the cost of putting them into environments that are not equitable.

In terms exploring tensions, risk and responsibility — defined as a commitment to interrogate tools and practices — is the one that doesn’t quite fit for me — to me this is a broader element that actually touches on these other realms — Agency, Community, Equity — because these tensions are exactly what a interrogation of tools should look at. So I see this as more of a meta orientation, connecting the others. If I were to choose a fourth to replace it, I might look at impact — the way in which students are given opportunities to make non-pedagogical impacts on things that matter to them, and then have interrogation cut across all the dimensions.

Overall, though, very happy to see this and hope we keep talking about it.

Cynicism, Not Gullibility, Will Kill Our Humanity

I’ve mentioned before that students come into the misinfo classes we teach more or less not trusting things. Here’s student trust in four stories they should have low trust in:

Level one there is low trust. And that’s where the students are.

That’s the dubious prompts, so that would be good if that was the end of the story. But of course they don’t trust anything fully (or distrust anything fully). Here’s some stories they should trust in blue next to the dubious ones:

That’s right — nationally reported, well-sourced stories get equal trust levels to child sex-trafficking conspiracies broadly debunked by all reputable press and all levels of government. A level that equates to “There might be something there but I don’t really trust it.”

Why does this matter? It matters because of this:


Photo by Reuters photographer Kim Hyung-Hoon

This is a photograph of what’s going on at the border in America’s name. Maybe you agree with it, but actually most Americans don’t. That is, assuming they think the photo is trustworthy. 

But of course, a lot of people are out there saying the photo is not trustworthy:

The vast majority of students don’t engage in this level of conspiracy theorizing. But exposure to this stuff over time, combined with no instruction on how to sort trustworthy information from untrustworthy information in a networked environment  leads to that undifferentiated suspicion that we see towards everything.

Gullibility can do real damage, as we see with some people who consume a daily diet of toxic fakery. But for the median citizen — and especially our youth — I am far more worried about the corrosive effects of cynicism.

Major advances in the U.S. — from civil rights to the social safety net — have been driven by the public being confronted with the consequences of their inaction or action and having to reckon with that. But if everything is worthy of low trust at best, you never need to confront the impacts of policy or politics or personal action. Uncertainty — hey, I did hear about that but who knows what’s true anyway — acts as a cutural novacaine, allowing one to persist in inaction, even as evidence mounts of effects that that same individual might find repugnant — if, you know, it turn out to be really true. Like, really, really true. And who knows?

The depressing thing is that there are methods that can help with this undifferentiated cynicism but we aren’t rolling them out to students. Remember the graph above showing the undifferentiated cynicism? Here it is again, the dubious prompts in red, the trustworthy in blue, for a variety of misinformation and disinformation across the political spectrum:

Not good. But here’s how students do after our “four moves” style instruction:

For reference, a three for most of these trustworthy prompts is a good answer, so this is an incredible amount of differentiation. That’s after just four hours. More details here.

Most of what we give students today doesn’t get students anywhere near this of course. Some of it probably makes them worse at this stuff — so much of the free text comments of students that got this stuff wrong used language that was straight out of an information literacy session they had had at some point (more on that later).

But we could give students this, if we wanted to. And if we want to retain our capacity to be unsettled — the driver of so much of the politics that has improved this country and others — giving them this is essential.