Hapgood

Mike Caulfield's latest web incarnation. Networked Learning, Open Education, and Online Digital Literacy


Some 2018 Predictions

wrote a prediction a couple weeks ago for Nieman Lab, and it was general and media literacy focused. But here’s some more mundane, somewhat U.S.-centric predictions:

  • Social media overrun by AI. AI’s main influence in the coming year will not be in driving cars but in driving state-sponsored social bots and corporate astroturfing. The ability of AI to manufacture real looking discourse and news will render “representativeness” heuristics for misinformation obsolete.
  • Revenge porn in politics. Revenge porn — using both real and fake photos — has been used in multiple international elections. We saw a hint of what can happen here with the Joe Barton incident, and media failed that test. But prepare for much more strategic use, along the lines of what we saw recently in Rwanda. Prepare for at least one U.S. 2018 race to have a revenge porn “scandal”, maybe more. Also, look to AI — with its ability to modify video and images — to play a possible role. (Also, can we come up with a different name for this trend? Revenge porn is a specific form of malinformation/misinformation, but “porn” is wrong since titillation is a mechanism, but the intent is personal destruction and silencing of opponents).
  • Hacked voter registration rolls. Some state voter registration systems will be hacked in an effort to create chaos and long lines at the polls, depressing the vote and calling the results of elections into question. Social media bots and armies will play both sides of this issue — with the mismatches portrayed as voter fraud on one side and suppression on the other.
  • Rise of the civil servant email dump. The press were willingly complicit in the Wikileaks email dump, covering and sifting through stolen personal, non-state emails to amplify a Russian disinfo campaign. Until we come up with a “right to privacy” for state officials and workers this will be a profitable vein. Imagine dumping the personal emails of a state-level elections official where they talk mean about one of the candidates, or share an anti-Trump meme with their spouse. Or an anti-Democratic meme. Apply this to a judge faced with adjudicating a case where the government is a plaintiff, or anyone who threatens your agenda.
  • Big Oil trolling. A major petrochemical company will be revealed to be using troll armies to create confusion around climate change, perhaps with some state help.
  • Creation of pro-government social media army focused domestically. My most out-there prediction. President Trump will announce the creation of a “Fake News Commission” to investigate both journalists and social media. One finding of the committee will be that the U.S. needs to emulate other countries and create an army of social media users to seek out anti-government “fake news” and “correct” it. (There is some precedent for the U.S. doing this in other countries, but we have never done it to our own population at any serious scale).
  • Weaponization of #MeToo. The #MeToo movement is an important and long overdue movement. But because of its disparate impact on the left it will be weaponized, and used to try to fracture the left. This will be one of the big uses of bots and trolling in the election, second maybe to voter fraud and voter suppression flame-fanning. Look for the rise of the #MeToo concern troll, the supposed “lifelong Democrat” who was “ready to vote for X” but these rumors “have disgusted” them. They’ll have names ending with eight random digits. Expect AI and microtargeting to play a role here, making it difficult for candidates to respond, especially to rumors they don’t know exist.

I don’t know what to do about this last problem. It’s knotty. I’m not the person to write about how we prepare as a society for this, but I wish someone would.

Larger Trends

Looking at these predictions, which I typed just now, stream-of-consciousness style, I’m struck by the two social vulnerabilities they exploit.

Industrialization of conversation. We have not come to terms with how the digitalization of conversation allows for its industrialization. And how it’s industrialization allows for manipulation that is more massive and immediate than what we’ve previously seen in the conversational space. We need to develop tools and norms to protect conversation from industrialization. And we desperately need to stop conceptualizing the discourse space on the web as a bunch of individual actors expressing an emergent will.

The erosion of the right to privacy. The modern expectation of privacy, if I recall correctly, was a result of both urbanization and literate culture. In a small tribe you don’t have much privacy, but on the other hand everyone knows you and has the context to evaluate new information about you. In a rural setting, most of what you do is relatively undiscoverable by others, unless someone involved talks. But in towns and cities people’s actions are both discoverable and shorn of context,  and written communication is similarly stripped of its setting and intended interlocutors, so new norms and laws were needed.

(Economics played a part here of course as well — consider the way that works of Rembrandt and others portray homes of the rising middle class as quiet and meticulously organized private spaces apart from the bustle of the street.)

Internet giants such as Google and Facebook have labelled privacy as a historical anomaly. And it’s true that the modern conception of privacy seems to emerge with the development of the modern middle class. But there are some things to note here.

The first is that privacy is necessitated by a move to literate culture. The nature of verbal communication is that it is by its very nature private, only available to those at a specific time and place. Written communication makes possible the broad dissemination of messages intended for different audiences and contexts, and so a notion of informational privacy has to develop. The mailman doesn’t get to read my mail, and has never had the right to do that, from the invention of mailmen. This is because the notion of mail gives rise to new notions of mail privacy. Mail doesn’t make privacy obsolete — the norms and the tech co-develop.

Looked at this way the move from literate to digital culture should not reduce the amount of privacy available to people, but increase the realms where the concept is applied. Our literate and productively bureaucratic culture could not have developed without the expansion of privacy norms around written communication. My Dad worked for Digital Equipment Corporation for many years, an IBM-like mid-20th century creation that functioned on memos and notes and written analysis. Had it been legal and acceptable for any person to go out and sell internal memos to other companies, or to publish employee assessments in the local paper, very little work could have been done on paper, and organizations like IBM and Digital simply would have been impossible to run. They would have collapsed. The invention of the written memo occasions new norms about the privacy of the written memo.

The move to digital communication should, likewise, prompt new and more restrictive norms around privacy.   Otherwise our digitally enabled culture will collapse, completely unworkable. But what’s different this time is the business model of our modern system involves the mailman reading our mail, so powerful interests have spent a lot of time arguing that rather than prompt new notions of privacy, technology undoes privacy. This is unbelievably stupid and technocentric. The invention of sexting, for example, doesn’t “undo” privacy — it argues for an expansion of the concept. The use of email doesn’t mean that everyone’s annual reviews will now be a matter of public record, or that we now all have a right to read the personal work squabbles of others — it means we need to develop new norms and laws and security expectations about email.

I’m sure that the powers that be in Silicon Valley believe in “the end of privacy”, just like they believe in technocratic meritocracy. The most attractive thing for any programmer to believe is that new technologies will render the messiness of social relations obsolete. But this idea, that privacy is antiquated, will lead to institutional and organizational collapse on a massive scale, which is why a transparency organization like Wikileaks is the favorite tool of dictators.

Additionally, unless privacy concerns are addressed, we will end up reversing the advances of the literate culture which allowed broad participation in discourse and decision-making. Keep in mind that while people become increasingly wary of speaking frankly in email, text, and chat rooms because of the lack of technical security and ephemerality people with face-to-face access to power will be able to speak freely. It’s easy to mock bureaucratic culture with its emails, and memos, and endless reply-to-alls. But when the only way to influence the direction of the company reverts to being seated in a chair across from the CEO we will miss it.

That’s a long point one on privacy. But let me add point two. The invention of an intensely internal and personal private life is one of the great gifts of modernity to humanity. I love Shakespeare, but read a soliloquy of Shakespeare’s next to Wordsworth’s Tintern Abbey. Reflecting on seeing a place he has not seen for a long time, Wordsworth writes:

 These beauteous forms,
Through a long absence, have not been to me
As is a landscape to a blind man’s eye:
But oft, in lonely rooms, and ‘mid the din
Of towns and cities, I have owed to them,
In hours of weariness, sensations sweet,
Felt in the blood, and felt along the heart;
And passing even into my purer mind
With tranquil restoration:—feelings too
Of unremembered pleasure: such, perhaps,
As have no slight or trivial influence
On that best portion of a good man’s life,
His little, nameless, unremembered, acts
Of kindness and of love.

People focus on Wordsworth’s treatment of nature, which is remarkable in itself. But the most striking thing to me about the poem has always been how recognizable the psychology is here compared to Shakespeare. And a piece of that is the way in which the narrator’s personal and private life provides sustenance even as the “din of towns and cities” drains him. The way in which even his mental experience of those “lonely rooms” is intensely and unapologetically personal. The obsession with the psychological reality of mental imagery. You see this same development with the novels of the Brontës and Jane Austen, and in portraiture. Matisse’s Woman Reading, for example, sits with her back to us, the room of hers small and unkempt, but she is transported via reading into a different world via the book she reads.

I’ve spent too many words here already on this post, so I’ll pursue this another time. It’s difficult to explain. But the notion of privacy is more than just social and organizational lubricant — over the past 500 years or so we’ve built it deep into the notion of what it means to be human, and removing it dehumanizes us.

Warmly,

Mike



6 responses to “Some 2018 Predictions”

  1. Very nice shift to Wordsworth.

    The “pro-government social media army focused domestically” really struck me. That sounds like a digital militia.

  2. […] up with them, and I wonder why we expect to avoid that. So I found Mike Caulfield’s “some predictions” post section on “the erosion of privacy” very timely “the mailman […]

  3. […] recently published a post of “Predictions for 2018” that in turn make me predict that his topic for 2018 will be clickbait content generated by machine […]

  4. […] Preparing for 2018– Not that we’re much into predictions, but if you’re going to be living dangerously it might help you be a little better prepared, and Mike Caulfield has some interesting thoughts on the subject. […]

  5. […] does it? I would say I hope 2018 is better than 2017, but I don’t see much hope for that. Mike Caulfield’s predictions seem pretty on the nose for me, and that doesn’t look like a fun place. But I’ll keep blogging […]

  6. “Under the right conditions, we can know things that we don’t know we know, and we can sometimes predict events or attract what we are thinking,” said Dr. Beitman, a former chairman of the psychiatry department at the University of Missouri. Here are some of the most remarkable coincidences from “The Simpsons,” and how they can, or can’t, be explained.

Leave a comment