I’ve been playing around with cognitive disfluency in slide design for my class lately, trying to solve a conundrum.

The problem is this — we know from research that reading materials that introduce “desirable difficulties” (such as presenting information in a difficult to read font) are recalled better than reading materials with a cleaner, more fluent presentation. This has been referred to as the “Comic Sans Effect”, after the notoriously hard to read font that is also apparently one of the more memorable. But the research shows that anything which disturbs fluency can have positive effects on recall — printing pages with a low toner cartridge, or producing deliberately bad photocopies.

(There’s a lot of caveats to this research, which I’ll deal with later — particularly around the issue of whether we are testing “difficulty” or “novelty”, but also it is a relatively new finding and it’s unclear how it transfers to something like slide design…) 

The problem is there’s a natural tension between your need as a presenter to have your slides represent you as a professional, and your desire to introduce desirable difficulties into slide-reading. The slidesets linked below represent my attempt to strike that balance. They are heavily influenced by mid-90s graphic design and perhaps also by Leigh Blackall’s presentation style from five or six years ago (Leigh’s slides in that 2006 Networked Learning presentation seared themselves into my brain forever, a perfect example of this working well). 

Anyway, here’s some attempts by me to do this. Viva disfluency!

Association & Causation

Observable/Unobservable, Inference, and Claims

.

.

From A. N. Whitehead’s An Introduction to Mathematics, a brilliant early reflection on what we now see as a System 1/System 2 problem: “It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them.”

From Obama’s 2012 campaign blog. I am glad to see them standing up for what they accomplished here instead of running from it. This is truly something that the administration should be proud of.

guardian:

Photograph: Brian J. Clark/AP

Two women share historic kiss at US Navy ship’s return
For the first time since the repeal of ‘don’t ask, don’t tell,’ a same sex couple takes part in a traditional public embrace.

President Obama signed the repeal of “Don’t Ask, Don’t Tell” into law one year ago today.

utnereader:

The New York City crime rate famously plummeted in the mid-1990s under the watch of police chief William Bratton, who introduced a computerized mapping system called CompStat to help cops track crime hot spots. He later took the system to Los Angeles, where once again crime plunged. CompStat is now used nationwide, reports Miller-McCune, and Bratton is a law-enforcement superhero.

Bratton’s latest innovation, called predictive policing, represents the next step in computerized law enforcement—moving beyond charting past felonies to forecasting future offenses. “The only way for us to continue to have crime reduction is to start anticipating where crime is going to occur,” says L.A. police lieutenant Sean Malinowski, with whom Bratton conceived the sophisticated data analysis program.

Keep reading …

I guess I’d be really interested to know how knowing where crime tends to happen (CompStat) is different from knowing where crime will occur (New Unnamed Thing). Is it an inference thing, where we put up a new bank and predict that based on our experience bank robberies will go up in the neighborhood?

I actually believe this might work, I just want more detailed information on it. The finally paragraph suggests the counter-intuitive example of street lights *increasing* crime because muggers “need them to do their work”. But there’s a lot of confounding associated with streetlight placement — population density, socioeconomic status, level of foot traffic — is it accounted for?

I don’t think you can call it remediation anymore when 1/3 of your students require it. At some point the problem is not the students or the high schools, but that we’ve built a higher education system based on false assumptions about who our students are and what they have when they get here.

Our failure isn’t that the students need to be remediated. Our failure is our misaligned priorities require that we call it “remediation”.

Minutes after I read Winer’s “everyone should run their own webserver” piece, I get this. 

I think sometimes we’re crazy people, driving old cars we repair ourselves, telling people how easy and cheap it is to maintain that 1993 Diesel BMW with the Fryolator oil-burning mod and the homemade solar charger. I’m actually really sick of running my own server, and if people like me are, I think the chances of a broad personal cyberstructure movement, failing some sort of subsidization of it, are somewhere between nothing and 0%, no matter what it’s social benefits might be.