A short explanation from a terminal smasher (or, Blackboard as an access control company)

YAAY! I am also going to smash all my corporate-made computers and hand-build my own. It’s NOT about the vehicle – it’s how you use it…

— Lee (no last name provided) dismissing EDUPUNK in a comment on the Chronicle article

As a person who has been involved in quite a bit of social activism, and done way too many interviews where I felt the resulting article was to the side of the real point, I have to say the fact the Chronicle has now covered EDUPUNK is incredibly significant, no matter what the slant. This term is literally less than a week old, and it is already disturbing people. That’s very very, good. And the fact the article links to our little blogring here means that the article can say whatever it wants (and kudos to the Chronicle reporter for being linky here). People interested in the concept can wander over into our conversation and get a level of analysis deeper than the “let’s smash Blackboard” dismissal.

And since they may be wandering over here, let’s clarify that.

First — on why this ends up being about Blackboard, even though we all just want to move on… well, is there another LMS of the size and influence? Of course not. Any discussion of LMS use in America is going to focus primarily on Blackboard, and if people don’t like that, they have only Blackboard and their government to blame. EDUPUNK didn’t grant them the patent, and EDUPUNK didn’t crush competing options through lawsuits and buyouts.

But let’s get to the main point, the thing that becomes clear once we get past EDUPUNK as terminal smashers rhetoric.

The movement is primarly creative, not destructive. It just looks like destruction to those who haven’t seen creativity in a while.

They didn’t get that about punk either, as Iggy himself tried to explain in a CBC interview:

Look, the movement is not anti-corporation. Google makes a profit, as does WordPress, as does pbwiki, as does Twitter (well, ok, not technically *profit* there, but still).

What the movement is about is this — while Blackboard was busy trying to leverage their foothold in the University to get into the business of dining hall management, video surveillance, and door access control, this little thing called Web 2.0 happened. And suddenly the technology Blackboard had for learning began to look — well, old. Junky. Very 1999.

So while Bb spent their efforts trying to become the single sign-on point for your instiitution, professors, frustrated with the kludginess of the actual *learning* part of Bb’s suite, started looking elsewhere for solutions.

Their first discovery was that they could do everything they were doing in Blackboard for free, and much more easily.

But the second discovery was the kicker. These Web 2.0 tools they adopted encouraged them to share their stuff with the world, instead of locking it away in a password protected course. And suddenly, they got a taste of open education. And it didn’t stop there. The tools they adopted had a true web DNA, and played well with other tools in a loosely coupled mode. So suddenly, they got a taste of what it was like to build your own custom learning environment.

And so on. They started to experience the creativity that the web can unleash, and experienced for the first time that connectivist thrill people had been going on about.

And it was then, with their courses out on the net for all to see, having developed WordPress pages that mashed together video with slideshares with twitter updates and del.icio.us feeds, having witnessed students commenting on posts right next to people from across the world, having seen authors of books responding to their student’s reader response essays, directly —

It was then that it hit these people. Blackboard was never a learning tool.

It was an access control system.

That detour into running your dining hall cards and your security cameras? It wasn’t a detour. It was the core business, extended into other realms. To Blackboard, it’s the same business. You pay your money, you get to get in and get the food.

And I think a lot of people realized at that point that Blackboard did not have (and never did have) the slightest idea what the web was about. Once you see how access control, and not learning, is at the heart of what they do, reading their promotional material becomes amusing. Even the better advocates for Web 2.0 over at Bb can’t escape the pull of the force. Here’s their promotion of their new Web 2.0 collaboration tool, Scholar:

That’s really what Scholar is all about. The whole idea is to enable academic resource storing and sharing among people with the common focus of education…a “validated network”, if you will. All Scholar users are instructors, students or staff from educational institutions and therefore you can consider most of the resources on Scholar “vetted”. It certainly saves me time and effort in a lot of the research I do everyday.

You see, it’s Web 2.0 — with access control! And this is from one of the more astute people over there. But she can’t fight it. It’s in the DNA of the company.

Look here’s the deal in a nutshell. If you believe there’s not much difference between the business model and mission of your Dining Commons, and the business model and mission of your university or college, by all means, give your vendor the keys. Let their feature set determine what you do in your classroom. Get excited about all the people you can keep out of your academic endeavor. Tie your roster and your building access into the same central database.

Seriously, go ahead. From a student service perspective, it may be exactly what you want. Go with God.

Just don’t confuse that with education. Keep your education EDUPUNK.

We’ve seen the future. And we’re not going to put it back in the box.

Bowling Alone, the Local Internet, and XBox Live

From Putnam, 1995:

The most whimsical yet discomfiting bit of evidence of social disengagement in contemporary America that I have discovered is this: more Americans are bowling today than ever before, but bowling in organized leagues has plummeted in the last decade or so. Between 1980 and 1993 the total number of bowlers in America increased by 10 percent, while league bowling decreased by 40 percent … The broader social significance … lies in the social interaction and even occasionally civic conversations over beer and pizza that solo bowlers forgo. Whether or not bowling beats balloting in the eyes of most Americans, bowling teams illustrate yet another vanishing form of social capital.

Edupunk* that I am, I occasionally feel the weight of the world on my shoulders. I have this theory, shared by many, that the Local Internet can start to rebuild the local communities that national TV destroyed over the past 60 years.

And then I think I’ve got to join the struggle and use my knowledge to help fix this. And that’s what I feel I’ve been doing, in one way or another, for quite a few years now.

But occasionally I happen on something that is just *happening*, happening naturally and organically. Something that feels less like rolling rocks uphill and more like bodysurfing the revolution.

XBox Live is such a thing. Until recently I had no idea what this was. I mean, one hears the talk about Second Life, and about World of Warcraft — those efforts and their relation to culture have been dissected quite a bit by academia — themes of value, life building, construction of self and problem solving have been analyzed in these arenas.

But what no one ever mentions is XBox Live and Bowling Alone.

Put simply, here is what I discovered on XBox Live. I started out several weeks ago playing other people nationally that I didn’t know. Random people. I got bored. It was too Compuserve chat room circa 1984.

So I started to badger my friends with XBoxes to get online so I could play them. We got five local people together and broke into teams, playing at three different houses and having one of the most amusing nights in recent memory. We’re now looking at forming a league team together and taking on other teams, or challenging others locally.

And what I found out in the process is that to varying degrees this is how many people are using XBox Live. The kid across the street plays his school friends on XBox live. People friend people they know or invite them to join games they are in. Leagues form, teams are forged. The patter in the games is a combination of tactic talk (“He’s on the balcony — try going up the back stairs while I distract him with a grenade…”) and the regular social patter one hears during any group game.

There is nothing revolutionary about this to anybody doing it. Yet look closely and you’ll see that this is the Local Internet, the process of the Internet rebuilding what the Broadcast Era destroyed.

At the end of the original essay Bowling Alone, Putnam delves into technology’s relation to the erosion of social capital:

In the language of economics, electronic technology enables individual tastes to be satisfied more fully, but at the cost of the positive social externalities associated with more primitive forms of entertainment. The same logic applies to the replacement of vaudeville by the movies and now of movies by the VCR. The new “virtual reality” helmets that we will soon don to be entertained in total isolation are merely the latest extension of this trend. Is technology thus driving a wedge between our individual interests and our collective interests?

The answer to that question, is no, not necessarily. In fact, writing in January of 1995, Putnam may have been marking the high water mark of technology’s power of separation and isolation, the last point where someone could talk about a virtual reality helmet without realizing that it gets awful lonely in that VR world if it’s not populated by people you know.

* I have chosen to spell “edupunk”, when used as a reference to a person, in lower-case. EDUPUNK the movement is, of course, in all caps, ala Jim Groom.

Edupunk

Jim Groom brings a new term into being in a recent post — edupunk.

There’s a couple reasons why I find the term useful, but the most important is that it captures the cultural revulsion many of us feel with the appropriation of the Learning 2.0 movement by corporations such as Blackboard. Learning 2.0, like punk, is a DIY movement. Like punk it favors technical accessibility over grand design.

And to people like us, Learning 2.0, if it is to remain relevant, must not be relegated to the dustbin of “features” or “products”. It’s neither a product or a process, but a way of approaching things, of which products are only one of the results.

Yet all the 2.0 formulations — Classroom 2.0, Learning 2.0, and even Web 2.0 itself — work against this very notion that what we are chasing here is not product, but style. What does the 2.0 version number symbolize if not a shrink-wrapped box or set of features?

What began as a clever pun has outlived its usefulness to us. We’ve known that for a while, but as companies begin to reduce the social web to a set of ingredients in their products — we have to go further than whether product x allows trackbacks or not.

“Edupunk” gets us there — with its implication of technical accessibility, a DIY ethic, quick and dirty over grand design, and a suspicion of corporate appropriation it hits a lot of the right notes.

The wrong notes it hits are mainly historical — because of course punk had surprisingly little social impact — and it’s worth remembering the same attitudes that kept it pure relegated it to being a tribal phenomenon rather than a broad cultural movement. Punk culture valued its exile from the mainstream. We want to change the world.

That inevitiably leads to a lot of compromises. But when we stray too far into the world of enterprise software, three-month timelines and eight page budgets — when we have to concede the assessment system will likely be a centralized corporate affair — on days like that, maybe edupunk is the cassette we throw into the tape deck on the way home, the tape that reminds us loudly of who we are, in three chords or less.

It’s obviously late here … thoughts?

Why Halo 3 is more educational than “intellectual” games

I always *felt* like I should be a gamer. After all, I built educational games for a good portion of my career — first for children (reading readiness software), then for Columbia University and Cable & Wireless, where the name of the game was social simulations — choose-your-own-adventure style scenarios where you interacted with professional environments — and if you made the wrong decision you could bring your team/company/state/country down with you.

So I tried to play recreational computer games. I really tried. Since I like to solve puzzles I kept buying PC based “adventure” games. And since I’m not a violent sort I steered away from the gore.

But every game I played seemed like the same game. And that game was “Try to figure out what the game designer thought an appropriate action would be in this context.”

I’m sure you know this game. It starts with you watching a film intro, and then some objective is voiced. Maybe you have to get to Room 306 or something. Maybe you have to find the crystal ionizer.

So you walk around a room, and the first steps come easy. Wow, there’s a note there! What does it say?

But then you try to exit the room for 30 minutes without success. Why won’t that door open? Am I at the wrong door?

And then the answer, stupid me, would be that the card under the coffee cup was actually a key card for the door. It goes (don’t you know) in the slot you saw on the floor on the other side of the room.

That’s thirty minutes of my life I’m not going to get back. And it’s thirty minutes of trying to guess what an “appropriate” solution is.

Worse, it’s thirty minutes of trying to figure out what an “appropriately creative” solution is. And that’s just maddening.

So I gave up on games for a while. Until one week I decided to borrow my brother’s XBox and see what the hullaballoo about Halo 3 was. And from the moment I started playing it, I realized I had it backwards on games.

Whatever your feeling about the subject matter, the battle games are the educational games. Why? Because as you run through scenarios dying repeatedly, you are forced to look at the thing, not from the perspective of WWGDD (What would game designers do?) but from the perspective of systems analysis. Have you chosen the correct weapons to make it through the hall? Would a short range weapon with a bigger kick be more appropriate? Are you dying because you are trying to take out too many of the enemy before proceeding — or do maybe you need to dash through *more* quickly? Is the risk of making the dash to the weapons cabinet worth the pay off here? What’s the optimum route through the level?

You have resources and potential paths. You can combine them in ways the game designer might not expect. There are multiple working paths to any achievement. You play co-op mode with others, and you develop team strategies (“You go this way with the gravity hammer and I’ll snipe with the 50 cal…”). And every time you die (which if you are me, is a *lot*), you evaluate that crucial question Seth Godin refers to as the question of “The Dip”: Is my set of tactics sound, but requiring more polish in execution? Or is my approach fundamentally flawed?

And, again, you do this all by studying the way the system operates instead of playing a senseless game of WWGDD.

You may find the content disturbing. Personally, as silly as it may sound, I can’t play games where I’m shooting realistic humans in a current war. I have to shoot aliens, or people so far back in history that I’m removed from the geopolitical implications.

It’s an odd line, but somehow it works for me.

But strip away concerns about the violence and the process of playing Halo or Gears of War is more educational, and will teach you more about analyzing problems than any “intellectual” game on the market. There’s an honesty to these games, and within tight constraints, an emergent element. No, it’s not Spore, or Civilization IV. And you can’t build your own weapons or design your own level (it turns out you can design your own level, see comments). You can’t mashup elements from other games into Halo.

But you can study a system that operates in a discoverable way, and develop an approach that makes the best use of tools and available cover. You can develop a strategy that it’s just possible no one has discovered before. That beats trying to figure out what cleverly hidden object you need to open a door any day of the week.

Made the jump to Linux

Vista on my budget laptop was the straw that broke the camel’s back. (really, there must be a better and more current saying for that, but I’m pre-coffee today).

I was losing an untold amount of time clicking links and x’s and waiting for Vista to do something (anything!). I was watching as Vista crashed — loading Flash. And most disturbingly I was spending a half hour a night shutting down my system, navigating a complex maze of hung processes and nonsense prompts (and yes, even “shutdown -f” did not solve it). I ended up leaving my laptop on all night, because I simply did not have time to shut it down. This, of course, further exacerbated the slowness.

So I tried to downgrade to XP, but it was a disaster. I mean, it didn’t even recognize my CD drive. I actually installed XP onto the laptop from the CD, and once it was installed, it didn’t recognize it.

I know I was installing a 2002 version of XP onto a 2007 laptop. But still, that’s capital-C crazy.

Plus, to get the pleasure of a broken installation I had to run around the house for 4 hours trying to find the key to the legal disk I bought 4 years ago. And I was not a pleasant person during those four hours.

Things like that can really damage a marriage. So I finally decided, since the XP (contrary to my instructions) had already cleared my hard drive, to try and install Linux. I’d tried this before on a laptop four years ago. I steeled myself for the worst.

I got Mandriva, because my research showed its installation was the best out of the box fit with my hardware. I popped it in.


And in 30 minutes my sytem was working.
And not just working — flying. I mean, I’m writing this post and you know what I did? I started my laptop (up in 60 seconds). I clicked the firefox icon, and firefox started in under 5 seconds. I sent it to my wordpress page — instantaneous!

That’s less than 90 seconds to get here and start doing something productive. With Vista under my laptop specs I couldn’t get here in under 10 minutes.

It’s hard to fully explain what that means to a person who steals what moments they can to blog — but I think some of my readers will get it.

The fact I waited so long to do this can only mean I was suffering from some sort of battered OS user syndrome. Life with my new OS isn’t perfect — the wifi card ain’t working, so I may have to get something external. But I’m spending so much more time doing things rather than watching my system think. I’m loving it.

More as this develops…

Ning Death Syndrome (a.k.a the Dead Shark Problem)

I’m a big fan of Ning, and lately I’ve been gearing up to launch an Alumni site in it. The first email invites will go out tomorrow.

Well, not exactly the first invites. And therein lies a story.

See, before I launched this, I tried a little experiment and invited a few of my alumni friends to a prototype site. The site grew by leaps and bounds until it reached 31 members, most of them not invited by me. Many invited by people not invited by me.

There were postings, reconnections, forums. For that period of time people were addicted, clearly stopping by the site obsessively. From February 13 to March 20, it was *the* place to be.

Then suddenly — not so much. I mean *really* not so much. Everybody disappeared, almost overnight.

There’s a number of reasons, I think. One being that initial activity was heavily about reconnecting and once new people stopped coming in, the site died. Another being that at thirty-one members, the site was just too small. The people that post the majority of content in things like these seem to number about one or two in a hundred — at 31 people, the flow of content was too unstable. (At Blue Hampshire we got well over a thousand members, and 600 readers a day, but the site is still dependent on 12 or so regulars who post).

I also think that a lot of times you set it up to have this explosive activity, but after the dust settles if you did it on a large scale you’re left with your regulars. So some amount of contraction is expected.

Still, I can’t help thinking of that Annie Hall quote about the shark (first 10 seconds of this trailer):

Do online social networks have to keep on moving forward or they die? It’s definitely something we’ll be looking at as we launch the alumni site. There’s nothing more unattractive than a dead shark.

If a Columnist Calls a Tail a Leg…

There was yet another Andrew Keen inspired article last week bemoaning the age of “wikiality” — an age of supposed gullibility of us internet sorts. It begins with shocking news — people are getting quotes wrong, and Web 2.0 is at fault:

Truth: Can You Handle It?
Better Yet: Do You Know It When You See It?
By Monica Hesse
Washington Post Staff Writer
Sunday, April 27, 2008; Page M01

How many legs does a dog have, if you call the tail a leg? Four.
Calling a tail a leg doesn’t make it a leg.

Abraham Lincoln *

[*Note: Lincoln never said this. He liked a similar, more long-winded anecdote about a cow, but the dog version? Nope. Still, the quote is credited to Abe on some 11,000 different Web pages, including quote resources Brainy Quote and World of Quotes.

Though not technically “true,” the quote makes a nice start to this article about truth, being topical and brief, so if we want to go with truth-by-consensus (very popular now), we can go ahead and just say that he said it.]

Hesse then explains the crisis:

Andrew Keen describes it as “the cult of the amateur” in his same-named book. Stephen Colbert called it “wikiality” — meaning, “a reality where, if enough people agree with a notion, it must be true.

Information specialists call it the death of information literacy.

What’s really amusing about the Hesse article is that her initial example – the Lincoln quote – is an example where the web was more correct — and the web could have shown her that. The web has well known conventions for dealing with authority and making truth more verifiable, and when these conventions are embraced rather than rejected, one gets better results.

Follow along while we compare what it takes to verify truth on the web, and what it takes the “old world”…

Score one for wiki-world

Hesse seems to be claiming that the web (and it’s tendency to magnify casual opinion over scholarship) was responsible for this quote being wrong. But was the quote actually wrong? That seemed an important point — and nothing in the article seemed to prove the “Brainy Quote” version false — nothing, that is, beyond her simple assertion.

I decided to use the web to find older, more authoritative references to the “false” quote. It was easy once I realized that I should include the phrase “said Lincoln” to filter out simple non-contextualized quotes, such as one finds in quote lists. In fact, once I figured that out, an extremely early instance was on the first page of results [Note: my posting this article appears to have altered that result set]. It appears in a work called Lincoln’s Own Stories published in 1912:

Once when a deputation visited him and urged emancipation before he was ready, he argued that he could not enforce it, and, to illustrate, asked them: How many legs will a sheep have if you call the tail a leg?” They answered, “Five.” “You are mistaken,” said Lincoln, “for calling a tail a leg don’t make it so”; and that exhibited the fallacy of their position more than twenty syllogisms.

It took less than fifteen minutes to prove the Hesse article wrong: far from being an false product of the wild web, the quote has an extremely good provenance. There’s a small matter of it being a sheep mentioned, but it matches the “wiki” quotes far better than the “long-winded anecdote” about a cow that Hesse favors.

Incidentally, the web can even show you how the “sheep” may have become a “dog”: Christopher Morely uses the modified Lincoln quote in Parnassus on Wheels in 1917 citing a dog, Wikipedia shows us he was an editor of several editions of Bartlett’s Quotations, which probably explains why the quote appears in his editions of Bartlett’s in the dog variation (no full text online, but see cites here).

That doesn’t seem to me a problem of authority. And it certainly has nothing to do with Web 2.0.

Score zero for the world of “authority”

Then, I decided to try it the other way round — could I prove the Hesse version of the Lincoln quote was from an even more trustworthy source?

Here’s where it gets ridiculous — the article that is bemoaning that people simply believe what they read provides no source for their version of the quote. So whereas you, the reader of this blog, can click the link “Lincoln’s Own Stories” to verify my assertion, to verify something in traditional media requires launching a federal investigation.

To try to find the source for her quote, I took the fact that it involved a cow, and probably contained the core phrase “calling a tail a leg”. Google Web search turned up nothing of use. Google Scholar turned nothing up, neither did Google Book Search. Figuring the author probably read this in a book (or saw it in a documentary) I tried Amazon’s full text search. Bingo.

The keywords I had chosen occurred in the biography “Lincoln” by David Herbert Donald. But the book was not providing a useful context snippet to Amazon. So I went down to the library, and got the book out. I looked up “emancipation” in the index — far too many pages listed. Looked up cow, and of course found nothing. Ugh.

My lunch break was slipping away. In a moment of insight, I went to the terminal in the library and pulled up Amazon.com. I did “search inside the book” again. While the snippet didn’t appear, it gave me the page number: page 396. I turned to the page — Aha! There was the source of the Hesse version. It talked of a long-winded anecdote about a Western case involving a cow.

Which raises the question: why do the defenders of “truth” want to make it so hard to verify their sources?

I won’t belabor this much longer. The source of the quote in the Lincoln biography is an obscure quarterly from 1950, the nearest available copy of which is in Worcester, about an hour and a half away. I thought of getting the article through Interlibrary Loan, but realized from the title “A Conference with Abraham Lincoln: From the Diary of Nathan Brown” that even if I got the journal, the article relied on a diary that would not be accessible to me.

So the Hesse version appears based on a single, non-primary source which references a journal article the author didn’t read, and the journal article references a diary that neither the author of the WaPo article or the author of the biography has ever seen.

It’s a big circle of trust, none of it linkable. And yet the web people, who are insisting on verifiable, linked sources are somehow the intellectually sloppy ones.

A final check

Still, given my source was from 1912, and the unverifiable source was likely contemporary, I could only prove that the quote being bemoaned as a product of “wikiality” had a good history, and was more verifiable. I couldn’t prove that it was more likely. So I called in a favor. I used to be a search interface programmer for the amazing Readex “Early American Newspapers” project, the project to create a searchable full text database of this nation’s periodicals from pre-revolutionary times until 1876. So I emailed a person I know that still programs there. I asked them if they could punch in “Lincoln” and “calling a tail a leg” into the product and send me back the first results.

Sixty seconds later I had my answer — Web: 1, Books: 0.

What Lincoln said to the party visiting him — well, it was reported in the Chicago Tribune at the time.

And it’s not a “long-winded anecdote about a cow”, but rather, it’s much closer to that quote that appears in all those crazy wikis.

Headline: Lincoln’s Own Construction of His Proclamation;
Article Type:News/Opinion
Paper: Macon Telegraph, published as Macon Daily Telegraph;
Date: 10-23-1862; Issue: 841; Page: [3];

LINCOLN’S OWN CONSTRUCTION OF HIS PROCLAMATION — A little while anterior to Lincoln’s interview with the clerical committee (says the Chicago Tribune) a couple of other abolition fanatics found their way to the President and pressed upon him the emancipation scheme, and this was his reply: You remember the slave who asked his master — if I should call a sheep’s tail a leg, how many legs would it have? ’Five’ ’No, only four, for my calling a tail a leg would not make it so.”

(Incidentally, the Readex Collection of Early American Newspapers is the most exciting thing going on in historical databases today — if your institution doesn’t have a license to it, you’re not serious about American History. Go check it out…)

I realize this is a Macon paper (hardly an uninterested party) quoting the Chicago Tribune (as was the custom in early papers). But there are plenty of other hits from other papers in the list as well — I’m staying on the clear side of fair use here, but they are there to be discovered by any user of Readex.

Suffice it to say, however, that the quote, and Hesse’s problem with it, are far more telling than she anticipated.

The subtitle of her article asks if you’ll know truth “when you see it”.

It’s a good question, but Hesse has the battery wired backwards.

The answer, from any web literate scholar, is if you make it easy for me to check it, maybe I will know it when I see it. The web does that in spades, which allows us, ironically, to repair the errors that the Washington Post generates.