Loosely coupled assessment

Here’s the thing it’s 2000 all over again. Eportfolio is the new LMS.

Watching a recent vendor presentation I thought “I can’t believe this is happening again.”

That single phrase. In a loop. In my head.

Because remember — this happened once before. The LMS vendors came in with an assessment and management tool, and told us it was an elearning solution. At the time, I was on the other side of the equation, with a company trying to sell award-winning goal-based scenario software to colleges who were saying but we already HAVE an elearning solution. It’s called Blackboard. Or WebCT. Or whatever.

And so Blackboard, an assessment and management tool, determined the pedagogy of colleges for eight or so years. Because teachers wanted to import rosters, we put students in a closed box and told them it was elearning.

When it wasn’t. The truth is the kids were doing more elearning on MySpace than in Blackboard.

How do we avoid it again? How do we avoid imposing something that is just pedagogically WRONG on a new set of students because we need to meet some institutional assessment needs?

There’s only one way — loosely coupled assessment.

If we are going to talk assessment, we are going to have to segregate it. Your assessment tool should ONLY assess.

We don’t need to talk more about student needs wth vendors that supply assessment tools. We need to talk to them less about student needs. It’s not their business.

Literally: it is not their business.

In fact, we should remove student needs entirely from the equation.

The students know they can get far bettter solutions to their problems for free elsewhere. They don’t need a eportfolio system to post their resumes on.

So enough of letting assessment vendors tell us what facilities we will be forced to use in their walled garden, and expecting us to be excited about it. Enough with assessment vendors selling us “environments”. What we should be doing is describing the the enviroment that might exist — students using WordPress, Blogger, S3, GDrive, email, messaging, etc. And then we should ask if they have a tool that can evaluate that. How will their tool interface with the learning environment we’ve constructed?

Anything else is insanity.

EPortaro demonstration today (any help?)

I’m attending an EPortaro demonstration in about 50 minutes.

If you read this blog at all, you can probably guess what I think about such eportfolio solutions. It’s 2000 all over again, with vendors coming in to save us from the big, bad internet.

Still, my opinion is probably the minority one on my commitee, so there’s a good chance we are going to pick SOME eportfolio vendor, despite my efforts.

If you know anything good or bad about EPortaro in particular, please email me at mcaulfield at keene edu. Or leave a comment.

Thanks.

Where will the wave come from?

I love talking the theory, but it’s even nicer to see practical notes from people implementing solutions. From a recent post over here, some WordPress MU as class-space experimentation

Teachers are finding WordPress MU easy to use and I’m very happy to see that. Currently, Teacher Assistants are recording students as they read their writings in class using Audacity. We are using inexpensive mics with noise canceling, and I have to say, I’m impressed with how well they work. It’s not easy to cut out the ambient noise in a working first grade classroom.

That’s right. A first grade classroom.

I’ve been a frequent critic of primary and secondary education, and that’s unlikely to stop. But I’ve been impressed in the past year with how much faster things seem to be moving down there than up at the university level.

It’s not just scattered notes like the one above. The percentage of thought leaders in the Learning 2.0 space that are focussed on K-12 is extraordinary.

Why? One would think if you can run a blog and wiki with first graders that surely this should be cake for a university classroom.

More as a way to start this conversation, here are a few hypotheses:

1. K-12 (and particularly K-6) does not have the subject problem — there is no issue that writing belongs in one discipline, video in another, and history or math is seperate from each. Holistic approaches aren’t thwarted by an org-chart that divvies up the student.

2. K-12 is behind on the LMS wave, and having not been infiltrated by LMS vendors, they are more able to think out of the box, rather than in terms of what new LMS modules are available.

3. There’s just more teachers than university professors, which creates the critical mass needed to get a movement going.

4. They don’t have a developed IT department or large IT budget — and hence are able to experiment more with an ad-hoc bricolage of tools, especially free ones: i.e. technology decisions are not treated as budget decisions.

Those ideas are all possibly wrong — but I’d love to hear other takes on this phenemenon. Unless higher education gets its act together, it is quite likely the college freshmen of tomorrow will be entering a far LESS enlightened tech environment than the one at the high school from which they came.

Electronic Textbooks and CommentPress

Via bavatuesdays, I learn of CommentPress.

Obviously there are other non-WP group annotation tools. What’s really striking to me here, however, is how powerful the fit is between the CommentPress approach to text and the best bits of traditional literary exegesis.

So great is the fit, as a matter of fact, that I half wonder if CommentPress could become the first step toward faculty blogging — rather than the other way around…

Goal-based scenario/simulation vs. learning 2.0

The most invigorating job I ever had was working for CognitiveArts programming learning “simulations”. Founded by Roger Schank, CogArts was truly a company with a mission — to revolutionize education through technology rather than simply extend the current system. And we pushed the envelope in every way we could. I worked with a large team of programmers whose goal was to make the ultimate Choose-your-own-adventure multimedia learning experiences.

The core idea was simple: people learn by doing, so learning should simulate doing in a low risk environment. Schank’s favorite talking point was this “Which would you rather your airplane pilot have — 90 hours of the flight simulator, or 90 hours of book study?”

Simulations would generally lead a person through a “goal-based scenario”: perhaps as a Governor’s economic advisor they had to make decisions for a hurricane torn state on things like price controls and rationing, and observe the effects of the action. Perhaps they had to negotiate a house price as part of Harvard Business School Publishing’s Negotiation class.

The key to the system was failure-based learning paired with just in time instruction. Students would be encouraged to develop expectations about what would happen as a result of their actions. When they failed, they would be provided with context-sensitive instruction, and encouraged to try again. It had been shown in a number of studies  that by providing the bulk of the instruction after failure that you could get retention of information significantly higher.

The system was later copied (often poorly) by other corporate training companies, and is now a pretty standard offering of most custom elearning vendors (although I would argue that the desire of many vendors to push such modules into a one-size-fits-all assessment harness profoundly degraded the experience — at CogArts we built an LMS that was precisely tailored to the needs of our scenarios).

This autodidactic gaming approach to elearning seems miles away from the PLE and the Inverted LMS (I still haven’t quite resolved if those are the same thing yet — please excuse my transitional use of both terms). The Inverted LMS is inherently social and collaborative; the CogArts model was solitary and self-taught. Indeed, if there was one flaw with what we did at Cognitive Arts, it was probably that in the move from CD-based non-networked learning to web-based instruction we were not radical enough in our rethinking of the social element of education.

Despite that, I’d argue that simulations are very close to the PLE/Inverted LMS in theory. Why?

Because both focus on learning by doing. Where there is high-risk to real life failure simulations make a lot of sense. And where the definition of success in a field or task is very narrowly defined, simulations shine. The flight simulator, one of the first computer applications ever built, still remains the model here.

But the web has introduced us to plenty of low-risk ways to engage in disciplines. And that’s where the new approach comes in.

An example? At CogArts, one of the apps I admired most was the “Is it a Rembrandt?” simulation, which provided students with detailed pictures that could be faked paintings or undiscovered Rembrandts. The students, through learning about Rembrandt’s style, had to make the call. Experts were there to give them the just in time instruction should they fail — explaining this or that about brush strokes or subject matter.

I’d still pay good money to use that sim — I think it remains a wonderful way to learn, and one that appeals to our gaming culture. Put software like that in a current high school, and you’re going to blow the doors of education. In a good way.

But what is striking nowadays with the web is how it supplies plenty of real low-risk problems for students to engage in. The Rembrandt simulation was built during a mid-90s rash of discoveries that certain Rembrandts were fakes. Ten years later if such a thing happened, there’d be a good chance you could get hi-res photos of detail from the fakes, if you asked nicely.

So what happens then? You gather your students, you put up a wiki and series of student blogs, you roll your sleeves up, and you get your class analyzing the paintings. Google becomes your just-in-time learning application, which is cool, because that’s what your JIT solution will end up being in real life. Success or failure is determined, as in life, somewhat fuzzily by the reaction of the experts in real life: if you can get them to engage with your work at all, that’s a high level of success; if they actually start agreeing with you or noting things as valuable insight, even better.

I miss both producing and playing with the Schank software, just because of how much fun it was, and if I could buy those titles shrink-wrapped from the local Staples today, I’d spend my own money to buy a title a week. Heck, I may go home tonight and play the Cable & Wireless simulation, which I still have a disc of somewhere. In a perfect world the government would fund more of these sorts of simulations.

But the brilliance of the internet is how much it matches, for a certain subset of problem, the perfect learning environment CogArts was simulating in its courseware. As with the simulations, on the internet you can try out ideas without much risk, you can get information from Google on a Just-in-Time basis, and you can talk to experts about the validity of your decisions. And, yes, it’s a lot fuzzier, and I certainly don’t want my pilot to have put in 90 hours of BLOGGING, but for certain types of learning (and possible for most learning), it’s a preferred method of engagement.

You’re already out there

This just in:

July 25, 2007 (Computerworld) — Millions of documents, both government and private, containing sensitive and sometimes classified information are floating about freely on file sharing networks after being inadvertently exposed by individuals downloading P2P software on systems that held the data, members of a House committee were told yesterday.

Among the documents exposed: The Pentagon’s entire secret backbone network infrastructure diagram, complete with IP addresses and password change scripts; contractor data on radio frequency manipulation to beat Improvised Explosive Devices (IED) in Iraq; physical terrorism threat assessments for three major U.S cities; information on five separate Department of Defense information security system audits.

I’ve been wondering for a while in whether Google Docs would be a good choice for the college’s Content Management System. It’s a wonderfully simple way to share docs, it’s cross-platform, remotely accessible, and free, as in beer.

One of the objections I’ve heard while kicking this idea around is that it might be dangerous to have our documents “out there”, in the Wild, Wild Web.

I’d actually argue the reverse — anything on a employee’s machine is already out there: in our P2P world, the web is us.

Google Docs (or something similar) actually has the advantage that it dissuades users from downloading materials to their hard drives, the major source of leaked materials. Combined with the right cache settings and some SSL action, it could be, I think, a far more secure environment than what we have now.

The one catch? We need the SSL; Google Docs seems not to have it. But perhaps it’s coming?

In Which I Meet Our (Other) Allies

So, I’ve just stumbled into a gold mine. Via an inbound link from Stephen Downes, I’ve discovered that much of what I’ve been calling an inverted LMS has been called elsewhere a PLE (personal learning environment):

Helen Barrett receives an email from Mike Caulfield describing an Inverted LMS, which turns out to be the PLE, independently discovered. More here. She also gets a note from a graduate student, who writes, “I’m trending towards the view that the system we will end up with will use RSS to expose content, tags to organize it, and open ID to selectively share content with certain people.” Yes, as people look at the potential of online technology, they begin reaching similar conclusions. Independently, autonomously.

And it’s true! There is much overlap. But just as I’m about to object that the Inverted LMS goes further than the PLE, I find this post via connections to Downes: Leigh Blackall’s Die LMS Die! You Too PLE! And stuff like this warms the cockles of my heart. All the cockles. Every single one:

Question to the PLE: Why do we need a PLE when we already have the Internet? The Internet is my PLE, ePortfolio, VLE what ever. Thanks to blogger, bloglines, flickr, delicious, wikispaces, ourmedia, creative commons, and what ever comes next in this new Internet age, I have a strong online ID and very extensive and personalised learning environment. Actually I think the PLE idea is better envisioned by the futurist concept known as the Evolving Personalised Information Construct (EPIC). I think we already have EPIC, so why do we need the PLE?

OK — apart from the fact that his was written over a year and a half before, and that it spells personalized with an “s” — isn’t it really Enterprise Learning Systems Considered Harmful to Learning?

The gift keeps on giving: there’s apparently a del.icio.us tag for PLEs. I know because my article was tagged by someone under it. And among those articles are ones that deal with these questions of how loose the PLE should be, ala Blackall.

(Why so few American representatives, I wonder? It’s all Canada, England, New Zealand, and Australia…)

I don’t think any of these ideas are new, really; it’s more that they’ve been refined during the long dark reign of the LMS. Looking at the network of people I’ve stumbled into I can see that they’ve been pushing these ideas outside the mainstream for some time too.

But I can’t help but feel that something is starting to happen here, when so many unrelated people are coming to the same conclusion. The very power of blogs to do what we see here — to organize people and refine ideas, to propel thought forward, to get things done — is what has revealed the LMS model to be such a cruel joke. So it’s not surprising, perhaps, that as blogging becomes ubiquitous these ideas, once considered digital utopianism, now can be expressed in very real and practical terms.

And even where the ideas are old, they now relate to a trailing-edge frame of reference — or soon will.

Over the next couple of days I’ll sift through my newly found goodies, and share what I find. I have a feeling it will be pretty extraordinary.

It’s not just Experts vs. Amateurs. It’s Experts vs. Experts in Something Else.

So there’s not much subtlety in a recent comment on Jon Udell’s call to experiment with local weather data and look for trends. After reading Jon’s piece on using Many Eyes to determine local trends, Brendan Lane Larson, a Weather Informaticist, writes:

Your vague “we” combined with the demonstration of the Many Eyes site trivializes the process of evidence exploration and collaborative interpretation (community of practice? peer review?) with an American 1960s hippy-like grandiose dream of democratization of visualized data that doesn’t need to be democratized in the first place.

You may recognize this meme — it’s the experts vs. the amateurs story that’s being peddled by Andrew Keen and others. And like most people involved in Web 2.0 endeavors, I have some sympathy for that view, but despise the frame.

And reading Jon’s article and looking at that bar graph it occurred to me one thing that we consistently fail to mention in this discussion.

It’s not just Experts vs. Amateurs. It’s very often Experts vs. Experts in Something Else.

What do I mean? Look at this bar graph:

It’s temperature in December . And I know nothing about weather patterns.

But of course it doesn’t look like a temperature graph to me. It looks like the crazy graphs of server requests and thread lengths I’d pore over daily about two years ago when the server was crashing every couple of hours.

That’s probably not helpful insight in this instance, for a couple reasons. First of all, I’m far from expert in analyzing server and nework traffic. Second of all, temperature most likely doesn’t operate to rules similar to network traffic.

But what if I was an expert in analyzing network traffic? And what if, just possibly, I had models and methods unknown to the climatology crowd that could be helpful? Or what if I were a sociologist, or an options trader, or any other of a thousand professions that have developed ways to crunch data?

Could we admit that maybe, just maybe, experts in these other fields might bring a breakthrough to the table?

Think I’m wrong? The Wright Brothers were bicycle builders. The pair that cracked the DNA code weren’t biologists or chemists, at least not originally: they were an ex-physicist and a former ornithology student. Douglas Engelbart, inventor of hypertext and the computer mouse (among other things) was heavily influenced by the work of linguist Benjiman Lee Whorf. Mandelbrot revolutionized market analysis, without any economics background.

The list goes on. The world is not quite as bottled up into climatologists and computer scientists as one might like to think. Certainly we must establish distinctions when it comes to authority, and we must encourage peer-review and cultivate healthy communities of practice. But if we do all that while maintaining our xenophobia, we are ignoring an important lesson. One that history has taught us again and again.

Send Bloggers

One of the absolutely consistent features of website development (at least in my neck of the woods) is that storytelling problems are miscast as technology platform problems.

Here’s a typical example. I’m currently working with a department to move them to a third party vendor, and in demonstrating a sample site one of the possible vendors shows a Hall of Fame gallery that features a video of an interview with a Hall of Fame coach. And it’s cut together nicely, with some original footage and Ken Burns style photo scans.

Cue oohs and ahhs. This is something they’ve wanted to do for some time.

But of course, the vendor didn’t produce the video. They merely put it up, a process that we could do on the current site if we had the video.

We don’t have a technology gap; we have a skills gap. We tell stories about our Hall of Fame coaches already. But we don’t have the internal capability to tell them in the way people now wish to hear them.

What’s changed from several years ago is where we once needed writers we now need full service media producers, and where we once could control the spin of a story, we now need to lay it out there a little more naked. It’s not enough to simply hire writers anymore. You need to hire a writer that can put together the sports video, take the photographs, and post it up on the web. Someone that knows how to both get the who-what-when-where-why and also how to do a simple video edit. Someone who gets the new culture of transparency and can write in the new web idiom.

In short, although I suppose every person in that vendor presentation would be shocked to hear it, you need to hire bloggers. For in the end, what would make more of a difference? Presenting the sports site on the new vendor platform? Or hiring a sports blogger instead of the average intern?

Which would build more of a following and have a greater effect on recruitment? Generate more alumni excitement? Have a greater bottom line effect?

My guess is for the majority of cases if you pit advanced technology against a compelling blogger, the blogger will always win. And organizations that begin to hire with that in mind will benefit.

Marc Andreessen Supports the Inverted LMS (sort of)

This is fascinating, to me at least. Marc (are we allowed to call him Marca?) came late to blogging, but he’s clearly making up for lost time and talking to the right people.

But what I noted in his recent post was how much his view of the larger web (via Sifry) matches exactly what we’ve been talking about over here vis-a-vis the Inverted LMS (or really the Inverted CMS idea applied to education). Marc writes:

The first time I met Dave Sifry, over three years ago, he told me that conversations on the Internet would eventually all revolve around every individual having a blog, each individual posting her own thoughts on her own blog, and blogs cross-linking through mechanisms like trackbacks and blog search engines (such as Dave’s Technorati).

The advantage of this new world, said Dave, is that each individual (anonymous or not) would be publicly responsible for their own content and in charge of their own space — substantially reducing the risk of spam and trolls — and the communication would flow through the links. There would still be the risk of link spam, but at least this new world would make people more responsible for their own content, and that would tend to uplevel the discourse.

I think Dave is exactly right, and the implications of this new world are very interesting.

The rest of the post is worth reading too — it’s more of a head-nodder, mostly reiterating stuff that ALL bloggers learn very quickly, but it’s great to have it all in one place. And it has the neat advantage that you can send it to the non-believers with a note that says “From the guy that co-founded Netscape.”

I’m saying, it doesn’t hurt.