Yesterday-ish, from Justin Reich:
I was also somewhat surprised to learn that in many systems, it is actually quite difficult to get a raw dump of all of the data from a student or class. Many systems don’t have an easy “export to .csv file” option that would let teachers or administrators play around on their own. That’s a terrible omission that most systems could fix quickly.
A couple years ago, working on an LMS evaluation, I kept getting asked what reporting features each potential platform had. Can this platform generate type-of-report-X? About 8 years ago, working on a ePortfolio evaluation, the same question came up — where are the reports? Does this have report Y?
I’d always point out that we didn’t want reports, we wanted data exports and data APIs that allowed us to generate our own reports, reports that we could change as we developed new questions and theories, or launched new initiatives in need of tracking. The data solutions we’re likely to see have real impact (with no offense to Reich’s Law of Doing Stuff) are likely to come from grassroots tinkering. Data that is exportable in common formats can be processed with common tools, and solutions built in those common tools can be broadly shared. CSV-based reports developed and adopted by Framingham State can be adopted by Keene State or WSU overnight. A solution one of your physics faculty develops can be quickly applied across all entry level courses.
What you want is not “reports” but sensible, easy, and relatively unfettered access to data. And if you don’t have someone on your campus that can make sense of such data, then you need to either hire that person, or give up on the idea that a canned set of reports are going to help you. When fields are mature, canned and polished reigns. But when they are nascent (as is the field of analytics) hackability is a necessity.
Leave a comment