Like many people I watched Google’s demo of their new Android system AI calling up a hair stylist and making an appointment with trepidation — was this ethical, to not disclose that it was an AI?
But now that the smoke has cleared, I’m realizing something a bit more disturbing. After years of Big Data and personal analytics hype, the advance that Google demonstrated is an application of 1970s AI work that requires none of that.
Setting up a haircut appointment is a social script. It has a sequence of things that happen, usually in a predictable order. The discovery of the importance of social scripts in computational understanding of communication was a big part of what Schank and Abelson brought to the field of AI in the 1970s.
Scripts were important both in terms of computers navigating standard social situations, but also in understanding stories about those situations. When I studied linguistics, one of my favorite little facts was you could often discover socially legible scripts by noticing how stories were elided. For instance, if I say “So I go to a restaurant, and the server gives me the bill…” no one stops me and says “Wait, you got a bill before you ate anything? And who is this server person?” The understanding in storytelling is I can evoke a script and then start at the part of the story that deviates from the script. That’s how core they are to our thinking and discourse, and Schank and Abelson made the case in the 1970s that mapping out these scripts would be core to computer understanding as well.
While less physical than dining, booking a haircut over the phone is a script too. It follows a particular sequence and has slots where the unique bits go. In general we find out if I am in need of a particular stylist, and then drill down on a date and time. Importantly, it works because I’ve learned the script and I know the things the hair stylist will ask and I have the answers the stylist requires. I know I need to provide date, time, and stylist, and I might need to supply a rough time of day preference — mornings, afternoons, end of day, before work. On the other hand, I know the stylist is not going to ask me if I’d rather have a chair nearer to the window or the bathroom or what type of music I prefer in the salon.
Here’s the thing: The precise nature of social scripts is that they often allow people with no knowledge of one another to negotiate transactions successfully. Preferences figure into that but are usually easily enumerated by each party — because that’s part of the script.
Because of this, I don’t really need personal analytics to discover that I like my cappuccinos extra dry. I have years of experience walking through scripts where I’ve learned to specify that, and the script has a very specific spot where that goes. The script has taught me how to concisely enumerate my preferences in ways useful to baristas.
In fact, analytics in these situations end up being a lesser reflection of the explicit inputs into the script. For example, Google might search my flight booking data and find I like window seats towards the front, that I prefer Alaska and layovers with a bit of buffer in them. But the patterns I produce in what I get for flights aren’t a mysterious secret sauce discovered by analytics, they’re the product of me specifically asking for nine things when I book flights. Nine things I can easily rattle off, because I’ve been doing the “booking a flight” script for years.
So here’s the question about the “haircut” demo: if the nature of the social script is you *don’t* need deep knowledge or background for the script to work, then what’s all the talk about personal data being Google’s prime AI asset about? What’s all the machine-learning hype?
After years of sucking up all our data Google’s big AI advance is… Script Theory. Which requires none of this. Maybe we should be talking about that.