This post is a part of the GroupLens Iron Blogging effort, so take that for what you will.
OS X 10.12 is rumored to have Siri integrated in. That’s exciting to me.
The natural language piece of Siri is phenomenal, and it’s something I’ve been wanting on my mac for a long time. On my phone I use it for reminders, I use it for lists, I use it to turn on music, and I use it for timers while cooking. I want this for my computer. I want to be able to instruct me computer to ‘remind me about this at 4pm’.
Except, I work in an office, at a desk, with a lot of other people. Maybe that’s uncommon, but probably not (what with the Cool Open Office Spaces that exist in many companies). What I don’t want is to have the only way to interact with this language-understanding assistant is through voice.
As HCI and social computing pushes into machine-learning backed assistants, and these assistants begin to permeate our lives, it feels important to figure out how they fit in to our normal social interactions. More concretely, it feels important to make sure these assistants aren’t breaking social norms like “talk to people and not objects”.
Consider this a call for introverted AI assistants, who really would prefer you not call attention to them, unless the social situation allows for it.