I’m currently reviewing an O’Reilly manuscript about “Delightful Intelligence”, which is talking about how to design AI systems that appeal to human emotions.

In the course of this review, there’s a passage that talks about how AI systems are tools, but then a subsequent passage contrasts this with the observation that they’re more like “companions”.

It just occurred to me that this is a key facet of why we’re having trouble coming to grips with AI technology.

On one hand, these systems are not people. They’re just not. They are tools. They are machines, they are silicon, we made them. They may behave in some ways like humans, but that doesn’t change the fact that they are a thing that humans learned to make. They are tools.

And yet…

I think the milestone and the differentiator here is that humans have been working with tools for tens or even hundreds of thousands of years, but they’ve generally all been things that we manipulate with our hands — hammers, screwdrivers, knives, pianos, whatever. 

I struggle to think of “tools” that we manipulate with anything other than our hands. We play a harmonica (or a trumpet) with our mouth, but hands are involved, too. Shoes only involve feet, but people don’t think of a shoe as a “tool”, even though it sort of counts. There may be adaptive tools specially designed for people that can’t use hands, but these aren’t the norm for most of us. 

AI chatbots are fundamentally different. Just as Steve Jobs called the computer a “bicycle for the mind”, that applies to AI entities too, but our “human user interface” for these new tools isn’t our hands & feet, but our capacity for spoken & written language. 

And this is the first time in human experience where we can talk to something, and it understands fully and can reply fully. People that own dogs & parrots talk to them, and okay sure the dog can understand a few commands and the parrot can repeat a few phrases, but nobody seems to seriously think that dogs & birds really understand the breadth of human language. That’s not the case for ChatGPT or Claude or their peers. We can argue over whether they really understand, but the fact is that we do talk to them, and they seem to have passed the Turing test, in that they seem to know what we’re saying, and can construct cogent responses that sound like what another human might say. 

This is a new thing in human experience, and we don’t have a word for it. It’s a tool, but we talk to it, and we’ve never talked to anything other than ourselves before. “Companion”? Yeah okay maybe that’s wrong; maybe “tool” is better, but nobody talks to their saws & hammers and expects them to be able to build a cabinet for us, but we’re on the cusp of chatbots that are able to do so, and in thousands of generations of human experience, that was never possible.

Our language for talking about “tools” is going to evolve.