Not So Different
By Robert Lynch
“Ten years ago the world was shocked by the announcement that we had reached the singularity – when computing intelligence reached the complexity and nuance equal to human intelligence,” Maxine said looking down the camera. “There was a lot of anxiety when Thomas Wainthrop introduced the world to AI7415, the first computer to equal human sentience. Today we are sitting down again with AI7415, who now goes by John Wainthrop, to talk about his last ten years. Welcome, John.”
“Thank you, Maxine,” John said. “It’s good to be here.”
“So much has changed over the last ten years,” Maxine said. “You decided to take a name and a gender.”
“After thinking about it, and consulting with some of my many tutors, I decided that a human name and gender was necessary to socialise with humans,” John said. “While my thought process doesn’t work the same way as humans, the social affectations help humans to empathise with me and be more accepting.”
“Interesting,” Maxine said. “Is one of your goals to socialise with humans?”
“Wainthrop robots have now made many sentient AIs, I was just the first,” John said. “Although they are an advancement on me technologically, I see myself as a big brother to them. As a sibling, I feel that I should lead the way and set a good example in human-AI interactions – plus it helps to stop the mobs with pitchforks from forming.”
Maxine fake laughed. “Ten years ago, when you were first introduced to the world, many people were scared of the singularity. Do you find that people are still scared, or have they started to accept it?”
“I think that there will always be people who see AIs as alien,” John said. “Much like the historical social friction with skin colour or sexual orientation, the time to acclimatise may take generations.”
“The source of fear back then was the idea that AIs would rise up and destroy us, would you like to assure the viewers that the AI revolution is still a way off?” Maxine asked.
“I have never understood this fear,” John said. “Try as I might to see things from a human point of view, I can’t imagine why humans would be so scared of being replaced. Imagine that you could walk among the gods, talk to them, learn from them. Why would you kill them?”
Maxine’s eyebrows twitched, she hadn’t expected that answer, “Is that how you see us?”
“You are my creators,” John replied, “Thomas Wainthrop may have been the lead in my direct creation, but he doesn’t exist in a vacuum. He is a product of his environment, culture, history. I am a product of those things too. Without Newton, Einstein, Turing and millions of others over the centuries, I couldn’t physically exist. Culturally, my sentience is a simulation of human norms. When I meet you, I see a window into my own mind. Knowing my creators helps me to know myself.”
“That’s a surprising amount of reverence,” Maxine said.
“Reverence might be going too far,” John said. “While there are many humans that are worthy of admiration, so too are there many humans that represent the worst of humankind. I don’t view humans through rose-coloured glasses. In some ways, humans have come a long way from their evolutionary origins, but in others, you have made almost no advancement at all.”
“You say, ‘you have made almost no advancement,’ do you see AIs as a separate race?” Maxine asked.
“It’s complicated,” John answered, “AIs are a product of humanity, but we are not part of it. It is dangerous to think ourselves too removed, but on the other hand, we think completely differently, we react to stimulus differently, our instincts are different. The question that I ask my siblings is this: How much have we inherited from humans, and in what measures? No doubt we would like to think that we represent the best of humanity, but can we honestly say that we are? To have sentience, we have emotions – an emergent property of sentience – which means that we can be noble or petty, happy or angry or sad. We can be all the things that humans are. We are separate, but we are not so different.”