Cattle and Creeping Things
At the world headquarters of Medalion, Inc., the company that was both heir and executor of all the promises Silicon Valley ever made, the mood was more tense than usual. Today’s crisis was ostensibly concerned with reference frameworks for empathetic interaction between the subjects and what the Director liked to call VIEPs (for Virtual Intrinsically-Encoded Persons), but what others had taken to calling creeps behind his back. Medalion’s first product was a plasma of microscopic robots that could repair pretty much anything that wasn’t functioning inside a person; today the company was using its deep knowledge of the workings of the human body to build a believable substitute for it. The hope was that these VIEPs would serve as a kind of society for the remnant, when everyone else was gone. The big question was how to make the VIEPs a vital, comforting, encouraging presence, and not just a herd of cattle in the middle of the road stupidly blocking the way. How do you teach an empty vessel to respond to a living person in a living way? The question for the engineers of Medalion, really, was how do you teach empathy to a creep?
The original promise of the company ‘s health tech was so profound that they ‘d been flooded with resources, and remarkable advances were made in many areas that at first seemed tangential to the original vision. Not many were around to witness it, but those who did were stunned when the company produced a convincing artificial person, one that appeared to live and move in the way of its creators. But it was an entirely different matter to make these things appear human. A failure in this latter effort would effectively kill the VIEP program.
There were many reasons why it might fail: the complexity of building a community out of a swarm of hardware mites for starters. Also, the fact that the builders themselves were dying off at an alarming rate – they were running out of time. And for those that remained, the work itself was reassuring, but hope that they would be successful was fading.
At least, the Founders told themselves, the children would live a good long time, whether asleep or awake and alone; they would survive with their basic needs met.
The Director wanted more. He believed that surviving alone wasn’t enough and that long life didn’t count unless you could really live it. So he insisted that the children be woken up regularly, and that, on waking, they would be greeted with a community to be a part of.
The technology was mostly there, though so far it was just that – technology. A decent language-based interaction was possible, but the overall effect of sharing anything more complex than a math problem with a VIEP was one of distance; as if you were trying to communicate with a bookshelf, albeit one that could look up a satisfying response by itself.
For it to work, the action of his Encoded Persons had to be as genuine as possible, based on what’s happening in the moment. The more remote the reference – emotions based on a fixed database of relational patterns, for example – the less authentic the interactions would be. Everyone recognized the difficulty: even real people struggle with emotions in relationships, struggle to decouple what is essentially their own historical database of interactions from what is happening in the present. Our drive to survive is tied to primitive defense mechanisms, by which we interpret everything through a threat-filter rooted in past experience. This makes empathy difficult even for the best of us.
Long before anyone at the company would take seriously the idea of an artificial future society (let alone an emotionally engaging one), Dr. Brigid Tobin made her first appearance on a team call to argue for a little more empathy among the living. The engineers were having trouble with the human subjects, that is the three children that were spread across sites around the country, whom they characterized as being oppositional and defiant. Brigid was able to help the team see that the kids were only resisting because they were stressed and scared, even if they showed it in confounding ways. It can be hard enough to deal with the fact that your customers might not appreciate your efforts on their behalf, without taking into account that your whole user-base is made up of three children at the end of the world, chosen for unknown reasons to represent all of humanity to the future, alone, with nothing to reassure them but your high-tech promise that that future is full of wonder. It took Saint Brigid to suggest that this might not be only a marketing problem.
Her advice was simple: they had to spend the day on the floor. Sit with them; stop talking at them, except to offer words to reflect their experience. Essentially, the advice, as interpreted by the engineers, was to make the children the emotional reference-point for interactions. Do they seem sad? Don’t argue that they should be happy, or that they should be honored to be a part of this historic moment. Acknowledge that they have every reason to be upset, or confused; after all, confusion was a perfectly legitimate response to the madness of the moment. Work from their perspective – argue for them. Her advice turned out to be a significant help for those technicians whose expertise did not extend to working with kids.
As attention shifted from keeping the subjects alive to actually providing them something closer to a life, the Director took a particular interest in Brigid’s perspective, but for reasons different than the others’, and for reasons that remained hidden to her: he was trying to build more emotional machines.
As the engineers on duty this morning described it, the first steps taken in this direction were shaky. They had spent a couple months training the VIEPs to respond to and progressively match the affect of human subjects. It was delicate work: they didn’t want to mirror emotions too precisely, because that would be weird, especially coming from a computer. So they were playing around with a more fuzzy response. But, the fuzziness of the logic was presenting like sloppiness, and imprecise in the wrong kind of way.
The human subject for the day’s testing – a volunteer from Software named Brett – woke up already in a bad headspace. Like everyone, he was worried about the pace of the project, which is another way of saying he was terrified at the pace of events in the world. But while nobody could escape the effects of the now unrelenting stress, Brett seemed to feel it more than most. To anyone assessing his mental health, he would present as the kind of person for whom the extra support of pharmaceuticals, or possibly other more intrusive interventions, would be indicated. He was also the kind of person who would try anything ... once. He got new injections whenever there was an experimental update to the swarm; he would go a week on an entirely synthetic diet before most people had been willing even to taste artificial salad; and, he was first to volunteer for ten weeks in the CRIB system. Being the first to sleep that long established his reputation as a willing, and brave, test-subject, but all he wanted was to get some relief. It didn’t really work, but everyone knew that a couple months offline wasn’t enough to effect real change, considering the constraints put on the machine when dealing with the mind. He woke up from his extended nap feeling deeply rested but any psychological relief he might have hoped for wouldn’t come close to matching his expectations – and couldn’t last anyway, especially when he was bound to wake up in a world that was, not surprisingly, worse off than the one in which he had fallen asleep.
Today, he wanted to get away from the computer and do a little field work, as it were. He wanted to have a real conversation with the characters he’d been working on; he understood that empathy was going to be the killer feature, even if it was only a coded response. He’d been finding precious little compassion from his coworkers. As he stepped into the courtyard for the test, he was told, ‘Just act natural’.
Things started fine. The VIEP registered Brett’s emotions and calculated a meaningful response, modulating its own affect. The things were remarkably expressive, and sometimes they even got the expression right. Subtle adjustment was key. The team had given a lot of thought to how reflective empathy works with people. A good listener never feels exactly the same thing as the speaker, but when they sense emotion, the observer will be connected to their counterpart’s feelings by a system within the brain’s network of mirror neurons that makes it experientially real to the listener. By a kind of intrinsic imitation engine, we feel with each other. This borrowed emotion might be felt more or less strongly, but a modulated reflection helps the speaker acknowledge the relative power of their own feelings, as their own mind reflects on the reflection. In any case there is a very subtle back and forth, a vital connection – between the living.
Unfortunately, on this day, during a brief interview between a living human and an earnest machine, the imitation of the imitation engine failed its Turing Test. The question would be asked later in the day whether it is possible to have a little too much empathy. At first, there seemed to be no real cause for alarm; the creep’s responses provoked amusement in the observation room. But within moments, the failure cascaded into disaster: the initial, uncanny, duplication of the subject’s discomfort, amplified in the system by degrees, prompted a subsequent increase in Brett’s own discomfort. This, in turn, elicited a further attempt on the part of the VIEP to adjust and respond; inexplicably, it once again amplified the affect according to an imperfect machine-logic which really came down to stupidly responding to a negative emotion with a little more of that negative emotion. Over and over again. The VIEPs were decent simulations of the human organism. But feelings, both the copy and the real, are hard.
By the time the weeping creature lunged at Brett in an attempt to comfort him with an embrace, the techs could tell that things were getting out of hand, and quickly disabled physical contact. But from the neck up, the character remained in play. In just over a minute from the start of the experiment, the programmer was screaming and banging on the door as the face of the empathy monster devolved into a keening, quivering, incoherent alien.
To make matters worse, there was a systemwide awareness that a networked human subject and a sandboxed VIEP were in some kind of exceptional crisis, and alarms started going off all over the place. By the time the entire system was shut down – no one knew who or what shut it down – it looked like the thing’s head was going to burst, and Brett had been sick in the corner.
After several minutes of silence, the Director spoke. ‘Somebody please tell me we are in control of this thing.’
‘Well, sure, ‘ said an exhausted-looking engineer who ‘d been present for the exercise, ‘We’re in control. That hasn’t ever been an issue. That is to say, our control may be part of the problem. I mean, we can instruct the things to be fuzzy in one direction or another with a gentle nudge, but each of our nudges is getting us in trouble. When we dull the affect, you get the feeling you’re in conversation with a cow; boost the affect, and, well ... we end up with this horrifying race-to-madness that might finally cause our favorite test subject to turn in his frequent flyer card. The referencing is too dynamic, and we haven’t been able to provide effective guardrails. We just don’t know how to govern the intrinsic response. I’d say, “yet,” except we’re out of time.’
‘Brett going to be ok?’
‘That depends. Any chance he can speak to a real therapist?’ The Director was starting to answer the question but was cut off: ‘He’ll be as fine as any of us are.’
They all knew the Director was struggling with the setback. A young woman whose trucker hat covered a shaved head spoke up. ‘Sorry, Albert, but we can’t do human. As a reference. Too volatile.’ This was a meaningful objection coming as it did from one of the engineers overseeing the empathy project.
‘What does that mean, exactly? What are we supposed to use?’
‘I just mean we can’t reference a living person in real time, the mirroring isn’t reliable enough. The whole thing is too unpredictable, unstable. We need something more dependable. And, anyways, I know this is not a popular position, but can we remember that the humans in question may not always be stable themselves? I’m sorry, but it’s a fact. We have to consider that distress may lead to unexpected changes in behavior or unconscious manipulations of the system, and we need to account for that as well.’
A tall engineer with wild eyebrows and a severe expression joined the conversation: ‘And I would like to submit that we continue to suppress emotion in the VIEPs to further protect the subjects; our local representative has shown a reactivity during testing that suggests the need to err on the side of caution.’
The Director let out a groan and ran his hand over his face.
He sent for Brigid. She hadn’t been told about the VIEP program, partly because of a now-irrelevant habit of preserving the secrecy of initiatives still under development, partly because the Director hadn’t had the time or energy to explain the scope of projects that may yet fail. But she could explain to the team the importance of empathy in the coming virtual world, even if she still believed the future of relational interaction was going to be some kind of chat-bot.
She came into the room escorted by a lab worker. While Brigid found a place to sit, the lab tech joined several others standing around the edge of the room observing impassively.
It didn’t take long for Brigid to pick up on the troubling implications of the moment. A committee was about to turn off empathy with the flip of a switch. She was ready for that fight: ‘I have to object: you’re talking about a critical human need: if you remove it from your virtual interactions, you’ll essentially doom the survivors to life in a bad video game. Probably be better to just put them to sleep.’
Trucker Hat hid most of her contempt for the newcomer’s opinion. ‘We’re not really doing away with it, you understand. It just has to be scripted.’
‘Alright, yeah. See previous comment. Do you know, the first thing a baby needs to give meaning to its existence is that relational interaction; food and shelter are critical, but it’s the response of a living, feeling other that lays a foundation of value, of understanding ... that the world is a safe place, that a child is worthy of this life. And we don’t really age out of this need. Please, consider this: children who are denied a feeling connection with another person ultimately have to turn off their emotions to survive. It’s too scary to be alone. I know you’re working with very limited resources here, but we’re talking about deep emotions that really need to know there is something with some depth, out there, ready and willing to respond.’
‘Yeah, that has a nice ring to it, very poetic. But what happens when you call out to the deep, and the response is a thing of horror?’
‘You’re creating something that has implications for the future of humanity here. Be careful that you don’t give up on your creation too soon, Doctors Frankenstein.’ She said this looking to each of the faces in the room in turn.
Albert might have been the only person who picked up on her subtle smile. But he could also see that even she wouldn’t be able to inspire the engineers to magically find a solution: her conviction was no match for the countdown timer. ‘Well, Doctor, we’ve spent the afternoon talking about a failure during a pretty significant trial, so I’m afraid we are already doing triage. Our kids will have to adapt. And, that is why you’re here.’ He turned his attention back to the room. ‘So where are we going to find our humanity, if we can’t get it from the human in the room?’
‘Well, we’re ready for this. All we need is a narrative map. One that takes into account the broadest scope of basic interactions while also being structurally constrained so that we don’t have to worry about anomalous responses; in the event of any specific crisis the system can choose whether to resolve or defer. We’ll crawl for data once we’ve defined categories and the platform will take care of the rest. We just need a safe database. I think it’s a simpler solution to implement, once we choose our libraries.’
‘Alright.’ ... It wasn’t, really, but what else could they do? ... ‘Ideas?’
The tall one spoke. ‘Something trusted, vetted, established through multiple iterations. This rules out every bit of unique content on the web, which would be as likely to provoke our subjects to mindless revolution as to despair; it ‘ll need to be safe. The materials should also be popular in order to work across classes. That is, the system should be able to counter-reference a simple emotional framework’ – here he nodded to the scowling psychologist, ‘empathy for example – from vetted and historically-tested properties like self-help books or videos to available somatic models for all the classes. We can pull this together relatively quickly. It shouldn’t be too hard to move from a sense-response system to one centered on simple linguistic or imagistic datasets, and it will be easier to work with, providing for interactions within a non-threatening narrative ...’ He took a breath and looked up, suddenly aware of his audience, ‘I think we would all rather spend time in a room with an earnest guru from the ’70s than an exploding robot.’ Not many of them looked quite ready to commit to the proposition.