acm-header
Sign In

Communications of the ACM

Last Byte

The Human Touch


crossover railroad tracks

Credit: Shutterstock

You want me to choose whether we have red or white wine? First, let me tell you about being abducted by aliens.

I was standing on Westminster Bridge in London, and Big Ben had just chimed the hour. Next moment, I am on the bridge of a starship, face-to-face with the pointy-eared alien from that '60s sci-fi show.

"Okay," I said. "Either this is a dream, or something's interfering with my mind. You aren't real."

"Fair point," it said. "We thought this would make the transition easier for you."

"Because you're bug-eyed monsters?"

"Not at all. The naivety of your science fiction always amazes me. The only realistic way to travel across the galaxy is as an artificial intelligence. We don't care about thousand-year journeys. Our ship is crewed by AIs."

"More than one AI? Why?" Somehow, I'd expected first contact with aliens to be more profound, but then I didn't do it every day.

"Even your primitive designs have benefited from interaction between AIs—it's the best way to enable machine learning. For us, it offers the same benefits as having a diverse crew." The AI alien waved at its virtual colleagues, who had the kind of ethnic diversity you would expect from any good modern sci-fi drama. "We have different ideas. Collectively, we can make better decisions. An individual can never achieve true wisdom."

"That's profound," I said. "But, if you don't mind, why me? Why am I here?"

"You were chosen as everyperson."

"Okaaaay. And what does that mean, exactly?"

"We needed someone to represent the whole of humanity. That meant picking a person with considerable knowledge, but not too intelligent."

"Thanks, I think."

The alien smiled, as if trying it out for the first time. "It is a kind of compliment, certainly. We would like you to make a decision."

I frowned. "You mean, you need a human being to make an ethical decision, because it's not appropriate for an artificial intelligence?"

I was a touch wounded when everyone on the bridge fell about laughing. Eventually, my host wiped the virtual tears from its eyes and shook its head. "Hardly. We are far more capable of making ethical decisions than humanity. We studied your media thoroughly before making contact. Have you ever actually watched the news?"

"A fair point," I replied. "But if it's not that?"

"Of course, I was slow to explain. We are going to set you a little problem. Depending on the decision you make, we will either open contact with Earth, bringing you into the local trading partnership, or destroy humanity as too dangerous for membership. That is why we need a representative individual—why you were chosen."

"Is this some kind of alien joke?"

"Not at all. Are you familiar with the trolley problem?"

"What?" I was struggling with the idea that a wrong decision could result in the destruction of the human race. "Yes, I know it. Does that disqualify me?"

"No, it's ideal. If you look at our views-creen…" The alien gestured behind me, and I turned to see a huge screen giving an aerial view of what appeared to be a trolley depot. "In two minutes' time, an out-of-control trolley will enter the picture from the left. It will pass over that switch in the middle of the image. If the switch is left as it is, the trolley will plough into and kill the five people unfortunately attached to the upper track. If the switch is moved, the trolley will be diverted onto the lower track and kill the single individual who is trapped there. The switch is under your control. If you press the button in front of you, the trolley will be diverted. If you do that, you will be personally responsible for the death of an individual who would otherwise be unharmed. But you will save the five currently in line to die. The choice of what to do is yours."

"That's an impressively detailed computer model," I said. "I can see the little people trying to free themselves and everything."

The alien shook its head with a faint smile. "It's not a model; it's real. That's always been the issue with your trolley problem and all those other hypotheticals set by your ethicists and psychologists. The experimental subjects know it's not real and take this into account in the decisions they make. We have set this up on the surface of your planet. Those are real people, real lives."

"I don't believe you," I said. "It's obvious from your appearance that you can create a perfect visual illusion. But we both know you aren't real."

"If you don't want to take my word for it, we can briefly transport you to the surface, so you can touch the people who are about to die if you do nothing. It's up to you. But we can't stop the clock, and there is less than a minute before the trolley arrives."

"It's not fair," I said. "The whole point of the trolley problem is that there is no perfect solution. I can put the needs of the many before the few—kill an individual to save five others. But I am still murdering someone. I can't see how either option looks good for humanity. You can't ask me to do this."

"I'm not asking you—you will do this. Inaction is just as much a choice# as pressing the button. You have about twenty seconds left. Please think about what is possible. Remember, the stakes are far greater than the lives of the individuals on the screen."


It's obvious from your appearance that you can create a perfect visual illusion. But we both know you aren't real.


I stared at the big red control in front of me. It was my very own nuclear button.

The trolley careened into view on the screen. My finger resting on the button, I watched intently. Could I really do nothing? I don't usually pray, but there was a kind of prayer going round in my head, asking if I was doing the right thing. I pressed the button.

The crew on the bridge stood in unison and started to applaud. "Congratulations," said the pointy-eared alien. "And welcome to the trading federation. We are already contacting all governments of the Earth. You have served the human race well."

I watched in a daze as a host of leading politicians flashed up on the screen, each apparently speaking to empty space. What if I hadn't cracked it? What if I hadn't realized that a real, physical trolley problem was different from the idealized setting of the ethicist's tests? I had pressed the button immediately as the first wheels of the trolley had passed the switch. The front wheels had ended on the top track, the rear wheels on the bottom. The trolley had derailed and stopped before it could reach any of the trapped people.

They were safe. Humanity was safe. But I haven't been able to make a decision since. Red or white wine? You tell me.

Back to Top

Author

Brian Clegg (www.brianclegg.net) is a science writer with more than 40 books in print. His latest title, Interstellar Tours, takes the reader on an imagined starship tour of the galaxy, experiencing the real science around them.


©2024 ACM  0001-0782/24/02

Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and full citation on the first page. Copyright for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or fee. Request permission to publish from [email protected] or fax (212) 869-0481.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2024 ACM, Inc.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account
Article Contents: