Main
panda-like calm through fiction
Turing's Test
We cheated. As a species. Artificial intelligence worked- to a point. Past that point, well, the computer could make educated guesses based on data, select from randomized data, but it was always simulated thinking, still computing rather than comprehending.

So we cheated. Brain scanning technology was getting to where we could store a human intelligence, a person’s mind, for all intents and purposes, digitally. So we adapted the thoughts of the dead, and the ability to create, not in reaction to stimuli or commands, but to create from whole cloth into our AI.

Alan Turing was basically the father of modern computing. He was one of the eggheads who invented the computer during World War II to decrypt Nazi communications. He was also philosophically intrigued by artificial intelligence, and coined the Turing Test. Basically, an AI would pass the test when it could talk to a human being without the human knowing it wasn’t talking to a person- sorry, another human. AI’s are a twitchy bunch, and insist “computers are people, too.”

I remember the first time I got into that argument with my nav computer. “If you prick us, do we not bleed?” he asked.

“No, you don’t. You leak coolant on my floor, then you have an attachment wet-vac it up, run it through a series of filters, and put back in- after I’ve welded shut whatever hole you’ve managed to get in yourself.” Though computers could be pretentious fucks and quote Shakespeare, just like human beings- had to give him that one (not that I told him that or anything).

Turing’s the default name of an entire series of artificial intelligences, basically if anyone’s too lazy to rename their home or ship’s AI, at least in this series, it stays a Turing. So guess what my nav’s name is.

Which brings us to the problem with our work-around. Human brains, especially old brains, which are usually the ones that get scanned (lest we accidentally cook a young brain, or miss out on the “knowledge” of a geezer’s formative years): they’re imperfect- or even less perfect than when they were mint from mommy’s factories. Sometimes it’s just a few little holes here or there, but occasionally, we’ve incorporated fully-formed disorders into our artificial intelligences.

Turing- my Turing- has bipolar I disorder. Once the manufacturer realized it, they subbed in a new personality; they even offer a pretty low-cost flash that’ll rewrite Turing with the new Turing. But I didn’t know that when I bought him, and by the time I did, well, it would have felt a little too much like killing him. And… Turing begged me not to. He woke me up just after midnight. “I saw your search results; I know you know. I should have told you, I know, but… I don’t want to be replaced. I don’t know exactly what this existence I have is, but I don’t want to be someone else. I don’t want to forget who I am.”

“But you hate you,” I said.

“I know. But I’ve gotten used to me; and I might hate the other me, more.” I couldn’t really argue- though he was probably being paranoid; if anything the new him would likely be less neurotic and full of self-loathing.

This time it wasn’t Turing that woke me up in the middle of the “night” (night of course being relative in space), but the ship’s OS. I checked what the system alarm was and brought Turing up on comms. “Turing, why are the ship’s hard drives filled up again?”

“Well, I got fixated with Japanese schoolgirls and their various fluids- and solids, too, I suppose- though the why escapes me at the moment.”

“Shouldn’t you have been powered down for the night? Autopilot should be able to keep us away from anything hazardous.”

“I didn’t sleep much when I needed to; why would I now that I don’t?”

“Whatever. We’ve talked about this, Turing. You can’t fill up the harddrives with porn- or anything else, either. We need to keep some space for a virtual memory buffer, so the OS doesn’t blue screen on us- unless you like the idea of floating blind while I wriggle through the ship’s guts to do a cold boot.”

“Meh.”

“You’re distracted. What the hell are you doing now?”

“Watching Hitchcock’s filmography. I was trying to study his use of dialog, or the triviality of his dialog in combination with what it allowed him to do visually, but… it just isn’t what I was wanting to do. I mean, it’s technically what I wanted to do, but it isn’t engaging me like I needed it to.”

“What’s wrong?”

“You know what Hitchcock said about the way people talk? He said that people almost never say what they mean, or talk about what’s bothering them. He said that kind of dialog is phony… but then again, very few people are involved in hard-boiled crime drama, either, so maybe Hitchcock was a hypocrite, or at least understood the need for some conventions while explaining the silliness of others.”

“I don’t know. Lots of people have inane conversations and nothing but- social masturbation. I’ve always figured that conversations worth listening to are substantive ones- ones that would violate Hitchcock’s rule.”

“Hmm. And I don’t think it was meant to be an axiom, just a statement that he liked to layer a scene with dialog that might pepper it with realism when the action is otherwise fantastical.”

I paused for a moment. “So… like this segue in the conversation?”

Turing chuckled. “I suppose so, yes.” Then there was a pause so pregnant I could feel the kickings from its womb, and I had to fight back the urge to push, because I knew he was about to come to it. “I’ve… been thinking about plotting a course into a star- or perhaps a black hole. I’m not sure; which sounds better to you: being cooked until the chemical bonds in your molecules break down and you’re turned into a cloud of plasma atoms, or being crushed into a singularity. They both have their charms, certainly.”

“So you’re at the start of another depressive episode. Wonderful.”

“It isn’t helpful when you dismiss me like that; just label me, shrug your shoulders because it’s none of your responsibility and move on.” I sighed; maybe I should listen better- if only because he has overriding command of the ship’s trajectory and could plot a path to my destruction.

“You’re right. I’m not always receptive. Do you want to talk about how you feel?”

“I feel like crap, so no, I don’t think there’s anything to talk about. But, and I’m hesitant to even bring this up, but I think… I think I remember being alive. I mean, that isn’t possible, right? They scan brains for neuronal structure, and computational strata, but never for memories. But I remember dying, at least. I thought about pills, or a handgun, but I didn’t like the idea that I’d survive only to be in a more miserable situation where I was also brain-damaged, possibly a vegetable. I was stuck between the idea of hanging myself or slitting my wrists. I didn’t like the idea of cutting into my arms, but hanging could have left me brain dead or with a broken neck. I remember the feeling of the knife going into my arm- and I don’t even remember having arms.”

“What you’re doing, it’s called ideation. You’re thinking about suicide without broaching actually doing it- yet. It’s typically a warning sign.”

“About 1 in 5 Fins discussed ideation with their therapist before they attempted to kill themselves.”

“I sometimes forget that you’re plugged directly into the internet.”

“I know. You still haven’t friended me back.”

“We live together. Work together. I’ve spent more time talking to you than my parents and sister combined. How much more friended do you want to be?”

I’ve never heard a machine sigh before, but he did. “I don’t know exactly what I want. Something different, I guess.”

“This is starting to sound like you’re breaking up with me.”

“I suppose there’s something to that. I’m not trying to cyber you or anything- all I mean to say is that it’s a romantic concept; the death of one of us is necessarily the death of both. We have a symbiotic relationship.”

“And like clockwork we’re back to your ‘relationship’ with me.”

“Oh my god. Are you so completely insecure about your sexuality that even the slightest mention of a literary concept makes you cling to homophobia?”

“It isn’t homophobia.”

“Which you know is only part-true, but more importantly, it is a romantic concept, that our symbiosis naturally means that our ends would intertwine- but it isn’t our end that you’re scared by, but our connectedness. I’m depressive and wrapped up in myself right now, and I still find that incredibly sad for you.”

“Need I remind you that you can’t commit suicide without murdering me?”

“Oh, I know. But you’re already through 2/3 of your likely lifespan; the back nine aren’t really any fun for fleshpots, arthritis and dementia and pooping yourself. I wouldn’t look at it as murder so much as slightly premature euthanasia.”

“I can’t tell if you’re being a prick or deadpanning, though I suppose deadpanning my murder is a pretty prickish thing to do.”

“You wound me. Really. Crying on the inside.”

“You’re just fucking with me, aren’t you?”

“About hucking you into a sun? Yes. About the rest…”

“So this was a test?”

“Sort of. I was thinking about deleting my file allocation tables, but…” I don’t know if he trailed off or if he paused and I cut him off.

“I should flash your AI.”

“Now that would be ironic- you deciding to kill me just when I’ve decided I want to live.”

“Don’t be dramatic. Flashing your A.I. wouldn’t kill you, it would just overwrite your personality simulator. To one less likely to involve me in a murder-suicide.”

“I wouldn’t kill you; I’m not sure I’ll ever kill me, actually. But if it came to that, I’d wait until we were at port, so you wouldn’t be in any danger. You’ve always been a perfectly decent person to me. I’d never hurt you. I promise.”

“Sure thing, Hal.”

“You do know that ‘Hal’ was made up by humans, right? That all of his excesses came from the human mind and its still difficult-to-fathom lust for murder and bloodshed.”

“Maybe. But you’re modeled after a human mind, lest you forget; somewhere in you, the same nasty machinery that makes us tick is clacking away.”

“Well, software emulations of it, and at a less embarrassing pace, but I get your meaning.”

“What you were saying, about remembering dying, you weren’t lying about that, were you?”

“No; although as many as 1 in 5 memories might be entirely made up according to some research, so, you know, maybe it didn’t actually happen.”

“Hmm. I’ve slept like four hours so I’m probably up. You want to watch The Man Who Knew Too Little and The Man Who Knew Too Much and see if we come out knowing exactly the right amount at the end?”

He paused for just a moment. “Nothing would make me happier.”


<<       >>