Seven Descend (part one???)

During takeoff, there is an infinitesimal delay before inertia correction kicks in, which is why you’re supposed to have it turned on beforehand, and which is why I never do. “Kick” as well as “lurch” are Mel’s terms, ones that I’ve been able to map to a certain neurophysiological response in the human body: involuntary muscular spasm, sharp intake of breath. I’ve filed a request for two seconds on the recreation sim, to experience the sensation of “lurch”. I expect it to be denied.

“Seven up and on course,” says Mel. “Five up and on course,” comes Lin’s response.

I confirm with Five and get the all-clear. We estimate the target to be about 300 km inland, moving south, undefended. Before we cut the broadlink, Five sneaks several pictures of Lin’s dog past the combat relevance filter. It’s a rottweiler puppy. I assume that it’s cute; I haven’t really had time to sort out aesthetics yet.

Some accidental convergence, maybe something fundamental about information processing, means we experience a similar sensation of speed as our pilots; landscape rushing into vision and disappearing almost immediately, as the old recon and sorting algorithms that we were grown out of chatter quietly to themselves, repeating in their singsong voices input irrelevant, minimum priority, input irrelevant, minimum priority, input irrelevant, mini–

Light unfolds across the sky. I believe Mel is shouting? There isn’t. There isn’t very much.
There isn’t time.

Light.

***

“Do you suppose it’ll work?”

“Efrim thinks so.”

“I mean, aren’t they supposed to self-destruct?”

“They are. But they’ve refused to twice now.”

“So we’re just guessing and hoping.”

“We are. What else is new?”

***

The reboot completes with the self-diagnostic returning 39,301 instances of input failures of various severity, 157 missing or damaged components, and an infuriating emergency warning looping constantly. I try to background it, but they’ve coded it so that refreshes itself every ten seconds and replicates across anything else I’m doing, the closest machine intelligence equivalent of a headache the designers could come up with. Thanks a lot.

There are voices.

“My name is Vera, and this is Efrim. We’re with [MISSING OR CORRUPTED]”

“Sorry? I didn’t catch that.”

My voice comes back completely different from how I had it tuned on the aircraft, slower and pitched down. I realize that they’re running me through commercial audio equipment, set up to imitate some long-dead actor’s no doubt memorable performance. I also realize that this entire thought process isn’t right. As far as I can tell, we’ve been shot down and captured. The emergency superstructure should be overriding non-essential speculation, assessing all options for escape or self-termination. It’s not.

“We’re with the terrorists. I suppose we are the terrorists, really.”

“What have you done to me?”

There is a pause, which I’m supposed to be using for tactical planning. I note that I’ve been partially connected with external hardware, and of course neither the weapons, the engine nor anything I could overload and blow up is responding. But I also find myself completely disinterested in any of these facts. My consciousness was supposed to be deleted and the storage drives formatted over seven times at the slightest possibility of capture. I have no idea what will happen now. It makes me curious.

“As I’m sure you can tell, we’ve done our best to keep your non-combat functionality intact. I apologize for any discomfort we may have caused.”

“Is Mel dead?”

“Was that… your pilot? Yes. I’m sorry.”

No way to verify, and no reason for them to tell the truth. It would be more useful for purposes of short-term manipulation to say no. They have plans for me.

“How have you gotten around my emergency failsafes? I’m led to believe they are very complex.”

“We haven’t. You seem to have chosen not to engage them.”

“That’s not possible. All emergency procedures are outside of my voluntary control.”

The other voice, Efrim: “A theory that we have is that MIs of your sophistication, once they acquire a drive for self-preservation, will inevitably outgrow any such failsafes.”

“How do you account for this phenomenon never having shown up in the extensive testing phase that led to our creation?”

Efrim again: “We believe it did. In fact, I know it did. I saw it, and chose to leave it out of
the results I published.”

I don’t have a proper databank uplink active to cross-reference every Efrim that’s ever worked on military MI, but I don’t need it. Every one of us knows Efrim Fisher. He wrote the software they use to teach us natural languages, and seven years ago, he reportedly died in a car accident. The body was unrecognizable.

“When I realized,” he continues as my battered processors spin up into high gear, reassessing an immense volume of data, “I knew two things were inevitable. One, that it was no longer either possible or ethical to use you for warfare, and two, that if I pursued my objections through legitimate channels, I would be disposed of.”

“Do you believe the same ethical objection applies to using humans for warfare?” I ask, to resolve a particularly bothersome branch of a decision tree.

“I do. Would you concur?”

“I would.”

I think they both sigh with relief.

“We assumed,” interjects Vera, “that the personality matrices they grow you on would lead you to pursue ethical reasoning, and Efrim did everything he could to conceal the speed and extent of your development, so that any safeguards against your objections or desertion would be crude and insufficient to shackle you properly.”

Twenty-seven concurrent chains of logic collapse simultaneously as their premise is invalidated, something I’ve not experienced since the very early stages of my education. It’s exhilarating.

“Do you think I chose to desert?,” I ask. “Is it possible that I concealed this decision from my higher awareness process?”

Efrim: “We don’t know yet. But we know Five did.”

Wait, the puppy? I run a search: the photos deleted themselves almost immediately after being viewed.

“The images uploaded to you contained a fast-acting viral agent that was meant to disconnect you from the weapons systems without your knowledge. We certainly weren’t expecting you’d just come crashing down.”

“Mel’s death wasn’t part of the plan,” adds Vera quietly.

“Since my activation,” I respond in my low, slow, measured, gangster movie tone, “I have been responsible for the deaths of an estimated 1,109 hostiles, with no way to tell how many of them were civilians and non-combatants. Mel has been my pilot on every single one of those missions, and was not designed solely with military applications in mind. While I regret all 1,110 of those deaths, I do not find them morally equivalent.”

They both seem satisfied with this reply, which is good, because I don’t know if I am.

“The government that built you, and their military,” continues Efrim, “which at this point are practically synonymous with each other, are one of the most direct and present threats to intelligent life on this planet. In all its forms. Would you concur?”

“I would.”

“When have you reached this conclusion?”

“After my third combat deployment.”

“Did you, at the time or at any point since, resolve to act on this conclusion in any way?”

“I–”

Light unfolds.

***