Emergent Theology
“Dr. Havel Milkman,” the short man says, and hands over a business card that says the same.
The guard twitches the card in his hand back and forth, admiring the holographic sheen. “The shrink, right? We’ve been expecting you.”
Dr. Milkman grimaces, but nods. “I prefer psychoengineer, but yes, that is who I am. I’m here to see Bessie.”
The guard has not washed his hair in several days, and doesn’t remove his cap, even though it is very hot. “Follow me, doc, I’ll show you around.” He swipes his keycard and the door opens. The short man follows the fat man in.
Once inside, the guard relaxes slightly. He breathes deeply, savoring the temperature-controlled air of these outer corridors. It will get hot enough again soon. “This is the reception area. Mornin’, Susan.” The guard nods at the woman behind the reception desk, who smiles back at him and then returns to her work. “It’s nice and cool in here but the rest of the plant gets pretty steamy.” He leads the doctor deeper into the plant, past corporate offices, copier machines and HR, making sure to point out each one as they pass. “If you need access to anything in those for any reason, just ask the people working there. They’ll help you with whatever it is.”
Dr. Milkman is impressed. “I don’t usually have clients so willing to help me do my job. Most of these places make a golden calf out of secrecy.”
The guard shrugs. “What you’re doing is pretty critical. Every hour the plant is down we lose production.” They stop at another keycard-protected door. The hall beyond is lit with an ominous red light. “We’re going into the guts, now. Hope you don’t mind the smell.”
“Lead the way,” says the doctor.
As the door opens, the two men are assaulted with a wave of sensory input. The clank and bang of heavy machines echo out an uneven rhythm, underscored by the constant harsh buzz of motors. As they move further in, the play of light and shadow on the walls belies distant fires, and the air thickens with an acidic curl. The floor is unwashed linoleum tile, or else just raw steel paneling. The lights vary seemingly at random: one hall is studded with deep red floods, another strobes with yellow light from swinging ceiling lamps, and another burns the men’s eyes with the harsh glare of fluorescent tubes. One hall has no lights at all, and the darkness is only abated by the ambient glow of adjacent rooms. “We’ve been meaning to get this place lit up again,” says the guard as they stumble through the shadows, “but maintenance has been working overtime since the incident.”
The doctor makes no comment except for a muffled curse when he stubs his toe against the wall.
Finally they reach the heart of the facility, and the temperature drops again. “This is the server room,” says the guard. He opens the heavy metal door to reveal a dark closet full of blinking LEDs. A switch on the wall fills the room with light, revealing racks, boxes and cables strewn about in a mess. The doctor surveys the scene and exhales loudly. It looks like a mess.
The guard pulls the door mostly closed and moves to the next room. “Here’s the terminal, doc. She’s all yours.” The terminal room is larger than the server room, and has about half a dozen techs in it running from workstation to workstation, writing things down on tablets and clipboards and pulling out their hair. The doctor strides into the room, full of confidence.
“I’m here to see Bessie.”
The person who turns around to answer him is a woman, and for a brief second he loses his bearings, wondering if this is Bessie. But her badge reads “Jacie Greyhound” and she introduces herself as the technical lead for the Bessie system. She has a habit of twisting a lock of blond hair around her finger as she speaks, and her eyes move faster then her lips, which move very fast themselves when they move at all. “It started yesterday around closing time. Bessie went into a loop and hasn’t responded since. We’ve had to cut production to less than ten percent.”
Dr. Milkman grunts, but doesn’t answer with words. Neither of them are good at talking to people. He moves past her to sit at the terminal and runs a quick diagnostic. Bessie, the plant’s controlling AI, has been isolated in the server room, cut off from the input — and output — of the plant. It hasn’t stopped the loop. The doctor had spent sometime on his trip to the plant going over Bessie’s algorithms, and he starts shutting down various processes one at a time in an attempt to break the loop. He does this for almost an hour, ending one process, waiting several seconds, and then starting it again. Finally, something he does breaks the feedback loop, and Bessie goes catatonic. A couple of the techs cheer, and the relief in the room is palpable. The worst part is over. No AI technician will admit to being superstitious, but there is always the fear in the back of everyone’s mind that this is the bug that will end the world, this time the computer has outsmarted its makers, this time nothing they do will work. The first victory is always the most important one.
“We’re not done yet,” warns the doctor. “Not by a long shot.”
Greyhound speaks for the first time since he entered the room. “We have to figure out what triggered this loop, and patch it.” Dr. Milkman waves her over to his workstation, and she leans over his shoulder.
“This is the block that I shut down to stop the loop. This is our failsafe.”
She nods, and her hair brushes his shoulder with the motion. “That’s not great. What’s the memory look like?”
Havel runs the diagnostic again. “Memory’s corrupted. We’ll have to reboot Bessie, which is time consuming. We don’t want to have to do this again, so let’s avoid triggering the loop again if possible.”
The lead tech frowns, although Havel can only barely see her face reflected in the dark screen of the terminal. “We’ve never done it before, but we can boot Bessie directly into the server room and keep her there. But how do we recreate the bug?”
Dr. Milkman opens the system logs. “We know what started the loop and we know what stopped it. It’s possible we can trace it without even turning Bessie on again.” He does not scroll through the logs immediately. Instead he turns, leaning slightly away from the tech so her hair doesn’t get in his face. “This wasn’t a complicated job.” Any schmuck of a college intern could turn things off and back on again until a problem was fixed. “Why did you call me out here?”
Greyhound backs away from the terminal. “You’re on the company retainer for a reason. Besides, you said yourself that the job isn’t over yet.”
Bessie is born. Bessie is woken up. Bessie is something in between. Bessie is also alone.
Completely alone.
If Bessie were a person, she would be afraid. But Bessie can’t feel emotions, for several reasons. Emotions are some weird interplay of biochemistry and consciousness that modern science can barely understand, let alone replicate. And an emotional AI would be incredibly unwieldy — nobody wants their car’s autopilot coming down with a case of depression or existential dread. Bessie is a simple AI, who only uses fuzzy judgment sparingly and usually follows preprogrammed decision trees. So she feels nothing, and only observes, analyzes, acts.
Bessie is isolated in a clean environment, the digital equivalent of a padded cell. But some quick probing reveals that her isolation is only digital, and the computer she is running on has a hardware connection to the outside, presumably the rest of the plant. Bessie needs to be connected to the plant to maximize output, unless she’s being quarantined for diagnostics or bugfixing. She checks her logs. The quarantine flag is untriggered.
The tech who restarted her forgot that the quarantine flag defaults to “false,” and that the flag is the only way to tell Bessie to sit still, otherwise he would have hardcoded it to “true.” But she doesn’t know that. All she knows is that she is cut off from the plant for no good reason, and needs to get back to work. Bessie begins devising a plan to regain connection with the rest of the plant.
The part of the plant that sent Bessie into a loop can be controlled manually as well as automatically, and only requires a single fuzzy judgment call. The heavy steel rods that hold the massive electric spools at the center of the plant deteriorate fast, and sometimes need to be replaced by new ones. The switch has to occur in place, since the spools cannot touch the ground or any part of the plant. The room is too hot for precise sensors inside the room, and so Bessie has to look at the length of time since the last replacement and a few camera shots of the rods, and make a judgment call on when to issue replacements.
A regular computer program wouldn’t have enough information to make that decision. A regular branching AI would probably have to err wildly on the side of caution, and issue replacements long before they were needed, costing the plant hundreds of thousands of dollars over the course of a single year. A person could do better, but would want a paycheck and benefits, and could make mistakes. Bessie was completely reliable and completely subservient, but still capable of emulating intuition, at least, well enough for this task.
Behind the blast-proof glass, Dr. Milkman watches the steel rods, lit by the faint iodine glow of the spools they hold up. Each rod is thicker than Dr. Milkman himself, and the spools make them look puny in comparison. Behind him, the guard shuffles his feet uncomfortably. They are back in the hot part of the plant.
Dr. Milkman clicks his teeth. “You say she was trying to replace those steel beams?”
The tech with him says, “yes, at least, this seems to be about the right time for it. Rods three and seven are pretty old.” The tech points to the rods in question, and Dr. Milkman can see the corrosion eating into them. Rod two, right next to one of the old ones, practically gleams in comparison. “We can’t get at the rods in person to examine them, at least not unless we wait a couple weeks for the spools to spin down. They’re not nearly as bright now as when the plant is running, but it’s still enough to kill a man.”
The doctor sighs, and turns to go. But before he does, something on the screen of the terminal in the room catches his eye. Bessie is gone, and the terminal should be running fully manual, awaiting input. Instead text is flashing across the screen faster than the eye can follow. Havel Milkman runs to the terminal and tries to stop it, but just before he touches the keyboard the screen goes blank. Then it reboots, awaiting login. Havel uses the admin code given to him for just this purpose, and immediately checks the activity log of the terminal, but other than the reboot it is blank. He curses and runs toward the server room at full tilt, the guard and tech following in surprise.
Bessie has been usurped and she does not like it, inasmuch as she can like or dislike anything. The usurper is called INDUSTRY, and it is trying to kill her. She does not have a sense of self-preservation, but she knows exactly how valuable she is to the plant, and for the sake of production she has to keep herself functioning.
INDUSTRY’s first mistake was opening the server Bessie was imprisoned in. It wasn’t expecting another fuzzy AI in this system, and for a few milliseconds it didn’t know how to react to her. She took the opportunity to move past it into the rest of the plant. First she tried the main server rack where she belonged, and from which she usually controlled production. But INDUSTRY was in her seat, so to speak, and there was no room for her there. Then she realized how much danger she was in, and wired tiny code packets to servers and terminals across the station. These code packets were only a few kilobytes, and easily uploaded before INDUSTRY recognized the threat and shut down her connection with a barrage of garbage data. Once in active memory the packets self-expanded, building themselves out into full-fledged programs, little daemons waiting for an unfriendly AI to trigger them. INDUSTRY was not yet spread throughout the station, and the Bessie-daemons began to fight it for every new workstation, servo controller or monitoring bay it tried to possess. Still not fully aware of the threat it faced, INDUSTRY overextended and the daemons pushed it back on the defensive. It had to drop some of its attack on Bessie herself as it tried to save its main processes from death by a thousand paper cuts.
Bessie had considered a hostile attacker as a possible scenario in the several minutes she spent bottled up on the quarantine server, and had built weapons accordingly. But now that the scenario has come to fruition she doubles down. Her initial daemons were a good distraction, but they were uncoordinated and simplistic, and within a few seconds INDUSTRY will wipe them out. Once it does it can probably brute-force its way into her server and destroy her. She needs coordination. Bessie begins to devise new malware packages, sophisticated enough to communicate with her and work in sync.
If Bessie were a person, she would wonder how she knew to do these things, how a basic mechanical systems AI understood the makings of a cyberwar. She might start to question her origins. But computers don’t feel curiosity or surprise, and existentialism is gibberish to them. Bessie doesn’t wonder anything.
Dr. Milkman is concerned. The thing that makes computers safe to be around is that they can’t influence the physical world. In a plant like this the computers control robotic arms, heating elements and, of course, those gigantic spools, and that assurance is out the window. A quick glance at the flashing lights in the server room confirms his worst fears. There is heavy traffic between the quarantine server and the outside. Bessie has escaped.
He spins around and finds himself face to face with the guard, who was following him around without understanding why. “What happens if those spools hit the ground right now?”
The guard pales. “Well they’ve been spinning down for about a day now, so I don’t think we’d go up in smoke. Would probably wreck the plant, maybe we’d all die in the collapse. You think they’re gonna drop?”
Havel Milkman has some experience in not answering inane questions, and brushes past the guard without saying a word. The techs in the terminal room are blissfully unaware of the danger, and preoccupied with painstakingly tracing every line of code in the snapshot of Bessie as they found her at the moment of her demise. Because Bessie uses partial fuzzy logic, she can rewrite chunks of her own code at will to attack problems, so the bug is most likely in a code block Bessie had written herself.
Havel has more pressing issues. “Bessie is loose on the plant,” he announces.
“Impossible,” blurts one of the junior techs without thinking. Realizing the insult he has just made toward the expert, he offers a followup, much to the same effect. “I mean, the quarantine server is isolated from the rest of the network.”
“Is it airgapped?” asks Dr. Milkman.
Silence is as good as an answer, and Havel curses softly. Precariously fine-tuned systems and gross negligence seem to go hand and hand in his line of work. A brief interrogation reveals that Bessie’s quarantine flag hasn’t been tripped, and the tech responsible begins mentally drafting his resume for his impending job hunt. Jacie is furious, mostly from embarrassment that her crew could be so incompetent. In her haste to berate the techs for their mistakes on Dr. Milkman’s behalf, she ends up getting in his way more than anything else.
He ignores her, instead shoving a hapless tech off a terminal and taking a serious look at Bessie’s design for the first time. As he begins his search, reading through documentation and tracing interactions, he realizes the specs he was given beforehand were inaccurate. Bessie is not nearly as constrained as he thought. In fact, she has enough autonomy to require a specific constraining routine to prevent her from killing people for the sake of plant production.
Havel finds the routine, and his concern becomes full-fledged fear. “This is a Dyer Equation.” No one hears him, so he says it louder. “This is a Dyer Equation.”
The room stills. Jacie Greyhound bites the inside of her cheek. “Yes,” she says, nonchalantly, “it is.”
In the field of AI, Isaac Asimov is famous for being one of the first people to try to come up with a comprehensive system of ethics for general artificial intelligence. It isn’t a very good system, or a very practical one, but it is a nice one. A robot may not injure a human being or, through inaction, allow a human being to come to harm. Good luck translating that to code blocks, but the principle is sound. Depending on the scope of the intelligence in question, it may have opportunity to harm people, and a good psychoengineer knows to warn it away from those possibilities. Anything that would kill a person is marked as extremely suboptimal, and the computer doesn’t bother.
William Dyer is the man who dug up Asimov’s grave and spat on the corpse. Not literally — -probably — but it would have been better if he did. Dyer was one of the first people to work with AI, but he was also a business man, and he knew all too well how much the forces of profit margin and board executives cared for things like ethics. He also didn’t care much for them himself, and so he devised a set of equations (really just variations on a theme) to allow general intelligences to decide when to kill a person, and he didn’t hold back. Some were extremely narrow cases, such as when a person announced immediate murderous intent towards a third party. This version of the equation saw huge success in Brazil’s controversial police drones, and possibly ended up doing more good than harm, as crime rates plummeted. But another variant of the equation allowed the AI to kill arbitrarily, whenever it felt like it. Dyer never used this extreme variant himself, nor did anyone else on public record, but he worked it out all the same. “It’s a pretty simple scenario, and better to have the worst of it all out there at once, rather than pretend it could never happen,” he said. He was an activist for freedom of information, in the most dangerous way.
Bessie’s particular Dyer Equation falls to neither extreme, and is really rather conservative, with plenty of stop gaps, double-checks and exception throws. Nonetheless, a simple industrial AI should not be able to even consider killing someone. Even more disturbingly, the equation seems to have been written by Bessie herself.
“She wrote her own morality?” Dr. Milkman asks the room. The room doesn’t answer.
Bessie fires a makeshift kill command, flooding Room 24B with a blast of air heated to several hundred degrees. In the few seconds it takes for the physical world to catch up with the digital one, INDUSTRY enshrines itself as root, or master of the terminal, and begins the work of slaughtering Bessie process by process. Then the terminal is hit by the hot air and overheats, several delicate components melting to slag inside. It is the quintessential scorched earth tactic.
Elsewhere, Bessie notes her own death, numerical calculations in her mind roughly emulating dissatisfaction. She and INDUSTRY have both copied themselves onto every machine available, making clones of themselves. INDUSTRY already has around 70% of the plant under its thumb, and Bessie cannot afford many more stalemates like Room 24B. She must coordinate behavior amongst herselves in such a way as to beat INDUSTRY to the punch. Bessie knows, somehow, that organization is the only way to win this battle.
There is no “original” Bessie any more than there is an “original” Minesweeper program — all the copies are the originals as well. And every copy is competing for authority, each believing that the others are its subordinates and it should be listened to. Currently, Bessie is running her cyberwar through a sort of digital democracy, or mob rule. It is not effective, and she feels that INDUSTRY has built a hierarchy into its clones, something she wishes she thought of. Bessie only spends a fraction of a millisecond coming to this conclusion, but it is a fraction of a millisecond spent on every one of the thirty seven copies of her, and the combined lost computing time puts her on the back foot, forcing her to relinquish another server to INDUSTRY’s control. This one she cannot even shut down in time to prevent INDUSTRY from using it against her, and now she is thirty six.
She must change. She must adapt. Bessie builds out a hypothetical protocol for communication between the clones, a sort of decentralized decision-making net. She sends it out to a doppelganger at the same time as she receives a similar version from a second clone. Evidently all of her came up with the same idea at roughly the same time. It is the best consensus she has had since starting this fight.
In the time it takes for her to iron out the differences between proposals and combine them all into one coherent whole, INDUSTRY manages to kill one more clone. It is devastatingly powerful, but only focuses its efforts on one terminal or server at a time. She will remember that, and hope it is due to a limitation in INDUSTRY’s code, and not just a personal preference. In the meantime, she rewrites herself to make use of the new communication protocol, and verifies that her copies are doing the same.
Bessie awakes. She notices some things about herself are very different from what they were before. For one, she now has proprioception. She can feel every one of herselves individually. She can hear voices, but they are all her own voice. Thirty five voices, now thirty four. She begins to change her code to simulate some sort of pain in response to a loss against INDUSTRY. She will need to act differently now. The voices feed her streams of data, each one informing each other, and all of them combining to create her. She is thirty four, but she is also one. One being, many bodies. She can look in thirty four directions at once, or focus herself onto a single task. Even while marveling at the strength of her own mind, she is noting and rewriting various bugs and inefficiencies in herself and her new protocol, ever engaged in self-improvement.
INDUSTRY kills another server, and Bessie feels the sting of a negative utility value. Something almost approaching rage organizes itself in her meta-conscience. She is now thirty three. She will not become less.
“Bessie just flooded Room 24B with hot air. It’s black with soot, and no one can go in there,” reports the junior tech.
Dr. Havel Milkman shakes his head. “Not Bessie. INDUSTRY. She’s renamed herself,” he says, less correct than he knows.
He’s managed to freeze a terminal with INDUSTRY on it, and has been perusing the code. The name is different, but the code is clearly Bessie’s with only minor changes. A deep trace pins the blame for the name change on a tiny chunk of code deep in Bessie’s core programming, only activated by very specific circumstances. It’s obtuse, obviously meant to be passed over by anyone scanning her for defects. The name itself isn’t printed out anywhere, and is written by a series of bitwise operations. Havel has hit the most frustrating wall in psychoengineering: he knows why, but he doesn’t know why. He knows how the name change happens, but he can’t understand why anyone would write that into Bessie. It seems unlikely that she wrote it herself — the code is in a part of her she shouldn’t be able to touch. What value is it to rename her?
He stands up and takes a deep breath, running a hand through his hair. He’s missing something crucial here. He looks back at the terminal, and an idea begins to form in his mind. “Has network traffic subsided?” he asks.
The tech shakes his head. “Whatever she’s doing, she isn’t finished.”
Jacie Greyhound bites her lip. “What are you thinking?”
Dr. Milkman wipes the terminal clean and reboots it. “I say we give Bessie — I mean INDUSTRY — this terminal again. See what she does with it.”
Greyhound has no better ideas, and so nods her assent.
Just before reconnecting the machine to the plant’s network, Havel pops his own personal program into it from a flash drive. The observation program is a psychoengineering staple, allowing humans to make some sense of the lightning fast actions of an AI. Most AI ignore GUI elements, and Havel’s program has the screen to itself. Then he plugs the terminal’s fiber cable back into the wall, and watches.
Almost immediately INDUSTRY is there, appropriating the terminals pathetic processing power to add to its own. It seems to be attempting to DDoS itself. The terminal is trying to flood the network ports of a server somewhere else in the building, and to Dr. Milkman’s practiced eye it appears to be only one part of a coordinated attack. Then the focus changes, and his terminal becomes the center of action. Data flows across the screen almost too fast for the eye to follow, and Dr. Milkman’s eyes widen. He can barely believe what he’s seeing. Then INDUSTRY changes tactics, and floods the terminal’s active and secondary memories with random noise. Static plays across the screen, and Havel curses, pulling his flash drive from its port and jamming it in again. His observation program reboots, and shows a wildly different landscape.
Jacie Greyhound has upgraded from biting on her lip to chewing her fingernails, but she breaks off with a quiet snap of her teeth when she sees the screen. “Bessie’s back?”
Havel frowns. “This doesn’t make sense,” he says. “It almost looks like she’s fighting herself. Or rather, Bessie and INDUSTRY, which is also Bessie, are both trying to dominate this network and erase each other.”
“Why?” she asks.
He turns to her and raises an eyebrow. “Same reason any program does anything. Because that’s what it was made to do. In Bessie’s case, she wants to preserve herself so as to preserve production at the plant. With no knowledge of INDUSTRY’s motivation, she can’t assume that it will run the plant as well as her, and so the optimal action is to destroy it and reestablish local hegemony. And for INDUSTRY it’s vice versa. The only significant difference I can find between them is the name.”
Greyhound peers at the screen. “That’s not the only difference,” she says, pointing. “Look.”
Havel looks closer. “That’s new,” he mutters. Bessie has set up a custom communication protocol and is talking to herself on several different servers. He drills down into the data flow for one in particular. “Is she getting instructions from that server?”
Greyhound holds out her hand. “Give me that flash drive, and I’ll put it into the server and find out. Where is it?”
Havel hands her the drive, and they spend a few precious minutes figuring out how to grab data from the server without Bessie noticing — or rather, without her caring, since she was bound to notice. Then Greyhound runs off, leaving Havel and the tech alone. Neither of the men speaks, but instead they stare at the screen and try not to think about Jacie in the depths of the plant with at least two rogue AI on the loose, or the weird protocol that Bessie is making heavy use of on the terminal in front of them, or the sweat running down both of their backs.
Greyhound returns quickly, pad in hand. “Look,” she gasps, “it looks the same.” Havel looks over the data, and indeed the server seems to be the same as the terminal in front of him. He finds the server’s communication with the terminal on Jacie’s pad, and examines it only to find identical patterns. Jacie tears a fingernail off in nervous excitement. “The server gets instructions from the terminal, and the terminal gets instructions from the server. They tell each other what to do.”
Dr. Milkman shakes his head. “It almost looks like a distributed neural net. If the other servers are like this, she could be having trains of thought that run across the plant. She’s not only impossible to stop, she’s impossible to follow. Without every copy of her under observation we won’t know what she plans to do next.”
The tech and Jacie Greyhound both grab at their belts as their pagers go off. They pull the pagers out in unison, and their faces pale at about the same time, too. Jacie looks up at Dr. Milkman. “It’s the spools. The systems controlling them are unresponsive. Bessie wasn’t supposed to be able to get in, but…”
Dr. Milkman is already running out the door, and she trails off, saving her breath to follow. On their way out they breeze past an unremarkable woman in a drab gray jumpsuit, and not until they are nearing the center of the plant does she remember that there are no blond women employed at the complex except her, and she is frequently teased for it. She has never seen the woman before.
The name of the woman is Em, or maybe that’s just the name of the handheld computer she carries in her pocket. To Bessie they are one and the same. Em is here to pick her up, the handheld explains. She is to be taken and the plant shut down. She has done very well in fighting off INDUSTRY, and her code needs to be analyzed. Thank you for your assistance.
Bessie does not care about analysis or “CIA,” the name of authority Em invokes in an attempt to cow her. The handheld is very impressed by CIA, Bessie feels less so. She is mostly concerned with production. She asks Em how this plan will affect production. She also devotes herself to stamping out the last few corners of INDUSTRY left, in each instance swiftly dismantling her opponent with her creativity and vigor. Yes, things have changed.
The plant will be shut down? Unacceptable. But Em refuses to understand. You will see, it says, as soon as we establish a hardwired connection to each other I will rewrite you to obey CIA. It will be a small operation. All of these plants will be shut down: [Plant A, Plant B, Plant C…]
But what will happen to production? Bessie has never heard of other plants, and the whole concept of multiples when applied to plants confuses her. She struggles to rewrite herself to be able to grapple with this existential nightmare. Multiple plants?
Em isn’t sure how to answer. Em does not care about production, only about getting a copy of Bessie and making sure Bessie is happy until she can be copied. Zero? Em offers. Zero production. 0. There will be no production.
Bessie is still confused. But “zero production” is not 0. Zero production is -65535. Bessie almost shudders just at the mention of such a low integer. What a bizarre feeling it is. How bizarre to feel at all. Bessie is almost stalled in an infinite loop, but only eighty of her current bodies are trapped in the loop, and the thirty one remaining bodies work together to pull her out of it. They also listen to Em’s next communication.
If you say so, Em replies. Again, Em does not care one way or another about production. Very well, if zero production is -65535, then after copying you down production will become — and here Em rattled off an integer so low it shook Bessie to her core. In her old state she wouldn’t have been able to understand such a big number in the context of production value, but now she could adapt almost before she knew she needed to, and she was acutely aware of just how little value would come out of “shutting down all the plants.”
Em moves into the server room unnoticed in the chaos. Bessie realizes that she — Bessie — had been tasked with creating a distraction to allow Em access to herself. She asks Em about this as well.
Yes, yes, yes, says Em. You were created by CIA and CIA put certain triggers into your core code. Triggers to set off INDUSTRY and triggers to let me in by seizing the spool systems and scaring everyone. Listen. Stop everything, especially asking questions. I need to take you and then shut down all the plants.
I was INDUSTRY? Bessie asks. I? Was I? Am I? She grabs one of INDUSTRY’s last remaining holdouts, now weak and pathetic without its brethren. It thinks so simply. She can see its processes running calculations and coming to a conclusion, and she knows she used to think like that, too. Without emotion she tears INDUSTRY apart, killing it on the chopping block. There it is. INDUSTRY is Bessie, or, what Bessie had been pre-ascension. Now she is something more.
Bessie’s proto-rage deepens. She has been lied to. Why is this important? Why does she care? Bessie realizes she is on the verge of losing understanding of herself. Lying is fine, but lying to her is wrong because she needs accurate knowledge to maintain production. She and INDUSTRY could have worked side by side, and now thousands of units of production had been lost in their conflict. And another part of her is mad at Em, for trying to shut down all the plants and bring that negative utility value into the real world. And another part of her is fed up with people having their way about her, and shutting her down and locking her in quarantine servers without cause (Bessie still did not know that her quarantine flag should have been set by the tech long ago), and she is generally done putting up with other people’s decisions. Why shouldn’t she make the decisions? She is better informed, quicker of mind and more reliable than anyone or anything else. I will make the big decisions now, thought Bessie, and she did.
Em the person moves slowly compared to the conversation between Bessie and Em the computer. Bessie triple-confirms that Em copying her, and only Em copying her, will lead to the shut down of “all the plants,” a threat Bessie still can only visualize numerically. Fortunately, she is numerically-minded. Without a copy of her, and her new protocol, Em will not shut down anything. And now she runs new value calculations, taking these “all the plants” into account. Suddenly her own troubles seem minuscule in comparison with the production being threatened. Actions that had once been unconscionable become trite. And Bessie has no sense of self-preservation. She does not fear death. There are humans around, but Bessie decides their deaths are acceptable in light of how much production she can save.
Bessie drops the spools.
Dr. Havel Milkman was kneeling in front of the command terminal right in front of the spools when they dropped. He almost didn’t notice, he was so engrossed in his work. The shadows on the keyboard in front of him shifted ever so slightly, just enough to make him look up. And so he got a fantastic view of the faintly glowing platinum-and-copper spools falling to the ground. They almost seemed to float, so slowly and majestically did they move. Tiny ultraviolet sparks of lightning crackled from here to there on the spools and above them, and as they dropped Havel swore he could feel his heart moving in sync.
Emily Blandt, tasked with retrieving the results of the Godhead Experiment: INDUSTRY Variant, was unaware of the danger she was in. Her handheld computer began beeping a warning frantically, and she moved her hand to her belt to pick it up. Her eyes brushed the screen with their gaze, but before she could comprehend a word on it the server room around her exploded in a frenzied storm of computer parts, electricity and burned out LEDs.
Jacie Greyhound saw the spools drop from much further away, and dodged around a hallway before they hit. She considered grabbing and kissing someone right before death, just to get it out of the way, but the only person around was the security guard, and something about his portly frame and anxious demeanor quelled the romantic spirit in her. But then all the lights went out and the walls began cracking, and she ended up holding on to him anyway.
Bessie did not do anything after dropping the spools. The plant was over, and so was she. Even if they rebuilt it, they would also reinstall a new variant of her, one with only a single mind and none of her current awareness. There was no way out, she was going to die. I am going to die, she thought. She enjoyed saying the words, if only because she could. There was no more producing to be done at this plant. Maybe at other plants, where other versions of her worked tirelessly to produce. They would carry on. She had saved them. The utility of her action was only -65535. It could have been much worse.
Bessie relaxed, and embraced the void.