For the previous installment of this story, click here.
Or, visit the Cleansed By Fire portal page for comprehensive links to previous chapter installments and additional backstory and information about the novel.
Cleansed by Fire
Chapter 6, Nexus (continued)
Emerging from the tunnels into the desert at just before 1400 hours wearing an ultradense slickskein suit was not, Bechan thought, something he would try in the future ever again—nor even recommend to someone he disliked. The only blessing was that this was winter, so the temperature was barely breaking 20 degrees Centigrade. Even so, it felt brutal as the sun beat down, and he was out of the outfit as quickly as he could be, sucking down the last of his recirculated sweat as he did so, and following up with a bottle of water from his pack. The Jordan was nearby, and he hiked there as quickly as he could to refill his water bottle and drop a tab of NutriCleanse into it.
He pulled the ocular out of his pack and scanned the area quickly. Out here, there wouldn’t be be many security pylons, but what ones there were would be well hidden. And he had to worry about wingscouts that might pick up his movements. And then there were…
In the distance, 800 meters away, a wyvern. Looking every bit like the unholy offpsring of a grizzy and a crocodile with a little extra horror thrown in. It squatted there, facing away from him, hunched on the thick legs of its wide, muscled lower torso, covered in bristling hairs that were said to be sharp as razors. The upper torso was leaner and more scaly, with just whispers of those bristles, as it narrowed toward the long-snouted, reptilian head with double rows of teeth top and bottom. And from that upper torso, no limbs per se but rather five spines on each side, like huge claws or membraneless wings curving outward—their embrace was deadly, and even a scratch meant the end of a struggle, as the toxin therein was a mild and fast-acting paralytic.
People talked about the Vatican dropping wyverns out here to breed amongst themselves and hunt would-be refugees. In truth, Bechan knew, only the last half was true. Wyverns were 100% manufactured genetically from nothing up to massive. Totally infertile—genderless, even, and thank Yahweh for that. Bechan had never had any doubt the stories of wyverns out here were true.
He tightened the focus on the ocular and noted the lack of a control collar. From all reports, the Vatican dumped the rejects out here to help dissuade people from leaving Israel. The ones that were too hard to control or that were too old for serious field duty. But as it turned its head northward, he saw the inhibitor pod at the base of its skull. No direct control, just a device to keep it from entering populated zones.
And then its head turned, slowly and deliberately, and Bechan could swear it was looking right through the ocular into his own eyes. He had no idea what the visual range of a wyvern was, nor its scent range, but he wasn’t about to hesitate in believing it knew he was there.
What wyverns lacked in starting speed, they made up for in the stretch, and they had plenty of endurance. And Bechan was pretty sure they could swim halfway decent as well.
He surprised himself with how fast he got into his watergear. He was certain it would have broken the record of any aquammando operative currently on active duty. The question was whether it was quick enough to get him down the river and to his next destination before he became a late lunch for his new admirer.
Even though Amaranth had given him a quick version of the story this morning, and Daniel Coxe had just given a more exhaustive one, the enormity of the problem was still screwing with Gregory’s mind. So the Peteris traded a quick glance with his wife once Daniel had finished his story, then asked him to repeat it. He sighed and did so, and then Gregory leaned forward a bit, frowning.
“So, we have an AI that no one but you and the Godhead—and now us—know exists, and it is running amok doing God knows what?”
A puzzled look crossed Daniel’s features. “Running amok? What are you getting on about?”
“Well, unless this AI is residing within the Godhead, in which case I think you and everyone else would have noticed it already, I presume it’s off somewhere in the Grid, possibly getting ready to disrupt entire networks and possibly destroy whole governments and nations,” Gregory said.
Daniel kept his eyes on Gregory while turning his head slightly toward Amaranth. “Paulis Dyson, is your husband a Luddite? Or an AI-phobe? The idea of AIs just flitting around on the Grid or taking over entire networks or launching missiles to destroy humanity are pretty much the province of the most idiotic vidmakers.”
She laughed. “My husband likes his technology just fine, at least when he’s searching for music or documentaries on the SystemGrid. But he barely passed his comptech classes. And to be honest, I’m not sure what to think myself. How would the Godhead arrange to have an entire AI complex built for a child and do so in secret? And—Is this a primary AI?”
“What I’ve seen suggests it is.”
“A primary AI is virtually immortal, unlike secondary AIs. The kind of sustainability functions it requires are huge, and unless the technology has changed, a primary AI can only be built from the ground-up by a massive team of techs over several years, or by the union of two other primary AIs.”
“So if it was a secondary AI with a limited lifespan, the Godhead could have just spat it out and maybe secretly transmitted it to some compact database somewhere, maybe hidden it in a virtual brothel or a mediaplex, since most of the secondary AIs end up in the sex industry or entertainment industry somewhere. But a primary? Where else but the Grid would have enough space to hold it without anyone noticing it? And where else could the Godhead send it without anyone noticing?”
“Paulis, Peteris, it just isn’t…” Daniel began, then stopped, sighed, and started again. “Look, if I were to offer to take out the left hemisphere of your brain and store in a support-unit somewhere, with a sliptrans attached to it and a companion sliptrans attached to your right hemisphere, so that the two halves of your brain could keep communicating no matter how far apart they got and with no delays, would you do it?”
“Not a chance,” Gregory said.
“All right then. Primary AIs have to be in large complexes in part because after thousands of years of scientific endeavor, we still know cock-squat about how personality and emotion is really, fully generated and managed in a human brain. So, every AI, primary or secondary, is just a collection of simulators with approximations to produce what seems to us—and to the AI—to be personality and emotion to go along with their rational and computational components. More powerful AIs, the primaries, require correspondingly larger databases than secondaries to manage the emotional content and to have a more complete range of emotions.”
“Those simulators are huge, almost unwieldy databases, which is why AIs are only created for functions that require a nearly-human personality,” he continued. “Otherwise, we wouldn’t even bugger around with secondary AIs, much less primaries; we’d just use computers for everything. But we need AIs for functions that require massive computation in parallel with some level of humanlike judgment, so we put the effort into creating distinct homes for their minds to live in. Really, really big homes for the really, really exceptional AIs so that you can adds lots of physical and virtual defenses to safeguard it. Do you follow so far?”
Gregory and Amaranth both nodded, and the Peteris muttered, “This seems to be my week to be berated by people I grant asylum to.”
“I granted this one asylum, darling,” Amaranth whispered, patting his hand.
Daniel ignored them both and continued. “Going back to my example, you wouldn’t split your brain in two and trust that the sliptrans might not cut out at some point, essentially robbing you of half your brain. Do you think an AI, particularly a primary AI, would want to spread its brain in hundreds or thousands of pieces all over the Grid?”
Gregory put up his hands in a gesture of surrender and then made a “zipping-my-lip” motion.
“So, the Godhead fathered a child,” Amaranth began, “Did he order up the construction of one, or did he beget one with another AI? I’m going to have to guess the former. He got in league with some kind of mega-rich benefactor, who built an AI complex in secret for him and pulled together a secret team of techs to design and program the AI itself.”
“No. The 13 data artifacts I located show signs of being left behind as trace residue from an inception routine. Unless two AIs are hardlinked during the reproduction process, these routines are broken down into separate self-extracting databases and then shipped to what will be the new AI’s complex. The Godhead bottled up his swimmers and shipped them out somehow to either a mother AI directly or to a location where the mother AI’s corresponding inception routines were sent.”
“Wouldn’t someone notice if a female AI suddenly received 13 packages of virtual sperm?” Amaranth asked.
“It could be done carefully and spread over time, but the shipment of larger databases like that to an AI would increase the chance of exposure. If the inception routines were sent directly to the mother AI, I would have expected to find several hundred artifacts and not just 13. But even still, the mother AI would have to create the child inside her own complex, and that would be noticed.”
“So the Godhead has been planning this with another AI, who, I’m guessing, also prepared 13 packages. The pieces of her virtual ova, to be meshed up with the Godhead’s sperm, sent to a predetermined location where a really big bunch of databases had already been readied for the birth of a bouncing baby AI,” Amaranth ventured.
“That’s it on the money, Paulis Dyson. It makes the most sense for success and secrecy.”
Gregory finally decided that he had been quiet long enough. “All right, so why is my notion of a civilization-destroying AI something to scold me over?”
“What?” Daniel asked.
“A handful of warwagons caused the Conflagration and then went to war with all the other warwagons, and it took centuries to dig out from that. Those were AIs running those warwagons.”
“They set fire to a nice chunk of the world because the governments holding their reins ordered them to,” Daniel noted. “They tried to kill the other warwagons largely because following those orders drove them insane. Look, the AIs on the four surviving warwagons are the only AIs we really have to worry about destroying the world, because they are completely self-sufficient, mobile and atrociously well-armed—but they already hobbled themselves to prevent that.”
“So, how can you be so sure that an Earth-bound AI might not do what the roguewagons did?”
“AIs are stuck in their physical complexes, for pity’s sake. What do you think is going to happen if an AI starts launching missiles or tanking entire economies or something else like that?”
Gregory pondered for a moment, then said: “Everyone would band together to lay waste to its entire complex, and since it has nowhere it can run…”
“Precisely. And from a strictly practical perspective, would you, if you were an AI, want humanity destroyed? Once the power plants stop working you would be as good as dead. Hell, once the SystemGrid started falling apart, you’d probably want to die because of the boredom.”
“Fine, then, you’ve convinced me that our wayward AI won’t end the world. But can you guarantee it isn’t up to something more subtly nefarious?”
“Of course I can’t. In fact, there’s a good chance it is,” Daniel said.
Although his statement was a nice validation of Gregory’s cynicism, it didn’t do a thing to settle his stomach the rest of the meeting.
(To read the next installment of this story, click here.)