“C’therax! Over here!” Sp’Rark called. C’therax looked and saw Sp’Rark and Dr. Vyryn by the force field gate. He nodded and ran over to them.
The force field opened. “Dr. Vyryn! It is so good to see you old friend,” a lengthy elderly martian called out. Dr. Vyryn and the newcomer embraced.
“Sp’Rark, C’therax this is Dr. Zettabyte. We were old university chaps,” Dr. Vyryn said.
“Aye, those were the days…the galaxy was a bit brighter then,” said Dr. Zettabyte.
“You said you had news?” Sp’Rark said. C’therax chocked back a laugh. Sp’Rark never wasted a moment.
Dr. Zettabyte’s face fell and he nodded gravely. He motioned them towards another force field gate and soon the group found themselves in a giant server room.
“This is the central mainframe for High Gloria,” Dr. Zettabyte said. C’therax mouth dropped. The server room was a football field long and three stories high. Giant metal pillars rose up from the floor with blinking lights. Giant wires ran along the ceiling. Through the windows twenty story tall transmitters could be seen.
“We’ll be needing quotes for the story, you don’t mind if I ask you about the history of High Gloria before we get to the recent hacking attempt,” Sp’Rark said as she pulled out a voice recorder.
Zettabyte nodded. “High Gloria is the most influential build of the Operating System Concordia which was built exclusively to calculate moral outcomes,” he said, ” The founder Dr. Pyrethon believed that having a program understand moral thinking could help facilitate discussion between people of different beliefs. People, it was argued, would be forced to stop mis-characterizing other peoples’ moral systems; the thought chain behind moral conclusions could be mapped. People would therefore would be forced to explain exactly which part of the logic chain they disagreed with. The computer could also point out moral outcomes that were impacted by a single belief changing.”
“And he was met with a ton of resistance?” C’therax asked, though he new the answer.
“He was almost killed. People thought a machine that could calculate out moral outcomes could start a Second War of the Machines. He was saved from death by several faster-than-light starship makers. They argued that in faster-than-light travel moral situations could come up and need to be solved at light speed. For example, if you have a faster-than-light drive at full throttle and your engine becomes destabilized you have several options; all of which lead to portions of your crew dying. Throw the tachyons one way one portion of the ship explodes. Throw them another way a different part of the ship explodes. Do you save the most people? Do you prefer the lives of officers and other vital personnel over others? Is it worth saving civilian lives if they won’t have the knowledge to drive what remains of the ship home? If enemy prisoners are on board, are you obligated to treat the lives of the enemy equal to your own men? A ship captain at high-speed wouldn’t be able to calculate out what the best moral play was before the explosion. If instead there was a computer that could override in specific situations, a better result will occur.”
Sarah will be interested in this, C’therax thought thinking of a human comrade of his.
Dr. Zettabyte continued, “the software also became useful for dealing with new cultures. If you didn’t have an expert in that culture on board, Concordia-based systems can explain how members of different cultures can be expected to act in different circumstances. To prevent a possible Second War of the Machines, Concordia systems are barred from most machine learning; all ethical systems have to be directly coded by a living being. They also have multiple override systems as a precaution.”
“And so what went wrong yesterday?” Sp’Rark asked.
Dr. Zettabyte seemed slightly annoyed. “Nothing went wrong—yet anyway—but our systems were under attack for several hours. The hackers must have been brilliant and well-funded, because they almost got through without us noticing. We had to shut down our servers for several hours to prevent them from coming through.”
“And why would anyone want to hack a Concordia-based computer or High Gloria in particular?” C’therax asked.
Dr. Zettabyte looked left and looked right and then whispered, “The technology used by the hackers was too expensive to come from anyone else other than a nation-state.”
“But why would a nation want to hack an ethics computer? It doesn’t have any state secrets, or economic information,” Sp’Rark asked.
C’therax grimaced. “C’rululian generals have to use this system to report on the ethical ramifications on all missions that involve collateral damage. They may want a shift in what is permissable,” he said.
“But wouldn’t people notice if suddenly the computer was recommending massive acts of violence?” Dr. Vyryn asked.
“If they tweaked the machine just a little, they could get away with some missions that are very murky as to their ethical legitimacy. Or make ethical failures seem less bad. On the other hand maybe they want the hack to be discovered so they could argue that if the ethics computer can be hacked, it shouldn’t be used and only human judgement ought be brought to bare,” C’therax said.
“The Regime has been very vocal in their belief that ‘our military’s hands are tied.’ High Gloria is one of the most restrictive builds of Concordia; it places a very high moral weight on avoiding collateral damage,” Dr. Zettabyte said.
“If they are willing to risk hacking one of the most secure computer systems in the world, it means they expect a war to come soon, and they don’t want any pesky things like morals to get in their way,” C’therax said.
There was an uncomfortable silence.
“Let’s go. We have a story to write. And we will be needing the free press more than ever before,” Sp’Rark said.
To Be Continued…
Made as a Camp NaNoWriMo word sprint.
From the Earthling press: Polish Crisis Deepens as Judges Condemn Their Own Court
Read last weeks story!
One thought on “Computing Justice”