Redwood
Fiction

Automata (Repulsed), Part One

Zoe Mitchell

October 2021
  • Home
  • Fiction
  • Poetry
  • Emerging Writers
    • Emerging Writers Submissions
  • Archive
    • 2022-23
    • 2021-22
    • 2020-21
    • 2019-20
    • 2018-19
  • About
    • Masthead
    • Writers
    • Contact
Date: 986:002:01
          Around the world there is blood. Hundreds of incidents of transport ships plowing through civilian streets are reported all at once. A momentary blip in the flight mechanics’ command system causes ships to go flying in the wrong direction.

          The real cause of the deaths? The AI pilots within the transports. When left with the choice of blowing up the goods within the transports by hitting a building, or spiralling into pedestrians and saving the cargo . . . well, you know what they chose to do.
          “Turn that blasted signal off!”
          The hushed murmurs of the meeting are silenced as the CEO of Robotic Takeover, Ray, storms to the tablet coms. He furiously taps the screen as the newscaster yammers on.
          “God, I hate the news, always exaggerating the issue.”
          He stops himself as he notices the room staring at him. Stifling silence fills the cubicle as the news cuts off.
          Ray clears his throat and launches into the meeting. They argue for hours over ethicality, and get nowhere.
          “AI are just too dangerous without a moral compass.”

986:005:01
          Robotic takeover launches a new campaign: The Automata: Bring Humanity to AI.
          “These new automata will be just as intelligent as the previous models, but they will be programmed to feel.”
          Behind the scenes, three years went into designing a vessel that could simulate human emotions in given contexts. Robotic Takeover had put all the money they had made from the transport companies into creating situations that required AI to act on “emotions." The trial studies, which took only seconds, involved inserting AI data into different programming simulations and testing whether that changed their decision patterns.
          The end product was a droid-like doll into which AI chips could be inserted, allowing for AI to experience human sensations. Touch, sight, emotions — the systems would overload the AI consciousnesses, searing the human experience into their programming before putting them back into the virtual cloud.
          Many reporters asked why it was necessary to go to such great lengths to do something that could easily be programmed into every AI in a matter of seconds. Ray would shake his head, explaining that AI is not simply a set of programs, but a consciousness. What he had created before were consciousnesses without emotions — rational thinkers. Each was similar, but not the same. Emotions would cause irrational actions to be taken. He designed these automata because of human flaws, especially in split-second decision making, knowing the automata would make the rational decision every time. But after the crash of 002:01, public pressure had forced him to find a way to modify consciousness itself.
          Reluctantly, he poured his soul into creating the automata prototype.

986:005:56
          She — she? — pulls in the shield of her visual capacities. Instantly, she is flooded with the programs around her.
          “Sight. You are seeing things, looking at things,” a signal from the doll’s wiring indicates.
          She looks around her station point. The 0s and 1s used to describe images and sights within her do not prepare her to see . . . this? The sight is soft on her programming.
          “This is a wall. It is flat, tall, blue.”
          She moves toward the wall. She puts a hand on the wall. She processes the sensation that occurs with contact. The signal indicates she has walked, she has touched, she has felt the wall.
          She spends the rest of 005:56 exploring her new senses. The wiring adjusts and explains what it is she’s perceiving: new words associated with the feelings she has.
          “Try speaking, Shuko,” Ray commands, walking through the doorway.
          She tries to activate her speaker systems, but no sound comes out.
          “Open your mouth.” Ray’s command is less audible, quieter.
          Signals flow through the doll she is in, up to her face, her lips.
          “S —” Static takes away her words. “Sh — Shuko . . .”
          “Yes. You are Shuko.”
          “Y— are Shuko . . .”
          “I am Ray. You are Shuko. No — argh.” Ray walks near the wall, his hand touching his head.
          Signals rearrange her — my — programs. I open my mouth.
          “I am Shuko.”
          Ray looks at me. The edge of his mouth moves upward.
          “Yes. Yes! You are Shuko.” Ray walks quickly toward me and wraps his arms around me.
          “Finally you work! I . . . two years and . . . oh, it all paid off. Now I have to work on the others. Ah! I have so much I need to do now that everything is working.”
          He walks out the doorway, and I cannot see him anymore. I hear him talking to himself. My programming warns me that he is most certainly going to do something irrational, but I stay. Why should I stop him?
To read Part Two, please click here.
Copyright © 2019-2022 Redwood Literary Magazine. All rights reserved.
  • Home
  • Fiction
  • Poetry
  • Emerging Writers
    • Emerging Writers Submissions
  • Archive
    • 2022-23
    • 2021-22
    • 2020-21
    • 2019-20
    • 2018-19
  • About
    • Masthead
    • Writers
    • Contact