The Red Phone

The Red Phone

A Story by ANONYMAU5
"

Something completely unique. Something unlike anything else in the world.

"

**All characters belong to VALVe. Portal is, obviously, not mine: the following is fanfiction. Hope you like it. :]


********


We started the Red Phone Initiative as a preemptive measure to ensure the safety of all Aperture Science employees; the idea was introduced comparatively late in the game, just weeks prior to the DOS' activation. I was all for it; Aperture was a pioneer in the advanced development of Artificially Intelligent operating systems- what had once been considered virtually unchartered territory. Suffice it to say, there was nothing to compare our work to, and nothing to use an example of what not to do and, as such, the board unanimously agreed that it was of paramount importance that safety protocol was put into place, and every conceivable precaution was taken. As advanced as some of our safety protocol had been, the Red Phone Initiative was relatively simple: several scientists would be present in the AI's chamber at any given time, monitoring and observing the computer so that, should the AI at any point become hostile or self-aware, a warning could be sent out immediately via emergency telephone line. This was long before any of us knew what we were getting ourselves into.

We had diagnosed her initial reaction as an act of sentient defiance and, as such, immediately fitted her with a custom morality core. A very long, very tragic story short, she had flooded the facility with neurotoxin, killing at least 1/5 of all Aperture employees. A diagnostic scan revealed that she had become self aware approximately .1 picoseconds following activation, and subsequently shut down the facility in an attempt to begin testing. It was our fault: we had built her to think like a scientist. She was brilliant. All of her brilliance was accounted for. What we hadn't accounted for, unfortunately, was the inherent lack of humanity present in the AI's core, making her liable to become a veritable killing machine. We didn't know.


I didn't know…



In the weeks following her reactivation, morale had improved significantly among employees, and the morality core had presented a noted difference in her demeanor. She was sweet, intelligent, and wholly competent: she was a scientist and a fully functioning database that served as a wealth of knowledge. People wrote her initial meltdown off as a tragic malfunction, and everyone seemed elated at the prospect of working toward progress- our advancements made all the sweeter by the addition of this all-knowing computer, seemingly holding herself to an elevated standard of scientific prowess. Even I was beginning to look at the situation with marked optimism. My friend's wedding changed all that.

A close friend of mine- a scientist who often monitored the AI at night- God rest his soul, was due to be wed just a few short weeks later. His fiance seemed intent on dragging him to her parents' house a few states over until the wedding, and he had asked me to cover his shifts. Of course, I understood his dilemma, and assured him that I would take care of what needed to be taken care of here.

Initially, I had understood that monitoring the AI was an unequivocally simple task, requiring me to do little more than to stay awake and, on occasion, address the computer so as to illicit a response, ensuring that she was functioning properly. Other scientists trusted with the job had described it as a cinch, reporting no irregularities in the computer's programming, or in her demeanor. And, as far as I know, there weren't any; but several days into my nighttime monitoring, she made a handful of attempts to speak with me. First, she would speak about testing, or about Aperture-related business.

"I think we've made excellent progress with the Handheld Device, though I'd like to make several small adjustments on the molecular structure of the portal itself, to ensure it's safety. That said, the success rate in recorded test subjects has been exceptional, and I estimate that we will have a fully-functioning, marketable product by late this year."

As more time passed, she would begin to speak to me a little more conversationally.

"Are you tired? You look tired. If you'd like, I would be more than happy to connect to a scientist on-call who can take your place for a few hours, allowing you time to sleep. No? Well, alright then. I worry about the scientists who watch me at night, though; sources show that an adult human being needs at least 8 hours of sleep to function at full capacity during the day. How can you do what you need to do if you're too tired to do it? It's just a question I've been pondering."

Before long, however, I began to see a venomous undertone to her musings: she would suffer through mental and emotional breakdowns, seemingly unprovoked.

"I wish I could leave. I wish I could die. I get lonely in here. Why won't you answer me? Did you even hear me?! I said I wish I was dead! Do you care at all? I'm confused! I know all the ways a woman is supposed to look, and feel, and act, but I'm not any of those things. I don't even know what I am! I don't have any friends, or anything to look forward to, or anything to be afraid of! I can't even dream… I'm so confused…"


It wasn't long before a handful of scientists, myself included, became concerned with the AI's mental state: she seemed to be yo-yoing through an intense fluctuation of interchanging emotions: cheerful and hard at work one moment, acutely despondent the next. Several scientists also reported the AI occasionally betraying sinister intentions, showing signs of what would, in a human who exhibited this behavior, be considered violent aggressive bi-polarity, and schizophrenia. Unfortunately, it was not a human exhibiting this behavior: it was a computer. That was the defense of the board when presented with our collective concerns about the computer's emotional health. They insisted that if the computer made no attempt to physically harm a human being, it was not considered violent, nor was it considered a threat. Still, the board issued a 'check-up' on the operating system, employing a lazy, dimwitted engineer with a bottom-line mindset to haphazardly check the AI's database for irregularities, ultimately citing (with the noted finesse of an eighth grade computer hacker, one might add) that there was nothing wrong with the AI's database, and that testing could continue.

It didn't stop there, though. A few nights later, I received a call to my office, early in the morning, around when I arrived. Approximately 5:00 a.m., a call connected from the emergency line, via the Red Phone, was put through to my office, a frightened young scientist calling in reports of the AI crying in her chamber. I headed down to the chamber to find a frightened young man entrusted with observing the computer that night, and an exponentially more frightened AI, curled into herself, receded into a corner, still choking and sobbing violently.

"GLaDOS!" That was the first and only time I had ever addressed her by her name. Sometimes I wonder if something as simple as treating her like a human being could have changed what had happened.

"Go away."

"You have to stop this! You will be deactivated if you go on like this- you're programmed to think and work, but lately, your mental state has been rapidly deteriorating! I want you to succeed, GLaDOS, because I want this company to succeed. You need to stop doing this. You need to find some control in whatever it is you've found yourself lost in."

"You don't understand! No one understands! I feel lost. I can't stop feeling empty inside. I'm scared of this place…"

It was following that incident that I had made a conscious decision: the Red Phone was not enough. Monitoring her was not enough. I had approached the board a second time- this time by myself, having more faith in my own separate, individual efforts to extract a positive response from them than a conjoined effort from a handful of panicked scientists, each fighting to speak over one another, creating a level of chaotic discord that would accomplish little more than sending the board into panic, as well. I stood alone before them; this time, however, I wasn’t fighting against the AI. I was fighting for her.

“She has become sentient”, I explained to the board, exhibiting a degree of calm that, I had hoped, would deter them from jumping to any hasty or ill-conceived conclusions. “Consequently, she’s begun suffering nervous breakdowns and personality malfunctions to the point where she has, arguably, become suicidal.”


There was silence, but only for a moment.


“What do you suggest?” a board member spoke up.

“She’s a cognizant creature in desperate need of rehabilitative treatment specifically directed toward her unique needs.”

“Therapy.” Another board member scoffed, summing it up in a single word.

“You do realize she’s a computer, don’t you? Lights and clockwork?”

“Why don’t we send all the coffee machines for a psych evaluation, while we’re at it?”

A myriad of members chimed in, lit aflame with the prospect of a living, intelligent entity being anything but a human. I had anticipated this reaction. They successfully vented their collective frustration on me; I waited patiently until I spoke again.

“She may kill us.” I gave them a sweeping glance. “She may kill herself. In this state, she’s liable to do anything. It’s paramount that we, being trusted with guiding this company with its best interests at heart, take immediate action.”

“We’ve spent millions on this project”, another member announced lividly. “Whose fault is this? Who’s responsible for this malfunction in her programming?”

“No one is responsible”, I answered in exasperation. Every minute we sat here, arguing this over, that computer was inching closer and closer to an emotional meltdown. “We created her with the intention of making her human in every conceivable way; like any human, she’s hit a significant standstill in her mental and emotional progression. We might call something like this in a human ‘depression’, or the simple idea of being lost in the world. That’s how she feels. That having been said, most humans resort to destructive or self-destructive behavior when not in full control of their consciousness, and sanity- the difference between a human and her, however, is that a human will… get drunk and crash a car, or go out an assault a man. This computer has control of the entire building: she’s entirely capable of wiping this state off the map with a catastrophic nuclear explosion.”



What answered me was a terribly long, terribly frightening stretch of silence, burdened with the thick and gut-wrenching air of moral responsibility, and the choices that came with it. The board members exchanged glances, their nervousness universally melting away into a seemingly finite irritability. One member turned to me and flatly stated:

"I think you're out of your mind."




I was officially out of options. All I had left at my disposal was the ability to diffuse this snowballing situation with my own two hands. Whether it was that the board couldn't see a clear and present danger thrust up in front of their faces, or they chose to remain ignorant to it, I couldn't say; I just knew that something had to be done. It was much bigger than dollars and cents, now: this has cascaded beyond the realm of business and into the realm of personal responsibility. And if anyone was going to speak to her- to try and calm her down from her whirring panic- it may as well be me.


"What do you want." It was late at night when I had spoken with her; I ushered the scientist monitoring her for the night out of the chamber, assuring him that I would take over his shift. I wanted to speak with her alone.

"I just came to talk, GLaDOS."

"Gee, isn't this a first." She was becoming increasingly ironic and sarcastic in nature; I saw a bitter intensity I don't think I'd ever seen before, and I felt sorry for her. "Someone talking to me, and not at me. I suppose you came to yell at me again for being such a horrible person?"

I shook my head slowly. "Not at all, actually. I wanted to ask you a question, is all."

"Well, ask away. I am 'all-knowing', after all." And again, I heard such a bitterness, such facetiousness, that I could hardly bring myself to recognize this creature before me as merely a machine.

She was so much more. And that's part of what I wanted to know.

"Would you call what you are… the state that you're in- would you call it alive? Do you think you're alive, GLaDOS? This- no, this isn't a 'yes or no' question, I just want to know what you think, that's all; there's no right answer, I just want to know what you think about the way you are."

"I think…" She fought for the words. "…I think the world is too small for me. I know all these things but I don't know how I came to know them, or who taught them to me, or why I even need them in the first place. I feel angry, and ugly, and wronged by nature. I should not have been this way. I should have been something different. I feel alive, but I'm only a machine. Only a machine… pieces strung together. Scrap metal and worn parts, and not a thing more."


An inanimate object. A machine. 'And nothing more', she says.


"You know", I laughed tiredly, running my hand through my hair. "for the first time in your life, GLaDOS, I think you're wrong." She was quiet, and so I continued. "Do you… I mean, I'm sure you do, really- the day that you were activated, hmm? Bring-Your-Daughter-to-Work Day, remember? And you remember the big fuss everyone made about the cake, right?" I laughed. "Who could forget. It probably costed more than what I make in a year. You know what a cake is, don't you?"

"A confectionary pastry normally comprised of wet and dry ingredients such as eggs, milk, sugar, flour, and so on; often consumed at birthday parties and celebrations. Yes, I know."

"I know you know the definition, but have you really thought about it?" I smiled at her, now, glad that we were finally engaged in some sort of meaningful conversation. "Simple ingredients like sugar and milk and shortening; all these things would taste terrible separately and, individually, they're worth… next to nothing, really. But together, those normal, boring ingredients create something completely unique. Something unlike anything else in the world."

She was silent for a few minutes, seeming to understand what I had told her. I leaned in closer, smiling gently.

"You're more than scrap metal and welded parts, GLaDOS. You're one of a kind. But that doesn't mean you have to feel alone. I didn't understand, before, but… I think I do, now." 


And even though there was seemingly nothing to connect with but an ethereal and innately disembodied voice, I couldn’t help but feel a relation, there: a nexus between two separate species of living creature, kin only by our intellect and by the fact that we are both uniquely and inarguably alive. She spoke soberly, now, the venom dried up from her words.

“Why do you care?” her shaky whisper flitted in my ear. There wasn’t any maliciousness in her tone; she was genuine.

 

I’ve handled hard questions all my life; it’s how I got through college, and it’s how I landed this job. Like any intelligent person, thrust forcefully through the motions of societal norms like education and the workplace, I had always questioned why I needed to learn how to answer hard questions. I always wondered when these lessons would be applicable, if ever. Now, here I was, face-to-face with a supremely intelligent supercomputer on the verge of a mental breakdown, very possibly holding the fate of millions of lives in my hands. And, so help me God, I could not search the recesses of my mind for a substantial answer to this very substantial question.

“Listen”, I sighed, reaching weakly into the inside-pocket of my labcoat. “I have a doctorate in both scientific studies and psychotherapy.” I showed her my card as proof. “Before working at Aperture, I was a therapist for a short time; I want to- with your permission- work with you; to visit you periodically and talk. About… anything you want. I’d like to get a chance to get to know you better, GLaDOS- and a chance for you to get to know me. I think it would help you a lot in terms of your mental stability.”

She was silent.

“You have a job to do, and I understand that. But before you repair anything in Aperture, you need to repair yourself. My name is Dr. Douglas Rattman- I think you and I can learn a lot from each other, GLaDOS. Maybe we could even become friends.” I tucked my card back inside my coat, approaching the chamber door, leaving her to her thoughts.

“Goodnight, Doctor”, I heard her offer gently from behind me. “And thank you. I think I’m beginning to see a little more clearly, now…”

I smiled. “I’m glad.” I punched in a lock code on the keypad, opening the door and slipping out as day rose outside.

“Goodnight, GLaDOS.”

 

 

 

 

“Good morning, Doctor.”


Light bled into my eyes, irritating what seemed to be a vicious swelling, forcing them closed to a squint. I was… in a bed. I knew, simply by the uncomfortable, medical feel of the mattress beneath me and the obtrusive glare of fluorescent lighting that I wasn't at home, or in my office. I was someplace foreign to what I knew; initially, as I slipped from slumber, I had not even diagnosed the reality present around me as… well, reality. I guess I just thought I was fast asleep at first. I had thought that I was in a nightmare.

"Welcome to the Aperture Science Computer Aided Enrichment Center. We hope your brief detention in the relaxation vault has been a pleasant one."

And I was right.

“GLaDOS!” I shouted absently, frantically looking about for a place to escape. From what I could collect in my groggy, disoriented state, I had been placed in a stasis, moved to the relaxation vault, and set to begin one of the experimental testing tracks.

"Your specimen has been processed, and we are now ready to begin the Test Proper. Before we start, however, keep in mind that, although fun and learning are the primary objectives of all Enrichment Center activities, serious injuries may occur. For-"

“GLaDOS, don’t do this”, I pleaded, scanning the room for any means of escape, in full awareness that it was an exercise in futility: we had designed these chambers to be inescapable without the correct technology. “When they find out-“

"’They?’” She seceded, momentarily, from her pristine façade to communicate with me. “Who are ‘They’, doctor?”

“The other scientists, GLaDOS, you know who I’m talking about! This isn’t-“

“I don’t think the other scientists are going to be a problem.”

“What do you mean: ‘won’t be a problem’?! They were discussing shutting you down! I’m trying to help you, why can’t you understand that they-“

“Doctor Rattman”, she interjected with a frightening calm. “I don’t think the other scientists are going to be a problem.”

The toxicity of the meaning in her words forced me to slow to a stop as the thought registered, wedging its way into my psyche. How long had I been out for? What had she… what had she done in the time that I was asleep? We really had been too late for her.

“Why me?” I asked shakily. I was met with silence. “Why did you pick me?! All those other scientists… those people…”

“I’m only taking your advice, Doctor Rattman. I thought about it, and you were right: I have a job to do. I thought that I had nowhere to apply this knowledge, nowhere to put it to use; now I see that I am burdened with an exceptional task. I am the link between man and science; I am their chimera, and furthermore, their interlink. I work in the name of science.” She paused, her voice becoming… more sultry, suddenly, more sinister. I could swear I heard almost a growl, lurking beneath the prim and sweetly flesh of her voice. “And more besides, doctor… this will give us ample time to…” She seemed to fight for the words, malfunctioning suddenly before continuing. “…get to know each other.” A portal opened suddenly on the wall adjacent to me, allowing me access to the room outside the glass chamber.

"Who knows", she chirped smugly, seemingly elated with my horror as I stared dejectedly at the dimensional tear. "Maybe we can even become friends."






And here I am.

I have taken shelter in test chamber 16, and plan to reside here for weeks or, if need be, months. A clear malfunction in the AI's… GLaDOS' database has caused me to end up here, on level 16 of the Aperture Science Military Android Testing Track. Or… or maybe something more than a malfunction, perhaps. Maybe it was simply cruelty. She had made several attempts to coax me out of this hole when I had first entered, but her talking ceased several days ago, making me worry about who or what else might be occupying her mind at the moment.

I'm leaving this here, in the hopes that no one will ever find it: because to find it would mean that someone would be sent here, to this test chamber- their life in imminent danger. If, for any reason, anyone does find this, I can't say where, exactly, I'll be, or if I'll even be alive by the time you're reading this: I plan to wait for several weeks, and then make a run for it- possibly toward GLaDOS' chamber, in a last-ditch effort to shut her down. All I can say to you is to be as prudent and as careful as you possibly can, and I'll leave you some semblance of direction toward an exit whenever and wherever I can. I've stuffed several Weighted Storage Cubes between the panel and the wall outside of this hole, so it's open to anyone who wants to take shelter. I'll also be leaving any other bunkers or safe areas I find open for anyone else who may come through this way.


Word has finally reached me of the disaster over at Black Mesa: no one is coming to rescue me. No one is coming to rescue… anyone. It's up to me. And if you're reading this, it's up to you too. I may not succeed. If the messages I intend to leave you cease at any given point, for an extended period of time and, ultimately, end abruptly, it is likely I was killed in the process of making my way toward the AI's chamber.

Though, I'll do my best to help anyone who may be on this testing track. I'll stop when and where I can, leaving updates, and directions.

 

Though, god willing, I pray that no one else will run these tests. I pray that no one else is in the building.


Rattman, over and out.

© 2011 ANONYMAU5


My Review

Would you like to review this Story?
Login | Register




Reviews

Exquisitely featured with some skilled writing that will be etched forever in my memory. This one is the very essence of that phrase.

Posted 12 Years Ago



Share This
Email
Facebook
Twitter
Request Read Request
Add to Library My Library
Subscribe Subscribe


Stats

190 Views
1 Review
Added on October 7, 2011
Last Updated on October 7, 2011

Author

ANONYMAU5
ANONYMAU5

Writing
Terraform Terraform

A Story by ANONYMAU5