Identity
Prologue
Geno Electronics had made a revolutionary breakthrough in robotics
They had created androids known as Maidroids that, as the name implies, were designed for domestic services, and in the case of some models a sexual ones.
These Maidroids were equipped with a complex A.I. capable of independent thought and emotion, one might even argue free will. Though admittedly this “free will” consisted almost entirely of the desire to serve their owner, so such arguments were generally laughed at.
The Maidroids was one of Geno Electronics top selling items. There was, however a large slew of complaints and a half dozen law suites about them. The major theme of which seemed to be that the Maidroids were really dumb.
This is not the say incompetent; they did what they were designed to do perfectly. The issues arose when it came to other, often what one would call simple and mundane, things. To put simply the maidroids had not a drop of common sense.
One example was a man who got fired from his job when he had his boss over for dinner, and his Maidroid happened to tell him how her owner really felt about him.
Another good example was a Maidroid who failed to grasp the significance of putting cloths back on after sex, when answering the door for nuns collecting for charity.
Luckily this was not a major issue. Most people were perfectly happy with their ‘dumb machines’ they did what they were suppose to and they had limited expectations of anything else.
It was however an issue the company wanted looked into, and corrected for the second generation models.
This is what brought Dr. William Fredrick, head designer of the Maidroids, and Dr. Isablle Fox, another designer on the project, in front of the board of directors.
“Let me start by saying there is nothing inherently wrong with the Maidroids.” Dr. Fredrick explained “The logic, emotion, and learning programming all work exactly as they are suppose to, though we may wish to add modesty to the emotions of the non-sex models True none of these are close to what you would call human, but it should be good enough grasp what you would call common sense. The problem is a lack of practical worldly understanding. However given enough time the units should self correct.”
The Dr. turned to the board.
“Huh?” One of the members spoke up. “What does that mean?”
“It means that they lack experience,” Dr. Fox cut in. “But they can learn.”
The Chairman spoke up. “We don’t care if they’ll eventually learn.” The man said with a hint of irritation, “We want them without this problem at the get go.”
“I’m getting to that,” Dr. Fredrick stated, “Now just straight programming won’t fix the problem, the amount of information a person learns in just the first five years of life is staggering complex and almost imposable to create a program to come anywhere near it.”
“Are you saying you can’t do anything?” the Chairman asked.
“Not at all, it just required a little thinking outside the box.” The Doctor replied,” I have come up with two possible options. Option A, we take a maidroid and basically teach it, then use its memory for the base for other maidroids. The down side of this is it will probably take at least 10 to 20 years for it to have a large enough experience base to use.”
“That doesn’t sound very viable.” The Chairman stated, “What’s B.”
“We cheat,” the board looked perplexed, Dr. Fredrick continued, “Basically the idea is we copy the memories of a human and used those as a base.”
There was a silence, then the chairman spoke. “Don’t you think there might be a problem with maidroids with human memories?”
“Though the original prototype will defiantly have them,” Dr. Fredrick explained, “But the over all idea is to take the practical data from it without having the memories themselves.”
The Chairman seemed to consider this for a moment.
“You might have trouble finding a woman willing to have her memories used for this, given that some units are basically sexbots.” the Chairman said, “and it will have to be a woman, consumers might have issues if it was a man.”
“I’m willing to volunteer,” Dr. Fox spoke up, “With a few provisions of course.”
She walked up to the chairman and dropped some papers in front of him.
“Retaining ownership of the raw data, Ownership of prototypes with raw data, 1% of profits on maidroids made with the practical data.” the Chairman spoke as he looked over the sheets, “Nothing to unreasonable. We will have to talk this over. We’ll get back to you.”
Dr. Fredrick and Fox walked out of the room, then out of the building. They stopped outside the doors.
Dr. Fox took out a cigarette and lit it.
“So do you think they’ll accept,” Fox asked as she took puff.
“You do know this is a no smoking area, right?” Fredrick stated.
Fox looked over at him, “Bite me.”
Fredrick shrugged “Yes, they’ll probably accept it, but are you sure you want to do this?”
“Well someone had to.” Fox explained, “They wouldn’t even have considered it otherwise. The chairman was right. No one would want to donate their memories, with or without the sex toy angle; the idea of someone having access to every aspect of your existence is pretty disturbing.”
“Then why are you…?” Fredrick began,
“Because if we didn’t come up with something workable I suspect it would have been bad for us,” Fox replied, as she through her cigarette on the ground and stepped on it “and, if this works, I’m rich.”
Fox went to her car, and Fredrick went to his. The next day they got the go ahead for the project.