it. He almost
smiled.
“That is a very interesting question,”
Kilgore said. “And well stated, I might add.”
Anita blushed. This thing, this inanimate
object, had made her blush. What a strange world , Sidney
thought.
“Denlas-Kaptek made an arrangement with the
university. I was enrolled in the medical program and for this
Brown University was to given full access to me during the course
of my study, a prominent place in the promotional literature for
the robotic physician marketing campaign, and of course monetary
compensation. With regard to the secretiveness of the program, any
student that had enrolled in courses I was currently scheduled to
take were required to sign confidentiality agreements. I did not
see these agreements myself, but I understand them to be very
thorough and quite punishing if violated.”
“And during this period of schooling, you
chose your field of practice?” Sidney asked. His voice contained a
tone that suggested doubt. He wondered if Kilgore would pick up on
it.
“You have doubts regarding my ability to
choose a field of practice?” asked Kilgore.
“Well, perhaps a few,” he answered. Sidney
was impressed. He wondered how the robot would interpret
skepticism. Not that it had feelings to hurt. Not in this model. So
that shouldn’t be an issue.
Why then was something nervous tugging at
the back of his head?
“Because I am not human?” the robot
asked.
“In a word, yes.”
“Is there a statute that requires all
doctors to be humans?”
“Not that I am aware of.”
“Then why would you assume I would not be
able to make that type of selection and continue my education and
subsequent residency to become a fully licensed medical
doctor?”
“Perhaps because of my own poor
understanding of how the programming at so advanced a level works.
Perhaps,” he added before he thought better of it, “because of my
own prejudices.” That one could have backfired. But again, only if
emotions played a part, and with this model they didn’t.
“Perhaps because you’re nothing but a
machine,” said Anita, looking at her pad, still scribbling.
It was offhand, a cavalier remark, which
Anita cast out like an idiot savant fisherman who throws a line to
a shark with a toy rod and catches it. Sidney and Kilgore were both
silent. Both were staring at her. It took a moment for her to
realize the conversation had stalled and she looked up.
“What?”
“That was an interesting statement,” said
Kilgore. “One might even argue that it bordered on rude.”
“But you don’t have an emotive processor, so
you don’t process rude. Or do you?”
“I think it is perhaps too simple to say
that I do or do not process rude. I do not process it in the same
way a human might. I do not feel angry when someone is rude to me,
calls me a name, or insults me. But I do have a comprehensive
understanding of what rude is, both in speech and in behavior. I do
recognize it. Therefore, one could say I do process it.”
“Okay, so you recognize it. So what?” It was
as if she were being deliberately antagonistic. What’s up with
this girl? Sidney thought.
Again the robot was silent and merely stared
at Anita. She stared back waiting for a response. Sidney was far
less comfortable and wondered what would happen if a robot decided
it was in fact angry.
He needed to take back control of the
conversation.
“It interests me that you consciously
consider yourself a doctor.”
“I am a doctor.”
“Technically, yes.”
“Technically?”
“What I mean to say is that—at least as far
as I understand from my preliminary reading on the subject—the
other medical robot prototypes do not consider themselves doctors.
They are skilled in the administration of treatments for many major
types of illnesses, but they are not doctors. They have no degrees,
per se.”
“Please do not forget that I did attend an
institute of medical learning.”
“And that same institution made the
conscious