to tell her parents she would have to be transferred to the ExtraChallenge school if she didnât do better. She didnât, and one day she wasnât at school anymore.
(All this time, Fuzzy just sat there. But he was busy.He downloaded Maxâs records. And then Tabbieâs. He looked at the EC schoolâs statistics. This wasnât public information, of course, but the school systemâs password was easily bypassed.)
âYou should not go to the ExtraChallenge school, Max,â Fuzzy said.
â
Ugh!
Not you, too!â groaned Max. âTrust me, I donât
want
to!â
âWell,â said her mother, âthen Fuzzy better go home, and you better go study.â
âI am contacting Jones,â said Fuzzy. âHe will be here in approximately forty-five seconds, based on the vanâs present location.â
4.4
MAX â S HOUSE
Fuzzy got up and very politely thanked the Zelasters for dinnerâeven though he hadnât eaten anythingâand for the lovely evening, even though it hadnât been lovely.
When they had been preparing Fuzzy for the Robot Integration Program, Nina had sent him links to several websites about manners and etiquette and he had created a long list of PoliteBehavior() code.
So when Fuzzy thanked the Zelasters, he was just running the appropriate code. Thatâs what robots and computers do, after all. And when they canât find the appropriate code, they either do nothing or generate an error message.
But not Fuzzy. When Fuzzy couldnât find the right code, he started writing it himself. This was what he was built for. To make a plan to fix an error, not just report it. To keep going . . . like a human has to.
And all through that dinner, listening to Max and her parents, Fuzzy had tried to find the appropriate code for the trouble Max was in. But he couldnât. The problem didnât even make sense, he realized: The scores showed that Max was not smart, but his own analysis showed that she
was
smart.
Smart = not smart
. It just didnât work. Something was wrong. He needed to fix it. In fact, he wanted to fix it.
Robots arenât supposed to want things. They are not supposed to like one person better than another person. They arenât supposed to do things they are not programmed to do.
But thatâs where the fuzzy logic came in: Fuzzy
was
programmed to do things he wasnât programmed to do.
And so he put all available processing power into creating a new, high-priority subroutine:
HelpMax().
4.5
NEAR MAX â S HOUSE
A block over from Maxâs house, a cargo truck was parked so that the occupants had a view, between two buildings, of the street in front of Maxâs house.
A man and a woman watched intently from behind the truckâs heavily tinted windows. Another man was in the back, staring at qScreens and fiddling with equipment.
âValentina! The vanâs pulling up,â said the big barrel-shaped man in front.
âThe robot must have called them in!â said the woman. âZeff, did you pick up the transmission?â
âWhat? No! Maybe!â came shouts from the back.
âJust keep scanning, in case thereâs another message. Look! Here comes the robot out of the house.â
âDoesnât look like much,â said the man in the front seat. âRobo-football players move a lot smootherââ
âWould you shut up? Thereâs Jones. Looks like heâs got a couple techs with him. Robotâs in the van. There they go.â
âShould Iââ
âNo, you shouldnât,â the blond woman said, and it was obvious that hers was the final word. âJust watch! I want to see if those three SUVs are guarding them . . . Yeah, there they go.â
Two big black nonautomated SUVs passed by Maxâs house, following the van.
âHmm, I guess the other one went back to the school already,â said the