Year's Best Scifi 8 Part 38

You’re reading novel Year's Best Scifi 8 Part 38 online at LightNovelFree.com. Please use the follow button to get notification about the latest chapter next time when you visit LightNovelFree.com. Use F11 button to read novel in full-screen(PC only). Drop by anytime you want to read free – fast – latest novel. It’s great if you could leave a comment, share your opinion about the new chapters, new novel with others on the internet. We’ll do our best to bring you the finest, latest novel everyday. Enjoy!

I did have a third excuse. It was time I dealt with that one, too.

I called Francine.

"Are you free for lunch?" I asked. She hesitated; there was always work she could be doing. "To discuss the Cauchy-Riemann equations?" I suggested.

She smiled. It was our code, when the request was a special one. "All right. One o'clock?" I nodded. "I'll see you then."

Francine was 20 minutes late, but that was less of a wait than I was used to. She'd been appointed deputy head of the mathematics department 18 months before, and she still had some teaching duties as well as all the new administrative work. Over the last eight years, I'd had a dozen short-term contracts with various bodies-government departments, corporations, NGOs-before finally ending up as a very lowly member of the physics department at our alma mater. I did envy her the prestige and security of her job, but I'd been happy with most of the work I'd done, even if it had been too scattered between disciplines to contribute to anything like a traditional career path.



I'd bought Francine a plate of cheese-and-salad sandwiches, and she attacked them hungrily as soon as she sat down. I said, "I've got ten minutes at the most, haven't I?"

She covered her mouth with her hand and replied defensively, "It could have waited until tonight, couldn't it?"

"Sometimes I can't put things off. I have to act while I still have the courage."

At this ominous prelude she chewed more slowly. "You did the second stage of Olivia's experiment this morning, didn't you?"

"Yeah." I'd discussed the whole procedure with her before I volunteered.

"So I take it you didn't lose consciousness, when your neurons became marginally more cla.s.sical than usual?" She sipped chocolate milk through a straw.

"No. Apparently no one ever loses anything. That's not official yet, but-"

Francine nodded, unsurprised. We shared the same position on the Penrose theory; there was no need to discuss it again now.

I said, "I want to know if you're going to have the operation."

She continued drinking for a few more seconds, then released the straw and wiped her upper lip with her thumb, unnecessarily. "You want me to make up my mind about that, here and now?"

"No." The damage to her uterus from the miscarriage could be repaired; we'd been discussing the possibility for almost five years. We'd both had comprehensive chelation therapy to remove any trace of U-238. We could have children in the usual way with a reasonable degree of safety, if that was what we wanted. "But if you've already decided, I want you to tell me now."

Francine looked wounded. "That's unfair."

"What is? Implying that you might not have told me, the instant you decided?"

"No. Implying that it's all in my hands."

I said, "I'm not was.h.i.+ng my hands of the decision. You know how I feel. But you know I'd back you all the way, if you said you wanted to carry a child." I believed I would have. Maybe it was a form of doublethink, but I couldn't treat the birth of one more ordinary child as some kind of atrocity, and refuse to be a part of it.

"Fine. But what will you do if I don't?" She examined my face calmly. I think she already knew, but she wanted me to spell it out.

"We could always adopt," I observed casually.

"Yes, we could do that." She smiled slightly; she knew that made me lose my ability to bluff, even faster than when she stared me down.

I stopped pretending that there was any mystery left; she'd seen right through me from the start. I said, "I just don't want to do this, then discover that it makes you feel that you've been cheated out of what you really wanted."

"It wouldn't," she insisted. "It wouldn't rule out anything. We could still have a natural child as well."

"Not as easily." This would not be like merely having workaholic parents, or an ordinary brother or sister to compete with for attention.

"You only want to do this if I can promise you that it's the only child we'd ever have?" Francine shook her head. "I'm not going to promise that. I don't intend having the operation any time soon, but I'm not going to swear that I won't change my mind. Nor am I going to swear that if we do this it will make no difference to what happens later. It will be a factor. How could it not be? But it won't be enough to rule anything in or out." I looked away, across the rows of tables, at all the students wrapped up in their own concerns. She was right; I was being unreasonable. I'd wanted this to be a choice with no possible downside, a way of making the best of our situation, but no one could guarantee that. It would be a gamble, like everything else.

I turned back to Francine.

"All right; I'll stop trying to pin you down. What I want to do right now is go ahead and build the Qusp. And when it's finished, if we're certain we can trust it...I want us to raise a child with it. I want us to raise an AI."

2029.

I met Francine at the airport, and we drove across Sao Paulo through curtains of wild, las.h.i.+ng rain. I was amazed that her plane hadn't been diverted; a tropical storm had just hit the coast, halfway between us and Rio.

"So much for giving you a tour of the city," I lamented. Through the windscreen, our actual surroundings were all but invisible; the bright overlay we both perceived, surreally colored and detailed, made the experience rather like perusing a 3D map while trapped in a car wash.

Francine was pensive, or tired from the flight. I found it hard to think of San Francisco as remote when the time difference was so small, and even when I'd made the journey north to visit her, it had been nothing compared to all the ocean-spanning marathons I'd sat through in the past.

We both had an early night. The next morning, Francine accompanied me to my cluttered workroom in the bas.e.m.e.nt of the university's engineering department. I'd been chasing grants and collaborators around the world, like a child on a treasure hunt, slowly piecing together a device that few of my colleagues believed was worth creating for its own sake. Fortunately, I'd managed to find pretexts-or even genuine spin-offs-for almost every stage of the work. Quantum computing, per se, had become bogged down in recent years, stymied by both a shortage of practical algorithms and a limit to the complexity of superpositions that could be sustained. The Qusp had nudged the technological envelope in some promising directions, without making any truly exorbitant demands; the states it juggled were relatively simple, and they only needed to be kept isolated for milliseconds at a time.

I introduced Carlos, Maria and Jun, but then they made themselves scarce as I showed Francine around. We still had a demonstration of the "balanced decoupling" principle set up on a bench, for the tour by one of our corporate donors the week before. What caused an imperfectly s.h.i.+elded quantum computer to decohere was the fact that each possible state of the device affected its environment slightly differently. The s.h.i.+elding itself could always be improved, but Carlos's group had perfected a way to buy a little more protection by sheer deviousness. In the demonstration rig, the flow of energy through the device remained absolutely constant whatever state it was in, because any drop in power consumption by the main set of quantum gates was compensated for by a rise in a set of balancing gates, and vice versa.

This gave the environment one less clue by which to discern internal differences in the processor, and to tear any superposition apart into mutually disconnected branches.

Francine knew all the theory backward, but she'd never seen this hardware in action. When I invited her to twiddle the controls, she took to the rig like a child with a game console.

"You really should have joined the team," I said.

"Maybe I did," she countered. "In another branch."

She'd moved from UNSW to Berkeley two years before, not long after I'd moved from Delft to Sao Paulo; it was the closest suitable position she could find. At the time, I'd resented the fact that she'd refused to compromise and work remotely; with only five hours' difference, teaching at Berkeley from Sao Paulo would not have been impossible. In the end, though, I'd accepted the fact that she'd wanted to keep on testing me, testing both of us. If we weren't strong enough to stay together through the trials of a prolonged physical separation-or if I was not sufficiently committed to the project to endure whatever sacrifices it entailed-she did not want us proceeding to the next stage.

I led her to the corner bench, where a nondescript gray box half a meter across sat, apparently inert.

I gestured to it, and our retinal overlays transformed its appearance, "revealing" a maze with atransparent lid embedded in the top of the device. In one chamber of the maze, a slightly cartoonish mouse sat motionless. Not quite dead, not quite sleeping.

"This is the famous Zelda?" Francine asked.

"Yes." Zelda was a neural network, a stripped-down, stylized mouse brain. There were newer, fancier versions available, much closer to the real thing, but the ten-year-old, public domain Zelda had been good enough for our purposes.

Three other chambers held cheese. "Right now, she has no experience of the maze," I explained. "So let's start her up and watch her explore." I gestured, and Zelda began scampering around, trying out different pa.s.sages, deftly reversing each time she hit a cul-de-sac. "Her brain is running on a Qusp, but the maze is implemented on an ordinary cla.s.sical computer, so in terms of coherence issues, it's really no different from a physical maze."

"Which means that each time she takes in information, she gets entangled with the outside world,"

Francine suggested.

"Absolutely. But she always holds off doing that until the Qusp has completed its current computational step, and every qubit contains a definite zero or a definite one. She's never in two minds when she lets the world in, so the entanglement process doesn't split her into separate branches."

Francine continued to watch, in silence. Zelda finally found one of the chambers containing a reward; when she'd eaten it, a hand scooped her up and returned her to her starting point, then replaced the cheese.

"Here are 10,000 previous trials, superimposed." I replayed the data. It looked as if a single mouse was running through the maze, moving just as we'd seen her move when I'd begun the latest experiment.

Restored each time to exactly the same starting condition, and confronted with exactly the same environment, Zelda-like any computer program with no truly random influences-had simply repeated herself. All 10,000 trials had yielded identical results.

To a casual observer, unaware of the context, this would have been a singularly unimpressive performance. Faced with exactly one situation, Zelda the virtual mouse did exactly one thing. So what? If you'd been able to wind back a flesh-and-blood mouse's memory with the same degree of precision, wouldn't it have repeated itself too?

Francine said, "Can you cut off the s.h.i.+elding? And the balanced decoupling?"

"Yep." I obliged her, and initiated a new trial.

Zelda took a different path this time, exploring the maze by a different route. Though the initial condition of the neural net was identical, the switching processes taking place within the Qusp were now opened up to the environment constantly, and superpositions of several different eigenstates-states in which the Qusp's qubits possessed definite binary values, which in turn led to Zelda making definite choices-were becoming entangled with the outside world. According to the Copenhagen interpretation of quantum mechanics, this interaction was randomly "collapsing" the superpositions into single eigenstates; Zelda was still doing just one thing at a time, but her behavior had ceased to be deterministic.

According to the MWI, the interaction was transforming the environment-Francine and me included-into a superposition with components that were coupled to each eigenstate; Zelda was actually running the maze in many different ways simultaneously, and other versions of us were seeing her take all those other routes.

Which scenario was correct?

I said, "I'll reconfigure everything now, to wrap the whole setup in a Delft cage." A "Delft cage" was jargon for the situation I'd first read about 17 years before: instead of opening up the Qusp to the environment, I'd connect it to a second quantum computer, and let that play the role of the outside world.

We could no longer watch Zelda moving about in real time, but after the trial was completed, it was possible to test the combined system of both computers against the hypothesis that it was in a pure quantum state in which Zelda had run the maze along hundreds of different routes, all at once. I displayed a representation of the conjectured state, built up by superimposing all the paths she'd taken in 10,000 uns.h.i.+elded trials. The test result flashed up: CONSISTENT .

"One measurement proves nothing," Francine pointed out.

"No." I repeated the trial. Again, the hypothesis was not refuted. If Zelda had actually run the maze along just one path, the probability of the computers' joint state pa.s.sing this imperfect test was about one percent. For pa.s.sing it twice, the odds were about one in 10,000.

I repeated it a third time, then a fourth.

Francine said, "That's enough." She actually looked queasy. The image of the hundreds of blurred mouse trails on the display was not a literal photograph of anything, but if the old Delft experiment had been enough to give me a visceral sense of the reality of the multiverse, perhaps this demonstration had finally done the same for her.

"Can I show you one more thing?" I asked.

"Keep the Delft cage, but restore the Qusp's s.h.i.+elding?"

"Right."

I did it. The Qusp was now fully protected once more whenever it was not in an eigenstate, but this time, it was the second quantum computer, not the outside world, to which it was intermittently exposed.

If Zelda split into multiple branches again, then she'd only take that fake environment with her, and we'd still have our hands on all the evidence.

Tested against the hypothesis that no split had occurred, the verdict was: CONSISTENT .

CONSISTENT . CONSISTENT .

We went out to dinner with the whole of the team, but Francine pleaded a headache and left early.

She insisted that I stay and finish the meal, and I didn't argue; she was not the kind of person who expected you to a.s.sume that she was being politely selfless, while secretly hoping to be contradicted.

After Francine had left, Maria turned to me. "So you two are really going ahead with the Frankenchild?" She'd been teasing me about this for as long as I'd known her, but apparently she hadn't been game to raise the subject in Francine's presence.

"We still have to talk about it." I felt uncomfortable myself, now, discussing the topic the moment Francine was absent. Confessing my ambition when I applied to join the team was one thing; it would have been dishonest to keep my collaborators in the dark about my ultimate intentions. Now that the enabling technology was more or less completed, though, the issue seemed far more personal.

Carlos said breezily, "Why not? There are so many others now. Sophie. Linus. Theo. Probably a hundred we don't even know about. It's not as if Ben's child won't have playmates."

Adai-Autonomously Developing Artificial Intelligences-had been appearing in a blaze of controversy every few months for the last four years. A Swiss researcher, Isabelle Schib, had taken the old models of morphogenesis that had led to software like Zelda, refined the technique by several orders of magnitude, and applied it to human genetic data. Wedded to sophisticated prosthetic bodies, Isabelle's creations inhabited the physical world and learned from their experience, just like any other child.

Jun shook his head reprovingly. "I wouldn't raise a child with no legal rights. What happens when you die? For all you know, it could end up as someone's property."

I'd been over this with Francine. "I can't believe that in ten or 20 years' time there won't be citizens.h.i.+p laws, somewhere in the world."

Jun snorted. "Twenty years! How long did it take the U.S. to emanc.i.p.ate their slaves?"

Carlos interjected, "Who's going to create an adai just to use it as a slave? If you want something biddable, write ordinary software. If you need consciousness, humans are cheaper."

Maria said, "It won't come down to economics. It's the nature of the things that will determine how they're treated."

"You mean the xenophobia they'll face?" I suggested.

Maria shrugged. "You make it sound like racism, but we aren't talking about human beings. Once you have software with goals of its own, free to do whatever it likes, where will it end? The first generation makes the next one better, faster, smarter; the second generation even more so. Before we know it, we're like ants to them." Carlos groaned. "Not that h.o.a.ry old fallacy! If you really believe that stating the a.n.a.logy 'ants are to humans, as humans are to x' is proof that it's possible to solve for x, then I'll meet you where the south pole is like the equator."

I said, "The Qusp runs no faster than an organic brain; we need to keep the switching rate low, because that makes the s.h.i.+elding requirements less stringent. It might be possible to nudge those parameters, eventually, but there's no reason in the world why an adai would be better equipped to do that than you or I would. As for making their own offspring smarter...even if Schib's group has been perfectly successful, they will have merely translated human neural development from one substrate to another. They won't have 'improved' on the process at all-whatever that might mean. So if the adai have any advantage over us, it will be no more than the advantage shared by flesh-and-blood children: cultural transmission of one more generation's worth of experience."

Maria frowned, but she had no immediate comeback.

Jun said dryly, "Plus immortality."

"Well, yes, there is that." I conceded.

Francine was awake when I arrived home.

"Have you still got a headache?" I whispered.

"No."

I undressed and climbed into bed beside her.

She said, "You know what I miss the most? When we're f.u.c.king on-line?"

"This had better not be complicated; I'm out of practice."

"Kissing."

I kissed her, slowly and tenderly, and she melted beneath me. "Three more months," I promised, "and I'll move up to Berkeley."

"To be my kept man."

"I prefer the term 'unpaid but highly valued caregiver.' " Francine stiffened. I said, "We can talk about that later." I started kissing her again, but she turned her face away.

"I'm afraid," she said.

Year's Best Scifi 8 Part 38

You're reading novel Year's Best Scifi 8 Part 38 online at LightNovelFree.com. You can use the follow function to bookmark your favorite novel ( Only for registered users ). If you find any errors ( broken links, can't load photos, etc.. ), Please let us know so we can fix it as soon as possible. And when you start a conversation or debate about a certain topic with other people, please do not offend them just because you don't like their opinions.


Year's Best Scifi 8 Part 38 summary

You're reading Year's Best Scifi 8 Part 38. This novel has been translated by Updating. Author: David G. Hartwell already has 670 views.

It's great if you read and follow any novel on our website. We promise you that we'll bring you the latest, hottest novel everyday and FREE.

LightNovelFree.com is a most smartest website for reading novel online, it can automatic resize images to fit your pc screen, even on your mobile. Experience now by using your smartphone and access to LightNovelFree.com

RECENTLY UPDATED NOVEL