Sirhan shivers. Aineko is staring up at him, unblinking. For a moment, he feels at gut level that he is in the presence of an alien god: It's the simple truth, isn't it? But – "Okay, I concede the point," Sirhan says after a moment in which he spawns a blizzard of panicky cognitive ghosts, fractional personalities each tasked with the examination of a different facet of the same problem. "You're smarter than I am. I'm just a boringly augmented human being, but you've got a flashy new theory of mind that lets you work around creatures like me the way I can think my way around a real cat." He crosses his arms defensively. "You do not normally rub this in. It's not in your interests to do so, is it? You prefer to hide your manipulative capabilities under an affable exterior, to play with us. So you're revealing all this for a reason." There's a note of bitterness in his voice now. Glancing round, Sirhan summons up a chair – and, as an afterthought, a cat basket. "Have a seat. Why now , Aineko? What makes you think you can take my eigenson?"
"I didn't say I was going to take him, I said I'd come for him." Aineko's tail lashes from side to side in agitation. "I don't deal in primate politics, Sirhan: I'm not a monkey-boy. But I knew you'd react badly because the way your species socializes" – a dozen metaghosts reconverge in Sirhan's mind, drowning Aineko's voice in an inner cacophony – "would enter into the situation, and it seemed preferable to trigger your territorial/reproductive threat display early, rather than risk it exploding in my face during a more delicate situation."
Sirhan waves a hand vaguely at the cat: "Please wait." He's trying to integrate his false memories – the output from the ghosts, their thinking finished – and his eyes narrow suspiciously. "It must be bad. You don't normally get confrontational – you script your interactions with humans ahead of time, so that you maneuver them into doing what you want them to do and thinking it was their idea all along." He tenses. "What is it about Manni that brought you here? What do you want with him? He's just a kid."
"You're confusing Manni with Manfred." Aineko sends a glyph of a smile to Sirhan: "That's your first mistake, even though they're clones in different subjective states. Think what he's like when he's grown up."
"But he isn't grown-up!" Sirhan complains. "He hasn't been grown-up for —"
"– Years, Sirhan. That's the problem. I need to talk to your grandfather, really, not your son, and not the goddamn stateless ghost in the temple of history, I need a Manfred with a sense of continuity. He's got something that I need, and I promise you I'm not going away until I get it. Do you understand?"
"Yes." Sirhan wonders if his voice sounds as hollow as the feeling in his chest. "But he's our kid, Aineko. We're human. You know what that means to us?"
"Second childhood." Aineko stands up, stretches, then curls up in the cat basket. "That's the trouble with hacking you naked apes for long life, you keep needing a flush and reset job – and then you lose continuity. That's not my problem, Sirhan. I got a signal from the far edge of the router network, a ghost that claims to be family. Says they finally made it out to the big beyond, out past the Böotes supercluster, found something concrete and important that's worth my while to visit. But I want to make sure it's not like the Wunch before I answer. I'm not letting that into my mind, even with a sandbox. Do you understand that? I need to instantiate a real-live adult Manfred with all his memories, one who hasn't been a part of me, and get him to vouch for the sapient data packet. It takes a conscious being to authenticate that kind of messenger. Unfortunately, the history temple is annoyingly resistant to unauthorized extraction – I can't just go in and steal a copy of him – and I don't want to use my own model of Manfred: It knows too much. So —"
"What's it promising?" Sirhan asks tensely.
Aineko looks at him through slitted eyes, a purring buzz at the base of his throat: "Everything."
"There are different kinds of death," the woman called Pamela tells Manni, her bone-dry voice a whisper in the darkness. Manni tries to move, but he seems to be trapped in a confined space; for a moment, he begins to panic, but then he works it out. "First and most importantly, death is just the absence of life – oh, and for human beings, the absence of consciousness, too, but not just the absence of consciousness, the absence of the capacity for consciousness." The darkness is close and disorienting and Manni isn't sure which way up he is – nothing seems to work. Even Pamela's voice is a directionless ambiance, coming from all around him.
"Simple old-fashioned death, the kind that predated the singularity, used to be the inevitable halting state for all life-forms. Fairy tales about afterlives notwithstanding." A dry chuckle: "I used to try to believe a different one before breakfast every day, you know, just in case Pascal's wager was right – exploring the phase-space of all possible resurrections, you know? But I think at this point we can agree that Dawkins was right. Human consciousness is vulnerable to certain types of transmissible memetic virus, and religions that promise life beyond death are a particularly pernicious example because they exploit our natural aversion to halting states."
Manni tries to say, I'm not dead , but his throat doesn't seem to be working. And now that he thinks about it, he doesn't seem to be breathing, either.
"Now, consciousness. That's a fun thing, isn't it? Product of an arms race between predators and prey. If you watch a cat creeping up on a mouse, you'll be able to impute to the cat intentions that are most easily explained by the cat having a theory of mind concerning the mouse – an internal simulation of the mouse's likely behavior when it notices the predator. Which way to run, for example. And the cat will use its theory of mind to optimize its attack strategy. Meanwhile, prey species that are complex enough to have a theory of mind are at a defensive advantage if they can anticipate a predator's actions. Eventually this very mammalian arms race gave us a species of social ape that used its theory of mind to facilitate signaling – so the tribe could work collectively – and then reflexively, to simulate the individual's own inner states. Put the two things together, signaling and introspective simulation, and you've got human-level consciousness, with language thrown in as a bonus – signaling that transmits information about internal states, not just crude signals such as 'predator here' or 'food there.'"
Get me out of this! Manny feels panic biting into him with liquid-helium-lubricated teeth. "G-e-t —" For a miracle the words actually come out, although he can't tell quite how he's uttering them, his throat being quite as frozen as his innerspeech. Everything's off-lined, all systems down.
"So," Pamela continues remorselessly, "we come to the posthuman. Not just our own neural wetware, mapped out to the subcellular level and executed in an emulation environment on a honking great big computer, like this: That's not posthuman, that's a travesty. I'm talking about beings who are fundamentally better consciousness engines than us merely human types, augmented or otherwise. They're not just better at cooperation – witness Economics 2.0 for a classic demonstration of that – but better at simulation. A posthuman can build an internal model of a human-level intelligence that is, well, as cognitively strong as the original. You or I may think we know what makes other people tick, but we're quite often wrong, whereas real posthumans can actually simulate us, inner states and all, and get it right. And this is especially true of a posthuman that's been given full access to our memory prostheses for a period of years, back before we realized they were going to transcend on us. Isn't that the case, Manni?"