
“I thought you knew they were the same thing, Bella! Sometimes I forget how stupid humans are.”
[Contains spoilers for Blindsight (Oct 2006), Nightflyers (1980), and The Thousandfold Thought (Jan 2006)]
So, I was reading Blindsight the other day, and guess what it reminded me of?
Neuropath?
No – I mean yes, that too – but there’s a much earlier book with much more direct similarities.
Twilight?
No! Nightflyers!
The TV show?
The George R. R. Martin novel that the show was based on. Consider: A group of around ten people set out on a spaceship to make contact with an alien ship about which almost nothing is known. Along the way, tensions rise, things go horribly wrong between the crew, people are killed, and everything just gets creepier and creepier. Ultimately, they get close enough to discover the real, shocking truth behind the nature of the aliens, and it all ends with inter-crew murders and a crash that leaves only a single survivor. Sound familiar?
Okay….
And it’s not just that. It’s the individual characters too! There’s a captain who was created artificially, a disputably-malicious ship AI backing him that controls corpses of dead crew members to do its will, a woman with physical enhancements…
Wait up! Okay, so you’re comparing Sarasti and Eris; one’s a vampire and one’s a human, but it’s true they were both made through gene-engineering, and they both like to hang out alone in their quarters avoiding the rest of the crew and only interacting remotely, so I see your point. But comparing the quantum AI to Eris’s mother only makes sense in that they both live in a computer. However, Eris’s mother was once a person and the quantum AI wasn’t, and besides that, Eris’s mother is directly hostile to the crew.
And the quantum AI wasn’t?
Fine. The Bates and Jhirl similarities are also obvious enough, I guess. But what about the rest of the crew? What about, say, Szpindel?
He’s Lommie Thorne. Both of them have exported parts of their existence into computers and machines, while retaining human bodies.
Susan James?
Ah, that one’s a bit complicated, because she’s not really one person, is she? So, I’d say Susan is d’Branin, since they’re both focused on trying to communicate with the aliens. But we can go further than that. Nightflyers has two linguists, Dannel and Lindran, right? But if you count all of Susan’s secondary personalities, Michelle and Sascha and Cruncher, the crew in Blindsight really has four linguists!
I suppose. How about Cunningham?
Christopheris, I guess, since they’re both biologists.
And Siri? Considering he’s the main character, and there’s not a lot of Nightflyers characters left to choose from…
Okay, bear with me here: Thale.
Um, no.
Yes! They both read minds! You can call it faces or surfaces or whatever, but at the end of the day, reading thoughts is reading thoughts.
Yeah, about that…are you sure it’s not just mood affiliation? And what about the fact that Thale dies at the beginning, but Siri survives until the end?
Come on, man. If we did it by death order, Thale would have to be Szpindel.
Fine. But what about Northwind?
Eh. I ran out of Blindsight characters before I got to her.
So much for your perfect analogy, then.
I never said it was perfect, just that there were certain distinct similarities between the two novels.
So, you’re saying Peter Watts…copied?
Not exactly. But you have to admit there’s something going on. Compare the dates.
Perhaps. And you’re mentioning this because you think it sheds light on the message and meaning of Blindsight?
Not really. Actually, I think making comparison with The Thousandfold Thought is more useful in that regard.
I see your point. Watts’s implied argument about the nature of humanity and consciousness shares some similarities with Bakker’s. And also, as in The Thousandfold Thought, some events key to this explanation require, hmm, literal explanation.
Indeed.
Cnaiür, fine, but Sarasti? What was the motive?
Good question. You have to realize, it’s a very different scenario. Sarasti is much closer in level to Kellhus, maybe even above him, so he can’t be modeled the same way as a level five like Cnaiür. No, you have to assume everything Sarasti does is for a purpose. That is, to understand his motive, we should take Eliezer Yudkowsky’s advice, and assume the actual effected result was what was intended.
And what was the result?
Transformation.
What do you mean?
Consider: Siri starts the book as a sociopath, who doesn’t really feel emotions. Everything points to it – from his whole relationship with his girlfriend, to when he gets assigned to go on this probable-death mission and all he feels is a bit of disappointment that he didn’t get to screw with the government more, and so on. I found one of the scenes on the ship near the beginning quite funny:

Pot, kettle, and all that – especially when you consider the flashback that follows:


Yeah, looking back, that discussion wasn’t really about Sarasti at all, was it?
Indeed. And then, we know Siri gets his emotions back, because the first thing he does when he leaves his cabin is argue with the other crewmembers, which he never did before because he didn’t care enough.
What I don’t see though, is why then? Siri had been living this way for decades – why wouldn’t his emotions have come back ages ago?
There was no external event to cause it to happen. People don’t just change for no reason. It takes something more – surgery, violence, near-death experiences, etc. Something about drastic events itself contains the agency to remap cognitive patterns. Think about the Zen Koan about the monk who cuts off the acolyte’s finger, and he is enlightened. The Buddhists are trying to get away from emotion and consciousness, while Siri is going back the other direction, but the pathway is still the same.
So when Siri was enlightened, instead of losing his connection with the world and exiting Samsara, he regained his emotions and reentered the cycle instead.
Not exactly regained. More like reconnected.
What?
Peter Watts chose the title of the book for a reason. Go look at it again. What is blindsight? Seeing, but not realizing you can see. Siri does feel emotions, he just doesn’t realize consciously that he is feeling them. Afterwards, he becomes reconnected to himself, and he can feel them again.
That didn’t happen for Conphas.
Because Conphas was born without certain emotions in the first place – there’s nothing for him to reconnect to. Kellhus isn’t wrong when he calls his condition a “defect carried from the womb” – after all, it wouldn’t hurt if it weren’t true. Although, just because Conphas couldn’t learn that particular lesson doesn’t mean there was nothing he should have taken away from the experience. Epistemic caution, say. Actually, Siri too might have done well learning some epistemic caution.
Yeah, like the scene where Sarasti calls him back:

I get that they’re stuck on a ship together, but hell man, I wouldn’t have.
It’s hardly an unusual mistake, though. Even Kellhus could have used some epistemic caution, really. That scene where Aurang possesses Esmenet…still laughing…not to mention almost getting his neck snapped by Cnaiür.
Okay, but if Siri is Conphas in this metaphor, that makes Sarasti…
Cnaiür, yes. The comparison is limited, though not entirely unwarranted. Both have limitations on their consciousness causing them to act according to instinct, space for dual-thread processing due to lack of consciousness, and these things make them telepath resistant.
But that doesn’t shed any light on Sarasti’s ultimate reasons. The characters are too different.
Oh, but it does. After all, there’s another comparison between Sarasti and Cnaiür – both were sent.
I suppose, since Kellhus sent Cnaiür, and Sarasti was sent by…
We don’t know precisely, but there must be someone back on Earth who started this whole mission, and who had some sort of goal in mind. Sarasti and the AI, we can be sure, are acting under orders. We only know what Siri knows, and Siri hasn’t seen those orders. But we can infer.
But do we know Sarasti is acting at all? How do we know that it isn’t just the AI the whole time, using Sarasti as a piece of hardware to interact with the crew?
We don’t. Personally, that would actually be my guess.
I’m not too sure. After all, sometimes Sarasti acts quite human for a vampire, much less a quantum-AI.
Hmm? When?
Consider this exchange:

Actually, that bit of dialog seems downright out of character for Sarasti, now that I think about it.
Of course it does, since you’re not interpreting it right.
What do you mean?
You’re not interpreting it literally enough. Consider: Conventionally, what people mean when they say, “I don’t know what I’m doing,” is one of a few things. Either the speaker means that they don’t have experience with what they are doing, which in this situation is too obvious to even need to be said, since it’s first contact, so by definition none of them have experience with it. Or else, the speaker means they aren’t sure how to proceed, or are unsure whether the actions they are taking are the correct ones – which is also obviously not the case here, since Sarasti is following whatever initial orders he was sent with and is clearly proceeding without hesitation in a manner consistent with his underlying operating principles. No, Sarasti means the words literally – he’s a vampire, he doesn’t know what he’s doing because he has no consciousness to observe his own actions with. This is what Siri sees when he finally looks for real – Sarasti is even less than a sociopath, he’s not even a person at all, he’s just like the aliens.
But then, why does he say “forgive me”? Why would a vampire, or a vampire-quantumcomputer-hivemindprogrammed combo entity, want forgiveness from some random human?
It wouldn’t, of course. This is advice to Siri. Sarasti is saying, it’s pointless to be angry at me over what happened, because I’m not even conscious. He might even mean it in past tense, since in vampire-speak all tenses look the same.
Ah. Now that I look, I can’t see how I missed that.
Because you’re thinking about how it sounds to you, and not how it might fit in with the speaker’s purpose. Watts even talks about this. Words come imbued with the speaker’s intent and implications, not the listener’s. Even Bates makes this sort of mistake about Sarasti.
When?
This part:

Her claim that Sarasti’s actions were without purpose is the equivalent of Achamian, upon being ambushed by the Scarlet Spires in the Sareötic Library, crying, “Think on this, Eleäzaras!” He fails the infamous crayon-box test (not for the first time), failing to model the situation from Eleäzaras’s perspective. Since it is, after all, Achamian for whom this situation is a surprise, Achamian who needs to think about it. Eleäzaras has clearly spent days if not weeks plotting this ambush, and had more than ample time to think through the costs and benefits. The same goes for Bates telling Sarasti his actions had no purpose – just because you don’t know the purpose, doesn’t mean there is none, it just means you haven’t thought about it enough yet.
But comparing Sarasti to Cnaiür isn’t quite right then, because Cnaiür really does do things for almost no reason at all. Sarasti is more like Kellhus, since he works with hidden purposes.
Really? I think Siri is more like Kellhus.
What?
Like I pointed out before, Siri reads minds, like a telepath – or a Dûnyain. Besides that, Kellhus is more similar to Siri than even Conphas when it comes to emotions, since Conphas experiences most emotions with only a few like shame missing, whereas Kellhus experiences almost no emotion at all, which is much closer to Siri.
No…I still don’t think that comparison works. Mind-reading and psychopathy aside, Kellhus was at the top of his world hierarchy, in control of everything, while Siri seems kind of, well, low on the ladder.
Well, that’s more because of differences between the worlds, than differences between the characters.
What do you mean?
Consider – the world of The Thousandfold Thought is low-technology. Bene Gesserit eugenics is the closest they can get to gene optimization, and besides that, they have no computers. Kellhus is at the top because there is nothing better there to outcompete him. But Siri’s world is high-tech, post-singularity, with gene editing and AIs running around everywhere. The level cap in Siri’s world is much higher, so to speak. So although Siri and Kellhus are around the same level, Kellhus is at his world’s level cap, while Siri isn’t even close. On Siri’s Earth, Kellhus wouldn’t be faring nearly so well as he did in Eärwa – although he might still have done a bit better than Siri, if merely due to differences in IQ and training.
Fine. But we’ve made so many comparisons – and do any of them even help us understand Watts’s purpose?
Yes! We see Siri and Kellhus face the same tradeoffs between abilities and feelings, and between ability and humanity. As soon as Siri reconnects his emotions, he becomes much less able to read others. Why? Because there’s no more room. There’s only so much room in a consciousness, according to Watts, and experiencing one’s own emotions and thoughts leaves that much less room for modeling others. There’s a reason Dûnyain have to shut off their own personal consciousness to enter the probability trance. For Kellhus too, the more he becomes invested in relationships with the other characters, the less he is able to take optimal actions to reach his goals. If connection with others is what makes us human, and emotion is the necessary glue for those connections, then both contribute to a tradeoff between humanity and efficiency. This tradeoff is what Watts is trying to show us.
But what is the answer, then? Do we choose humanity, or do we choose efficiency?
It depends on your goals. That’s what makes it a choice.
But there must be some way to have both, right?
What you’re looking for is the unity of passion with calculation, to be both greater than human and less at the same time. Such an answer is not obvious – rather, it is the sort of thing that must be sought.
But not found?
Depends on how far you’re able to look.
And did you look?
What do you think?
Well, then – TELL ME. WHAT DO YOU SEE?
