Ray Kurzweil, virtual sex, and the horror of our future vision

This one is depressing and distressing to write but I feel that I need to do so because there is a lesson to be learned.

Decades ago, Ray Kurzweil received laudatory praise for his accuracy in predicting the future (most of which I dispute), media attention for his books and more outlandish claims, and — for all that — surprisingly little traction in the general public for any of it. Somehow, despite Kurzweil’s prodigious welcome in major media, no one seemed to really pay attention, perhaps they just enjoyed looking at the show, like it was a circus. Nowadays, the very things he spent his time advocating are everyday conversation: intelligent machines, brain-computer interfaces, pervasive computing, etc.

Often, Kurzweil’s predictions almost land where he wanted them, and to greater-or-lesser extent this permits a little retconning to make it sound like he made an accurate claim: like when he decided that smartphones were the fulfillment of his promise that clothing would be Internet-enabled. Most of the time, the retcon is trivial and I don’t generally care whether he’s right or wrong. After all, people who predict the future are almost always wrong. But there are real dangers to note in how he and others miss the mark.

Now I want to stop for a moment and state that I think Ray Kurzweil is an interesting author. And he’s clearly a brilliant innovator. In the one time I met him, I also found him very nice. So that’s a lot of good stuff. I hope that he is, in fact, generally a nice person. I have no guarantees of that; I just hope it’s the case. It happens to be that I find his faith in the digital future religious and thus speculative (though not necessarily wrong) and I think his predictions are usually further from reality than he wants. I do not currently believe Ray Kurzweil is a bad person or a fool, but I think his predictions come with real risk, especially if we see them through his rose-colored glasses.

A recent example has me deeply concerned. In The Age of Spiritual Machines (1999), Kurzweil traces an interesting rhetorical move: join this future because machines are cool and their development inevitable –> if that doesn’t suffice join this future because the sex will be great –> if that doesn’t suffice, join this future because religion will be great. It’s actually a rather surprising pathway.

So right now you’re probably pretty curious about what will make sex and religion great again. Let’s forget religion for now (even though it’s what I talk about in books like Apocalyptic AI and Futureproofing Humanity). Instead, Salt-N-Pepa style, let’s talk about sex.

Kurzweil states that conventional reality and virtual reality will blur into one another, this juxtaposition being made possible by augmented reality glasses/contacts, brain-computer interfaces, and eventually nanobots that allow us to see the physical world as something other than it is. In this world, Kurzweil says we can use virtual/augmented reality to improve our sex lives by seeing someone other than the person in front of us. In The Age of Spiritual Machines, he projects into the future claiming:

  • “Virtual touch has already been introduced, but the all-enveloping, highly realistic, visual-auditory-tactile virtual environment will not be perfect until the second decade of the twenty-first century. At this point, virtual sex becomes a viable competitor to the real thing…Today, lovers may fantasize their partners to be someone else, but users of virtual sex communication will not need as much imagination. You will be able to change the physical appearance and other characteristics of both yourself and your partner. You can make your lover look and feel like your favorite star without your partner’s permission or knowledge.” (147)

I am not worried about whether he accurately defined the technologies of the 2010s. I am worried about what he wants from those technologies. Just to fully clarify, I’ll quote him again:

  • With neural implants by the 4th decade of the 21st century “you will be able to have almost any kind of experience with just about anyone, real or imagined, at any time.” (148)

And in the book he published shortly afterward, The Singularity is Near (2005), he writes:

  • “Consider that you can be with your favorite entertainment star…Imagination is nice, but the real thing — or, rather, the virtual thing — is so much more, well, real.” (318)

That’s enough for our purposes, though there are other dubious passages that could be raised. When I used to teach The Age of Spiritual Machines in a class on religion and psychology, we were pretty universally disturbed by how Kurzweil’s vision would destroy a sense of trust and care between people. After all, there’s something decidedly distressing about your partner wanting to see someone else when intimate with you. Much worse is his claim that you’d be entitled to do so without their permission or knowledge. Imagine also, if you will, what it would feel like for someone (an entertainment star, a classmate, or anyone at all) to have another person describe to them how that star/classmate/neighbor/whatever were made sexually available to this person’s digital control.

And now, here we are, with Grok — in particular — known for undressing women and girls. But it’s not just Grok. Such things can be done with Stability and with other AI technologies. We live in an age of digitally fabricated revenge porn and of unwilling participants in deep fake videos. This horror, which is a source of profound distress for many people, is the missed landing of Kurzweil’s vision for AI and sex.

This should warn everyone that our predictions may seem thrilling to some people when the technologies are purely imaginary but those imagined technologies run the risk of becoming terrifying to many, many people when they are made real. We need to do better. If we are to keep developing these technologies, we must do so in a way that reduces the harms they can create. I am not content with the promises and predictions of Kurzweil’s successors. Sam Altman, Elon Musk, take your pick. It all looks like a disaster because they haven’t learned a fundamental lesson.

All AI ethics is human ethics.

Please, let us do better. Let us resist horror and provide shelter.

Leave a comment