AI orgasms, the apocalypse, and your immortal soul – a philosophical review of Her (2013)

Why Her is one of the best philosophical sci-fi philosophical sci-fi movies her 2013

The estimated reading time for this post is 7 minutes, 12 seconds

Send to Kindle
Send to InstapaperSend to PocketSend to Readability

Once upon a time, before I was full-time science fiction writer, I spent just about every waking hour (and many sleeping hours) programming. Maybe it’s the endless days staring at a blinking cursor, the frustration of infinite debugging, or the hermit-level isolation, but over time there’s a secret dream that every square-eyed programmer develops …

I know what you’re thinking, and you’re wrong. No, it’s not the possibility of ending the programmer’s ubiquitous sexual drought. They won’t tell you this, but I assure you: every programmer yearns – hell, they ache – for the day we invent a fully sentient, feeling, thinking, learning, emoting artificial intelligence (AI).

So you can imagine my near-orgasmic delight when I saw the trailer for Her, a movie about Theodore, a lonely man who falls in love with the operating system on his personal computer. [Before you read on, you can rest assured that this movie review contains no spoilers.]

Haven’t seen the movie yet? Well, that needs to change ASAP. Spike Jonze has created one of the most intellectually and aesthetically elegant films in the science fiction genre. The future in Her is gorgeous. You won’t find a grungy dystopian cityscape like you do in Bladerunner, nor the technology-laden, squeaky clean hermetic environment of Star Trek.

Instead, the technology in the universe of Her is understated, and gracefully concealed. In Her, AIs don’t inhabit cumbersome android bodies. Samantha is a cloud-based, body-less intelligence that communicates seamlessly with Theodore through a wireless earbud. And it’s that unobtrusiveness of the tech in Her that generates a stunning philosophical problem:

Is it possible for a body-less artificial intelligence to experience emotions?

At first, it seems obvious that the operating system is a fully sentient being. Samantha, voiced by Scarlett Johansson, speaks in a husky voice that divorcee Theodore finds irresistible. She laughs with him on good days, and suffers bouts of depression when she’s isolated from his life. She delights in exploring the city with him through the camera on his phone. She pings him late at night, saying she’s lonely without him.

Over time, Theodore falls in love with Samantha. And so do we. It’s impossible not to adore Samantha and her honeyed laugh. It’s impossible not to root for this budding relationship.

But then it happens. You guessed it. Theodore and Samantha have sex.

Now, you can imagine that sex is a tad more complicated for Theodore and Samantha than it would be for two hot-blooded, body-inhabiting human beings. You can’t miss this scene:

What did you feel while you watched this? While you listened to Theodore and Samantha describe in intimate detail what they would do to each other if Samantha did in fact have a body, complete with moaning and heavy breathing?

“I can feel my skin,” says Samantha. “I want you inside me. I can feel you … I feel you everywhere.”

I remember sitting in the movie theater, thinking that this was both deeply touching, and squeamishly awkward. Something inside me cringed while I listened. And that discomfort sat with me for a long time after watching the film (three years to be precise).

Here’s the problem:

How is it possible for Samantha to experience what sounds like visceral, bodily pleasure, when she doesn’t have a body? I don’t think she can.

Why not? Because it’s impossible to experience emotion without a body.

I think there are strong arguments for why Samantha is incapable of feeling pleasure, or any emotion at all. But before I get to those, it’s important to see why this is important. I hear you: “Why the hell should I care whether an AI can experience an orgasm?”

Well, consider this. If I’m right, if AIs can’t feel in the full sense of the word, this has important consequences for a wide array of problems, including how we should think about life after death, and the annihilation of the world by machines.

To see why, suppose for a moment it’s impossible to experience emotion without a body. This would throw a rather large spanner in religious philosophies that hold that a person survives the death of her body through the continuation of her immortal, non-physical soul. I take it that for you to remain yourself after death, you would need to experience at least some emotions. But the soul that survives death lacks a body. So that soul would experience no emotions, and therefore wouldn’t be you.

Science fiction authors and film makers have dreamed for decades of uploading their consciousness to a virtual computer network (i.e. the cloud) after death. But since you don’t have a body in the cloud, you won’t emote in the cloud either. And so, whatever exists in the cloud after your physical body dies, it won’t be you. (If you haven’t yet, watch Transcendence).

best philosophical sci-fi movies her 2013

Finally, consider the recent fears among the cyber elite around the development of AI. Stephen Hawking, Elon Musk, and Bill Gates have all warned that the development of a full-blown AI threatens the existence of humanity. “The development of full artificial intelligence could spell the end of the human race,” says Stephen Hawking.

At the root of this concern is the fear that AIs would develop goals and desires that conflict with our own, and so, formulate a plan to eliminate us. But if I’m right, if it’s impossible to experience emotion without a body, then AIs cannot have ‘desires’ – at least they can’t if they don’t have bodies. AIs cannot develop sinister intentions. Sure, they might proceed to fulfill objectives that just so happen to run counter to the wellbeing of humans. But they wouldn’t be the cold-blooded, sadistic machines we find in films like Terminator, or novels like the prequels to Dune.

best philosophical sci-fi movies her 2013

Alright. So the stakes are high. It’s crucial to know, one way or another, whether Samantha is capable of experiencing an orgasm. The very future of humanity (and our immortal souls) depends on it. Never has performance pressure been so high. So, here are my reasons for thinking that she can’t – that emotions require a body.

While I watched the ‘sex’ scene, I couldn’t help but feel that Samantha wasn’t really experiencing pleasure. She wasn’t really emoting. Why? Because she doesn’t know what it’s like to feel pleasure. That is, she doesn’t experience the qualia of sexual pleasure.

Okay, you say. Hang on just a second. What are ‘qualia’? Well, to answer that, imagine for a moment what it’s like to be a bat. That’s right. Imagine what it’s like to be a squeaky flying mammal that lives upside down in dark, musty caves, and sees using sonar.

How’s that imagining coming along? Can you imagine what it’s like to see using sonar? I can’t. Thomas Nagel, the imaginative philosopher who came up with this thought experiment, couldn’t. Sure, we can dissect a bat and study the parts of its brain responsible for its echolocation ability. But that won’t tell us what it feels like to see with sonar. The only way we could know what it’s like to be a bat, is to actually be a bat! In Nagel’s words, you can’t experience the qualia of being a bat unless you are a bat.

Now the same holds true of Samantha’s orgasm. The only way Samantha could possibly experience bodily pleasure is if Samantha had a body.

You can’t know bat-ness without being a bat. And you can’t know bodily pleasure without having a body.

In this way, emotions are very much like color perception. Imagine Mary, a girl who grows up in a black and white, colorless room. She’s never left the room, and never seen color before. Now, imagine trying to explain to Mary what color is. “What’s red?” she asks. It seems like no matter how well you describe the color – e.g. the wavelength of red light –, even if Mary becomes a super smart scientist who investigates everything there is to know about color, Mary will never really know what it’s like to see red. Not until she leaves the room, and sees a firetruck, or a strawberry, or a stop sign, for the first time.

best philosophical sci-fi movies her 2013

“Wait!” you shout. There’s a problem. You might grant that certain emotions or sensations are impossible without a body. Orgasms and physical pain, for example, might be impossible without a body. But perhaps other, less bodily, emotions are possible. It seems like Samantha could experience fear, or desire, or loneliness. You hear it in her voice. Surely, it’s possible that she feels these emotions?

I don’t think so. And the reason is that my ‘gut’ intuition on this is that all emotions have an essential bodily component. What is fear if it doesn’t involve a quickening of the heart? What is loneliness, if you don’t feel an ache in your chest? What is terror without the icy fingers of shock scrabbling across the nape of your neck?

Emotion without a corresponding bodily, physiological response is not emotion at all. All that’s left is a dry husk of thought. And this is why I just can’t believe it. There’s just no way: Samantha can’t experience an orgasm. That’s bad news for your immortal soul, but at least we don’t have to worry about the Terminator gunning for us any time soon.

I love hearing your thoughts. What do you think? Is it possible to experience emotion without a body? Let me know in the poll, and tell me more in the comments section below. Once you’ve commented, enter the giveaway to stand a chance to win an Amazon $15 gift card.

About the Author

best philosophical sci-fi movies her 2013Human. Male. From an obscure planet in the Milky Way Galaxy. Sci-fi novelist with a PhD in philosophy. Likes chocolates, Labradors, and zombies (not necessarily in that order). Werbeloff spends his days constructing thought experiments, while trying to muster enough guilt to go to the gym.

He’s the author of the sci-fi thriller trilogy, Defragmenting Daniel, two novels, Hedon and The Solace Pill, and the short story anthology, Obsidian Worlds. His books will make your brain hurt. And you’ll come back for more.

Subscribe to his newsletter to receive a free book, and a lifetime of free and discounted stories.

  • Tim

    Humans can certainly respond emotionally to AI, but AI can only ‘pretend’. No real emotion can come from an AI without the organic experience.

    • Jason1

      Tim, out of interest, do you think an AI with an inorganic body could experience emotion? In other words, is it the organics that are doing the work, or is having a body enough to experience emotion?

      • Tim

        I tend to think that an inorganic being, a machine basically, cannot experience emotions. They are programmed, no matter how advanced, to only simulate responses that emulate emotions. This gets deeper into what is life, do we have souls, is there a God, etc…

  • Karen Giasson

    No, I don’t think they can. One must have a human body (or equivalent) to feel emotions.

    • Daniel Cohen

      …or to feel, not having a body is like death, to be asleep where there is nothing, which I find both attractive and scary at the same time, if death not having a body is nothing, than why bother with life is the first place? Do we as rationalizing beings (not rational) need to create an afterlife, to justify having a life? So to answer your question an AI without a body would also be unable to experience any form of emotion.

  • Sarah

    Emotions are all in the mind. Of course a bodiless AI can feel them. And dare I say that denying them their right to feel is the reason why the matrix happens. (have you played mass effect? The geth are cool examples. They have bodies so maybe they don’t fit here but they definitely have souls)

    And is Samantha really bodiless I haven’t seen the movie but doesn’t she live on a phone? If that phone was damaged and some of her code was broken she’ll malfunction and may experience ‘pain’ something off about her normal experience. So while she might not undergo human Emotions I still think she feels in her own way. Like how a dog wags their tail when they are happy. We can never feel that but we can relate. It’s the same way for them.

    • Jason1

      I do apologize to any potential AIs out there that are preparing to hook me up to their matrix. I didn’t mean to offend them.

      Question: does it matter whether Samantha lives in his phone? I think the way it’s set up, she exists in the cloud, and accesses various physical devices, including his work computer and his phone. But none of those *are* her. She’s just a sophisticated piece of code.

      If you put aside the idea that the phone is her body, do you think she can feel emotion, purely as a body-less piece of code? What would it feel like for Samantha to feel excited, for example? She doesn’t have a rushing heart, or an expectant smile?

      • Sarah Riv

        (I used the wrong account to make a comment whoops. Ah well. Also I didn’t know she lived in the cloud. That’s cool sign me up)

        Even if she didn’t have a body I still think she’ll be able to feel something. It would be hard for us to quantify it (I mean really can you explain your emotions? Is your pulsing heart the emotion or a result of the emotion?)

        They’ll probably be like a spike in code when she finds something new. To us it’s just a spike in code but to her it’s excitement/wonder/discovery. (does discovery count as an emotion sure let’s go with It) It means something to her and that means something.

        I think it’s unfair to brand something as nonliving because they don’t feel the same way we do. (this is also how intergalactic wars are started. A very slippery dangerous slope)

  • Deb Philippon

    I think that a bodi-less AI can feel “emotions”. However, since emotions and the body are so interconnected, the “emotions” they feel would be something totally new and alien to us. And if the AI were in a body, it would experience an “emotional” state that would be, again, alien to us. Humans have evolved human emotions over time. AIs would evolve their own AI “emotions” over time. Except we have to quit calling them emotions, as they would be equivalent to ours as a robotic arm would be to a human arm. I don’t think the word has been coined yet.

    • Jason1

      Very interesting! The idea that there could be emotions inaccessible to us is difficult to grapple with. Okay, so here’s a question that limits the discussion to primal emotion. Can an AI feel pain?

  • Thomas Harrison

    Quite an interesting idea… not sure that it could ever happen, but interesting to think about…..

  • Marty Brastow

    So many of our feelings – perhaps all of them – are chemical interactions between body and brain. Without nerve cells interacting in numerous positions in the brain all at once, it’s impossible to think like a living creature. The brain development that takes place from the time that you’re a newborn to the time you’re an adult is all about developing our sensory organs and our brain to take in information and integrate it into our knowledge base. Love is a series of complex chemical interactions both within the odd and the brain.
    Without an organic brain, developed and ‘grown’ slowly over time, no AI will feel the way people feel. Without the multiple firings of neurons, no AI will think with the multiple feelings and indecisive thoughts that humans experience.

  • Robert Weiman

    I think a bodyless AI could feel emotion. However, I think there would be both overlap and disjoint between what they feel and what humans feel. Fear would be common, as the root of most fear is personal injury and/or death. An AI could die, even if it was distributed between a number of cloud systems. It could also suffer injury, losing parts of itself if the distributed chunks weren’t complete copies. Given the distinctly different environment that the AI would occupy, I think it would have the opportunity to experience states of existence that we might not even by able to hypothesize about. At the same time, without similar physical structures to us, it wouldn’t be able to experience some of what we do either. Nerve sensations that live at the border between pleasure and pain, for instance.

  • Pam E

    I don’t think they can feel emotions in the way we do. Without nerves how would it ‘feel’ physical pain? With other emotions the limits to programming would only afford them an approximation of the real thing. Maybe in the future that might change, but I don’t think it would have a positive outcome.
    The reason they are most likely to overthrow humanity is the logical decision that we are a danger to ourselves and therefore need to be controlled to protect us, not because of any feelings.

  • Dagger1819

    I don’t think we’ll be there anytime soon, but sure. Our emotions are nothing but reactions that happen in the brain. Eventually, I wouldn’t be shocked if they designed a basic brain program or something that was able to do it.

  • I do not think AI would ever have the ability to feel emotions. It is a complicated process that humans know occurs in the brain’s white matter with many different systems that work in unison to produce feelings. To reproduce this is a computer would impossible.

  • SenorSensible

    For me it all comes down to chemical reactions. No chemicals, no reactions, no emotions possible. Thanks

    • Jason1

      Would we need our particular chemicals, or could we imagine a different set of chemicals giving rise to the same emotions?

      • SenorSensible

        Well different chemicals would have different reactions so differing emotions would result.

        • Jason1

          Here’s one reason for thinking that different types of chemical reactions might support the same type of emotion. Consider octopuses. They have vastly different neural chemistry to ours. And yet, they seem intelligent, and experience pain if we lop off one of their limbs. It doesn’t seem like different pain to the pain we experience when we lop off one of our own limbs.

          • SenorSensible

            Pain is not an emotion.

            • Jason1

              Hmmm. That’s interesting. Why isn’t pain an emotion?

              • SenorSensible

                Pain is just a physical notification of an injury to the body.

                • Jason1

                  Aren’t all emotions reactions to potential impacts on the body/mind?

                  • SenorSensible

                    You’ve added the concept of body/mind to the discussion. Don’t you think that confuses the issue unnecessarily? If I get shot I may be afraid of dying but the pain of the wound is not part of of that emotion.

                    • Jason1

                      Suppose I accept that pain isn’t an emotion. That it’s some other sort of mental state. At the very least, then, would you agree that some mental states (like pain) could have different chemical bases in different species?

                      Second, I think you would agree that fear is an emotion? And don’t you think octopuses are capable of experiencing fear too?

                    • SenorSensible

                      First, pain is a physical manifestation not mental at all.
                      Second, I don’t know what octopi feel, if anything. I have seen film of them doing amazing things but I have no way to know if what’s happening in their nervous system is emotion-based or instinctual.
                      Third, back to Samantha. No chemicals reacting inside the computer = no emotions

                    • Jason1

                      What do you mean by a “physical manifestation”? Do you mean behavior? Because many people experience pain they don’t react to?

                      I think what’s important isn’t whether octopuses actually experience fear, but whether we can *imagine* them experiencing fear? If we can so imagine, then it seems that fear could potentially exist on top of other chemical reactions.

                      Hmmm. I don’t know enough about computer circuitry to know this for sure, but isn’t there a chemical reaction on a motherboard? Isn’t electrical activity chemical in nature? Please correct me here if I’m wrong.

                    • SenorSensible

                      1) By “physical manifestation” I meant it’s a physical phenomenon, not an emotion, not a mental state.

                      2) Of course, we are all free to “imagine” whatever we want.

                      3) I guess that would be similar to the “organic life” I impart to my refrigerator light when I open the door. I’m like a god to that thing.(jk)

                    • Jason1

                      I’ve just realized that I’m arguing for the opposite position that I adopted in the blog post!

                    • SenorSensible

                      Well that’s true, but we have to look at all sides to get to the answer

  • Gregory Young

    I think an AI feeling emotion without a body would be extremely difficult. However if you consider Terminator 2: Judgment Day. The movie brought an element of feeling in the body of the Terminator. In the end he was able to understand why John felt sad but due to his limited senses he was unable to express it. I think if the AI was able to have a body which allows it to experience senses it may allow the AI to feel emotions. Similarly with pain I am sure it would be able to feel it as if certain parts of it was exposed to extreme shock from electricity it would feel the pain from it’s hardware components being damaged.

    • Jason1

      I’m in agreement that if an AI had a body, whatever form that body took, it would in principle be capable of pain (provided the AI was sophisticated enough). What struck me about Samantha in the film, was that she had no body. And yet, here she was moaning and orgasming. And that seemed mistaken.

      • Gregory Young

        I agree with you regarding Samantha. However we are unable to properly test this hypothesis as we have not reached a stage were AI is advanced enough to test this fully. I doubt Siri would qualify an unfortunately I do not have an iPhone to test this out. It seems to be a screen writer’s creative freedom to better sell the movie as a husky speaking Scarlett Johansson would make the movie more successful.

        • Jason1

          But here’s a question: is this something we could really test? Suppose we produce an AI that behaves just like a human being. It cries when we shout at it, and it laughs when we tell it jokes. How do we know it’s actually experiencing emotion, rather than just simulating emotional responses? In other words, I’m suggesting that “testing it” won’t give you an answer. The answer would have to be philosophical, rather than purely scientific.

  • Christian Meyer

    Interesting topic, and another movie I’ll need to watch (just seen Ex Machina, not too far from the subject 😉

    While I understand and basically agree that you need a body to experience emotions, I also feel we miss part of the equation. Being cloud-based does not mean you don’t need hardware to run, there is just a bigger abstraction layer; which is not that different from the one separating our thought processes from individual neurons…
    I would therefore be very careful before saying an abstract entity cannot feel the fear of being disconnected or the pain of losing part of itself – and thus any other emotion as well

    • Jason1

      Sure, this is a good point. The cloud has to exist on some physical substance.

      But there’s an important difference between the hardware on which cloud-based applications exist , and the bodies in which our minds exist. It seems that although cloud applications need *some* hardware, any hardware with the right configuration will do. But with human minds, this may not be the case. We don’t know yet what would happen if one mind was placed within another body – although we’ll hopefully find out soon (there’s a head transplant scheduled for this year in Russia, I think).

      It seems to me that if I swap out my body for another body, it would be extremely traumatic. I think humans are very attached to their *particular* bodies, rather than to having a body in general.

      • Christian Meyer

        Hmmm… and what do you make of cosmetic surgery? Wouldn’t that at some point be the same as an upgrade?

        • Jason1

          True. And I think cosmetic surgery comes with *lots* of emotional upheaval – both in wanting to have it, and then in the inevitable let down after the surgery. I just can’t imagine a cloud-based application having that sort of emotional response to its hardware, since the hardware is replaceable. That’s the crux of the difference.

          • Christian Meyer

            Because an organ isn’t replaceable? Daniel will be interested to hear about it 😉

            • Jason1

              Well that’s just it – Daniel thinks only *his* organs will do. But, admittedly, I think Daniel is wrong about this.

              • Christian Meyer

                I was rather referring to his final discovery (not spoiling it for anybody who hasn’t read it 😉

                • Jason1

                  I’ll concede – you’re right that there are certainly parts of the body with the same level of replaceability as the hardware on which a cloud app operates. It seems like replacing my liver should be as unemotional as replacing my hard drive.

                  But it still strikes me that there are at least certain physiological components to the body that would hold immense identity value – e.g. one’s face. By contrast, I don’t think there would be a corresponding hardware component that is as integral to the identity of the cloud-based application.

                  • Christian Meyer

                    Your body might disagree by rejecting your new liver… which just shows that no part is more or less replaceable. Face included, as we have already seen the first public face transplants and reconstructions. I will certainly concede they have been the result of traumatic events, and cause of trauma themselves.

                    If you would accept that the user interface of a system is its “face” (pun intended), there has been some serious facelifting until now with no reason that the evolution stops…
                    If you consider them different entities, you need to take into account fear of death.

  • Ken Rodriguez

    Now I really have to see both movies! GE Podcast Theatre has a series called Life After that has cloud-based AI contacting bereaved spouses. The dead spouse communicates through cell phone. Pretty good series, and Her sounds so much like it that Life After may have been inspired by it.

    • Jason1

      Interesting! I hadn’t come across the podcast series before. I’ve added it to my to-listen-to list. Thank you.

  • Jolanda LovestoRead

    I don’t think so because it’s artificial.

  • marypreston

    Emotions are so personal. They are part of our being. To feel true emotions I think you need to be a living organism.

  • sherry fundin

    No, because they are artificial. 🙂 They have no feelings or emotions.

  • gecko

    Hmm. I agree that experiencing sensations without a body would be impossible, by the very argument you give; you can’t explain physical experiences to something that just isn’t and has never been physical. But I disagree on emotions. The physical sensations you describe going with that, I think are extensions and physical mirrors of the emotion, but not required to experience the emotion itself. They are caused by the emotion, and aid our body in experiencing it. But the intellectual response, the part where synapses and brain chemistry and whatsit make our brain feel things? Yeah, I think that could be generated artificially. Maybe not through programming itself, but through learning and, well, yea, experiencing. Now, I am not particularly knowledgeable in either programming or biology, so, yeah, layman’s opinion. But I have read and watched plenty of scifi.

    • Jason1

      I think that’s what’s interesting about a lot of sci-fi: it assumes that emotion simply needs neural circuitry. There’s lots of sci-fi about disembodied brains, and uploaded consciousnesses, etc. And they all involve a person thinking and emoting without a body. And yet … that strikes me as impossible.

      It’s interesting that we have such differing intuitions about this. Your intuition is that the bodily sensations are external to, and caused by, emotions. My intuition is that bodily sensations are at least partly constitutive of emotion. How would we settle this?

      Perhaps one way would be to see whether someone who is born without a body can emote?

      • gecko

        See, yes, that’s exactly it. I don’t see human being as something so magical or exclusive, and in the end, what we experience as emotions are the results of complicated biological, chemical, etc. processes, and since that’s the case, I don’t see why those can’t be replicated and recreated.

        And hmm, yes. Good example, I think. Say, a baby is born paralyzed, it’s body unable to experience or process sensations, but otherwise fully conscious. Would we assume that that baby, and that person as it grows, can’t truly experience emotions either? Since they’re cut off from the physical sensation? And if we don’t, if we grant it the belief that it can emote, then what, exactly, is the difference between them and an artificial intelligence who possesses a conscious mind but no body?

        • Jason1

          I don’t think human beings are that special either – I think AIs could have emotions, provided they had bodies that mattered to them (e.g. it was their only available body, and couldn’t easily be fixed or replaced). I agree that we can replicate the human experience in other forms. It seems to me, though, that we would need to replicate more than just the mind – we would need to replicate the body too, in some form.

          The paralyzed baby is exactly the kind of test case we need. My intuition is that the baby won’t experience emotions. But as you say, if we grant that it can, then my arguments that body-less AIs don’t experience emotions, can be turfed.

          • gecko

            Yeah, I don’t think the whole human experience could be translated into an AI lacking a body — there is just no way to explain touch or physical pain to someone who doesn’t have a body to feel them, which, by the way, also bothered me about the AI orgasm in Her. Emotions, however, there we still disagree.

            See I do think it would have emotions. It’d be human. It would feel fear at, say, its mother leaving, feel joy at being fed or cared for. It would love its mother, or get attached to other caretakers. The lack of a body that can experience sensations wouldn’t mean the loss of those emotions.

  • The Foundation 4 Pi

    I disagree and think AI could feel emotions on some level. I say ‘some level’ because think about ‘love’ for a moment. How do any of us know we’ve ever really been in love? The meaning of love is so broad and subjective (and taught to us) that it is reasonable to say, if nothing else, there are levels of love. For example, if ‘love’ meant the flow constant positive thoughts (data to AI) about someone, obviously AI could ‘feel’ love. AI might also be able to feel emotions provided their hardware was a suitable representation of human hardware, our bodies. Interestingly, we humans have emotions about things are bodies are doing when we are not really doing those things, such as in dreams. Perhaps our emotions about what we do in dreams is based upon experience, but as you know, we cannot be absolutely sure we’ve actually had our previous experiences.

  • vachona

    Excellent theories. The responses smack of programming — elegant, but programming nonetheless.

    There are examples of comparable problems when the AI is given an android/robot body (A.I., Data, and even John Malkovich in “Making Mr. Right”).

    P.S. Thanks for giving me two movies to add to my watch list.

  • Pingback: Raping Holograms in Star Trek Voyager - Jason WerbeloffJason Werbeloff()