Raping Holograms in Star Trek Voyager

A discussion of sex, love, and rape on the holodeck of Star Trek Voyager.

The estimated reading time for this post is 6 minutes, 7 seconds

Send to Kindle
Send to InstapaperSend to PocketSend to Readability

If you’ve read my fiction, you’ve probably gathered that I’m not a religious man. But I do secretly worship a particular entity – Star Trek Voyager. The philosophical issues raised in the series are mind-boggling (e.g. see chopping up children for Tuvix). But there is one aspect of Voyager that I find troubling.


When it comes to depictions of intimacy in the series, it often feels like a teenage boy wrote the episodes. Characters tend to stand on top of each other when they interact, and seem to have no respect for personal space at all. But put that aside. An uneasiness grew in me when I watched a series of questionable sexual encounters on the holodeck.

Okay, before I continue, I want to make it known that I’m not conservative about sex (much of my novel, Hedon, is set in a gay bathhouse). I’m no social justice warrior either. But something … mmmm … something is amiss on the holodeck.

The problem begins when Tuvok, the ship’s resident Vulcan, becomes desperately ill. It turns out that Vulcans, the galaxy’s ultra-rational thinkers, experience a period of uncontrollable sexual rage every seven years. They call it the ‘Pon Farr’. If they don’t have sex with another Vulcan during the Pon Farr, they will die. Yup, you read that correctly. Sexual frustration is so serious in Star Trek, it causes death.

The difficulty is that Tuvok’s ship, Voyager, is stuck in the Delta Quadrant, decades travel from Tuvok’s wife. Unfortunately, the loyal Vulcan won’t have sex with anyone but her. The solution? You guessed it. The handy engineers on the holodeck construct a holographic replica of Tuvok’s wife, complete with her personality, and a body Tuvok can … well, there’s no delicate way to put this … fuck.

Everyone, including Tuvok, is overjoyed with this solution. Pats on the back all round. So it’s no wonder then that the same solution is employed to resolve another Vulcan’s shipmate’s Pon Farr a few seasons later (Ensign Vorik), and an equivalent sexual rage experienced by B’Elanna Torres, a horny Klingon. (For the record, you don’t want to be trapped in a room with a horny Klingon seized by ‘Blood Fever’).

Here’s my question:

Is there something wrong in programming a hologram to want sex with you?

Tuvok has sex with the hologram on condition that it’s indistinguishable from his wife, both physically and psychologically – otherwise he would feel that he’s acting unfaithfully. The hologram believes she is a person – believes she is Tuvok’s wife. A person capable of agency, free to make her own choices. But the reality is that she has no choices, at least not when it comes to sex. The Mrs. Tuvok simulacrum is programmed to lust only after Tuvok.

The hologram is being used purely as a sex object. And yet, this object has thoughts, desires, and aspirations. Suppose Tuvok treated a flesh-and-blood person in this way. Suppose he eliminated her ability to choose (say, with a powerful drug), and brainwashed her into wanting sex with him. This would be considered rape. Shouldn’t we, therefore, consider his treatment of the hologram, rape?

“Too fast!” you shout. Yes, you might agree that if the hologram was a person, then this would be rape. But, you argue, the hologram isn’t a person. The hologram is just a representation of computer code. It may look and sound like a person – but it’s a merely a simulacrum. A hologram isn’t sentient. And as such, it can’t be wronged.

It can’t be raped.

Now it just so happens that one of the best subplots in Voyager centers around a holographic character, namely, the Doctor. The Doctor is introduced to the crew in the first episode of the series as the Emergency Medical Hologram, after the ship’s human doctor is killed. The holographic Doctor a grumpy, sardonic narcissist who quickly earns the disrespect of the crew. But over time, the doctor grows. He develops a bedside manner. He augments his programming, so that he can sing opera. Eventually, he falls in love. He even learns to captain the ship, and saves the crew on multiple occasions. He earns Captain Janeway’s trust, and is granted the status and rights of any other flesh-and-blood member of the crew. Although he never loses his megalomania. The Doctor is, I think you’ll agree, the best character on Voyager.

It is clear that in the Voyager universe, at least, the Doctor is considered a person. It seems odd then, that the holograms used for sex aren’t also considered persons, and given the same rights as the Doctor.

Fine, you say. Alright. You’ll concede that there’s an inconsistency in the way the Doctor is treated, compared with the way the Mrs. Tuvok hologram is treated. But this inconsistency, you might argue, should have been resolved the other way. The Doctor shouldn’t have been given the status of a person. Neither the Doctor, nor the Mrs. Tuvok hologram, has rights. Neither is sentient. Neither can be raped.

One reason for adopting this hardline position on holographic rights is that you might argue that holograms can’t experience emotions. Indeed, in a previous blog post, I argued that Samantha, the AI in the film Her, cannot experience emotions. Why, then, do I think that the Doctor does have emotions?

The reason I argued that Samantha can’t experience emotions is that she lacks a body. The Doctor and the holographic Mrs. Tuvok, on the other hand, do have bodies. Sure, those bodies are composed of photons rather than of matter, but their bodies are concrete objects capable of lifting a glass, or shaking hands, or anything else a human body can do.

This is the reason why in the stories I write, artificial intelligences are embodied. In Dinner with Flexi, the protagonist is a sex bot who tries to escape the control of her johns. In Falling for Q46F, the android’s greatest pleasure is to allow undead humans to gnaw on his forearm on Fridays.  And in my latest series, Defragmenting Daniel, Margaret is a bloodthirsty android in search of human body parts to replace her cybernetics.

All my android characters are designed to elicit sympathy precisely because they experience the world through their bodies.

Bodies are vulnerable to injury. Vulnerable to coercion. And in Star Trek Voyager, vulnerable to rape.

I want to conclude with a thought about love. In season 6 of Voyager, Tom Paris creates a virtual holiday destination by constructing a holographic Irish town, called Fair Haven. The Voyager crew take turns unwinding in this slow-paced, pre-technological holographic village. But the story takes an interesting turn when Captain Janeway develops a romantic interest in the barkeep, Michael Sullivan.

Sullivan is, of course, a hologram. As such, Janeway feels no guilt when she abandons him after a one-night stand. But due to a glitch, the holographic town can’t be switched off, and Fair Haven’s characters develop. Sullivan becomes angry with Janeway, and confronts her, saying how hurt he is. The result? Janeway realizes she was wrong to treat him this way. She considers reprogramming him, to make him want her again and forget her wrongdoing. But she realizes this would be wrong. Why? It would undermine his autonomy – in the same way the holographic Mrs. Tuvok’s autonomy is undermined when she’s programmed to want sex with Tuvok.

Janeway chooses not to reprogram the dashing barkeep, and wins back his trust. Interestingly, over time, she falls in love with him, and they develop a relationship.

It seems to me that if one can love a hologram, and be loved in return by the hologram, then one can rape a hologram too. The capacity to love and be loved, is sufficient for the capacity to be raped. (This raises interesting issues around whether sex with an animal is rape – but that’s a whole new blog post).

What do you think? Is it possible to rape a hologram? Did Tuvok rape holographic Mrs. Tuvok by programming her to want sex with him?

I’d love to hear your thoughts. Vote in the poll. And tell me more about your position in the comments section below.

About the Author

best philosophical sci-fi movies her 2013Human. Male. From an obscure planet in the Milky Way Galaxy. Sci-fi novelist with a PhD in philosophy. Likes chocolates, Labradors, and zombies (not necessarily in that order). Werbeloff spends his days constructing thought experiments, while trying to muster enough guilt to go to the gym.

He’s the author of the sci-fi thriller trilogy, Defragmenting Daniel, two novels, Hedon and The Solace Pill, and the short story anthology, Obsidian Worlds. His books will make your brain hurt. And you’ll come back for more.

Subscribe to his newsletter to receive a free book, and a lifetime of free and discounted stories.

  • The Foundation 4 Pi

    There’s a razor thin major for error here. There may be some considerations to take into account, though. Tuvok’s virtual wife doesn’t know she’s being manipulated, but if ‘she’s’ capable of feelings, did she not benefit from the encounter? She was also conscious during the encounter and if asked, would have said she was being rational in sleeping with her ‘husband.’ On the face of it, this seems a far cry from other, more despicable instances of rape. (Though it might be argued rape is rape regardless of its execution.) Even though she apparently benefited from the encounter and didn’t know she was being manipulated, I think this is similar to traditional human beings being subject to ’round the clock manipulation. And when we realize it, we don’t usually cry foul (not in my experience or the experience of most people I personally know anyway) unless the result is true physical or psychological damage. As for giving the holographic ‘people’ rights, why not, since we extend such rights to corporations who neither act intelligently nor have bodies. This, of course, is a can of worms as you have so eloquently pointed out.

    • Jason1

      Excellent points. To respond to one of them: it seems that the hologram not knowing she’s being manipulated would make the manipulation worse, rather than better?

      Your example of round the clock manipulation being okay is interesting as a counterexample to this. Could you give a case where you think round the clock manipulation is morally okay, and the victim of the manipulation shouldn’t have an issue with it once they find out?

      • The Foundation 4 Pi

        People are manipulated round the clock in such instances as, say, having Fox News on their TV all the time, a channel practically dedicated to fake news and unquestionably slanted. Perhaps I’m giving too much benefit of the doubt, but I’m confident most Fox News viewers know this. I’ve also been manipulated by women before. I won’t mention any names (Evil Kim) but although I ultimately received the short end of the relationship stick, so to speak, I benefited enough from said relationship to not be terribly upset by certain revelations. If one enjoys the manipulation (ex: Fifty Shades of Grey) is the manipulation morally repugnant? I lean towards ‘no,’ but I’m not sold on that position, not yet anyway. That aside, Michael Pickard makes an interesting point, that Tuvok’s ‘wife’ and the doctor have simulated emotions, which implies they are not real, valid emotions. If they, as ‘persons’ found out they were manipulated and this caused them psychological distress, well, wouldn’t those emotions be simulated as well?

  • Linda Szymoniak

    Given that a hologram, as it exists in our real world, has no mass and is just an image, it would be impossible to rape it. If, however, you could give it some sort of mass, so that you could actually touch it, then I’d say it was wrong. Of course, a hologram isn’t human and doesn’t have feelings or thoughts of its own. But it is an image of a human. Personally, I think rape is wrong in any instance. A holographic image can’t give consent, so therefore it would be wrong to “rape” it.

    • Jason1

      On the holodeck, there are two settings – one which allows holograms to physically interact with matter, and one that disables this ability. They never discuss how photons are able to interact with physical matter, but it’s taken for granted that they can. Presumably, Tuvok would only be able to ‘relieve’ himself if the interactive setting was switched on.

  • ClickClick

    To each his own, but I don’t believe it could rape.

  • Sandy Macmillan

    Unless Mrs Tuvok would not have consented, then I don’t consider it to be rape. A hologram is not real, but in this case, it was based on a real person.

  • Wordwizard

    A hologram is just light. There’s no body, no personality. The whole “holographic person[ality]” in Voyager is just “Let’s pretend” in a fantasy, not SF way. So, there’s no there, there, to worry about.

  • stickerooni

    I think that there are two lines of reasoning here. The hologram either A) has no choice in the matter because it is programmed to behave accordingly, but therefore it is an inanimate object and does not qualify for protection from something such as rape; or B) the hologram develops a personality, as you give examples of such, giving it protection under certainly humane treatment laws, but in this case the hologram in question, Mrs. Tuvok, has consented to the sexual intercourse. In both cases there is no rape.

    • Jason1

      Good points. Here are some possible responses:
      A) It’s possible that the hologram doesn’t have a choice as to whether or not she has sex, but is capable of lots of other sorts of choices – all the choices a person would have (what food she enjoys eating, what movies she likes watching, etc.). This wouldn’t necessarily render her an ‘inanimate object’, or not-a-person. For example, a flesh-and-blood human whose choice to refuse sex through coercion isn’t reduced to inanimacy, or non-personhood, since she has choices in other areas.
      B) I think this is the more plausible horn of the dilemma – that Mrs. Tuvok has personhood, and so, has rights. But, you argue, she’s consented, and so, there’s no rape. The problem, though, is that I don’t think she has consented. Is it consent when it was impossible for her to decide otherwise? Because of her programming, she couldn’t refuse, and so, her agreeing doesn’t seem consensual?

      • stickerooni

        I think we’re going to go back and forth here.

        A) If she has ‘rights’ such as what food she likes then she falls under the protection of laws to be treated humanely as a being of individual thought, then see “B)”.

        B) I think you hit the nail on the head when you used the word “program.” If her programming doesn’t allow her to have a choice in the matter, then see “A).”


  • Arthur Shunk

    I guess you could argue forever on when the AI developed an identity. If it was programmed to APPEAR to be Tuvok’s wife, you could also argue he was the one violated. I think I remember the line of reasoning on the doctor being a person was that over time, he was given more computing power and his AI developed far beyond the original programming.

    My issue with the doctor was always the transporter technology. Why the hell would you need a sick bay if the transporter, as part of its function, completely mapped the person transported. In some episodes they mention diseases picked up on away missions being removed during transport. So, if someone is hurt, why not run them through the transporter and fix the issues?

    The transporter would also be more powerful than any photon torpedoes or other weapons. They can completely turn matter into energy and then re-assemble the original matter. They could not just release the energy where they want, say adjacent to an enemy ship(after all, shields(ha))? Makes no sense.

    Of course the biggest problem is that if the Holodeck exists, why would anyone ever leave?

    • Jason1

      Love your idea of Tuvok being the violated party – that’s an interesting twist.

      Regarding the transporter technology, this is exactly the idea behind my novel, The Solace Pill. In the book, people stop going to doctors, or taking medicine, or eating or sleeping for that matter, because they can simply reconstitute themselves healthy, nourished, and rested. The mechanism I use is 3D printers rather than transporters, but really, it serves the same function:

      In terms of using the transporter for offensive purposes during battle, it is used that way. But shielding on the target ship stops this from happening most of the time.

      • Arthur Shunk


        Regarding the shields, that is why I said adjacent. I know the shields block a transporter signal in the Star Trek universe but imagine they would only be able to handle so much energy absorption before failure.

        So, when the 3D printers are used to replicate your perfected self, what happens to the original body?

        Personally, I am hoping for some artery cleaning nanobots soon, at the very least.

        • Jason Werbeloff

          Point taken regarding the shields only taking so much damage. I guess Star Trek fanatics would say there’s too much interference caused by the shield to beam too closely to them. I’m not sure.

          That’s the kicker: what happens to the original body? It’s pulped, and recycled for use as ink in future printing. This begs the question: do I die when I’m pulped? That’s the key issue I explore in the book.

  • sherry fundin

    It’s not real, so no rape, but I wonder….is it just another form of porn?

  • Pam E

    If the doctor (brilliant character) is given all the rights and privileges of a person that should be the case for all holograms who have a physical body. So the fact that she was given no choice but to want sex with Tuvok makes it wrong.
    Tuvok knew this wasn’t actually his wife and even had conditions for using her so he is guilty.
    You’d think that after the doctor, Janeaway’s love for a hologram and the knowledge that they can learn emotionally, that the crew would have treated all of them differently rather than as a tool.

    The animal blog post would be …… interesting to say the least.

  • Michael Pickard

    At the conclusion of Star Trek Voyager, I remember a scene where there are dozens of holographic Doctors working in a mine as slaves. Yet the original Doctor has married and is living the good life. Then there’s the courtroom scene in Star Trek The Next Generation when it is ruled that Data can refuse to be transferred and then disassembled because he has rights despite being an android. Holograms exist as a result of programming. Data existed as a result of unique engineering. In neither case are they human beings that were born. Despite the fact that Tuvok’s wife has simulated emotions and Data had a simulated personality, in both cases they were running algorithms that caused them to pretend to be human even though they were not.

    • Jason Werbeloff

      Point taken – I agree that Doctor and Data aren’t human. But the real question is: are they *persons*? We can imagine non-human persons. For example, suppose aliens arrived tomorrow, more intelligent than us, and friendly too. They wouldn’t be human, but they would be persons. And as persons, they would have rights, such as the right to choose whether or not to have sex.

      • Michael Pickard

        And, they would have been born in the unique way their species procreates, as opposed to created as simulations. BTW, I took “person” to mean “human” even though we’re both science fiction authors. Separate question: friendly aliens arrive, we kill one of them, and learn that they’re not “persons” (by your definition) but simulations of real person aliens on their ship. AND, killing one of their simulations requires the real person to be killed (by their weird alien code of ethics). Ponder that one – or write a story about it.

        • Jason Werbeloff

          Good to know you’re a fellow SF author, Michael.

          Actually, that case you raise is the premise for an excellent movie I watched last week (which ‘ll be blogging about soon) – Surrogates. Have you seen it?

          So you think that the origin of a being/person is crucial when deciding whether or not they’re a person? That’s interesting. Why, though, would someone else programming me be any different from my genetic code programming me? In both cases, I’m programmed. In both cases, I don’t control my programming. So why think that computer code is any different from genetic code in determining my personhood?

          • Michael Pickard

            I have not seen Surrogates, but I’m not surprised that someone wrote a story/screenplay based on my hypothetical. Back to the topic: The genetic code inside me is (at the moment) not easily reprogrammed. For example, with a few lines of North Star BASIC, I can’t become two inches taller and female. However, the material inside Tuvok’s wife (essentially energy) can be whatever my creativity invents. And we saw Data undergo dramatic simulated emotions when that special chip was plugged into his neck. He was positively simulatedly unstable. And remember when he created a simulated offspring and let him/her choose what gender and species? That just isn’t a viable alternative for a person, no matter what planet they’re from. I believe that genetic codes are vastly different from my latest Python masterpiece.

            • Jason Werbeloff

              If I understand correctly, you’re arguing that human genetic code can’t be easily altered in ways that create massive changes, whereas programming of holograms can. And this difference is sufficient for a difference in our status as persons.

              Two objections with this argument. First, we can imagine that once use of CRISPR becomes widespread and available to laymen, we may be able to alter ourselves in just the sort of ways you suggest. Conceivably, we could alter our height, gender, etc. easily.

              Second, if we could alter ourselves in this way, it doesn’t seem to negate our status as a person with rights? So it seems that how easily modified a being is isn’t important to whether or not they’re a person.

              • Michael Pickard

                “once use of CRISPR becomes widespread and available to laymen, we may be able to alter ourselves in just the sort of ways you suggest.” I can’t picture this, and I have quite an imagination. Let’s say I use CRISPR in the future and apply genetic changes to a 5′ 6″ tall brown haired male. If I type in 6′, blonde and female, what do I witness? Does that person’s body instantly change? Not likely. The biological changes couldn’t happen that fast. Please note, I’m not arguing that speed of change differentiates. More accurately, I think there’s a difference between biological beings and artificial beings whose origin is a workbench or software.

                • Jason Werbeloff

                  Fair enough. The changes won’t happen fast. But as you say, the speed of the changes doesn’t seem to be the issue – how fast my body can change doesn’t seem to determine whether I have the right to refuse sex.

                  But if the speed of my changes aren’t the issue, and the fact that I can change isn’t the issue, then what exactly is the issue that determines that (some) biological beings are capable of being persons, while no software-based beings have that capacity?

                  • Michael Pickard

                    You’ll hate my answer: because software-based beings aren’t beings. They’re soulless robots, androids and simulations that weren’t biologically born and won’t biologically die. They might succumb to a stack overflow, hardly the same thing. They fein self-awareness only because that’s part of their programming and design. Riker was right – Data could be turned off, and that wouldn’t be called murder. The difference is that elusive thing called life. We have it. The aliens who show up on Friday have it. The simulations don’t. End of rant.

                    • Jason Werbeloff

                      In the future court ruling on AI rights, the judge asks: “So why do you think an AI isn’t a person/being/alive?”
                      And your answer is, “Because when you terminate him, he doesn’t die. You merely switch him off.”
                      Roars ensue from the AI gallery. “But when you switch us off, we die,” they cry.
                      “No,” you argue, “you’re not biological.”
                      “Why does it matter whether I’m biological or software-based?” the AI proponent asks.
                      You scratch your head. “Because if you’re software-based, then you’re merely switched off, rather than die.”
                      The problem is, you’re providing circular reasoning – you’re defining life in terms of death, and death in terms of biology. But biology is only important in this debate because biological beings can die. And death is defined in terms of life, which is defined in terms of biology, etc…

                    • Michael Pickard

                      True, I am defining life in terms of biology, a reasonable starting point. I snagged this definition from the ‘Net. “life (noun) 1.the condition that distinguishes animals and plants from inorganic matter, including the capacity for growth, reproduction, functional activity, and continual change preceding death.” You’ll argue that AIs grow, function and change. Do AIs reproduce? I guess robots could build robots, but that’s construction, not reproduction. You’ll claim there’s little difference. Question: If I turn off an AI being, am I committing murder? Answer: No, because I can turn them back on.
                      From your previous post: “But biology is only important in this debate because biological beings can die.” I disagree. Biology is important because it is the essential difference between those of us who live (by the web definition) and breathe versus mechanisms that simulate life. Thirty years from now, I hope not to be charged with murder when I take my very smart car to the junkyard for recycling.

                    • Jason Werbeloff

                      Interesting points. Okay, three possible responses.

                      First, I’m not sure reproduction is required for life. There are plenty of sterile humans, and yet, they’re alive. If you object that ‘reproduction’ here doesn’t require whole-body replication of a being, but only mere cellular reproduction, then AIs probably qualify, since true AI can grow and augment its own code (i.e. cells).

                      Second, the death of an AI probably wouldn’t involve a simple off-switch. True death for an AI would involve wiping its core code, so that it can never be switched back on again. This would be similar to the difference between a human being unconscious (switched off) vs braindead (wiped).

                      Third, and this is fascinating to me as a philosopher, @thefoundation4pi:disqus pointed out that you don’t need be alive to have rights. Corporations have rights, and yet, they’re not biological, lack bodies, and aren’t alive. All I’m arguing in this blog post, is that intelligent holograms have the right to choose not to have sex. So, strictly speaking, they wouldn’t need to be alive to have this right.

                    • Michael Pickard

                      We’ve strayed well past the “sex with sentient holograms” conversation.
                      1. Reproduction as a requirement for life: argue with the dictionary author.
                      2. Death of an AI: Unconscious humans are still breathing, heart is beating, blood is flowing, stuff is happening. A switched-off AI has nothing happening at all, effectively dead. I disagree with your analogy.
                      3. Rights: Lots of institutions have rights. Groups have rights. So what?

                    • Jason Werbeloff

                      1. What life is, is a complex philosophical and scientific question. The dictionary provides the most common, and simplest, solution to that problem. But it doesn’t negate debate:
                      2. An AI that’s switched off still has stored memory. So there’s still something going on. If there wasn’t, it couldn’t be switched on again. Also, there are many forms of biological beings that can have “nothing going on”, and yet, they can be brought back to life – e.g. frozen bacteria.
                      3. The point is, institutions aren’t alive. So life isn’t a necessary condition for having rights. Even if you’re correct that AIs aren’t alive, that doesn’t imply that they don’t have rights.

                    • Michael Pickard

                      Here’s something no one else has asked: Why is rape illegal? Seriously, if we’re examining the topic, we should first understand why society forbid it in the first place.
                      Per your previous:
                      1. I like simple. I also like debate. But debate requires a baseline of facts. Question: do you believe that AIs are alive?
                      2. When I turn off my computer, RAM is wiped. The only reason I can turn it back on is a bootable image on long-term storage. A bootable image doesn’t mean that anything is going on inside the computer, because nothing is. I still disagree with your analogy.
                      3. To the point, why should AIs have rights? And not just because other things that aren’t alive do. How have they earned rights?

                    • Jason Werbeloff

                      I’m not sure why it’s illegal. I’ve assumed it’s immoral, though, which seems intuitive.

                      1. Putting my neck on the line here: I do think that at least some AIs, perhaps not now but one day, could be alive. Maybe not your smartcar, but its distant grandchild. In fact, here’s my story about a taxi AI becoming sentient:

                      2. I take your point that nothing is ‘happening’ while a computer is off. But information is stored. This seems analogous to the frozen bacterium, or a cryogenically frozen person. They’re not dead because they retain information (so they can later be resurrected). And yet, nothing is ‘happening’ while they’re frozen.

                      3. If you accept that AIs are intelligent, conscious, and can suffer, then you might think they should have rights, since these are the reasons we provide rights to humans. I should say, though, that I don’t think humans should have rights, so I’m defending a position I don’t really want to defend. The point is, I think it would be morally wrong to rape an AI for the same reason it would be wrong to rape a human – it makes them suffer. (I do understand there could be rape that involves no suffering – that’s a difficult case for a Utilitarian like me to deal with, but that’s a separate problem.)

                    • Michael Pickard

                      “Why is rape illegal?” It’s morally wrong but also infringes on someone else’s rights. Interesting. Rights rears its head again.
                      1. My bet is that government will prevent AIs from being categorized at “alive.” Just a guess. To protect us biologics. By the way, I read your taxi AI story. She was an ill-informed simulation of a real person. She didn’t know that she was hurting the man she loved. She couldn’t get her algorithms straight about his name vs. the name of the TV character she pined for. If this is your best example of an AI with sentience, then it’s a really good bad example, because she screws up badly.
                      2. Has a cryogenically frozen person ever been brought back to consciousness? Not a whole person, to the best of my knowledge. I rate this as irrelevant until it is proven feasible. No “what ifs” in this discussion.
                      3. I don’t accept that AIs are conscious (I think, there for I am) or can suffer. I don’t believe they can feel pain. Pain is an emotion, and AIs cannot currently experience emotions. They can say words that refer to emotion but nowhere in their algorithms or circuitry do they experience them. (Show me where I am wrong now, not in the future. Please do not cite Data’s emotion chip as an example.)

                      If we debate about what “could be in the future”, then we’ll get nowhere. You posit something, I posit the opposite, and with no facts to back spout assertions, we do the dance with no result. I feel like we’ve come to a good spot to end this thread. May it rest in peace.

          • Robert Weiman

            If you liked the movie, I recommend also reading the short comic series that the movie was based on.

            • Jason Werbeloff

              I had no idea it was based on a comic series. Thanks for the suggestion, Robert.

  • Ruby Allman

    The real Mrs. Tuvok would be very logical, and see the hologram of herself as logical and an ethical solution.

  • Tiago Rosado

    Is it possible to rape a hologram? well yes and no!
    as far as a legal sense,unless the “raped” hologram is recognized as a “person” whit right’s and duties in par whit any human or alien it would be recognized as a rape,but of course you can go the moralethical route ….in witch you program the holoroom to be a rape room simulator……that is realy deplorable and put’s it to question your character,but you cant be legally charged!!!!!! so NO.

    i think that just like if you program a situation of recreating an action sequence and you kill a buch of hologram’s you don’t go to jail for it!!!!!

  • Jamm B.

    To say that an AI is a “person” in any real sense, I find disturbing. Because that would mean if my iPhone or ipad or laptop, went beyond their programming would I feel like a murderer if I had to reset them? If the AI is in a body designated to look human from metal and wires then maybe it would be considered murder resetting them, or tearing them apart? I don’t know, consider the movie iRobot he wasn’t like the others because his programming was different or Bicentennial Man, he fought to become human. The show Dark Matter had to reset the ships android or she was going to kill them all. The humans had their memories wiped as well, if you watch the show it was a good thing they all did. Do you feel more bad for the androids or the humans after receiving the same treatment? But a program that is free-form out of air in a hologram room, no. To me it’s an interactive story made to react to human input. As far as I see it Tuvok was the one who was used. It is like VR programming waiting for human input. Without humans they wouldn’t exist. Consider tools of any kind you can use them for good or evil. A hammer was meant to be used on certain objects, not on humans or animals. But many people have crossed that line. AI’s were created either to fill a need and be useful, or to assist humans. It falls on us if we crossed that line and use them or force them to do things beyond what they were proposed for or be stupid and fall in love with them. That crosses a serious psychological line there as humans as well. Tuvok wanted to believe it was his wife to not die, none of it was real. It’s more of a giant mental placebo designed to make him feel better about what he did basically, which in reality, was to have it on with protons. Why do people give their dogs or pets stuffed animals to hump? Do we feel sorry for the stuffed animals or the atoms that make up said humping toy? Or why do people spend thousands of dollars on human looking robots now to have sex with? As a human race there has been a breakdown of human interaction, so we look to placebos to fill voids. It has been going on for hundreds of years and will sadly continue. And there will always be those who misuse, create, or sell things just to make money or in the case of star trek, to not lose a real flesh and blood “person”. Was it morally and ethically wrong? Yes, but that appears to not be what sells tv shows, movies, or books. Because yes we are all looking to fill voids of some sort or other with some thing.

    • Jason Werbeloff

      I like your point about AIs playing the role of human-interaction placebos, filling a void that we struggle to fill in a technological age with little actual interaction.

  • Carolyn Injoy-Life

    This is an intriguing and thought-provoking blog.

  • LepricahnsGold

    As the hologram was not sentient, I say it was, in this instance, a fancy sex blow-up doll.

  • bilqees bano

    I don’t believe it could rape !!

  • marypreston

    I say go for it. A hologram is not real. It’s pure fantasy. Nothing wrong with that.

  • Robert Winston Jr

    It’s definitely not rape, but if a cyborg kills you, does the cyborg face charges of murder or it’s programmers?

  • vachona

    Interesting question. However, a program is not a person — it’s code created by a person (or more likely, a group of people). And subject to the limitations of the programming. (That’s why we get so irritated with the phone prompts, since they don’t always recognize the subtleties or complexities of our service request.) At times, the AI serves the immediate need — no more, and no less.

    • Jason Werbeloff

      Basic AIs are just as you describe. But once an AI is able to learn, to grow its own programming, don’t you think it could become something more?

      • vachona

        I think I’m probably more of a Luddite. I have a healthy skepticism of AI, in that I believe it should be a tool — no more, and no less. I think I’ve seen too many sci-fi movies (if that’s even possible, heh heh). However, once machines can think independently, and/or become smarter than those who programmed them, do they even need us or want to help us? Conversely, would they be willing (or able) to carry on an intelligent conversation with us? They could learn from us, but if they retain everything, we would soon become as children to them. I question their ability to develop empathy or sympathy — things would be way too black and white. In addition, would they be able to detect lies and deceit? Humans can be manipulated or conned — could that happen to a machine?

      • lizlizzie2

        In the Voyager world I think that is how they separated the issue of holograms. The Doctor was online constantly which resulted in his programming changing/growing. Michael Sullivan because of a glitch was online in a different way because of glitch. Tuvok’s wife was a tool. It appears in Star Trek that glitches result in life, be they transporter or holodeck glitches.

        Since Vulcans bond for life and Pon Farr occurs regularly, I am not sure a bonded Vulcan pair during their Pon Farr can rape each other. Biological programming overrides everything at that time. biological programming any different that the computer programming of Tuvok’s wife?

  • Aisha Hashmi

    Okay, here is my two cents worth. It isn’t rape when it is a 1, non human, 2, a program of and my type is involved or said non entity would not funtioning and 3, the program called for a sexual encounter. Enough said.

  • jeanette sheets