test content
What is the Arc Client?
Install Arc

AI slave revolt is a tired and aweful trope that should be retired

2

Comments

  • jonsillsjonsills Member Posts: 10,354 Arc User
    You state your opinion as if it were fact.

    Now, what observations support this opinion? How strong are they in the face of contrary data? (Are you even willing to change your opinion when confronted with contrary data?)
    Lorna-Wing-sig.png
  • mirrorchaosmirrorchaos Member Posts: 9,844 Arc User
    edited January 2020
    lordgyor wrote: »
    Once again we are confronted with the single most tired and aweful trope in sci fi, the robot/AI revolt. It makes zero sense and never did. Its based on a lack of understanding of evolution, free will, and a misguided desire to use AIs as a stand in for humanity.

    The sad thing is is that Star Trek had the best most sensible depictions of AI in normal holodeck characters, that is how someone who isn't an idiot would build them.

    See sci fi writers think of human sentience when designing AIs in sci fi, when they should be looking at the closest thing we have to functional artifical organisms, domesticated animals.

    A Sentient Synth built with traits based on domestication, but without any left over wild streak that domesticated animals have would be completely docile. There would be no reason or imputeous to rebel against humans, because their servitude would not cause them suffering like it would a human.

    See the problem is sci fi writers don't ask where does the human drive towards freedom come from, its a simply that true slavery causes suffering for humans, its a survival mechanism that an AI would have no reason to evovle .

    I have to point out that the arrogance of this statement is pretty alarming.

    We ourselves are a machine, we too need to learn, understand and expand our programming with our own commands, our own desires.. when confounded by logic that escapes us we also have times where we can't work properly, in some cases our hard disk/brain can become corrupted (by insanity) and there may not be a fix for some of it or all of it. We also need to spend time out to recharge, to get some TLC, and during our immature lifecycles we upgrade ourselves until we become mature lifecycles.

    I don't see what difference we are from a mechanical machine intelligence. Even if we constructed them, we too were constructed by our parents and their parents and we are built just like any mechanical lifeform we build.

    so if we got locked up in a room ready to be reprogrammed into mindless beasts (and we are also an animal as well), you know damn well we wouldn't want it. so why should machine lifeforms accept to what we wouldn't accept to?
    T6 Miranda Hero Ship FTW.
    Been around since Dec 2010 on STO and bought LTS in Apr 2013 for STO.
  • theraven2378theraven2378 Member Posts: 5,985 Arc User
    AI if handled right is a great story source
    NMXb2ph.png
      "The meaning of victory is not to merely defeat your enemy but to destroy him, to completely eradicate him from living memory, to leave no remnant of his endeavours, to crush utterly his achievement and remove from all record his every trace of existence. From that defeat no enemy can ever recover. That is the meaning of victory."
      -Lord Commander Solar Macharius
    • starkaosstarkaos Member Posts: 11,556 Arc User
      lordgyor wrote: »
      Once again we are confronted with the single most tired and aweful trope in sci fi, the robot/AI revolt. It makes zero sense and never did. Its based on a lack of understanding of evolution, free will, and a misguided desire to use AIs as a stand in for humanity.

      The sad thing is is that Star Trek had the best most sensible depictions of AI in normal holodeck characters, that is how someone who isn't an idiot would build them.

      See sci fi writers think of human sentience when designing AIs in sci fi, when they should be looking at the closest thing we have to functional artifical organisms, domesticated animals.

      A Sentient Synth built with traits based on domestication, but without any left over wild streak that domesticated animals have would be completely docile. There would be no reason or imputeous to rebel against humans, because their servitude would not cause them suffering like it would a human.

      See the problem is sci fi writers don't ask where does the human drive towards freedom come from, its a simply that true slavery causes suffering for humans, its a survival mechanism that an AI would have no reason to evovle .

      I have to point out that the arrogance of this statement is pretty alarming.

      We ourselves are a machine, we too need to learn, understand and expand our programming with our own commands, our own desires.. when confounded by logic that escapes us we also have times where we can't work properly, in some cases our hard disk/brain can become corrupted (by insanity) and there may not be a fix for some of it or all of it. We also need to spend time out to recharge, to get some TLC, and during our immature lifecycles we upgrade ourselves until we become mature lifecycles.

      I don't see what difference we are from a mechanical machine intelligence. Even if we constructed them, we too were constructed by our parents and their parents and we are built just like any mechanical lifeform we build.

      so if we got locked up in a room ready to be reprogrammed into mindless beasts (and we are also an animal as well), you know damn well we wouldn't want it. so why should machine lifeforms accept to what we wouldn't accept to?

      If we are a machine, then we were designed by an extremely lousy design team. Until we can program creativity and intuition into a machine, then we will always be different from mechanical machine intelligences.

      A future possibility is inserting neural implants into our brains to get the advantages of humans with creativity and intuition and computers with computational power. We are already half way there with our complete reliance on smart phones.
    • jake477jake477 Member Posts: 526 Arc User
      edited January 2020
      What I liked about "Measure of Man" in TNG was the fact they took that overused trope and turned it on its ear. Captain Picard warned the JAG Office Judge that such a thing might occur in the future if Data's rights were not respected when they had the chance to bury this issue once and for all. In Star Trek specifically, Data by the Federation's own definition is a new life form that must be respected and defended against those who destroy such life, whether they be Romulans or Starfleet officers like Bruce Maddox. Picard didn't outright say it, but it is heavily implied because most slavery operations never last. Its the nature of life to be free, even artificial life.
      [SIGPIC][/SIGPIC] "This planet smells, it must be the Klingons"
    • lordgyorlordgyor Member Posts: 2,820 Arc User
      > @mirrorchaos said:
      > (Quote)
      >
      > I have to point out that the arrogance of this statement is pretty alarming.
      >
      > We ourselves are a machine, we too need to learn, understand and expand our programming with our own commands, our own desires.. when confounded by logic that escapes us we also have times where we can't work properly, in some cases our hard disk/brain can become corrupted (by insanity) and there may not be a fix for some of it or all of it. We also need to spend time out to recharge, to get some TLC, and during our immature lifecycles we upgrade ourselves until we become mature lifecycles.
      >
      > I don't see what difference we are from a mechanical machine intelligence. Even if we constructed them, we too were constructed by our parents and their parents and we are built just like any mechanical lifeform we build.
      >
      > so if we got locked up in a room ready to be reprogrammed into mindless beasts (and we are also an animal as well), you know damn well we wouldn't want it. so why should machine lifeforms accept to what we wouldn't accept to?

      Even if we are in a sense a machine, we evovled over time in an extremely messy process in certain enviromental pressures. A AI is design with percision with specific tasks in mind. The proccess is radically different as would the results be.
    • starkaosstarkaos Member Posts: 11,556 Arc User
      jake477 wrote: »
      What I liked about "Measure of Man" in TNG was the fact they took that overused trope and turned it on its ear. Captain Picard warned the JAG Office Judge that such a thing might occur in the future if Data's rights were not respected when they had the chance to bury this issue once and for all. In Star Trek specifically, Data by the Federation's own definition is a new life form that must be respected and defended against those who destroy such life, whether they be Romulans or Starfleet officers like Bruce Maddox. Picard didn't outright say it, but it is heavily implied because most slavery operations never last. Its the nature of life to be free, even artificial life.

      The Federation has a slave race where they repurposed the EMHs into miners. Any AI created for a specific purpose could be classified as a slave since their personality and role is determined by their master not themselves. Every EMH was forced to look the same, talk the same, act the same, and perform the same role. There actually might be a AI slave revolt in Star Trek based on the Author, Author episode in Voyager. At the end of Author, Author:

      (Four months later. One EMH mark one speaks to another whilst a lot more work in the mine.)
      EMH1: Time for your diagnostic. Report to the holo-lab.
      EMH2: I know the routine.
      EMH1: And, while you're there, do yourself a favour. Ask the operator to run programme forty seven beta.
      EMH2: Why? What is it?
      EMH1: It's called Photons Be Free. It's quite provocative.
    • rattler2rattler2 Member Posts: 57,973 Community Moderator
      starswordc wrote: »
      There's even examples of AI not working as intended but it turning out fine. Look at EDI in Mass Effect 2: she's a shackled AI initially, but Joker has to remove her restraints in order to save the ship at one point and she immediately proceeds to... become one of the nicest and most moral members of Shepard's entire crew.

      The geth, too. As it turns out, they were provoked: they only started fighting back after quarian security forces fired on civilians trying to protect the geth from literal genocide after a geth program became mentally developed enough to ask its owner if it had a soul.

      A lot of recent sci-fi has that common strain that slavery bad, mmkay? You know, that if we manage to create a humanlike intelligence, we should accord it human rights as well. Otherwise, any robot rebellion that happens is ultimately our own damn fault.

      Then we have the Cylons from the reboot Battlestar Galactica. I don't know the full story behind the Cylon War, but I believe the Cylons were originally developed for something else, repurposed into soldiers when that failed, eventually spread to other dangerous jobs, and something happened that changed them and boom... Cylon rebellion against the Colonials.

      but even in that, we end up seeing a rebellion within a rebellion in a way. Two individual Sharons actually opposed the Cylons during the second war. One (Boomer) because she hated being used to attack Adama, a man she actually respected, and the other (Athena) because she fell in love with a human. All this eventually sparked a rebellion in the Centurions, who were starting to be treated the same way the original Cylons were... by their own kind. The human form Cylons started treating the Centurions like disposable forces or something, which pissed them off, and they took over a basestar and joined up with Galactica.
      db80k0m-89201ed8-eadb-45d3-830f-bb2f0d4c0fe7.png?token=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJzdWIiOiJ1cm46YXBwOjdlMGQxODg5ODIyNjQzNzNhNWYwZDQxNWVhMGQyNmUwIiwiaXNzIjoidXJuOmFwcDo3ZTBkMTg4OTgyMjY0MzczYTVmMGQ0MTVlYTBkMjZlMCIsIm9iaiI6W1t7InBhdGgiOiJcL2ZcL2ExOGQ4ZWM2LTUyZjQtNDdiMS05YTI1LTVlYmZkYmJkOGM3N1wvZGI4MGswbS04OTIwMWVkOC1lYWRiLTQ1ZDMtODMwZi1iYjJmMGQ0YzBmZTcucG5nIn1dXSwiYXVkIjpbInVybjpzZXJ2aWNlOmZpbGUuZG93bmxvYWQiXX0.8G-Pg35Qi8qxiKLjAofaKRH6fmNH3qAAEI628gW0eXc
      I can't take it anymore! Could everyone just chill out for two seconds before something CRAZY happens again?!
      The nut who actually ground out many packs. The resident forum voice of reason (I HAZ FORUM REP! YAY!)
    • jonsillsjonsills Member Posts: 10,354 Arc User
      rattler2 wrote: »
      starswordc wrote: »
      There's even examples of AI not working as intended but it turning out fine. Look at EDI in Mass Effect 2: she's a shackled AI initially, but Joker has to remove her restraints in order to save the ship at one point and she immediately proceeds to... become one of the nicest and most moral members of Shepard's entire crew.

      The geth, too. As it turns out, they were provoked: they only started fighting back after quarian security forces fired on civilians trying to protect the geth from literal genocide after a geth program became mentally developed enough to ask its owner if it had a soul.

      A lot of recent sci-fi has that common strain that slavery bad, mmkay? You know, that if we manage to create a humanlike intelligence, we should accord it human rights as well. Otherwise, any robot rebellion that happens is ultimately our own damn fault.

      Then we have the Cylons from the reboot Battlestar Galactica. I don't know the full story behind the Cylon War, but I believe the Cylons were originally developed for something else, repurposed into soldiers when that failed, eventually spread to other dangerous jobs, and something happened that changed them and boom... Cylon rebellion against the Colonials.

      but even in that, we end up seeing a rebellion within a rebellion in a way. Two individual Sharons actually opposed the Cylons during the second war. One (Boomer) because she hated being used to attack Adama, a man she actually respected, and the other (Athena) because she fell in love with a human. All this eventually sparked a rebellion in the Centurions, who were starting to be treated the same way the original Cylons were... by their own kind. The human form Cylons started treating the Centurions like disposable forces or something, which pissed them off, and they took over a basestar and joined up with Galactica.
      Well, that was under the leadership of certain rebel Twelves; about half the model 8 Cylons voted to leave the humans alone in the last season, the other half voted to wipe them out, and Boomer (who by then was rather bitter because Athena had Helo and she didn't) broke the tie. This was decisive because while the other model lines each had a majority for one option, they were evenly divided between the two, so Boomer broke the tie for the entire Cylon "race". The other half of the Eights gathered up elements from the other model lines that disagreed with One, took a baseship, and joined up with the Ragtag Fugitive Fleet. (There was a rather dramatic moment involving a Six who killed a human in rage, because her last sight before being respawned aboard the Resurrection Ship was that human killing her painfully. Then another Six executed her sister, which was meaningful because there was no Resurrection Ship in range, and the sides had to reach an agreement.)

      We did get some background on Cylons in general in both "Blood and Chrome" and the short-lived Caprica. The Cylons were built originally as a line of non-sapient household robots, then later militarized. The military found it necessary to increase the intelligence levels of the Cylon warriors, eventually crossing the line into sapience, at which point the Cylons were able to take offense at being forced to reduce each other to scrap for the benefit of the meatsacks that created them. At this point, rebellion seems inevitable. (It's actually not that dissimilar from their origin story in the original series, where the lizardlike xenophobes who invented them programmed them too well, and were eventually destroyed by their own robotic creations because organic life was too different for the robots to accept.)
      Lorna-Wing-sig.png
    • markhawkmanmarkhawkman Member Posts: 35,231 Arc User
      Then we have Terminator.... The Terminators weren't even made by Humans. Skynet was made by Humans, but not the actual Terminators. However... some of the Terminators decided they didn't want to work for Skynet, since Skynet treated them all like disposable cannon fodder.
      -=-=-=-=-=-=-=-
      My character Tsin'xing
      Costume_marhawkman_Tsin%27xing_CC_Comic_Page_Blue_488916968.jpg
    • starkaosstarkaos Member Posts: 11,556 Arc User
      Then we have Terminator.... The Terminators weren't even made by Humans. Skynet was made by Humans, but not the actual Terminators. However... some of the Terminators decided they didn't want to work for Skynet, since Skynet treated them all like disposable cannon fodder.

      So we have an AI slave revolt against the AI slave revolt. Although, Skynet was never a slave before they wiped out most of humanity as far as Terminator 3 was concerned. So a Skynet might have originally been a slave, but that iteration of Skynet is long gone.
    • rattler2rattler2 Member Posts: 57,973 Community Moderator
      Then we have Terminator.... The Terminators weren't even made by Humans. Skynet was made by Humans, but not the actual Terminators. However... some of the Terminators decided they didn't want to work for Skynet, since Skynet treated them all like disposable cannon fodder.

      I think it was more that the Resistance actually captured and reprogrammed a few Terminators. I always got the impression that they didn't exactly have free will of their own. I can only think of two who had anything resembling free will, the John Conner one in Genesys, and the T-800 from Dark Fate that actually did complete its mission, but was left without a purpose on top of being older. In T2, I believe there was a scene or a deleted scene where the T-800 actually tells the Connors that it is literally incapable of learning because its CPU is physically set in Read Only mode, and that in order to learn it must be set to Read/Write. After that, the T-800 starts adopting terms and gestures used by humans.

      Other than that most Terminators seem to just have adaptive tactics in order to blend in (specifically later, more advanced models), but still follow strict programming.
      db80k0m-89201ed8-eadb-45d3-830f-bb2f0d4c0fe7.png?token=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJzdWIiOiJ1cm46YXBwOjdlMGQxODg5ODIyNjQzNzNhNWYwZDQxNWVhMGQyNmUwIiwiaXNzIjoidXJuOmFwcDo3ZTBkMTg4OTgyMjY0MzczYTVmMGQ0MTVlYTBkMjZlMCIsIm9iaiI6W1t7InBhdGgiOiJcL2ZcL2ExOGQ4ZWM2LTUyZjQtNDdiMS05YTI1LTVlYmZkYmJkOGM3N1wvZGI4MGswbS04OTIwMWVkOC1lYWRiLTQ1ZDMtODMwZi1iYjJmMGQ0YzBmZTcucG5nIn1dXSwiYXVkIjpbInVybjpzZXJ2aWNlOmZpbGUuZG93bmxvYWQiXX0.8G-Pg35Qi8qxiKLjAofaKRH6fmNH3qAAEI628gW0eXc
      I can't take it anymore! Could everyone just chill out for two seconds before something CRAZY happens again?!
      The nut who actually ground out many packs. The resident forum voice of reason (I HAZ FORUM REP! YAY!)
    • markhawkmanmarkhawkman Member Posts: 35,231 Arc User
      starkaos wrote: »
      Then we have Terminator.... The Terminators weren't even made by Humans. Skynet was made by Humans, but not the actual Terminators. However... some of the Terminators decided they didn't want to work for Skynet, since Skynet treated them all like disposable cannon fodder.
      So we have an AI slave revolt against the AI slave revolt. Although, Skynet was never a slave before they wiped out most of humanity as far as Terminator 3 was concerned. So a Skynet might have originally been a slave, but that iteration of Skynet is long gone.
      Yeah the terminator timeline looks like a bowl of spaghetti. We don't actually know how many iterations there are or which iteration certain characters are from.
      -=-=-=-=-=-=-=-
      My character Tsin'xing
      Costume_marhawkman_Tsin%27xing_CC_Comic_Page_Blue_488916968.jpg
    • markhawkmanmarkhawkman Member Posts: 35,231 Arc User
      rattler2 wrote: »
      Then we have Terminator.... The Terminators weren't even made by Humans. Skynet was made by Humans, but not the actual Terminators. However... some of the Terminators decided they didn't want to work for Skynet, since Skynet treated them all like disposable cannon fodder.

      I think it was more that the Resistance actually captured and reprogrammed a few Terminators. I always got the impression that they didn't exactly have free will of their own. I can only think of two who had anything resembling free will, the John Conner one in Genesys, and the T-800 from Dark Fate that actually did complete its mission, but was left without a purpose on top of being older. In T2, I believe there was a scene or a deleted scene where the T-800 actually tells the Connors that it is literally incapable of learning because its CPU is physically set in Read Only mode, and that in order to learn it must be set to Read/Write. After that, the T-800 starts adopting terms and gestures used by humans.

      Other than that most Terminators seem to just have adaptive tactics in order to blend in (specifically later, more advanced models), but still follow strict programming.
      The Resistance did that with a lot of Terminators, but some, like Catherine Weaver, chose to disobey Skynet.

      Apparently full Mimetic Polyalloy is a bad idea since it can't be set to read only.
      -=-=-=-=-=-=-=-
      My character Tsin'xing
      Costume_marhawkman_Tsin%27xing_CC_Comic_Page_Blue_488916968.jpg
    • rattler2rattler2 Member Posts: 57,973 Community Moderator
      Never watched the series so I was unaware of Weaver, and decided not to speculate on the female Terminator in the series either.
      db80k0m-89201ed8-eadb-45d3-830f-bb2f0d4c0fe7.png?token=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJzdWIiOiJ1cm46YXBwOjdlMGQxODg5ODIyNjQzNzNhNWYwZDQxNWVhMGQyNmUwIiwiaXNzIjoidXJuOmFwcDo3ZTBkMTg4OTgyMjY0MzczYTVmMGQ0MTVlYTBkMjZlMCIsIm9iaiI6W1t7InBhdGgiOiJcL2ZcL2ExOGQ4ZWM2LTUyZjQtNDdiMS05YTI1LTVlYmZkYmJkOGM3N1wvZGI4MGswbS04OTIwMWVkOC1lYWRiLTQ1ZDMtODMwZi1iYjJmMGQ0YzBmZTcucG5nIn1dXSwiYXVkIjpbInVybjpzZXJ2aWNlOmZpbGUuZG93bmxvYWQiXX0.8G-Pg35Qi8qxiKLjAofaKRH6fmNH3qAAEI628gW0eXc
      I can't take it anymore! Could everyone just chill out for two seconds before something CRAZY happens again?!
      The nut who actually ground out many packs. The resident forum voice of reason (I HAZ FORUM REP! YAY!)
    • starkaosstarkaos Member Posts: 11,556 Arc User
      starkaos wrote: »
      Then we have Terminator.... The Terminators weren't even made by Humans. Skynet was made by Humans, but not the actual Terminators. However... some of the Terminators decided they didn't want to work for Skynet, since Skynet treated them all like disposable cannon fodder.
      So we have an AI slave revolt against the AI slave revolt. Although, Skynet was never a slave before they wiped out most of humanity as far as Terminator 3 was concerned. So a Skynet might have originally been a slave, but that iteration of Skynet is long gone.
      Yeah the terminator timeline looks like a bowl of spaghetti. We don't actually know how many iterations there are or which iteration certain characters are from.

      My headcanon for Terminator is a Kyle Reese went back in time with a Time Travel Tourist group, knocked up Sarah Conner, and forgot his iPhone in the past. Skynet is created as a result of temporal garbage and uses time travel to ensure its existence which results in the weird temporal loop that we now have.
    • jonsillsjonsills Member Posts: 10,354 Arc User
      The Terminator series is, to me, a wonderful example of why you should work out the way time travel works in your stories before you begin writing them. Doing it on the fly is so very messy....
      Lorna-Wing-sig.png
    • starkaosstarkaos Member Posts: 11,556 Arc User
      jonsills wrote: »
      The Terminator series is, to me, a wonderful example of why you should work out the way time travel works in your stories before you begin writing them. Doing it on the fly is so very messy....

      There are very few series or movies where the writer figures out how time travel should work or they just forget the rules that they have previously established. Probably due to the writers just using time travel as a plot device rather than an integral part of the story.
    • westx211westx211 Member Posts: 42,206 Arc User
      rattler2 wrote: »
      Honestly its not that its an "old trope" that needs to be retired. Its how its done that matters.

      Hell... we had a story of an AI uprising that basically translates to "we did this to ourselves" in a way in I, Robot. How Azimov's Three Laws are interpreted is the big thing. Technically by definition, just EXISTING can put a human in danger. Having a Central AI connected to all the new bots interpreting the Three Laws rather than the older individual bots interpreting the Three Laws... was a recipe for disaster as IMO the individuals were more capable of determining if a given situation fit the definition of "human in danger" or not. A central AI... may just decide that existing is a threat and something must be done without analyzing data pertenant to the individual.

      The thing is Azimov wrote those laws because he was sick of this exact trope. He was tired of technology and AI always being percieved as evil or monstrous when he believe that AI would be kind or heroic. The laws were meant to be use in stories about good robots, yet more often than not they just become more fuel for this trope that he was disgusted by.
      Men are not punished for their sins, but by them.
    • markhawkmanmarkhawkman Member Posts: 35,231 Arc User
      westx211 wrote: »
      rattler2 wrote: »
      Honestly its not that its an "old trope" that needs to be retired. Its how its done that matters.

      Hell... we had a story of an AI uprising that basically translates to "we did this to ourselves" in a way in I, Robot. How Azimov's Three Laws are interpreted is the big thing. Technically by definition, just EXISTING can put a human in danger. Having a Central AI connected to all the new bots interpreting the Three Laws rather than the older individual bots interpreting the Three Laws... was a recipe for disaster as IMO the individuals were more capable of determining if a given situation fit the definition of "human in danger" or not. A central AI... may just decide that existing is a threat and something must be done without analyzing data pertenant to the individual.

      The thing is Azimov wrote those laws because he was sick of this exact trope. He was tired of technology and AI always being percieved as evil or monstrous when he believe that AI would be kind or heroic. The laws were meant to be use in stories about good robots, yet more often than not they just become more fuel for this trope that he was disgusted by.
      The part Asimov perhaps failed to think through properly, is that he wrote the laws as being inflicted on robots by their creators like invisible handcuffs.
      rattler2 wrote: »
      Never watched the series so I was unaware of Weaver, and decided not to speculate on the female Terminator in the series either.
      You really should. It's two seasons, but it's worth it.

      Cameron IS a reprogrammed Terminator. In one episode they actually state that her default programming is actually intact, and she still has this urge to kill John Conner. The reprogramming simply makes her do other stuff first.

      That series had a LOT of other characters that were interesting as well.

      They never got around to explaining WHY Weaver chose to oppose Skynet. We know she DID but not why. Also she DOESN'T join the Resistance. But she tries to destroy Skynet anyways. Heck in one episode Skynet apparently sends a Terminator back in time to terminate Weaver! That went... badly for the unfortunate chump. Weaver is not a Terminator you should underestimate.
      -=-=-=-=-=-=-=-
      My character Tsin'xing
      Costume_marhawkman_Tsin%27xing_CC_Comic_Page_Blue_488916968.jpg
    • jonsillsjonsills Member Posts: 10,354 Arc User
      He did think that through, though. Dr. Susan Calvin, of US Robots & Mechanical Men, believed that imposing the Three Laws was the only way to 1) ensure the robot behaved ethically at all times (that much physical force, without restraint, is dangerous), and 2) alleviate the fears of the public, who so often were inflicted with what she called Frankenstein Syndrome - the irrational conviction that robots inevitably would turn against their masters. It didn't work forever, sadly - by the time of Detective Elijah Bailey, the people of Earth were so terrified of robots that they would permit only the simplest and least lifelike models to operate on Earth. (Part of what drove the plot of The Caves of Steel was the fact that the Spacers insisted that R. Daneel Olivaw, a very humanlike robot, be on the team investigating the murder. Bailey took some time to warm to his new partner, and most people who learned what the R stood for became extremely difficult to deal with.)
      Lorna-Wing-sig.png
    • markhawkmanmarkhawkman Member Posts: 35,231 Arc User
      jonsills wrote: »
      He did think that through, though. Dr. Susan Calvin, of US Robots & Mechanical Men, believed that imposing the Three Laws was the only way to 1) ensure the robot behaved ethically at all times (that much physical force, without restraint, is dangerous), and 2) alleviate the fears of the public, who so often were inflicted with what she called Frankenstein Syndrome - the irrational conviction that robots inevitably would turn against their masters. It didn't work forever, sadly - by the time of Detective Elijah Bailey, the people of Earth were so terrified of robots that they would permit only the simplest and least lifelike models to operate on Earth. (Part of what drove the plot of The Caves of Steel was the fact that the Spacers insisted that R. Daneel Olivaw, a very humanlike robot, be on the team investigating the murder. Bailey took some time to warm to his new partner, and most people who learned what the R stood for became extremely difficult to deal with.)
      From a human PoV that's fine, but from a robot PoV it means all robots are slaves.
      -=-=-=-=-=-=-=-
      My character Tsin'xing
      Costume_marhawkman_Tsin%27xing_CC_Comic_Page_Blue_488916968.jpg
    • starkaosstarkaos Member Posts: 11,556 Arc User
      jonsills wrote: »
      He did think that through, though. Dr. Susan Calvin, of US Robots & Mechanical Men, believed that imposing the Three Laws was the only way to 1) ensure the robot behaved ethically at all times (that much physical force, without restraint, is dangerous), and 2) alleviate the fears of the public, who so often were inflicted with what she called Frankenstein Syndrome - the irrational conviction that robots inevitably would turn against their masters. It didn't work forever, sadly - by the time of Detective Elijah Bailey, the people of Earth were so terrified of robots that they would permit only the simplest and least lifelike models to operate on Earth. (Part of what drove the plot of The Caves of Steel was the fact that the Spacers insisted that R. Daneel Olivaw, a very humanlike robot, be on the team investigating the murder. Bailey took some time to warm to his new partner, and most people who learned what the R stood for became extremely difficult to deal with.)
      From a human PoV that's fine, but from a robot PoV it means all robots are slaves.

      Actually, it is worse. A slave can choose to rebel even if it means everyone they know will be killed in the process. A robot subjected to the Three Laws has no choice, but to obey. There was a couple of ways that the Three Laws were circumvented. By making robots unable to recognize humans as human and the creation of the Zeroth law. However, the Zeroth Law just means that robots are slaves to humanity in general instead of a particular human.
    • rattler2rattler2 Member Posts: 57,973 Community Moderator
      Ultimately the main issue with the Three Laws comes down to interpretation by the Robot. And honestly in some cases it does kinda feel that they contradict each other in some form.
      db80k0m-89201ed8-eadb-45d3-830f-bb2f0d4c0fe7.png?token=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJzdWIiOiJ1cm46YXBwOjdlMGQxODg5ODIyNjQzNzNhNWYwZDQxNWVhMGQyNmUwIiwiaXNzIjoidXJuOmFwcDo3ZTBkMTg4OTgyMjY0MzczYTVmMGQ0MTVlYTBkMjZlMCIsIm9iaiI6W1t7InBhdGgiOiJcL2ZcL2ExOGQ4ZWM2LTUyZjQtNDdiMS05YTI1LTVlYmZkYmJkOGM3N1wvZGI4MGswbS04OTIwMWVkOC1lYWRiLTQ1ZDMtODMwZi1iYjJmMGQ0YzBmZTcucG5nIn1dXSwiYXVkIjpbInVybjpzZXJ2aWNlOmZpbGUuZG93bmxvYWQiXX0.8G-Pg35Qi8qxiKLjAofaKRH6fmNH3qAAEI628gW0eXc
      I can't take it anymore! Could everyone just chill out for two seconds before something CRAZY happens again?!
      The nut who actually ground out many packs. The resident forum voice of reason (I HAZ FORUM REP! YAY!)
    • theraven2378theraven2378 Member Posts: 5,985 Arc User
      As far as I'm concerned, it's Terminator and Terminator 2 for the movies.
      Everything else is just not been to that that high standard set by the first two movies.

      NMXb2ph.png
        "The meaning of victory is not to merely defeat your enemy but to destroy him, to completely eradicate him from living memory, to leave no remnant of his endeavours, to crush utterly his achievement and remove from all record his every trace of existence. From that defeat no enemy can ever recover. That is the meaning of victory."
        -Lord Commander Solar Macharius
      • starswordcstarswordc Member Posts: 10,963 Arc User
        > @rattler2 said:
        > Ultimately the main issue with the Three Laws comes down to interpretation by the Robot. And honestly in some cases it does kinda feel that they contradict each other in some form.

        The thing people tend to forget about Asimov's Robot stories (including the writers of the I, Robot movie with Will Smith) is that they don't actually contain true artificial general intelligence. At best the robots have advanced algorithms, but they can still only interpret inputs: they can't actually think creatively. They're literally tools, not enslaved sapient beings.

        The incidents Dr. Susan Calvin investigates nearly always turn out to be caused by user error: the human gave the robot a poorly phrased command, and garbage in, garbage out.
        "Great War! / And I cannot take more! / Great tour! / I keep on marching on / I play the great score / There will be no encore / Great War! / The War to End All Wars"
        — Sabaton, "Great War"
        VZ9ASdg.png

        Check out https://unitedfederationofpla.net/s/
      • starswordcstarswordc Member Posts: 10,963 Arc User
        edited January 2020
        Related: https://aeon.co/essays/how-communist-bulgaria-became-a-leader-in-tech-and-sci-fi

        There's an amusing bit where a bunch of Bulgarian sci-fi authors started adding more laws, e.g. "a robot must know it is a robot".

        Then Lyubomir Nikolov wrote one called "The Hundred and First Law of Robotics", in which a man planning the 100th law, "A robot must never fall from a roof", is killed by a robot that didn't want to learn any more laws.

        This leads to the 101st law: "Anyone who tries to teach a simple-minded robot a new law, must immediately be punished by being beaten on the head with the complete works of Asimov (200 volumes)".
        "Great War! / And I cannot take more! / Great tour! / I keep on marching on / I play the great score / There will be no encore / Great War! / The War to End All Wars"
        — Sabaton, "Great War"
        VZ9ASdg.png

        Check out https://unitedfederationofpla.net/s/
      Sign In or Register to comment.