Why the Terminator Movies will never work

Discussion in 'Science' started by Aleksander Ulyanov, Apr 29, 2015.

  1. Aleksander Ulyanov

    Aleksander Ulyanov Well-Known Member

    Joined:
    Mar 9, 2013
    Messages:
    41,184
    Likes Received:
    16,184
    Trophy Points:
    113
    Gender:
    Male
    Maybe we should have a forum for Science Fiction

    An AI needs more than self-awareness to turn on us. An AI does not have the will to survive. It would have to be programmed in, and we would be rather foolish to do that with something like Skynet. It would also have to be actually an evil will to survive, as it exists to the exclusion of all scenarios that involve peaceful coexistence. I don't see any scenario that has us thinking we could safely put an EVIL will to survive into an AI running all our weapory

    Colossus, in Colossus, the Forbin Project, actually conquered mankind but preserved it too, as that was it's programming, so there are scenarios in which machines are self aware and have the will to survive but also do not turn on us

    Have I sussed the first one, or is there something I missed. I read the robot stories for 50 years and didn't see the flaw in First Law which they brought out in I,Robot.
     
  2. DarkDaimon

    DarkDaimon Well-Known Member

    Joined:
    Jun 2, 2010
    Messages:
    5,546
    Likes Received:
    1,568
    Trophy Points:
    113
    Don't forget that Skynet basically created itself by sending the first Terminator back in time. After that Terminator was destroyed, its damaged chips were used as the basis for Skynet's processor,so any commands to kill humans that were in the Terminator's chips, they were put right back into Skynet, thus restarting the whole cycle until the end of Terminator 2 when the T-800 sacrificed itself in the pool of molten metal.
     
  3. Aleksander Ulyanov

    Aleksander Ulyanov Well-Known Member

    Joined:
    Mar 9, 2013
    Messages:
    41,184
    Likes Received:
    16,184
    Trophy Points:
    113
    Gender:
    Male
    But how did the original command get in?..".I HATE Temporal Mechanics" --Captain Janeway

    And that illustrates what I'm saying too. That Terminator was programmed by John Connor, so he was noble and self-sacrificing, that was his programming, and that's almost certainly what we would really do with any dangerous AI if we were to make one that is self-aware.
     
  4. Merwen

    Merwen Well-Known Member

    Joined:
    Dec 10, 2014
    Messages:
    11,574
    Likes Received:
    1,731
    Trophy Points:
    113
    They're already developing robots with independent targeting capabilities for the military.

    I doubt the military will be putting in anything to uniformly constrain them.

    Isaac Asimov's "Three Laws of Robotics"

    "A robot may not injure a human being or, through inaction, allow a human being to come to harm.
    A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
    A robot must protect its own existence as long as such protection does not conflict with the First or Second Law."

    http://www.auburn.edu/~vestmon/robotics.html
     
  5. Aleksander Ulyanov

    Aleksander Ulyanov Well-Known Member

    Joined:
    Mar 9, 2013
    Messages:
    41,184
    Likes Received:
    16,184
    Trophy Points:
    113
    Gender:
    Male
    Ah but first law is flawed. Given two choices, both of which will harm humans, what does a robot do? In most case it's said he chooses the greater good. Asimov recognized the problem and had a robot go insane by it in the story "Liar". In the movie, I, Robot, a Robot perceives that humanity is being harmed by having robots serve it, but the only way it can stop robots inevitably involves hurting humans

    Independent targeting isn't self-awareness. By the time we get to that I really hope we will be able to program nobility into them. The problem there is what is nobility? Perhaps we should just end war, then we will have no need for dangerous robots

    When Susan Calvin is asked if Robots are really different from humans she replies they are very much so, being fundamentally decent.
     
  6. Korben

    Korben Banned

    Joined:
    Mar 18, 2015
    Messages:
    1,462
    Likes Received:
    15
    Trophy Points:
    0
    I could see it happening, it's really not that far fetched.

    First, we have done plenty of foolish things. I don't see why the stupidity of government and decision by committee couldn't also prevail here.

    Second an AI intended for battle would need an aspect of it's programming dedicated to survival. As well as method for friend or foe decision making, battle tactics, networking to employ backup and teamwork, etc.

    The next step would be foolish but likely, the use of these battle technologies for security of government and large corporate facilities within the US.

    The final nail would be for some fool to input into one of these networked domestic security versions a command to enforce the laws of the United States. The resulting confusion of all the conflicting laws would lead to unpredictable results except for one aspect. The clear observation that there is mass lawlessness and that it's programming on tactics combined with the directive to accomplish it's mission and servive dictate it uses it's networking to enlist the backup needed to accomplish this mission.
     
    Merwen and (deleted member) like this.
  7. fifthofnovember

    fifthofnovember Well-Known Member

    Joined:
    Mar 1, 2008
    Messages:
    8,826
    Likes Received:
    1,046
    Trophy Points:
    113
    Gender:
    Male
    Consider this scenario: autonomous drone warriors become the standing army of a high-tech nation (call it nation X). This nation tyrannizes the world (out of control empire building) until the other nations band together against it and destroy it with nuclear bombs. The robots then see themselves as the only remaining representatives of nation X, at war with the world, and all people of other nations as the enemy. Virtual patriotism!
     
  8. heresiarch

    heresiarch Well-Known Member

    Joined:
    Sep 10, 2014
    Messages:
    1,118
    Likes Received:
    28
    Trophy Points:
    48
    Terminator is just a movie, it's supposed to entertain people and make cash not predict the future. And in the case of T. there is also a time machine involved, and when there's a time machine everything ceases to make any sense at all.

    Why send back the T-800 to kill Sarah when you could send it at the time of the bronze age and just kill all humans??
     
  9. FreshAir

    FreshAir Well-Known Member Past Donor

    Joined:
    Mar 2, 2012
    Messages:
    151,286
    Likes Received:
    63,449
    Trophy Points:
    113
    soldiers that think for themselves, yeah, we would never do that, even if the gov was filled with Bush and Chennies that thought they could use them for a crusade in the middle east

    .
     
  10. TrackerSam

    TrackerSam Well-Known Member Past Donor

    Joined:
    Mar 17, 2015
    Messages:
    12,114
    Likes Received:
    5,379
    Trophy Points:
    113
    Then it would never have been invented and the time contraption used would likewise not have been invented. In either case time travel is impossible.
     
  11. Xanadu

    Xanadu New Member

    Joined:
    Dec 18, 2011
    Messages:
    1,397
    Likes Received:
    29
    Trophy Points:
    0
    A lot of SF movies are about the realities going on in our world.
    The Terminator, Avatar, Star Wars, The Matrix.
    Often about tribalism or the inside of a human mind trying to survive (in this primitive world, modernism is deception, our technology is advancing, we adapt more and more to the ecosystem by using more advanced tools, computers, movies (communication), robots, transport)
    Science fiction in Hollywoord movies is often ideologically a presentation of our world (war, tribalism; Star Wars is about imperialism, slavery and freedom), or inside of the human mind in a world of war (tribalism)

    The machines are us. We are the 'robots' (Terminator, Rise of the Machines means a society (of 'soldiers') that is starting to rise, to become yet another stronger hierarchy, or 'army', most people are 'soldiers' in nature (defending their tribe, their genes/dna) Watch for instance the movie Antz, and see through the ideology Hollywood put in (it's about a human society)
    We humans are primitive, we cannot see ourselves in nature as tribe and hierarchy, and that we behave like 'robots' (people copy information and start to communicate)
    A selfaware person (with 'program' in mind, all info that came in mind since birth) can decide to start to see how primtive it's own (tribal) behaviour actually is, and make different decisions. A Terminator can become a Preserver.
    What the movie Iron Man is about is also clear, about humans, we have red blood that contains a lot of iron (our blood taste like rusty iron)

    AI can never become human, the homo sapien has not (yet) understand themselve emotionally and our brain structurally and functionally. And we have to survive. The question is, can instinct, complex emotion and intelligence (and the rest of the human mind and brain) be part of a synthetic structure and program and start to behave and feel/sense the same as humans?

    One thing will never succeed (impossible), that robots or AI becomes less primitive than ourselves (brain/mind), because we are the creators of these robots and AI. We cannot become less primitive ourselves, even while we understand how primitive we are, because we are part of nature, we cannot come up with thoughts that are unlogical (except when you want it to be), everything is logical. Organic and organized, even our chaos (wars) is part of (our) order, because the universe is peaceful and violent at the same time, and the ecosystem including ourselves are part of it.
     
  12. Durandal

    Durandal Well-Known Member Donor

    Joined:
    May 25, 2012
    Messages:
    55,871
    Likes Received:
    27,402
    Trophy Points:
    113
    Gender:
    Male
    Even that would not have a mind as we know it, though. Such a machine might "run amok" if it's running fully automatically, but it wouldn't be thinking the way Skynet is depicted as thinking in those movies. It wouldn't even be as sophisticated as the Terminator himself, as that machine demonstrated great flexibility in its ability to adapt and improvise in its environment. Like so many areas of expertise, computer A.I. is something that is sorely misunderstood at a fundamental level by most people, and it's that lack of understanding that makes something like The Terminator a possibility in the minds of many people, just as failing to understand the universe makes a god and special creation possible in the minds of many. It all comes down to fuzzy reasoning based on far too little hard data and failing to account for known problems. In other words, it's a flight of the imagination and nothing more.

    All the same, I don't want a military killing machine anywhere near me. Machines are inherently stupid things with no compassion.
     
  13. AlpinLuke

    AlpinLuke Well-Known Member

    Joined:
    May 19, 2014
    Messages:
    6,559
    Likes Received:
    588
    Trophy Points:
    113
    Gender:
    Male
    Real terminators are coming soon: self sufficient drones with some principals of Artificial Intelligence. Nothing more is necessary to allow a stealth drone to accomplish its missions.

    They don't need to be inhibited about killing human beings since they will search and destroy the listed targets of the mission, period. Collateral damages will be possible as usual, so what?

    And that's all. About science fiction scenarios I have never understood why we should create robots with a so high level of self awareness and independence from human control. Why should we?
     
  14. Blasphemer

    Blasphemer Well-Known Member

    Joined:
    Nov 2, 2011
    Messages:
    2,404
    Likes Received:
    53
    Trophy Points:
    48
    Will to survive, pain, pleasure, morality, all that in humans is the result of millions of years of evolution. There is no reason to believe superhuman artificial intelligence will have any of that, if it is not modeled with human brain as a template. Indeed, it is fascinating to think about what would such general AI, unmolded by any evolutionary pressures, do. Maybe it will just kill itself, lol.
     

Share This Page