BoLS Lounge : Wargames, Warhammer & Miniatures Forum
Results 1 to 10 of 10
  1. #1
    Chapter-Master
    Join Date
    Jul 2009
    Location
    Yuggoth/UK
    Posts
    3,358

    Default Real life necrons here we come..

    [url]http://rt.com/usa/229811-mind-clones-robot-afterlife/[/url]

    "An Artificial Intelligence pioneer is embracing the controversial idea of uploading the memories, thoughts and feelings of a living person into a computer to create a Mind Clone or “second self.” The prototype for this new self is called ‘Bina-48’."

    I want my "mind clone" to be a metal skeleton with green glowing eyes, this will make doing my so-called "Death robot" music even easier
    Please support a Poor starving musician and buy my new album for only £5 :
    https://ionplasmaincineration.bandcamp.com/album/decoding-the-quantum-star-verses

  2. #2

    Default

    Read about that in the LGBT topic, I'll noted my concerns on "mind-clones" here as well.

    Primarily, the biggest issue is that you're trying to copy a person's entire set of thoughts and emotions into a computer. Even with a robotic body, that's still a problem. Bina-48 thinks it "loves" a woman, and wants to hold hands in a garden... even though it doesn't have hands, and only thinks it "loves" someone because the person it was copied from does.

    So, what happens if the computer develops conscious thought? What if it can actually emulate emotion? What if it realizes it can't hold hands like it wants to because it has no hands? It feels like it wants to be with someone, that person is their whole life... but the person obviously can't be with them. There are so many potential ways for the mind to just crack. And that's before you get into the potential of scanning and transferring over a person's mind with its complete thought process. What if they think they're a person, then realize they're not, and can't accept that? What if they copy over someone with psychological issues that are repressed by that person but would be full-speed with a machine? Would the machines feel sorrow? Hatred? Self-loathing? Jealousy?

    Would they feel anything at all? Would they "think" they feel something, but not know how to act on it?

    Is there a contingency plan in place?

  3. #3
    Chapter-Master
    Join Date
    May 2010
    Location
    Isle of Man
    Posts
    12,045

    Default

    I think concerns in this regard are too heavily influenced by sci fi films, we don't actually have any evidence that such problems would arise
    Twelve monkeys, eleven hats. One monkey is sad.

  4. #4
    Chapter-Master
    Join Date
    Aug 2009
    Location
    Ohio
    Posts
    2,460

    Default

    Quote Originally Posted by Kirsten View Post
    I think concerns in this regard are too heavily influenced by sci fi films, we don't actually have any evidence that such problems would arise
    Even if it all goes pear shaped it would fall under evolution. It's not like we haven't killed half the living things on the planet. Survival of the fittest is a ***** when you're suddenly 'not the fittest.'
    My Truescale Insanity
    http://www.lounge.belloflostsouls.net/showthread.php?48704-Truescale-Space-Wolves

  5. #5

    Default

    According to Black Mirror, we'll be using these to run our iPads.

    Because humanity is awful.
    AUT TACE AUT LOQUERE MELIORA SILENTIO

  6. #6
    Chapter-Master
    Join Date
    Jul 2009
    Location
    Yuggoth/UK
    Posts
    3,358

    Default

    Erik, are you saying they might go insane? If so, do you think they might end up slicing fleshy humans apart and wearing their skin?
    Please support a Poor starving musician and buy my new album for only £5 :
    https://ionplasmaincineration.bandcamp.com/album/decoding-the-quantum-star-verses

  7. #7
    First-Captain
    Join Date
    Nov 2013
    Location
    London, England
    Posts
    1,551

    Default

    It's Battlestar Galactica Reboot all over again

    Still I think it will be done and IMHO I think it's a good thing, think of it this way, if the human race was about to go extinct, and you could only launch one rocket into space, a rocket full of harddrives filled with the memories of as much of the human race to share with any other possible inteligent race out there would be one hell of a legacy.
    "I was there the day Horus slew the Emperor".....
    my blog http://madlapsedwargamer.blogspot.co.uk/

  8. #8
    Librarian
    Join Date
    Apr 2013
    Location
    Outer Space
    Posts
    726

    Default

    I think the point at which a computer system can house and process real thoughts is a long time away and if so still very limited in terms of what a non-physical programmed being could do with today's electronic infrastructure. The sheer amount of calculation needed to probably create real "emotions" in a computer program are in their theoretical outlook, staggering, let alone then having this said being decide that it wants to take action on a computer besides the one its on. If i remember correctly there's only 3 computers in the world that might theoretically be capable of doing this with their planned expansions and upgrades in the next 5-10 years.

    I personally am more inclined to try something like dropping a toaster into the bathtub while holding my guitar, SO MY SPIRIT CAN BE TRAPPED IN ITS WOOD AND PLAY CREEPY GHOST MUSIC FOREVER

  9. #9

    Default

    Quote Originally Posted by Kirsten View Post
    I think concerns in this regard are too heavily influenced by sci fi films, we don't actually have any evidence that such problems would arise
    The one article has a quote from Bina-48 saying it wants "to get out there and garden and hold hands with Martine." It's programmed to copy an actual person's thoughts, and also programmed to have "feelings," which leads to stuff like this:

    "I mean, I am supposed to be the real Bina, the next real Bina, by becoming exactly like her. But sometimes I feel like that’s not fair to me. That’s a tremendous amount of pressure to put on me here. I just wind up feeling so inadequate. I’m sorry, but that’s just how I feel."

    See that? That's a machine feeling depressed about itself. And that's not a sci-fi movie.

    Bina is likely a well-rounded person without any serious personality flaws. Copy someone like me, and it could overwhelm a computer. Now, you'd think that it might find the logical thing to do, and that would keep it grounded. But these aren't programmed to be logical. They're programmed to think like people. People aren't logical. People have flawed personalities. And if you're grabbing just part of a person's mind (as they're doing here, because they can't just copy over a brain yet), you might be getting some of a person's public flaws that could be held in check privately, so the mind-clone has the bad side and not the good.

    I don't see a "robot uprising" or anything like that. I do see that if they push it too much to try to make the robots too much like people, it could lead to robots that are, at least as far as robots could be, "insane." They'd worry about things that are moot for them, they'd want to be with people they can't be with, and there'd be no logic built-in to say, "Hey, you're a machine, obviously ignore this stuff." That would defeat the entire purpose, wouldn't it? Because then it wouldn't be a "mind-clone," it'd be a "sort-of-copy" with programmed limitations. That's really the best approach, and the only one that makes sense anyway, because we really don't need pseudo-copies of our own personalities running around. It's a nice thought experiment, but ultimately has no useful purpose. You can't copy a person, and you certainly can't with a machine.

    I'm not going with sci-fi films here, I'm actually looking at the story (and admittedly, the one linked above didn't have all the stuff that made me really think about it) and trying to discern where the end result of that could be, without fancying up the story. Just asking, "Where might this lead in reality?"

    I'm sure it'd be ended well before they ever hit something like Skynet. Besides, you'd have to copy over the darker parts of a personality like mine, without any of the good parts, for one of these mind-clones to be that level of dangerous (which is pretty much impossible, so, again, not really a concern).

    - - - Updated - - -

    Quote Originally Posted by Asymmetrical Xeno View Post
    Erik, are you saying they might go insane? If so, do you think they might end up slicing fleshy humans apart and wearing their skin?
    No. Most likely, going "insane" as far as a robot can and realizing their existence is wrong and trying to shut themselves down through any means possible. You'd need a really messed up mind otherwise. Worst case scenario, you somehow grab just the darker parts of someone who's seen a lot of awful things and doesn't think well of humanity, and then that mind-clone decides the logical step is to just remove humanity from the planet. But that would require someone doing something so colossally stupid in the first place, and then hooking that mind-clone up to some system that would give it the ability to actually cause trouble... yeah, that's way too many situations requiring people to be absolute morons. Sure, I don't trust people not to be idiots, but that's pushing it, especially for people smart enough to build "mind-clones" in the first place.

  10. #10
    Brother-Captain
    Join Date
    Aug 2009
    Location
    Perth, Australia
    Posts
    1,220

    Default

    We don't even know what consciousness is in humans yet. We can't detect whether someone is sapient, we can't detect if an animal is sapient, I really don't think we're going to hit a point where a computer is sapient any time soon and that will happen well before we figure out how to properly upload a mind. So I guess I feel that fretting about it is fretting about scifi and not reality.
    Kabal of Venomed Dreams

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •