Topic List | Page List: 1 |
---|---|
Topic | Which of these things has a higher probability of happening? |
Flappers 12/16/18 7:19:16 PM #17: | Yellow posted... Flappers posted...AI will never become smart enough to threaten us. In order for that to happen, we first have to build an AI capable of doing so -- and we are not smart enough to do that. By definition, a machine can only do what it is programmed to do. It cannot "learn and grow" beyond the capacities of what we want it to do or what it is made to do. It would take a Godlike ability to manufacture a machine that essentially functions as a human mind, and if hypothetically we had the ability to do so, we would be smart enough to also create a preventative to it ever going wrong. I'm honestly surprised by how many people believe AI can and will take us over. It's easy to be dazzled by our advancing technologies and perfectly human to imagine how impressively it will improve, but people have always thought this way. Keep in mind, back in the 1980s they thought we'd have flying cars and self-lacing shoes by now. "AI will enslave us" is the modern day equivalence of that absurdity, and nothing more. --- Look at my bio to check what Egg-Move/Hidden Ability Pokemon I will breed for you. I also have shinies for trade in exchange for a Pokemon I want. ... Copied to Clipboard! |
Topic List | Page List: 1 |