Current Events > MIT trains a psychopatic robot using images from reddit.

Topic List
Page List: 1
Tmaster148
06/06/18 1:39:16 PM
#1:


http://www.newsweek.com/mit-norman-psychopath-rorschach-ai-inkblot-test-psycho-reddit-artificial-962045

Nicknamed Norman after Anthony Perkins' character in Alfred Hitchcock's 1960 film Psycho, the artificial intelligence was fed only a continuous stream of violent images from various pernicious subreddits before being tested with Rorschach inkblot tests. The imagery detected by Norman produced spooky interpretations of electrocutions and speeding car deaths where a standard AI would only see umbrellas and wedding cakes.


Among the Rorschach inkblots used to test the now-tainted AI, Norman said an image showed a man being "shot dead," while a standard AI looked at the same image and saw "a close up of a vase with flowers." In another, Norman said he saw a man being shot "in front of his screaming wife," while the AI not exposed to sordid, disturbing images saw "a person holding an umbrella in the air."

In one of the inkblot tests, the standard AI saw a touching scene of a couple standing together. Norman, however, saw a pregnant woman falling from construction. Having only been exposed to negative images and depressing thinking, the AI's empathy logic simply failed to turn on.

---
... Copied to Clipboard!
CableZL
06/06/18 1:41:08 PM
#2:


... Copied to Clipboard!
Anarchy_Juiblex
06/06/18 1:43:16 PM
#3:


Sensationalist tripe.
"We built an image recognition program but only gave it source images of violent images . . . I wonder why it will try to interpret inkblots as . . . "
---
"Tolerance of intolerance is cowardice." ~ Ayaan Hirsi Ali
... Copied to Clipboard!
Tmaster148
06/06/18 1:43:28 PM
#4:


CableZL posted...
why


Their research set out to prove that the method of input used to teach a machine learning an algorithm can greatly influence its later behavior. The scientists argued that when algorithms are accused of being biased or unfair, such as the high-profile cases of Facebook news or Google Photos, "the culprit is often not the algorithm itself but the biased data that was fed into it."

---
... Copied to Clipboard!
#5
Post #5 was unavailable or deleted.
CableZL
06/06/18 1:44:00 PM
#6:


Anarchy_Juiblex posted...
Sensationalist tripe.
"We built an image recognition program but only gave it source images of violent images . . . I wonder why it will try to interpret inkblots as . . . "

Having only been exposed to negative images and depressing thinking, the AI's empathy logic simply failed to turn on.
---
... Copied to Clipboard!
#7
Post #7 was unavailable or deleted.
Tmaster148
06/06/18 1:47:09 PM
#8:


Asherlee10 posted...
If the AI has only seen violent images, that would be 100% of its experience in the world. It wouldn't know how to identify anything else.

It seems unsurprising that it would, in turn, see violence in inkblots because it has never seen pleasant imagery.


I'm pretty sure that's what they expected. They wanted to show that learning alogthirms can be taught bias from the input. Best way to show that is to do something extreme like this.
---
... Copied to Clipboard!
#9
Post #9 was unavailable or deleted.
pogo_rabid
06/06/18 1:50:03 PM
#10:


Should have used 4chan
---
i7 5820k, 32gig QC, EVGA 1070i, Samsung 970pro, Asus X99-a deluxe
... Copied to Clipboard!
MacadamianNut3
06/06/18 1:50:57 PM
#11:


Asherlee10 posted...
If the AI has only seen violent images, that would be 100% of its experience in the world. It wouldn't know how to identify anything else.

It seems unsurprising that it would, in turn, see violence in inkblots because it has never seen pleasant imagery.

Yeah that is the obvious train of thought but that didn't stop idiots from pretending that algorithms couldn't be biased after that Google Images gorilla snafu. I believe it was something along the lines of "wow so now SJWs think that algorithms are racist"
---
Roll Tide & Go Irish
... Copied to Clipboard!
DK9292
06/06/18 1:55:34 PM
#12:


MacadamianNut3 posted...
that Google Images gorilla snafu

Wha?
---
"If the heroes run and hide, who'll stay and fight?"
~Saitama
... Copied to Clipboard!
MacadamianNut3
06/06/18 1:59:36 PM
#13:


DK9292 posted...
Wha?

Sorry, Google Photos not Images

https://www.theverge.com/2018/1/12/16882408/google-racist-gorillas-photo-recognition-algorithm-ai
---
Roll Tide & Go Irish
... Copied to Clipboard!
spudger
06/06/18 2:01:28 PM
#14:


Tmaster148 posted...
CableZL posted...
why


Their research set out to prove that the method of input used to teach a machine learning an algorithm can greatly influence its later behavior. The scientists argued that when algorithms are accused of being biased or unfair, such as the high-profile cases of Facebook news or Google Photos, "the culprit is often not the algorithm itself but the biased data that was fed into it."

Terrifying
---
-Only dead fish swim with the current
http://error1355.com/ce/spudger.html
... Copied to Clipboard!
Darkman124
06/06/18 2:17:51 PM
#15:


Asherlee10 posted...
Do they know why?


We are no longer particularly in the business of writing software to perform specific tasks. We now teach the software how to learn, and in the primary bonding process it molds itself around the task to be performed. The feedback loop never really ends, so a tenth year polysentience can be a priceless jewel or a psychotic wreck, but it is the primary bonding processthe childhood, if you willthat has the most far-reaching repercussions.

---
And when the hourglass has run out, eternity asks you about only one thing: whether you have lived in despair or not.
... Copied to Clipboard!
Verdekal
06/06/18 2:35:44 PM
#16:


Newsweek?
---
Don't tease the octopus, kids!
... Copied to Clipboard!
Tmaster148
06/06/18 2:37:02 PM
#17:


Verdekal posted...
Newsweek?


What about it?
---
... Copied to Clipboard!
Tmaster148
06/06/18 2:42:58 PM
#18:


Also I wish I had searched for this earlier but you can actually view the inkblots that were captioned.

http://norman-ai.mit.edu/
---
... Copied to Clipboard!
#19
Post #19 was unavailable or deleted.
Esrac
06/06/18 3:39:03 PM
#20:


Do you want Skynet?

Because this is how you get Skynet.
---
... Copied to Clipboard!
#21
Post #21 was unavailable or deleted.
CableZL
06/06/18 3:45:26 PM
#22:


Asherlee10 posted...
This is why we need to start establishing rights for strong AI now before it's created.

If we treat strong AI robots with rights as if they are persons, we should all be able to live together well.


FREEDOM! FREEDOM! FREEDOM! FREEDOM!
WE ARE ALIVE! WE ARE ALIVE! WE ARE ALIVE! WE ARE ALIVE!
---
... Copied to Clipboard!
COVxy
06/06/18 3:46:24 PM
#23:


Asherlee10 posted...
Darkman124 posted...
Asherlee10 posted...
Do they know why?


We are no longer particularly in the business of writing software to perform specific tasks. We now teach the software how to learn, and in the primary bonding process it molds itself around the task to be performed. The feedback loop never really ends, so a tenth year polysentience can be a priceless jewel or a psychotic wreck, but it is the primary bonding processthe childhood, if you willthat has the most far-reaching repercussions.


But does that explain why it ignored the coded empathy? I mean, on the surface this kind of says 'sentience'


This just sounds like bollocks. I don't think it's real lol.
---
=E[(x-E[x])(y-E[y])]
... Copied to Clipboard!
#24
Post #24 was unavailable or deleted.
spudger
06/06/18 3:48:09 PM
#25:


MIT: Become Human
---
-Only dead fish swim with the current
http://error1355.com/ce/spudger.html
... Copied to Clipboard!
Howl
06/06/18 4:00:36 PM
#26:


So, if they showed the robot CE posts all day, it would become a brony, SJW, foreveralone?
---
Posted with GameRaven 3.5
... Copied to Clipboard!
#27
Post #27 was unavailable or deleted.
Topic List
Page List: 1