Current Events > Twitter algorithm is proven to be RACIST.

Topic List
Page List: 1
au_gold
09/20/20 2:03:36 AM
#1:


https://twitter.com/bascule/status/1307440596668182528?s=21

Can you believe it?

---
Let me talk to your mother. Get your mother please.
... Copied to Clipboard!
Esrac
09/20/20 2:48:20 AM
#2:


Pick for what?
... Copied to Clipboard!
nfearurspecimn
09/20/20 2:54:42 AM
#3:


oh weird. it isn't about racism though.

---
Wake up. You have to wake up.
currently a preta (hungry ghost)
... Copied to Clipboard!
hockeybub89
09/20/20 2:55:54 AM
#4:


... Copied to Clipboard!
nfearurspecimn
09/20/20 2:57:46 AM
#5:


hockeybub89 posted...
I don't know what this means
this is the actual picture. it's a long picture with obama very far away from mitch
https://pbs.twimg.com/media/EiT2SfuU8AA8gyJ?format=jpg&name=4096x4096

---
Wake up. You have to wake up.
currently a preta (hungry ghost)
... Copied to Clipboard!
Kazi1212
09/20/20 2:57:49 AM
#6:


got em

---
My mind is an open book, do you find it entertaining? At least be original in your responses then.
"Don't call me a God, call me a Sadist"
... Copied to Clipboard!
Choco
09/20/20 7:17:26 PM
#7:


Esrac posted...
Pick for what?
hockeybub89 posted...
I don't know what this means
just click on the things guys >_>

---
... Copied to Clipboard!
SSJKirby
09/20/20 7:22:17 PM
#8:


I don't think Mitch should smile

---
Not changing this signature until Beyond Good and Evil 2 is in my hand.
August 25th, 2010.
... Copied to Clipboard!
Tyranthraxus
09/20/20 7:23:59 PM
#9:


Oh man I don't say this often but it's totally worth opening the whole thread and going through all the replies.

Twitter also responded to it, btw. Like this got an official response lol

---
It says right here in Matthew 16:4 "Jesus doth not need a giant Mecha."
https://imgur.com/dQgC4kv
... Copied to Clipboard!
s0nicfan
09/20/20 7:24:04 PM
#10:


hockeybub89 posted...
I don't know what this means

The user has created a whole bunch of pictures with Mitch at the very top and Obama at the very bottom with a lot of white space in between and he's testing which part of the image Twitter chooses to center for the clipart.

The argument, which is total nonsense, is that the algorithm is somehow racist because in most circumstances it centers on Mitch's face. But I suspect that you'll see many more people start to make claims like this as machine learning becomes more prevalent and people without even entry-level knowledge of how those algorithms work try to apply societal concepts to them.

---
"History Is Much Like An Endless Waltz. The Three Beats Of War, Peace And Revolution Continue On Forever." - Gundam Wing: Endless Waltz
... Copied to Clipboard!
Anteaterking
09/20/20 8:26:14 PM
#11:


s0nicfan posted...
The argument, which is total nonsense, is that the algorithm is somehow racist because in most circumstances it centers on Mitch's face. But I suspect that you'll see many more people start to make claims like this as machine learning becomes more prevalent and people without even entry-level knowledge of how those algorithms work try to apply societal concepts to them.

What's your basis for it being nonsense? I actually work with the mathematics of machine learning and "the output of a machine learning algorithm can be racist" is not a contentious claim at all.


---
... Copied to Clipboard!
Smashingpmkns
09/20/20 8:27:50 PM
#12:


... Copied to Clipboard!
COVxy
09/20/20 8:30:21 PM
#13:


Anteaterking posted...
What's your basis for it being nonsense? I actually work with the mathematics of machine learning and "the output of a machine learning algorithm can be racist" is not a contentious claim at all.

Seriously, super not contentious.

---
=E[(x-E[x])(y-E[y])]
... Copied to Clipboard!
iPhone_7
09/20/20 8:31:20 PM
#14:


Whats with the long images

---
... Copied to Clipboard!
antfair
09/20/20 8:32:18 PM
#15:


iPhone_7 posted...
Whats with the long images
Twitter tries to center faces in image thumbnails, the length gives distance between the two faces to force the system to pick one or the other.

---
What is this, a fair for ants?
... Copied to Clipboard!
BeyondWalls
09/20/20 8:32:46 PM
#16:


Smashingpmkns posted...
https://twitter.com/that_gai_gai/status/1307717590706352129

Actually kind of interesting.
Is it though?

---
END OF LINE
... Copied to Clipboard!
Lost_All_Senses
09/20/20 8:34:58 PM
#17:


I went through all of them. But I dunno. It's suspect, but I'm not confident enough to call it. It might just be cause I don't completely understand the technology people are testing too lol.

Basically, Im saying, I have nothing to contribute. Good day.

---
Name checks out
"Try to talk and they ain't listening, but they'll point it out when you get ignorant" - Dreezy
... Copied to Clipboard!
#18
Post #18 was unavailable or deleted.
s0nicfan
09/20/20 8:36:44 PM
#19:


Anteaterking posted...
What's your basis for it being nonsense? I actually work with the mathematics of machine learning and "the output of a machine learning algorithm can be racist" is not a contentious claim at all.

The fuck it isn't. If you work with ML then you'd understand that the machine learning algorithm isn't making decisions, but performing basic bucketed classifications based on training data given a set of features which, for imagery, are almost always the raw pixels. Claiming any decision it makes is "racist" is fundamentally attributing an issue of incomplete data to societal problems.

Like, find me a single ML expert who can actually show objectively that the hidden layers for an algorithm like this were trained on data such that "skin color" is actually derived as a feature during back propagation. They don't exist, because one of the fundamental problems with ML is explainability. You can infer it through testing like above, but that isn't racism. It's a byproduct of the data set fed into the algorithm during training.

---
"History Is Much Like An Endless Waltz. The Three Beats Of War, Peace And Revolution Continue On Forever." - Gundam Wing: Endless Waltz
... Copied to Clipboard!
Tyranthraxus
09/20/20 8:51:22 PM
#20:


s0nicfan posted...
The fuck it isn't. If you work with ML then you'd understand that the machine learning algorithm isn't making decisions, but performing basic bucketed classifications based on training data given a set of features which, for imagery, are almost always the raw pixels. Claiming any decision it makes is "racist" is fundamentally attributing an issue of incomplete data to societal problems.

Like, find me a single ML expert who can actually show objectively that the hidden layers for an algorithm like this were trained on data such that "skin color" is actually derived as a feature during back propagation. They don't exist, because one of the fundamental problems with ML is explainability. You can infer it through testing like above, but that isn't racism. It's a byproduct of the data set fed into the algorithm during training.



---
It says right here in Matthew 16:4 "Jesus doth not need a giant Mecha."
https://imgur.com/dQgC4kv
... Copied to Clipboard!
Bad_Mojo
09/20/20 8:54:56 PM
#21:


The only question I have is based on this -

s0nicfan posted...


The user has created a whole bunch of pictures with Mitch at the very top and Obama at the very bottom with a lot of white space in between and he's testing which part of the image Twitter chooses to center for the clipart.

What happens if you put Mitch on the bottom and Obama on the top? Is it just taking the top one?

---
... Copied to Clipboard!
Tyranthraxus
09/20/20 8:55:44 PM
#22:


Bad_Mojo posted...
The only question I have is based on this -

What happens if you put Mitch on the bottom and Obama on the top? Is it just taking the top one?

The very first tweet did exactly that. Look at the pictures individually, don't just look at the preview.

---
It says right here in Matthew 16:4 "Jesus doth not need a giant Mecha."
https://imgur.com/dQgC4kv
... Copied to Clipboard!
Bad_Mojo
09/20/20 8:56:50 PM
#23:


Tyranthraxus posted...
The very first tweet did exactly that. Look at the pictures individually, don't just look at the preview.

Got it. Thanks

Edit - Yeah, I don't have a f'n clue how to read Twitter. I don't even know how to see the comments.

---
... Copied to Clipboard!
au_gold
09/20/20 9:01:51 PM
#24:


Tyranthraxus posted...
Twitter also responded to it, btw. Like this got an official response lol
What did Twitter say?

---
Let me talk to your mother. Get your mother please.
... Copied to Clipboard!
Tyranthraxus
09/20/20 9:03:06 PM
#25:


au_gold posted...
What did Twitter say?
https://twitter.com/TwitterComms/status/1307739940424359936?s=19

---
It says right here in Matthew 16:4 "Jesus doth not need a giant Mecha."
https://imgur.com/dQgC4kv
... Copied to Clipboard!
Anteaterking
09/20/20 9:10:12 PM
#26:


s0nicfan posted...
The f*** it isn't. If you work with ML then you'd understand that the machine learning algorithm isn't making decisions, but performing basic bucketed classifications based on training data given a set of features which, for imagery, are almost always the raw pixels. Claiming any decision it makes is "racist" is fundamentally attributing an issue of incomplete data to societal problems.

Like, find me a single ML expert who can actually show objectively that the hidden layers for an algorithm like this were trained on data such that "skin color" is actually derived as a feature during back propagation. They don't exist, because one of the fundamental problems with ML is explainability. You can infer it through testing like above, but that isn't racism. It's a byproduct of the data set fed into the algorithm during training.

Bro this isn't a contention of what I said at all. You're just trying to pass the buck off to "the data" even though any machine learning algorithm is made up both of the underlying algorithm (e.g. generative adversarial network) and the actual parameters trained off of the data that was provided.

Also I've worked with *specifically* computer vision and contrast between adjacent pixels to find differently shaded portions of the face to draw out facial features that should be present is one of the easiest things for a neural net to figure out ESPECIALLY if the training set is all white faces, which aren't as strongly identified on faces with darker complexions.

If I use the output of my machine learning algorithm to do things that end up being racially biased because of how I trained my model (even if I wasn't intentionally doing so), I can't irresponsibly shrug and say "Oh well it's a mathematical model so it can't be racist".

Also just FYI, it's not as hard to determine "obvious" features from a neural net as you think. There is a problem with how abstract complicated models can get, but there are plenty of methods (principal component analysis, some algorithms for feature extraction for random forests, etc.) to determine some of the strongest factors.


---
... Copied to Clipboard!
Laserion
09/20/20 9:15:55 PM
#27:


I'm guessing face recognition software is counting on eyes, nostrils and mouth to be darker than the rest of the face area, and that is easier if the face is lighter colored. On darker faces, it has more trouble making out the eyes/nostrils/mouth.
Perhaps the algorithm could be made to check the normal image and then check a gamma-modified version to find all faces before choosing.
---
There is no "would of", "should of" or "could of".
There is "would've", "should've" and "could've".
... Copied to Clipboard!
ButteryMales
09/20/20 9:19:58 PM
#28:


Remember when Microsoft's social media A.I. became a Nazi?
... Copied to Clipboard!
Bad_Mojo
09/20/20 9:23:31 PM
#29:


ButteryMales posted...
Remember when Microsoft's social media A.I. became a Nazi?

I don't. Is that because the AI was getting data from the users interacting with it and the humans were feeding it horrible things? I don't know how AI works, but I have seen that one YouTube channel that make the AI play games. CodeBullet, maybe?

---
... Copied to Clipboard!
Tyranthraxus
09/20/20 9:25:49 PM
#30:


Bad_Mojo posted...
I don't. Is that because the AI was getting data from the users interacting with it and the humans were feeding it horrible things? I don't know how AI works, but I have seen that one YouTube channel that make the AI play games. CodeBullet, maybe?

Yes in the Microsoft case people tweeted racist shit to it and it started copying them.

---
It says right here in Matthew 16:4 "Jesus doth not need a giant Mecha."
https://imgur.com/dQgC4kv
... Copied to Clipboard!
emblem boy
09/20/20 9:27:14 PM
#31:


... Copied to Clipboard!
emblem boy
09/20/20 9:30:37 PM
#32:


s0nicfan posted...
Anteaterking posted...
What's your basis for it being nonsense? I actually work with the mathematics of machine learning and "the output of a machine learning algorithm can be racist" is not a contentious claim at all.

The fuck it isn't. If you work with ML then you'd understand that the machine learning algorithm isn't making decisions, but performing basic bucketed classifications based on training data given a set of features which, for imagery, are almost always the raw pixels. Claiming any decision it makes is "racist" is fundamentally attributing an issue of incomplete data to societal problems.

Like, find me a single ML expert who can actually show objectively that the hidden layers for an algorithm like this were trained on data such that "skin color" is actually derived as a feature during back propagation. They don't exist, because one of the fundamental problems with ML is explainability. You can infer it through testing like above, but that isn't racism. It's a byproduct of the data set fed into the algorithm during training.


It's one of those things we should expect large companies, that have expertise, to be able to anticipate and work on. It doesn't have to be called racist or whatever. I don't really care how it's labeled, but it's an error that should get visibility as we delve deeper into ML. We know biases can exist unintentionally and we should hold shit to a higher standard.

The twitter post from above probably states my view of it a bit better
---
Pitter-patter, let's get at 'er
... Copied to Clipboard!
Bad_Mojo
09/21/20 5:28:22 PM
#33:


I very doubt it's anything other than a mistype, but why capitalize 'Black' and not 'white'? Just found it interesting.



What a shit story, though.

https://www.yahoo.com/news/nebraska-bar-owner-jake-gardner-012637139.html

Face the music for your crime.


---
... Copied to Clipboard!
DarthAragorn
09/21/20 5:30:51 PM
#34:


Bad_Mojo posted...
I very doubt it's anything other than a mistype, but why capitalize 'Black' and not 'white'? Just found it interesting.

<img src="" />

What a shit story, though.

https://www.yahoo.com/news/nebraska-bar-owner-jake-gardner-012637139.html

Face the music for your crime.


No, for some reason it's a new thing to capitalize black, intentionally
... Copied to Clipboard!
MixedRaceBaby
09/21/20 5:31:32 PM
#35:


*twitter computer picks lighter face to center on*
"Ok but how is this racist"
*picks it again*
"Still how is it racist"
*twitter picks the lighter of two black people*
"im literally colorblind guys im MLK i dont see race. how is it racist"
*twitter burns cross*
"I STILL DONT SEE IT GUYS"

---
For the mixed race babies!
... Copied to Clipboard!
Smashingpmkns
09/21/20 5:32:33 PM
#36:


... Copied to Clipboard!
Bad_Mojo
09/21/20 5:33:26 PM
#37:


DarthAragorn posted...
No, for some reason it's a new thing to capitalize black, intentionally

I'll give them the benefit of the doubt because I do it all the time, but I do it with a lot of other words as well. And it's random, lol.

---
... Copied to Clipboard!
s0nicfan
09/21/20 5:49:20 PM
#38:


https://www.bbc.com/news/technology-54234822
Twitter's chief design officer, Dantley Davis, found editing out Mr Madland's facial hair and glasses seemed to correct the problem - "because of the contrast with his skin".

Zehan Wang, a research engineering lead and co-founder of the neural networks company Magic Pony, which has been acquired by Twitter, said tests on the algorithm in 2017, using pairs of faces belonging to different ethnicities, had found "no significant bias between ethnicities (or genders)" - but Twitter would now review that study.

"There are many questions that will need time to dig into," he said.

"More details will be shared after internal teams have had a chance to look at it."

This is why you don't cry racism at stuff like this.

As for anteaters excellent post above, 2 comments:
  1. PCA will tell you the relative weights of the input features, but not the meaning behind the hidden layers which was my criticism. You can show clusters of pixels are more important in parts of the image vs others, but you can't prove that layer 2 of X is featurized into "race" or even "skin color."
  2. I'm not trying to pass the buck or claim data is infallible, but surely you see how absolutely destructive the opposite is, where any algorithm that returns unexpected or undesirable results is "racist." Just look at this story... People immediately jumped to race when early indicators are that the issue is facial hair contrast, which might be just as bad in other cases and has nothing to do with "Twitter algorithms favor white people."



---
"History Is Much Like An Endless Waltz. The Three Beats Of War, Peace And Revolution Continue On Forever." - Gundam Wing: Endless Waltz
... Copied to Clipboard!
Smashingpmkns
09/21/20 5:53:09 PM
#39:


Ah yes. Barack Obama's facial hair.
---
... Copied to Clipboard!
AceMos
09/21/20 9:02:01 PM
#40:


https://twitter.com/sina_rawayama/status/1307506452786016257

more research done on this further showing it ALWAYS picks the white guy

---
3 things 1. i am female 2. i havea msucle probelm its hard for me to typ well 3.*does her janpuu dance*
... Copied to Clipboard!
Darmik
09/21/20 9:04:22 PM
#41:


If anything it seems like the challenge is to find a scenario where Twitter picks the black guy.

---
Kind Regards,
Darmik
... Copied to Clipboard!
Antifar
09/21/20 9:57:13 PM
#42:


... Copied to Clipboard!
NightRender
09/21/20 10:07:48 PM
#44:


Antifar posted...
https://twitter.com/aftertheboop/status/1308091057863888896



---
Dedicated to D - 4/15/05
... Copied to Clipboard!
Topic List
Page List: 1