Poll of the Day > I keep hearing facial recognicion technology is racist and bad

Topic List
Page List: 1
Yellow
06/18/20 8:47:57 PM
#1:


Look... If it can't recognize black people, the answer is normalizing the brightness and contrast of your training data, not yelling at the computer.

Lay off my beautiful neural networks.

Seriously wtf. I want to hear actual arguments in favor of the idea that they're racist. If they can't recognize black people... that's racist against white people, because no one wants to be recognized by a random NN against their will.

https://www.bloomberg.com/news/articles/2020-06-12/trump-retweets-call-for-microsoft-ban-from-federal-contracts-kbc7pj17

Oh... Donald Trump is in favor of it. Sometimes politicians I like TDS, but I can't blame them, because they're like 60+ years old and there's no way they could understand this.

---
... Copied to Clipboard!
zebatov
06/18/20 8:58:07 PM
#2:


Obviously you have no idea that theres no rock-bottom for the ones saying that. Literally anything can be racist to them.

---
C was right.
... Copied to Clipboard!
rjsilverthorn
06/18/20 8:59:52 PM
#3:


"The majority of commercial facial-recognition systems exhibit bias, according to a study from a federal agency released on Thursday, underscoring questions about a technology increasingly used by police departments and federal agencies to identify suspected criminals.
The systems falsely identified African-American and Asian faces 10 times to 100 times more than Caucasian faces, the National Institute of Standards and Technology reported on Thursday. Among a database of photos used by law enforcement agencies in the United States, the highest error rates came in identifying Native Americans, the study found.
The technology also had more difficulty identifying women than men. And it falsely identified older adults up to 10 times more than middle-aged adults."

https://www.nytimes.com/2019/12/19/technology/facial-recognition-bias.html

https://www.nytimes.com/2019/07/08/us/detroit-facial-recognition-cameras.html

https://scholar.google.com/scholar?q=facial+recognition+racial+bias&hl=en&as_sdt=0&as_vis=1&oi=scholart
... Copied to Clipboard!
Lokarin
06/18/20 9:02:25 PM
#4:


That just means the tech is shitty.

Remove the errors; Now what's wrong with it?

---
"Salt cures Everything!"
My YouTube: https://www.youtube.com/user/Nirakolov/videos
... Copied to Clipboard!
rjsilverthorn
06/18/20 9:06:45 PM
#5:


Lokarin posted...
That just means the tech is shitty.

Remove the errors; Now what's wrong with it?

I tend to agree, these are issues that can be resolved by using a larger and more diverse image pool when training.

Like a lot of new technology there is a fear of it being abused, but that is honestly a risk with almost all technology.
... Copied to Clipboard!
TigerTycoon
06/18/20 9:09:47 PM
#6:


I would say in general a surveillance society where the government can find anyone, anywhere, at any time is pretty messed up, not specifically because you can identify people of certain races.

FYI recognition software, as far as I'm aware, is not focused on color, but facial structure. Different races do however, have patterns of facial structure within their race.
---
YOU COULDN'T AFFORD IT!
... Copied to Clipboard!
Yellow
06/18/20 9:27:38 PM
#7:


rjsilverthorn posted...
The systems falsely identified African-American and Asian faces 10 times to 100 times more than Caucasian faces
Sounds like they're not diversifying and normalizing their training data correctly. They probably obtained their training data from a completely anonymized source that had more young white male samples. Here is a case where you actually want 100% diversity among ages sexes and races for the sake of properly learning all facial structures.

For shame MS. Amateur hour. I would say scrub Facebook for faces and record their ages and race, but that would be illegal. You would have to write a second NN that would identify the sex race and age of your training data so you could weigh it properly for your initial facial recognition software.

---
... Copied to Clipboard!
Yellow
06/18/20 9:41:31 PM
#8:


This is a fundamental problem with anonymized data sets. People aren't ok with having their demographics tied to their anonymized data, which makes it very hard to control for biases, because as far as the algorithm is concerned there is no such thing as a white or black person if you don't tell it, especially so when your white or black people pictures don't have a "white" or "black" tag to go with them. The algorithm just goes "I see this person 10% of the time so there's no reason I have to prioritize my skill in recognize this particular face"

Tl; Dr: You cannot bias anonymous data and this is an oversight that is given a political spotlight as being a moral issue rather than a logistical one. They could solve this by getting less anonymous data. That's harder and generally less legal.

I hope someone understood how I put it. :/ Kind of hard to explain.

---
... Copied to Clipboard!
Yellow
06/18/20 9:48:30 PM
#9:


https://drive.google.com/file/d/0B7EVK8r0v71pblRyaVFSWGxPY0U/view

Well if you're using the less anonymous CelebFaces Attributes Dataset, you have the following tags to recognize

5_o_Clock_Shadow Arched_Eyebrows Attractive Bags_Under_Eyes Bald Bangs Big_Lips Big_Nose Black_Hair Blond_Hair Blurry Brown_Hair Bushy_Eyebrows Chubby Double_Chin Eyeglasses Goatee Gray_Hair Heavy_Makeup High_Cheekbones Male Mouth_Slightly_Open Mustache Narrow_Eyes No_Beard Oval_Face Pale_Skin Pointy_Nose Receding_Hairline Rosy_Cheeks Sideburns Smiling Straight_Hair Wavy_Hair Wearing_Earrings Wearing_Hat Wearing_Lipstick Wearing_Necklace Wearing_Necktie Young

Of course, then your algorithm favors recognition of celebrities, which is another problem. :)

---
... Copied to Clipboard!
TigerTycoon
06/18/20 9:48:30 PM
#10:


Yellow posted...
Kind of hard to explain.

Algorithms have no inherent bias (unless specifically programmed to), however, if not programmed with any biases, it will interpret patterns as fact, which people can interpret as bias.
---
YOU COULDN'T AFFORD IT!
... Copied to Clipboard!
Yellow
06/18/20 9:52:24 PM
#11:


TigerTycoon posted...
Algorithms have no inherent bias (unless specifically programmed to), however, if not programmed with any biases, it will interpret patterns as fact, which people can interpret as bias.
If your input data has a bias there will be a bias in your algorithm though.

Say 90% of your images are trains and 10% people, your algorithm will suddenly forget what people look like and start identifying trains. That's the nature of a NN, it just forgets less common/important details. It is a very limited technology atm.

---
... Copied to Clipboard!
Mead
06/18/20 9:53:37 PM
#12:


Yeah I dont really understand it either

---
The Betrayer
... Copied to Clipboard!
rjsilverthorn
06/18/20 9:54:03 PM
#13:


Yellow posted...
I would say scrub Facebook for faces and record their ages and race, but that would be illegal.

Yeah, someone was doing something very similar to that https://mashable.com/article/ibm-flickr-images-training-facial-recognition-system/
... Copied to Clipboard!
Krazy_Kirby
06/18/20 10:01:34 PM
#14:


zebatov posted...
Obviously you have no idea that theres no rock-bottom for the ones saying that. Literally anything can be racist to them.


that's racist
---
... Copied to Clipboard!
Yellow
06/18/20 10:26:25 PM
#15:


rjsilverthorn posted...
Yeah, someone was doing something very similar to that https://mashable.com/article/ibm-flickr-images-training-facial-recognition-system/
I like it when I go on a rant about a hypothetical solution only to be validated like this

In January, IBM revealed its new "Diversity in Faces" dataset with the goal to make facial recognition systems fairer and better at identifying a diverse range of faces AI algorithms have had difficulty in the past recognising women and people of colour.
And it's... probably illegal on some basis.

If NNs could categorize data better they would be able to refine less vital skills. It would learn by itself what a black and white person are.

I wish I had more time to work on a custom learning algorithm, and I'm not going to lie, it's really hard to do something like that and focus on anything irl.

---
... Copied to Clipboard!
Zeus
06/19/20 5:35:48 AM
#16:


Wow, how amazing, a topic about facial recognition just days after John Oliver does a piece on the subject on LWT. What are the odds? >_>

Yellow posted...
Look... If it can't recognize black people, the answer is normalizing the brightness and contrast of your training data, not yelling at the computer.

Lay off my beautiful neural networks.

Seriously wtf. I want to hear actual arguments in favor of the idea that they're racist. If they can't recognize black people... that's racist against white people, because no one wants to be recognized by a random NN against their will.

When people only have one drum they can beat, they pound it with all of their might. Everything is "racist" because they can't think of a new criticism that will automatically rile people up.

---
(\/)(\/)|-|
There are precious few at ease / With moral ambiguities / So we act as though they don't exist.
... Copied to Clipboard!
adjl
06/19/20 9:36:38 AM
#17:


Yellow posted...
If they can't recognize black people... that's racist against white people, because no one wants to be recognized by a random NN against their will.

The issue isn't that they can't recognize black people, it's that they have more difficulty differentiating between individual black faces and are therefore more likely to flag an innocent black person as the target and result in a mistaken arrest. Given that black people already face an inordinate amount of "you resemble a suspect so I'm going to accost you" regardless of whether or not they've done anything, the notion of implementing automated systems that perpetuate (if not worsen) that problem instead of fixing it is naturally going to be considered a problem.

Fundamentally, the concept of using the technology is sound, it's just got limitations that are born of the same racist biases that currently pose a problem for manual facial recognition. There's room to be better, and saying that it's good enough and doesn't need to be improved because its shortcomings are only really a problem for non-white people is indeed racist.

---
This is my signature. It exists to keep people from skipping the last line of my posts.
... Copied to Clipboard!
streamofthesky
06/19/20 10:23:39 AM
#18:


Put aside the issue of whether it's racist or not.
It's fucking creepy and disturbing and I don't want government and all sorts of companies and individuals tracking me by facial recognition.
Clear View in particular just scrapes photos of you online. Which includes ones taken of you / with you in the background by someone else and posted online without your awareness or consent. So no, there is no such thing as "opting out" of this brave new world.

Zeus posted...
Wow, how amazing, a topic about facial recognition just days after John Oliver does a piece on the subject on LWT. What are the odds? >_>
Aren't you against big government?
Shouldn't the use of this technology freak you the hell out?
Since you referenced the John Oliver segment, what are your thoughts on how police nabbed the faces of Baltimore protestors and used it to harass them later on? Are you ok with that? If so, will you suddenly find it appalling when a Democrat is in office and it's used on people at an NRA rally?
... Copied to Clipboard!
OniRonin
06/19/20 11:32:23 AM
#19:


neural networks are a tool for laundering the creator's bias through an 'algorithm' to masquerade as factual. this is why every facial recognition software ever made has horrible biases

---
god is dumb
#NotAllGamers #YesAllLandlods
... Copied to Clipboard!
DrYuya
06/19/20 1:29:57 PM
#20:


zebatov posted...
Obviously you have no idea that theres no rock-bottom for the ones saying that. Literally anything can be racist to them.


Yeah that's what I still don't get about far left ideas and/or BLM nonsense.

If
  1. being their definition of racist is tantamount to being an ultimate sinner who deserves to be homeless and jobless on the street and it fills them with so much hate they can't even begin to understand or care about how actually racist the other person really is
  2. every little thing is racist
  3. every little celebrity/political figures tweet/comment is always racist for sure and makes them certified racist regardless of when/how/any context they made the tweet/comment with...


Then it almost seems just by virtue of the sheer ridiculous levels of racism everywhere and for everyone that float about all the time...that it shouldn't be as much of a shell shocker to them each and every time (they perceive) it happens. You'd just have to get tired...being that angry and finding that much "legit" racism.

So my standing theory has always been they are just bored and will get over the phase eventually.

---
It's time to kick ass and chew bubblegum, and I'm all out of ass but still have plenty of bubblegum to chew at my leisure.
... Copied to Clipboard!
Mead
06/19/20 1:45:27 PM
#21:


DrYuya posted...
Yeah that's what I still don't get about far left ideas and/or BLM nonsense.

If
1. being their definition of racist is tantamount to being an ultimate sinner who deserves to be homeless and jobless on the street and it fills them with so much hate they can't even begin to understand or care about how actually racist the other person really is
2. every little thing is racist
3. every little celebrity/political figures tweet/comment is always racist for sure and makes them certified racist regardless of when/how/any context they made the tweet/comment with...

Then it almost seems just by virtue of the sheer ridiculous levels of racism everywhere and for everyone that float about all the time...that it shouldn't be as much of a shell shocker to them each and every time (they perceive) it happens. You'd just have to get tired...being that angry and finding that much "legit" racism.

So my standing theory has always been they are just bored and will get over the phase eventually.

probably not a good idea to make broad judgements based on the most extreme and vocal users of social media, most people whether on the left or the right are pretty even keel

but you do you

---
The Betrayer
... Copied to Clipboard!
Zeus
06/19/20 2:18:45 PM
#22:


adjl posted...
The issue isn't that they can't recognize black people, it's that they have more difficulty differentiating between individual black faces and are therefore more likely to flag an innocent black person as the target and result in a mistaken arrest.

An issue solved as simply as just having an officer actually look at a photo before arresting a suspect, something they generally do anyway.

streamofthesky posted...
Aren't you against big government?
Shouldn't the use of this technology freak you the hell out?

I'm not in favor of the technology for general use (particularly setting up cameras everywhere to gather data, which is part of a practice I already dislike). It shouldn't be something that's always going, like the Eye of Sauron (to use the same tired metaphor Oliver did)

streamofthesky posted...
what are your thoughts on how police nabbed the faces of Baltimore protestors and used it to harass them later on?

That's where I differ somewhat. The riots are active criminal situations (or a direct precursor to criminal situations) where the police are stretched thin. While I object to mass surveillance in day-to-day life, this is kind of a martial law exception so it's harder to take issue. I'm not such an ivory tower idealist to not recognize the necessity of certain actions to restore order in a lawless situation. Which isn't to say that I support something like keeping a long-term, active database of protestors, just that certain activities should be used to arrest suspects for criminal conduct once order has been restored (ie, that riots shouldn't be allowed to be free-for-alls where anybody who does something can get away with it)

streamofthesky posted...
If so, will you suddenly find it appalling when a Democrat is in office and it's used on people at an NRA rally?

Has there ever been a NRA rally that descended into looting and rioting? Because I can't think of any, can you? If NRA rallies were resulting in burned down buildings, robbed businesses, etc, and the technology could be used to flag the culprits, it would certainly seem like an acceptable use, don't you agree? Obviously I'm not supporting that it be used solely to hassle people who committed no crime.

That said, the NRA is already super-easy to track because you have like a dozen existing databases so it's not like that'd make much difference.

---
(\/)(\/)|-|
There are precious few at ease / With moral ambiguities / So we act as though they don't exist.
... Copied to Clipboard!
Topic List
Page List: 1