I came across this snippet, it in and of itself is insignificant and unremarkable.
>Police are using software to predict crime. Is it a ‘holy grail’ or biased against minorities?
Being in the industry I am, there are things that cross in front of my eyes that make me stop and think for a moment.
Certain visual tech has very often ended up being biased towards white people-- I see it, I don't think it is primarily racist. There are a bunch of us in the target disposable income demographic and they want to make money. Yeah, it sucks. I work on problems that bother me, anyone that this particular matter bothers is free to invest their resources... Anywho, I suppose visuals aren't the only trick in the bag that could be leveraged.
Heres a Look at How Color Film Was Originally Biased Toward White People
HP Face Tracking Webcams Don't Recognize Black People
iPhone X Racist, Apple Refunds Device Can't Tell Chinese People Apart
My examples provided above just touch on the image processing facet of technology. The implication that a technical solution will in general be applied unevenly between people with various physical characteristics or cultural patterns is a very real issue. Right now, minimal impact. But, as always that is subject to change.
>Police are using software to predict crime. Is it a ‘holy grail’ or biased against minorities?
Being in the industry I am, there are things that cross in front of my eyes that make me stop and think for a moment.
Certain visual tech has very often ended up being biased towards white people-- I see it, I don't think it is primarily racist. There are a bunch of us in the target disposable income demographic and they want to make money. Yeah, it sucks. I work on problems that bother me, anyone that this particular matter bothers is free to invest their resources... Anywho, I suppose visuals aren't the only trick in the bag that could be leveraged.
Heres a Look at How Color Film Was Originally Biased Toward White People
HP Face Tracking Webcams Don't Recognize Black People
iPhone X Racist, Apple Refunds Device Can't Tell Chinese People Apart
My examples provided above just touch on the image processing facet of technology. The implication that a technical solution will in general be applied unevenly between people with various physical characteristics or cultural patterns is a very real issue. Right now, minimal impact. But, as always that is subject to change.
No comments:
Post a Comment