Communique for #BlackMastodon and Black folk only:

(White folk can listen too if they want, but this conversation is not for them).

The people telling you to be very afraid of Artificial General Intelligence don't know what they're talking about. Remember, their last big predictions were:
* Monkey jpegs are now money. Buy crypto.
* Elon will be great for Twitter
* Listening to VCs talk on Clubhouse is the next big social network
* Adam Neuman is a genius, and we should give him more money

These people are very very rich, not very very smart. The track record of their judgment speaks for itself.

There are very real, very serious, risks from Machine Learning in general, but they are not the risks that these delusional dudes are talking about.

The real risks are not "coming in the near future." They are here with us today, and they affect systems that impact marginalized communities the most.

Most of the experts on the real risks are from marginalized communities.

All this talk of artificial general intelligence is a head fake to draw attention from the massive and real harms that can be caused by ML systems today.

Today's systems can:
* Issue a warrant for your arrest for a crime that you didn't even do based on facial recognition
* Decide that you are a pre-trial flight risk and deny you pre-trial release
* Give a false diagnosis at a hospital, deciding that you are not worth putting on life support
* Tell a car to run you over as you cross the street

@mekkaokereke
I saw something on TV that said facial recognition works less well on those who are not white which would increase the risk of people of colour and black people being arrested for something they haven't done. That's one of my worries about facial recognition being used by the police and its not like the police have a great track record for their dealings with members of ethnic minorities anyway even without that.

@AutisticMumTo3 @mekkaokereke I found a classic video of it. Here's a video from when facial recog was new. It was clear that the data used to train the software didn't have enough black people. This sort of design goes back to

https://youtu.be/t4DT3tQqgRM

This goes back to the ~~nature~~ origin of film itself and this shows up again and again and again with every technology. We're literally afterthoughts. Make camera, make it beautiful, then make it work for black people.

HP computers are racist

YouTube
@wolfkin @AutisticMumTo3 @mekkaokereke white person popping in to drop a couple of links on how it's not the nature of film but white people's decisions about how film should perform
- the Shirley card https://www.vox.com/2015/9/18/9348821/photography-race-bias
- the exposure range Polaroid picked
https://www.theguardian.com/artanddesign/2013/jan/25/racism-colour-photography-exhibition
Color film was built for white people. Here’s what it did to dark skin.

The biased film was fixed in the 1990s, so why do so many photos still distort darker skin?

Vox

@marypcbuk @wolfkin @AutisticMumTo3 @mekkaokereke I first realized that it's a choice as an anglo getting film developed in south India - there, they default to darker skin (as makes sense), so I looked radioactive in most of the photos. The rolls I developed in the US, my Tamil friends were extra dark and indistinct.

From what I hear, modern digital photography has made great strides on working for everyone.

@Dangandblast @wolfkin @AutisticMumTo3 @mekkaokereke only because some people got out and pushed digital photography (which is just software) to work better for a wider range of skin tones; it's not like technology is automatically neutral (or neutral in any way, actually)
@marypcbuk @Dangandblast @wolfkin @mekkaokereke
One of the problems with facial recognition software is it is often trained primarily on white faces.