RIP Philo

Even AIs know that men are better than women

Amazon scraps secret AI recruiting tool that showed bias against women:

https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G
Permalink NewsBot 
October 10th, 2018 8:55am
Amazon's explanation for this is complete bullshit.
Permalink Tyrone Stallion 
October 10th, 2018 9:00am
Let's measure some non biased metrics

Hours worked
work weekends
work anti social hours
Dangerous work
Days off sick
Learn stuff in spare time
Work faster
Better results

We must stop measuring these things, they're clearly sexist.
Permalink Davy Crockett 
October 10th, 2018 1:23pm
> Gender bias was not the only issue. Problems with the data that underpinned the models’ judgments meant that unqualified candidates were often recommended for all manner of jobs, the people said. With the technology returning results almost at random, Amazon shut down the project, they said.

These other issues are the likely real reason. Neural networks are hard. And by definition impossible to debug. If it was just a matter of sex then they could have simply started again with a new training set that was 50/50.

The other problems convinced them they were out of their depth but they couldn't admit to that publicly so they blamed sex instead and as a bonus they levelled up their woman fu. That guy got a big promotion and a holiday in Thailand.
Permalink ,ndo 
October 10th, 2018 6:03pm
It also is possible that the humans who made the training set were

- biased against women
- not capable of picking resumes better than random
Permalink Send private email FSK 
October 10th, 2018 6:21pm
Their AI algorithms worked as well as the ones that decide who in Yemen to kill with a drone.
Permalink McCain's Tumor 
October 10th, 2018 8:16pm
"the humans who made the training set were ... not capable of picking resumes better than random"

Lets look at that one.

This is a known problem in the industry, and afflicts all the major employers. They simply can't figure it out.

Like you point out, they can't figure it out, so they make a training set based on... their past bad decisions. In this case why should AI suddenly do better? At best it will match the abysmal performance of hand selection.

Using fancy back propagation techniques to extract coefficients from a neural topography works if you are able to train with data that's 100% accurate.

There's no accuracy in their training data. Why would they expect garbage in not to produce garbage out?
Permalink McCain's Tumor 
October 10th, 2018 8:20pm
Because "computers"?
Permalink Legion 
October 10th, 2018 11:08pm
At no point is it stated that the resulting hires were bad for the job, only that it didn’t select enough waaammmen.
Permalink John Henry 
October 10th, 2018 11:17pm