- cross-posted to:
- lemmyshitpost@lemmy.world
- cross-posted to:
- lemmyshitpost@lemmy.world
Reminds me of an early application of AI where scientists were training an AI to tell the difference between a wolf and a dog. It got really good at it in the training data, but it wasn’t working correctly in actual application. So they got the AI to give them a heatmap of which pixels it was using more than any other to determine if a canine is a dog or a wolf and they discovered that the AI wasn’t even looking at the animal, it was looking at the surrounding environment. If there was snow on the ground, it said “wolf”, otherwise it said “dog”.
Early chess engine that used AI, were trained by games of GMs, and the engine would go out of its way to sacrifice the queen, because when GMs do it, it’s comes with a victory.
That’s funny because if I was trying to tell the difference between a wolf and a dog I would look for ‘is it in the woods?’ and ‘how big is it relative to what’s around it?’.
What about telling the difference between a wolf and grandmother?
Look for a bonnet. Wolves don’t wear bonnets.
Yeah, that’s a grandmother, so what?
The idea of AI automated job interviews sickens me. How little of a fuck do you have to give about applicants that you can’t even be bothered to have even a single person interview them??
One of my favorite examples is when a company from India (I think?) trained their model to regulate subway gates. The system was supposed to analyze footage and open more gates when there were more people, and vice versa. It worked well until one holiday when there were no people, but all gates were open. They eventually discovered that the system was looking at the clock visible on the video, rather than the number of people.
That reminds me of the time, quite a few years ago, Amazon tried to automate resume screening. They trained a machine learning model with anonymized resumes and whether the candidate was hired. Then they looked at what the AI was looking at. The model had trained itself on how to reject women.
I really hate that we are calling this wave of technology “AI”, because it isn’t. It is “Machine Learning” sure, but it is just brute force pattern recognition v2.0.
The desired outcomes you define and then the data you train it on both have a LOT of built-in biases.
It’s a cool technology I guess, but it’s being misused across the board. It is being overused and misused by every company with FOMO. Hoping to get some profit edge on the competition. How about we have AI replace the bullshit CEO and VP positions instead of trying to replace fast food drive through workers and Internet content.
I guess that’s nothing new for humans… One human invents the spear for fishing and the rest use them to hit each other over the head.