• ShaunaTheDead@fedia.io
    link
    fedilink
    arrow-up
    6
    ·
    5 months ago

    Reminds me of an early application of AI where scientists were training an AI to tell the difference between a wolf and a dog. It got really good at it in the training data, but it wasn’t working correctly in actual application. So they got the AI to give them a heatmap of which pixels it was using more than any other to determine if a canine is a dog or a wolf and they discovered that the AI wasn’t even looking at the animal, it was looking at the surrounding environment. If there was snow on the ground, it said “wolf”, otherwise it said “dog”.

  • Th4tGuyII@fedia.io
    link
    fedilink
    arrow-up
    2
    ·
    5 months ago

    The idea of AI automated job interviews sickens me. How little of a fuck do you have to give about applicants that you can’t even be bothered to have even a single person interview them??

  • cheddar@programming.dev
    link
    fedilink
    arrow-up
    1
    ·
    5 months ago

    One of my favorite examples is when a company from India (I think?) trained their model to regulate subway gates. The system was supposed to analyze footage and open more gates when there were more people, and vice versa. It worked well until one holiday when there were no people, but all gates were open. They eventually discovered that the system was looking at the clock visible on the video, rather than the number of people.

  • TAG@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    5 months ago

    That reminds me of the time, quite a few years ago, Amazon tried to automate resume screening. They trained a machine learning model with anonymized resumes and whether the candidate was hired. Then they looked at what the AI was looking at. The model had trained itself on how to reject women.

  • Colonel Panic@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    5 months ago

    I really hate that we are calling this wave of technology “AI”, because it isn’t. It is “Machine Learning” sure, but it is just brute force pattern recognition v2.0.

    The desired outcomes you define and then the data you train it on both have a LOT of built-in biases.

    It’s a cool technology I guess, but it’s being misused across the board. It is being overused and misused by every company with FOMO. Hoping to get some profit edge on the competition. How about we have AI replace the bullshit CEO and VP positions instead of trying to replace fast food drive through workers and Internet content.

    I guess that’s nothing new for humans… One human invents the spear for fishing and the rest use them to hit each other over the head.