• GardenVarietyAnxiety@lemmy.world
    link
    fedilink
    English
    arrow-up
    65
    arrow-down
    7
    ·
    edit-2
    2 months ago

    This is being done by PEOPLE. PEOPLE are using AI to do this.

    I’m not defending AI, but we need to focus on the operator, not the tool.

    The operator as much as the tool.

    • kibiz0r@midwest.social
      link
      fedilink
      English
      arrow-up
      23
      ·
      edit-2
      2 months ago

      Technology is not neutral.

      Especially for a tool that’s specifically marketed for people to delegate decision-making to it, we need to seriously question the person-tool separation.

      That alleged separation is what lets gig economy apps abuse their workers in ways no flesh-and-blood boss would get away with, as well as RealPage’s decentralized price-fixing cartel, and any number of instances of “math-washing” justifying discrimination.

      The entire big tech ethos is basically to do horrible shit in such tiny increments that there is no single instance to meaningfully prosecute. (Edit: As always, Mike Judge is relevant: https://youtu.be/yZjCQ3T5yXo)

      We need to take this seriously. Language is perhaps the single most important invention of our species, and we’re at risk of the social equivalent of Kessler Syndrome. And for what? So we can write “thank you” notes quicker?

    • zib@lemmy.world
      link
      fedilink
      English
      arrow-up
      22
      ·
      2 months ago

      You bring up a good point. In addition to regulating the tool, we should also punish the people who maliciously abuse it.

      • GardenVarietyAnxiety@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        2 months ago

        Regulate it because it’s being abused, and hold the abusers accountable, yeah.

        I always see the names of the models being boogey-manned, but we only ever see the names of the people behind the big, seemingly untouchable ones.

        “Look at this scary model” vs “Look at this person being a dick”

        We’re being told what to be afraid of and not who is responsible for it, because fear sells and we can’t do anything with it.

        Just my perception, of course.

    • Saleh@feddit.org
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 months ago

      I mean the tool is also being made by people. And there is people who pointed out, that a tool that is great at spurting out plausible sounding things with no factual bearing could be abused badly for spreading misinformation. Now there have been ethic boards among the people who make these tools who have taken these concerns in and raised them in their companies, subsequently getting ousted for putting ethical concerns before short term profits.

      The question is, how much is it just a tool and how much of it is intrinsically linked with the unethical greedy people behind pushing it onto the world?

      E.g. a cybertruck is also just a car, and one could say the truck itself is not to blame. But it is the very embodiment of the problems of the people involved.

      • merc@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        subsequently getting ousted for putting ethical concerns before short term profits.

        The irony is that there are no profits. The companies selling generative AI are losing such vast sums of money it’s difficult to wrap your head around.

        What they’re focused on isn’t short-term profits, it’s being the biggest, most dominant firm whenever AI does eventually become profitable, which might take decades.