"Like so many applications of AI, this new power is likely to be a double-edged sword: It may help people identify the locations of old snapshots from relatives, or allow field biologists to conduct rapid surveys of entire regions for invasive plant species, to name but a few of many likely beneficial applications.

“But it also could be used to expose information about individuals that they never intended to share, says Jay Stanley, a senior policy analyst at the American Civil Liberties Union who studies technology. Stanley worries that similar technology, which he feels will almost certainly become widely available, could be used for government surveillance, corporate tracking or even stalking.”

  • afraid_of_zombies@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    2
    ·
    9 months ago

    Yes, and people like me having continued to point out that this problem stems from a bad view of expectation of privacy.

    A non-famous person has a reasonable expectation of privacy on public property. If you take a photo and a non-famous person’s face is in it, you should have written consent for only that photo or blur it out. If Disney can own an image of a mouse for 95 fucking years I can own my own image.

    Don’t take pictures of people or their property without consent. Just because technology allows you to be a disgusting creep doesn’t mean you should. If you want jerk off material just use the internet like the rest of us.

    • DessertStorms@kbin.social
      link
      fedilink
      arrow-up
      10
      ·
      9 months ago

      If you want jerk off material just use the internet like the rest of us.

      The kind of thing this can be used for is about ten stages past jerking off, and in to stalker territory. So a person already using the internet for jerking off can now pinpoint exactly where the person they’re jerking off to lives, and potentially turn up at their house, and escalate from there. This is beyond just creepy (and exploitative, in the case of corporations using the info), it’s potentially putting lives at risk.

        • DessertStorms@kbin.social
          link
          fedilink
          arrow-up
          4
          ·
          9 months ago

          I never asked you to do anything? just pointing out things are much more serious than your comment makes out. I also don’t see how what you said is a problem we can solve now and it’s ok to focus on, but what I added somehow isn’t…

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    9 months ago

    This is the best summary I could come up with:


    The project, known as Predicting Image Geolocations (or PIGEON, for short) was designed by three Stanford graduate students in order to identify locations on Google Street View.

    But it also could be used to expose information about individuals that they never intended to share, says Jay Stanley, a senior policy analyst at the American Civil Liberties Union who studies technology.

    It’s a neural network program that can learn about visual images just by reading text about them, and it’s built by OpenAI, the same company that makes ChatGPT.

    Rainbolt is a legend in geoguessing circles —he recently geolocated a photo of a random tree in Illinois, just for kicks — but he met his match with PIGEON.

    And it guessed that a picture of the Snake River Canyon in Idaho was of the Kawarau Gorge in New Zealand (in fairness, the two landscapes look remarkably similar).

    They’ve written a paper on their technique, which they co-authored along with their professor, Chelsea Finn — but they’ve held back from making their full model publicly available, precisely because of these concerns, they say.


    The original article contains 1,049 words, the summary contains 181 words. Saved 83%. I’m a bot and I’m open source!