• Andy@slrpnk.net
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      I don’t think it’s secret. A lot of OpenAI’s business strategy is to warn of the danger of their own project as a means of hyping it.

      OpenAI, despite having produced a pretty novel product, doesn’t really have a sound business model. LLMs are actually expensive to run. The energy and processing is not cheap, and it’s really not clear that they produce something of value. It’s a cool party trick, but a lot of the use cases just aren’t cost effective at this point. That makes their innovation hard to commercialize. So OpenAI promotes itself like online clickbait games.

      You know the ones that are like, ‘WARNING: This game is so sexy it is ADDICTIVE! Do NOT play our game if you don’t want to CUM TOO HARD!’

      That’s OpenAI’s marketing strategy.

      • JackbyDev@programming.dev
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        Creator of AI: “This shit is super dangerous. We need to be regulated. Please regulate us so no one else can construct an AI!”

        It’s just trying to squash competition.

    • Socialist Mormon Satanist@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Have you read some of the comments here and on Reddit?! Dude, it IS true already.

      Even in this thread I got downvoted for digging at people who have AI girlfriends. lmao

      • Chozo@fedia.io
        link
        fedilink
        arrow-up
        0
        ·
        3 months ago

        Even in this thread I got downvoted for digging at people who have AI girlfriends. lmao

        Probably because you’re being mean and picking on people who clearly have some mental health issues. What, were you expecting a round of applause for punching down?

            • Socialist Mormon Satanist@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              3 months ago

              So sounds like you need to get downvoted more than me.

              I didn’t accuse them of having mental health issues. That’s all you, brother.

              But I DO think that having an ai girlfriend is pathetic. And saying that, made a bunch of people mad.

              And it hasn’t changed my mind at all. lol

              • Chozo@fedia.io
                link
                fedilink
                arrow-up
                0
                ·
                3 months ago

                I didn’t say you accused them of having mental health issues. I said that you targeted a group of mentally unwell people to insult them. Those aren’t the same thing.

                But I DO think that having an ai girlfriend is pathetic. And saying that made a bunch of people mad.

                What’s pathetic is getting so worked up about it and starting multiple arguments in the thread over it. At least the folks with AI partners aren’t being assholes to everyone about it; what’s your excuse?

                EDIT: Nevermind, just noticed your account’s not even a half a day old. Evading a ban, I take it?

                Don’t answer, because I don’t care.

  • Socialist Mormon Satanist@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    Loser redditors are already making ai be their fake “girlfriends.”

    As in legit, they see it as their girlfriend. So yes, people have already become emotionally reliant.

    What the fuck is up with you guys not being able to have a girlfriend in real life?! lmao

  • MagicShel@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    Having used it a couple of times… I’m not seeing it. It’s the same ChatGPT that talks about everything like it’s a lecture. I haven’t tried to jailbreak it into hot girlfriend AI or anything, but it seems like that would feel a lot more awkward in voice chat than text.