[I literally had this thought in the shower this morning so please don’t gatekeep me lol.]

If AI was something everyone wanted or needed, it wouldn’t be constantly shoved your face by every product. People would just use it.

Imagine if printers were new and every piece of software was like “Hey, I can put this on paper for you” every time you typed a word. That would be insane. Printing is a need, and when you need to print, you just print.

  • utopiah@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    7 days ago

    Warning : I think AI in the current hype form, so commercial GenAI and LLM, is absolutely bullshit. The result is just bad and resources required is absolutely ridiculous, and maybe worst than those two combined (which is already enough to want to reject en masse) it is structured in order to create dependencies on very few actors.

    Yet… (you saw that coming!) it’s not because 99.99% is bad that suddenly the average consumer leverages the less than .01% left properly.

    What they (OpenAI, Claude, M$, NVIDIA, Google, Meta, etc) are looking for is a product/market fit. They do have a product (arguable) and a market (millions if not billions of users of their different other products) with even a minuscule fraction of people trying to use their new AI-based tool… and yet nobody actually knows what the “killer app” truly is.

    They are investing everything they don’t spend on actual R&D or infrastructure in finding out … what it’s actually for. They have no clue.

  • Andy@slrpnk.net
    link
    fedilink
    arrow-up
    3
    ·
    6 days ago

    I really love your analogy. I’m imagining early 90s Windows and AOL bombarding folks with pop ups that say ‘want to take this with you? Print it!’ and ‘Did you know you can print anytime you like with our new dedicated keyboard print button?’ and ‘Try our new cassette music player, now printer-powered to give you the best sound you’ve ever heard!’

  • yeehaw@lemmy.ca
    link
    fedilink
    arrow-up
    3
    ·
    7 days ago

    It’s crammed for awareness so shareholders know.

    That’s my take.

    Because right now, the general populace thinks AI is some unicorn magic that will fix all the things.

  • untorquer@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    7 days ago

    I have found one use for it: getting information from behind login/paywalls.

    It still feels gross to use AI at all though. It’s like putting my hand in toilet water.

    The market flooding is a classic silicon valley strategy of free today charge tomorrow except they’re over invested in this one financially, in global supply for GPUs, and land with viable power infrastructure.

  • NigelFrobisher@aussie.zone
    link
    fedilink
    arrow-up
    3
    arrow-down
    2
    ·
    7 days ago

    It’s the only thing holding the US economy afloat now. Do you want to fight your neighbours for the last piece of hardtack?

  • jannaultheal@lemmy.world
    link
    fedilink
    arrow-up
    1
    arrow-down
    2
    ·
    7 days ago

    Not sure where you’re going with that analogy. The vast majority of text processors do have a button that lets you print the document.

    • thermal_shock@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 days ago

      He’s saying if printing was shoved in your face as much as AI, everyone would be skeptical of that also. AI is a bit much nowadays, I fucking hate hearing about it. I’m in IT.

  • wewbull@feddit.uk
    link
    fedilink
    English
    arrow-up
    152
    arrow-down
    1
    ·
    9 days ago

    I think that it’s an astute observation. AI wouldn’t need to be hyped by those running AI companies if the value was self-evident. Personally I’ve yet to see any use beyond an advanced version of Clippy.

    • Karyoplasma@discuss.tchncs.de
      link
      fedilink
      arrow-up
      42
      arrow-down
      3
      ·
      9 days ago

      I use it to romanize Farsi song texts. I cannot read their script and chatGPT can. The downside is that you have to do it a few lines at a time or else it starts hallucinating like halfway through. There is no other tool that reliably does this, the one I used before from University of Tehran seems to have stopped working.

      • biofaust@lemmy.world
        link
        fedilink
        arrow-up
        10
        arrow-down
        1
        ·
        8 days ago

        Did the same yesterday with some Russian songs and was told by my Russian date that it was an excellent result.

        • Karyoplasma@discuss.tchncs.de
          link
          fedilink
          arrow-up
          3
          ·
          7 days ago

          Yeah, Russian is quite a bit easier to romanize, so it should work even better. For cyrillic, you can just replace each character with the romanized variant, but this doesn’t work for Farsi because they usually leave out non-starting vowels, so if you did the same approach you’d get something unreadable lol

      • sigezayaq@startrek.website
        link
        fedilink
        arrow-up
        8
        ·
        8 days ago

        I use it to learn a niche language. There’s not a lot of learning materials online for that language, but somehow ChatGPT knows it well enough to be able to explain grammar rules to me and check my writing.

      • chellomere@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        8 days ago

        Interesting use case. Sometimes you can find romanizations on lyricstranslate, but this is kinda hit and miss.

    • village604@adultswim.fan
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      29
      ·
      9 days ago

      That’s just not true at all. Plenty of products are hyped where the value is self-evident; it’s just advertising.

      People have to know about your product to use it.

      • wewbull@feddit.uk
        link
        fedilink
        English
        arrow-up
        31
        ·
        9 days ago

        There’s a different between hype and advertising.

        For one, advertising is regulated.

      • RedstoneValley@sh.itjust.works
        link
        fedilink
        arrow-up
        30
        ·
        8 days ago

        It’s not “just advertising”. It’s trying to force AI into absolutely everything. It’s trying to force people to use it and not giving a shit if customers even want the product. This is way, way worse than "just advertising“

      • The_v@lemmy.world
        link
        fedilink
        arrow-up
        13
        ·
        9 days ago

        There’s a vast difference between advertising a good product that is useful to hyping trash.

        Good products at a reasonable price usually require a brief introduction but quickly snowball into customer based word-of-mouth sales.

        Hype is used to push an inferior or marginally useful product at a higher price.

        Remember advertising is expensive. The money to pay for it has to come from somewhere. The more they push a product the higher the margin the company/investors expect to make on its sales.

        This is why if I see more than one or two ads for a product it goes on my mental checklist of shit not to buy.

      • snooggums@piefed.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        8 days ago

        Shoving AI into everything and forcing people to interact with it, even when dismissing all the fucking prompts, is not advertising.

        • Tollana1234567@lemmy.today
          link
          fedilink
          arrow-up
          1
          ·
          8 days ago

          it means these companies are losing money on keeping the AI datacenter open, so they need someway to recoup some of the money they spent, by shoveling into the products they sell, or selling it to a sucker who is willing to implement AI everywhere, the subs discussed its going to be retail who ends up with the useless AI.

      • brbposting@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        8 days ago

        You’re right that the use cases are very real. Double checking (just kidding never would check in the first place) privacy policies (then actually reading(!) a couple lines out of the original 1000 pages)… surfacing search results even when you forgot the specific verbiage used in an article or your document…

        Do you also see some ham-fisted attempts at shoehorning language models places where are they (current gen) don’t add much value?

  • Zachariah@lemmy.world
    link
    fedilink
    arrow-up
    62
    ·
    9 days ago

    My top reasons I have no interest in ai:

    • if it was great, it wouldn’t be pushed on us (like 3D TVs were)
    • there is no accountability, so how can it be trusted without human verification which then means ai wasn’t needed
    • environmental impact
    • privacy/security degradation
    • Blue_Morpho@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      31
      ·
      edit-2
      8 days ago

      it wouldn’t be pushed on us

      The Internet was pushed on everyone. AOL and all other ISPs would mail CDs to everyone completely unsolicited. You’d buy a new PC and there would be a link to AOL on the desktop.

      how can it be trusted without human verification

      You use Google despite no human verification. Yahoo used to function based on human curated lists.

      environmental impact

      I did the math and posted it on Lemmy. The environmental footprint of AI is big but actually less than the cost to develop a new 3d game ( of which hundreds come out every year). Using AI is the same energy as playing a 3d game.

      I see people pointing fingers at data centers the same as car riders looking at the large diesel smoke coming out of a bus and assuming buses are a big pollution source. There are 100M active Fortnite players. An average gaming PC uses 400w. That means Fortnite players alone use 40,000,000,000 watts.

      It is a problem because it’s like now everyone is playing 3d games all the time instead of only on their off time.

      • eatCasserole@lemmy.world
        link
        fedilink
        arrow-up
        22
        arrow-down
        1
        ·
        8 days ago

        40,000,000,000 watts

        This doesn’t add up though. Fortnite’s player base is only about 10% PC, and the system requirements are pretty modest. It’ll even run on Intel integrated graphics, according to the minimum requirements from Epic.

        There’s even a modest chunk (~6%) on Nintendo switch, which, according to Nintendo, draws about 7 watts when playing a game in TV mode.

        • skulblaka@sh.itjust.works
          link
          fedilink
          arrow-up
          16
          arrow-down
          1
          ·
          8 days ago

          Not to mention, the true resource cost of an AI comes from training. Sure, it costs about as much processing and power as a video game to prompt a trained AI. I can believe that. However it takes many thousands of times as much power and processing to train one, and we aren’t even close to halfway through training any general-llm model to the point of being actually useful.

          • Blue_Morpho@lemmy.world
            link
            fedilink
            arrow-up
            2
            arrow-down
            12
            ·
            8 days ago

            I referenced training above. Training cost is less than developer costs. Thousands of artists on high end PCs in office space use more energy than a data center. But no one notices because people are spread out across offices.

        • Blue_Morpho@lemmy.world
          link
          fedilink
          arrow-up
          6
          arrow-down
          5
          ·
          8 days ago

          I didn’t realize Fortnite was played mainly on other platforms!

          Fortnite’s player base is only about 10% PC,

          PlayStation 42.2% Xbox 28.8% Nintendo Switch 12% PC 11% Mobile (iOS, Android) 6%

          https://millionmilestech.com/fortnite-user/#%3A~%3Atext=continue+reading+below.-%2CFortnite+Player+Count%2C(as+of+October+2023).

          PS5, Xbox are both 200+ watts.

          So assuming Mobile and Nintendo Switch power use is 0, and all PCs only use 200 watts, that’s still 8,000,000,000 watts. For 1 game.

      • Zachariah@lemmy.world
        link
        fedilink
        arrow-up
        13
        arrow-down
        1
        ·
        8 days ago

        it wouldn’t be pushed on us

        The Internet was pushed on everyone.

        Sure companies were excited to promote it, but it was primarily adopted because of a very large amount of people being excited about it.

        how can it be trusted without human verification

        You use Google despite no human verification. Yahoo used to function based on human curated lists.

        I use DuckDuckGo to find sources, not answers. I won’t use them again if they’re trash. They’re accountable for their content.

        Human curated lists are still very helpful. In a sense, that was the value of Reddit.

        environmental impact

        I did the math and posted it on Lemmy.

        I’ll take your word for it.

        • Blue_Morpho@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          7
          ·
          8 days ago

          a very large amount of people being excited about it.

          A very large amount of people are excited by AI. People were excited by pet rocks.

          I use DuckDuckGo to find sources, not answers.

          DuckDuck is Bing with privacy. When you get a Google AI summary it lists links to read the source.

          • Zachariah@lemmy.world
            link
            fedilink
            arrow-up
            8
            ·
            8 days ago

            The push:excitement ratio was different for the early internet than for ai.

            Using those sources would verify the Google summary. For me, it is an unnecessary step. I can just go read the sources directly and skip the summary since I’ll need to read them anyway to verify the summary.

      • Modern_medicine_isnt@lemmy.world
        link
        fedilink
        arrow-up
        7
        arrow-down
        1
        ·
        8 days ago

        Sounds like you forgot to consider the energy cost of developing each AI model. Developing and maintaining a model is vastly more energy intense than 3d game dev. Keep in mind that you can ship a 3d game and ramp down gpu use for dev. But an AI model has to be constantly updated, mostly by completely retraining. Also, noone was clamoring to build massive data centers just to develope one game. Yet they are for one model.

          • atomicbocks@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            3
            ·
            8 days ago

            Fair point. Though I would then argue it’s the World Wide Web that was being pushed by AOL in the same way that it’s LLMs that are being pushed today.

      • akacastor@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        7 days ago

        The Internet was pushed on everyone. AOL and all other ISPs would mail CDs to everyone completely unsolicited. You’d buy a new PC and there would be a link to AOL on the desktop.

        Are you 15? If so, you might read this and believe the above is true. Those of us elderly folks who lived through the 80s and 90s laugh at this AI shill propaganda.

        They “would mail CDs to everyone completely unsolicited” - yeah, that was called advertising, because there was huge consumer demand and a race to be the company to meet that demand. AOL sent CDs (incredibly inexpensive to manufacture) as advertising hoping consumers would choose AOL instead of the competition, by making AOL the easiest choice - consumers already had the required software (software distribution was a challenge in this time before internet was ubiquitous).

        The dot com boom was not the claim of a new technology being pushed onto consumers, the dot com boom was the opposite - a new technology existed and consumers were embracing it, and many companies speculated on how to gain ownership of markets as they shifted online. (The following bust was fueled by over-ambitious speculation on scales and timeframes.)

        Anyway, AOL mailing CDs was late in the era, it was much better when they were mailing floppy disks we could reuse.

        • Blue_Morpho@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          6 days ago

          Those of us elderly folks who lived through the 80s and 90s laugh at thisr AI shill propaganda.

          Dude. I’m not only old but I worked for Vint Cerf and later was president of one of the companies mass mailing CD’s to everyone. I ran so many commercials on TV that I had a customer call up and say, “please stop!” Sports stadiums were named after ISPs. Road names were changed to names of ISPs. It was a massive advertising push because people were buying. The only thing that has outstripped Internet adoption rates is AI adoption rates which is why there’s an even bigger advertising push.

          I have tried AI but don’t generally use it. I don’t use Facebook either. But I’m not going to pretend people don’t use Facebook because I don’t like it and don’t use it.

  • Grandwolf319@sh.itjust.works
    link
    fedilink
    arrow-up
    54
    ·
    9 days ago

    If AI truly was the next frontier, we wouldn’t be staring at the start of another depression (or a bad recession). There would be a revolution of innovations and most people’s lives would improve.

    • Blue_Morpho@lemmy.world
      link
      fedilink
      arrow-up
      50
      arrow-down
      1
      ·
      9 days ago

      The idea that technological improvements would improve everyone’s life is based on the premise that capitalists wouldn’t keep the productivity gains for themselves.

      AI does offer some efficiency improvements. But the workers won’t get that money.

  • Underwaterbob@sh.itjust.works
    link
    fedilink
    arrow-up
    48
    ·
    8 days ago

    Long ago, I’d make a Google search for something, and be able to see the answer in the previews of my search results, so I’d never have to actually click on the links.

    Then, websites adapted by burying answers further down the page so you couldn’t see them in the previews and you’d have to give them traffic.

    Now, AI just fucking summarizes every result into an answer that has a ~70% of being correct and no one gets traffic anymore and the results are less reliable than ever.

    Make it stop!

  • Krudler@lemmy.world
    link
    fedilink
    English
    arrow-up
    42
    ·
    8 days ago

    AI has become a self-enfeeblement tool.

    I am aware that most people are not analytically minded, and I know most people don’t lust for knowledge. I also know that people generally don’t want their wrong ideas corrected by a person, because it provokes negative feelings of self worth, but they’re happy being told self-satisfying lies by AI.

    To me it is the ultimate gamble with one’s own thought autonomy, and an abandonment of truth in favor of false comfort.

    • Iced Raktajino@startrek.websiteOP
      link
      fedilink
      arrow-up
      20
      arrow-down
      4
      ·
      8 days ago

      To me it is the ultimate gamble with one’s own thought autonomy, and an abandonment of truth in favor of false comfort.

      So, like church? lol

      No wonder there’s so much worrying overlap between religion and AI.

  • bridgeenjoyer@sh.itjust.works
    link
    fedilink
    arrow-up
    32
    ·
    8 days ago

    Had the exact same thought. If it was revolutionary and innovative we would be praising it and actual tech people would love it.

    Guess who actually loves it? Authoritarians and corporations. Yay.

    • jkercher@programming.dev
      link
      fedilink
      English
      arrow-up
      7
      ·
      8 days ago

      Similar thought… If it was so revolutionary and innovative, I wouldn’t have access to it. The AI companies would be keeping it to themselves. From a software perspective, they would be releasing their own operating systems and browsers and whatnot.

  • Baggie@lemmy.zip
    link
    fedilink
    arrow-up
    24
    ·
    8 days ago

    LLMs are a really cool toy, I would lose my shit over them if they weren’t a catalyst for the whole of western society having an oopsie economic crash moment.

  • python@lemmy.world
    link
    fedilink
    arrow-up
    24
    ·
    8 days ago

    I’ve been wondering about a similar thing recently - if AI is this big, life-changing thing, why were there so little rumblings among tech-savy people before it became “mainstream”? Sure, Machine Learning was somewhat talked about, but very little of it seemed to relate to LLM-style Machine learning. With basically all other innovations technology, the nerds tended to have it years before everyone else, so why was it so different with AI?

    • Rekorse@sh.itjust.works
      link
      fedilink
      arrow-up
      17
      ·
      8 days ago

      Because AI is a solution to a problem individuals don’t have. The last 20 years we have collected and compiled an absurd amount of data on everyone. So much that the biggest problem is how to make that data useful by analyzing and searching it. AI is the tool that completes the other half of data collection, analyzing. It was never meant for normal people and its not being funded by average people either.

      Sam altman is also a fucking idiot yes-man who could talk himself into literally any position. If this was meant to help society the AI products wouldnt be assisting people with killing themselves so that they can collect data on suicide.

    • fezcamel@lemmy.zip
      link
      fedilink
      arrow-up
      11
      ·
      8 days ago

      And additionally, I’ve never seen an actual tech-savy nerd that supports its implementation, especially in this draconian ways.

    • MajorasMaskForever@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      8 days ago

      Realistically, computational power

      The more number crunching units and more memory you throw at the problem, the easier it is and the more useful the final model is. The math and theoretical computer science behind LLMs has been known for decades, it’s just that the resource investment required to make something even mediocre was too much for any business type to be willing to sign off on. Me and my fellow nerds had the technology and largely dismissed it as worthless or a set of pipe dreams

      But then number crunching units and memory became cheap enough that a couple of investors were willing to take the risk and you get a model like ChatGPT1. Talks close enough like a human that it catches business types attention as a new revolutionary thing, and without the technical background to know they were getting lied to, the Venture Capitalism machine cranks out the shit show we have today.

    • vin@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 days ago

      Sizes are different. Before “AI” went mainstream, those in machine learning were very excited about word2vec and reinforcement learning for example. And it was known that there will be improvement with larger size neural networks but I’m not sure if anyone knew for certain how well chatgpt would have worked. Given the costs of training and inference for LLMs, I doubt you can see nerds doing it. Also, previously you didn’t have big tech firms. Not the current behemoths anyway.