- cross-posted to:
- fuck_ai@lemmy.world
- cross-posted to:
- fuck_ai@lemmy.world
Good, end this AI bullshit, it has little upsides and a metric fuckton of downsides for the common man
This is exactly what social media companies have been doing for a while (it’s free, yes) they use your data to train their algorithms to squeeze more money out of people. They get a tangible and monetary benefit from our collective data. These AI companies want to train their AI on our hard work and then get monetary benefit off of it. How is this not seen as theft or even if they are not doing it just yet…how is it not seen as an attempt at theft?
How come people (not the tech savvy) are unable to see how they are being exploited? These companies are not currently working towards any UBI bills or policies in governments that I am aware of. Since they want to take our work, and use it to get rich and their investors rich why do they think they are justified in using people’s work? It just seems so slime-y.
“The plagiarism machine will break without more things to plagiarize.”
Over in the US, that’s giving China the advantage in AI development. Won’t happen.
No, actually they’ve just finally admitted that they can’t improve them any further because there’s not enough training data in existence to squeeze any more demonizing returns out of.
Apparantly their trying to get Deepseek banned again, really doesn’t like competition this guy.
looks good
Oops, oh well. I very much hope it’s over, asshole.
I am good with that.
Perhaps this is just a problem with the way the model works. Always requiring new data and unable to use current data, to ponder and expand upon while making new connections about ideas that influenced the author… LLM’s are a smoke and mirrors show, not a real intelligence.
They do seem fundamentally limited somehow. With all the bazillion watts they are cheap imitation at best compared to mere 20 Watts of human brain
over it is then. Buh bye!
Business that stole everyone’s information to train a model complains that businesses can steal information to train models.
Yeah I’ll pour one out for folks who promised to open-source their model and then backed out the moment the money appeared… Wankers.
It’s so wild how laws just have no idea what to do with you if you just add one layer of proxy. “Nooo I’m not stealing and plagerizing, it’s the AI doing it!”
National security my ass. More like his time span to show more dumb “achievements” while getting richer depends on it and nothing else
If I’m using “AI” to generate subtitles for the “community” is ok if i have a large “datastore” of “licensable media” stored locally to work off of right?