3am
3am
Great, now we’ll have separate “california-model” ai-models, like cars.
Flight of the bumblebee.
Don’t blame emo for what can be explained by YAGNI.
Technically correct.
I’m sorry it troubles your mind so much that you had to make a post about it.
Personally witnessed or it didn’t happen.
“Unrelatedly”… consession prices double… again…
Multiple endings! Refund, pay-up, audit, and no-knock raid!
Does it subtly move under battery power?
This reminds me of my old phone. I downloaded a podcast on it that had a shock-opener and for some reason was always “the next thing” the sound/music player wanted to play. So many times, by accidental touch inputs or clicking the headphone button, or the like, my phone would randomly scream: "WHO DOESN’T LIKE TO PEE IN THE SINK!?!?!”
Fluxcapaciwich
Both practically and theoretically, it might be impossible. It basically comes down to trusting trust. https://www.youtube.com/watch?v=SJ7lOus1FzQ
Even if AI never actually takes someone’s job, it’s clear that the hype surrounding it can displace workers, and it’s use in screening candidates may prevent you from finding another.
I hate to make it even weirder, but getting… erm… “unborn”… is not quite the same as being killed. Methinks it would be more like a scifi movie where an alien force absorbs everyone on Earth.
Technically true, proportionally.
I’m not talking about one-offs and the assessment noise floor, more like: “ChatGPT broke the Turing test” (as is claimed). It used to be something we tried to attain, and now we don’t even bother trying to make GPT seem human… we actually train them to say otherwise lest people forget. We figuratively pole-vaulted over the turing test and are now on the other side of it, as if it was a point on a timeline instead of an academic procedure.
The natural general hype is not new… I even see it in 1970’s scifi. It’s like once something pierced the long-thought-impossible turing test, decades of hype pressure suddenly and freely flowed.
There is also an unnatural hype (that with one breakthrough will come another) and that the next one might yield a technocratic singularity to the first-mover: money, market dominance, and control.
Which brings the tertiary effect (closer to your question)… companies are so quickly and blindly eating so many billions of dollars of first-mover costs that the corporate copium wants to believe there will be a return (or at least cost defrayal)… so you get a bunch of shitty AI products, and pressure towards them.
Are you saying that you DON’T walk that close to other pedestrians?
They show a phone on life support, so maybe they dumped it from RAM?