It’s arguably worse, since it seems to be more pervasive than crypto and NFTs were at their peak.
Crypto never really hit the mainstream, and even NFTs were still fringe. Whereas AI and AI accelerators are packed into basically every new phone and (Intel) processor.
There are way more uses cases to the average person than crypto so that’s only natural. There’s also a trust issue with crypto that doesn’t exist with AI, as well as losing your money when things go wrong.
That being said, I don’t approve of this nor adding it randomly to products where it clearly has little use. If people want generative software, they can just choose to install it.
Why call out Intel? Pretty sure AMD and Nvidia are both putting dedicated AI hardware in all of their new and upcoming product lines. From what I understand they are even generally doing it better than Intel. Hell, Qualcomm is advertising their AI performance on their new chips and so is Apple. I don’t think there is anyone in the chip world that isn’t hopping on the AI train
Fair enough. Was just asking because the choice of company surprised me. AMD is putting "AI Engines in their new CPUs (separate silicon design from their GPUs) and while Nvidia largely only sells GPUs that are less universal, they’ve had dedicated AI hardware (tensor cores) in their offerings for the past three generations. If anything, Intel is barely keeping up with its competition in this area (for the record, I see vanishingly little value in the focus on AI as a consumer, so this isn’t really a ding on Intel in my books, more so making the observation from a market forces perspective)
Horrifying privacy implications aside, AI has really become the new cryptocurrency.
Don’t get me wrong, both technologies are interesting, but it’s tiring to see both be forced into applications that functioned just fine without them.
But what about my Web 3.0 AI cryptocurrency in the metaverse?
“An app that lets you make nfts from images created by a camera in the metaverse”
Isn’t that already a thing? That surely has to be a thing already.
It’s arguably worse, since it seems to be more pervasive than crypto and NFTs were at their peak.
Crypto never really hit the mainstream, and even NFTs were still fringe. Whereas AI and AI accelerators are packed into basically every new phone and (Intel) processor.
There are way more uses cases to the average person than crypto so that’s only natural. There’s also a trust issue with crypto that doesn’t exist with AI, as well as losing your money when things go wrong.
That being said, I don’t approve of this nor adding it randomly to products where it clearly has little use. If people want generative software, they can just choose to install it.
Why call out Intel? Pretty sure AMD and Nvidia are both putting dedicated AI hardware in all of their new and upcoming product lines. From what I understand they are even generally doing it better than Intel. Hell, Qualcomm is advertising their AI performance on their new chips and so is Apple. I don’t think there is anyone in the chip world that isn’t hopping on the AI train
Because I was only aware of Intel (and Apple) doing it on computers, whereas most major flagship mobile devices have those accelerators now.
GPUs were excluded, since they’re not as universal as processors are. A dedicated video card is still by and large considered an enthusiast part.
Fair enough. Was just asking because the choice of company surprised me. AMD is putting "AI Engines in their new CPUs (separate silicon design from their GPUs) and while Nvidia largely only sells GPUs that are less universal, they’ve had dedicated AI hardware (tensor cores) in their offerings for the past three generations. If anything, Intel is barely keeping up with its competition in this area (for the record, I see vanishingly little value in the focus on AI as a consumer, so this isn’t really a ding on Intel in my books, more so making the observation from a market forces perspective)