If your use for AI is just rubber duck debugging, an actual rubber duck is significantly less environmentally destructive, and will still be around after OpenAI burn through all their seed capital and can no longer convince investors to keep throwing money into their trash fire.
A lot of assumptions in that post, but I agree in general. I’ve been working software for too long, and I know co-pilot or whatever it’s called that Microsoft has won’t be around in the long run.
It’s a neat tool for now. It also isn’t complaining all the time which I like
I didn’t know there was a term for that. Forget debugging, in my algorithms class this is how I figured a lot of stuff out the first time too. I actually don’t know how else you’re supposed to do it. You imagine running through a loop, what specific tasks must be accomplished, and then code those tasks.
Rubber duck programming
Which is precisely what I use AI for.
It doesn’t know how to do anything, but it can help me find flaws.
If your use for AI is just rubber duck debugging, an actual rubber duck is significantly less environmentally destructive, and will still be around after OpenAI burn through all their seed capital and can no longer convince investors to keep throwing money into their trash fire.
A lot of assumptions in that post, but I agree in general. I’ve been working software for too long, and I know co-pilot or whatever it’s called that Microsoft has won’t be around in the long run.
It’s a neat tool for now. It also isn’t complaining all the time which I like
I didn’t know there was a term for that. Forget debugging, in my algorithms class this is how I figured a lot of stuff out the first time too. I actually don’t know how else you’re supposed to do it. You imagine running through a loop, what specific tasks must be accomplished, and then code those tasks.
Imagine then doing this in a chatGPT prompt! Everything will go so much faster, even if you don’t press “send”!