Using ChatGPT as a sort of "expert system" and asking it specific questions makes much more sense than asking open ended questions like "write me a program to do ...". The state of the art is simply not good enough for it to produce decent code.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
That's exactly the point many of those articles "devs job are in danger" miss...
It might be a tool to do some tasks faster and more comfortable, but it is not a replacement of a trained human brain.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
Life should not be a journey to the grave with the intention of arriving safely in a pretty and well-preserved body, but rather to skid in broadside in a cloud of smoke, thoroughly used up, totally worn out, and loudly proclaiming “Wow! What a Ride!" - Hunter S Thompson - RIP
Having seen what it produces posted as "solutions" in QA, it's code doesn't always (or indeed often) compile, let alone run and do the actual job.
I suspect this is a result of the code source it uses: if you process a large amount of student homework as actual code then what you produce will be student homework.
And to get really useful stuff out, you have to specify exactly what you need pretty accurately in the first place - which is effectively the code you want to use anyway!
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
I definitely agree with your point about the need for accurate specifications. The ultimate example of that being a futile exercise would be the assembler code I often have to write for embedding on custom hardware