158
ChatGPT has a style over substance trick that seems to dupe people into thinking it's smart, researchers found
(www.businessinsider.com)
This is a most excellent place for technology news and articles.
If you need a correct answer, you're doing it wrong!
I'm joking of course, but there's a seed of truth: I've found ChatGPT's wrong or incomplete answers to be incredibly helpful as a starting point. Sometimes it will suggest a Python module I didn't even know about that does half my work for me. Or sometimes it has a lot of nonsense but the one line I actually need is correct (or close enough for me to understand).
Nobody should be copying code off Stack Overflow without understanding it, either.