When you use a hammer or a drill, do you expect it to sometimes not hit/screw the nail?
If ChatGPT is a tool for knowledge transfer/extraction, it can't hallucinate/lie to you/be wrong/make stuff up.
If it's a tool for potentially discovering some knowledge that may be true and needs to almost always be verified by either a compiler or a followup "find me a reference/discussion" Google search to make sure it's accurate, then sure. But I don't think that's what it's primarily being advertised as.
Beyond the obvious fact that you can accidentally hit your thumb with a hammer or strip the head off a screw with a screwdriver, I’d very much like to hear about any tool for collecting knowledge that is perfect.
Web searches will for sure give you wrong answers. Even professors or other experts in a field will be wrong sometimes. Heck, even Einstein got some things wrong.
Your goalpost is in the wrong spot. Tools don’t need to be and probably never can be perfect. But that doesn’t mean they’re not useful.
If you hit yourself with a hammer, that’s your fault because you did something wrong. Comparatively, you can pose a completely correct and unambiguous question to ChatGPT and still get a wrong answer. I don’t like this implicit shifting of blame to the user when it’s the tool that is flawed.
The design of the hammer makes it easy to miss the nail, and you need some skill in order to use it effectively. A nail gun is an example of a better tool for driving nails, since it’s faster and allows for greater accuracy.
Similarly, you can ask ChatGPT for an answer and it might get something wrong. It takes some skill to know how to interpret and verify the response. If a user takes the response as truth without question, it’s partially the user’s fault.
But it's not the users fault if the question is correct and unambiguous. To continue with the hammer analogy, that's like landing a perfect hit on a nail and the hammer's head somehow falls off and richochets into the user's eye.
I just don’t see what point you are trying to make here. Yes, ChatGPT can give the wrong answer given a correct input, but that doesn’t mean it’s not a useful tool.
Think about how GPS can give a bad route, especially if there is construction or snow on the road.
Or how keyboard autocorrect sometimes changes what you wrote into something silly and wrong, even if you originally spelled the word correctly.
Or how OCR and speech-to-text software sometimes makes mistakes.
Or how Google Translate uses unnatural or incorrect word choices sometimes.
Are these not useful tools even though they get things wrong?
> Or how keyboard autocorrect sometimes changes what you wrote into something
> silly and wrong, even if you originally spelled the word correctly.
> Or how OCR and speech-to-text software sometimes makes mistakes.
> Or how Google Translate uses unnatural or incorrect word choices sometimes.
When you point this out no one will jump in to defend them. If you say the same about ChatGPT your inbox will suddenly be full of people telling you you're just using it wrong (see this reply chain for example).
> When you use a hammer or a drill, do you expect it to sometimes not hit/screw the nail?
Not sure about drills, but this absolutely happens with drivers if you fumble mating the bit to the screw head, or if you miss the stud, or if you overtighten, or if you don't sometimes pre-drill, or if you strip the head, or if you don't correctly gauge underlying material composition, or thickness, or if you...
When you use a hammer or a drill, do you expect it to sometimes not hit/screw the nail?
If ChatGPT is a tool for knowledge transfer/extraction, it can't hallucinate/lie to you/be wrong/make stuff up.
If it's a tool for potentially discovering some knowledge that may be true and needs to almost always be verified by either a compiler or a followup "find me a reference/discussion" Google search to make sure it's accurate, then sure. But I don't think that's what it's primarily being advertised as.