Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You're starting off with a distinction without difference.

You could throw darts at a spinning wheel with real names and imagined crimes.

The point is that it doesn't matter what the seed for the false statement is, it's the act of spreading it that's problematic.

You're also muddling a point that I can agree with: Treating ChatGPT as an infallible expert is wrong.

But that applies to so many other things. Even expert witnesses are not infallible.

So I disagree with characterizing hallucinations as the problem, it's the application that's problematic.

Blindly and pasting factual content from ChatGPT is a bad idea, just like blindly taking a single source of information as gospel is a bad idea.

Humans can be just as confidently wrong as LLMs, and a simple adage applies to both: trust but verify.



> Humans can be just as confidently wrong as LLMs, and a simple adage applies to both: trust but verify.

Trust people who have earned trust (either through qualifications or reputation) and treat everyone else as good faith actors who are quite possibly wrong.

ChatGPT should be treated as a person you just met at a bus stop who is well dressed and well spoken but has just told you that you are both waiting for the elevator to arrive at the bus stop.


That's the fast track to get your point of view to be ignored: Pessimism is ok, but that level of dismissiveness isn't really warranted: especially since the conversation forming in public is not just about some specific model you happen have strong feelings about, but the general concept of LLMs and factuality.

I wouldn't expect a random doctor approached at a bus stop to accurately answer a question about medicine anymore than I would ChatGPT by the way. Trusting people based on their qualifications and reputations isn't really a thing.

If a doctor tells you to take medication X there's a reason you take that to a pharmacist rather than a store clerk with a key to a safe or something: verifying is always a great idea, regardless of reputation.


I'm not sure how the critique relates to my post. Of course you wouldn't trust an architect with medical advice or a doctor with structural materials for bridge building; that was implied.


> ChatGPT should be treated as a person you just met at a bus stop who is well dressed and well spoken but has just told you that you are both waiting for the elevator to arrive at the bus stop.

Ahahahahaha. Wow. This is brilliant mate. I'm going to start using it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: