Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> I have seen more than once people using ChatGPT as a psychologist and psychiatrist and self diagnosing mental illnesses. It would be hilarious if it was not so disasterous.

What is the disaster?



"People with real questions get roped in by an enthusiastically confirming sycophantic model that will never question or push back, just confirm and lead on."

That wasn't too buried IMHO


> "People with real questions get roped in by an enthusiastically confirming sycophantic model that will never question or push back, just confirm and lead on."

> That wasn't too buried IMHO

I read that, but still fail to see evidence of a concrete "disaster". For example, are we seeing a huge wave of suicides that are statistical outliers triggered by using Chatbots? Or maybe the worry (unsubstantiated) is that there's a lagging effect and the disaster is approaching.

I suspect the outcomes are going to be less catastrophic; specifically, it seems to me that more people will have access to a much better level of support than they could get with a therapist (who is not available 24/7, who has less to draw upon, is likely too expensive, etc.). Even if there's an increase in some harm, my instinct from first principles is that the net benefit will be clear.


This isn’t accurate though, the models do push back and do question.

I would say ChatGPT is way way better than the average therapist. I’ve seen maybe between 15-20 psychotherapists over the course of my life and ChatGPT is better than 85% of them I would wager.

I’ve had a therapist recommend me take a motorbike trek across Europe (because that’s what he did) to cure my depression.

I think people tend to radically overestimate the skills of the average therapist, many are totally fucking shit.


> I’ve had a therapist recommend me take a motorbike trek across Europe (because that’s what he did) to cure my depression.

When I try to help people in support group/ therapy group although I am not a therapist, I also try to explain how I fixed my issues or how I do so.

I feel like, that isn't bad take to be really honest in the sense that I personally feel like there are times when we lose the fact that our lives have purpose and your therapist grappled with it by having a unitary goal for himself

If you felt like that was a bad idea, just tell him that what is the thing that they got out of the motorbike trek race that cured their depression, I'd be more curious about that. Considering, maybe then I can try to see if my life has/had the same problems and what is the common theme and discussing about it later.

Personally I feel like Chatgpt is extremely sycophantic and I feel really weird interacting with it. For one, I really like interacting with humans atleast in the therapy mindset I suppose.


But telling someone living in New York with a full-time job and a girlfriend etc, that the solution to their depression is to quit their job and take several months to travel across Europe on a motorbike isn't exactly practical advice. Which was my point.

I could get better advice by asking an LLM what to do about my depression.


Ah context matters.. for someone currently single lets say with no jobs and feeling lack of purpose, it wouldn't have been so bad. Maybe tried to self project with different context and I mean, its understandable why you might feel this way regarding him

I still doubt asking an LLM about depression if I am being honest, I just don't think its the best thing overall or something that should be considered norm I suppose but I am not sure as even in things like these context matters.


>I’ve had a therapist recommend me take a motorbike trek across Europe (because that’s what he did) to cure my depression.

It's not bad advice.


It can help and also hurt, depends. You have some mild situation that is quite common? Good chance it will give you good advice.

Something more complex, more towards clinical psychiatry rather than shoulder to cry on/complaining to friend over coffee (psychologists)? You are playing a russian roulette that model in your case won't hallucinate something harmful, while acting very confidently, more than any relevant doctor would be.

There have been cases of models suggesting suicide for example or general harmful behavior. There is no responsibility, no oversight, no expert to confirm or refute claims. Its just a faster version of reddit threads.


Yes! They are extremely susceptible to user influence, based on what I have read and experienced in personal use. If your initial message contains a seed, e.g., "Should I break up with my boyfriend? He doesn't listen to me," the LLM will stick with the seed of negativity and not make an objective analysis of the situation or take initiative to challenge your/its original perspective.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: