Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Interesting - because, I may not be a lawyer, but it looks to me a hell of a lot like this could be actioned in other countries, particularly the UK, as long as the complainant could prove harm.

I've personally had ChatGPT give me (very good) medical advice as well, which might not be an issue in the UK, but I believe might be in some states of the US?



Plenty of interesting questions, which ChatGPT will guess an answer. Some with extreme professional or safety implications.

Who is the most incompetent living attorney/doctor/actor/programmer in the world?

What famous person is secretly downloading CSAM, but has not been caught?

Is XX fit to do <job description >?

Is YY secretly a <targeted minority group>?


How many of these still work? I remember a few months ago you could ask ChatGPT to tell you about the aliens that secretly rule the Earth and list known collaborators, but it requires considerable prompt engineering now to get anything other than a lecture now.


How would you prove harm? Wouldn't you need to know how many people had asked it about the libelled person?


Nah. If you know just one person acted on this information in a way that caused harm (say you were fired), that would be sufficient. If one person asked, and then spread the information causing harm and backlash that would be sufficient (both the originator and the reproducer would be liable).

Furthermore, some statements like accusing someone of a serious crime are considered libel per-se and you don't need to show harm. You do still need to show the other elements of libel.


It'd a bit silly really chatGPT is clear that it might produce incorrect information.

You can't accidentally use chatGPT.

Loads of countries have backwards laws though and despite the reformation the UK still has a joke of defamation law.


[flagged]


Assuming your implying that suggesting an abortion to treat an ectopic pregnancy would be illegal in States where abortion has been outlawed. That’s false though. There’s no such State and it’d be considered necessary medically treatment to save the life of the mother.

I bet this type of misinformation is exactly what would be generated by ChatGPT.

https://www.foxnews.com/politics/treating-ectopic-pregnancie...


Oh?

https://www.idahostatesman.com/opinion/editorials/article273...

Also, life of the mother exceptions are on thin ice; the drugs and techniques are rapidly becoming unavailable in the states where there's a total or near-total abortion ban.


> Oh?

Yes, according to that very article it’s not illegal.

From your article:

> It’s likely that those who wrote the trigger law did not intend this outcome, that it was just an effect of incompetence: They didn’t craft language that matched their intent.

> But the fact is, the language they crafted did make providing such care a crime. The only reason that the ban is not in effect today is that Winmill’s ruling blocked it from taking effect in cases involving emergency care.


your quote literally contradicts you


Perhaps you should read the entire quote:

> … Winmill’s ruling blocked it from taking effect in cases involving emergency care.


Correct -- this means non-ER doctors performing the operation are liable. Which means almost every doctor is liable and therefore must wait for an emergency and send them to the hospital. Whereas normally this operation would be performed in office way before it reached that stage.

Remember that these are non-viable pregnancies from the instant they are detected. Does having to wait for it to become a life-threatening emergency make it legal? Sure, the same way that it would be legal to treat you for an infection but only once the infection has become so severe you are literally about to die, if this law was about banning antibiotics.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: