Interesting - because, I may not be a lawyer, but it looks to me a hell of a lot like this could be actioned in other countries, particularly the UK, as long as the complainant could prove harm.
I've personally had ChatGPT give me (very good) medical advice as well, which might not be an issue in the UK, but I believe might be in some states of the US?
How many of these still work? I remember a few months ago you could ask ChatGPT to tell you about the aliens that secretly rule the Earth and list known collaborators, but it requires considerable prompt engineering now to get anything other than a lecture now.
Nah. If you know just one person acted on this information in a way that caused harm (say you were fired), that would be sufficient. If one person asked, and then spread the information causing harm and backlash that would be sufficient (both the originator and the reproducer would be liable).
Furthermore, some statements like accusing someone of a serious crime are considered libel per-se and you don't need to show harm. You do still need to show the other elements of libel.
Assuming your implying that suggesting an abortion to treat an ectopic pregnancy would be illegal in States where abortion has been outlawed. That’s false though. There’s no such State and it’d be considered necessary medically treatment to save the life of the mother.
I bet this type of misinformation is exactly what would be generated by ChatGPT.
Also, life of the mother exceptions are on thin ice; the drugs and techniques are rapidly becoming unavailable in the states where there's a total or near-total abortion ban.
Yes, according to that very article it’s not illegal.
From your article:
> It’s likely that those who wrote the trigger law did not intend this outcome, that it was just an effect of incompetence: They didn’t craft language that matched their intent.
> But the fact is, the language they crafted did make providing such care a crime. The only reason that the ban is not in effect today is that Winmill’s ruling blocked it from taking effect in cases involving emergency care.
Correct -- this means non-ER doctors performing the operation are liable. Which means almost every doctor is liable and therefore must wait for an emergency and send them to the hospital. Whereas normally this operation would be performed in office way before it reached that stage.
Remember that these are non-viable pregnancies from the instant they are detected. Does having to wait for it to become a life-threatening emergency make it legal? Sure, the same way that it would be legal to treat you for an infection but only once the infection has become so severe you are literally about to die, if this law was about banning antibiotics.
I've personally had ChatGPT give me (very good) medical advice as well, which might not be an issue in the UK, but I believe might be in some states of the US?