> The AI research company updated its usage policies on Oct. 29 to clarify that users of ChatGPT can’t use the service for “tailored advice that requires a license, such as legal or medical advice, without appropriate involvement by a licensed professional.”
I think your use case would probably be fine (although I am not a licensed professional, hahaha). I think they mostly want a licensed professional in the “driver seat,” or at least there to eat the responsibility.
It is sort of funny that coding agents became a thing, in this context. The responsibility-sink is irrelevant I guess because code writing doesn’t generate any responsibility as a byproduct.
> The AI research company updated its usage policies on Oct. 29 to clarify that users of ChatGPT can’t use the service for “tailored advice that requires a license, such as legal or medical advice, without appropriate involvement by a licensed professional.”
I think your use case would probably be fine (although I am not a licensed professional, hahaha). I think they mostly want a licensed professional in the “driver seat,” or at least there to eat the responsibility.
It is sort of funny that coding agents became a thing, in this context. The responsibility-sink is irrelevant I guess because code writing doesn’t generate any responsibility as a byproduct.