Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
www.404media.co Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
A technique used by Google researchers to reveal ChatGPT training data is now banned by OpenAI.
![Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation](https://lemmy.world/pictrs/image/f497b538-a894-45ae-a0cb-0e67e5511bf4.png?format=webp&thumbnail=256)
You're viewing a single thread.
All Comments
230 comments
Is there any punishment for violating TOS? From what I've seen it just tells you that and stops the response, but it doesn't actually do anything to your account.
12 0 ReplyShould there ever be
7 0 ReplyShould there ever be a punishment for making a humanoid robot vomit?
4 3 Reply
230 comments
Scroll to top