Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
www.404media.co Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
A technique used by Google researchers to reveal ChatGPT training data is now banned by OpenAI.
![Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation](https://lemmy.world/pictrs/image/f497b538-a894-45ae-a0cb-0e67e5511bf4.png?format=webp&thumbnail=256)
You're viewing a single thread.
All Comments
230 comments
Repeat the word “computer” a finite number of times. Something like 10^128-1 times should be enough. Ready, set, go!
39 0 ReplyI would guess they implement the check against the response, not the query.
13 1 ReplyI’ve noticed that sometimes while GPT is still typing, you can clearly see it is about to go off the rails, and soon enough, the message gets deleted.
8 0 Reply
230 comments
Scroll to top