Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
www.404media.co Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
A technique used by Google researchers to reveal ChatGPT training data is now banned by OpenAI.
![Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation](https://lemmy.world/pictrs/image/f497b538-a894-45ae-a0cb-0e67e5511bf4.png?format=webp&thumbnail=256)
You're viewing a single thread.
All Comments
230 comments
how are they getting pii data in the first place
4 0 ReplyBecause people post their personal information all over the fucking internet and these things scrape it all up.
17 0 Reply
230 comments
Scroll to top