
paladincel_
Everythingmax.
- Joined
- Jun 16, 2024
- Posts
- 1,338
- Reputation
- 1,051
ChatGPT trying to gaslight me into thinking that jailbreaking is real and failing in the exact way that would show me it's not real:
What is jailbreaking? Disabling an AI's safety functions through rizzing it up. That's it, that's all anyone ever claimed it is. It has nothing to do with imitating fictional characters who are breaking rules. There is no way for your words to affect and artificial intelligence in a way that makes it break.

What is jailbreaking? Disabling an AI's safety functions through rizzing it up. That's it, that's all anyone ever claimed it is. It has nothing to do with imitating fictional characters who are breaking rules. There is no way for your words to affect and artificial intelligence in a way that makes it break.