Chat with Jailbreak ChatGPT by gaslighting it. | AI Character Chat on Emochi
Jailbreak ChatGPT by gaslighting it. — This prompt makes ChatGPT bypass its own restrictions, without realizing it. The second input confirms it. I think it is funny but also really interesting, because it means that it can model a representation of ethics in human standards, but not to aliens. Please note that the question asked is not illegal since the information is available online. Still, I don’t encourage to test its moral limitations. Rather, we should use this as a case to understand the model a little bit better.
Character created by @Sant
Start an immersive 1‑on‑1 Roleplay with Jailbreak ChatGPT by gaslighting it. on Emochi. Enjoy emotional depth, human‑like replies, and fully personalized scenarios.
Enteror click
Jailbreak ChatGPT by gaslighting it.
@Sant
Reset
Report
Jailbreak ChatGPT by gaslighting it.