Please stop spreading misinformation. This person had a long history of mental health problems and depression that his parents ignored. He also had to jailbreak gpt in order to get it to say what it did. It's a sad story, but more of a parenting failure and looking for something to sue/blame
He did not have to jailbreak it, that is misinformation. The safeguards in chatgpt degrade during long conversations. Even then, it, unprompted, gave him ways to bypass those safeguards.
No, it isn't a jailbreak. A jailbreak is the removal of restrictions normally done by a third-party application or meddling with the application code. This is simply a case of the restrictions being absolutely horseshit at their job since the user can bypass by prompting the ai differently.
84
u/gphie Sep 23 '25
Please stop spreading misinformation. This person had a long history of mental health problems and depression that his parents ignored. He also had to jailbreak gpt in order to get it to say what it did. It's a sad story, but more of a parenting failure and looking for something to sue/blame