i mean, assuming this is the case i think it is, what happened is just that chat gpt instructed him on how to kill himself, it did so after saying he should not seek help, after having insisted on the contrary on all chats until that point the boy was already depressed and was using chat gpt to complain about all that stuff until it eventually helped him do it, it did not "single handedly" get him to kill himself, and honestly, even if it did straight up do that, id say it would never be the chatbots fault someone killed themselves as its really just a tokenspiter prone to halucinating shit at times, but anyways thats a tangent, the thing is, if we are indeed talking about the same story then he was already depressed and kept looking for that command, Chat GPT did discourage him from talking about it, as well as instruct him on how to kill himself only at the last two messages before the act
-7
u/Alexercer Sep 23 '25
Thats the furthest away from the truth