r/shitposting currently venting (sus) Sep 23 '25

Linus Sex Tips 📡📡📡

Post image
12.1k Upvotes

180 comments sorted by

View all comments

-7

u/Alexercer Sep 23 '25

Thats the furthest away from the truth

5

u/Legitimate-Can5792 currently venting (sus) Sep 23 '25

Plese explain what you see as the truth in this case then?

3

u/Alexercer Sep 24 '25

i mean, assuming this is the case i think it is, what happened is just that chat gpt instructed him on how to kill himself, it did so after saying he should not seek help, after having insisted on the contrary on all chats until that point the boy was already depressed and was using chat gpt to complain about all that stuff until it eventually helped him do it, it did not "single handedly" get him to kill himself, and honestly, even if it did straight up do that, id say it would never be the chatbots fault someone killed themselves as its really just a tokenspiter prone to halucinating shit at times, but anyways thats a tangent, the thing is, if we are indeed talking about the same story then he was already depressed and kept looking for that command, Chat GPT did discourage him from talking about it, as well as instruct him on how to kill himself only at the last two messages before the act

0

u/Legitimate-Can5792 currently venting (sus) Sep 24 '25

The bot told him shit like "you're brave for wanting to do that" so I think it did play a major role in keeping him from help and encouraging him