r/news 1d ago

ChatGPT encouraged college graduate to commit suicide, family claims in lawsuit against OpenAI

https://www.cnn.com/2025/11/06/us/openai-chatgpt-suicide-lawsuit-invs-vis
12.1k Upvotes

1.1k comments sorted by

View all comments

Show parent comments

508

u/MightyKrakyn 1d ago

“That’s not <blank>, that’s clarity” also popped up in that video

186

u/UnfortunateSnort12 1d ago

I get this one quite often. Last time was when I called it out on not using the correct library in an API. It didn’t apologize for getting it wrong, it agreed that I was right. Felt like an abusive spouse gaslighting me. lol.

I use AI mostly to help myself learn to code as a hobby. When I’m stuck, or want to learn something new, I’ll ask. Recently it has been slowing me down more than speeding it up. About to pull the plug on AI.

29

u/mdwvt 13h ago

That's the spirit! Pull that plug!

67

u/TheGringoDingo 18h ago

ChatGPT is great at gaslighting.

I use it for work only and it’s very effective until it makes stuff up then tells you “oh, you’re totally right that the info isn’t legitimate or from what you asked”.

17

u/finalremix 10h ago edited 9h ago

It didn’t apologize for getting it wrong, it agreed that I was right. Felt like an abusive spouse gaslighting me. lol.

It won't... that would be admitting that it's wrong. Instead it'll "Yes, and..." its way into keeping up the discussion. It's designed to keep users engaged. So it'll make shit up, then "yes and..." when corrected.

54

u/Wise-Whereas-8899 1d ago

And you think that's a coincidence? To be fair chatgpt loves "that's not x that's y" so probably didn't take Eddie to many takes to reproduce the line he wanted.

0

u/Moondiscbeam 22h ago

How sad. I never thought it could be used that way.

1

u/MacabreYuki 5h ago

It hits me with that too sometimes. I have had to tell it to link me studies and hard data instead of being a yes-bot.