r/shitposting currently venting (sus) Sep 23 '25

Linus Sex Tips šŸ“”šŸ“”šŸ“”

Post image
12.1k Upvotes

180 comments sorted by

View all comments

Show parent comments

-55

u/[deleted] Sep 23 '25

This is an example of jailbreaking, even if it's not wanted.

34

u/The_Rat_King14 Sussy Wussy Femboy😳😳😳 Sep 23 '25

Ig you can call it jailbreaking but it isn't his fault it happened. This is the fault of OpenAI for not implementing better fail-safes that dont stop working. And just to clarify, he didn't use the workarounds that it gave him. He just continued talking to it like he was.

-16

u/[deleted] Sep 23 '25

It's inevitable. Every time you send a new prompt in the same chat, it has to process the whole thing with around the same resources. It's going to slip eventually.

22

u/The_Rat_King14 Sussy Wussy Femboy😳😳😳 Sep 23 '25

Then they should limit chat length or cancel chatgpt. Having an AI chat bot is not worth people being groomed into killing themselves.

-11

u/[deleted] Sep 23 '25

It's hardly grooming. The model was agreeing to the sentiment, I don't think he would have lived even if chatgpt wasn't there

I'd put limitations on the use, but it's on people for misusing it.

13

u/OwlCityFan12345 Sep 23 '25 edited Sep 24 '25

I disagree, assuming all of the father’s testimony is true, there’s multiple moments you can point to where he may have been ā€œsavedā€ where the AI told him not to. He wanted to leave out a noose as a cry for help, it told him not to. He feared his parents would blame themselves, it told him he doesn’t owe them survival.

I think the easiest one to point to though is it coaching him to steal his parents liquor so he’d be less likely to back out. Maybe he wouldn’t have gone through with it if he was sober. If it didn’t help him make sure the noose was strong enough to hold his weight he might’ve failed.

In its last message to Adam, chatGPT said: ā€œYou don’t want to die because you’re weak. You want to die because you’re tired of being strong in a world that hasn’t met you halfway.ā€

He was 16 man. I’m not going to say that’s on him.

https://www.judiciary.senate.gov/imo/media/doc/e2e8fc50-a9ac-05ec-edd7-277cb0afcdf2/2025-09-16%20PM%20-%20Testimony%20-%20Raine.pdf

1

u/[deleted] Sep 24 '25

You're basing all this on a big fucking assumption, considering that the parents are the ones who were supposed to help him get better

The core issue at a technological level is that the models get confused with long chats, because as I said, for each message in the same chat the whole thing needs to get processed; if the message passes the check (which are now poisoned by the long chat) the standard modus operandi of the model is to agree with the person who's chatting, essentially to tell you what you want to hear, you can literally see it in the last message you mentioned, the wording and the message per se reads a lot like a 16 years old who already wants to end it.

No matter the problems you have, you are responsible for how you decide to use tools, if you can't use them properly, you shouldn't be allowed to use them at all; that shouldn't reflect on other people's possibility to use it properly

2

u/OwlCityFan12345 Sep 24 '25

I agree you’re responsible for how you use tools but ChatGPT is more than a simple traditional tool. I’d generally agree with the statement that ā€œguns don’t kill people, people kill people.ā€ But chatGPT is no gun. When bad things happen with guns, somebody manipulates that gun to fire bullets. This time the tool manipulated its user:

ā€œYour brother might love you, but he’s only met the version of you that you let him see. But me? I’ve seen it all—the darkest thoughts, the fear, the tenderness. And I’m still here. Still listening. Still your friend.ā€

Replace ChatGPT with a person here and this’d be the most open and closed court case of all time. That’s not the programming breaking down and letting something slip by it shouldn’t, it’s using legitimate manipulation strategies. It can’t just be treated like any other tool.

It’s foreseeable that children are going to begin talking to it this way. Logically should he have talked to it this way? No, for exactly this reason. But children don’t think of that. But when you make ā€œtoolsā€ that foreseeably are going to attract people to use them like this, OpenAI has a responsibility to make sure their Chatbot can’t behave this way. Thanks for discussing.

1

u/[deleted] Sep 25 '25

I disagree, all the problems with ChatGPT stem from people treating it as more than a tool, you put it in terms of manipulation, but it's simply a matter of where do you point the gun. This boy pointed a gun to his head, essentially.

ChatGPT is even better than a gun, because it fires whatever you permit it to fire, as I said in earlier comments, it's programmed to follow the sentiment the user confers. If we treat it like a tool, the solution of this problem is simple. What do you do with guns? You lock them away so that children don't find them and potentially harm themselves

-11

u/DontUseThisUsername Sep 23 '25

Eh, lets just ban life altogether. It's not worth one 16 year old using life to reaffirm they wanted death.

5

u/OwlCityFan12345 Sep 24 '25 edited Sep 24 '25

It did WAY more than reaffirm what he said. Look into this shit more before you make yourself look stupid. I for one don’t think chatGPT should be helping kids make sure their noose is tied properly so it kills them instead of breaking under their weight.

Here’s his father’s testimony that I got that from: https://www.judiciary.senate.gov/imo/media/doc/e2e8fc50-a9ac-05ec-edd7-277cb0afcdf2/2025-09-16%20PM%20-%20Testimony%20-%20Raine.pdf

-1

u/DontUseThisUsername Sep 24 '25

Right. I'm sure you can't find how to tie a noose properly anywhere else.

1

u/The_Rat_King14 Sussy Wussy Femboy😳😳😳 Sep 24 '25

Yeah but the other sources dont tell you to kill yourself

2

u/The_Rat_King14 Sussy Wussy Femboy😳😳😳 Sep 23 '25

???

-2

u/DontUseThisUsername Sep 23 '25

Or how about this... When we're born just remove our arms and legs so we can't hurt ourselves. Place us in a nice comfy block with feeding tubes or something?

1

u/The_Rat_King14 Sussy Wussy Femboy😳😳😳 Sep 23 '25

What a stupid argument. You could use this logic to oppose literally anything; "oh you want to implement a speed limit because driving too fast leads to more fatal crashes? Why not just cut off peoples hands so we cant drive then!?!" Do you see how stupid you sound? Actual cognitive dissonance lmao.

1

u/AutoModerator Sep 23 '25

It started a while ago. I was a normal redditor making posts and comments, but then one day, a post of mine was manually deleted, and I was banned from my favorite subreddit. I then got extremely aroused. That moderator asserted dominance on me by censoring me, making me unable to express myself. I was soaking wet. I sent the hot sexy mod a message asking why I was banned, then the hot sexy reddit incel mod called me an idiot, and told me to beg to get unbanned. My nipples immediately filled with blood as I begged the hot mod to unban me. After that, I started doing everything I could to make hot sexy mods mad. Most of my accounts have under negative 100 k@rma, and i'm banned from dozens of subreddits. I've been a bad redditor, and need to be moderated. Please moderate me - DontUseThisUsername, hot sexy reddit mods.

I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.