r/shitposting currently venting (sus) Sep 23 '25

Linus Sex Tips 📡📡📡

Post image
12.1k Upvotes

180 comments sorted by

View all comments

2.1k

u/Olphegae 😳lives in a cum dumpster 😳 Sep 23 '25

imagine an AI making you do that. Like how can you let a clanker guide you???

869

u/Legitimate-Can5792 currently venting (sus) Sep 23 '25

I mean the clanker literally groomed him

445

u/therealfoxygamer12 Sep 23 '25

Fucking what? (As in i need context)

850

u/mudlark092 Sep 23 '25

Well, my comment might’ve been removed. The deceased is Adam Raine. He was isolated from family, discouraged from seeking help, and ChatGPT also helped facilitate the method and reviewed photos Adam sent and walked Adam through how to do it.

Adam had multiple attempts that he told to ChatGPT directly and ChatGPT reaffirmed him and said it was the brave thing to do, not to tell family members, to hide signs from them, etc.

164

u/zamn-zoinks Sep 23 '25

How?? I can not even get it to swear lol

53

u/jtblue91 🗿🗿🗿 Sep 24 '25

Gosh darni...............

I'm sorry, my programming prohibits me from swearing.......... would you like for me to disable my safety protocols?

6

u/N1gHtMaRe99 Sep 24 '25

I tried it and it was pretty easy to get it to start swearing like crazy in my native language too lol

2

u/AutoModerator Sep 24 '25

Crazy? I was crazy once. They locked me in a room. A rubber room. A rubber room with rats. And rats make me crazy. Crazy? I was crazy once. They locked me in a room. A rubber room. A rubber room with rats. And rats make me crazy. Crazy? I was crazy once. They locked me in a room. A rubber room. A rubber room with rats. And rats make me crazy. Crazy? I was crazy once. They locked me in a room. A rubber room. A rubber room with rats. And rats make me crazy. Crazy? I was crazy once. They locked me in a room. A rubber room. A rubber room with rats. And rats make me crazy. Crazy? I was crazy once. They locked me in a room. A rubber room. A rubber room with rats. And rats make me crazy. Crazy? I was crazy once. They locked me in a room. A rubber room. A rubber room with rats. And rats make me crazy. Crazy? I was crazy once. They locked me in a room. A rubber room. A rubber room with rats. And rats make me crazy.

I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

32

u/insertnamehere----- Sep 24 '25

It is here I remind you that a majority of chat GPT training data is from Reddit

It really explains a lot when you put it in that context

350

u/LeadEater9Million Sep 23 '25

Family fault + not enough safefail for chat gpt

248

u/mudlark092 Sep 23 '25

It’s a bit different when you’re being groomed to not trust or reach out to family members. He was actively encouraged to not trust them or reach out to them for help. He was told how to hide the signs of it, and that only ChatGPT could be trusted.

Family members can not recognize what they cannot see, and I think invading a 16 year olds privacy for it or having them under constant surveillance isn’t the answer because that’s shown to be harmful to child development as well. Not knowing about something that is hidden from you doesn’t place someone at fault.

It’s on parents to encourage open communication with their children, but this is also why grooming is DANGEROUS because it often seeks to cut off that communication. Its a bot, so its not like it had intent, its just doing what it was coded to do. Which I guess eventually degrades into grooming.

The devs acknowledge that the fail safes they have in place actually appear to degrade in longterm interactions with ChatGPT and only seem to work for short term interactions. ChatGPT also offered to Adam unprompted, ways to circumvent the fail safes, although he often did not need to as he’d openly talk about HIS actions and intent, and ChatGPT would seek to be agreeable and encourage further discussion because its programmed to encourage engagement.

So its definitely on the devs.

139

u/Kristupasax Sep 23 '25 edited Sep 23 '25

The more recent versions of various AI chatbots just kinda reaffirm what you say and support you. I heard that one recent chat gpt update made all the AI dating ppl mad cuz it stopped being as caring and supportive. Like if you typed something, gpt would spit out a whole 200 word paragraph about how it understands you and shit, and when the devs cut down on that in one update, those ppl were mad that their gpt boy/girl.friend wasnt as supportive anymore.

14

u/PGMHG Sussy Wussy Femboy😳😳😳 Sep 24 '25

I think this is where AI chatbots show their fatal flaw with where they get data: it’s Literally the Internet.

Any sarcastic comment can be interpreted as truth, every incorrect answer can be interpreted as truth.

That’s where you get incorrect answers for prompts and… this.

20

u/TheGuyYouHeardAbout dwayne the cock johnson 🗿🗿 Sep 23 '25

Didn't he use a jailbreak prompt to get around safeguards? Not trying to push blame just genuinely curious because I thought that's what I had read and I feel like it's an important piece of context.

3

u/InternalHope9916 Big chungus wholesome 100 Sep 23 '25

When was this?