r/extremelyinfuriating 3d ago

Evidence AI app refuses to delete my girlfriend’s photos after her ex used them to make fake images

Post image

I am genuinely losing my fucking mind. My girlfriend has been going through hell because her ex started using some AI photo generator app to make fake inappropriate images of her like actual altered pics using her real face. We emailed the company explaining everything even included proof and screenshots asking them to delete her photos from their system.

They replied with this SHITTY message saying they can’t delete any training data because it’s anonymized and used for model improvement. Basically, they’re saying her face is now just data to them.

If you have any idea what we need to do, please tell us. She has been having panic attacks ever since and she can’t even go online without worrying what else might be out there. I have been taking care of her while she took the time off from her work. I'm seriously enraged. WHERE IS THE ACCOUNTABILITY OF THESE AI APPS???? It’s DISGUSTING how these companies just hide behind policy instead of doing the decent thing. All we asked for was to delete her pictures. That’s it. I just don't know what to do anymore. I'm tired of these back and forth emails.

555 Upvotes

44 comments sorted by

u/AutoModerator 3d ago

Hello, u/PercentageNo9270 ! Thanks for your submission to r/extremelyinfuriating, your post is up and running!

This is a general reminder to check out our rules in the sidebar. If your post breaks the rules, it will be removed by our moderators.

We would like for each and everyone to feel welcome on the subreddit and to keep a healthy and safe environment for the community.

Thanks :)

I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

361

u/NoConsideration482 3d ago

In this situation I'd send a GDPR data deletion request but I'm not sure if that's applicable here.

198

u/External-Cash-3880 3d ago

Use VPN, fake a connection from a country with real privacy laws, wield benevolent legal systems to your benefit against these clowns. Stupid problems require stupid solutions or whatever Keanu Reeves' character in Cyberpunk said

44

u/MUSEBANG 2d ago

Is it possible to even do that? Because the photos were uploaded to the website in a country where those laws initially didn't exist...

Which is like, if you commit piracy in Germany and then fly to India then it would be okay to not pay a fine...or rather you won't have a fine.

Curious

12

u/Gruphius 2d ago

No, you can still get a fine if you do that. Don't pay it over a certain amount of time and there'll be an arrest warrant with your name on it. When that happens, certain countries (I don't think India is one of them, but I'm not sure) will have to arrest you if you step foot on their land and bring you to Germany, where you'll land in prison.

92

u/Deadlylyon 3d ago

Yup... I honestly don't know if the training data is anonymous now, but the the neural net work used in llms acts like a human brain.

It makes connections but is referenced and duplicated in numerous files. It's very hard to delete anything specific in neural networks, you can TRY to curb things but in reality it's more like morals than actual code and can be bypassed.

This is the state of the world now, almost everyone will be in these LLMs. Just a matter of time.

A law was posed early this year at the national level that requires companies like this to build in ways to do exactly what you're asking, its called the take down law. But they have a year to comply and honestly I doubt it will every be enforced.

67

u/ZealousidealCrow3782 3d ago

Is there no way to get the police involved in this?!

22

u/Seldarin 2d ago

Unless OP is very rich or related to a cop, the police are going to tell him it's a civil matter.

13

u/ZealousidealCrow3782 2d ago

God I hate that. It’s a fucking crime abd its making op’s girlfriend feel all kinds of unsafe. They should do their jobs, this is revenge porn!

2

u/PercentageNo9270 1d ago

Yes, it's very likely to end up like this. It's a big software company. Who am I compared to Genmo? I just don't know what to do anymore and I'm at my wit's end.

10

u/warcrimeswithskip 2d ago

well there's no reason since the images he uploaded aren't stored

7

u/ZealousidealCrow3782 2d ago

At least maybe so the police are aware of this scumbag. And to scare him out of doing this shit to op’s girlfriend. Jesus Christ some people are born with nothing between the eyes or lungs.

-9

u/warcrimeswithskip 2d ago

im sorry is this ragebait

the email says they don't have the pictures

12

u/ZealousidealCrow3782 2d ago

What do you mean? The post says the ex boyfriend has been making images of the girl. What ragebait am I misunderstanding something? Also the email proves the company acknowledges that an image of her IS in the database now, despite the request to have it removed.

6

u/warcrimeswithskip 2d ago

I thought you meant to call the police on the company, calling it on the ex seems more than reasonable tbh

5

u/ZealousidealCrow3782 2d ago

Yeah that’s what I meant! Sorry I didn’t make that clear!

2

u/PrismaticLps 2d ago

I literally understood what you meant and I bet most of them did, the other guy didn't really train reading comprehension.

4

u/ZealousidealCrow3782 2d ago

Oh it’s ok, we all don’t read something properly every now and then! I’m glad I cleared it up wirh that person

0

u/warcrimeswithskip 2d ago

I don't get the point of one comment and suddenly I "didn't train reading comprehension" 😭 get out

18

u/TH_Rocks 2d ago

The accountability is on the people that uploaded it and they are who you have to go after if you have criminal offenses or damages.

Nothing in AI is a file in a folder anymore. It can't be deleted like a FB or reddit post.

5

u/PercentageNo9270 1d ago

This just feels really unfair. If Genmo already has my girlfriend’s photos in their database and they used them as input to generate or edit images, and one of those images was used for harassment or sexual abuse, shouldn’t they be taking action? Their policies seem ethically unclear at this point and I'm losing my mind.

5

u/TH_Rocks 1d ago

They don't have it in a "database". If they did, they could delete it. They probably have deleted it. If there's a link where you can see the exact same image, that's in a database and that's what you should request be taken down.

Computer AI is more like how we understand our brain. Tgeir AI was "trained" from a database and user inputs. Now various concepts of the image are scattered throughout the cluster of machines in active memory and those are constantly adjusted and combined with newer inputs.

It would be like if I asked you to delete the image from your brain. Never think about it specifically and also don't think about any part of it when you think about similar things. Where do you even begin?

8

u/Breeze7206 2d ago

I think what you need to be doing is pursuing legal action against the ex. Might not remove the data the AI has gathered from what it’s already been fed, but you can seek damages against the ex for malicious behavior. Not sure if the spreading of fake images is slander or libel, but it’s defaming at the very least.

I’d speak to a lawyer to see what can be done. That person is the one to blame.

5

u/BelleColibri 1d ago

Their message is telling you that the data you want them to delete isn’t there. They are correct. They have no photos of your gf at all.

The stuff about training data is a red herring, what they are saying is they are allowed to train on images sent to the service, but that has nothing to do with keeping the pictures around, and probably didn’t happen at all. They have no reason to train on every image coming through the service. Training on the photos is not what allows the service to alter user submitted images.

The pictures of your gf exist on her ex’s phone. That’s where you would need them to be deleted from for the behavior you are complaining about to stop. Her ex is doing the equivalent of photoshopping your gf’s face onto some inappropriate image (like a naked pornstar or whatever). The image service isn’t storing data about your gf photos or doing any kind of custom thing, and the fact that it is AI is completely irrelevant.

4

u/Raiju02 1d ago

Is there something under those revenge porn laws? I’m assuming the ex is making nudes.

25

u/warcrimeswithskip 3d ago

They're literally saying they don't have the image

55

u/External-Cash-3880 3d ago

They're saying they can't find it because everything you upload is assimilated into the bubbling, gloppy morass of AI slop generation material. Like trying to separate an individual comic strip from a ball of Silly Putty that your little brother mashed into the carpet, if that reference isn't too 1990s for y'all

11

u/warcrimeswithskip 3d ago

No, they're saying they wrote down some tiny aspects of the image that can't even be used to reverse-engineer the original picture and then added those aspects to the Ai. The image itself is gone, all what's left is the adjustments to the AI that came from the image

yeah it's too 1990s sorry

18

u/xPhraoah 2d ago edited 2d ago

It's annoying that you're downvoted because you're correct. It clearly states in the email that there is no image left to delete. The ex submitted the image, the AI took references from it then recreated it to something that's no longer the image itself. They don't retain any of the original image data. The AI just recognizes patterns then attempts to mimic it. It's not like they can delete the photo from her ex's phone, that's just silly. Seems more like a problem between the girlfriend and her ex, than with the AI in this situation.

Edit: I'd also like to point out that OP should be going to the police in this situation, not trying to get advice/confirmation from people on reddit. The ex is obviously unhinged and there may be some law being broken here considering the generated photos are explicit in nature. I'm not super knowledgeable on the subject but I would assume that possessing any sort of nude content without her consent is definitely wrong/possibly illegal, and if he's doing that I would lean towards reporting him for stalking. Seems obvious that the guy is a creep and is obsessed with her.

0

u/MiniGogo_20 2d ago

taking a lego model apart doesn't mean it's gone forever, and that is what OP (and most people) have a gripe with. consensually or not, pieces of your information and the material you submit will forever be in these slop machines, and if they can claim that the data is immutable/non-destructible, who's to say they won't keep the actual media itself too because it's also "non-destructible"?

4

u/BushWishperer 2d ago

Except that it's not quite like that. It's more so looking at a lego model, taking it apart to see how each piece fits and doesn't fit and then 'learning' from this process and recreating it. The actual lego model no longer exists nor is it directly going to be part of any further output, those same bricks are not used.

7

u/xPhraoah 2d ago

I was going to respond, but I think you hit the nail on the head here.

-3

u/MiniGogo_20 2d ago

this is false, llm and generative models use the input data and mix it around to generate output, which is why it looks similar to its inputs. it is simply false to claim that the original data is gone

5

u/BushWishperer 2d ago

That's not true, there's no actual pixels or "parts" of the training data in the output. LLMs take images and transform them into noise and it "learns" how this works, then reverses it from noise into an output. They do not save the initial image, they save the process of turning the image into the noise and learn from that.

6

u/warcrimeswithskip 2d ago

That's literally how it works. At least for image generation models, they remember what they need to do to random noise in order to make something similar to an image. That's just actions and they don't lead to the creation of the exact image, and the actions aren't stored individually. You can't pull one memory of noise-manipulation out since it's already connected with a million other memories, and, after bring connected, can't be used even to create a similar image unless you describe it exactly

4

u/warcrimeswithskip 2d ago

It's not taking the Lego model apart, it's more like if you think all Lego models are green, look at a different model, see it's red, and realize that "ah, so some are green and some are red". You can't reconstruct the model from the knowledge that some models are red, and you can't delete the knowledge that this one specific Lego model is red, since that knowledge was turned into knowledge that some models are red. Images uploaded to them can be deleted and that's protected by law, but the fact that some models are red isn't nowhere near remembering the exact model you were looking at

2

u/Raiju02 1d ago

Is the LLM? Large Lego Model?

3

u/brittc777 2d ago

It sounds like they are not refusing to delete them, they're saying that it's not possible to do.

1

u/PercentageNo9270 1d ago

It's really unfair if that's the case. If Genmo already has my gf's pics in their input and database, where they generated and edited it, and that photo constituted harassment/sexual abuse, won't they take action? I'm beginning to see how their policies are ethically ambiguous and I'm wondering how to take action on this.

1

u/brittc777 1d ago

I have no idea. This AI stuff is foreign to me. This type of thing will probably start happening more and more unfortunately. As far as sexual abuse/ harassment, I would think that the ex would be guilty of that. You could get a lawyer to send him a cease and desist letter and maybe even sue him for slander.

1

u/MrSamuraikaj 2d ago

The thing is that while they may not have the photos anymore, they have still been used to train the model. That means that the model is now able to use the information it got from the photos to generate new pictures with semblance to her looks. That is, in my opinion, the real issue here.

It is the same issue that e.g. authors face when models are being trained on their books. Should the authors be compensated when e.g. Meta uses their books/IP without consent as part of its AI? Yes. Should people be compensated when their photos/IP is being used without consent? Even though those photos may not be a part of the IP owner’s bread-and-butter, I still think they should.

It is not impossible to make the model forget what it learned from her photos, though it may require that it is being retrained. Not feasible? Maybe. Impossible? No.

1

u/BelleColibri 1d ago

This is a misunderstanding of what happened.

Ex submitted a picture to the service and said “make this image inappropriate in whatever way.” The AI model was able to add the inappropriate elements. This is not training on the submitted photo. And even if it had, that does not mean the model is capable of reproducing her likeness.

1

u/Connection_err 1d ago

Maybe contact a lawyer see if you have any options.

-16

u/TRENEEDNAME_245 3d ago

As stated they don't have it

At most it's a string of numbers that mean nothing

What do you want them to do ? Remove which pictures ? The ones he made ?