Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”

  • RagnarokOnline@reddthat.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 months ago

    I don’t want to band wagon against you, but I do think it’s important that people who agree with your viewpoint have a chance to understand that the situation is a violation of privacy.

    The kids’ reputation is, likely, damaged. You have an underage girl who is already dealing with the confusion and hierarchy of high school. Then (A) someone generates semi-accurate photos of what their naked body looks like and (B) distributes it to others.

    Issue (A) is bad because it’s essentially CSAM and also because it’s attempting to access a view of someone that the subject likely hasn’t permitted the generator to have access to. This is a privacy violation and the ethics around it are questionable at best.

    Issue (B) is that the generator didn’t stop at the violations of issue (A), but has now shared that material with other people who know the subject without the subject’s consent, and likely without her knowledge of the recipients. This means that the subject now has to perpetually wonder if every person they interact with (friends, teachers, other parents, her own parents) have seen lewd pictures of her. Hopefully you can see how this could disturb a young woman.

    Now apply a different situation to it. Suppose you took a test at school or at work that shows you as dumb (like, laughably dumb; enough to make you feel subconscious). Even if you don’t think it’s a fair test, this test exists. Now, assume that someone shared this test with your friends, co-workers, and even your parents without you knowing exactly who received it. And instead of everyone saying “it’s just a dumb test — it doesn’t mean anything”, they decide it means something about you. Every hour or so, you walk by someone or interact with someone who chuckles or cracks a joke at your expense. You’re not allowed by your community to move on from this test.

    Before your test was released, you could blend in. Now, you’re the person everyone is looking at and judging. Think of that added anxiety on top of everything else you have to deal with.

    • lambalicious@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 months ago

      Issue (A) is bad because it’s essentially CSAM and also because it’s attempting to access a view of someone that the subject likely hasn’t permitted the generator to have access to. This is a privacy violation and the ethics around it are questionable at best.

      That part is not a privacy violation, the same way someone drawing in a canvas their own impression of what a bank vault looks like on the inside does not constitute a trespassing / violation of privacy of the bank. Unless the AI in question used actual nudes of them as a basis, but then we wouldn’t need the extra AI step for this to be a problem, right? Otherwise, I’m rather sure that the actual privacy violation starts at (B).

      Ofc, none of that makes it less of a problem, but it does feel to me like it subverts a potential angle for fighting against this.