A bipartisan group of US senators introduced a bill Tuesday that would criminalize the spread of nonconsensual, sexualized images generated by artificial intelligence. The measure comes in direct response to the proliferation of pornographic AI-made images of Taylor Swift on X, formerly Twitter, in recent days.

The measure would allow victims depicted in nude or sexually explicit “digital forgeries” to seek a civil penalty against “individuals who produced or possessed the forgery with intent to distribute it” or anyone who received the material knowing it was not made with consent. Dick Durbin, the US Senate majority whip, and senators Lindsey Graham, Amy Klobuchar and Josh Hawley are behind the bill, known as the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, or the “Defiance Act.”

Archive

  • MagicShel@programming.dev
    link
    fedilink
    arrow-up
    19
    arrow-down
    2
    ·
    9 months ago

    Generating sexual images of minors is already illegal. And these images can be generated by anyone modestly technical on their computer, so you can’t go after people for creating or posessing the images (except if they look too young), only distribution.

    This is unfortunately theater and will do basically nothing. How does a person even know if they are deep fakes? Or consensual? Hell what’s too close of a likeness, because some of those images didn’t look that much like her and at least one was not even realistic.

    I’m not saying it’s cool people are doing this, just that enforcement of this law is going to be a mess. You wind up with weird standards like how on Instagram you can show your labia but only through sheer material. Are deep fakes fine if you run them through an oil painting filter?

    • yamanii@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      9 months ago

      Are deep fakes fine if you run them through an oil painting filter?

      Probably since nobody could mistake an oil painting for the real person, it’s not a deep fake anymore.

      • MagicShel@programming.dev
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        9 months ago

        I have about a 99% success rate at identifying AI full body images of people. People need to learn to look better. They look just as fake as the oil paintings.

        • gapbetweenus@feddit.de
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          9 months ago

          They look just as fake as the oil paintings.

          You can go photo or even hyper realism with oil. And with AI you just need a bit of post.

          • MagicShel@programming.dev
            link
            fedilink
            arrow-up
            1
            ·
            9 months ago

            I think that’s relevant when the defense against oil paintings is that you can tell they aren’t real. The line can’t be “you can’t tell they are fake” because… well… you can identify AI artwork 99% of the time and the other 1% is basically when the pose is exactly so to conceal the telltale signs and the background is extremely simple so as to give nothing away.