A bipartisan group of US senators introduced a bill Tuesday that would criminalize the spread of nonconsensual, sexualized images generated by artificial intelligence. The measure comes in direct response to the proliferation of pornographic AI-made images of Taylor Swift on X, formerly Twitter, in recent days.

The measure would allow victims depicted in nude or sexually explicit “digital forgeries” to seek a civil penalty against “individuals who produced or possessed the forgery with intent to distribute it” or anyone who received the material knowing it was not made with consent. Dick Durbin, the US Senate majority whip, and senators Lindsey Graham, Amy Klobuchar and Josh Hawley are behind the bill, known as the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, or the “Defiance Act.”

Archive

  • gila@lemm.ee
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    9 months ago

    There is a money trail when it’s legal. You get blatant advertising of services where you pay to upload your own photos to make deepfakes with them, on all kinds of sites (ahem, Pornhub). That’s a level of access that can’t be ignored, especially if it’s a US-based company providing the service, taking payment via Visa/Master etc. Relegate it to the underground where it belongs.

    • Serinus@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      9 months ago

      I’d be more okay if the law were profit based, because that’s much easier to enforce.

      I don’t like laws that are near impossible to enforce unless they’re absolutely necessary. I don’t think this one is absolutely necessary.

      • gila@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        9 months ago

        I don’t think general enforcement against deepfake porn consumption is a practical application of this proposed law in civil court. Practical applications are shutting down US-based deepfake porn sites and advertising. As far as possessors go, consider cases of non-celebrities being deepfaked by their IRL acquaintances. In a scenario where the victim is aware of the deepfake such that they’re able to bring the matter of possession to court, don’t you agree it’s tantamount to sexual harrassment? All I’m seeing there is the law catching up to cover disruptive tech with established legal principle