Top 5 This Week

Related Posts

Trump signs bill outlawing ‘revenge porn’ | Technology News


Legislation criminalises publishing of intimate images, including AI-created ‘deepfakes’, without a person’s consent.

United States President Donald Trump has signed a bill outlawing so-called “revenge porn”, including images created using artificial intelligence.

The Take It Down Act, signed on Monday, makes it a federal crime to publish intimate images of a person without their consent, and obligates social media platforms to remove such images within 48 hours when requested by victims.

The legislation, which passed the US Congress earlier this year with near-unanimous support, applies to realistic-looking AI-created images, known as “deepfakes”, as well as genuine photos.

“With the rise of AI image generation, countless women have been harassed with deepfakes and other explicit images distributed against their will. This is … wrong … just so horribly wrong,” Trump said at a signing ceremony at the White House in Washington, DC.

“It’s a very abusive situation … And today we’re making it totally illegal.”

First Lady Melania Trump, who championed the legislation after her husband’s return to the White House, described the law as a “powerful step forward in our efforts to ensure that every American, especially young people, can feel better protected from their image or identity being abused”.

While enjoying rare bipartisan support and the backing of numerous organisations dedicated to combating sexual harassment and assault, the legislation drew criticism from digital rights groups on privacy and anti-censorship grounds.

The Electronic Frontier Foundation said the law’s provisions for removing flagged material from the internet posed risks to “free expression, user privacy, and due process, without addressing the problem it claims to solve”.

“Lawful content – including satire, journalism, and political speech – could be wrongly censored,” the group said in a statement in February.

“The legislation’s tight time frame requires that apps and websites remove content within 48 hours, meaning that online service providers, particularly smaller ones, will have to comply so quickly to avoid legal risk that they won’t be able to verify claims. Instead, automated filters will be used to catch duplicates, but these systems are infamous for flagging legal content, from fair-use commentary to news reporting.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Popular Articles