U.K. May Outlaw ‘Deepfakes’ Porn; New Detection Software Funded

A new law under consideration is the United Kingdom may establish criminal penalties for creating “deepfakes” porn, a new type of video that uses an artificial intelligence algorithm to make authentic-seeming hardcore videos that appear to star celebrity actresses.

The “deepfakes” AI algorithm surfaced last year on the internet forum Reddit, taking forged porn to a new level of sophistication. While videos that superimpose celebrity faces over existing porn scenes have been around for years, the deepfakes algorithm allows the superimposed celebrity face to make lifelike expressions and react in realistic ways, adding to the illusion that celebrities such as Scarlett Johansson, Katy Perry, Gal Godot (as seen in the image above) and many others had actually participated in porn videos.

New York State recently proposed a law that would ban deepfakes porn, and now the entire U.K. may adopt legislation criminalizing the videos as form of sexual abuse.

The current bill soon to be introduced in Parliament by the British government is aimed only at banning “upskirt” images: still or video recordings taken surreptitiously by positioning a camera to “see” underneath the dress or skirt of an unsuspecting woman. But according to a report Thursday by The Guardian, a top British expert in pornography law is pushing to include a deepfakes ban in the bill as well.

“It would be easy to extend the bill so that it covers images which have been altered too, and clearly criminalize a practice that victims say they find incredibly distressing,” Durham University law professor Clare McGlynn told the paper.

Two Labor Party members of parliament are already exploring how the bill could be amended to include the deepfakes ban, The Guardian reported.

In the United States, efforts to root out deepfake videos from large online platforms such as Reddit, PornHub and others—which have banned the AI porn videos but have reportedly not succeeded in completely eradicating them from their sites—may have taken a step forward. The imaging software company TruePic announced this week that it has raised $8 million in startup capital for the company’s effort to produce software that would automatically detect deepfake porn posted online, according to a report on the technology news site TechCrunch

TruePic already makes an app that allows viewers to shoot photos that contain a digital watermark, making it possible to track if the photos are later digitally altered.

But how well any deepfake detection software will actually work remains to be seen. The imaging platform Gyfcat said in February that it was using an AI technology known as Project Angora to dig for deepfakes on its site. But according a report this week by the VICE-owned tech site Motherboard, numerous deepfake videos remain on the Gyfcat site.

The legality of deepfakes video under U.S. laws is also confusing. Not only do they use the faces of celebrities (and non-celebrities) without permission, they also misappropriate the underlying porn video scenes, using the bodies of actual porn performers also without permission, and seemingly violating the copyright of the original producers.

But according to a report in Wired Magazine, the videos likely do not constitute a privacy invasion, because a person’s privacy cannot be invaded by depicting something that never actually happened. Additionally, First Amendment protections for spoofs and satire—the same protections that allow the popular porn “parody” genre to exist—may also protect deepfake porn.

Photo via YouTube Screen Capture using superimposed image of actress Gal Godot

Originally published at: https://avn.com/business/articles/legal/uk-may-outlaw-deepfakes-porn-new-detection-software-funded-782182.html

Tags:
0 Comments

Leave a reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Copyright 2018 Adult Talent List
or

Log in with your credentials

or    

Forgot your details?

or

Create Account