Thursday, July 11, 2019

Yet another awful, future-destroying app

“Deepfake” has been in the news of late, what with concern on the part of some that there will be wholesale manipulation of videos, audios and images during the upcoming (make that already upon us) election cycle. The technology has gotten so good at creating real-looking but entirely bogus clips that it will make the social media fakery and Russian bot activity we suffered from (make that continue to suffer from) during 2016 look like amateur hour.

It’s not difficult to imagine how this will play out. A low-information, easily gulled electorate will be doused with vids of Barack Obama trash talking the Democratic nominee. Etc. (As for fakery on the other side, well, what would the content be exactly? Trump trash talking a Gold Star family? Making fun of a handicapped journalist? Claiming that a woman who accused him of rape isn’t “his type”? Cozying up to tyrants? This time around at least, there doesn’t seem to be any reason to doctor anything up. And while I’m not big on bothsiderism these days, I don’t suppose that anyone would have to use technology to make zany Democratic candidate Marianne Williamson look like a wack job.)

In any case, fact checkers will be working overtime next/this time around.

Of course, artificial intelligence won’t just be drafted into use to put words in the mouths of politicians.

Though much has been made of the technology’s threat to national security, it has also been harnessed to make a torrent of fake porn, including widely circulated videos of celebrities such as Gal Gadot and Scarlett Johansson. Although sites including Reddit, Twitter and Pornhub have tried to ban pornographic deepfakes, they have had limited success. The technology is cheap and easily accessible, and the opportunities for use are limitless. (Source: WaPo)

Not that I don’t have some sympathy for Gal Gadot and Scarlett Johansson here. This is pretty awful stuff. But, at least they have Reddit and Pornhub on their side. (Pornhub? WTF?) Frankly, I’m more concerned with a fake video of one of the Democratic woman revealing her secret plan to castrate all white men. Or something.

Anyway:

An app developer who created an algorithm that can digitally undress women in photos has pulled the plug on the software after high traffic and a viral backlash convinced him that the world is not ready for it.

DeepNude used artificial intelligence to create the “deepfake” images, presenting realistic approximations of what a woman — it was not designed to work on men — might look like without her clothes.

I don’t think I have to worry that anyone will want to use this app on me. Still, the potential for this to be used on “civilians”, i.e., women whose images aren’t more of less in the public domain, makes it pretty awfuller stuff than the porning of movie stars.

The coder, Alberto, who came up with DeepNude – an app with a freeware version that incorporated a watermark so that people would know it was fake, and premium $50 version that had an easily crop-out-able stamp in the corner that says the image is “FAKE” – says that he is:

“not a voyeur,” merely a technology enthusiast who was driven to create the app out of “fun and enthusiasm.”

Well, “fun and enthusiasm” and also looking for “an economic return from this algorithm”.

Alberto based his app on UCal Berkeley open-source code.

DeepNude’s creator said he mulled the ethics of his software but ultimately decided the same results could be accomplished through any number of photo-editing programs.

“If someone has bad intentions, having DeepNude doesn’t change much. . . . If I don’t do it, someone else will do it in a year,” Alberto said.

“Bad intentions”? Come on, Alberto. What good intentions would there be out there?

Sure, on Twitter – where after a few days, @deepnude had 21.9K followers – the thought was that the app would be used for “user’s entertainment.” But at quite a cost, my friend.

As for “someone else will do it in a year,” in this, Alberto is 100% correct-o-mundo.

But at least Alberto did the right thing and closed up shop. Not, of course, before his app was written about on Motherboard and Vice. Which accounts, I guess for all those followers. Download attempts eventually crashed the server it was on, but DeepNude got out there before Alberto had his crisis of conscience. And since it’s “out there,” anyone who wants it can have at it.

“We don’t want to make money this way,” the tweet read. “Surely some copies of DeepNude will be shared on the web, but we don’t want to be the ones who sell it.”

I looked through a few of the Twitter comments, and they were split between “this is a sexist outrage” and “I’ve been waiting my whole life for this.” Oy.

DeepNude itself had this final word:

“The world is not yet ready for DeepNude,” it said.

But California is getting ready for it. They’re looking at legislation that would outlaw deepfake porn.

Between the return of old-style fascism throughout the world, and the techno shit shows that are coming on harder and faster, this is really not the future I want to live in.

Maybe Marianne Williamson is onto something when she talks about “harnessing love.”

Sigh…


No comments: