Thursday, June 13, 2024

This has got my stomach in knots. (Another AI downside...)

On the crime continuum, it's hard to come up with anything worse than the sexual abuse of children. And it's one area where I'm afraid technology, i.e., the Internet, has made things worser. 

Yes, there was plenty of child abuse and kiddie porn before there was the Internet, but the Internet (for all the good that has come of it) makes it easier for evil-doers to find children to prey on, and it makes it a whole lot easier to find pornography involving children - and to find fellow consumers of CSAM (chid sexual abuse material). The ease of finding these sordid images, the ease of finding so many others with the same predilection, also normalizes sex with children. Someone who thought he was the lone weirdo interested in CSAM finds that he's not alone. Instead, there's a big old permission structure out there inviting people to partake.

While I'm not sure where and how people acquired CSAM back in the pre-Internet days - did they lurk around dark alleys? where their euphemisms used in ads that let buyers know what they were buying? - I'm pretty sure it was more difficult to find it, more difficult to find those others.

And then there was the Internet...

A personal side story: When I was a kid, the boy next door was smart, handsome, and funny. In terms of age, he fell between me and my brother Tom, and we were both on pretty friendly terms with him. (There were a lot of boys in my neighborhood - one family had 4 boys to 1 girl; another had 6 boys to their one girl; another famiy had 7 boys, and no girls. While my best friends were girls, with whom I played jacks, dolls, and jumprope, I also played baseball, cowboys & Indians, and GI's vs. Nazis with boys. Some games, like DONKEY, were co-ed.)

The boy next door was something of an exotic. His family were the only Protestants on our street. Protestant aside, there was always something "off" about him. He was a smart aleck, but he was also sneaky and a phoney around adults. My father couldn't stand him, dubbing him "Eddie Haskell" after Wally Cleaver's smarmy wiseguy friend on Leave It To Beaver

Fast forward a bunch of decades, and wasn't "Eddie Haskell" arrested and imprisoned (for a decade or so) for possession of CSAM. The material had been downloaded from the Internet, and was found on his harddrive when he brough his computer in for repair. 

My father would have been rip-shit, but not surprised. 

I did ask the youngest of my sibs whether he had ever done anything to them, but he hadn't. But it still gives me the shivers to think that this creep was just next door (and just next door in our neighborhood meant about 5 yards away, if that). 

Not that everything about CSAM isn't horrendous, but the most horrendous thing is, of course, that real children were sexually abused in its making. All those little innocents...

But now, the world is being flooded by CSAM "created by artificial intelligence. 
Over the past year, new A.I. technologies have made it easier for criminals to create explicit images of children. Now, Stanford researchers [at the Stanford Internet Observatory] are cautioning that the National Center for Missing and Exploited Children, a nonprofit that acts as a central coordinating agency and receives a majority of its funding from the federal government, doesn’t have the resources to fight the rising threat. (Source: NYTimes)

And a big part of that rising threat is child pornography is created by AI. 

A.I.-generated images of CSAM are illegal if they contain real children or if images of actual children are used to train data, researchers say. But synthetically made ones that do not contain real images could be protected as free speech, according to one of the [Stanford] report’s authors.
There's a rapidly growing amount of this stuff out there already, and legislation and the content platforms keeping up with this new threat and outlawing it or, in the case of the patforms, ferreting it out.

Because whether the kids in the pictures and videos are real or not, harm does come to real children even when the images are AI-generated. 




On a single day earlier this year, a record one million reports of child sexual abuse material flooded the federal clearinghouse. For weeks, investigators worked to respond to the unusual spike. It turned out many of the reports were related to an image in a meme that people were sharing across platforms to express outrage, not malicious intent. But it still ate up significant investigative resources.

That trend will worsen as A.I.-generated content accelerates, said Alex Stamos, one of the authors on the Stanford report.

“One million identical images is hard enough, one million separate images created by A.I. would break them,” Mr. Stamos said.

The center for missing and exploited children and its contractors are restricted from using cloud computing providers and are required to store images locally in computers. That requirement makes it difficult to build and use the specialized hardware used to create and train A.I. models for their investigations, the researchers found.

The organization doesn’t typically have the technology needed to broadly use facial recognition software to identify victims and offenders. Much of the processing of reports is still manual.

---------------------------------------------------------
Image source: Washington Post

No comments: