Thursday, September 14, 2023

"Don't it make your brown eyes blue?"

Rona Wang, 24, is a recent MIT grad and future MIT grad student whose resume boasts some pretty impressive credentials. She was a Math and Computer Science major whose experience includes the requisite Microsoft internship. So, STEM girl. And last year, her first novel, You Had Me At Hello World - which received a six-figure advance - was published by Simon and Schuster. So, artsy girl, too. So, double threat. (Note: Wang had an earlier collection of short stories published by a small press. It was withdrawn from distribution under a cloud of credible accusations of plagiarism. So, not perfect.)

Anyway, Rona Wang decided she wanted to use a more professional image on her LinkedIn profile. After all, her present LinkedIn pic does kind of scream undergraduate. Plus the mask sort of dates it.

Wang decided to play around a bit. And last month she asked AI for a bit of help. She worked with different AI portrait generators and ended up with some fairly odd results: "images of herself with disjointed fingers and distorted facial features."

Then this happened:

Wang uploaded a picture of herself smiling and wearing a red MIT sweatshirt to an image creator called Playground AI, and asked it to turn the image into “a professional LinkedIn profile photo.”

In just a few seconds, it produced an image that was nearly identical to her original selfie — except Wang’s appearance had been changed. It made her complexion appear lighter and her eyes blue, “features that made me look Caucasian,” she said.

“I was like, ‘Wow, does this thing think I should become white to become more professional?’” said Wang, who is Asian American. (Source: Boston Globe)

Here's the before (which I guess wasn't used purely in pursuit of a professional look, as she's still got an MIT shirt on, which is semi-pro at best; but, hey, she's still a student, albeit a graduate one):  


And here's after:

Which (assuming she didn't indicate she wanted to look like a white woman) is completely crazy, as it doesn't even look like her. And don't it make her brown eyes blue? How odd is that.

Research — including at MIT — has found so-called AI bias in language models that associate certain genders with certain careers, or in oversights that cause facial recognition tools to malfunction for people with dark skin.

Well, at least it didn't come back with Rona Wang looking like a man. So there's that. 

AI is, of course, still being perfected. And this imperfect attempt may have just been a random blip. 

Or, [Wang] said, it may have been trained using a batch of photos in which a majority of people depicted on LinkedIn or in “professional” scenes were white.

In any case, the fact that AI "thinks" this way is not harmless:

It has made [Wang] think about the possible consequences of a similar misstep in a higher-stakes scenario, like if a company used an AI tool to select the most “professional” candidates for a job, and if it would lean toward people who appeared white.

“I definitely think it’s a problem,” Wang said. “I hope people who are making software are aware of these biases and thinking about ways to mitigate them.”

The folks at PlaygroundAI said they're looking into it and will use Wang's experience to make refine their models. 

Bottom line: AI sure is worrisome.


No comments: