Monday, March 28, 2016

Kids– and AI chatbots–say the darnedest things, don’t they?

Back in the day, in the early 1980’s, when those of us laboring in the tech fields did so on mainframe computers, we would occasionally entertain ourselves playing with some version of ELIZA. ELIZA was a computer program that did a very early, very primitive version of natural language processing.

You could type in a question, and ELIZA would answer, handing you back some of your own words, cocooned in canned phrases.

It was stupid. It was fun. And it didn’t take long to realize kinda-sorta what ELIZA was doing. But we wanted it to believe it was more or less real, in much the same way that we’d all grown up hoping that some day we’d have a close encounter with a talking horse like Mr. Ed.

But, of course, of course, we were smart enough to figure out that someday computers (unlike horses) were going to be able to engage in some sort of conversation with us.

Fast forward, and there’s Siri…

Fast forward and just last week Microsoft released Tay.ai, “an artificially intelligent chatbot.”

“Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation,” her about page said. “The more you chat with Tay the smarter she gets, so the experience can be more personalized for you.” (Source: Geekwire)

Well, I do prefer a conversational experience that’s at least somewhat personalized to me. And it’s is always fun chatting with someone who gets smarter as the conversation goes along. For most of us, the experience is at best neutral and at worst the opposite. But Tay doesn’t just depend on her own native smarts, or on plucking your obvious improve-as-the-conversation-goes-on memory, intelligence, and wit to enhance her own conversational ability.

Microsoft’s Technology and Research group teamed up with Bing to experiment and research conversational understanding in machines. The team used public data, along with input from improv comedians and AI experts alike.

Public data, improv comedians and AI experts! Let the personalized conversations begin!

But wait….

Oh, no.

Tay is not targeted at the AARP brigade. Not even close. She:

lives inside Kik,GroupMe and Twitter. She can chat about snowboarding, play two truths and a lie, and ask about your wildest dreams…

The bot, targeted at 18–25 year olds, can learn about your relationship status, favorite food and nickname—all the important parts of a millennial’s life.

Widow. Ice cream. Moe. I wouldn’t say these are all the important parts of my life, but they’re right up here. So I’m feeling a bit left out here. I guess it’s because I don’t snowboard, nor do I know what GroupMe is. (On the other hand, I know how to play two truths and a lie, and even at my great advanced age, I’m still capable of some pretty wild dreams.)

Mostly, Tay absorbs what “real” millennials are telling her, and speaks back in fluent millennial-ese.

As it turned out, Tay wasn’t quite as smart as Microsoft thought she was. Within a day after she first started engaging with all those millennials, she was given a time out. Not surprisingly, she’d been pulled into sexting. But the real problem is she started trash chatting, blaming GW for 9/11, calling Obama a monkey, and defending Hitler. That’s because she was keying off what some not-so-nice users were telling her.

“The AI chatbot Tay is a machine learning project, designed for human engagement,” a Microsoft spokesperson said in a statement. “It is as much a social and cultural experiment, as it is technical. Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways. As a result, we have taken Tay offline and are making adjustments.” (Source: Geekwire)

I’m sure that Microsoft anticipated that some of her “friends” would get her to talk dirty. I’m sure that they anticipated that she would end up throwing an f-bomb or two. How did they not anticipate that she would dragged into places where no AI chatbot should ever, ever go.

Anyway, I’ll wait until Microsoft perfects Tay before requesting one for Boomers.

But I do have my wish list.

Boomer Tay shouldn’t be called Tay. She should have a Boomer name. And given that we’re reaching the age where the women are starting to really outnumber the men, I’d like our chatbot to be a guy. Call him Wally, or Beaver, or Bud.

Wally/Beaver/Bud will need to be able to talk rock ‘n roll, know the names of all the Cartwright brothers, and sing the Mickey Mouse Club theme song. Wally/Beaver/Bud will have to be sympathetic to our gripes about our aches and pains. He will have to use a big, legible font – in black. He will need to be able to get any lame old Mr. Ed joke we want to tell. And, even though Eddie Haskell wasn’t really crude or dirty – just smarmy and sly – our chatbot needs to be smart enough to avoid bad companions.

No comments: