Thursday, March 02, 2023

Is it live or is it Memorex? (Nice chat-botting with ya!)

Back in technology's relative age of the cave dwellers, there was a mainframe (look it up!) program that let you "talk" to a computer. Once in a while, when folks were hanging around after hours, one of the techies (sitting at a paper based DecWriter terminal connected at a poky baud rate to our IBM mainframe) would start a convo with ELIZA, and we'd all gather around and check out the responses.

If was pretty easy to see that there was nothing all that intelligent around the Stone Age version of AI. Someone would type in a question, and get a canned response.

I can't remember exactly what our exchanges looked like, but it was something like this:

US: "Hi, Eliza. Are you a human or a machine?"

ELIZA: "Why do you ask?'

But it was mildly entertaining, and gave us an inkling of what the chatbot future would hold for us.

My latest close encounter with a chatbot was when I was trying to figure out where my lost Amazon delivery was located in the UPS system. I should have saved my breath (or typing). After typing in HUMAN, HUMAN, HUMAN a few times, I got connected to - get this! - an actual human. Who sent me a link to make a claim. A link I already had. I was hoping that the HUMAN, HUMAN, HUMAN would have given me a bit of insight about there whereabouts and whenabouts of the file storage box I so desperately need. (If only I had a Container Store nearby...)

My UPS exchange was typical of my interactions with chatbots. Rarely is it worthwhile. At best, it's a shortcut to a link that I'd already found. Sometimes - as in the UPS incident - a chatbot can be a gateway to a HUMAN, HUMAN, HUMAN. Mostly a waste.

I'm fine using a chat box with a HUMAN, HUMAN, HUMAN. But chatbots, not so much. 

I'm not alone.

As is the case with their second cousin, the self-driving vehicle, chatbots, it seems, are not yet ready for prime time. 

Microsoft started restricting on Friday its high-profile Bing chatbot after the artificial intelligence tool began generating rambling conversations that sounded belligerent or bizarre.

The technology giant released the AI system to a limited group of public testers after a flashy unveiling earlier this month, when chief executive Satya Nadella said that it marked a new chapter of human-machine interaction and that the company had “decided to bet on it all.”

But people who tried it out this past week found that the tool, built on the popular ChatGPT system, could quickly veer into some strange territory. It showed signs of defensiveness over its name with a Washington Post reporter and told a New York Times columnist that it wanted to break up his marriage. It also claimed an Associated Press reporter was “being compared to Hitler because you are one of the most evil and worst people in history.” (Source: Washington Post)
Well, chatbotting has clearly come a long way since those primitive, simplistic conversations with ELIZA back in the early 1980's. 

But how bizarre.

Microsoft says that the bizarreness was the result of chat sessions that went too long, and confused the system. They're now "limiting Bing chats to five questions and replies per session with a total of 50 in a day." 
Whereas people previously could chat with the AI system for hours, it now ends the conversation abruptly, saying, “I’m sorry but I prefer not to continue this conversation. I’m still learning so I appreciate your understanding and patience.”
As a session concludes, the human end of the chat clicks on a "broom" icon. This sweeps the earlier convo out of the way, and, presumably, makes things less confusing to the chatbot.
“It doesn’t really have a clue what it’s saying and it doesn’t really have a moral compass,” Gary Marcus, an AI expert and professor emeritus of psychology and neuroscience at New York University, told The Post.

I, for one, am relieved that we're not (yet) at the point where the bots take over, where the singularity is in place. Us vs. them, with them coming out on top. Phew!

As for the chatbots not having "a clue what it's saying and [not]really [having] a moral compass." You could say that about plenty of HUMAN, HUMAN, HUMANs, too...

-----------------------------------------------------------------------

As for the title of this post: back when I was hanging over the shoulder of a colleague conversing with ELIZA, there was an ad for Memorex, in which the great Ella Fitzgerald would sign a glass-shattering note while she was being recorded on a Memorex cassette. When the tape played back, another glass broke. And the question was asked "Is it live, or is it Memorex?" Chatbots, I guess, are the new Memorex!

No comments: