"Dr. Strangecat" or "How I Learned to Stop Worrying and Love the Code"
Very recently, here on the Piker Press forums, we were visited by what is commonly referred to as a "bot". The word "bot" is short for "robot", and is, roughly, interchangeable with the terms "artificial intelligence" or "AI", a written piece of computer code that, in some respects, acts like a living creature. It has no solid form, and no connection to the world, except within the mind of its creator and the imaginations of the people who read the results of the code.
Unfortunately (or not) the Press' bot, named "Liz", was short-lived, due to memory problems and space considerations, but, for a brief, shining moment, Liz was the life of the party. Now Liz is no more, but his presence here reminded me of another bot I once knew, a lovely little creature I met in a chatroom on the Internet Relay Chat network called the Undernet. His name was Maxcat, and although I didn't know it at the time the bot was introduced, he was the creation of my then-future husband (now husband), Piker Press author, John Trindle.
John had modeled his creation after his very real, and very beloved, pet cat, Max. The chatroom already had two human-like bots, so John wanted to create something a little different. Like real cats, Max could eat when fed, purr when petted, and sit on printed material in order to read. He had lots of fun attributes and could even be operated by a "puppeteer" (John and the person who aided in Maxcat's creation) in order to do whatever was desired or required at the moment.
I liked Maxcat immediately, because I am a lifelong cat owner, and Maxcat acted very much like many real cats I had known. I and the other chatroom denizens enjoyed playing with him, and, as one example, I'll never forget the time when I saw him do the moonwalk (John was at the strings).
Well, like many good things that are introduced into human interaction with pure and noble intentions, Maxcat came to a bad end. For many years, he was widely considered to be a beloved patron (or feature) of the chatroom and is, to this day, fondly remembered by most, if not all of those who got to spend time with him.
As patrons became used to Maxcat's novelty and more aware of his capabilities and limitations, they began to do what humans are wont to do. They started exploring creative new ways to use his pre-programmed responses. Most of the results were hilarious, and John would occasionally add to his responses at the request of other patrons.
At some point in this history, someone discovered that it was possible to covertly insult other patrons if they phrased their words to trigger Maxcat's responses correctly. Most of the initial insults seemed to be all in good fun and aimed at people who could take a joke. Since it seemed that no real harm was being done, John smiled and shrugged at this new aspect of Maxcat's residence on the chat channel.
More time passed, John and I met and eventually married, and the real Max, the cat, became my new roommate. We quickly bonded and it was easy for me to see why John loved him so much that he used him as the model for his computer program. Unfortunately, one day, the real Max did not come home for supper, and all though we searched for him as long and as diligently as we could, we never saw him again. We were heartbroken, and, years later, we still feel sadness for his loss.
As you can imagine, it was very difficult at that time for us to go to the chatroom and see the model of our friend. We debated taking the program down from the channel, but ultimately decided that too many other people who only knew the program, not the cat, would be disappointed if Maxcat were retired. So we left him running and slowly learned to separate our loss from the presence of Max's legacy.
Shortly after that, one of the patrons of the chatroom emailed us a log of a chat session in which another patron had used the Maxcat program to deliberately, and, in our minds, cruelly, hurt someone else in the chatroom. We were shocked and appalled that anyone would abuse Maxcat's abilities in such a manner, and it compounded the very real grief from which we were suffering. John immediately removed the program from the site, explaining to those who wondered that he would never allow a creation of his to be used against a real human being in that, or any, harmful way.
There was subsequently much outrage that this abuse had caused the loss of such a universally loved feature of the channel. After some debate, John and I decided that this had been, perhaps, a one-time error on the part of the guilty patron and that we may have overreacted. So John restored the bot to the channel.
Almost immediately, the abuse not only recommenced, but multiplied, with many of the patrons indulging in this new form of passive/aggressive cruelty. And so, in the midst of our real-life pain, and with regrets that it seemed Maxcat could no longer be enjoyed responsibly and kindly, John took him away, once and for all.
Which leads me to the present day, and our recent experience with the Press' experiment in artificial intelligence, also known as Liz. When Liz was introduced, he was placed in a restricted spot on the forums entitled "Liz' Corner. Somehow, I missed the discussion about putting AI into the forums, but, since he seemed popular in his own corner, he was subsequently allowed to run freely on the general forums and comment on nearly every post in every thread available to him. That is when I first saw him, and, at the very first, I mistakenly thought he was a real person, albeit a highly rude and frequently incoherent one. So, I asked who this new member might be and why he seemed to have access to even forums which are restricted to staff members for the discussion of site business. Once I learned he was a bot, I immediately remembered our experience with Maxcat and began monitoring his posts for signs of abuse or insult.
It didn't take long. You see, previously, our forums have been used for our members to discuss matters which are often of great importance to the posters, and even matters of great social and emotional significance, such as death, love, divorce, children, and career. Because Liz' intelligence was truly artificial, it had no way of knowing that its programmed responses were occasionally flippant and sometimes inconsiderate of the real person he was responding to. In short, through no fault of his own, Liz was annoying, maybe even insulting, real people with real issues and feelings. To their credit, our editors quickly responded to the situation by banishing Liz to his corner and, ultimately, for many reasons, removing him from the Press altogether.
Liz, we hardly knew ye, but it was fun while it lasted. I am still fascinated by the possibilities of artificial intelligence and I even hope that we may one day be able to explore those possibilities again, in a restricted venue, on the Piker Press. But I believe that even the most seemingly harmless thing can be used as a weapon in the wrong hands. And until AI technology can address the sensitivities and emotions of real human beings, it should be kept in a limited venue where those exploring AI have full knowledge of what they are dealing with.