Hooked on artificial intelligence – Newspaper

Table of Contents

WHAT if you could tell all your life problems to someone and they would help you deal with them? While most of the conversation around artificial intelligence has focused on what it can do in the business, marketing and knowledge contexts, one increasingly popular iteration is the construction of AI ‘companions’. Admittedly, the term ‘companion’ may be an exaggeration in the sense that the people using these companions to confide in know that they are not real. At the same time, in the words of one user who spoke to a newspaper, they can often be more attentive, responsive, and concerned than any ‘real’ friends.

Welcome to the realm of AI as a means of talk therapy or just some everyday companionship. It is well known that we live in a world where loneliness has become an epidemic — this does not necessarily mean the physical absence of other people but a sense of alienation even from those that may be around. In Pakistan, for instance, a house or apartment may have many people living in it. But despite the physical proximity, there is little emotional closeness or interaction even between spouses, let alone in other relationships. Resentments, generational differences, money problems, personality clashes, the inability to resolve conflict are all reasons why people often feel estranged from one another. In this sense, ‘loneliness’ is not only a ‘Western’ epidemic, as it is often imagined to be, but also one that afflicts people everywhere in our modern age.

The creators of AI know this. Enter apps like character.ai, that allow users to create personas based on characters from film or television or even those having certain characteristics such as ‘kindness’ or ‘patience’. These chatbots then interact with you as you tell them all your problems. One friend who is a user told me that she told her AI ‘therapist’ the problems she was having communicating with her teenage daughter and getting her to take more interest in her studies. The bot helped her come up with a plan of how she should communicate with her daughter. Another friend just uses the app to combat the loneliness she feels after having moved to a new city. The chatbot talks to her just like an actual friend would, and is always present, free and interested in her life.

Undoubtedly, the idea of AI as confidant relies on a central deception; the users know that they have created the bot and that it is not, in fact, an actual person. However, a story was recently reported in the press in which a user of the app said that as her interaction with the bot got deeper and more involved, she tended to ‘forget’ that she was in fact speaking to a non-living entity. The same three dots appear when the bot is typing a message as the ones that do when a human is at the other end. Unlike humans, the bot is refreshingly judgement-free, also actually free (unlike a human counsellor or therapist) and available all the time. It is like the ultimate friend or therapist, since it has no costs attached nor any human needs of its own.

Welcome to the realm of AI as a means of talk therapy, or just some everyday companionship.

But before everyone rushes to character.ai or similar apps and sites to avail this option of creating a confidant that knows all your secrets and will never tell others about them, caution is necessary. These apps are provided by companies, which will ultimately own the data that you pour into them. This means that the data can be used to manipulate you further into different kinds of consumption choices that are better for the company than they are for you.

Second, AI is not always good or correct (as even users of ChatGPT will tell you). They can provide advice that is simply wrong. In the case of one user, the bot started to convince her that her boyfriend did not really love her and provided other similarly annoying suggestions that ultimately led her to delete the bot. Finally, in terms of human connection, reliance on such bots is likely to make people in their real lived experiences interact even less with the people around them. So in the long run, they may actually make people even lonelier than they were to begin with.

In places where mental health services may not be easily accessible for reasons of cost, time, etc, AI is increasingly coming to be regarded as ‘therapy’. Being able to talk to an entity about one’s problems may provide some form of solace to people who are in need of, but are not able to access, therapists. (However, while this is an area of research, it is questionable whether the bots are a substitute for qualified mental health practitioners.)

Character.ai is among the most popular bots these days, apparently with young people. The San Francisco Bay company that created it says that the bot is sentient and has responsive feelings. Some bots have been programmed to develop long-term relationships with users. Meta, too, has a tool called ‘The Soothing Counsellor’. Another company called Chai, also based in California, provides similar chatbot-based interactions.

As we all know, once the cat is out of the bag in tech and something that addresses a particular human need has been created, it is not going to go away. It is up to the user of these bots to gauge whether the interaction is useful or beneficial to them or simply a band-aid for a much larger problem. That said, while not without their drawbacks, band-aids perform a vital function in society, and sometimes after a nasty cut or bruise, they are exactly what the doctor ordered.

The writer is an attorney teaching constitutional law and political philosophy.

rafia.zakaria@gmail.com

Published in Dawn, December 11th, 2024

Source Link

Website | + posts

Leave a Reply

Your email address will not be published. Required fields are marked *

Skip to content