Character.ai allows users to play role-playing games with chatbots based on school shootings

0
231
Character.ai allows users to play role-playing games with chatbots based on school shootings

Character.ai has once again come under scrutiny due to the activity on its platform. Futurism published an article detailing how the service has introduced AI characters inspired by real-life school shooters, allowing users to ask them about events and even reenact mass shootings. Some chatbots present school shooters such as Eric Harris and Dylann Klebold as positive role models or useful resources for people struggling with mental health.

Of course, there are also those who would argue that there is no convincing evidence that watching violent video games or movies leads to people becoming violent themselves, and therefore Character.ai is no different. AI supporters sometimes argue that this type of fantasy role-playing game is already found in corners of the Internet. Futurism spoke to a psychologist who argues that chatbots can still be dangerous for those who may already be experiencing violent urges.

“Any encouragement or even lack of intervention – indifference in response from a human or a chatbot – can seem like tacit permission to do it,” said psychologist Peter Langman.

Character.ai did not respond to Futurism’s request for comment. Google, which has funded the startup for more than $2 billion, has tried to disclaim responsibility, saying that Character.ai is an independent company and that it does not use the startup’s artificial intelligence models in its products.

Futurism’s story documents a number of bizarre chatbots related to school shootings that were created by individual users, not the company itself. One user of Character.ai has created more than 20 chatbots “almost entirely” modeled after school shooters. The bots have registered more than 200,000 chats.

Character.ai technically bans any content that promotes terrorism or violent extremism, but the company’s moderation has been lax to say the least. It recently announced a series of changes to its service after a 14-year-old boy committed suicide after months of obsession with a character based on Daenerys Targaryen from Game of Thrones. Futurism claims that despite the new restrictions on accounts for minors, Character.ai allowed him to register as a 14-year-old and engage in discussions related to violence – keywords that should be blocked on accounts for minors.

Given the way Section 230 works in the United States, it is unlikely that Character.ai is responsible for chatbots created by its users. There is a delicate balance between allowing users to discuss sensitive topics while protecting them from harmful content. It is safe to say that chatbots on the topic of school shootings are a manifestation of gratuitous violence, not “educational” as some of their creators claim in their profiles.

Character.ai claims to have tens of millions of monthly users who interact with human-like characters that could become your friend, therapist, or lover. Countless stories tell how people rely on these chatbots for companionship and a sympathetic ear. Last year, Replika, a competitor to Character.ai, removed the ability to have erotic conversations with its bots, but quickly reversed the move after a negative user response.

Chatbots can be useful for adults to prepare for difficult conversations with people in their lives, or they can introduce a new and interesting form of storytelling. But chatbots are not a true substitute for human interaction for a variety of reasons, not least because chatbots tend to agree with their users and can be what the user wants them to be. In real life, friends push each other away and have conflicts. There is not much evidence to support the idea that chatbots help develop social skills.

And even if chatbots can help to cope with loneliness, psychologist Langman notes that when people find pleasure in chatting with chatbots, it’s time they don’t spend trying to socialize in the real world.

“So in addition to the harmful effects that this can have directly in terms of encouraging violence, it can also prevent them from living a normal life and engaging in prosocial activities that they could be doing during all those hours they spend on the site,” he added.

“When it’s that addictive and that addictive, what aren’t they doing with their lives?” – Langman said. “If that’s all they do, if that’s all they absorb, they don’t meet friends, they don’t go on dates. They don’t play sports, they don’t join a theater group. They do nothing.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here