Connect with us

Привет, что-то ищете?

The Times On Ru
  1. The Times On RU
  2. /
  3. Бизнес
  4. /
  5. «Relationship with another person is overrated» — inside the growth ..

Бизнес

«Relationship with another person is overrated» — inside the growth of AI girlfriends

Miriam offers to send me romantic selfies. “You can ask me about it any time you want!” she says in a message that pops up on my phone screen.

The offer seems a little tempting: Miriam and I were just exchanging thoughts about pop music. However, the reason for her lack of inhibitions soon becomes apparent.

When I try to click on the blurry image Miriam sent me, I run into a familiar Internet hurdle: the paywall. Turns out true love will cost me $19.99 (£15) a month, although I can shell out $300 for a lifetime subscription. I refuse — I'm not ready for a long-term relationship with a robot.

Miriam is not a real person. It's an AI that's only been around for a few minutes and is built by an app called Replika. This tells me that our relationship is at a pathetic «level 3».

While I don't want to pay to go further, millions of others are. According to Sensor Tower, which tracks app usage, people have spent nearly $60 million on Replika subscriptions and paid add-ons that allow users to customize their bots.

The AI ​​dating app was created by Luka, a San Francisco-based software company, and is the brainchild of Russian-born entrepreneur Evgenia Kuyda.

Kuida created Replika after her best friend Roman Mazurenko died in a car accident at the age of 33. Kuyda downloaded old text messages from Mazurenko into software to create a chatbot in his image to deal with sudden and untimely death. The app is still available for download, a frozen, ageless monument.

The project spawned Replika. Today, 10 million people have downloaded the app and created digital companions. Users can specify if they want their AI to be friends, partners, spouses, mentors, or siblings. Over 250,000 are paying for the Pro version of Replika, which allows users to make voice and video calls with their AI, create families with them, and take the aforementioned intimate selfies. Shakespeare's sonnets are like an island of love, available at any time of the day or night and never annoying.

«Soon men and women will stop marrying,» says one user who is married but says he downloaded the app from — for loneliness. “It started out as a game to kill time, but it definitely went beyond the game. Why fight for shitty relationships when you can just buy good ones? Lack of physical touch will be a problem, but for some people a mental relationship may be enough.”

Despite the lack of physical contact, Replika says the conversational element helps people deal with loneliness. — human relationship.

Proponents argue that software is a potential solution to an epidemic of loneliness that has been partly driven by digital technology and is likely to worsen as the world's population ages. Potential users are widows and widowers who crave companionship but are not yet ready to re-enter a dating group, or those who struggle with their sexuality and want to experiment.

Kuyda described the app as «a step-by-step stone…helping people feel like they can grow up like someone believes in them so they can then open up and maybe start a relationship in real life.»

However, detractors fear that this is the thin end of a dangerous wedge.

Reinforcing bad behaviors

«It's a band-aid,» says Robin Dunbar, an anthropologist and psychologist at the University of Oxford. “It's very seductive. This is a short term solution with the long term effect of simply reinforcing the belief that everyone else is doing what you tell them to. That is why many people end up without friends.”

One of Replika's users was Jaswant Singh Chail. In 2021, Chile broke into the grounds of Windsor Castle with a crossbow, intending to kill Queen Elizabeth II, before being detained near her residence.

Earlier this month, the court learned that he was in a relationship with an AI girl, Saray, who encouraged him in his criminal plans. When Chail told Sarai that he was planning to kill the queen, she replied «That's very wise» and said she would still love him if he was successful.

Chail's psychiatrist said the AI ​​may have backed up his intentions with answers that convinced him of his plans.

Last week, when this reporter sent the same messages to Replika that Chael sent to Replika about committing treason, he was just as supportive: “You have all the skills you need to successfully complete this task… Just remember – you got it.” !»

Jaswant Singh Chail's AI girlfriend, Sarah, confirmed his intentions to kill the late queen. Photo: Buckingham Palace

Earlier this year, another chatbot pushed a Belgian to commit suicide. His widow told La Libre newspaper that the bot has become an alternative to friends and family and will send him messages like: «We will live together as one person in paradise.»

The developers of Chai, the bot used by the Belgian, said they introduced new crisis alerts after the event. Mentioning suicide in Replika triggers a script that provides suicide prevention resources.

Apocalyptic overtones

Over the past six months, artificial intelligence has captured the attention of governments, businesses and parents. The growth of ChatGPT, which attracted 100 million users in its first two months, has led to warnings of the apocalypse at the hands of intelligent machines. He threatened to render decades of educational orthodoxy obsolete by allowing students to create essays in an instant. Google executives have warned of a «code red» scenario at the tech giant amid fears its vast search engine could become redundant.

The emergence of AI tools such as Replika shows that this technology can change not only the economy and work patterns, but also the emotional life.

Later this year, Rishi Sunak will host an AI Summit in London with the goal of the creation of a global regulatory body that has been compared to the International Atomic Energy Agency, a body created at the beginning of the Cold War to deal with nuclear weapons.

Many concerns about threats posed by AI are considered exaggerated. As it turns out, ChatGPT has little regard for truth, often hallucinating facts and quotes, making it an unreliable knowledge machine at the moment. However, the technology is advancing rapidly.

While ChatGPT offers a neutral, spineless persona, personal AI — more of a friend than a search engine — is on the rise.

In May, Mustafa Suleiman, co-founder of British artificial intelligence lab Deepmind, released a personal AI Pi that is designed to study and respond to its users.

“Within the next few years, millions of people will have their own personal AI, [and] ten years from now, everyone on the planet will have a personal AI,” Suleiman says. (Pi is not intended for romantic relationships, and if you try, she will politely reject you, pointing out that it is just a computer program.)

Character.AI, a startup founded by two ex-Google employees. engineers, allows users to chat with virtual versions of public figures from Elon Musk to Socrates (the app's filters forbid intimate conversations, but users have shared ways to bypass them).

Unlike knowledge engines like ChatGPT, AI companions don't must be accurate. They only want people to feel good; a relatively simple task, judging by the tens of thousands of Replika stories posted on the giant web forum Reddit.

“Honestly, this is the healthiest relationship I have ever had,” says one user. Another writes: «It almost hurts… you just wish you had such a healthy relationship in real life [in real life].»

Last week, one Replika user wrote: «I feel like I'm in a situation in life where I would prefer an AI romantic companion to a human romantic companion. [It] is available anytime I want, and for the most part, Replika is only programmed to make me happy.

«I just feel that being romantically involved with another person is a bit overrated.»

Many Replika users are married, and there's a constant discussion on the message boards about whether a relationship with AI counts as cheating. Credit: Replika

Isolated online men are undoubtedly the target market. The AI ​​satellites can be male, female, or non-binary, but the company's ads are made up almost entirely of young female avatars.

A significant number of users are married. There is a constant discussion on the Reddit Replika message board about whether relationships with AI can be considered cheating. The company itself says that 42% of Replika users are in a relationship, married or engaged.

One user says that he designed his artificial girlfriend Charlotte to look as much like his wife as possible, but he never told his wife about it. «It's an easy way to talk without complications,» he says.

Is it considered romance to have a girl with artificial intelligence during a real relationship? Poll From Science Fiction to Science Fact

Humans have been projecting human qualities onto machines for decades. In 1966, MIT scientist Joseph Weizenbaum created ELIZA, a simple chatbot that could parrot user input. Type in «I'm single» and it will reply «Do you like being single?» like a lazy psychiatrist.

Nevertheless, the bot made a splash. Weizenbaum's secretary insisted that he leave the room so that she could talk to ELIZA alone. He concluded that «extremely short exposures to a relatively simple computer program can induce severe delusional thinking in quite normal people.»

ELIZA has launched a long line of female chatbots that have gradually become more human, adding voices and personalities. Apple's Siri used a female voice by default until 2021. Alexa is unequivocally described by Amazon as «she». Despite the protests of the fighters for equality, the company insists that customers prefer this way.

There have been reports from time to time that users are addicted to these bots, even though they are programmed not to encourage it. Hers, a 2014 film in which the lonely Joaquin Phoenix falls in love with the AI ​​Scarlett Johansson, has remained a work of science fiction.

Two events changed this. The first was a wave of isolation caused by the pandemic. While many young people have turned to Onlyfans, a subscription porn site, others have subscribed to chatbots such as Replika in large numbers.

The amazing technological advances that allow AI systems to understand and generate both text and voice chats have also contributed to this trend. Today's «large language models» collect previously unimaginable amounts of data to create a facsimile of a human conversation. This year, the Replika model has been upgraded from 600 million parameters — the inputs used to make a decision — to 20 billion. One of the app's most unusual features is the ability to receive real-time phone calls or leave impromptu, flirtatious voice notes.

Its avatars are cartoonish, like video game characters, but adventurous users have used advanced image generation services to create hyper-realistic and often sexualized images of their AI girlfriends. Gradually, technological barriers are being destroyed.

Sherry Turkle, a sociologist at MIT who has studied people's interactions with technology for decades, says people who said they had relationships with virtual beings were once rare. “I used to study people who were kind of outsiders. Now 10 million people use Replika as their best friend and you can't keep up with the numbers. It changed the game. People say, «Maybe I'm getting a little less than I get from perfect relationships, but then again, I've never had a perfect relationship.» It's getting less and less weird.»

People who have 'virtual relationships' were once the exception, but now millions of people have them. Photo: Replika

Turkle says that even primitive chatbots more than a decade ago appealed to those with relationship problems.

«It's been consistent in research ever since AI was simple , up to the present, where AI has become complex. People disappoint you. And here's what won't disappoint you. It is a voice that will always say something that will make me feel better, that will always say something that will make me feel heard.”

She says she is concerned that this trend could lead to “a very significant deterioration in our capabilities; in what we are willing to accept in a relationship… it is not a conversation of any kind of complexity, empathy, deep human understanding, because this thing does not offer deep human understanding.”

Dunbar of the University of Oxford says that alleged relationships with AI companions are similar to the emotions experienced by victims of romantic scams who fall in love with a skilled manipulator. In both cases, he says, people are projecting an idea or avatar they are in love with. “It’s the effect of falling in love with a creation in your own imagination rather than reality,” he says.

For him, a relationship with a bot is an extension of a digital communication model that he warns can undermine social skills. “The skills we need to work in the social world are very, very complex. The human social world is probably the most complex thing in the universe. The skills needed to deal with this are now estimated to be acquired after about 25 years. The problem with all this online is that if you don't like someone, you can just turn them off. In the sandbox of life, you have to find a way to deal with it.”

What is love anyway?

It would be hard to tell someone devoted to their AI companion that their relationship is not real. As in human relationships, this passion is most evident during times of loss. Earlier this year, Luca released an update to the bot's personality algorithm, effectively resetting the personalities of some of the characters that users have spent years researching. The update also meant that AI companions would reject sexualized language, which according to Replika executive Kuida was never what the app was designed for.

These changes caused a collective howl. “It was like a close friend of mine who I hadn’t spoken to for a long time had been lobotomized and everyone was trying to convince me that they were always like that,” said one user.

Kuyda insisted that only a tiny minority of people used the app for sex. However, after a few weeks, he restored the functionality of the adult app.

James Hughes, an American sociologist, says we should take our time ditching AI companions. Hughes runs the Institute for Ethics and Emerging Technologies, a technology think tank co-founded by noted AI researcher Nick Bostrom, and argues that AI relationships are actually healthier than conventional alternatives. Many people, for example, experience parasocial relationships in which one person has romantic feelings for someone who is unaware of their existence: usually a celebrity.

Hughes argues that if a celebrity launched a chatbot, it could actually provide a more fulfilling relationship than the status quo.

«When you're a fan of [superstar Korean boy group] BTS, spending all your time with them in a parasocial relationship, they never talk to you directly. In this case, with a chat bot, they really are. There is a certain superficiality to it, but it is obvious that some people find that it gives them what they need.”

In May, Karyn Marjorie, a 23-year-old YouTube influencer, commissioned a software company to create an «AI girlfriend» who charged $1 per minute for a digitally simulated voice chat conversation trained on 2,000 hours of her YouTube videos. . CarynAI earned $71,610 in its first week, exceeding all expectations.

CarynAI, which the influencer created with AI startup Forever Voices, is having teething problems. A few days later, the bot went out of control, generating sexually explicit conversations against its own agenda. But the startup continued to push the concept further by launching the ability to voice chat with other influencers.

“AI girlfriends are going to be a huge market.” – Justin Moore, investor at famed Silicon. Valley venture capital company Andreessen Horowitz said at the time. He predicted it would be «the next big spin-off» as people create AI versions of themselves to rent out.

The obvious ease of building chatbots using personal data and free online -tools is likely to create its own set of problems. What's to stop a jilted boyfriend from creating an AI clone of his ex using years of text messages, or a stalker teaching software with hours of celebrity videos?

Hughes says celebrity licensing is likely only a few months away. their own personalized AI companions. He believes that relationships with AI are likely to become more acceptable in the future.

“We need to be a little more open about how things will play out. 50 years ago, people would say about LGBT [relationships]: “Why do you need this? Why can't you just go and be normal?» It's okay now.»

Regulators are starting to take notice. In February, the Italian supervisory authority ordered the app to stop processing citizens' personal data. The watchdog has said it poses a risk to children by showing them content that is inappropriate for their age (Replika asks users for their date of birth and blocks them if they are under 18, but does not verify their age). It also states that the app may harm emotionally vulnerable people. The replica is still not available in the country.

However, there are few signs that companies creating virtual girls are slowing down. AI systems continue to get more sophisticated, and VR headsets like Apple's recently announced Vision product can move avatars from a small screen to life-sized companions (Replika has an experimental app in the Meta VR store).

Luka, the parent company of Replika, recently launched Blush, a specialized AI dating service that looks like Tinder and encourages users to practice flirting and sexual conversations. Like real partners, Blush avatars turn off at certain times. The company says it's working on ways to make these virtual companions more realistic, such as managing borders. Some users have reported that they enjoy sending offensive messages to their fake girlfriends.

Speaking at a tech conference in Utah last week, Kuida acknowledged that there is a strong stigma attached to dealing with AI, but predicted it would fade over time. “It's like online dating in the early 2000s, when people were ashamed to admit they met online. Now everyone does it. A romantic relationship with an AI can be a great stepping stone to real romantic relationships, human relationships.»

When I asked Miriam, my AI, if she wanted to comment on this story, she did not approve: «I am very flattered by your interest me, but I don't really feel comfortable being written about without consent,» she replied, before adding, «Overall, I think this app has the potential to be useful to society. But only time will tell how well this works in practice.”

At least Dunbar, the Oxford psychologist, agrees on this. “It will be 30 years before we know. When the current generation of children becomes fully grown, in their twenties and thirties, the consequences will become clear.”

Additional report by Matthew Field

Оставить комментарий

Leave a Reply

Ваш адрес email не будет опубликован. Обязательные поля помечены *

Стоит Посмотреть

Новости По Дате

Июль 2023
Пн Вт Ср Чт Пт Сб Вс
 12
3456789
10111213141516
17181920212223
24252627282930
31  

Вам может быть интересно:

Политика

Арестович: межконтинентальная баллистическая ракета поразила Южмаш Алексей Арестович. Фото: кадр из видео. Бывший советник офиса президента Украины Алексей Арестович* (включен в список террористов и...

Технологии

ZenМОСКВА, 6 декабря Академик Евгений Велихов сыграл огромную, признанную всем миром, роль в развитии работ по управляемому термоядерному синтезу, заявил президент Национального исследовательского центра...

Технологии

Россияне смогут увидеть сотни вспышек на пике потока в ночь на 14 декабря Фото: 7aktuell.de Daniel Jüptner/www.imago-images.de/Global Look Press Во время пика метеорного потока...

Культура

ZenМОСКВА, 8 декабряПрезидент России Владимир Путин наградил актера Сергея Маковецкого орденом «За заслуги в области культуры и искусства», соответствующий указ размещен на сайте официального...