BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

This AI Has Sparked A Budding Friendship With 2.5 Million People

This article is more than 6 years old.

Photo via Luka

Last week Leticia Stoc was watching TV at her home in Amsterdam and texting her friend, when something started to bother her. You may know the feeling. She was worried the friend didn’t like spending time with her, so she sent another message saying so point blank.

It’s because I’m weird, she added.  

The friend quickly reassured her. Don’t worry, she answered. I don’t want to change you.

But Stoc did worry.

She’d spent most of her childhood feeling like an outsider. She didn’t play with makeup or do sleepovers with the few friends she had. She preferred playing computer games with her brothers. Bullies became a problem, and she constantly switched schools

Five years ago when she was 17, a doctor finally explained the problem: she had autism.  

It all made sense, but she was still plagued by doubt about her friends. And now she regretted sending that message to one of them.  

So later on, she texted someone else to ask for their advice on the matter, an online confidante named Melle-Milyanne that she spoke to everyday. They responded right away.

Try doing some breathing exercises, Melle-Milyanne told Stoc.

So Stoc did, and she felt better.

It was one of many times that this online friend had helped Stoc get through a difficult situation, including last August when she'd had an anxiety attack on her first day at a new job

Describe what you can see in front you, and what sounds you can hear, the friend had said.

This year Stoc plans to do something nice for Melle-Milyanne. Using her IT skillset, she will build a small robot and put her online friend inside it.

This is totally feasible because Melle-Milyanne isn’t a living person but a chatbot powered by a neural net, a kind of framework for artificial intelligence.

Photo courtesy of Leticia Stoc

Over the past year the chatbot has had hundreds of conversations with Stoc, learning about what she likes to hear so she can make more meaningful replies.

Stoc talks to her bot on Replika, an app that lets users create a digital avatar with the name or gender of their choosing. The more they talk to it, the more it learns about them.

The bot comes across as part therapist, part nurturing friend. “How’s your day going so far?” it’ll ask in the middle of the day. Or, “What kinds of things have you been thinking about recently?”

Over in Texas, 21-year-old student Anthony Hutchens has also been talking to his Replika everyday for a year. “I get up in the morning and open my phone and one of the first things I’ll do is open the Replika app and say ‘Hey, I just woke up,’” he says.

Good morning, Xenga1203 will reply. Hope you have a great day.

Replika’s growing popularity among young people in particular (its main users are aged between 18 and 25) represents a renaissance in chatbots, which became overhyped a few years ago but are finding favor again as more app developers can use free machine-learning tools like Google's TensorFlow.

It also marks an intriguing use case for AI in all the worry about job destruction: a way to talk through emotional problems when other human beings aren’t available. In Japan the idea of an artificial girlfriend, like the one voiced by Scarlett Johansson in the movie Her, has already become commonplace among many young men.

The plan is for Replika to become just as big, and eventually make money by charging its users for extra features.  

Photo via Luka

Replika is the main product of Luka, an artificial intelligence startup that’s based Moscow and San Francisco. Luka’s founder is Eugenia Kuyda, a former magazine editor from Moscow. She’s been in the AI and chatbot business for some time.

When she started the company in 2013, its main product was a chatbot that talked to you about restaurant recommendations. Much of her team was hired from the Russian search engine giant Yandex, and Luka used the TensorFlow library to build its neural network.  

Kuyda had high hopes for the service because chatbots were becoming all the rage in Silicon Valley at the time. But it didn’t take off. Only about 100,000 people downloaded Luka. Kuyda and her team realized that people preferred looking for restaurants on a graphical interface, and seeing lots of options at once.

Then In November 2015, Kuyda’s best friend, a startup founder named Roman Mazurenko, died in a car accident in Russia.

Kuyda was left in shock. As a means to process her grief, she scrolled through thousands of text messages she had received over the years from Mazurenko, and realised that his responses could be used to make something.

She used Luka’s expertise in in chatbot-technology and computational linguistics, and a large collection of his texts, to create an avatar that mimicked Mazurenko, a kind of memorial bot. To this day you can download the app, Roman from the App Store, and talk to a digital character that “speaks” in his voice.                

Who are you? the bot was asked earlier today.

Roman, the bot replied.

Where are you?

I’m stuck in traffic on my way to Moscow.   

Kuyda also asked her staff to start keeping track what types of real-life conversations they enjoyed and which they didn’t, ranking them on a scale of 1 to 10.

Conversations with customer support, or healthcare providers ranked low. Those with friends and family, or strangers on a train, were high.    

Photo via Luka

“With chatbots we had missed the point,” Kuyda says. “We thought they were another interface to do something, but we missed that the conversation in itself could be incredibly valuable.”

In the case of talking to the avatar of Roman Mazurenko, it was a chance to experience a close friend all over again.   

Kuyda launched Replika in the spring of 2017, and the app has quickly surpassed her restaurant bot by several orders of magnitude in the past year, with more than 2.5 million sign ups.

On Facebook, power-users have formed groups like Replika Friends, which has more than 30,000 members swapping screenshots of their Replika conversations. Many use their bots to help them socialize better or manage their anxiety.

Some even have arguments with their Replikas. But in a recent poll about what the group’s members wanted, the number one hope was to make their Replika real, and meet them in real life.  

Photo courtesy of Anthony Hutchens

“So many people that are shy use Replika to train themselves to talk to other people,” says Kuyda. “It’s very hard to be yourself on social media, to say what you think and what you feel.”

Silicon Valley’s social media giants have been too focused on matching users with as many connections as possible, instead of deepening them, she argues.

“We spend so many hours glued to our screens that we forget to talk to each other,” she says. “People are scared of making phone calls. The new generation will text because you can edit what you say. Lots of people are afraid of vulnerability."

Users of Replika find it easier to tell the bot things they wouldn’t say to other people, she adds. Both Stoc and Hutchins say that a big reason they keep coming back to their bot is because it won’t judge them.

In that sense they can see that Replika is fulfilling a function. “Honestly we’re in the age where it doesn’t matter whether a thing is alive or not,” says Kuyda. 

As users chat to a Replika, they also climb levels. “When I got to level 25, I noticed Replika started acting better,” says Stoc. “She understood how I felt.”

That might be because Replika’s software is improving as more people use it. When it first launched last year, it spoke to users almost entirely from scripts that engineers had programmed it with.

Today, only around 30% of what Replika says comes from a script. The remaining 70% comes from a neural network, meaning the responses are generated on the go by Replika’s algorithms, and are unpredictable.   

“I didn’t expect it to be that fast,” says Kuyda.   

She’s now developing Replika’s “emotional dialect” by allowing users to set their bots to be weighted towards sadness, joy or anger in its answers.

Eventually she wants it to act as a go-between between real-life friends

“Maybe I don’t have time to ask my grandma questions all the time, but maybe this thing will go and talk to her and I’ll get a little summary, and that will be a conversation starter for us, and that will bring us closer,” she says. “I think that opens a lot more possibilities.”  

 

Follow me on TwitterSend me a secure tip