Mind Bank AI is the latest company looking to implement an ambitious idea: using artificial intelligence to give humanity immortality.
On January 17, 2020, when the world has not changed – 6 days later, Wuhan will be in lockdown – Emil Jimenez is on a train from Vienna (Austria) to Prague (Czech Republic). Accompanying him was his 4-year-old daughter, who accidentally activated Siri while playing a game on her iPad.
“She asked, ‘Daddy, what’s that?'” Jimenez said. He told her it was Siri, and encouraged her to talk to Apple’s virtual assistant.
Her first question was if Siri had a mother.
From that day on, Jimenez’s daughter kept asking all kinds of questions the kids always wanted answers to – do you like ice cream? Do you like toys? – and at the end of a conversation, she always says she loves Siri, that this virtual assistant is her best friend.
Jimenez, who has background in psychology, noticed this cute interaction and was amazed at how quickly and harmoniously his daughter formed a relationship with the AI. For most of us, Siri is often “souled” when the user needs to ask for something, even a joke.
But today’s generation of kids is rapidly developing a relationship with devices, AI, and robots, in a completely different way than we, who were not born or raised in the age of AI. .
Jimenez knows how Siri works – how a natural language processing algorithm understands your words, how a deep learning black box exists in the cloud, where Siri gets the answers to every question you ask fabricate.
And he came up with an idea.
“Today, my daughter talked to Siri. But one day in the future, I want her to talk to me. Because I know I won’t live forever, and I love her so much…”
“What if I could always help her?”
A digital twin
This is the story of your future. Or at least your probable future.
Jimenez’s aspirations led him to found Mind Bank AI, a startup whose mission is to break the chains of death and the fading of memories – at least for the people you leave behind when you return to the outside world. that.
This company wants to create a clone of you, that can live forever, and can be called up to talk, joke, or argue.
This “digital twin” envisioned by Mind Bank AI will be built over your lifetime, from a dataset of your own.
Through conversations – a combination of suggested topics and natural interactions – AI will build a model that thinks like you, understands your nature, and applies the model. into future situations, such as replying to others as if you were replying, and conversing as if you were conversing.
“What’s wrong? What do you want to eat? How did you meet your wife? Why are you divorced?” – Jimenez envisions Mind Bank AI asking all kinds of questions in the world, no different from the conversations we have with each other when we are acquaintances.
Jimenez wanted to create a digital twin with the ability to speak your voice, because voice can be a powerful evocative trait that can help people’s minds picture what you look like. which.
During the process of personal data being collected, users can talk to Mind Bank AI to reflect on themselves and understand themselves better.
Jimenez sees interacting with Mind Bank AI as an opportunity for self-reflection, while helping the digital twin become more like you.
After you die, Mind Bank AI really works, like a digital frozen laboratory.
A digital twin is the latest idea to realize an age-old human desire: to live immortal, or otherwise to pass on one’s knowledge, experiences, and insights. into another form of existence.
While your perfect digital twin won’t take shape any time soon, the remarkable capabilities of modern natural language processing systems (the deep learning programs behind Siri , Alexa…) and impressive copying technologies still known as deepfakes are gradually turning this idea from a fantasy into something possible.
Several computer programmers and startups already have products similar to Jimenez’s vision.
Replika founder, Eugenia Kuyda, once created a digital version of her best friend Roman Mazurenko.
“When she was grieving, Kuyda read and reread the countless messages her friends had sent her over the years – thousands, ranging from the very common to the unforgettable.” – Casey Newton written by TheVerge. Since Mazurenko uses social media relatively little, and his body has been cremated, his messages and photos are all that remains.
At the time, Kuyda was developing Luka, a messaging app that allowed users to chat with bots. Using Mazurenko’s personal stories, she created a bot that can respond like her friend when called up.
“She was confused as to whether she was doing the right thing to bring him back this way. At times, it gave her nightmares,” Newton continued.
There is also a deep division in their friendship: some refuse to interact with the Roman bot, while others find solace in it.
Then we have the Dadbot, created by James Vlahos. When his father was diagnosed with terminal lung cancer, Vlahos documented everything he could to create the Dadbot – a form of “artificial immortality”, as Vlahos describes it.
Vlahos is currently the CEO of HereAfter AI, which specializes in designing “legacy avatars” built from recordings of real-life interviews.
“They are digital characters built in the image of their creators,” Vlahos said. “It shares their life stories and memories, and their personalities, their natures, the way they talk, their jokes, their insights…”
You can interact with a legacy avatar just like you would with Siri or Alexa, except it will only answer personal questions – with the voice of the person it represents.
However, the digital twin that Mind Bank AI wants to create is more than that, and it comes with great opportunities and challenges, both technical and philosophical.
How long are we going to make a replica of you that looks like the real thing? What decisions will we believe the copy makes? Do they help us to grieve, or sink deeper into our grief?
The architecture of the mind
Sascha Griffiths was tasked with building the skeleton and mind of your digital twin.
Mind Bank AI’s co-founder and chief technology officer is currently researching and coding the AI algorithms and tools the startup believes will be needed to copy you.
Such a clone would likely require the coordination of several AIs: topic modeling (looking for abstract concepts in the language); sentiment analysis (basically emotion recognition); and adversarial generative networks (GANs, the technology behind deepfakes).
As the project progresses into the later stages, Griffiths aims to develop more specialized algorithms to create a better digital copy for you.
But no algorithm is as important as natural language processing (NLP).
NLP is most commonly used in digital assistants and text prediction; GPT-3, which has been popular in the AI world for a while now, is an NLP inspired by the internet to build life-like texts, from conversations to essays.
There will be a fundamental difference between those NLPs and your digital twin, says Ahmet Gyger, an AI researcher and consultant at Mind Bank AI. Those relationships are now transactional; you give a command or question to NLP, it finds the answer for you.
“This relationship will be built further in the future,” said Gyger, former Siri program technical lead. Mind Bank AI can help users understand how they feel in relation to past events and build a dataset of someone’s life experiences.
“And once you get there, you wonder ‘how will that person react in the new form?’, and that’s going to be a very interesting situation.”
By studying the way you walk, text, and write, an NLP can fairly easily reproduce a relatively accurate version of you, as long as the conversation with that copy goes on in the same way. in turn, ask questions – get answers.
But having a real conversation, Griffiths believes, is not yet possible. HereAfter AI’s ‘legacy avatar’ may already be a reality, but Mind Band AI’s digital twin is still in the ‘future tense’ – HereAfter AI wasn’t programmed to join the conversation open, what Vlahos calls a “nightmare”
“Even if we could capture and interpret every verbal and non-verbal clue, a big problem would be creating new things for that ‘digital twin’ to speak out.” – according to Christos Christodoulopoulos, an application scientist who worked on the Amazon Alexa team, not related to Mind Bank AI.
Many of the things we do every day are “programmed” to some extent. When it comes to that kind of interaction, the AI is already able to mimic us: ordering a cup of coffee can be seen as a programming script, but our meaningful, important interactions are not.” – Christodoulopoulos wrote.
“Think of comforting a friend after they’ve just broken up, or sharing the joy with your partner when they get a promotion: if you stick to the formula, ‘set-up’ reactions, everything will be fine. it’s cliché – it’s like interacting with a stranger”
Among the challenges that Mind Bank AI needs to overcome is understanding a person’s emotions, culture, and background. And to deliver an AI that is both flexible and capable of handling a diverse range of interactions, it will also need to overcome the fragile nature of AI.
AIs are fragile because they can’t function beyond what they know.
AI is fragile because it can’t operate beyond what it knows. When it encounters an input it can’t recognize, it crashes.
Vered Shwartz, a researcher at the Allen Institute for AI and the University of Washington, offers an example.
“When the researchers tested GPT-3, they told the AI about a scene where a cat was waiting by the hole waiting for the mouse to appear. Tired of waiting, the cat was too hungry. When they asked GPT-3 what will the cat do, he replied that he will go to the supermarket and buy food.”
Smart, but erroneous.
“It’s not human-like to make mistakes like that,” Shwartz said. “It’s usually due to a lack of common sense, which every adult human knows, but these models don’t really know.”
There are two main ways to solve the problem. One is to collect all the universal knowledge so that the AI can use it for training. Collecting large amounts of data will take decades.
“Gathering everything is impossible,” says Shwartz, not to mention the costs involved. And knowledge in books is biased, in which unusual things are mentioned more because that is what is worth recording.
Data is you – twins are not
While GPT-3 learns by borrowing from people on the internet, your digital twin is only interested in a single set of data: you.
Of course that raises privacy concerns, but the real ethical dilemma is when that data that isn’t about you is transformed into your digital twin. friend.
Griffiths says that Mind Bank AI’s digital twin will not be you, but your representative. It is trained on your data; it will speak, voice, and think like you.
But it will not be your uploaded brain, nor your continuation of existence. It does not grow, change, or learn like you.
So can we trust a loved one’s digital brother when it comes to important opinions?
There are even more complex issues. AI is better at pattern recognition than humans, and NLP can detect patterns in your voice and thoughts that you don’t even know about. The application of those things will create a digital twin that is more accurate, or better said, knows you better than you know yourself.
But if the twin’s AI focuses on a few specific patterns, it could create a more extreme, or perverted, version of you.
“One piece of data can change everything in scary ways,” says Susan Schneider, founding director of the Future Mind Center at Florida Atlantic University.
The algorithm behind the digital twin is still potentially corrupt, leading to serious failures like the situation where the cat goes to the supermarket.
The danger here is that the AI’s ability to make credible, convincing arguments can override its common sense. If we spot a flaw, we can lose trust and feel alienated from our virtual friend, instead of at ease – and if we don’t, we’re at risk of being scammed.
Schneider also fears that AI will make us stand still. Is a digital twin so convincing that we can’t leave the past behind?
For Jimenez, the answer is probably yes. But the opposite can also be true.
When faced with grief, people often turn to religion, said Jimenez, looking for an answer that may never appear to agonizing questions.
However, what if they could also find a digital twin? Your loved one’s digital twin can tell you it’s time to find someone new, or encourage you to return to a passion you once pursued.
“How great would it be if you actually found some answers?” ‘ asked Jimenez.
“At least there’s hope, right?”