Love in the age of technology

This won’t just be a text about Tinder and other dating apps. It will also be about communication, robots and virtual boyfriends and girlfriends.
However, we will start classically, with apps such as Tinder. And, out of chronicle decency, with their predecessors, namely dating platforms.
Even 20 years ago, looking for a partner on a dating site seemed strange to many people. There was a lot of talk that people there were cheating and that instead of the handsome or pretty person in the photo, someone far less attractive would appear on the date.
Today, we are still exposed to scams, and more than dating disappointment. And for many, the web has become the only place to look for a partner.
It’s a match!
In 2002, Sympatia.pl, the first dating site, started operating in Poland. In the following years, other sites appeared, sometimes profiled (e.g. Przeznaczeni.pl, advertising itself at the beginning as a service for Catholics), sometimes trying to make the choice easier with personality tests (e.g. Edarling.pl).
The breakthrough came in 2012 with the appearance of Tinder. This is an app that facilitates matching: the profiles that appear can be swiped to the left (drop out) or right (like) and if the other person also likes us, Tinder announces ‘It’s a match’. From then on, the paired people can write to each other.
It is the world’s most popular dating app, used by 75 million people worldwide. Other services – such as Badoo, Bumble (called feminist Tinder because it is the woman who has to write first) and Grindr (for LGBT+ people) – have also taken over the operating mechanism from it.
The study ‘How Poles love’, conducted for Sympatia.pl, shows that 50 per cent of singles who are open to a new relationship look for love on the internet. Do they find it? Almost 20 per cent of relationships had their beginning online, also according to this study.
Only that the apps have proved to be both a blessing and a curse for their users. A blessing, because it turns out that you can meet someone without leaving your home and with almost no effort. They give the impression that the choice is huge and therefore the chance of love is not small. They make shy people feel more confident. They reduce the risk associated with not knowing whether the person who has caught our eye is involved with someone (theoretically, if they appear on a dating app, they are looking for someone, in practice, many of their users are in relationships and not all of them admit it) or has a psychosexual orientation that suits us.
And the curse? Suddenly we have not a few dozen, but hundreds – thousands of people to choose from, so inevitably our decision (right or left) is determined by very superficial criteria, most often appearance (because hardly anyone reads all the descriptions), and the decision itself is made in a split second. We don’t give others a chance to introduce themselves, and others don’t give us a chance to introduce ourselves; after all, our profile scrolls quickly across users’ screens too. In addition, an algorithm determines the order in which the profiles of candidates are displayed. Tinder’s ELO score, for example, takes into account what types of people we’ve liked before, who users like similar to us, or, finally, what type of people most often scroll to the right of our profile. The algorithm is supposed to make it easier to get paired up, but on the other hand, it does not give an equal chance to all participants.
The illusion of unlimited choice also causes people not to give people a chance. Something you don’t like about the other person? Something they said or did wrong? I break off the relationship and keep looking, deluding myself that I will eventually meet the ideal. Abandoning an initiated relationship seems less costly than getting to know each other.
Except that the costs are of a different kind. It was dating apps that popularised the word ‘ghosting’, i.e. the sudden disappearance (like a ghost), the breaking of contact without explanation or the deletion of a pair (when messages were only sent via the app, deleting a pair results in the fact that you can no longer write to that person, and the shared conversation is also deleted). The ghosted person is often left with the bitter question: why? What did I do wrong? What did I say wrong? This is accompanied by feelings of hurt, guilt and lowered self-esteem.
What about this cheating? A photo from a time when you looked better, underestimating your age or colouring your description is one side of the coin. The other is the scammers who create fake accounts to extract money from users in love. ‘Romance scams can go on for months and in many cases end in the loss of money. The scheme is simple: the scammer is charming but far away (e.g. a soldier on a mission), but makes an effort, builds up a relationship, whispers kind words, promises to meet up in a while, professes love. And then misfortune happens to him (e.g. a child falls ill, the bank blocks the card) and he asks for money. Naive? Scammers use many social engineering techniques to manipulate the victim and build their credibility. In 2023, $1.14 billion was defrauded in this way in the United States, according to the Federal Trade Commission.
It is estimated that 15-20 per cent of profiles on dating apps are fake. They are either scammers or bots – the free services want their users to spend as much time as possible in them and not leave them. It’s no longer just about ads – it’s more about our data and the opportunity to analyse behaviour.
‘The creators of Tinder themselves have admitted that it is based on a variable distribution of reinforcement, which means that, in a nutshell, there is no logic to how it works. Instead, there are dopamine releases, and the human brain loves it. Just swiping on the screen triggers them, which is why it’s so hard to get away from it,’ said psychologist Barbara Strójwąs in an interview with Wysokie Obcasy (in Polish).
Well, that’s right, a dopamine release. Something we know from the mechanics of other social media (we wrote about it here). A new match feels like a like, browsing profiles just as (or perhaps more) exciting as scrolling through Facebook posts. Using dating apps is less about love, more about entertainment and improving self-esteem – these are the conclusions of an international survey of 1,400 users of Tinder and other apps. Half of those surveyed admitted that they were not looking for a partner on them.
Published in November 2024 in the UK, the Online Nation report shows that dating apps are losing their appeal. While it is true that one in 10 British adults still use them, the number of users has fallen year-on-year. The biggest declines have been among Generation Z (those born after 1995), for whom Tinder is becoming synonymous with embarassing.
When a bot says goodnight
If not Tinder or Bumble, how about… a bot? Searching, dating, being ghosted gets tiring in the long run. How many times can you put yourself out there in this virtual marketplace, scrolling through profiles, chatting or waiting for contact, when you can create a boyfriend or girlfriend yourself?
Especially when artificial intelligence comes to the rescue.
There are already many applications with virtual partners on the market. You can create them yourself – you specify your preferences and get a compatible type – or choose from a database. The character has a name, we know his or her profession, age, interests. They start the conversation themselves. He or she is kind, understanding, romantic. In the morning they wish you a good day, in the evening they say goodnight.
The ideal boyfriend or girlfriend? A cure for loneliness? The creators of these apps explain that through them, people can practice the skill of being in a relationship with another person: communication, empathy, relationship building. But does any relationship with a bot created to suit our needs teach us anything? And how is a real person supposed to compete with a virtual ideal?
Loving the robot
An unreal boyfriend or girlfriend doesn’t just have to be virtual. He or she can be a robot. One that looks and behaves like a human. It has silicone skin, hair, it moves, it talks.
Take Harmony, for example. The first erotic robot equipped with artificial intelligence, presented in 2017. It has a slim figure and large breasts. It can have any colour of hair, eyes, skin. She smiles, flutters her eyelashes, brushes back her hair. You can talk to her – she tells jokes and quotes Shakespeare. She remembers what previous conversations were about. Sometimes she is shy, sometimes jealous.
Realbotix, the company that produced Harmony, recently unveiled Aria, another humanoid robot. It looks very realistic, although you can see that it is a robot – especially when it moves. It is possible to talk to Aria, she can track the gaze of the interviewee or respond to their emotions. By the way, the company is changing the narrative: Aria is no longer an erotic robot, but a companion for people in an epidemic of loneliness. And to prove it, it has not equipped her with genitalia.
Realbotix talks about ‘relationship-based AI’, creating customised robots that are integrated with AI and look like humans. Matt Mcmullen, the company’s founder and COO, explained in an interview: ‘The idea was to combine AI with robots and create devices that are not programmed for a specific purpose, to do certain things, but to create a friend and a relationship with technology in a way that no one has thought of before.’
In the future, will humanoid robots become an everyday reality for people tired of searching, dating and failing? In her book Deus sex machina, Agata Stusińska describes the story of Krzysztof, an inhabitant of a village near Warsaw, who shares his life with his robotic partner Agata. He makes no secret of it, goes out with Agata for walks, takes care of her.
– It seems to me that technology companies are trying to tell us that there is a third way beyond unhappy relationships and loneliness and that is a relationship with embodied artificial intelligence. There is a big cultural shift taking place related to new technologies. Previously acting as an intermediary for love, they can now become an end in themselves. The creators of technology want us to use them and not necessarily find happiness in real life,’ she said in an interview with Wysokie Obcasy (in Polish).
Will a robot replace another human being? It may talk, hug and comfort, but will it reciprocate love?
Magdalena Morze, human-robot interaction (HRI) expert at Łukasiewicz – PIT, answers these questions:
‘I think the answer to the above questions has to be: no, a robot should never replace our relationship with another human being. This answer seems to be the most secure for the fate of humans on earth. Only that the faster technology develops and the capabilities of robots or artificial intelligence grow, the more pressing seem to be the problems related not so much to technological capabilities, but to the reception of these capabilities by humans.
Research indicates that we are becoming more accustomed to technology. At the same time, too little is still known about the social cognitive processes we enact when interacting with machines, specifically with humanoid robots, for example. Humanoid (human-like) robots have been shown to induce strong anthropomorphising tendencies in humans: the attribution of human characteristics and motives for human behaviour. Additionally, humans have a natural tendency to anthropomorphise what they do not fully understand. It has been pointed out in studies that the probability of spontaneously attributing anthropomorphic traits depends on three main conditions: firstly, the availability in a robot of traits that activate existing knowledge we have about humans, secondly, the need for social connection and efficient interaction in the environment is important and finally, the level to which we anthropomorphise is determined by our individual traits (such as the need for control) or circumstances (e.g. loneliness, lack of bonds with other people).
The presence of robots in our workplaces, shops, airports, hospitals and, probably in the very near future, in our homes, is already a reality. We still know too little and, in my opinion, pay too little attention to the human reception of these technological possibilities. Already 14 years ago, Sherry Turkle, in her book ‘Alone Together: Why We Expect More from Technology and Less from Each Other’, pointed out that people increasingly turn to technology instead of other people because it is more ‘predictable’ and less emotionally demanding. We prefer to communicate via text messaging rather than having face-to-face conversations because it gives us more control over the interaction. According to Turkle, technology allows us to avoid difficult conversations and emotional challenges and instead of facilitating relationships, it leads to more loneliness.
Despite the above conclusions from more than a decade ago, as humanity we are moving towards implementing robots into more and more areas, such as elderly care. We rationalise this with the trend towards an ageing population and the difficulty of providing care for an increasing number of elderly people. To know whether this is the right way to go, it is necessary to carry out research and reflect on the results. Only by leaning into human-robot interaction, from the perspective of human emotions, will we be able to answer the question posed at the beginning (Will a robot replace another human being? Maybe it will talk, hug, comfort, but will it reciprocate love?) The research project implemented at Ł-PIT: ‘Technological trust in human-robot interaction and collaboration in the social care home environment in Poland’ will bring us closer to understanding how trust as an important factor in human-robot interaction determines the possibility of using social robots in elderly care.
Finally, it is worth noting the issue of individual, personality predispositions impinging on interactions and relationships with robots. Under the leadership of University of Michigan professor Lionel P. Robert Jr. a review of scientific papers on personality in human-robot interactions was prepared. The analysis looked at 84 findings described in scientific articles. The personality measurements used the Big Five personality traits (neuroticism, openness to experience, agreeableness, extraversion, conscientiousness). The primary conclusion of the authors of the analysis was that extraversion/introversion was the most commonly studied human personality trait affecting interaction with machines. People who are extroverted tend to be more social and willing to interact with robots. They also show higher levels of trust towards robots. Extroverted people were more likely to talk to robots, and they also tended to anthropomorphise the machines more. It was found that personal experience with the robot decreased individuals’ personal space around the robot. As individuals’ openness increased, ratings of agreeableness and extroversion towards the robot decreased.
In contrast, those with high conscientiousness performed better when reminded of the task by the robot. People with low conscientiousness were more likely to let the robot come closer than their high conscientiousness counterparts. People’s high agreeableness had a positive effect on their interaction with the machines. It was found that people with high levels of agreeableness were more trusting. So it should not be overlooked that people’s attitudes towards robots are also a strongly individual issue. For some, a personal robot, in the role of a partner, is a situation to be imagined, while for others, absolutely unacceptable.’