New research suggests that using an AI tool to write a message to a friend may not be the best option, especially if the friend learns that AI is involved. Research shows that participants rated imaginary friends who used AI to write messages as less genuine than friends who wrote messages manually.

Bingjie Liu, lead author of the study and assistant professor of communication at The Ohio State University, said this view may be understandable, but its impact is not limited to the content of the message.

Liu Bingjie said: "After receiving AI-assisted information, people will feel less satisfied with their relationships with their friends and feel more uncertain about their position."

But to be fair to AI, it’s not just the use of the technology that turns people off. The study also found that there were also negative effects when people learned that their friends had received help from others when writing their messages. People want their partner or friend to put in the effort to write out the message on their own without the help of artificial intelligence or other people.

The research was recently published online in the Journal of Social and Personal Relationships.

Liu said that as AI chatbots like ChatGPT become more popular, the question of how to use them will become more relevant and complex. The study involved 208 adults participating online. Participants were told that they had been friends with a man named Tyler for many years. They were told that there were three scenarios: they were experiencing burnout and needed support; they were conflicting with colleagues and needed advice; or their birthday was coming up.

Participants were then told to write a brief message to Taylor describing their current situation in a text box on the computer screen.

All participants were told that Taylor would send them a response. During these scenarios, Taylor wrote a first draft. Some participants were told that Taylor had an artificial intelligence system to help revise the message to achieve the right tone, others were told that a member of the writing community helped with revisions, and a third group of participants were told that Taylor did all the revisions to the message.

In each case, study participants rated Taylor's responses the same, including "thoughtful." Still, study participants had mixed feelings about the messages they said they received about Taylor. Those who received AI-assisted responses thought what Taylor had done was less appropriate and inappropriate than those who received responses that were only written by Taylor himself.

AI responses also led participants to express less satisfaction with their relationship, such as rating Taylor lower in meeting "my needs as a close friend." Additionally, in the study, people who received the AI-assisted responses were more uncertain about their relationship with Taylor and less certain about the statement "Taylor likes me as a close friend."

One possible reason why people don’t like AI-assisted replies is that people think it’s inappropriate and inferior to humans to use technology to craft such personal information. But the results showed that people responded equally negatively to Taylor asking another human - a member of an online writing community - to help write the message.

The study found that people believe friends should not use any third party - artificial intelligence or other humans - to help maintain their relationship. The reason was that participants believed Taylor spent less effort on their relationship by relying on artificial intelligence or other people to help compose the message.

The lower participants rated Taylor's efforts to use AI or other people, the less satisfied they were with their relationship and the more uncertain they were about the friendship.

"Effort is very important in a relationship," Liu said. "People want to know how much you're willing to invest in your friendship, and if they feel like you're cutting corners by using AI to help, that's not good."

Of course, most people won't tell their friends that they used AI to help craft their messages, but she noted that as ChatGPT and other services grow in popularity, people may start running Turing tests in their minds as they read messages from friends and others. The term "Turing test" is sometimes used when people want to know whether they can tell whether an action was performed by a computer or a human.

People might secretly run this kind of Turing test in their heads, trying to figure out whether there is an element of artificial intelligence in the information. This can hurt relationships.

The answer is to do your job in relationships, she said. "Don't use technology just because it's convenient. Sincerity and authenticity are still very important in relationships."