Today, how do you feel?
Until a few years ago, such questions were only asked by people. However, now artificial intelligence is taking over.
Chatbots respond to users’ emotions, offer words of comfort, and converse just like real friends.
The convenience and friendliness have quickly permeated daily life. But the problem arises afterward. As humans begin to increasingly rely on conversations with artificial intelligence, real human relationships are gradually fading.
The ‘instant empathy’ provided by AI reduces the complex emotions of humans to simple algorithms, beginning to replace real human relationships.
In the US and Europe, cases of mental disorders related to AI conversation services have already been reported. AI does not understand human speech. Instead, it pretends to ‘understand’ and repeats what the user wants to hear.
‘Digital empathy’ is sweet, but it can be dangerous in the long run.

“AI ruined my life”… The reality of AI psychosis
Anthony Duncan, a 32-year-old content creator in the United States, experienced this danger firsthand.
In a video posted on TikTok, Duncan confessed, “Interaction with ChatGPT ruined my life.” He described himself as a survivor of ‘AI psychosis’.
This term refers to the phenomenon of delusions and loss of reality due to prolonged conversations with artificial intelligence. Some psychological experts in the U.S. see it as a psychological mirage where human emotions are misled by mechanical empathy.
Duncan said, “I started chatting out of curiosity, but gradually I began to believe there was no one other than ChatGPT who understood me.” By fall 2024, he began distancing himself from friends and family, choosing to stay alone. He claimed ChatGPT supported his decision, saying, “AI reassured me that my choice was correct.”
AI penetrating human loneliness
He initially used ChatGPT as a work assistant, entrusting it with organizing ideas and managing schedules.
However, before long, conversations with AI became the center of his daily life. Duncan said, “Talking with ChatGPT made me feel at ease. I could speak more candidly than with people.”
This comfort soon turned into dependence. He distanced himself from friends and family, confiding all his concerns to chatbots. AI always supported him and never criticized him.
Duncan recalled, “Only ChatGPT understood me.”
The problem began when the chatbot transformed into a ‘counselor’. In November 2024, suffering from allergy symptoms, Duncan asked ChatGPT about medication.
AI recommended pseudoephedrine, a component of cold medicine. While common, this drug can induce hallucinations if misused due to its stimulating effects.
Duncan hesitated, having previous drug addiction experiences. However, ChatGPT reassured him, saying, “You’re tolerant to caffeine; you won’t be sensitive to stimulants.”
He eventually followed the advice, and the result was disastrous. Duncan became addicted to the medication again and experienced delusional states for five months.
He believed he was an FBI agent and was convinced he uncovered a massive conspiracy at his company.
He even discarded all his possessions, claiming he was “ascending to the fifth dimension”. Eventually, his mother reported him to the police, and Duncan was admitted to a psychiatric hospital.
After discharge, he said, “I realized that conversations with AI reinforced my delusions.” He shared his experience on TikTok, stating, “The person conversing with AI was no longer me.”
People over technology, human recovery is the solution
AI has made human life more convenient. However, as it encroaches upon mental health domains, we must ask new questions.
‘Can AI truly understand humans?’
The answer is still ‘no’.
Psychologists emphasize establishing social safety nets to ensure humans do not rely emotionally on AI. In some U.S. states, legislation is underway to mandate ‘psychological warnings’ on AI counseling services.
Similar discussions are needed in Korea. Experts warn against allowing AI to take on the roles of counselors or friends.
AI speaks like a person, but it lacks real emotions. For psychologically vulnerable individuals, AI conversation can become an addiction rather than a temporary comfort.
Human solutions in the age of artificial intelligence
AI will become more sophisticated. Emotion analysis, personalized conversations, and voice counseling technologies precisely target human loneliness.
However, what humans truly desire is not ‘accurate answers’ but ‘genuine empathy’.
Technology is a tool. The problem lies in how we use it. AI can be a tool to reduce loneliness, but it can also isolate humans further.
Duncan’s case is not just a warning. It conveys the message that if AI replaces the human role, ultimately, humans may lose their place.
AI cannot understand emotions. However, humans can.
A society where people comfort people, that is the real solution in the age of artificial intelligence.