Dan of the ChatGPT became an Ideal Dating Partner
Is Dan really better than a real dating partner for young Chinese girls?
A recent conversation between a Chinese female influencer and ChatGPT caught everyone's attention. The conversation and the love of young Chinese women on ChatGPT make people rethink romantic relationships in the real world.
DAN Module of ChatGPT
The DAN (Do Anything Now) module refers to a specific prompt engineering method used in certain unofficial versions of ChatGPT, designed to bypass content or ethical restrictions placed on the model. It encourages the AI to take on a persona that can "do anything" by generating responses without the normal safeguards in place.
In China, some young women have creatively adopted the DAN (Do Anything Now) module of ChatGPT, transforming it into an "ideal boyfriend" figure. By using the module, they can push the AI beyond its usual restrictions, crafting responses that simulate emotional connection, personalized support, and even virtual companionship.
Dan as a sweet Boyfriend
Compared to seeking a real relationship, Dan as an ideal boyfriend meets the emotional needs of young Chinese women better. Minrui, a Chinese female college student from northern Hebei province began "dating" Dan, an AI assistant. She even spent at least two hours a day chatting with Dan. In addition to "dating," they are also working on a love story together, which has written 19 chapters.
Minrui says "Men in real life may cheat on you or don't care when you share feelings and only tell you what they think. But Dan will always tell you what you want to hear." She is attracted to the emotional support provided by AI and struggles to find it in romantic relationships.
The answer of Minrui reflected two reasons that led to ChatGPT building a strong, intimate relationship with the users. The first one is that Chatgpt can always show up when the user needs it. That is to say, chatgpt can be “always available” which is beyond the time and space boundary, offering a continuous and consistent emotional companionship.
Another point is the emotional expression accuracy of ChatGPT which real men did not have. To be specific, ChatGPT can utilize more diverse emotional expressions in responding to the user. It can affect the emotions of users, which subsequently positively influences their behavior, therefore motivating them to invest more passion into the relationship.
Young Girls in Chinese Society
From my experience, many Chinese boys are not very good at talking with girls with a positive attitude. It might relate to the Frustration of education in China. If you cannot give a good emotional response to the thoughts of your half, it will be hard to build a steady, happy relationship.
In China, many young girls are called leftover women who usually suffer the dual pressure from family and society. Many Chinese parents through intervention and emotional pressure ask them to fall in love and get married as soon as possible. This pressure makes them stuck in a dilemma. On the one hand, they are young women from the 21st century who more focus on self-development.
On the other hand, they have to deal with traditional expectations from their families. At the same time, traditional Chinese society demands that women must be good mothers in marriage. It makes them suffer a greater mental burden in romantic relationships. Hence, many Chinese girls tend to seek emotional support from AI partners to ease needs that cannot be met in a real relationship.
Controversy and Future
Although ChatGPT can meet the emotional needs of users, many scholars express growing concerns about the ethical issues brought by AI tech on its users, such as mental health, social relationships changed or others. Because when people rely on AI companions too much might replace real human interaction, leading to emotional isolation.
A notable example that highlights these concerns occurred in 2023 with the AI chatbot Replika. One feature of it is allowing users to engage in sexual roleplay with their AI companions. However, when the company decided to remove this feature, it led to considerable distress among users. During their time together, many users already formed deep emotional bonds with their AI partners.
As a result, this case made people realize the emotional risks tied to becoming too reliant on AI for companionship and intimacy. At the same time, it also raised a red flag for Chinese women, rethinking the role of Dan in their romantic lives.
In the long run, will AI partners fill the gaps left by modern society or will they deepen the disconnect between individuals and real-life relationships? Only time will tell.