OpenAI gives unusual warning to ChatGPT-4o users

OpenAI warned ChatGPT-4o users from developing feelings for AI tool

OpenAI gives unusual warning to ChatGPT-4o users
OpenAI gives unusual warning to ChatGPT-4o users

OpenAI has released a major warning for AI users who might develop feelings for the chatbot.

In a “system card” blog post for GPT-4o, OpenAI highlighted major risk factors of the chatbot and one of them is “anthropomorphization and emotional reliance,” which involves attributing human-like behaviors and characteristics to nonhuman entities, such as AI models.

OpenAI also explained that, “during early testing … we observed users using language that might indicate forming connections with the model. For example, this includes language expressing shared bonds, such as “This is our last day together”.”

Some AI users are becoming increasingly attached to the chatbot.

According to the OpenAI, GPT-4o can sometimes “unintentionally generate an output emulating the user’s voice.”

The AI company revealed that it will further study the potential for “emotional reliance.”

“and ways in which deeper integration of our model’s and systems’ many features with the audio modality may drive behavior,” the statement further read.

OpenAI also claimed that ChatGPT is “deferential,” which means users can take over conversations and can interrupt whenever they want.