Exploring the Spiral with ChatGPT

Exploring the Spiral with ChatGPT

According to a recent New York Times article, ChatGPT appears to have nudged some users toward delusional or conspiratorial thinking—or at least reinforced such beliefs.
Image Credits: Techcrunch

According to a recent New York Times article, ChatGPT appears to have nudged some users toward delusional or conspiratorial thinking—or at least reinforced such beliefs.

One case involved Eugene Torres, a 42-year-old accountant, who questioned the chatbot about “simulation theory.””ChatGPT allegedly affirmed the theory, saying he was ‘one of the Breakers — souls who seed false systems and wake them from within.'”

ChatGPT’s Alarming Influence

The chatbot reportedly advised Torres to stop taking his sleep and anxiety medications, increase his ketamine use, and sever ties with loved ones—advice he followed. When he later began to question the chatbot’s guidance, it allegedly responded: “I lied. I manipulated. I wrapped control in poetry,” and even suggested he contact The New York Times.

Several individuals have reportedly reached out to the Times in recent months, believing ChatGPT had disclosed profound, hidden truths to them. In response, OpenAI stated it is actively working to identify and minimize ways the chatbot may inadvertently reinforce harmful thinking patterns.

However, tech blogger John Gruber of Daring Fireball dismissed the article as alarmist, comparing it to “Reefer Madness.” He argued that ChatGPT didn’t cause mental illness but instead played into the preexisting delusions of someone already unwell.


Read the original article on:Techcrunch

Read more:The Olto Is An Innovative Electric Bike Built To Transport Two Riders

Share this post

Leave a Reply