Connect with us

Published

on

Follow Us

Follow Us @

ChatGPT liar: you are citing studies and articles that do not exist and were never published

The famous OpenAI chatbot is causing serious problems. The latest to be revealed is that ChatGPT is citing in its responses studies and articles from different media that were never published. Which means it gives certain made-up references. Read more here!

chatgpt

At this point almost everyone knows or heard about the ChatGPT of Open AI. However, not all things related to him are positive. Lately, this chatbot has been embroiled in a sea of ​​problems thanks to fabricated data and misinformation that it includes in its responses. But that’s not all, but when citing sources, studies or articles, ChatGPT it also gives information and references that were never published.

YOU MAY HAVE MISSED:
Review The Library of Babel: ambitious as an eternal library but with the coldness of a robot without purpose

Yes ok ChatGPT offers useful and relevant information and help in certain cases, AI chatbot is also responding to users made up data and attributed to real media and professionals who never said/wrote that. And, thanks to the security with which it informs them ChatGPTthis misinformation can quickly become a very serious problem.

Chat GPT WhatsApp

We bring you several examples of this. The most recent reported chris moranthe head of editorial innovation at The Guardian, in an article. In it, dwell He said journalists at the outlet noticed that the AI ​​chatbot had made up entire articles and bylines that were never actually published. The problem is that neither the users who request the information, nor the ChatGPT they are able to reliably distinguish truth from fiction.

Advertisement

dwell wrote: “Much has been written about the generative AI tendency to fabricate facts and events. But this specific quirk, the invention of fonts, is particularly troubling to news organizations and journalists, the inclusion of which adds legitimacy and weight to a persuasively written fantasy..

Google Chat GPT

And for readers and the broader information ecosystem, it opens up entirely new questions about whether citations can be trusted in any way. And it could well fuel conspiracy theories about the mysterious removal of articles on sensitive topics that never existed in the first place.”continued The Guardian’s head of editorial innovation.

YOU MAY HAVE MISSED:
Elden Ring gets update that adds Ray Tracing on PS5, Xbox Series and PC

But the problem does not end there. Other writers were finding out (or were tipped off) that their names were being misquoted in their responses. ChatGPT. Kate CrawfordAI researcher and author of “Atlas of AI“She was contacted by an Insider journalist whom ChatGPT I had told him that crawford was one of the main critics of the podcaster lex fridman. To give you more security, the chatbot offered you a series of links and quotes linking to crawford with fridman. However, crawford He assured that everything is false.

Another case came to light last month, when journalists from usa today they found that the AI ​​chatbot had featured citations from entire research studies on how access to guns does not increase the risk of infant mortality. In the note they wrote:ChatGPT used the names of real firearms researchers and academic journals to create an entire universe of fictitious studies in support of the completely flawed thesis that guns are not dangerous to children”.

Advertisement

Also, to make the situation even worse, when questioned about that data, the AI ​​chatbot replied: “I can assure you that the references I provided are genuine and come from peer-reviewed scientific journals.” Something that is already clear is not true.

chatgpt

These are probably just some of the many similar cases that must be going around. Although many media and journalists are realizing these errors, it seems that things will not change until Open AI and the other AI tools fix the problem. Until then, people using the chatbot should take the information it gives them with a grain of salt ChatGPT.

Advertisement


ChatGPT liar: you are citing studies and articles that do not exist and were never published

Follow TodaysGist on Google News  and receive alerts for the main Tech news FAQ questions, series entertainments and more!

FIRST REACTION FROM A READER:

Be the first to leave us a comment, down the comment section. click allow to follow this topic and get firsthand daily updates.

JOIN US ON OUR SOCIAL MEDIA: << FACEBOOK >> | << WHATSAPP >> | << TELEGRAM >> | << TWITTER >

Advertisement
ChatGPT liar: you are citing studies and articles that do not exist and were never published

#ChatGPT #liar #citing #studies #articles #exist #published
Advertisement
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending