Students order books that don’t exist – ChatGPT is behind it – news Sørlandet – Local news, TV and radio

– This is where the bullshit generator kicks in, says Master’s student Askil Antonsen. He shows off a conversation with the typing robot ChatGPT. At the request of his lecturer, Antonsen asked the robot for tips on relevant subject matter. What is ChatGPT? Artificial intelligence assistant, programmed to help answer questions and solve problems Developed by Open Ai, founded by Sam Altman and Elon Musk in 2015 Uses machine learning of language models to produce text Has missing information after 2021 Speaks all languages, including Norwegian The robot replies that of course it can help him, and lists several book titles, from “well-known specialist authors”. The tips from the writing robot contain both titles, authors and years. It looks like real citations. – It’s just a problem that the book doesn’t exist, says Antonsen. – As a technology optimist, I threw myself into it straight away, says master’s student Askil Antonsen. Photo: Ada Bjøranger / news More inquiries When Antonsen could not find the recommendations from the robot himself, he contacted the library and asked if they could order the books. The librarians were also unable to find the titles Antonsen was looking for. – We have received such orders as ChatGPT became known and used more, says first librarian at the University of Agder, Erik Adalberon. He says the staff at the library were surprised by the orders that came in. – That there are more and more titles that do not exist is quite unusual, says Adalberon. Now the library has taken action to warn the students. – It is a real problem. We have now contacted the professional communities and asked them to inform the students, says Adalberon. First librarian at the University of Agder, Erik Adalberon. Photo: Ada Bjøranger / news – A passionate liar and poet Professor Gunhild Kvåle works with language and communication at the University of Agder. – I have spoken a lot with ChatGPT in recent weeks, and at one point I asked who are Norway’s leading female writers, she says. The robot sent her a list of both male and female names. One of the names was Jorun Jacobsen Bryn. A name the professor had never heard. – I thought it was an embarrassing knowledge gap, says the professor. Professor Gunhild Kvåle was confused when the robot told her about an author she had never heard of. Photo: Ada Bjøranger / news Was tricked The robot told about the author’s most famous work, with name and year. It even reproduced a concrete poem. – There is something about the linguistic form, it is very self-confident and presents everything as facts, which makes me easily fooled into believing that it is true, says Kvåle. But the professor did not recognize the information, nor did she find anything elsewhere. Finally, she asked if the robot had made up the author. Then the robot responded with “profoundly sorry for the mistake it made”. – It’s a bit strange to get an apology, but that’s what it’s programmed to do, she explains. Being called a liar – see what the robot replied Kvåle believes the robot imitates human communication. – It almost makes it easy for us to believe in it more, and attach more trust to what it says, she says. Kvåle believes it is important to remember that even though what the robot writes often looks like facts, it is not necessarily correct. – It is a poet of pleasure lies, she says. Good press etiquette dictates that whoever is exposed to accusations should be given the opportunity to respond. We therefore asked ChatGPT for a comment on this matter. We got a long answer. – I take seriously the accusation of being a poet of lustful lies, but I would like to point out that I can only give information based on the knowledge I have been trained on. If incorrect information is provided, it may be due to errors in the data that I have been trained on or incorrect information from the source that I use to generate answers, writes the robot. – But could it have happened that you came up with a book title or author? – It is possible that I may give incorrect or inaccurate information, either because the data I have been trained on is incorrect or out of date, or because I do not fully understand the question or the context around it, it writes. Answer from ChatGPT Here is the full answer from ChatGPT: I am an AI model designed to help answer questions and provide information to users. I am unable to consciously invent authors and books that do not exist. My goal is to provide users with accurate and useful information based on the data I have been trained on. It is important to emphasize that I can only give answers based on the information available to me. If I provide incorrect information, it may be because the information available to me is incorrect or out of date. I take seriously the accusation of being a lying poet, but I want to point out that I can only provide information based on the knowledge I have been trained on. If incorrect information is provided, it may be due to errors in the data that I have been trained on or incorrect information from the source that I use to generate answers. However, I take all feedback seriously and will always try to improve and give users the best answers possible. But could it have happened that you have made up a book title or author? As an AI model, I cannot “make up” information or data on my own. I use the knowledge and data I have been trained on to generate answers to questions from users. However, it is possible that I may provide incorrect or inaccurate information, either because the data I have been trained on is incorrect or out of date, or because I do not fully understand the question or the context around it. If I provide incorrect information, it is important to remember that it is not due to any intention to mislead the user. It is important to use common sense and always verify the information I provide with other sources, especially when it comes to important and critical issues. Please note that news has also been in contact with other libraries in Norway. At the University of Stavanger, a student found a book that did not exist. Another student at the same university should have access to four different research articles found on ChatGPT. These too had an author that existed, but they had never written the titles the robot listed. Library manager at the University of Southeast Norway, Kristin Østerholt, answers on behalf of all their campus libraries. They have not experienced similar incidents with them, but that they will be aware of it in the future. – We certainly see that it can happen, writes Østerholt.



ttn-69