Funk: What electrical and electronic engineers tell us about ChatGPT is a wake-up call
The Institute of Electrical and Electronic Engineers is the “world's largest technical professional organization dedicated to advancing technology for the benefit of humanity,” so when one of its articles in Spectrum by Craig S. Smith is entitled “ChatGPT's Hallucinations Could Keep It from Succeeding,” we should listen. says Dr Jeffrey Funk.
Friday, 17 March, 2023 - 02:43
Entitled "ChatGPT's Hallucinations Could Keep It from Succeeding", the article argues that these hallucinations are why experienced programmers must review the output from OpenAI’s Codex or Github’s Copilot, approving, correcting or rejecting it before allowing it to slip into a code base where it might wreak havoc.
The article says that “High school teachers are learning the same. A ChatGPT-written book report or historical essay may be a breeze to read but could easily contain erroneous “facts” that the student was too lazy to root out.”
The article also contradicts Bill Gates, saying that he has “mused that ChatGPT or similar large language models could some day provide medical advice to people without access to doctors. But you can’t trust advice from a machine prone to hallucinations.”
The article cites Yann LeCun, a pioneer in deep learning and a winner of the Turing Award. LeCun believes there is a fundamental flaw that leads to hallucinations. “Large language models have no idea of the underlying reality that language describes,” he said, adding that most human knowledge is non-linguistic. “Those systems generate text that sounds fine, grammatically, semantically, but they don’t really have some sort of objective other than just satisfying statistical consistency with the prompt.”
The article continues: “Humans operate on a lot of knowledge that is never written down, such as customs, beliefs or practices within a community that are acquired through observation or experience. And a skilled craftsperson may have tacit knowledge of their craft that is never written down.”
“Language is built on top of a massive amount of background knowledge that we all have in common, that we call common sense,” LeCun said. He believes that computers need to learn by observation to acquire this kind of non-linguistic knowledge.
“There is a limit to how smart they can be and how accurate they can be because they have no experience of the real world, which is really the underlying reality of language,” said LeCun. “Most of what we learn has nothing to do with language.”
OpenAI of course disagrees. Its chief scientist and one of the creators of ChatGPT, says he’s confident that the problem will disappear with time as large language models learn to anchor their responses in reality. OpenAI has pioneered a technique to shape its models’ behaviours using something called reinforcement learning with human feedback.
We will find out how much less those hallucinations have become in the next few weeks as GPT-4 is tested. OpenAI CEO already admits GPT-4 hallucinates, the question is how much?
And how much more will it cost?
Further reading: https://spectrum.ieee.org/ai-hallucination
Dr. Jeffrey Funk (LinkedIn: https://www.linkedin.com/in/dr-jeffrey-funk-a979435/) has been at the forefront of understanding the relationship between people and technology for more than a quarter of a century. To do that, he has to understand, and to be able to explain, both people and technology. His research has shown what tech will succeed and what tech will not. As the hype of startups and technologies is eventually replaced by more rational thinking, he would like to help organisations pursue this rational thinking.
About this section
From FinTech to RegTech, from "AI" to security, from the terraverse to the metaverse, if there's a technology element, we're interested. But this is not an area for PR. It's an area for considered, structured articles that advance arguments. Think Op-Ed with a purpose.
Opinion pieces or "Op-Eds" are the home-made bombs of the publishing world. So long as they meet editorial standards, are not intentionally offensive with a view to causing hurt or insult and are relevant to our field of endeavour, we will look at submissions.
We like contentious, we like contrarian views. We don't like pretty much any -ism . We recognise that Opinion pieces are one person's view and are not balanced (if they are balanced and reach a reasoned conclusion, they are probably more suited to the Articles section). We do not like empty expressions (reaching out, going forward, circling back etc), acronyms and buzzwords. English, only please.
To submit an Opinion / Technology piece, please complete the Contact form.