We explain the meaning of ‘hallucinate’, and why the Cambridge Dictionary chose it as its Word of the Year.
Hallucinating ‘false information’
“To seem to see, hear, feel, or smell something that does not exist, usually because of a health condition or because you have taken a drug.” This used to be the definition of ‘hallucinate’ in the Cambridge Dictionary prior to 2023.
But in the year when generative AI tools such as ChatGPT and Bard, which use large language models (LLMs), captured the world’s imagination, ‘hallucinate’ has taken on additional meaning. The Cambridge Dictionary, this year, made the following addition to its definition: “When an artificial intelligence hallucinates, it produces false information.”
As we live through the nascency of generative AI with all its deficiencies and limitations, hallucinations are all too common. Sometimes utterly nonsensical while seemingly plausible other times, they differ in form. But at their core, they are falsehoods touted confidently by AIs while processing any prompt.
Why AIs hallucinate
Generative AI takes actions based on past data. When given a prompt — a piece of text, an image, or even a piece of computer code — the AI generates an appropriate response based on the information it has been trained on. Popular tools such as ChatGPT and Bard use LLMs, which learn from ginormous data. Effectively, they (attempt to) recreate human thought and linguistic expression by ‘learning’ from millions of (human-created) sources.
“At their best, large language models can only be as reliable as their training data,” Wendalyn Nichols, Cambridge Dictionary’s Publishing Manager, said in a statement. “AIs are fantastic at churning through huge amounts of data to extract specific information and consolidate it. But the more original you ask them to be, the likelier they are to go astray,” she said.
Most Read
Advertisement
AIs can learn from factually inaccurate sources, or produce inaccuracies themselves while processing the information. Either way, hallucinations have some real-world consequences. For instance, a US law firm used ChatGPT for legal research, which led to fictitious cases being cited in court. The judge fined the firm a sum of $ 5,000 for the mistake.
Why Cambridge Dictionary chose ‘hallucinate’
The Cambridge Dictionary Word of the Year has been published every year since 2015. A team makes the decision based on data on words that saw popular usage in the year and their cultural salience.
“The Cambridge Dictionary team chose hallucinate as its Word of the Year 2023 as it recognised that the new meaning gets to the heart of why people are talking about AI,” the company’s statement announcing the decision on Wednesday read. “Generative AI is a powerful tool but one we’re all still learning how to interact with safely and effectively — this means being aware of both its potential strengths and its current weaknesses,” it said.
No comments:
Post a Comment