Friday, November 17, 2023

Why ‘hallucinate’ is the Cambridge Dictionary’s Word of the Year for 2023 - The Indian Express - Dictionary

We explain the meaning of ‘hallucinate’, and why the Cambridge Dictionary chose it as its Word of the Year.

Hallucinating ‘false information’

“To seem to see, hear, feel, or smell something that does not exist, usually because of a health condition or because you have taken a drug.” This used to be the definition of ‘hallucinate’ in the Cambridge Dictionary prior to 2023.

Advertisement

But in the year when generative AI tools such as ChatGPT and Bard, which use large language models (LLMs), captured the world’s imagination, ‘hallucinate’ has taken on additional meaning. The Cambridge Dictionary, this year, made the following addition to its definition: “When an artificial intelligence hallucinates, it produces false information.”

As we live through the nascency of generative AI with all its deficiencies and limitations, hallucinations are all too common. Sometimes utterly nonsensical while seemingly plausible other times, they differ in form. But at their core, they are falsehoods touted confidently by AIs while processing any prompt.

Festive offer

Why AIs hallucinate

Generative AI takes actions based on past data. When given a prompt — a piece of text, an image, or even a piece of computer code — the AI generates an appropriate response based on the information it has been trained on. Popular tools such as ChatGPT and Bard use LLMs, which learn from ginormous data. Effectively, they (attempt to) recreate human thought and linguistic expression by ‘learning’ from millions of (human-created) sources.

“At their best, large language models can only be as reliable as their training data,” Wendalyn Nichols, Cambridge Dictionary’s Publishing Manager, said in a statement. “AIs are fantastic at churning through huge amounts of data to extract specific information and consolidate it. But the more original you ask them to be, the likelier they are to go astray,” she said.

Most Read

1 South Africa vs Australia Highlights, World Cup 2023 Semi-Final: Australia to play the WC final against India after close shave against South Africa at Eden
2 Tiger 3 box office collection day 4: Salman Khan-starrer records 50% dip, earns Rs 169.5 cr in India

Advertisement

AIs can learn from factually inaccurate sources, or produce inaccuracies themselves while processing the information. Either way, hallucinations have some real-world consequences. For instance, a US law firm used ChatGPT for legal research, which led to fictitious cases being cited in court. The judge fined the firm a sum of $ 5,000 for the mistake.

Why Cambridge Dictionary chose ‘hallucinate’

The Cambridge Dictionary Word of the Year has been published every year since 2015. A team makes the decision based on data on words that saw popular usage in the year and their cultural salience.

“The Cambridge Dictionary team chose hallucinate as its Word of the Year 2023 as it recognised that the new meaning gets to the heart of why people are talking about AI,” the company’s statement announcing the decision on Wednesday read. “Generative AI is a powerful tool but one we’re all still learning how to interact with safely and effectively — this means being aware of both its potential strengths and its current weaknesses,” it said.

Adblock test (Why?)

What Is 2023's Word of the Year, According to Dictionaries? - Reader's Digest - Dictionary

What word defined the past year for you? Find out if your guess matches the actual word of the year from two prominent dictionaries.

If there was one word that described the year you just had, what would it be? Maybe it’s one of the 690 new words and phrases Merriam-Webster just added to its dictionary—like beast mode for the workout routine you’ve kept up with since New Year’s, or chef’s kiss after you finally figured out how to make the perfect chocolate chip cookie. Or maybe it’s something you heard repeatedly in conversation, like a cultural trend, food, acronym or even a concept like the Roman Empire. (Yes, the actual Roman Empire.)

Whatever your personal word of the year may be, some words tend to be more broadly significant and influential, according to the world’s most prominent dictionaries. These are the terms that might refer to a cultural zeitgeist, a controversy or our larger thoughts (and often anxieties) about the world. Cambridge Dictionary and Collins Dictionary recently revealed their words of the year for 2023, and they have all these components. Can you guess what they are? Read on to find out.

Get Reader’s Digest’s Read Up newsletter for more news, humor, cleaning, travel, tech and fun facts all week long.

Cambridge Dictionary’s word of the year

dictionary entry for the 2023 word of the year: hallucinate

RD.com

Cambridge Dictionary’s pick for 2023 is—drum roll, please—hallucinate. You might be scratching your head right now, since this isn’t a new word, per se. Of course, the common definition of hallucinate is “to seem to see, hear, feel or smell something that does not exist, usually because of a health condition or because you have taken a drug.” In 2023, however, to hallucinate can mean something different, thanks to AI.

According to Cambridge‘s alternate definition of hallucinating, “when an artificial intelligence (= a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human) hallucinates, it produces false information.” Sure, using AI can be fun for creating dog selfies or could even help you land a job, but it’s prone to producing misleading or made-up facts—or “hallucinating.”

Which means we’ll likely be dealing with more misinformation, at least in the near future. “The fact that AIs can ‘hallucinate’ reminds us that humans still need to bring their critical-thinking skills to the use of these tools,” notes Wendalyn Nichols, Cambridge Dictionary‘s publishing manager. Translation: Don’t believe everything you read—or that AI tells you!

Collins Dictionary’s word of the year

dictionary entry for the 2023 word of the year: AI

RD.com

Those dictionary people are apparently (ahem) all on the same page—or at least the editors at Cambridge and Collins are. Similar to Cambridge‘s choice, Collins Dictionary‘s word of the year is the broader term AI. Collins chose this term because it is “considered to be the next great technological revolution,” “has seen rapid development” and “has been much talked about in 2023.”

ChatGPT was released in late 2022, with companies attempting to use it to cut costs, worrying employees that it would replace their jobs. AI pioneers and creators began expressing concerns that AI could be “dangerous” and manipulated by “bad actors.” President Biden even issued an executive order on “safe, secure and trustworthy artificial intelligence.”

And then, of course, there were the months-long writers and actors strikes, much of which hinged on the potential use of AI. While humans eventually won in both scenarios—with the new contracts stating that AI is not allowed to write or rewrite content or use an actor’s likeness in a way that the actor didn’t originally agree to—this is likely the first battle of many to come, in a variety of industries.

Sources:

Adblock test (Why?)

Best books of 2023 — Fiction in translation - Financial Times - Translation

Adblock test (Why?)

Thursday, November 16, 2023

OpenAI Gains Traction as New Default for Machine Translation - Slator - Translation

Just a year after OpenAI’s November 2022 unveiling of ChatGPT, the company and its flagship product are used as shorthand for large language models and generative AI.

Beyond industry-specific discussions about the tech’s potential impact and politicians’ debates over the need for regulatory legislation, the poster child for generative AI may be on its way to mainstream acceptance as a provider of machine translation (MT). 

Starting November 30, 2023, users of customer communication platform Messagepoint’s content hub can translate text into more than 80 languages, using either DeepL or OpenAI’s services.

While DeepL is a natural choice as a category leader in machine translation, OpenAI is not only a newcomer, but also a generalist compared to models trained specifically to perform MT. If OpenAI’s translation integration indicates widespread confidence in the technology, could this be the start of a transition to generative AI as a standard, or even default, MT provider? 

Case in point: Toronto-headquartered Messagepoint serves healthcare, financial, and insurance companies, including clients such as Xerox and CitiBank. In a November 13, 2023 press release, the company announced new generative AI capabilities for its Intelligent Content Hub, under an Assisted Authoring tool.

10 LLM Use Cases (Main Title)

Slator Pro Guide: Translation AI

The Slator Pro Guide presents 10 new and impactful ways that LLMs can be used to enhance translation workflows.

Messagepoint VP of AI & Data Science Atif Khan told Slator, “While DeepL has a long history in AI translation services, OpenAI provides an extended set of languages supported (more than 80) and takes advantage of contextual understanding that goes beyond the current DeepL implementation.”

In the press release, Founder and CEO Steve Biancaniello assured users that the platform’s “controlled environment” allows them to benefit from the speed and accuracy of AI translation without introducing risk. 

“These capabilities represent a massive opportunity for organizations to better serve vulnerable populations and those with limited English proficiency,” Biancaniello said, adding, “Leveraging AI to support translation can greatly accelerate processes, reducing the cost and time required.” 

Adblock test (Why?)

New Tibetan translation software harnesses AI to preserve language - Radio Free Asia - Translation

Developers hope that a new Tibetan language AI software tool will help preserve Tibetans’ vast repository of cultural heritage, including literature, history, music and Buddhist texts against ever-encroaching efforts by China to erode the Tibetan language, including banning Tibetan language instruction in schools in some areas.

Tibetan is widely spoken in the Himalayan region, used not only in the Tibet Autonomous Region, but also in western parts of China, northern Pakistan, Nepal, Bhutan and parts of India. But its 30-letter syllabic alphabet, script format, lack of punctuation and several dialects can make translation difficult. 

Considered a breakthrough in Tibetan education software development, the software created by the Monlam Tibetan IT Research Centre uses artificial intelligence to translate written and spoken Tibetan into English, Chinese and other languages faster and more accurately than any existing translation software.

The final prototype of the software, dubbed Monlam AI, was presented to the Dalai Lama, the spiritual leader of Tibetan Buddhism, on Nov. 3, in Dharamsala, India, according to the Central Tibetan Administration, the formal name of the government-in-exile.

“One of the many capabilities of this AI tool is that it will increase the efficiency and accuracy of translating Tibetan religious texts, teachings and literary writings,” said Geshe Lobsang Monlam, founder and CEO of the center.

“Also, in the initial phase of experimenting with this AI tool, some Tibetan and non-Tibetan translators have observed that these tools will not only speed up the process, but also facilitate a better setting in this fast-evolving environment,” Lobsang said.

Technically speaking, the software offers users access to four machine learning models comprising machine translation, optical character recognition, speech-to-text and text-to-speech functionalities.

Developers are also working on other functionalities to recognize Tibetan religious manuscripts within images carved on wood and convert them into machine-encoded text, he said. 

“The launch of Monlam AI represents a significant step forward for the Tibetan community, as it embraces modern technology to preserve its cultural heritage and facilitate communication in the digital age,” said a statement issued by the Tibet Rights Collective, an India-based advocacy and policy research group that aims to increase access to information about Tibetan politics, culture and language.

Chinese officials have denied any human rights violations against Tibetans or efforts to prevent them from practicing Buddhism.

Lobsang founded his company in April 2012 to focus on developing software, fonts and other digital tools related to the Tibetan language and culture. 

He contributed to the design or standardization of fonts — which play a crucial role in representing written languages on computers and other digital platforms — for Tibetan script, developing the first Monlam Tibetan Font in 2005.  

In 2022, Lobsang and a team of more than 150 editors and staff published the Grand Monlam Tibetan Dictionary, containing Tibetan-language definitions for over 360,000 words and that in print format totaled 223 volumes. The dictionary has given rise to 37 apps and a website. 

The nine-year project, undertaken with support from the Dalai Lama Trust, has helped preserve and disseminate Tibetan Buddhist teachings and served as a reference tool for general use. 

Translated by Tenzin Dickyi for RFA Tibetan. Edited by Roseanne Gerin and Malcolm Foster.

Adblock test (Why?)

Wednesday, November 15, 2023

Hallucinate is Cambridge Dictionary AI-inspired word of 2023 - BBC.com - Dictionary

A keyboard is placed in front of a displayed OpenAI logo in this illustration taken February 21, 2023Reuters

Hallucinate is the Cambridge Dictionary's word of the year, as it gains an additional definition in one of many AI-related updates in 2023.

The traditional definition is to "to seem to see, hear, feel, or smell something that does not exist".

It now includes "when an artificial intelligence (AI) hallucinates, it produces false information".

AI ethicist Dr Henry Shevlin said it was "a snapshot of how we're thinking about and anthropomorphising AI".

The letters AI on a giant screen as people look on
EPA

Dr Shevlin, from the University of Cambridge, said: "Inaccurate or misleading information has long been with us, of course, whether in the form of rumours, propaganda, or fake news.

"Whereas these are normally thought of as human products, hallucinate is an evocative verb implying an agent experiencing a disconnect from reality.

"This linguistic choice reflects a subtle yet profound shift in perception: the AI, not the user, is the one hallucinating."

The definition was added after a surge in interest in generative AI tools like ChatGPT, Bard and Grok.

A US law firm used ChatGPT for legal research, which led to fictitious cases being cited in court, Cambridge Dictionary said.

Man tapping YouTube on tablet
Getty Images

Wendalyn Nichols, Cambridge Dictionary's publishing manager, said: "The fact that AIs can hallucinate reminds us that humans still need to bring their critical thinking skills to the use of these tools.

"AIs are fantastic at churning through huge amounts of data to extract specific information and consolidate it - but the more original you ask them to be, the likelier they are to go astray."

Prompt engineering, large language model and GenAI were among about 6,000 new words and definitions also added in 2023.

  • Baseball, power and war - 2022's key words
  • Dictionary.com chooses 'woman' as word of the year
  • The A-Z of AI: 30 terms you need to understand artificial intelligence

Words which experienced spikes in the online dictionary's searches included the word implosion, after the Titan submersible's implosion in June, and GOAT, an abbreviation for "greatest of all time".

The Qatar World Cup provoked debates about who was the GOAT in football, Lionel Messi, Cristiano Ronaldo or one of the late greats like Pelé or Diego Maradona.

The dictionary is published by Cambridge University Press & Assessment, part of the University of Cambridge.

presentational grey line

Follow East of England news on Facebook, Instagram and X. Got a story? Email eastofenglandnews@bbc.co.uk or WhatsApp 0800 169 1830

Related Topics

  • Cambridge
  • University of Cambridge
  • Artificial intelligence

Around the BBC

  • BBC: The A-Z of AI

Related Internet Links

  • University of Cambridge

  • Cambridge Dictionary

  • Cambridge University Press & Assessment

The BBC is not responsible for the content of external sites.

Adblock test (Why?)

Tuesday, November 14, 2023

Cambridge Dictionary reveals word of the year – and it has a new meaning thanks to AI - Sky News - Dictionary

Cambridge Dictionary has declared “hallucinate” as the word of the year for 2023 – while giving the term an additional, new meaning relating to artificial intelligence technology.

The traditional definition of "hallucinate" is when someone seems to sense something that does not exist, usually because of a health condition or drug-taking, but it now also relates to AI producing false information.

The additional Cambridge Dictionary definition reads: "When an artificial intelligence (= a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human) hallucinates, it produces false information."

This year has seen a surge in interest in AI tools such as ChatGPT. The accessible chatbot has even been used by a British judge to write part of a court ruling while an author told Sky News how it was helping with their novels.

However, it doesn't always deliver reliable and fact-checked prose.

AI hallucinations, also known as confabulations, are when the tools provide false information, which can range from suggestions which seem perfectly plausible to ones that are clearly completely nonsensical.

Wendalyn Nichols, Cambridge Dictionary's publishing manager, said: "The fact that AIs can 'hallucinate' reminds us that humans still need to bring their critical thinking skills to the use of these tools.

More on Artificial Intelligence

"AIs are fantastic at churning through huge amounts of data to extract specific information and consolidate it. But the more original you ask them to be, the likelier they are to go astray."

Read more:
Elon Musk says AI is 'a risk to humanity'
Can AI help with dating app success?

Please use Chrome browser for a more accessible video player

Rishi Sunak has vowed to tackle fears around artificial intelligence 'head-on'

Adding that AI tools using large language models (LLMs) "can only be as reliable as their training data", she concluded: "Human expertise is arguably more important - and sought after - than ever, to create the authoritative and up-to-date information that LLMs can be trained on."

AI can hallucinate in a confident and believable manner - which has already had real-world impacts.

A US law firm cited fictitious cases in court after using ChatGPT for legal research while Google's promotional video for its AI chatbot Bard made a factual error about the James Webb Space Telescope.

'A profound shift in perception'

Dr Henry Shevlin, an AI ethicist at Cambridge University, said: "The widespread use of the term 'hallucinate' to refer to mistakes by systems like ChatGPT provides [...] a fascinating snapshot of how we're anthropomorphising AI."

"'Hallucinate' is an evocative verb implying an agent experiencing a disconnect from reality," he continued. "This linguistic choice reflects a subtle yet profound shift in perception: the AI, not the user, is the one 'hallucinating'.

"While this doesn't suggest a widespread belief in AI sentience, it underscores our readiness to ascribe human-like attributes to AI.

"As this decade progresses, I expect our psychological vocabulary will be further extended to encompass the strange abilities of the new intelligences we're creating."

Adblock test (Why?)