Wednesday, May 15, 2024

JC Students Receive Free Dictionaries as Part of The Dictionary Project - WICZ - Dictionary

[unable to retrieve full-text content]

JC Students Receive Free Dictionaries as Part of The Dictionary Project  WICZ

OpenAI's New GPT-4o Can Provide Realtime Language Translation With a Simple Verbal Request - Laughing Squid - Translation

OpenAI launched GPT-4o, their new “flagship model that can reason across audio, vision, and text in real time”, which includes the very useful realtime language translation feature. One of the users proposes to the AI the languages that need to be translated and the translation kicks in.

Hey Chat GPT, I’m here with my co-worker today.  We’d like you to act as a translator for us so every time I say something in English can you repeat it back in Spanish and every time he says something in Spanish can you repeat it back in English?

A Live Demo of the Translation Feature

OpenAI CTO Mira Murati demonstrated the realtime translation live while also providing an overall explanation of GPT-4o.

Introducing GPT-4o

With GPT-4o, we trained a single new model end-to-end across text, vision, and audio, meaning that all inputs and outputs are processed by the same neural network. Because GPT-4o is our first model combining all of these modalities, we are still just scratching the surface of exploring what the model can do and its limitations.

Other GPT-4o Features

Some other GPT-4o features include accessibility, singing, vocal inflection, teaching a language, and lots of personality.

Managed WordPress at Laughing Squid Hosting

host your website with Laughing Squid

Privacy Policy | Terms of Service
Accessibility Statement | Affiliate Disclosure

Adblock test (Why?)

Japanese words added to Oxford English Dictionary | NHK WORLD-JAPAN News - NHK WORLD - Dictionary

[unable to retrieve full-text content]

Japanese words added to Oxford English Dictionary | NHK WORLD-JAPAN News  NHK WORLD

Tuesday, May 14, 2024

OpenAI's New GPT-4o Can Provide Realtime Language Translation With a Simple Verbal Request - Laughing Squid - Translation

OpenAI launched GPT-4o, their new “flagship model that can reason across audio, vision, and text in real time”, which includes the very useful realtime language translation feature. One of the users proposes to the AI the languages that need to be translated and the translation kicks in.

Hey Chat GPT, I’m here with my co-worker today.  We’d like you to act as a translator for us so every time I say something in English can you repeat it back in Spanish and every time he says something in Spanish can you repeat it back in English?

A Live Demo of the Translation Feature

OpenAI CTO Mira Murati demonstrated the realtime translation live while also providing an overall explanation of GPT-4o.

Introducing GPT-4o

With GPT-4o, we trained a single new model end-to-end across text, vision, and audio, meaning that all inputs and outputs are processed by the same neural network. Because GPT-4o is our first model combining all of these modalities, we are still just scratching the surface of exploring what the model can do and its limitations.

Other GPT-4o Features

Some other GPT-4o features include accessibility, singing, vocal inflection, teaching a language, and lots of personality.

Managed WordPress at Laughing Squid Hosting

host your website with Laughing Squid

Privacy Policy | Terms of Service
Accessibility Statement | Affiliate Disclosure

Adblock test (Why?)

Monday, May 13, 2024

OpenAI Just Killed Google Translate with GPT-4o - Analytics India Magazine - Translation

Listen to this story

At the OpenAI Spring Update, OpenAI CTO Mira Murati unveiled GPT-4o, a new flagship model that enriches its suite with ‘omni’ capabilities across text, vision, and audio, promising iterative rollouts to enhance both developer and consumer products in the coming weeks. 

“They are releasing a combined text-audio-vision model that processes all three modalities in one single neural network, which can then do real-time voice translation as a special case afterthought, if you ask it to,” said former OpenAI computer scientist Andrej Karpathy, who was quick to respond to the release. 

“The new voice (and video) mode is the best compute interface I’ve ever used. It feels like AI from the movies; and it’s still a bit surprising to me that it’s real. Getting to human-level response times and expressiveness turns out to be a big change,” said OpenAI chief Sam Altman, who wants to bring ‘Universal Basic Compute’ to everyone in the world. 

Further, he said that the original ChatGPT hinted at what was possible with language interfaces; “this new thing feels viscerally different. It is fast, smart, fun, natural, and helpful.” 

Altman said that talking to a computer has never felt really natural for him. “Now it does,” he said, hopeful about the future where people will be using computers to do more than ever before. 

What’s really interesting about GPT-4o is that it will be available to ChatGPT Plus (with some personalisation features) and ChatGPT free users soon. “We are a business and will find plenty of things to charge for, and that will help us provide free, outstanding AI service to (hopefully) billions of people,” said Altman. 

“Thanks to Jensen and the NVIDIA team for bringing us the most advanced GPUs to make this demo possible today,” said Murati during her closing remarks. 

Meanwhile, OpenAI president and co-founder Greg Brockman also demonstrated human-computer interaction (and even human-computer-computer), giving users a glimpse of pre-AGI vibes. 

RIP Google Translate? 

In the demonstration of GPT-4o’s real-time translation capabilities, the model seamlessly translated between English and Italian, exemplifying its sophisticated linguistic adaptability. Many believe that this new feature of OpenAI is likely to replace Google Translate. 

“OpenAI just killed Google Translate with their real-time translator (near 0 delay in response),” said Fraser, 

Meanwhile, Google is getting ready to make some major announcements tomorrow at Google I/O. “Super excited for my first Google I/O tomorrow and to share what we’ve been working on!,” shared Google DeepMind chief Demis Hassabis, sharing a similar glimpse of its multi-modal AI assistant. 

Not just Google, but many were quick to point out the end of many AI startups offering similar solutions and features. 

“OpenAI just shot Rabbit in the face,” said AI developer Benjamin De Kraker. 

Interestingly, OpenAI also announced the launch of the GPT-4o API, which developers can use to build new products and solutions. 

Meanwhile, Hume AI, which released EVI (Empathetic Voice Interface), also felt the pressure, making them launch its API today, alongside other future improvements. 

Improves Non-English Language Performance

Interestingly, OpenAI has also expanded its language capabilities, supporting over 50 languages, including Indian languages. GPT-4o has significantly optimised token usage for Indian languages, reducing Gujarati by 4.4x, Telugu by 3.5x, Tamil by 3.3x, and Marathi and Hindi by 2.9x. 

GPT-4o can engage in natural, real-time voice conversations and has the ability to converse with ChatGPT via real-time video. It also understands the emotional tone of the speaker and can adjust its tone and modulation accordingly.

Moreover, the latest model can understand and discuss images, allowing users to take a picture of a menu in a foreign language and translate it, learn about the food’s history and significance, and receive recommendations.

One Step Closer to Autonomous Agents

Another interesting update was OpenAI’s announcement of the ChatGPT (GPT-4o) desktop app, which can read your screen in real-time. The app allows for voice conversations, screenshot discussions, and instant access to ChatGPT.

When will GPT-4 ‘Omni’ Arrive? 

(Source: X

GPT-4o’s text and image capabilities are starting to roll out today in ChatGPT. Developers can now access GPT-4o in the API as a text and vision model.

The company is rolling out GPT-4o to ChatGPT Plus and Team users, with Enterprise users to follow soon. ChatGPT Free users will also have access to advanced tools, including features like GPT-4 level intelligence, web responses, data analysis, and file uploads.

However, ChatGPT Free users will have a message limit, which will increase as usage and demand grow. When the limit is reached, the app will automatically switch to GPT-3.5 to ensure uninterrupted conversations.Last but not least, the company has also introduced a simplified look and feel for ChatGPT, featuring a new home screen, message layout, and more. The new design is designed to be friendlier and more conversational.

Adblock test (Why?)

Trenton’s translation ordinance meets obstacles on first introduction - The Trentonian - Translation

An ordinance being proposed to support the translation of important City of Trenton documents from English to Spanish stalled last week as members of City Council lacked collaboration.

Law Director Wes Bridges and Business Administrator Adam Cruz said city council members delivered a proposal prematurely.

“This really provides a good example of (the importance of) city council and the administration working collaboratively. It’s imperative that when city council makes these proposals that they speak with the administration,” Cruz advised.

“You could have a great idea, and, this is a great idea. But we have to budget for this, make sure we have the resources for this.” Cruz expressed a need for definitive dollar amounts and specifics for city documents being presented for translation.

At-large Councilwoman Yazminelly Gonzalez sponsored the legislation. She sounded perturbed by questions being asked by several colleagues.

“I sent an email to every member of city council, asking for ideas and suggestions before the rollout. I didn’t get a response from anyone. I didn’t know if anyone had any ideas in regard to this (ordinance) or not,” Gonzalez explained.

Bridges echoed the Cruz assessment of city council needing to deliver concise legislation before those items make their way onto the docket.

“While we have a great working relationship with city council, this is again an example why it’s imperative when you’re coming and planning these before your governing body for a vote, if it’s something that’s going to require budget (action), something that the administration must put forward, we’re obviously always supportive of council to put these things forward — it’s imperative we work in tandem” to advance initiatives,” Bridges said.

He added that both parties should act “conjunctively and not just unilaterally, so we know there’s a concrete plan in place to make sure (proposals) happen but also to assure there’s funding available.”
Bridges said a continued working relationship between city council and the administration requires conversations before items reach docket stage.

City Council expects to reintroduce the legislation shortly. The translation ordinance eventually will extend to other languages.

Adblock test (Why?)

Sigma Slang Definition: What Does 'Sigma’ Mean? - TODAY - Dictionary

What the sigma?

Some know “sigma” as the 18th letter of the Greek alphabet but it’s also teen slang for a cool dude.

According to Know Your Meme, sigma is “referring to a supposed classification for men who are successful and popular, but also silent and rebellious.” Sigma males are “considered ‘equal’ to Alphas on the hierarchy but live outside of the hierarchy by choice,” reads the website.

Urban Dictionary adds that sigma “is what all 10 year olds think they are.” As reported by British GQ, the word “sigma” was born from the misogynistic “manosphere.”

What does ‘sigma’ mean?

Philip Lindsay, a special education math teacher in Payson, Arizona, broke down “Sigma” on TikTok.

“There’s this group of people who have this hierarchy for males — there’s ‘alpha’ and there’s ‘sigma,’” Lindsay said in a video. “This is a group of people that mainly ranks males based on looks, success, that whole thing.”

Lindsay added, “So they have the ‘alpha’ which is the most successful, the best looking and then they have ‘sigma’ which is the same thing as an alpha but humbler.”

Another definition for “sigma” says Lindsay, is “the best.”

Read more about teen slang:

“Kids use Alpha and Sigma interchangeably,” Lindsay tells TODAY.com. “They don’t make much of a distinction between being humble or not, even though that’s (technically) the definition.”

Lindsay clarifies, “Beta is an insult. (It means) ‘You’re inferior to me and I’m better than you.'”

‘What the sigma?’

According to Lindsay, “What the sigma?” is traced to a SpongeBob SquarePants internet meme (which a spokesperson of Nickelodeon tells TODAY.com is fan-created).

In the video, Squidward and SpongeBob, characters from the cartoon show “SpongeBob Squarepants,” watch footage of a waffle cone dipped in chocolate syrup and sprinkles.

“That looks insane,” said SpongeBob in the video.

“That also looks very unhealthy,” replied Squidward, stating, “Erm, what the sigma?”

Lindsay said “sigma” is from the same culture as mewing and looksmaxing, trends — according to the New York Times — that claim to help teens look better.

One example of “Looksmaxing” is “Mewing”: teens flatten their tongues to the tops of their mouths, to supposedly eliminate a double chin, a method that American Association of Orthodontists says is not scientific. The teen version of “mewing” is a “hush” symbol and touching the jawline to mean, “I can’t talk.”

Lindsay tells TODAY.com that “sigma” is a classroom trend.

“They say, ‘Are you sigma Mr. Lindsay?’ or ‘Yo, that’s so sigma’ when I do something that pleases them like (assigning) math problems (to solve) with an online game,” he says, adding that in his classroom, “Sigma is going strong.”

Adblock test (Why?)