Queensland-based Yugambeh Museum has worked with Google Arts and Culture to develop an open-source, artificial intelligence-based digital language tool called Woolaroo aimed to teach and preserve endangered languages.
Built using Google Translate and Cloud Vision, the tool uses machine learning and image recognition to translate photos of objects into indigenous languages in real time. If multiple objects are detected in a photo, users can scroll through and select the translation based on each object.
"Given the importance of Aboriginal language to Australian culture, we have the incentive to record the known but in particular new words our community members are using as the world evolves bringing us new technology we didn't have before," Yugambeh Museum CEO Rory O'Connor wrote in a blog post.
In addition to being able to translate, Woolaroo has also been designed to encourage individuals and communities to contribute new words and audio recordings to help with pronunciation.
"Crucial to Indigenous communities is that Woolaroo puts the power to add, edit, and delete entries completely in their hands. So people can respond immediately to newly remembered words and phrases and add them directly," O'Connor said.
The languages that are supported on Woolaroo include Yugambeh, an Aboriginal language spoken in Queensland and New South Wales, Louisiana Creole, Calabrian Greek, Māori, Nawat, Tamazight, Sicilian, Yang Zhuang, Rapa Nui, and Yiddish. These can be translated into English, French, or Spanish.
Earlier this year, Google revealed it was bringing a new feature to the Google Translate app, which lets it transcribe audio from one language into another in near real time.
At the time, Bryan Lin, an engineer on the Translate team, said the audio transcription feature would be available in the coming months.
No comments:
Post a Comment