TOKYO -- SoftBank will start providing an AI-based sign language translation service as soon as 2024 using a smartphone app that can turn 5,000 signs into Japanese text in one second or less, Nikkei has learned.
There are more than 300,000 people with hearing or speech impairments in Japan. SoftBank hopes the sign language translation service can promote smooth communication between hearing people who do not understand sign language and those with hearing impairments.
The system employs artificial intelligence to detect sign language and translate it into Japanese during a video conversation. The AI also adds postpositional particles to make full sentences. When a hearing person speaks, the AI automatically converts the speaker's words to text on screen. The people having the conversation do not have to type, which allows them to enjoy communicating and to see each other's facial expressions.
SoftBank in 2017 began basic research on the system with Tokyo's University of Electro-Communications. It has also collaborated with a Tokyo-based AI startup Abeja, in which Google invests.
Abeja had the AI learn from 50,000 sign language videos, which enabled it to detect the characteristic movements of each signed word. SoftBank aims to develop the service in multiple languages in the future.
The Japanese telecom company is offering the service on a trial basis to organizations for disabled people in Tokyo and Fukushima Prefecture since April, and a total of nine municipalities and organizations have started using the service.
SoftBank will begin offering the service to health care facilities and public transportation companies within two years. It will roll out the service to the general public free of charge as early as 2024.
The company is now focused on improving the accuracy of the system. As with spoken language, people who communicate using sign language talk at varying speeds. The position of their hands and arms also varies. The AI can translate signs with more than 90% accuracy, but if the system is insufficiently trained, accuracy can fall to less than 50%. To raise the rate of sign recognition, it is necessary to read the movements of more than 100 people per word.
SoftBank in July began offering an app through which anybody can register samples of sign language. Tokyo-based AI company Preferred Networks provided the technology to automatically generate computer graphics as a sample from sign language videos. By showing the sample, the company is now calling for more people who do not understand sign language to register the samples as volunteers.
No comments:
Post a Comment