Grim.
Lost in Translation
Bad AI has a price — and in the case of the United States immigration system, that price could cost people their freedom.
As The Guardian reports, immigration officials have been instructed to use free programs like Google Translate or Microsoft Translator to communicate with the people they detain, which can result in inaccurate or confusing information being given to detainees or put down on their applications.
One agency, Customs and Border Patrol, has created its own translation app, known as "CBP One," but as the report notes, it can only translate to and from a handful of languages, and even in the tongues it recognizes, there are errors.
The report cites a number of examples of this effect, from an FAQ page being transformed into a string of letters when the app is asked to translate into Haitian Creole to asylum applicants being denied because of small grammatical discrepancies.
In one such case recounted by Ariel Koren, the founder of the Respond Crisis Translation emergency interpreter network, an asylum-seeker who was trying to flee her abusive father described the man in colloquial Spanish as "mi jefe," which the translation app took literally to mean her "boss." Her application for asylum was thusly denied.
"Not only do the asylum applications have to be translated, but the government will frequently weaponize small language technicalities to justify deporting someone," Koren, who once worked at Google Translate, told The Guardian. "The application needs to be absolutely perfect."
Dialectics
If things are that bad for Spanish speakers, one can imagine they're worse for those who speak lesser-known dialects. Indeed, as Respond Crisis Translation told The Guardian, there are often issues across agencies and translation apps for Afghan refugees who speak Dari, one of the region's two major dialects. Google Translate, the report notes, doesn't recognize Dari at all.
"Afghan languages are not highly resourced in terms of technology, in particular local dialects," Uma Mirkhail, RCT's lead for Afghan languages, told the British newspaper. "It’s almost impossible for a machine to convey the same message that a professional interpreter with awareness about the country of origin can do, including cultural context."
With systemic biases against non-English speakers in both government and machine learning abounding, it's heartbreaking but not surprising that the tools meant to help immigration officials and the people under their charge communicate end up causing harm and headaches.
"AI translation tools should never be used in a way that is unsupervised," Koren, the RCT founder, said. "They should never be used to replace translators and interpreters and they should not be used in high-stakes situations — not in any language and especially not for languages that are marginalized."
More on AI fails: Gannett Promised to Be Super Responsible With AI Before Completely Bungling It
Share This Article
No comments:
Post a Comment