WeChat apologizes for mistranslations Foreign media: Should teach AI to identify "bad things"

(Original title: WeChat Apologizes for Improper Translation of Foreign Media: AI Should Be Taught to Recognize "Bad Things")

According to a report from foreign media outlet Gizmodo, WeChat recently issued an apology after its automatic translation mistakenly rendered the term "black foreigner" as "black nigger." This incident has sparked broader discussions within the industry about how to train artificial intelligence (AI) to recognize and avoid racist language.

On Wednesday, An James, a Black film director, shared a message on WeChat in which she wrote, "She was late." When her colleague responded in Chinese, WeChat's AI translation tool incorrectly translated the message into "The nigger is late." James later recalled that WeChat sometimes automatically translates terms starting with "n" into offensive words.

Photo: WeChat Apologizes for Translating “Black People” as “Black Ghosts”

Note: The word "nigger" is a derogatory term used to refer to Black people, often implying low social status. It is highly offensive and should be avoided at all costs.

The term "nigger" originates from the Latin word "niger," meaning "black," and "nigra," which can carry a more negative connotation. Over time, it evolved into an insulting nickname in English, often used to demean Black individuals.

A few days ago, WeChat released an official statement addressing the issue: "We sincerely apologize for the inappropriate translation. Upon receiving user feedback, we immediately resolved the problem." The company also explained that its translation system relies on artificial intelligence, which is trained on vast amounts of text. However, without proper contextual understanding, such systems may misinterpret or misuse sensitive language.

It has been reported that the recognition model is a key component of language AI. These systems identify patterns by analyzing related words. For example, in 2016, researchers tested Google's text algorithm and found some alarming results. The word "Emily" was translated as "Ebony," while "Pancake" became "Fried Chicken." Even more absurdly, "men" were translated as "women," and "doctors" became "nurses."

As foreign media pointed out, AI is essentially learning from human input. Unfortunately, it often absorbs the worst aspects of our language and culture. The question remains: How do we teach AI to recognize and reject racist discrimination? This incident highlights the urgent need for better training and ethical oversight in AI development.

With increasing reliance on AI in communication, it's crucial that platforms like WeChat take responsibility for ensuring their systems are not only accurate but also respectful and inclusive. This event serves as a reminder that technology must evolve alongside our values, not just our data.

Laser Barcode Scanner

Guangzhou Winson Information Technology Co., Ltd. , https://www.barcodescanner-2d.com