Apple has officially addressed a bizarre voice-to-text issue that has gone viral, where the word “racist” was automatically replaced with “Trump” when using the iPhone’s dictation feature. The glitch, first noticed by users on social media, has sparked discussions about Apple’s speech recognition software.
RELATED STORIES: Apple Reportedly Launching A Calculator App for iPad After 14 Years
According to ABC News, numerous iPhone users took to platforms like TikTok and Twitter, posting videos of the voice-to-text error in action. As soon as they dictated the word “racist,” their phones initially transcribed it as “Trump” before correcting the mistake a second later. The company confirmed the issue, which appeared to be linked to Apple’s speech recognition model.
“We are aware of an issue with the speech recognition model that powers Dictation, and we are rolling out a fix today,” Apple stated.
What Caused the Glitch?
Apple has explained that voice-to-text algorithms rely on phonetic patterns to determine what words a user intends to say. The system is designed to predict text based on context and sound similarities, but in this case, it appears to have misinterpreted words beginning with an “R” consonant, including “racist.”
Additionally, some users reported that other words containing “R” were also being mistakenly altered, further highlighting flaws in Apple’s speech recognition AI.
Apple’s reliance on machine learning models means that errors like these can occur, especially when contextual cues are misinterpreted. However, the company assured users that the issue is being addressed and that future updates should prevent similar incidents.
The Viral Reaction: Social Media’s Take
As expected, the internet had a field day with the glitch. Videos demonstrating the bug quickly went viral, with many questioning whether it was intentional or a mere accident. Some Twitter users speculated that Apple’s algorithms were designed with inherent biases, while others found humor in the situation.
Meanwhile, others criticized Apple for not testing their software updates thoroughly before public release.
Apple’s Response and Fix
Following the social media backlash, Apple moved swiftly to correct the error, rolling out an update that adjusts the speech recognition model to prevent words like “racist” from being automatically linked to “Trump.”
Although Apple has not disclosed exactly why the software was making this substitution, the company reiterated that machine learning models evolve over time and occasional misinterpretations are inevitable.