As It Happens5:56Their names are frequently autocorrected. A U.K.-based collective wants tech companies to change that
Dirty, Dorito, duty — those are just a few words that Dhruti Shah’s name is frequently autocorrected to when typing on a phone or computer.
“It’s frustrating. Like why is a commercial brand of tortilla chips considered more important than my name?” Shah told As It Happens host Nil Köksal.
Shah is a supporter of the “I am not a typo” campaign, formed by a group of people with often-autocorrected names who are calling on tech developers to expand the dictionaries used in the operating systems of phones and computers.
The U.K.-based campaign group made their demand in an open letter addressed to technology companies.
Many text software, from Microsoft Word to smartphone apps, have an autocorrect function that suggests changes to words that it thinks are spelled wrong. Programs often show a red line under a presumed misspelled word, where others will automatically change it to the one it assumes the user meant to type.
According to the letter, 41 per cent of names given to babies in England and Wales are flagged as “incorrect” by Microsoft’s English (UK) dictionary — and most of those names are of African or Asian origin.
Changing names to words with Western origin or influences “doesn’t reflect a diverse, inclusive society,” the collective writes in the letter.
Many of the names flagged as incorrect are extremely popular, according to the letter. Esmae appears with a red underline when typed — even though 2,328 babies in England and Wales were given that name since 2021. Meanwhile the name Nigel, which was given to 36 people over that same period, is not flagged.
In a statement, Apple told CBC the keyboard apps used by its iPhones and other devices uses personalized user information to learn to suggest words or names they have previously typed. They can also manually enter names, nicknames and pronunciation details for themselves or their contacts.
CBC also reached out to Microsoft and Google for comment. Neither responded by publication time.
Apple really doesn’t care about whether you’re frustrated by this.– Randy Goebel, professor of computing science, University of Alberta
The assumption that non-white names are incorrect is a barrier that makes technology more cumbersome for racialized people to use, according to research from 2021 by Rashmi Dyal-Chand, a law professor at Northeastern University in Boston who is supporting the campaign.
It also contributes to a “cultural devaluation of non-Anglo individuals and communities,” Dyal-Chand argues.
Shah’s name originates from Sanskrit and has a number of meanings. Her dad used to tell her it meant North Star, and others say the name originates from that of a Hindu goddess.
When her name gets corrected to something like “Dirty” and the author doesn’t catch it, Shah said the accident can feel like a microaggression.
“If you get a name incorrect, whether that’s on autocorrect or in real life, it really does make a difference. I think it’s all about due respect,” said Shah.
Randy Goebel, a professor of computing science at the University of Alberta, says that while many issues with predictive text are difficult to solve, the solution here is simple.
Big tech companies could very simply add some of these names to their software’s dictionaries, like the campaign group is asking. That would avoid them being flagged as wrong.
“That’s such a simple solution. But as far as I can tell, the urgency of the consequences haven’t landed” with the companies, Goebel said.
Fixing falsely autocorrected words is seen as a mere annoyance compared to some of the bigger operating system issues developers are dealing with, according to Goebel, meaning the problem is never high on a company’s list of priorities.
“Apple really doesn’t care about whether you’re frustrated by this,” said Goebel. “Nobody’s going to invest effort in putting it on the development agenda because it will always cost resources.”
While it is possible to turn off autocorrect in most device’s settings, Shah said the default dictionaries big tech companies use should be as inclusive as possible.
“It’s about finding a positive solution and also making sure that we bring everybody in with us,” she said.