Along with jet packs and hover boards, a machine to translate from any language to any other is so appealing as a fantasy that people are willing to overlook clunky prototypes as long as they can retain the belief that the future promised by science fiction has, at last, arrived. One particularly clunky subspecies of the universal language translator has a rather dismal history: the sign-language glove, which purports to translate sign language in real time to text or speech as the wearer gestures. For people in the Deaf community, and linguists, the sign-language glove is rooted in the preoccupations of the hearing world, not the needs of Deaf signers.
The basic idea dates to the 1980s, when researchers started exploring how humans could interact with computers using gestures. In 1983, a Bell Labs engineer page_seo_titled Gary Grimes invented a glove for data entry using the 26 manual gestures of the American Manual Alphabet, used by speakers of American Sign Language. But the first glove intended to make interactions between deaf and non-deaf people easier was announced in 1988 by the Stanford University researchers James Kramer and Larry Leifer. It was called the “talking glove,†and the entire system cost $3,500-not including the price of the CyberGlove itself.
The first sign-language glove to gain any notoriety came out in 2001. A high-school student from Colorado, Ryan Patterson, fitted a leather golf glove with 10 sensors that monitored finger position, then relayed finger spellings to a computer which rendered them as text on a screen. Patterson received considerable attention for his “translating glove,†including the grand prize in the 2001 Intel International Science and Engineering Fair and a $100,000 scholarship. In 2002, the public-affairs office of the National Institute on Deafness and Other Communicative Disorders effused about Patterson, sneaking in the caveat only at the end: The glove doesn’t translate anything beyond individual letters, certainly not the full range of signs used in American Sign Language, and works only with the American Manual Alphabet.
Over the years, similar designs-with corresponding hoopla-have appeared all over the world, but none has ever delivered a product to market. A group of Ukrainians won first prize and $25,000 in the 2012 Microsoft Imagine Cup, a student technology competition, for their glove project. In 2014, Cornell students designed a glove that “helps people with hearing disabilities by identifying and translating the user’s signs into spoken English.†And in 2015, one glove project was announced by two researchers at Mexico’s National Polytechnic Institute, and another by the Saudi designer and media artist Hadeel Ayoub, whose BrightSignGlove “translates sign language into speech in real time†using a data glove.
The most recent project is from July 2017, when a team at the University of California, San Diego, published a paper in PLOS One describing a gesture-recognizing glove. The project was headed by Darren Lipomi, a chemist who researches the mechanical properties of innovative materials, such as stretchable polymer-based solar cells and skin-like sensors. On July 12, the UCSD news office promoted Lipomi’s publication with a story proclaiming, “Low-cost smart glove translates American Sign Language alphabet and controls virtual objects.†The next day, the online outlet Medgadget lopped “alphabet†out of its headline, and reports of a glove that “translates sign language†again spread far and wide, getting picked up by New Scientist, The Times in the United Kingdom, and other outlets. Medgadget wasn’t entirely to blame-Lipomi had titled his paper “The Language of Glove†and written that the device “translated†the alphabet into text, not “converted,†which would have been more accurate.
Linguists caught wind of the project. Carol Padden, the dean of social sciences at UCSD and a prominent sign-language linguist who is also deaf, passed along a critique of the sign-language glove concept to Lipomi’s dean at the school of engineering. The critique she gave him had been written by two ASL instructors and one linguist and endorsed by 19 others. It was written in response not to Lipomi’s paper, but to a notorious sign-language-glove project from the year before. In 2016, two University of Washington undergraduates, Thomas Pryor and Navid Azodi, won the Lemelson-MIT Student Prize for a pair of gloves that recognized rudimentary ASL signs. Their project, called SignAloud, was covered by NPR, Discover, Bustle, and other outlets, but was also answered by vociferous complaints in blog posts by the linguists Angus Grieve-Smith and Katrina Faust.
“Initially, I didn’t want to deal with [SignAloud, the UW project] because this has been a repeated phenomenon or fad,†says Lance Forshay, who directs the ASL program at UW. “I was surprised and felt somehow betrayed because they obviously didn’t check with the Deaf community or even check with ASL program teachers to make sure that they are representing our language appropriately.†But after SignAloud received national and international media attention, Forshay teamed up with Kristi Winter and Emily Bender, from his department, to write a letter. They gathered input for the letter from the Deaf community and Deaf culture experts.
Their six-page letter, which Padden passed along to the dean, points out how the SignAloud gloves-and all the sign-language translation gloves invented so far-misconstrue the nature of ASL (and other sign languages) by focusing on what the hands do. Key parts of the grammar of ASL include “raised or lowered eyebrows, a shift in the orientation of the signer’s torso, or a movement of the mouth,†reads the letter. “Even perfectly functioning gloves would not have access to facial expressions.†ASL consists of thousands of signs presented in sophisticated ways that have, so far, confounded reliable machine recognition. One challenge for machines is the complexity of ASL and other sign languages. Signs don’t appear like clearly delineated beads on a string; they bleed into one another in a process that linguists call “coarticulation†(where, for instance, a hand shape in one sign anticipates the shape or location of the following sign; this happens in words in spoken languages, too, where sounds can take on characteristics of adjacent ones). Another problem is the lack of large data sets of people signing that can be used to train machine-learning algorithms.
And while signers do use the American Manual Alphabet, it plays a narrow role within ASL. Signers use it “to maintain a contrast of two types of vocabulary-the everyday, familiar, and intimate vocabulary of signs, and the distant, foreign, and scientific vocabulary of words of English origin,†wrote Carol Padden and Darline Clark Gunsauls, who heads Deaf studies at Ohlone College, in a paper on the subject.
And the writers of the UW letter argued that the development of a technology based on a sign language constituted cultural appropriation. College students were gaining accolades and scholarships for technologies based on an element of Deaf culture, while Deaf people themselves are legally and medically underserved.
Also, though the gloves are often presented as devices to improve accessibility for the Deaf, it’s the signers, not the hearing people, who must wear the gloves, carry the computers, or modify their rate of signing. “This is a manifestation of audist beliefs,†the UW letter states, “the idea that the Deaf person must expend the effort to accommodate to the standards of communication of the hearing person.â€
That sentiment is widely echoed. “ASL gloves are mainly created/designed to serve hearing people,†said Rachel Kolb, a Rhodes Scholar and Ph.D. student at Emory University who has been deaf from birth. “The concept of the gloves is to render ASL intelligible to hearing people who don’t know how to sign, but this misses and utterly overlooks so many of the communication difficulties and frustrations that Deaf people can already face.â€
Julie Hochgesang, an assistant professor of linguistics at Gallaudet, said she rolls her eyes when another glove is announced. “We can't get decent access to communication when we go to the doctor. Why bother with silly gloves when we still need to take care of the basic human-rights issues?â€
So why do so many inventors keep turning to the sign-language glove concept?
One reason is pretty obvious: Despite the popularity of ASL classes in American colleges (enrollment in such courses grew by 19 percent between 2009 and 2013), non-signers often don’t know that much about sign language. They may not even realize that ASL (and other sign languages, such as British Sign Language, Chinese Sign Language, and dozens of others) are distinct languages with their own grammars and phonologies, not word-for-word reformulations of a spoken language. Additionally, says Forshay, “People have no knowledge of the culture of Deaf people and how signed language has been exploited and oppressed over history.†As a result, they’re not aware of why the issue would be so sensitive.
An equally potent but less immediately apparent reason is the way engineers approach problem-solving. In engineering school, students are taught to solve only the mathematical elements of problems, says the Virginia Tech engineering educator Gary Downey. In a 1997 article he noted that “all the nonmathematical features of a problem, such as its politics, its power implications for those who solve it, and so forth, are given,†meaning they’re bracketed off. Students are prepared to focus on sensor placement or algorithm design, but often not the broader social context that the device they’re designing will enter.
The specific application of Lipomi’s glove as an accessibility device seems to have been an afterthought. The project’s purpose, he wrote on his blog later, was to “demonstrate integration of soft electronic materials with low-energy wireless circuitry that can be purchased economically.†The American Manual Alphabet was chosen because “it comprises a set of 26 standardized gestures, which represent a challenge in engineering to detect using our system of materials.â€
However, engineers seem to be hearing and responding to linguists’ complaints. Pryor and Azodi, the inventors of the UW SignAloud project, signed on to the UW open letter. And when Darren Lipomi heard about the linguists’ criticisms, he changed the wording of his paper with an addendum to PLOS One and wrote a blog post encouraging researchers to be more culturally sensitive. “The onus is thus on the researcher to be aware of cultural issues and to make sure ... that word choice, nuance, and how the technology may impact a culture is properly conveyed to the journalist and thence to the public,†he wrote.
Still, as long as actual Deaf users aren’t included in these projects, inventors are likely to continue creating devices that offend the very group they say they want to help. “To do this work, the first rule you have to teach yourself is that you are not your user,†says Thad Starner, who directs the Contextual Computing Group at the Georgia Institute of Technology. The group develops accessibility technologies for the deaf, such as a sign language-based educational game to train the working-memory abilities of deaf children.
That’s not to say that Deaf people don’t have futuristic fantasies that involve technology. For example, Kolb says a dominant fantasy among her friends is for glasses that would auto-caption everything that hearing people say. Several teams of researchers are working on algorithms to make signing videos on YouTube searchable. Even more thorough, higher-quality captioning and better interpreting services would improve the lives of many.
|