SC cadet students build robot hands to teach American Sign Language

The Citadel Photo

As seen in The Post and Courier, by Bo Peterson

Mohamed Baghdady leans to the microphone and says the word “one.” A lone mechanical finger on the table responds with a curl, then rises upright.

At another table in The Citadel engineering lab, a complete robotic hand spells the sign language letters for “c-a-t” from instructions typed on a computer. The movement is driven by motors and fishing line.

Baghdady, a cadet, is building a pair of hands to communicate as a teaching aid and potentially as a human substitute when an American Sign Language translator can’t be brought in.

The hands would work with voice recognition software, replicating the spelling and gestures human hands use to communicate with someone who is deaf.

The work in professor Robert Rabb’s class could lead to another breakthrough in the uses of robotic limbs — a world of prosthetics and  “microbots” diagnosing illnesses inside the body.

But maybe the coolest feature of the hands is you could build them at home. The class final product won’t be a pair of hands; it’ll be an online workshop on the website Instructables’ how-to guide.  

Paul Vargas works with classmates on a robotic hand that they are trying to teach American Sign Language at The Citadel on Wednesday, March 4, 2020. (Courtesy: Lauren Petracca, The Post and Courier)

“Even a middle-schooler could build this project,” said cadet Paul Vargas. “It’s not only mobile, it’s cheaper to use, cheaper to produce than most robotics, and it fills a need.”

There are a few obstacles to overcome.

The first control board the cadets built blew up on them: It needed a shield. The vocal-recognition computer program struggles with words that begin or end with a vowel.

The class is now trying to flex the hand wrists — a necessary component to communicating in sign. In early prototypes, moving the hand took so much electricity it left the finger joints unable to return upright, said cadet Zachery Danis.

But the biggest hurdle is a little-realized subtlety in ASL itself.

The “grammar” of sign language includes physical or facial expressions of the interpreter, said Jason Hurdich, an ASL interpreter who gained fame when he interpreted for Gov. Nikki Haley in 2016 as she pleaded for residents to evacuate ahead of Hurricane Matthew.

Hurdich, now a Clemson University professor teaching sign language, commented from his experience and not as a professor, he said.

Students at The Citadel work on a robotic hand that they are trying to teach American Sign Language on Wednesday, March 4, 2020. (Courtesy: Lauren Petracca, The Post and Courier)

When signing, the face tells the intention instead of the hands. For example, a question that requires a yes or no answer is communicated by facial expression.

“Imagine speaking English with no inflection,” Hurdich said.

That’s the limit of communicating simply with hands.

“Robotic hands are cool. The technology is great. But there are gaps,” he said.

Rabb’s students realize this. The work won’t be completed by this class alone. They hope research continued by classes after them eventually comes up with a technical substitute for expression. But just getting the hands to work well enough to teach basics would be valuable.

Rabb came up with the idea while attending his daughter’s graduation and watching an ASL interpreter.

“How do people out in remote areas get this?” he asked himself.

The answer is, a lot of them don’t.

ASL is the third most often used language in the United States, Hurdich said. But there are not enough instructors, much less interpreters.

So that means The Citadel group is on to something. 

“I’d be concerned about grammar,” he said. “But I’m definitely curious about the hands.”