It also works backward, capturing a person’s spoken words and projecting the appropriate hand sign onto the monitor. Students sampled a database of images to train their software to recognize the hand signs, according to a UH news release. The team used between 200 and 300 images per sign.
It’s not clear how well the translation algorithms work — so far, the device was able to translate a single phrase: “Good job, Cougars,” congratulating the students who designed it. The team has since graduated, but the team members hope to further development of their prototype and eventually build a functional, marketable device, according to industrial design student Sergio Aleman.