Tuesday, July 23, 2013

Microsoft Is Teaching Kinect to Understand Sign Language

Microsoft Is Teaching Kinect to Understand Sign Language


Microsoft Asia and the chinese Academi of science's Institute of Computing Technology has collaborated to use the KINECT to recognize computer Sign-Language. They've demonstrate the Microsoft Kinect's ability to translate signs by tracking hand and body actions.

Microsoft Research says in "translation mode" the Kinect can translate sign language into text or speech. Both isolated word recognition and continuous sentence recognition were tested. In "communications mode," an avatar can help an individual who is deaf or hard-of-hearing communicate with someone who can hear by converting the sign inputs into text and vice-versa.

It all works through a process Microsoft Research refers to as "3D trajectory matching:" Kinect for Windows software helps decipher the hand movements, which are then matched to identify a word.

Microsoft Research Asia and the Institute of Computing Technology at the Chinese Academy of Sciences have teamed up to use Kinect for Windows to effectively track complex hand motions in 3D space. By combining the data from both the RGB camera and the depth-sensing infrared camera in the Kinect, these researchers were able to develop an impressive system to aid communication between the deaf and the hearing.

No comments:

Post a Comment