Skip to Content

Sign Language Recognition

About:

Communication and collaboration between deaf people and hearing people is hindered by lack of a common language. Although there has been a lot of research in this domain, there is room for work towards a system that is ubiquitous, non-invasive, works in real-time and can be trained interactively by the user. 

The approach utilizes insights from Sign Language Linguistics that American Sign Language morphemes can be differentitated using either of Movement, Location, Orientation, Handshape or Facial Expression of a signer. 

Towards this goal we provide two approaches. The datasets utilized for each of these are also released. 


Faculty Advisor: 
Dr. Sandeep Gupta

Research Faculty:
Dr. Ayan Banerjee

Curent Students:
Prajwal Paudyal
Junghyo Lee 

Sponsors:

Collaborators:
Paul Quinn 
Dr. Tamako Azuma