Menu

Learn ASL



It would be so awesome if artificial intelligence (AI) could learn American Sign Language. However, is it even feasible? Well, a student in India named Priyanjali Gupta built an AI model that can interpret American Sign Language into English in real-time. Gupta’s AI model was basically inspired by a data scientist Nicholas Renotte’s video on Real-Time Sign Language Detection. According to an Inquirer.net article, “She invented the AI model using Tensorflow object detection API that translates hand gestures using transfer learning from a pre-trained model named ssd_mobilenet.” The AI was able to interpret basic signs like hello, please, I love you, thank you, yes and no.

Is Learning American Sign Language From an AI Ideal?

Technology is evolving, and people can now e cutting-edge inventions. While it's amazing that some people design inventions like AIs that are able to translate ASL to English in hopes of bridging the communication gap between the Deaf and hearing people, it is probably not recommended and practical to learn ASL from AIs for a couple of reasons.

1) AI is very limited.

As mentioned, American Sign Language isn't only about communicating with hands but also includes body movements and facial expressions. The facial expressions can mean different things when signing. For example, “raised or lowered eyebrows” can be used depending on the questions being asked. “Raised eyebrows” often demonstrate that the questions are a yes or no type of question. On the other hand, “lowered eyebrows” questions usually show questions that require an answer. Body movements include moving when referring to a dialogue of various speakers in a conversation, or demonstrations of proud vs. timid, and so forth. You need to look at the person’s face and also the whole body, so you get the whole input of both facial expressions and the body language. A number of people want to learn American Sign Language virtually or in person, where they can experience the whole body, including the signer’s signing, body movements, and facial expressions. 

2) AI will not be able to respond to questions

When someone is learning a new language, that person will most likely have a lot of questions to ask about the language structure itself. Unless of course the AI is programmed with a lot of knowledge about the linguistics of ASL, the key aspects of Deaf culture, and is always immersed within the Deaf community, it'd be impossible to answer almost all questions accurately. Real life is consistently changing, and people, along with their language, conform to the changes. New signs are constantly being developed nowadays. AI wouldn't be able to keep up with those changes; and so would quickly be full of outdated info. The AI would consist of superficial knowledge, that only demonstrates the basic signs, and those signs are translated to English.

3) AI will not be able to translate the significance of facial expressions, body language, ASL grammar, and sentence structure, nor key areas of the Deaf culture and community.

ASL is an expressive language, and thus facial expressions and the language are essential when signing. Body language and facial expressions can change the message of a story. ASL’s grammar and sentence structure are not the same as in English. For example, the right sentence structure in English is, “I'm going to the store,” but in ASL, the sentence changes to, “Store I go.” The person who programs the AI is probably not Deaf; hence, the program could easily convey wrong ASL.

4) AI doesn't have the day-to-day real-life experience

It is still a long way to go for Artificial Intelligence before it even comes close to simulating a real person’s knowledge. It can't even understand most precise signs or signer’s styles. In order for anyone to become good in American Sign Language, the most effective solutions will be watch slow-motion ASL video courses, exclusive one-on-one lessons, attend Deaf socials and connect with Deaf people. You can learn a lot from real-life interactions when it comes to how ASL is being utilized into day to day life.

5) A conversation with AI feels not real and not authentic.

Artificial Intelligence is extremely robotic and doesn't sign as fast or as easily as a real person can. A real person’s expressions are also much more animated than any known Artificial Intelligence, which makes the conversation much more personal and meaningful. It's always highly suggested that the beginner signers communicate with Deaf people in real-life conversations.


To conclude, it's wonderful that people are inventing new types of AI that help fix the communication gap between Deaf and hearing people. Then again, ASL, the Deaf culture, and the Deaf community hold quite a lot of historical background and value. A number of Deaf people feel that AI would only take away the key value of both their language and culture. If AI teaches ASL, the language can easily be incorrectly modified and stray away from the authentic ASL structure, and Deaf people definitely like to keep that from happening. 

In any case, AI wouldn't make communication between Deaf and hearing people better or easier. The best solution to this problem is for hearing individuals to learn American Sign Language either online or face-to-face from an actual Deaf teacher. When more hearing people start to learn true American Sign Language, it'll make Deaf people’s day-to-day lives and communication much easier.

Go Back

Post a Comment
Created using the new Bravenet Siteblocks builder. (Report Abuse)