How do Alexa, Siri and Google Assistant able to understand language?
How these applications and devices are able to understand the language spoken by us and they answer in that language?
This is passible because of 'Natural Language Processing' (NLP), it is domain or library of Artificial Intelligence. The main objective of NLP is to read to interpret, understand and make sense of the human languages.
Alexa is available in the market as a device and it is product of Amazon and Google Assistant is available as software/application in Android phone and as a device but Apple's Siri is available as application/software in iOS.
What is uses of NLP?
NLP is used for developing many AI applications. Following are a few major uses of it:
- NLP is used to create language translation apps such Google Translate.
- It is used to make personal assistant application as Google Assistant, Siri and Alexa.
- NLP is used for creating robots and chatbots.
- NLP is used to make Interactive voice response(IVR) application that is used in call centers to respond to certain user's requests.
Components of NLP:
There are two main components of NLP:
Natural Language Understanding(NLU): NLU involves the following steps:
- It maps the input given in a natural language into useful representations.
- It also analyses different aspects of language.
Natural Language Generation(NLG): NLG is the process of producing meaningful phrases and sentences in the form of natural language from some internal representation.
NLG include the following steps :
- Text Planning: It includes retrieving the relevant content from the knowledge base.
- Sentence Planning: It includes choosing the required words, forming meaningful phrases and setting the tone of the sentence.
- Text realisation: It is mapping of the sentence plan with sentence structure.
Comments
Post a Comment