Say What?

Michael Wiese
30 June 2022
5 min read
Natural Language Understanding, what is it? To understand that, we need to look at where it came from. Natural Language Understanding(NLU) is part of Natural Language Processing, which is a field of computer science that has been around since the 1940s. After the war there was a realization just how valuable it would be to have a machine that could translate between different languages. This started the field of Natural Language Processing (NLP).

Once scientists and researchers started to dive into this problem from a machine’s perspective, it was clear this would be no easy task. Language is tricky to understand because it’s constantly evolving for one thing, as well as the fact that it takes a certain amount of knowledge about the environment, or culture to understand the intent of the information spoken or written.

For instance, let’s say you ask a smart device a question like, “what time is Wizard of Oz playing in La Grange?”. The computer has to understand that “Wizard of Oz” is not referencing an actual wizard, but a movie. That movie would be cultural knowledge. The location of La Grange, would need to be understood that you are looking for a movie theater near that city. Once it comes back with a few options, you may reply something like, “great buy one child, and 2 adults for the 7:30 showing”. Again, it would need to understand contextually that you are not trying to purchase an actual child, rather a child’s ticket and 2 adult tickets. These are simple sentences that we as humans don’t give a but a split second thought to. Because of these, and many, many more complexities, NLU is still undergoing much development.

Great, but what is Natural Language Understanding? The goal of NLU is to develop a system or program that can understand and respond with language like humans do. Then take this program and apply it to a range of different tasks and applications. Fast forward to the early 2000’s the best programs for NLU were using statistical models. Shortly thereafter deep learning came on the scene and changed the face of NLU. Using neural networks NLU was able to achieve things that were unattainable previously.

In 2013 word2vec was introduced. This was a method that used several neural networks to take a large input of text and create a vector space, often of several hundred dimensions. Each word’s context is considered in such a way that other words with similar context will be close to each other in the vector space. This was the beginning that allowed the machine to start understanding relationships between words. No longer was it just looking at the word itself, it was able to start understanding how words related to each other, and what role that word played in a sentence, such as verb, noun, adjective and so on. Using these methods we are able to do things such as compare the sentiment of a sentence.

I have only scratched the surface here, but NLU has come a long way and still has a long way to go. Most people would agree if they have used a digital assistant such as Siri or Alexa that the understanding is pretty good, but quickly falls down if we try to have a “real conversation” with human-like language. Still, NLU has given way to many services that we use every day, from automated phone menus to chatbots, to tools that predict what we are typing, which I have been using right now to write this blog. Maybe one day we will be able to have a natural conversation with a digital assistant that won’t misunderstand us and try to purchase a child from a wizard from the magical land of Oz.