From the course: Applied AI: Getting Started with Hugging Face Transformers
Natural language processing
From the course: Applied AI: Getting Started with Hugging Face Transformers
Natural language processing
- (Narrator) Let's begin this course with a brief review of the natural language processing or NLP domain. Natural language processing deals with the ability of computers to process, understand and generate text. This includes both spoken and return human languages and enables automation of analytics, self-service actions and human machine interactions. There are multiple branches of N L P. It starts with natural language understanding or N L U. N L U is used to understand the words sentences, semantics, and context in text. Popular N L U applications include sentiment analysis and text summarization. Information extraction is the earliest branch of N L P. This deals with extracting structured information from a body of text. Tasks for information extraction include named entity recognition and text search. Natural Language Generation or NLG Is a fast growing field. In NLP, NLG is used to generate text that resembles human generated text. Popular tasks include converting text to spoken voice called text to speech machine translation between languages and text Content generation. Automated speech recognition or ASR is another branch that has existed for a long time. It has started with understanding specific words or contexts like names and dates. Today it can be used to understand continuous human speech and transcribe audio to text. Trigger word detection is a popular task seen in devices like Amazon Alexa, apple Siri, and Google Assistant. The techniques used for machine learning have also evolved over time for NLP applications. They initially started with bag of words based models which look for specific words in text and associate static context to them. Converting text to numeric representations helped apply classical machine learning algorithms like name bias and random forest to text tasks. Representations like one hot encoding and TF IDF became popular. The advent of deep learning and associated recurrent neural network architectures opened up new tasks in N L P word embeddings then helped represent context and semantics within a language. Finally transformer architectures help develop foundation models for a variety of tasks that can be applied out of the box to real world situations. In this course, we focus on transformers and how hugging face helps apply them to your N L P tasks.
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.