Beyond Traditional Natural Language Processing (NLP) - Master NLP using Deep Learning in PyTorch

Natural Language Processing (NLP) is the hottest field in the data science space right now. The demand for NLP experts and data scientists is through the roof.

We have seen a remarkable rise in the amount of research and breakthroughs in NLP recently. We can trace this recent rise to one (seismic) paper – “Attention is All You Need” by Google AI. This breakthrough has spawned so many new and exciting NLP libraries that enable us to work with text in ways that were previously limited to our imagination (or Hollywood).

These NLP deep learning developments have spawned some mind-boggling applications. We’re sure you’ve already interacted with and used some of them yourself:

  • Human-level chatbots (dialogue systems like Google Duplex and Meena)
  • Text generation (Machines writing text from scrtach, such as OpenAI’s GPT-2 model)


Here’s a quick list of the latest Natural Language Processing concepts in deep learning every NLP expert and even enthusiast should know:

  • Language Modeling
  • Sequence-to-Sequence Modeling
  • Attention Mechanism
  • Transformer Architecture
  • Transfer Learning in NLP, and much more!


These are key NLP deep learning concepts you should master if you want to successfully navigate through the NLP goldmine.

Why PyTorch for Natural Language Processing (NLP)?

Every once in a while, there comes a library or framework that reshapes and reimagines how we look at the field of deep learning.

We can safely say that PyTorch is on that list of deep learning frameworks. It has helped accelerate the research that goes into deep learning models by making them computationally faster and less expensive (a data scientist’s dream!).

You will love working with PyTorch for NLP tasks like building language models and creating deep learning powered chatbots. You will quickly find yourself leaning on PyTorch’s flexibility and efficiency quite often for NLP.

Course Curriculum

Recent Developments in NLP

In the past couple of years, the amount of research and breakthroughs in NLP has skyrocketed. Now every month we get to see state-of-the-art NLP models and architectures. 

This incredible rise of NLP can be traced back to the paper – “Attention is All You Need” by Google AI in June 2017. This breakthrough has spawned so many new and exciting NLP models and some key developments are listed below:

  • Transformer Architecture - It achieved state-of-the-art status on sequence modeling tasks. It can handle long-range dependencies in text with ease

  • BERT - This architecture by Google took the Transformer a notch higher and achieved state-of-the-art performance in Language Modeling

  • GPT and GPT 2 - OpenAI’s GPT was state-of-the-art language model till BERT toppled it from the top spot and soon they launched GPT 2 a colossal language model which was trained on many GB’s of text scraped from the web

  • Transformers Library - This amazing library by Huggingface has democratized the use of Transformers. It has a large number of Transformer based pre-trained models that can also be fine-tuned.


In the modules given below, all the projects will be implemented using PyTorch.


Language Modeling

In this module, you will understand a very important sequence modeling task that is known as language modeling. This is the main ingredient in many modern NLP tasks such as machine translation, text summarization, speech recognition, etc.


Project: Building an Autocomplete System

In this project, you will train a Language Model from scratch and use it to automatically give next word suggestions to an input text. You will use PyTorch to implement this project.


Sequence-to-Sequence Modeling

Once you are familiar with the language model, you will learn about the sequence-to-sequence models wherein both the input and the output are sequences of different lengths. You will also get a thorough understanding of the architecture along with the attention mechanism.


Project: Neural Machine Translation

In this project you will get to apply sequence-to-sequence modeling to build a deep learning model to translate one language to another automatically. 


Transformers and Transfer Learning in NLP

The next step in your journey is the Transformer. In this module, you will understand the Transformers models in detail. You will also learn how Transformers are used to perform transfer learning in NLP. You will get to use the Transformers library by Huggingface.


Project: Text Classification using Transformers

You will fine-tune a pre-trained Transformer model to perform text classification on a new dataset. The domain of the new dataset will be different from that of the dataset on which the Transformer was trained.


Project: Build a Dialog System using RASA

Conversational agents or chatbots have been instrumental in across several industries. In this project you will build a chatbot using RASA, an open source. You will also learn to deploy the chatbot in slack.


Project: Build a Voice Command System

In this project you will not only use text data but audio data as well. The objective of this project is to build a deep learning system to convert a speech input to text. It can be used for tasks such as controlling home automation systems by just a few voice commands.

Project 1: Building an Autocomplete System

Autocomplete, or word completion, is a modern-day feature in which the system predicts the rest of a word a user is typing. It is used in a variety of applications and platforms such as chatting apps, search engines, e-commerce websites. In this project, you will train a Language Model from scratch to build an autocomplete system. You will use PyTorch to implement this project.
Project 1: Building an Autocomplete System

Project 2: Translating Language using Neural Machine Translation

Machine Translation has been a highly active area of research since the early 1990s. After the modern resurrection of neural networks, machine translation has accomplished state-of-the-art performance. In this project, you will build a neural machine translation model. You will use sequence-to-sequence modeling to build this deep learning model to translate one language to another automatically.
Project 2: Translating Language using Neural Machine Translation

Project 3: Text Classification using Transformers

The Transformer based neural networks have taken the field of NLP by storm. They have significant advantages over recurrent neural networks on sequence modeling tasks and transfer learning as well. You will fine-tune a pre-trained Transformer model to perform several NLP tasks such as Text Classification, Language Modeling, and Named Entity Recognition.
Project 3: Text Classification using Transformers

Project 4: Build a Voice Command System

Speech recognition has always been an exciting field of research. It involves recognition and translation of spoken language into text, automatically. It has a wide range of real-world applications. From home automation to flying fighter jets. In this project, you will not only use the text data but audio data as well. The objective of this project is to build a deep learning system to convert a speech input to text. It can be used for tasks such as controlling home automation systems by just a few voice commands.
Project 4: Build a Voice Command System

Project 5: Build your first Chatbot

Chatbots are everywhere today, from booking your flight tickets to ordering food, chances are that you have already interacted with one. In this module, you will build your first chatbot to search for restaurants online and learn how to use it in a real-world application by deploying it on Slack.
Project 5: Build your first Chatbot

FAQ

  • Who should take this course?

    This course is for people who are looking to get into the field of Natural Language Processing, or those who want to brush up their knowledge of NLP and get familiar with the trends in the field. The course provides you everything you need to know to become an NLP practitioner

  • I have a programming experience of 2+ years, but I have no background of Machine learning. Is the course right for me?

    The course assumes prior background in Machine Learning. So we would recommend you to be aware of basics of Machine Learning before going through this course.

  • Do I need to install any software before starting the course?

    Yes, you will get information about all installations as part of the course.

  • What is the refund policy?

    The fee for this course is non-refundable.

  • Do I need to take the modules in a specific order?

    We would highly recommend taking the course in the order in which it has been designed to gain the maximum knowledge from it.

  • Do I get a certificate upon completion of the course?

    Yes, you will be given a certificate upon satisfactory completion of the course.

  • What is the fee for this course?

    Fee for this course is INR 10,999

  • How long I can access the course?

    You will be able to access the course material for six months since the start of the course.

  • When will the classes be held in this course?

    This is an online self-paced course, which you can take any time at your convenience over the 6 months after your purchase.

Customer Support for our Courses & Programs

We are there for your support when you need!