Neuralearn dotAI/Deep Learning for Natural Language Processing

  • $49

Deep Learning for Natural Language Processing

  • Course
  • 77 Lessons

Learn the basics of Machine Learning. Dive deep into Deep Learning for Natural Language Processing using TensorFlow 2 and Huggingface. Master how to build, train, evaluate, test and Deploy Deep Learning Models. Understanding Key MLOps concepts. Going from beginner to solving real world problems efficiently!

Contents

Introduction to NLP

Welcome
General introduction
Course material
Link to Code

Tensors and Variables in Tensorflow

Master tensor manipulation in Tensorflow 2
Basics
Initialization and Casting
Indexing
Maths Operations
Linear Algebra Operations
Common Methods
Ragged Tensors
Sparse Tensors
String Tensors
Variables

Building Neural Networks with Tensorflow

Master the basics of Tensorflow by creating and training a simple linear regression model
Task Understanding
Data Preparation
Linear regression Model
Error sanctioning
Training and optimization
Performance Measurement
Validation and testing
Corrective Measures
TensorFlow Datasets

Text processing for sentiment analysis

Understanding sentiment analysis
Text standardization
Tokenization
One-hot encoding and bag of words
Term frequency - Inverse Document frequency (TF-IDF)
Embeddings

Sentiment Analysis with Recurrent neural networks

How Recurrent neural networks work
Data Preparation
Building and training RNNs
Advanced RNNs (LSTM and GRU)
1D Convolutional Neural Network

Sentiment Analysis with transfer learning

Exploring different ways of evaluating classification models

Understanding Word2vec
Integrating pretrained Word2vec embeddings
Testing
Visualizing embeddings

Neural Machine Translation with Recurrent Neural Networks

Understanding Machine Translation
Data Preparation
Building, training and testing Model
Understanding BLEU score
Coding BLEU score from scratch

Neural Machine Translation with Bahdanau Attention

Understanding Bahdanau Attention
Building, training and testing Bahdanau Attention

Neural Machine Translation with Transformers

Understanding Transformer Networks
Building, training and testing Transformers
Building Transformers with Custom Attention Layer
Visualizing Attention scores

Sentiment Analysis with Transformers

Sentiment analysis with Transformer encoder
Sentiment analysis with LSH Attention

Transfer Learning and Generalized Language Models

Understanding Transfer Learning
Ulmfit
Gpt
Bert
Albert
Gpt2
Roberta
T5

Sentiment Analysis with Deberta in Huggingface transformers

Data preparation
Building,training and testing model

Ecommerce search engine with Sentence transformers

Problem Understanding and Sentence Embeddings
Dataset preparation
Building,training and testing model

Build LLM from scratch part 1 : Refresher

RNNs and Attention Models
How Transformers Work
Difference in Training and Inference

Build LLM from scratch part 2

Global Architecture of Mistral Model
Tokenization
Rotary Positional Encoding (RoPE)
Rotary Positional Encoding (RoPE) Practice
Group Query Attention
Sliding Window Attention
Kv-Caching
Transformer Block
Full Transformer Model
Deploying Mistral to the cloud (RunPod)