Sinopsis
Chapter 1: Introduction to Language Models
Chapter Goal: History and introduction to language models
Sub-topics:
• What is a language model
• Evolution of language models from n-grams to now Transformer based models
• High-level intro to Google BERT
Chapter 2: Transformers
Chapter Goal: Introduction to Transformers and their architecture
Sub-topics:
Introduction to Transformers
• Deep dive into Transformer architecture and how attention plays a key role in Transformers
• How Transformer realizes tasks like sentiment analysis, Q&A, sentence masking, etc.
Chapter 3: Intro to Hugging Face library
Chapter Goal: Gives an introduction to Hugging Face libraries and how they are used in achieving NLP tasks
Sub-topics:
• What is Hugging Face, and how its emerge as a relevant library for various data sets and models related to NLP
• Creating simple Hugging Face applications for NLP tasks like sentiment analysis, sentence masking, etc.
• Play around with different models available in the IT space.
Chapter 4: Code Generator
Chapter Goal: Cover an example of a code generator using Transformer architecture.
Sub-topics:
• Creating a simple code generator wherein user input is text in NLP like sorting a given array of numbers.
• The generator will take the user text and generate Python code or YAML (yet another markup language)file as an example for Kubernetes
• Deploying the model on the cloud as a service in Kubernetes
Chapter 5: Transformer Based Applications
Chapter Goal: Summary of the topics around Transformers, Hugging Face libraries, and their usage.
Subtopics:
• Summary of Transformer based applications and language models.
• Summarize Hugging Face libraries and why how they are relevant in NLP.
"Sinopsis" puede pertenecer a otra edición de este libro.