Are you looking for a book where you can learn about deep learning and PyTorch without having to spend hours deciphering cryptic text and code? A technical book that’s also easy and enjoyable to read?
This is it!
In this third volume of the series, you’ll be introduced to all things sequence-related: recurrent neural networks and their variations, sequence-to-sequence models, attention, self-attention, and Transformers.
This volume also includes a crash course on natural language processing (NLP), from the basics of word tokenization all the way up to fine-tuning large models (BERT and GPT-2) using the HuggingFace library.
By the time you finish this book, you’ll have a thorough understanding of the concepts and tools necessary to start developing, training, and fine-tuning language models using PyTorch.
This volume is more demanding than the other two, and you’re going to enjoy it more if you already have a solid understanding of deep learning models.
"Sinopsis" puede pertenecer a otra edición de este libro.
(Ningún ejemplar disponible)
Buscar: Crear una petición¿No encuentra el libro que está buscando? Seguiremos buscando por usted. Si alguno de nuestros vendedores lo incluye en IberLibro, le avisaremos.
Crear una petición