Kickstart your NLP journey by exploring BERT and its variants such as ALBERT, RoBERTa, DistilBERT, VideoBERT, and more with Hugging Face's transformers library
Key Features
Book Description
BERT (bidirectional encoder representations from transformer) has revolutionized the world of natural language processing (NLP) with promising results. This book is an introductory guide that will help you get to grips with Google's BERT architecture. With a detailed explanation of the transformer architecture, this book will help you understand how the transformer's encoder and decoder work.
You'll explore the BERT architecture by learning how the BERT model is pre-trained and how to use pre-trained BERT for downstream tasks by fine-tuning it for NLP tasks such as sentiment analysis and text summarization with the Hugging Face transformers library. As you advance, you'll learn about different variants of BERT such as ALBERT, RoBERTa, and ELECTRA, and look at SpanBERT, which is used for NLP tasks like question answering. You'll also cover simpler and faster BERT variants based on knowledge distillation such as DistilBERT and TinyBERT. The book takes you through MBERT, XLM, and XLM-R in detail and then introduces you to sentence-BERT, which is used for obtaining sentence representation. Finally, you'll discover domain-specific BERT models such as BioBERT and ClinicalBERT, and discover an interesting variant called VideoBERT.
By the end of this BERT book, you'll be well-versed with using BERT and its variants for performing practical NLP tasks.
What You Will Learn
Who this book is for
This book is for NLP professionals and data scientists looking to simplify NLP tasks to enable efficient language understanding using BERT. A basic understanding of NLP concepts and deep learning is required to get the best out of this book.
"Sinopsis" puede pertenecer a otra edición de este libro.
Sudharsan Ravichandiran is a data scientist and artificial intelligence enthusiast. He holds a Bachelors in Information Technology from Anna University. His area of research focuses on practical implementations of deep learning and reinforcement learning including natural language processing and computer vision. He is an open-source contributor and loves answering questions on Stack Overflow.
"Sobre este título" puede pertenecer a otra edición de este libro.
EUR 8,54 gastos de envío desde Estados Unidos de America a España
Destinos, gastos y plazos de envíoEUR 4,29 gastos de envío desde Reino Unido a España
Destinos, gastos y plazos de envíoLibrería: ThriftBooks-Dallas, Dallas, TX, Estados Unidos de America
Paperback. Condición: As New. No Jacket. Pages are clean and are not marred by notes or folds of any kind. ~ ThriftBooks: Read More, Spend Less 1.33. Nº de ref. del artículo: G1838821597I2N00
Cantidad disponible: 1 disponibles
Librería: medimops, Berlin, Alemania
Condición: very good. Gut/Very good: Buch bzw. Schutzumschlag mit wenigen Gebrauchsspuren an Einband, Schutzumschlag oder Seiten. / Describes a book or dust jacket that does show some signs of wear on either the binding, dust jacket or pages. Nº de ref. del artículo: M01838821597-V
Cantidad disponible: 1 disponibles
Librería: PBShop.store UK, Fairford, GLOS, Reino Unido
PAP. Condición: New. New Book. Delivered from our UK warehouse in 4 to 14 business days. THIS BOOK IS PRINTED ON DEMAND. Established seller since 2000. Nº de ref. del artículo: L0-9781838821593
Cantidad disponible: Más de 20 disponibles
Librería: Ria Christie Collections, Uxbridge, Reino Unido
Condición: New. In. Nº de ref. del artículo: ria9781838821593_new
Cantidad disponible: Más de 20 disponibles
Librería: PBShop.store US, Wood Dale, IL, Estados Unidos de America
PAP. Condición: New. New Book. Shipped from UK. THIS BOOK IS PRINTED ON DEMAND. Established seller since 2000. Nº de ref. del artículo: L0-9781838821593
Cantidad disponible: Más de 20 disponibles
Librería: California Books, Miami, FL, Estados Unidos de America
Condición: New. Nº de ref. del artículo: I-9781838821593
Cantidad disponible: Más de 20 disponibles
Librería: BargainBookStores, Grand Rapids, MI, Estados Unidos de America
Paperback or Softback. Condición: New. Getting Started with Google BERT: Build and train state-of-the-art natural language processing models using BERT 1.33. Book. Nº de ref. del artículo: BBS-9781838821593
Cantidad disponible: 5 disponibles
Librería: THE SAINT BOOKSTORE, Southport, Reino Unido
Paperback / softback. Condición: New. This item is printed on demand. New copy - Usually dispatched within 5-9 working days 222. Nº de ref. del artículo: C9781838821593
Cantidad disponible: Más de 20 disponibles
Librería: Rarewaves.com UK, London, Reino Unido
Paperback. Condición: New. Kickstart your NLP journey by exploring BERT and its variants such as ALBERT, RoBERTa, DistilBERT, VideoBERT, and more with Hugging Face's transformers libraryKey FeaturesExplore the encoder and decoder of the transformer modelBecome well-versed with BERT along with ALBERT, RoBERTa, and DistilBERTDiscover how to pre-train and fine-tune BERT models for several NLP tasksBook DescriptionBERT (bidirectional encoder representations from transformer) has revolutionized the world of natural language processing (NLP) with promising results. This book is an introductory guide that will help you get to grips with Google's BERT architecture. With a detailed explanation of the transformer architecture, this book will help you understand how the transformer's encoder and decoder work. You'll explore the BERT architecture by learning how the BERT model is pre-trained and how to use pre-trained BERT for downstream tasks by fine-tuning it for NLP tasks such as sentiment analysis and text summarization with the Hugging Face transformers library. As you advance, you'll learn about different variants of BERT such as ALBERT, RoBERTa, and ELECTRA, and look at SpanBERT, which is used for NLP tasks like question answering. You'll also cover simpler and faster BERT variants based on knowledge distillation such as DistilBERT and TinyBERT. The book takes you through MBERT, XLM, and XLM-R in detail and then introduces you to sentence-BERT, which is used for obtaining sentence representation. Finally, you'll discover domain-specific BERT models such as BioBERT and ClinicalBERT, and discover an interesting variant called VideoBERT. By the end of this BERT book, you'll be well-versed with using BERT and its variants for performing practical NLP tasks.What you will learnUnderstand the transformer model from the ground upFind out how BERT works and pre-train it using masked language model (MLM) and next sentence prediction (NSP) tasksGet hands-on with BERT by learning to generate contextual word and sentence embeddingsFine-tune BERT for downstream tasksGet to grips with ALBERT, RoBERTa, ELECTRA, and SpanBERT modelsGet the hang of the BERT models based on knowledge distillationUnderstand cross-lingual models such as XLM and XLM-RExplore Sentence-BERT, VideoBERT, and BARTWho this book is forThis book is for NLP professionals and data scientists looking to simplify NLP tasks to enable efficient language understanding using BERT. A basic understanding of NLP concepts and deep learning is required to get the best out of this book. Nº de ref. del artículo: LU-9781838821593
Cantidad disponible: Más de 20 disponibles
Librería: GreatBookPrices, Columbia, MD, Estados Unidos de America
Condición: New. Nº de ref. del artículo: 42509814-n
Cantidad disponible: Más de 20 disponibles