Step into the world of LLMs with this practical guide that takes you from the fundamentals to deploying advanced applications using LLMOps best practices
Get With Your Book: PDF Copy, AI Assistant, and Next-Gen Reader Free
Artificial intelligence has undergone rapid advancements, and Large Language Models (LLMs) are at the forefront of this revolution. This LLM book offers insights into designing, training, and deploying LLMs in real-world scenarios by leveraging MLOps best practices. The guide walks you through building an LLM-powered twin that’s cost-effective, scalable, and modular. It moves beyond isolated Jupyter notebooks, focusing on how to build production-grade end-to-end LLM systems.
Throughout this book, you will learn data engineering, supervised fine-tuning, and deployment. The hands-on approach to building the LLM Twin use case will help you implement MLOps components in your own projects. You will also explore cutting-edge advancements in the field, including inference optimization, preference alignment, and real-time data processing, making this a vital resource for those looking to apply LLMs in their projects.
By the end of this book, you will be proficient in deploying LLMs that solve practical problems while maintaining low-latency and high-availability inference capabilities. Whether you are new to artificial intelligence or an experienced practitioner, this book delivers guidance and practical techniques that will deepen your understanding of LLMs and sharpen your ability to implement them effectively.
This book is for AI engineers, NLP professionals, and LLM engineers looking to deepen their understanding of LLMs. Basic knowledge of LLMs and the Gen AI landscape, Python and AWS is recommended. Whether you are new to AI or looking to enhance your skills, this book provides comprehensive guidance on implementing LLMs in real-world scenarios
"Sinopsis" puede pertenecer a otra edición de este libro.
Paul Iusztin is a senior ML and MLOps engineer at Metaphysic, a leading GenAI platform, serving as one of their core engineers in taking their deep learning products to production. Along with Metaphysic, with over seven years of experience, he built GenAI, Computer Vision and MLOps solutions for CoreAI, Everseen, and Continental. Paul's determined passion and mission are to build data-intensive AI/ML products that serve the world and educate others about the process. As the Founder of Decoding ML, a channel for battle-tested content on learning how to design, code, and deploy production-grade ML, Paul has significantly enriched the engineering and MLOps community. His weekly content on ML engineering and his open-source courses focusing on end-to-end ML life cycles, such as Hands-on LLMs and LLM Twin, testify to his valuable contributions.
Maxime Labonne is a Senior Staff Machine Learning Scientist at Liquid AI, serving as the head of post-training. He holds a Ph.D. in Machine Learning from the Polytechnic Institute of Paris and is recognized as a Google Developer Expert in AI/ML. An active blogger, he has made significant contributions to the open-source community, including the LLM Course on GitHub, tools such as LLM AutoEval, and several state-of-the-art models like NeuralBeagle and Phixtral. He is the author of the best-selling book "Hands-On Graph Neural Networks Using Python," published by Packt.
"Sobre este título" puede pertenecer a otra edición de este libro.
EUR 2,28 gastos de envío en Estados Unidos de America
Destinos, gastos y plazos de envíoEUR 2,28 gastos de envío en Estados Unidos de America
Destinos, gastos y plazos de envíoLibrería: GreatBookPrices, Columbia, MD, Estados Unidos de America
Condición: New. Nº de ref. del artículo: 48596664-n
Cantidad disponible: Más de 20 disponibles
Librería: Grand Eagle Retail, Bensenville, IL, Estados Unidos de America
Paperback. Condición: new. Paperback. Step into the world of LLMs with this practical guide that takes you from the fundamentals to deploying advanced applications using LLMOps best practicesGet With Your Book: PDF Copy, AI Assistant, and Next-Gen Reader FreeKey FeaturesBuild and refine LLMs step by step, covering data preparation, RAG, and fine-tuningLearn essential skills for deploying and monitoring LLMs, ensuring optimal performance in productionUtilize preference alignment, evaluation, and inference optimization to enhance performance and adaptability of your LLM applicationsBook DescriptionArtificial intelligence has undergone rapid advancements, and Large Language Models (LLMs) are at the forefront of this revolution. This LLM book offers insights into designing, training, and deploying LLMs in real-world scenarios by leveraging MLOps best practices. The guide walks you through building an LLM-powered twin thats cost-effective, scalable, and modular. It moves beyond isolated Jupyter notebooks, focusing on how to build production-grade end-to-end LLM systems.Throughout this book, you will learn data engineering, supervised fine-tuning, and deployment. The hands-on approach to building the LLM Twin use case will help you implement MLOps components in your own projects. You will also explore cutting-edge advancements in the field, including inference optimization, preference alignment, and real-time data processing, making this a vital resource for those looking to apply LLMs in their projects.By the end of this book, you will be proficient in deploying LLMs that solve practical problems while maintaining low-latency and high-availability inference capabilities. Whether you are new to artificial intelligence or an experienced practitioner, this book delivers guidance and practical techniques that will deepen your understanding of LLMs and sharpen your ability to implement them effectively.What you will learnImplement robust data pipelines and manage LLM training cyclesCreate your own LLM and refine it with the help of hands-on examplesGet started with LLMOps by diving into core MLOps principles such as orchestrators and prompt monitoringPerform supervised fine-tuning and LLM evaluationDeploy end-to-end LLM solutions using AWS and other toolsDesign scalable and modularLLM systemsLearn about RAG applications by building a feature and inference pipelineWho this book is forThis book is for AI engineers, NLP professionals, and LLM engineers looking to deepen their understanding of LLMs. Basic knowledge of LLMs and the Gen AI landscape, Python and AWS is recommended. Whether you are new to AI or looking to enhance your skills, this book provides comprehensive guidance on implementing LLMs in real-world scenarios ". offers a detailed roadmap for building, training, and deploying Large Language Models, complete with practical examples and advanced techniques, making it an essential guide for modern AI professionals."--Provided by publisher., training, and deploying LLMs in Shipping may be from multiple locations in the US or from the UK, depending on stock availability. Nº de ref. del artículo: 9781836200079
Cantidad disponible: 1 disponibles
Librería: California Books, Miami, FL, Estados Unidos de America
Condición: New. Nº de ref. del artículo: I-9781836200079
Cantidad disponible: Más de 20 disponibles
Librería: Textbook Campus, Lexington, KY, Estados Unidos de America
paperback. Condición: New. Appears never used, or very lightly used. All of our books come with a 30 day, money back guarantee. Item does not include any supplemental items such as access codes, discs, etc. Order ships quickly!FM. Nº de ref. del artículo: mon0000023134
Cantidad disponible: 1 disponibles
Librería: GreatBookPrices, Columbia, MD, Estados Unidos de America
Condición: As New. Unread book in perfect condition. Nº de ref. del artículo: 48596664
Cantidad disponible: Más de 20 disponibles
Librería: PBShop.store UK, Fairford, GLOS, Reino Unido
PAP. Condición: New. New Book. Delivered from our UK warehouse in 4 to 14 business days. THIS BOOK IS PRINTED ON DEMAND. Established seller since 2000. Nº de ref. del artículo: L0-9781836200079
Cantidad disponible: Más de 20 disponibles
Librería: PBShop.store US, Wood Dale, IL, Estados Unidos de America
PAP. Condición: New. New Book. Shipped from UK. THIS BOOK IS PRINTED ON DEMAND. Established seller since 2000. Nº de ref. del artículo: L0-9781836200079
Cantidad disponible: Más de 20 disponibles
Librería: Ria Christie Collections, Uxbridge, Reino Unido
Condición: New. In. Nº de ref. del artículo: ria9781836200079_new
Cantidad disponible: Más de 20 disponibles
Librería: Rarewaves USA, OSWEGO, IL, Estados Unidos de America
Paperback. Condición: New. LLM Engineering offers a detailed roadmap for building, training, and deploying Large Language Models, complete with practical examples and advanced techniques, making it an essential guide for modern AI professionals. Nº de ref. del artículo: LU-9781836200079
Cantidad disponible: Más de 20 disponibles
Librería: GreatBookPricesUK, Woodford Green, Reino Unido
Condición: New. Nº de ref. del artículo: 48596664-n
Cantidad disponible: Más de 20 disponibles