Transformer, BERT, and GPT: Including ChatGPT and Prompt Engineering (MLI Generative AI Series)
Author: Oswald Campesato (Author)
Publisher finelybook 出版社: Mercury Learning and Information
Publication Date 出版日期: 2023-11-30
Language 语言: English
Print Length 页数: 364 pages
ISBN-10: 1683928989
ISBN-13: 9781683928980
Book Description
By finelybook
This book provides a comprehensive group of topics covering the details of the Transformer architecture, BERT models, and the GPT series, including GPT-3 and GPT-4. Spanning across ten chapters, it begins with foundational concepts such as the attention mechanism, then tokenization techniques, explores the nuances of Transformer and BERT architectures, and culminates in advanced topics related to the latest in the GPT series, including ChatGPT. Key chapters provide insights into the evolution and significance of attention in deep learning, the intricacies of the Transformer architecture, a two-part exploration of the BERT family, and hands-on guidance on working with GPT-3. The concluding chapters present an overview of ChatGPT, GPT-4, and visualization using generative AI. In addition to the primary topics, the book also covers influential AI organizations such as DeepMind, OpenAI, Cohere, Hugging Face, and more. Readers will gain a comprehensive understanding of the current landscape of NLP models, their underlying architectures, and practical applications. Features companion files with numerous code samples and figures from the book.
FEATURES:
- Provides a comprehensive group of topics covering the details of the Transformer architecture, BERT models, and the GPT series, including GPT-3 and GPT-4.
- Features companion files with numerous code samples and figures from the book.
About the Author
Oswald Campesato is an adjunct instructor at UC-Santa Cruz and specializes in Deep Learning, Python, Data Science, and GPT-4. He is the author/co-author of over forty books including Python and Machine Learning, Data Cleaning, and NLP for Developers (all Mercury Learning and Information).
TABLE OF CONTENTS
1: The Attention Mechanism
2: Tokenization
3: Transformer Architecture Introduction
4: Transformer Architecture in Greater Depth
5: The BERT Family Introduction
6: The BERT Family in Greater Depth
7: Working with GPT-3 Introduction
8: Working with GPT-3 in Greater Depth
9: ChatGPT and GPT-4
10: Visualization with Generative AI
Index