Transformers for Natural Language Processing: Build, train, and fine-tune deep neural network architectures for NLP with Python, PyTorch, TensorFlow, BERT, and GPT-3, 2nd Edition
Author: Denis Rothman and Antonio Gulli
Publisher finelybook 出版社: Packt Publishing; 2nd edition (March 25, 2022)
Language 语言: English
Print Length 页数: 564 pages
ISBN-10: 1803247339
ISBN-13: 9781803247335
Book Description
Under the hood working of transformers, fine-tuning GPT-3 models, DeBERTa, vision models, and the start of Metaverse, using a variety of NLP platforms: Hugging Face, OpenAI API, Trax, and AllenNLP
Key Features
Implement models, such as BERT, Reformer, and T5, that outperform classical language models
Compare NLP applications using GPT-3, GPT-2, and other transformers
Analyze advanced use cases, including polysemy, cross-lingual learning, and computer vision
Transformers are a game-changer for natural language understanding (NLU) and have become one of the pillars of artificial intelligence.
Transformers for Natural Language Processing, 2nd Edition, investigates deep learning for machine translations, speech-to-text, text-to-speech, language modeling, question-answering, and many more NLP domains with transformers.
An Industry 4.0 AI specialist needs to be adaptable; knowing just one NLP platform is not enough anymore. Different platforms have different benefits depending on the application, whether it’s cost, flexibility, ease of implementation, results, or performance. In this book, we analyze numerous use cases with Hugging Face, google Trax, OpenAI, and AllenNLP.
This book takes transformers’ capabilities further Author: combining multiple NLP techniques, such as sentiment analysis, named entity recognition, and semantic role labeling, to analyze complex use cases, such as dissecting fake news on Twitter. Also, see how transformers can create code using just a brief description.
Author: the end of this NLP book, you will understand transformers from a cognitive science perspective and be proficient in applying pretrained transformer models to various datasets.
What you will learn
Discover new ways of performing NLP techniques with the latest pretrained transformers
Grasp the workings of the original Transformer, GPT-3, BERT, T5, DeBERTa, and Reformer
Find out how ViT and CLIP label images (including blurry ones!) and reconstruct images using DALL-E
Carry out sentiment analysis, text summarization, casual language analysis, machine translations, and more using TensorFlow, PyTorch, and GPT-3
Measure the productivity of key transformers to define their scope, potential, and limits in production