Introduction to Transformers for NLP

Introduction to Transformers for NLP: With the Hugging Face Library and Models to Solve Problems 1st ed. Edition
by Shashank Mohan Jain (Author)

Publisher Finelybook 出版社:Apress; 1st ed. edition (October 21, 2022)
Language 语言:English
pages 页数:176 pages
ISBN-10 书号:1484288432
ISBN-13 书号:9781484288436

Book Description
Get a hands-on introduction to Transformer architecture using the Hugging Face library. This book explains how Transformers are changing the AI domain, particularly in the area of natural language processing.

This book covers Transformer architecture and its relevance in natural language processing (NLP). It starts with an introduction to NLP and a progression of language models from n-grams to a Transformer-based architecture. Next, it offers some basic Transformers examples using the google colab engine. Then, it introduces the Hugging Face ecosystem and the different libraries and models provided by it. Moving forward, it explains language models such as google BERT with some examples before providing a deep dive into Hugging Face API using different language models to address tasks such as sentence classification, sentiment analysis, summarization, and text generation.

After completing Introduction to Transformers for NLP, you will understand Transformer concepts and be able to solve problems using the Hugging Face library.

What You Will Learn
Understand language models and their importance in NLP and NLU (Natural Language Understanding)
Master Transformer architecture through practical examples
Use the Hugging Face library in Transformer-based language models
Create a simple code generator in Python based on Transformer architecture

Who This Book Is ForData Scientists and software developers interested in developing their skills in NLP and NLU (Natural Language Understanding)

下载地址 Download隐藏内容需1积分,VIP免费,请先 !没有帐号? 注 册 一个!
觉得文章有用就打赏一下
未经允许不得转载:finelybook » Introduction to Transformers for NLP

评论 抢沙发

  • 昵称 (必填)
  • 邮箱 (必填)
  • 网址

觉得文章有用就打赏一下

非常感谢你的打赏,我们将继续给力更多优质内容,让我们一起创建更加美好的网络世界!

支付宝扫一扫打赏

微信扫一扫打赏