Transformers for NLP笔记

版本:Transformers for Natural Language Processing_Build, train, and fine-tune deep neural network architectures for NLP with Python, PyTorch, TensorFlow, BERT, and GPT-3 2nd Edition



  • Chapter 1: What are Transformers?
  • Chapter 2: Getting Started with the Architecture of the Transformer Model
  • Chapter 3: Fine-Tuning BERT Models
  • Chapter 4: Pretraining a RoBERTa Model from Scratch
  • Chapter 5: Downstream NLP Tasks with Transformers
  • Chapter 6: Machine Translation with the Transformer
  • Chapter 7: The Rise of Suprahuman Transformers with GPT-3 Engines
  • Chapter 8: Applying Transformers to Legal and Financial Documents for AI Text Summarization
  • Chapter 9: Matching Tokenizers and Datasets
  • Chapter 10: Semantic Role Labeling with BERT-Based Transformers
  • Chapter 11: Let Your Data Do the Talking: Story, Questions, and Answers
  • Chapter 12: Detecting Customer Emotions to Make Predictions
  • Chapter 13: Analyzing Fake News with Transformers
  • Chapter 14: Interpreting Black Box Transformer Models
  • Chapter 15: From NLP to Task-Agnostic Transformer Models
  • Chapter 16: The Emergence of Transformer-Driven Copilots


Fill in your details below or click an icon to log in: 徽标

您正在使用您的 账号评论。 注销 /  更改 )

Twitter picture

您正在使用您的 Twitter 账号评论。 注销 /  更改 )

Facebook photo

您正在使用您的 Facebook 账号评论。 注销 /  更改 )

Connecting to %s