E-Book, Englisch, 602 Seiten
Rothman Transformers for Natural Language Processing
2. Auflage 2022
ISBN: 978-1-80324-348-1
Verlag: De Gruyter
Format: EPUB
Kopierschutz: 0 - No protection
Build, train, and fine-tune deep neural network architectures for NLP with Python, Hugging Face, and OpenAI's GPT-3, ChatGPT, and GPT-4
E-Book, Englisch, 602 Seiten
ISBN: 978-1-80324-348-1
Verlag: De Gruyter
Format: EPUB
Kopierschutz: 0 - No protection
No detailed description available for "Transformers for Natural Language Processing".
Autoren/Hrsg.
Fachgebiete
Weitere Infos & Material
Table of Contents - What are Transformers?
- Getting Started with the Architecture of the Transformer Model
- Fine-Tuning BERT Models
- Pretraining a RoBERTa Model from Scratch
- Downstream NLP Tasks with Transformers
- Machine Translation with the Transformer
- The Rise of Suprahuman Transformers with GPT-3 Engines
- Applying Transformers to Legal and Financial Documents for AI Text Summarization
- Matching Tokenizers and Datasets
- Semantic Role Labeling with BERT-Based Transformers
- Let Your Data Do the Talking: Story, Questions, and Answers
- Detecting Customer Emotions to Make Predictions
- Analyzing Fake News with Transformers
- Interpreting Black Box Transformer Models
- From NLP to Task-Agnostic Transformer Models
- The Emergence of Transformer-Driven Copilots
- The Consolidation of Suprahuman Transformers with OpenAI's ChatGPT and GPT-4
- Appendix I — Terminology of Transformer Models
- Appendix II — Hardware Constraints for Transformer Models
- Appendix III — Generic Text Completion with GPT-2
- Appendix IV — Custom Text Completion with GPT-2
- Appendix V — Answers to the Questions