Lesson 1 of 0
In Progress

Examples of Pre-trained Models

August 23, 2023
  • BERT: Bidirectional Encoder Representations from Transformers. Understands text context from both directions.
  • GPT-3: Generative Pre-trained Transformer. Advanced generative capabilities to create human-like text.
  • RoBERTa: Robustly optimized BERT pre-training approach. Improves performance over BERT.
  • T5: Text-to-Text Transfer Transformer. Unified framework for different NLP tasks.
  • DALL-E: Generates images from text descriptions. Combines vision and language.