Generative AI Language Modeling with Transformers

What you’ll learn

  1. Explain the concept of attention mechanisms in transformers, including their role in capturing contextual information.
  2. Describe language modeling with the decoder-based GPT and encoder-based BERT.
  3. Implement positional encoding, masking, attention mechanism, document classification, and create LLMs like GPT and BERT.
  4. Use transformer-based models and PyTorch functions for text classification, language translation, and modeling.
Price Free
Language English
Duration 8 Hours
Certificate No
Course Pace Self Paced
Course Level Advanced
Course Category Generative AI
Course Instructor IBM
Generative AIGenerative AI Language Modeling with Transformers