Generative AI Language Modeling with Transformers
What you’ll learn
- Explain the concept of attention mechanisms in transformers, including their role in capturing contextual information.
- Describe language modeling with the decoder-based GPT and encoder-based BERT.
- Implement positional encoding, masking, attention mechanism, document classification, and create LLMs like GPT and BERT.
- Use transformer-based models and PyTorch functions for text classification, language translation, and modeling.
Price
Free
Language
English
Duration
8 Hours
Certificate
No
Course Pace
Self Paced
Course Level
Advanced
Course Category
Generative AI
Course Instructor
IBM