(Weeks 7–8 | Lec 6 Hrs / Lab 12 Hrs / Ext 0 Hrs | 18 Total Hrs | 0.8 Credit Hours)
Students will:
- Understand Transformer architectures and attention mechanisms.
- Fine-tune BERT and GPT models for classification, Q&A, and summarization.
- Deploy LLMs for basic inference tasks.
Prerequisite: ANLP 103 – Word Embeddings & Topic Modeling
Tools: Hugging Face Transformers, TensorFlow
About Instructor
Ratings and Reviews
0.0
Avg. Rating
0 Ratings
5
0
4
0
3
0
2
0
1
0
What's your experience? We'd love to know!
Login to Review
What's your experience? We'd love to know!
Login to Review
Login
Accessing this course requires a login. Please enter your credentials below!

