(Weeks 13–15 | Lecture 9 Hrs / Lab 27 Hrs / Ext 0 Hrs | 36 Total Contact Hrs | 1.2 Semester
Credits)
Students will:
- Adapt pre-trained models like MobileNet, ResNet, and BERT to new domains.
- Fine-tune deep models for specific datasets without overfitting.
- Customize transformer models for specialized NLP tasks.
Prerequisite: MLDL 106 – Sequence Models (RNN, LSTM, GRU)
Tools: TensorFlow Hub, Hugging Face Transformers
About Instructor
Ratings and Reviews
0.0
Avg. Rating
0 Ratings
5
0
4
0
3
0
2
0
1
0
What's your experience? We'd love to know!
Login to Review
What's your experience? We'd love to know!
Login to Review
Login
Accessing this course requires a login. Please enter your credentials below!

