Efficient and Modular Implicit Differentiation (Machine Learning Research Paper Explained)
#implicitfunction #jax #autodiff
Many problems in Machine Learning involve loops of inner and outer optimization. Finding update steps for the outer loop is usually difficult, because of to differentiate through the inner loop’s procedure over multiple steps. Such loop unrolling is very limited and constrained to very few steps. Other papers have found solutions around unrolling in very specific, individual problems. This paper proposes a unified framework for implicit differentiation of inner optimization procedures without unrolling and provides implementations that integrate seamlessly into JAX.
OUTLINE:
0:00 - Intro & Overview
2:05 - Automatic Differentiation of Inner Optimizations
4:30 - Example: Meta-Learning
7:45 - Unrolling Optimization
13:00 - Unified Framework Overview & Pseudocode
21:10 - Implicit Function Theorem
25:45 - More Technicalities
28:45 - Experiments
ERRATA:
- Dataset Distillation is done with respect to the training set, not the validation or test set.
Paper: https://arxiv.
1 view
8
5
5 days ago 00:05:17 1
The Ultimate Email Extractor in 2024: YellowPages Scraper 🌎
1 week ago 00:08:10 1
AI Agents Will Create MILLIONAIRES in 2025 – Are You Ready
1 week ago 00:03:02 1
SPX Options Trading : Strategies for Big Gains!
2 weeks ago 00:04:27 1
Botsol vs. Leads Sniper: Email Extractor Showdown 🔥
3 weeks ago 00:01:18 1
ORIVISION New Launch - 4 Channels HEVC HDMI Encoder EH1304
3 weeks ago 00:06:30 1
Leadstal vs. Leads Sniper: Which Email Extractor REALLY Gets the Job Done? 😎