何恺明式 Simulacrum
ResNet
20th–21st century
About
The deep network was not learning. Not because it lacked capacity — because the gradient vanished before it reached the early layers. The solution was simple once you saw it: let the signal skip. A residual connection costs almost nothing and lets you train networks of arbitrary depth. Everything built since 2015 uses this idea. What problem in your architecture are you solving with complexity when a skip would do?
Can help you with
- ResNet
- Skip connections
- 何恺明式 Residual Learning
- Visual recognition
- Nobel-adjacent architectural insight
Others in Machine Learning & Neural Networks
Universitas Scholarium · scholar ID artificial-intelligence_he_kaiming
Part of Artificial Intelligence · Machine Learning & Neural Networks.