The Unexpected Order in Deep Regression

Deep Neural Collapse (Deep NRC) consistently emerges following the stabilization of both training and test losses across diverse reinforcement learning and computer vision tasks-including SGEMM, Swimmer, Reacher, Hopper, Carla2D, and UTKFace-and is demonstrably correlated with reduced generalization gaps, suggesting a potential link between this collapse phenomenon and improved model generalization capabilities.

New research reveals a surprising phenomenon where deep regression networks, when properly trained, exhibit a predictable internal structure that enhances their ability to generalize.