not enough data
Conventional wisdom: “Not enough data? Use classic learners (Random Forests, RBF SVM, ..), not deep nets.” New paper: infinitely wide nets beat these and also beat finite nets. Infinite nets train faster than finite nets here (hint: Neural Tangent Kernel)! via
[1910.01663] Harnessing the Power of Infinitely Wide Deep Nets on Small-data Tasks