Seminar | September 26 | 12-1 p.m. | 560 Evans Hall
Russ Webb, Apple
A primary goal of this work is to give a clear example of the limits of current, deep-learning techniques and suggest how progress can be made. The presentation will include a discussion of open questions, unpublished experiments, suggestions on how to make progress. This work is founded on the paper Knowledge Matters: Importance of Prior Information for Optimization by Gulcehre et. al., which sought to establish the limits of current black-box, deep learning techniques by posing problems which are difficult to learn without engineering knowledge into the model or training procedure. In our work, we completely solve the previous Knowledge Matters problem using a generic model, pose a more difficult and scalable problem, All-Pairs, and advance this new problem by introducing a new learned, spatially-varying histogram model called TypeNet which outperforms conventional models on the problem. We present results on All-Pairs where our model achieves 100% test accuracy while the best ResNet models achieve 79% accuracy. In addition, our model is more than an order of magnitude smaller than Resnet-34. The challenge of solving larger-scale All-Pairs problems with high accuracy is presented to the community for investigation.