Notes by Saeran Vasanthakumar --- 21/05/18 - Killing Creativity in DL
"I so deeply love the way you think... sort of going to these big datasets is encouraging people to not think creatively. So you're -it sort of constrains you to train on these large resources and you think more resources will be a bit better and you start... somehow you kill the creativity." - Lex Fridman paraphrasing Jeremy Howard.
I was listening to Lex's interview[1] with Jeremy Howard while driving home tonight, and thought the portion above, where they discuss the creativity bottleneck imposed by working at scale in deep learning, was profound[2]. The context for this is the interesting story of FastAI's winning entries for DAWNBench[2], which Howard attributes in part to their ability to rapidly prototype without relying on heavy computational power (i.e multiple GPUs, large datasets). Footnotes 1. Interview with Jeremy Howard 2. This is an example of my favorite type of expert discussion, the kind that confirms a prior bias. 3. Training Imagenet in 3 hours for USD 25; and CIFAR10 for USD 0.26 --- email: saeranv @ gmail dot com git: github.com/saeranv