There are more than 70 companies developing new acceleration hardware for the growing field of Machine Learning. I work in a field of computational neurobiology, in which we image fragments of neural tissue at nanometer resolution in order to fully reconstruct their connectivity maps: their connectomes. This talk will explain how unraveling brain structure can lead to new understandings of what our futuristic machine learning hardware should look like.
Nir Shavit is a professor in the Department of Electrical Engineering and Computer Science at MIT and a professor of Computer Science at Tel-Aviv University. Shavit is a co-author of the book The Art of Multiprocessor Programming, and is a recipient of the 2004 Gödel Prize in theoretical computer science and of the 2012 Dijkstra Prize in Distributed Computing. His recent interests include systems issues in ML and techniques for understanding how neural tissue computes by extracting connectivity maps of neural tissue, a field called connectomics. He is the CEO of Neural Magic, a startup delivering GPU class performance for ML workloads on commodity multicore CPUs.