15-874 Machine Learning from Neural Cortical Circuits

Carnegie Mellon University

Spring 2016.

Course Description

In the last few years, deep learning methods based on convolutional neural networks have produced state-of-the-art performance in object and speech recognition. These neural networks have also been found to provide a reasonable approximation for the neural representations in the primate visual systems. Yet, real biological neural networks are far more intricate and complex than the current neural networks. For example, only 5% of the synapses of a neuron in the real network in the visual cortex listen to bottom-up input signals yet current neural networks are primarily concerned with feed-forward computation. What are the functions of the other 95% of the synapses of a neuron? What could be the computational roles of the recurrent connections in the real biological circuits? What other learning rules are known or implementable in the real circuits? Can we develop new computational vision models and machine learning techniques from our knowledge of the cortical neural circuits? We will study current relevant machine learning, computer vision and biological papers to explore the answers to these questions. Students of all levels, from undergraduates to Ph.D. students are welcome, though priority will be given to more senior students. The course will involve paper presentation and discussion, research term projects by students, and lecture presentation by professors. NOTE: Although we have not imposed prerequisites, the course is intended for students with some background in machine learning and neural computation, who are doing or are interested in doing related research. You should contact the instructor to see whether it is appropriate for you to enroll, or we can decide at the beginning of the course.

Course Information

Instructors Office (Office hours) Email (Phone)
Tai Sing Lee (Professor) Mellon Inst. Rm 115 (office hour anytime or by appt) tai@cnbc.cmu.edu (412-268-1060)

Paper Presentation (60 percent of the grade)

Term research project (40 percent of the grade)

Week 1 (Jan 22) Vision and the Visual System

Week 2 (Jan 29) Sparse and predictive coding

Week 3 (Feb 5) Compositional theory

Week 4 (Feb 12) More on composition and neural evidence of sparse coding

Week 5 (Feb 19) More neural evidence of sparse coding

Week 6 (Feb 26) Sparse coding circuits and overview on biological learning rules

Week 7 (March 1) Overview on Cortical circuits

Week 8 (March 18) Machine Learning on cortical circuits and LSTM

Week 9 (April 1) Prediction Models

Week 10 (April 8) Predictive and efficient models in balanced networks

Week 11 (April 15) Representational Similarity Analysis (Location changed)

Week 12 (April 22) Sparse HMAX and Deep Residue Networks

Week 13 (April 29) Memory System and One-shot Learning


Questions or comments: contact Tai Sing Lee