15-386/686 Neural Computation

Carnegie Mellon University

Spring 2023

Course Description

Neural Computation is an area of interdisciplinary study that seeks to understand how the brain learns and computes to achieve intelligence. It seeks to understand the computational principles and mechanisms of intelligent behaviors and mental abilities -- such as perception, language, motor control, decision making and learning -- by building artificial systems and computational models with the same capabilities. This course explores computational principles at multiple levels, from individual neurons to circuits and systems, with a view to bridging brain science and machine learning. It will cover basic models of neurons and circuits, computational models of learning, memories and inference in real and artifical systems. Concrete examples will be drawn mostly from the visual system, with emphasis on relating current deep learning research and the brain research, from hierarchical computation, attention, recurrent neural networks, to reinforcement learning. Students will learn to perform quantitative analysis as well as computational experiments using Matlab. No prior background in biology or machine learning is assumed. Prerequisites: Basic knowledge of matrix, linear algebra, basic calculus (partial differential equations), probability and statistics are required. 15-100, 21-120 or permission of instructor. 21-241 preferred but not required.

Course Information

Instructors Office (Office hours) Class zoom link Email (Phone)
Tai Sing Lee (Professor) Friday 9:00-10:00 a.m. (class zoom) taislee@andrew.cmu.edu
Deying Song (TA) Monday and Wednesday 8 p.m (class zoom) deyings@andrew.cmu.edu
Liwen Zhou (TA) Tuesday and Thursday 8 p.m. (class zoom) liwenzho@andrew.cmu.edu

Recommended Supplementary Textbook

Classroom Etiquette

386 Grading Scheme

Evaluation% of Grade
Assignments 70
Midterm 10
Final Exam 20
Optional term paper/project (Replacement of 1 HW ) up to 10
  • Grading scheme: A: > 88% B: > 75%. C: > 65% .
  • 686 Grading Scheme

    Evaluation% of Grade
    Assignments 70
    Midterm 10
    Final Exam 20
    Weekly Journal Club (Friday - reading / presentations) Required, minimum 10.
    Term project You may use a term project to replace Journal Club
    Term project + journal club Term project can replace two problem sets
  • Grading scheme: A: > 88% B: > 75%. C: > 65% .
  • If the term project to replace two problem sets, the term project will be graded and awarded up to 22-24 points, otherwise it will be graded as good, pass or fail. "Good" + 88% are needed for an A.
  • Assignments

    Term Project


    Late Policy


    Date Lecture Topic Relevant Readings Assignments
      Part 1: Neurons and Synapses    
    W 1/18 1. Introduction and Overview NIH Brain Facts (chapter 1)  
    M 1/23 2. Neurons and Membranes Trappenberg Ch 1.1-2.2
    W 1/25 3. Spikes and Cables Trappenberg Ch 2 (C) HW1 out
    F 1/27 Math and Matlab Tutorial (optional) Trappenberg Math Appendix  
    M 1/30 4. Synapse and Dendrites Trappenberg Ch 3.1, 3.3  
    W 2/1 5. Logical Computation Trappenberg 3.1,3.5 F. Rosenblatt - Perceptron. McCulloch and Pitts (1943)  
    M 2/6 6. Synaptic plasticity Trappenberg Ch 4 Abbott and Nelson (2000)  
    W 2/8 7. Hebbian Learning Trappenberg Ch 4, HPK Ch 8 Oja (1982) HW1 due. HW2 Out
      Part 2: Representation and Computation    
    M 2/13 8. Sensory Systems  
    W 2/15 9. Source Separation and Inference Foldiak (1990)  
    M 2/20 10. Redundancy Reduction Olshausen and Field (1997) (2004)  
    W 2/22 11. Deep Belief Net Trappenberg Ch 10.3 Hinton and Salahutdinov (2006) HW2 due. HW3 out.
    M 2/27 12. Computational Maps HPK Ch 9. Trappenberg 7.1-7.2 Kohonen (1982)  
    W 3/1 Midterm Term Project Preliminary Proposal due
    3/6-3/10 Midsemester and Spring break. March 13. Midterm grade due
      PART 3: Neural Networks    
    M 3/13 13. Recurrent Network Marr and Poggio (1976) Samonds et al. (2013) Wang et al. (2018) Term Project Proposal in
    W 3/15 14. Attractor network and Memory Trappenberg Ch 8. Ch 9.4 Hopfield and Tank (1986) HW3 due. HW4 Out
    M 3/20 15. Hierarchical Modular Systems Trappenberg 5.1. Van Essen et al (1992) Fellman and Van Essen ( 1991)  
    W 3/22 16. Neural Networks Trappenberg Ch 6, 10.1. Fukushima (1988), Krizhevsky et al. (2012)  
    M 3/27 17. Deep Learning and the Brain LeCun, Bengio and Hinton (2015) Yamins and DiCarlo (2016)  
    W 3/29 18. Reinforcement Learning Trappenberg. Ch 9. Niv (2009), Montague et al. (1996) HW4 due. HW5 out.
    M 4/3 19. Hierarchical Casual Inference Mumford (1992) Rao and Ballard (1998) Lee and Mumford (2003)  
    W 4/5 20. Predictive Coding Trappenberg Ch 10. Lotter et al (2016), Colah (2015) Rao (2015)  
    M 4/10 22. Cue Integration Ernst and Banks (2002). Kording and Wolpert (2004)  
    W 4/12 21. Probabilistic Codes Weiss et al. (2002). Ma et al. (2006) Kersten and Yuille (2003) HW5 due. HW6 out.
    F 4/14 Carnival No Journal Club    
    M 4/17 23. Inference Algorithms Orban et al. (2016). Shivkumar et al. (2019)  
    W 4/19 24. Attention Computation Trappenberg Ch 10. Vaswani et al. (2017) Lindsay (2020) Knudsen (2007)  
    M 4/24 25. Conscious Systems Blum and Blum (2018) Koch (2018).  
    W 4/26 26. Review   HW6 due.
    M 5/1 Final Exam 5:30-8:30 p.m (in person)  
    F 5/5 Term Paper Presentation    

    Journal Club List of Potential Papers (To be Added)

    Retinal Computation

    Sparse Coding

    Logical computation in Neurons

    Biological Neural Circuits

    Neural Network models of Neural Circuits

    Sparse Coding on computation and memory

    Reinforcement Learning

    Biological Plausible Deep Learning Algorithms

    Casual inference

    Inverse Rational Control

    Spiking Bayesian Circuit

    Curiosity and Imagination

    Reinforcement Learning and Song Birds



    Glia and their functions

    Questions or comments: contact Tai Sing Lee
    Last modified: spring 2023, Tai Sing Lee