15-386/686 Neural Computation

Carnegie Mellon University

Spring 2021

Course Description

Neural Computation is an area of interdisciplinary study that seeks to understand how the brain learns and computes to achieve intelligence. It seeks to understand the computational principles and mechanisms of intelligent behaviors and mental abilities -- such as perception, language, motor control, decision making and learning -- by building artificial systems and computational models with the same capabilities. This course explores computational principles at multiple levels, from individual neurons to circuits and systems, with a view to bridging brain science and machine learning. It will cover basic models of neurons and circuits, computational models of learning, memories and inference in real and artifical systems. Concrete examples will be drawn mostly from the visual system, with emphasis on relating current deep learning research and the brain research, from hierarchical computation, attention, recurrent neural networks, to reinforcement learning. Students will learn to perform quantitative analysis as well as computational experiments using Matlab and deep learning platforms. No prior background in biology or machine learning is assumed. Prerequisites: 15-100, 21-120 or permission of instructor. 21-241 preferred but not required.

Course Information

Instructors Office (Office hours) Class zoom link Email (Phone)
Tai Sing Lee (Professor) Wednesday 8:00 -9:00 pm. Friday 9:30-10:30 a.m. tai@cnbc.cmu.edu
Summer Huang (TA) Monday 8:00 - 9:30 p.m. Sat 8:00- 9:30 pm. hgesummer@gmail.com
Tianqin Li (TA) Monday 5:00-6:30. Tuesday 8:00-9:30 p.m. tianqinl@andrew.cmu.edu
Hal Rockwell (TA) Tuesday 4:30-5:30. hrockwel@andrew.cmu.edu

Recommended Supplementary Textbook

Classroom Etiquette

386 Grading Scheme

Evaluation% of Grade
6 Assignments 70
Midterm 10
Final Exam 20
Optional term project (Replacement of 1 HW ) 10
  • Grading scheme: A: > 88 B: > 75. C: > 65 percent.
  • 686 Grading Scheme

    Evaluation% of Grade
    Assignments 70
    Midterm 10
    Final Exam 20
    Option 1. Weekly Journal Club (Friday - reading / presentations) Required No credit.
    Option 2. Term project Replace Journal Club
    Option 3. Term project and journal club Term project replace two problem sets
  • Total Score: 100. 686 students can choose one of the above three options for the 25 points.
  • Grading scheme: A: > 88 B: > 75. C: > 65 percent.
  • Note 1: if you choose option 1 or 2, you have to achieve 88% of the 386 requirement to get an A. If you choose option 3, you can use the project grade (24 points) to replace two problem sets.
  • Note 2: if you choose option 2, your project can be done in a team of two students. If you choose option 3, you have to do the project individually as it will replace two problem sets.
  • Assignments

    Term Project

    Examinations

    Late Policy

    Syllabus

     
    Date Lecture Topic Relevant Readings Assignments
      Part 1: Neurons and Synapses    
    M 2/1 1. Introduction and Overview NIH Brain Facts (chapter 1)  
    W 2/3 2. Neurons and Membranes Trappenberg Ch 1.1-2.2 HW 1 out
    F 2/5 Recitation: Matlab tutorial    
    M 2/8 3. Spikes and Cables Trappenberg Ch 2 (C) Matlab tutorial Wean 5201 4:30-5:30
    W 2/10 4. Synapse and Dendrites Trappenberg Ch 3.1, 3.3  
    F 2/12 Recitation: Review for Problem Set 1    
    M 2/15 5. Synaptic plasticity Trappenberg Ch 4 Abbott and Nelson (2000)  
    W 2/17 6. Hebbian Learning Trappenberg Ch 4, HPK Ch 8 Oja (1982) HW 2 out, HW 1 in
    F 2/19 Recitation: Review Part 1    
      Part 2: Representation and Computation    
    M 2/22 7. Computation and Logical Units Trappenberg 3.1,3.5 F. Rosenblatt - Perceptron. McCulloch and Pitts (1943)  
    W 2/24 8. Biological Visual Systems Trappenberg 5.1. Van Essen et al (1992) Fellman and Van Essen ( 1991)  
    F 2/26 Recitation: Review Problem Set 2    
    M 3/1 9. Sparse Coding Foldiak (1990) Olshausen and Field (1997) (2004)  
    W 3/3 10. Deep Belief Net Trappenberg Ch 10.3 Hinton and Salahutdinov (2006) HW 3 out. HW 2 in;
    F 3/5 Recitation: Review for Problem set 3    
    W 3/8 11. Computational Maps HPK Ch 9. Trappenberg 7.1-7.2 Kohonen (1982)  
    W 3/10 12. Markov Network Marr and Poggio (1976) Samonds et al. (2013) Wang et al. (2018)  
    F 3/12 Recitation: Review Part 2    
    M 3/15 Midterm    
      PART 3: Neural Networks    
    W 3/17 13. Attractor network and Memory Trappenberg Ch 8. Ch 9.4 Hopfield and Tank (1986)  
    F 3/19 No Recitation, Mid-semester break    
    M 3/22 14. Causal inference (Model-selection)   Mid-semester grade due
    W 3/24 15. Neural network (MLP) Trappenberg Ch 6, 10.1. Fukushima (1988), Krizhevsky et al. (2012) HW3 in. HW4 out.
    F 3/26 Recitation: Review Problem set 4    
    M 3/29 16. Convolutional Neural Networks Zeiler and Fergus (2013) LeCun, Bengio and Hinton (2015)  
    W 3/31 17. Deep Network and the Brain Yamins and DiCarlo (2016) Maheswaranathan et al. (2018) Lillicrap et al. (2016)  
    F 4/2 Recitation: Review Part 3    
    M 4/5 Easter Monday. No Class  
      PART 4: PREDICTION AND FEEDBACK    
    W 4/7 18. Biological Plausible Learning Arrout et al. (2019), Guerguiev et al. (2017) HW4 due. HW5 out.
    F 4/9 Recitation: Review Problem set 5    
    M 4/12 19. Hierarchical Inference Mumford (1992) Rao and Ballard (1998) Lee and Mumford (2003)  
    W 4/14 20. Attention and Self-Attention Trappenberg Ch 10. Vaswani et al. (2017) Lindsay (2020) Knudsen (2007)  
    F 4/16 Carnival No recitation    
    M 4/19 21. Prediction and Surprise Trappenberg Ch 10. Lotter et al (2016), Colah (2015) Rao (2015)  
    W 4/21 22. Probabilistic Inference Weiss et al. (2002). Ma et al. (2006) Kersten and Yuille (2003) HW 5 in, HW 6 out.
    F 4/23 Recitation: Review Problem set 6 and Journal Club    
    M 4/26 23. Inference Mechanisms Orban et al. (2016). Shivkumar et al. (2019)  
    W 4/28 24. Reinforcement Learning Trappenberg. Ch 9. Niv (2009), Montague et al. (1996)  
    F 4/30 Journal Club    
    M 5/3 25. Curiosity and Imagination Gruber et al. (2014) Kidd and Hayden (2015).  
    W 5/5 26. Emotion and Consciousness Koch (2018) van Hateren (2019) HW 6 in
    F 5/7 Recitation: Review of the Course    
    W 5/12 Term paper and everything due 2:00 pm..    
    M 5/17 Final Exam Start    
    R 5/20 Final Grade due 4 p.m. for Graduates    
    T 5/25 Final Grade due 4 p.m.    

    Journal Club 2021

  • Time to be determined per agreement of students of 686. Minimal attendance: 10 times (10 points). 15 points for 3 Presentations. Each week's papers should be read by two students. One serves as a presenter, while the other one serves as discussant.

    Week 1: Neurons as threshold logic devices

  • Joseph Glanzberg will present the classic McCulloch and Pitt paper. Harideep Nair will presesnt a recentScience paper shows dendrites can perform X-OR operation.

    Week 2: Dendritic computation modeled by neural networks

  • Sophia Shan will be presenting Polsky's paper. Joseph Glanzberg will do the neural-network paper.

    Week 3: Spike-timing dependent plasticity

  • Juhi Farooqui will present "Connectivity reflects coding" and Harideep Nair will present "Voltage and spike timing interact in STDP – a Unified Model" both by Claudia Clopath.

    Week 4: Dendritic Computation

  • Ruoyi Chen and Chloe Chen will present and discuss Larkum's TINS (2013) paper. Ailin Jin will present their 2020 paper on how active dendritic current gates perception.

    Week 5: Neural plausible Online PCA

  • Nicholas Blauch will present Pehlevan and Chklovskii (2019) and Jiayi Shou will present Obeid et a. NeurIPS (2019) paper. These are Hebbian-based on-line learning methods, a nice bridge between biology and machine learning.

    Week 6: Normalization circuit and Memory Trace Model

  • Nicholas Blauch will present Heeger's paper and Joseph Glanzberg will present

    Week 7: Context-sensitive Associative Memory and Gated Boltzmann machine

  • Rohan Patel will present, assisted/mentored by Hari.

    Week 8: Neural Field, Implicit functions and Grid Cells

  • Yuhao Liu will present Matthew Tancik and Vincent Sitzmann's Fourier and Implicit function papers.

    Week 9: Representation of part-whole hierarchies

  • Sophia will present Hinton's.

    Week X: Grid Cells

  • Week 10: Computational Theory of Consciousness

  • Cathy Chen will present.

    Week 10. Reinforcement Learning

    Biological Plausible Deep Learning Algorithms

    Supplementary Reading List

    Logical computation in Neurons

    Biological Neural Circuits

    Neural Network models of Neural Circuits

    Sparse Coding on computation and memory

    Reinforcement Learning

    Biological Plausible Deep Learning Algorithms

    Casual inference

    Inverse Rational Control

    Spiking Bayesian Circuit

    Curiosity and Imagination

    Reinforcement Learning and Song Birds

    Emotion

    Consciousness

    Glia and their functions


    Questions or comments: contact Tai Sing Lee
    Last modified: spring 2021, Tai Sing Lee