Textbook and reference materials
Please do not purchase ahead of the first class.
[PRML] Pattern Recognition and Machine Learning
Christopher Bishop, Springer, 2006.
[DL] Deep Learning
Ian Goodfellow, Yoshua Bengio and Aaron Courville, MIT Press, 2016.
[TN] Theoretical Neuroscience
Peter Dayan and Larry Abbott. MIT Press, 2001.
[PNS] Principles of Neural Science
Eric R. Kandel, John D. Koester, Sarah H. Mack and Steven A. Siegelbaum. McGraw Hill, 2021.
We will also read foundational research papers from the neuroscience and deep learning literature.
You may also find these reference materials useful throughout the quarter (modified from CSE 446/546)
.
- Machine Learning (and related topics)
- Linear Algebra and Matrix Analysis
- Probability and Statistics
- CSE 312 Course Materials
- Probability Review by Arian Maleki and Tom Do. (From Andrew Ng's machine learning class.)
- All of Statistics, Larry Wasserman. Chapters 1-5 are a great probability refresher and the book is a good reference for statistics.
- A First Course in Probability, Sheldon Ross. Elementary concepts (previous editions are a couple bucks on Amazon)
- Python
- Latex