[Home] | [Information] | [Topics] | [Lectures] | [Projects] |
know basics of calculus and statistics
be familiar with linear algebra, know vector/matrix derivatives
have algorithmic design and programming skills
Collaboration and cheating: We encourage collaborations; however, we do not tolerate any kinds of cheating.
Important: To make clear what should NOT to do, please read policy on academic misconduct. if you haven’t done so already, sign the acknowledgment form within two weeks of the beginning of the semester.
Honor code
Discussions and study groups about class materials are encouraged. You can discuss the assignments as well. However, all the solutions, proofs, codes and reports should be written completely by yourself. Do NOT copy from other students or the Web.
The first time cheating will result failing grade for that submission; the second time cheating will lead to failing for the whole class.
For projects, discussions/collaborations are encouraged within the project group.
No plagiarism and cheating
Students with disabilities: Please let me know at your earliest convenience. The University of Utah endeavors to offer equal access to its programs, services and activities for everyone. If you hope to qualify for exemptions under the Americans with Disabilities Act (ADA), please notify the Center for Disability Services as well.
Also see the College of Engineering guidelines for information about appeals, the Americans with Disabilities Act, repeating courses and add and withdraw deadlines.
Please post your questions or initiate discussions on the class discussion board on Canvas. The instructor and TAs will watch the forum and answer the questions. Students who know better can answer the questions as well. If you have confidential questions, such as grading questions and project related questions, please come at office hours.
The major reference textbook for this course is Pattern Recognition and Machine Learning by Christopher Bishop, Springer, 2007. While the lecture slides will cover all the content, the students are encouraged to read through the corresponding chapters. There can be a few topics not covered by the reference book. For these topics, we will provide extra reading materials. In addition, we list several books to further extend the depth and breadth of the topics we will discuss in the class.
Kevin Patrick Murphy, Machine Learning: a Probabilistic Perspective. MIT Press, 2012.
David J.C. MacKay, Information Theory, Inference, and Learning Algorithms.Cambridge University Press, 2003.
Sidney I, Resnick. A probability path. Springer Science & Business Media, 2013
Larry, Wasserman. All of statistics: a concise course in statistical inference. Springer Science & Business Media, 2013.
Daphne Koller and Nir Friedman. Probabilistic Graphical Models: Principles and Techniques.MIT Press, 2009
Old and New Matrix Algebra Useful for Statistics from Thomas P. Minka
The Matrix Cookbook from Kaare Brandt Petersen and Michael Syskind Pedersen
Linear Algebra Review and Reference from Stanford
The Linear Algebra chapter in the text book on deep learning by Goodfellow, Bengio and Courville.
Review of some elements of linear algebra, by Fernando Paganini
David Blei’s review of probability
Probability for data miners, slides by Andrew Moore
Review of basic concepts in probability by Padhraic Smyth.
A review of probability theory from Stanford
The Probability and information theory chapter in Yoshua Bengio’s upcoming book on deep learning.