18.S096 Matrix Calculus for Machine Learning and Beyond, January IAP 2022
Author(s)
Edelman, Alan; Johnson, Steven G.
DownloadThis package contains the same content as the online version of the course. (21.48Mb)
Terms of use
Metadata
Show full item recordAbstract
We all know that calculus courses such as 18.01 Single Variable Calculus and 18.02 Multivariable Calculus cover univariate and vector calculus, respectively. Modern applications such as machine learning require the next big step, matrix calculus.
This class covers a coherent approach to matrix calculus showing techniques that allow you to think of a matrix holistically (not just as an array of scalars), compute derivatives of important matrix factorizations, and really understand forward and reverse modes of differentiation. We will discuss adjoint methods, custom Jacobian matrix vector products, and how modern automatic differentiation is more computer science than mathematics in that it is neither symbolic nor based on finite differences.
Date issued
2022Other identifiers
18.S096-IAP-JanuaryIAP2022
18.S097
Keywords
matrix calculus, modes of differentiation, applied mathematics, calculus, linear algebra, adjoint methods, Jacobian matrix vector products, modern automatic differentiation
Collections
The following license files are associated with this item: