Up next


Lecture 5 Part 1: Derivative of Matrix Determinant and Inverse

1,551 Views
MIT OpenCourseWare
1
Published on 01 Dec 2023 / In Other

MIT 18.S096 Matrix Calculus For Machine Learning And Beyond, IAP 2023 Instructors: Alan Edelman, Steven G. Johnson View the complete course: https://ocw.mit.edu/courses/18-s096-matrix-calculus-for-machine-learning-and-beyond-january-iap-2023/ YouTube Playlist: https://www.youtube.com/playlist?list=PLUl4u3cNGP62EaLLH92E_VCN4izBKK6OE Description: The first ~6 minutes are on the topic Norms and Derivatives: Why a norm of the input and output are needed to define a derivative. Now we can find the “matrix gradient” of the determinant function (leading to the “adjugate” matrix), and the “Jacobian” of a matrix inverse. License: Creative Commons BY-NC-SA More information at https://ocw.mit.edu/terms More courses at https://ocw.mit.edu Support OCW at http://ow.ly/a1If50zVRlQ We encourage constructive comments and discussion on OCW’s YouTube and other social media channels. Personal attacks, hate speech, trolling, and inappropriate comments are not allowed and may be removed. More details at https://ocw.mit.edu/comments.

Show more
0 Comments sort Sort By

Up next