Up next


[MATH ONLY] Non-Euclidean Therapy for AI Trauma [Analog Archives]

3,368 Views
neoknowstic
0
Published on 10 Sep 2023 / In News & Politics

This is the version of the #SoME3 video that is suitable for all audiences and contains no unsettling themes. Changes include: - New illustrations that better explain where the row vector is sent to in the output space, and that explain basis vector dimensionality reduction in a different way - The second half of the video is about finding the correct direction, rather than about matrix approximation - The AI's voice is clearer and more pleasant sounding in some parts of the second half Most of the content is the same; this version is tailored to the math, rather than the story. The longer, original story can be found here: https://www.youtube.com/watch?v=FQ9l4v7zB3I Link to the paper this was based on: "Unsupervised Discovery of Semantic Latent Directions in Diffusion Models" https://arxiv.org/abs/2302.12469 ------------------------- PATIENT ALICE: An Artificial Intelligence suffering from hallucinations of a lost puppet show. These hallucinations need to be erased. GENERATIVE MODEL TYPE: Diffusion-based. PRESCRIBED TREATMENT: A Latent Space Editing method that involves the Pullback, the Jacobian Matrix, Eigenfaces and SVD. ------------------------- Timestamps: [spoilers ahead] 00:00 - Beginning 00:54 - Patient Introduction 01:20 - Manifolds and Pushforwards 05:03 - The Three Functions 08:54 - Diffusion Models and the U-Net 10:41 - Matrix Multiplication and the Change of Basis Neurons 16:27 - The Jacobian Matrix 22:40 - The Pullback and the Dot Product 25:40 - Other Algorithms for the Method 26:20 - Finding the Error 29:43 - Row Weight Vector Projection 33:22 - Finding Important Directions in Data 34:55 - Superposition 36:21 - Matrix Correlations 39:03 - Eigenvectors for the SVD 41:12 - The Intuition of the Singular Value Decomposition 44:30 - Eigenface Conclusion An updated version may be uploaded in the future. ------------------------- Helpful Resources: https://www.youtube.com/watch?v=vSczTbgc8Rc&t=673s&ab_channel=VisualKernel SVD Visualized, Singular Value Decomposition explained https://www.gastonsanchez.com/matrix4sl/power-method.html https://www.youtube.com/watch?v=OzeDqsVoTFc The Power Method https://www.youtube.com/watch?v=caoeihy9kLo What is a Jacobian-Vector product (jvp) in JAX? -- Prequels / supplementary videos: https://www.youtube.com/watch?v=DHjwbleAgPQ Why do Neural Networks use Linear Algebra? || The Visual Intuition of Cat Mathematics https://www.youtube.com/watch?v=xI7tAjoe4oc&t=174s THE AI AMONG US in your Non-Euclidean Mind 〘 Analog VHS Infomerical 〙 --- More References: Understanding the Latent Space of Diffusion Models through the Lens of Riemannian Geometry: https://arxiv.org/abs/2307.12868 Discovering Interpretable Directions in the Semantic Latent Space of Diffusion Models: https://arxiv.org/abs/2303.11073 https://transformer-circuits.pub/2022/toy_model/index.html

Show more
0 Comments sort Sort By

Up next