3Blue1Brown How might LLMs store facts | Chapter 7, Deep Learning
🎯 Загружено автоматически через бота:
🚫 Оригинал видео:
📺 Данное видео является собственностью канала 3Blue1Brown. Оно представлено в нашем сообществе исключительно в информационных, научных, образовательных или культурных целях. Наше сообщество не утверждает никаких прав на данное видео. Пожалуйста, поддержите автора, посетив его оригинальный канал: @3blue1brown.
✉️ Если у вас есть претензии к авторским правам на данное видео, пожалуйста, свяжитесь с нами по почте support@, и мы немедленно удалим его.
📃 Оригинальное описание:
Unpacking the multilayer perceptrons in a transformer, and how they may store facts
Instead of sponsored ad reads, these lessons are funded directly by viewers:
An equally valuable form of support is to share the videos.
AI Alignment forum post from the Deepmind researchers referenced at the video’s start:
Anthropic posts about superposition referenced near the end:
Sections:
0:00 - Where facts in LLMs live
2:15 - Quick refresher on transformers
4:39 - Assumptions for our toy example
6:07 - Inside a multilayer perceptron
15:38 - Counting parameters
17:04 - Superposition
21:37 - Up next
------------------
These animations are largely made using a custom Python library, manim. See the FAQ comments here:
#manim
All code for specific videos is visible here:
The music is by Vincent Rubinetti.
------------------
3blue1brown is a channel about animating math, in all senses of the word animate. If you’re reading the bottom of a video description, I’m guessing you’re more interested than the average viewer in lessons here. It would mean a lot to me if you chose to stay up to date on new ones, either by subscribing here on YouTube or otherwise following on whichever platform below you check most regularly.
Mailing list:
Twitter:
Instagram:
Reddit:
Facebook:
Patreon:
Website:
11 views
0
0
4 months ago 00:25:25 1
Queuing theory and Poisson process
4 months ago 00:05:39 1
How To Install Manim On Windows (Step By Step)
5 months ago 00:17:05 1
The essence of calculus
5 months ago 00:15:42 1
Divergence and curl: The language of Maxwell’s equations, fluid flow, and more
6 months ago 00:26:10 1
Attention in transformers, visually explained | Chapter 6, Deep Learning
6 months ago 00:27:14 1
But what is a GPT? Visual intro to transformers | Chapter 5, Deep Learning
6 months ago 00:24:46 1
Why π is in the normal distribution (beyond integral tricks)
6 months ago 00:13:24 8
[3Blue1Brown] Как свет может появляться быстрее c, почему он искривляется и другие вопросы | Пазлы по оптике 4
7 months ago 00:26:33 1
The Remarkable Story Behind The Most Important Algorithm Of All Time
7 months ago 00:19:54 1
How To Know Which Statistical Test To Use For Hypothesis Testing
7 months ago 00:03:40 1
How To Install Manim On Mac (Step By Step)
7 months ago 00:27:07 1
How (and why) to raise e to the power of a matrix | DE6
7 months ago 00:13:13 1
A quick trick for computing eigenvalues | Chapter 15, Essence of linear algebra
7 months ago 00:12:51 1
Change of basis | Chapter 13, Essence of linear algebra
7 months ago 00:13:10 1
Cross products in the light of linear transformations | Chapter 11, Essence of linear algebra
7 months ago 00:08:54 1
Cross products | Chapter 10, Essence of linear algebra
7 months ago 00:12:09 1
Inverse matrices, column space and null space | Chapter 7, Essence of linear algebra
7 months ago 00:10:03 1
The determinant | Chapter 6, Essence of linear algebra
7 months ago 00:11:34 1
Robot Piano Catches Fire Playing Rush E (World’s Hardest Song)
8 months ago 00:23:22 1
Chemistry Revamped E02 | Nanobots part 1: Materials & Structure