Simple Message Passing on Graphs

  Рет қаралды 33,827

WelcomeAIOverlords

WelcomeAIOverlords

Күн бұрын

Пікірлер: 34
@RaynerGS
@RaynerGS 2 жыл бұрын
Thanks so much for your class; I am a Ph.D. student, and your channel is helping me - way to go and Salute from Brazil.
@RalphDratman
@RalphDratman Жыл бұрын
Excellent! Thanks very much. That's outstandingly clear. The resulting system implements diffusion with conservation of total, um, node stuff.
@arp_ai
@arp_ai 4 жыл бұрын
I enjoyed the video. Thank you! Would be awesome to mention example applications of message passing.
@welcomeaioverlords
@welcomeaioverlords 4 жыл бұрын
Thanks Jay! This is relevant whenever information is gathered over a graph neighborhood. Graph neural network techniques and Label Propagation are examples. I hope this helps. A follow up video will expand on this to show how this same approach is used by Graph Convolutional Networks.
@arp_ai
@arp_ai 4 жыл бұрын
@@welcomeaioverlords wonderful! Thanks for the answer! Looking forward to it!
@pavelpopov9712
@pavelpopov9712 Жыл бұрын
It’s a great video, one thing (if I understood correctly) is to take different values for nodes features vector instead of nodes initial range positions. I didn’t catch that from the first time
@Christian-ty5vn
@Christian-ty5vn 3 жыл бұрын
Thanks! :) Enjoyed the video. Well explained and great examples.
@736939
@736939 2 жыл бұрын
7:35 I didn't get this: so why, when we sandwich "A", between D^(-1/2) A D^(-1/2) we know that "D" on the right side is related to the destination node?
@welcomeaioverlords
@welcomeaioverlords 2 жыл бұрын
Because if it's on the right, it multiplies the columns of A and if it's on the left, it multiplies the rows of A. To see this, do it by hand. It will also make it easier to recall that D only has non-zero entries along the diagonal.
@736939
@736939 2 жыл бұрын
@@welcomeaioverlords Thanks a lot. I think that I will buy your course, because I saw the content and you provide a good GNN explanation, also I thought that GNN's propagation is done by BFS algorihm, but you show the "waterdrop" approach, where did you read it? I tried to find good, readable GNN sources, but I saw only difficult-to-read scientific articles.
@moustafa_shomer
@moustafa_shomer 2 жыл бұрын
great video, thank you so much
@leandrostival4326
@leandrostival4326 3 жыл бұрын
Great explanation!
@maryamalizadeh4984
@maryamalizadeh4984 3 жыл бұрын
great explanation . Thanks
@jevoncharles8680
@jevoncharles8680 11 ай бұрын
Brilliant
@rupjitchakraborty8012
@rupjitchakraborty8012 3 жыл бұрын
Loving your channel. Too few likes for such a good video.
@ThyRiki
@ThyRiki 4 жыл бұрын
Great video! Very clear.
@TheMarcosVerissimo
@TheMarcosVerissimo Жыл бұрын
Wonderful video! Pity that it looks like you have stopped making them. If that were part of a Udemy course, I'd definitely take it.
@welcomeaioverlords
@welcomeaioverlords Жыл бұрын
I have this free course: www.graphneuralnets.com/p/basics-of-gnns, and this paid course: www.graphneuralnets.com/p/introduction-to-gnns. And I hope to start making videos again, but needed a little (or not so little) break :)
@TheMarcosVerissimo
@TheMarcosVerissimo Жыл бұрын
@@welcomeaioverlords Thanks for the reply! I read the video description later, and I have already enrolled. Would you have any plans of making the also the full course self-paced? I have a project that will involve GNNs but I'm on a postdoc that involves learning a lot of things, and that leaves me little room for separating a few months to practically only take the course.
@seyitahmetozturk721
@seyitahmetozturk721 2 жыл бұрын
I couldn't understand why you changed degree of each node (1,2,3,1,1) to (1,2,3,4,5). Maybe I missed some point but I am curious to answer.
@keremcomert4239
@keremcomert4239 2 жыл бұрын
(1,2,3,4,5) is irrelevant to the node degrees, the degrees stay as they are. (1,2,3,4,5) was just an arbitrary assignment of values to each node, to demonstrate node features. You might as well assign any other value to each node, the idea is that when you perform the matrix multiplication, for each node in a row only the neighbors of that node are able to propagate its feature value.
@hayleecs4223
@hayleecs4223 4 жыл бұрын
Great video, but that t-shirt tho! edit: seriously, thanks for the animations as well!
@welcomeaioverlords
@welcomeaioverlords 4 жыл бұрын
\m/
@leo.y.comprendo
@leo.y.comprendo 3 жыл бұрын
Great video!! I am currently working on an algorithm that builds an adjacency matrix from features extracted on a convolutional layer for a set of images (deep features). However, the formula just states that A is softmax(F)@softmax(F).T. This leaves me with an adjacency matrix with real numbers. Is this valid for message passing algorithms? I could imagine that it is some sort of “weighted edges”? Thanks!
@milandoshi7640
@milandoshi7640 3 жыл бұрын
Is it possible to create a adjacent matrix from an image of a molecule or from just coordinates without knowng which components are connected ?. Thanks in advance.
@welcomeaioverlords
@welcomeaioverlords 3 жыл бұрын
You might dig into "Latent Graph Learning".
@xabiergarciaandrade2656
@xabiergarciaandrade2656 3 жыл бұрын
In order to derive an adjacency matrix from a coordinate file you would need to define pairwise bond lengths as threshold
@arjunashok4956
@arjunashok4956 4 жыл бұрын
Why wouldn't message passing work without scaling with respect to the destination, and without the square root? i.e. just by using the initial formulation(D * A) with self-loop edges in A (and hence accounted in D), why can't I do the message passing?
@welcomeaioverlords
@welcomeaioverlords 4 жыл бұрын
It’s not that it wouldn’t work. It might work as well or better, depending on the problem. Sometimes no normalization is better (i.e. just a sum of messages). But the paper used this parameterization because it aided in the derivation and is numerically stable.
@arjunashok4956
@arjunashok4956 4 жыл бұрын
Thanks a ton! You're awesome! :)
@vishnusureshperumbavoor
@vishnusureshperumbavoor 2 жыл бұрын
I didn't understood the part 1,2,3,1,1 @ 4:57
@rafael_l0321
@rafael_l0321 Жыл бұрын
As explained in 3:01 - node 1 has degree 1, node 2 degree 2, node 3 degree 3, node 4 degree 1, node 5 degree 1
@robmarks6800
@robmarks6800 2 жыл бұрын
Even though I’ve almost finished my master’s, the generalization of simple scalar arithmetic such as multiplication and division to vectors still surprise me. For example going from a/d to the inv(D)*A formula. Which seems even harder to comprehend if D wouldnt be diagonal. Can anyone help me/point to any respurces that explain(not define) these fundamental matrix operations?
@welcomeaioverlords
@welcomeaioverlords 2 жыл бұрын
D is always diagonal by construction. I'm not sure this is what you're asking, but I'm not claiming that element-wise division can always be straight-forwardly expressed by an inverted matrix. In this particular case, since D is diagonal and we have the definition: D*D_inv = 1, I think it's straightforward to see (do it by hand for an example) that the inverse of a diagonal matrix is simply 1/elements of that matrix.
Graph Convolutional Networks using only NumPy
13:21
WelcomeAIOverlords
Рет қаралды 40 М.
Graph Neural Networks - a perspective from the ground up
14:28
УНО Реверс в Амонг Ас : игра на выбывание
0:19
Фани Хани
Рет қаралды 1,3 МЛН
Хаги Ваги говорит разными голосами
0:22
Фани Хани
Рет қаралды 2,2 МЛН
Intro to Graphs and Label Propagation Algorithm in Machine Learning
11:16
WelcomeAIOverlords
Рет қаралды 41 М.
Deep learning with dynamic graph neural networks
15:08
Jacob Heglund
Рет қаралды 13 М.
Understanding Graph Attention Networks
15:00
DeepFindr
Рет қаралды 91 М.
Graph Convolutional Networks (GCNs) made simple
9:25
WelcomeAIOverlords
Рет қаралды 126 М.
Converting a Tabular Dataset to a Graph Dataset for GNNs
15:22
The basics of spatio-temporal graph neural networks
13:09
Jacob Heglund
Рет қаралды 27 М.
Graph Neural Networks: A gentle introduction
29:15
Aladdin Persson
Рет қаралды 49 М.
Graph Attention Networks (GAT) in 5 minutes
5:10
WelcomeAIOverlords
Рет қаралды 45 М.
УНО Реверс в Амонг Ас : игра на выбывание
0:19
Фани Хани
Рет қаралды 1,3 МЛН