Sheesh. After many videos of backpropagation, this is by far the best explanation. I just wrote everything down on a whiteboard and everything made sense. Thank you, andrej. You are my GOAT
@georgiandanciu356722 күн бұрын
GPT watching these lectures eating popcorn
@EurekaAILabsАй бұрын
Can someone explain what's happening in 39 mint??
@AymanFakri-ou8roАй бұрын
best explanation, thank you!
@zhenyusong70242 ай бұрын
kzbin.info/www/bejne/imi8nIONpdx5epYsi=w5Zc3NyijbNQhc6L&t=3235 Attention at the time didn't deserve even a word.
@john_olu2 ай бұрын
Damn i would have stocked up on Nvidia stocks if i watched this in 2016 51:55
@shahainmanujith21092 ай бұрын
MINDBLOWING SIMPLICITY!!! Well explained!
@mikeba38092 ай бұрын
Andrej is so good
@ashrafibrahim81523 ай бұрын
This lecture feels like the lecturer is only reading the slide aloud. Explanations are unclear, the lecturer hastily skipped through many parts, without providing any clear explanation.
@LiveLifeWithLove3 ай бұрын
I can confirm from Future that Thaeano is dead, and Caffe is in comma
@happier134 ай бұрын
done
@ONeilPoppy-l1k4 ай бұрын
Garcia Elizabeth Clark Mark Wilson Brian
@ONeilPoppy-l1k4 ай бұрын
Hall Gary Allen Paul Young Brenda
@channelforwhat4 ай бұрын
@14:47
@aryanbhushanwar90835 ай бұрын
Much thanks!
@Zynqify5 ай бұрын
if you're confused by the example at 19:50, it is very simple but can be confusing because in the slides the rule for the output size was defined as (N-F) / stride + 1. the lack of parenthesis in that definition causes the confusion as it really should be ((N-F) / stride) + 1 which will then give you the correct answer. wrong: (32 + 2 + 2 - 5) / 1+1 = 31/2? correct: ((32+2+2-5)/1) + 1 = (31/1)+1 = 32
@doomsdaymachiene915 ай бұрын
Where the cats okay??
@Kornackifs5 ай бұрын
Why there aren't any captions with the video?
@realnicorobin87985 ай бұрын
watching this 8 years later cause i cant understand what my professor was tryna explain (she doesnt know how to explain)
@Clammer9996 ай бұрын
Always loved to hear Dr Fei Fei Li speaks even though this is nearly 10 years ago and she is still pretty hilarious.
@enestemel94907 ай бұрын
Very good lecture, Andrej. Also, the student who interrupted Andrej by saying 'that was what I said' didn't have a good tone and annoyed me.
@SiD-hq2fo7 ай бұрын
hey internet, i was here
@susdoge37677 ай бұрын
andrej has the most unique way of telling things and generally they are more intuitive, what a maniac!
@anon-yn9rc7 ай бұрын
yeah completely agree he is just excellent!
@suvarnakadam65577 ай бұрын
Just want to say a big Thank you, this course had helped me prepare the foundation for my PhD back in 2017. Literally watched it multiple times to internalise the concepts
@ayushchaudhary6638 ай бұрын
and this is the 100th
@ayushchaudhary6638 ай бұрын
this is the 99th comment
@phangb5808 ай бұрын
27:40
@huongdo17588 ай бұрын
6:00
@vq8gef329 ай бұрын
Amazing Course ! Thank you!
@vq8gef329 ай бұрын
Started my journey and watching this course now. Couldn't attend Stanford University but still good to be in the class in 2024 : )
@twentyeightO19 ай бұрын
This is helping me quite a lot, thanks!!!
@akzsh9 ай бұрын
In 2015, the world didn't know about the problem of batch normalization
@sezaiburakkantarci10 ай бұрын
1:14:27 - The network never fully converges, but at some point you stopped caring. Because it has been 2 weeks and you are just tired. 😅
@sezaiburakkantarci11 ай бұрын
You are one of the best Andrej. You make learning so fun, with moments like 27:45 😄 Forever grateful.
@vil938611 ай бұрын
This clears lot of doubts I had in my head. Thank you Andrej.
@vil9386 Жыл бұрын
Can't thank Andrej, the cs231n team, Stanford enough. Thoroughly enjoy your lectures. Knowledge is one form of addiction and pleasure and thank you so much for providing it freely. I hope you all enjoy giving it as much as we enjoy receiving it.
@egemeyvecioglu3165 Жыл бұрын
1:09:10 it worked :)
@vijaypalmanit Жыл бұрын
does he speaks 1.5x by default 😛
@Siwon-vv5mi Жыл бұрын
At 39:00, what did he mean by jiggling the scores?
@mannemsaisivadurgaprasad8987 Жыл бұрын
one of the best videos on RNN who explains the code perfectly from sratch.
@jenishah9825 Жыл бұрын
This content here, is GOLD.
@pravachanpatra4012 Жыл бұрын
28:00
@lifeisbeautifu1 Жыл бұрын
I love you Andrej ❤
@reachmouli Жыл бұрын
This is a beautiful lecture - gave a very fundamental understanding of backward propagation and its concepts - I see backward propagation correlates to demultiplexing and forward prop corresponds to multiplexing where we are multiplexing the input .
@GohOnLeeds Жыл бұрын
haha. at 1.17.00 Justin says "it's cool but not sure why you would want to generate images"... 🙂
@GohOnLeeds Жыл бұрын
Seems to be a mistake in the "Computing Convolutions: Recap" slide - it says "FFT: Big speedups for small kernels" when it should be "big kernels"?
@zeeshankhanyousafzai5229 Жыл бұрын
YOLO is king now hahaha.
@padenzimmermann1892 Жыл бұрын
when this video was recorded I could not even factor a quadrtic equation. Now I can watch this and follow the math w realetive ease. wow