This is awesome. Right level of detail balanced with simplification. Great work!
@NM-vw6xq2 жыл бұрын
Misra, thank you so much for putting together these videos. They helped me out a lot. Seeing you walk through a simple example made a lot of things click for me vs. just seeing random notation in a textbook that I never fully understood.
@misraturp2 жыл бұрын
Glad it was helpful!
@jameelabduljalil252 жыл бұрын
Thanks for the time and effort you put in these lessons and put them for free.. Really appreciate it..💐
@misraturp2 жыл бұрын
You're very welcome Jameel!
@HomeDesign_Austin11 ай бұрын
great detail for new learners, great job
@hshrestha28112 жыл бұрын
You have presented things in a very simple and comprehensive manner. Thanks!
@misraturp2 жыл бұрын
You're very welcome :)
@zainulhassan8557 Жыл бұрын
Thank You so much Mam'm. Really appreciate your efforts. You explain all the things in an easy way.
@behradio2 жыл бұрын
Simple and Understandable, Thank you 🙏
@misraturp2 жыл бұрын
You're welcome :)
@alexanderkamke3774 Жыл бұрын
Thanks a lot! Its very good to understand ...
@viktorkovacs96802 жыл бұрын
Thanks, I understand better now. But are the slides correct at 9:11 and 12:13?
@bay-bicerdover Жыл бұрын
Yep
@prasad46856 ай бұрын
Thanks ma'am- very useful
@SheepKev5 ай бұрын
I'm a bit confused at 12:34, how does `(w3x1 + w4x2) + b2` turn into `(w1x1 + w2x2) + b1`?
@joniendog66932 жыл бұрын
Memang buat pemula seperti saya belajar ini terasa agak sulit ' biar cepet memahami saya terus mengulang ulang melihat videonya ' terima kasih ' semoga sehat selalu
@deepthireddy17899 күн бұрын
Malay?
@went-rogue2 жыл бұрын
so i first saw you Assemble Ai, loved your content, thank you so much
@misraturp2 жыл бұрын
Glad you enjoyed!
@dancode07752 жыл бұрын
clear and precise thank you
@misraturp2 жыл бұрын
Glad it was helpful!
@hshrestha28112 жыл бұрын
I think you need to have two columns in the W transpose matrix. They are now presented as a single column with w1w2 as a product form. For matrix multiplication, number of columns in W transpose matrix should be equal to the number of rows in X matrix (2).
@bay-bicerdover Жыл бұрын
Disagree
@schorrer17 Жыл бұрын
Hey thanks for the great video! How would the formula be, if there would be a second hidden layer? assuming B would be the output of the second layer, W2^T would be the weight matrix of the second layer, b2 its bias and so on B=beta{W2^T*[alpha(W1^T*X+b1)]+b2} is this correct? thanks for any help
@himanshupandey99022 жыл бұрын
Great ....... keep continue.
@misraturp2 жыл бұрын
:)
@mohammedzia10152 жыл бұрын
Please share the slides and course notes for Lesson 2, Module 1.
@misraturp2 жыл бұрын
Thanks for the heads up. They're there now.
@lakeguy656162 жыл бұрын
RELU, if the value is less than zero, relu returns 0, if greater than zero, relu returns the value (unchanged).
@bay-bicerdover Жыл бұрын
Well?
@lakeguy65616 Жыл бұрын
@@bay-bicerdover well what?
@bkentffichter Жыл бұрын
This is all fine, but I have no idea what all of this means as far as learning. I need to have some sort of practical something.