Thanks, sir jee for such videos! please start BERT/transformer series!
@SethuIyer958 ай бұрын
I think you nailed it with the random forest analogy. That's just brilliant.
@SurajitDas-gk1uv Жыл бұрын
Very good tutorial. Thank you very much for such a good tutorial.
@alirazatareen56513 ай бұрын
Amazing you are the king of deep learning
@MonikaSingh-nu5sg2 жыл бұрын
Amazing video again ...
@JiwanRai-mj7ke5 ай бұрын
@Nitish Shing. I really loved your all videos and the way you explain the concept. Looking fore more videos of such nature in future. I am learning AI through your videos. Thanks for amazing work you are doing. Lots of support from Bhutan
@sakshamjain4640Ай бұрын
Its amazing, I was thinking Dropout layer is like Random forest with regression logic, Maybe the intuition of everything is kinda similar with slight difference. But Amazing Video. Really impressed !!
@narendraparmar16319 ай бұрын
Thanks for this simple explanation.
@rayanali9737 Жыл бұрын
very simple and precise explanation. Thanks
@sujithsaikalakonda4863 Жыл бұрын
Very well explained.
@nomannosher8928Ай бұрын
I wish, I would watch this video before my interview.😥. Interviewer watched this before me.
@ShahnazMalik. Жыл бұрын
Brilliantly explained. May I please know what pen and drawing pad you use to write the mathematics and explanation? Thanks and keep uploading videos and I am sure you will soon have millions of subscribers
@aakashbarwad24208 ай бұрын
Wonderful videos all
@hamzakhanswati9087 Жыл бұрын
Amazing❤❤
@ekanshuagrawal593110 ай бұрын
Thanks for providing the paper
@mustafizurrahman5699 Жыл бұрын
sir you are a genius
@paragbharadia28953 ай бұрын
amazing video thank you, learned so much!
@ParthivShah7 ай бұрын
Thank You Sir.
@vinaynaik9532 жыл бұрын
Great
@sandipansarkar92112 жыл бұрын
finished watching
@ParthivShah7 ай бұрын
best video.
@chetanchavan6478 ай бұрын
Great Video
@AayushShrivastavaIITH3 ай бұрын
If the weight is calculated based on the ratio of availability of the nodes, i.e., 0.75, then during testing we will use the same weight value to predict the output. Why to multiply with the availability ratio of 0.75 during testing?
@barryallen30512 жыл бұрын
I have a request. Can you please make a video about Monte Carlo (MC) Dropout too?
@Singasongwithme20043 ай бұрын
in last topic how prediction works there you told p = 0.25 means only one node will be ignored during training in every epoch(according to you) but i think if you want to drop or ignore only one node from one layer(one layer is made up by 4 layer) the p value should be 0.75 and this is the formula - p = Neurons to keep / Total neurons If i am wrong please correct me and if we both are right then let me know please 🙂
@savyasachi69885 ай бұрын
hidden gem
@amLife074 ай бұрын
Thank you soo much Sir ....
@vanshshah64182 жыл бұрын
Thanks awesome
@AtharvaAnkushSadanshive11 ай бұрын
Hello Sir, just what you mentioned in the video that the probability of each weight being present while testing will be 0.75 given that p is 0.25. But as the nature being random for selecting the node to be dropout isnt it possible in the worst case scenario that one node being present all the time for eg the node was present for all the 100 epochs?
@rb47545 ай бұрын
Awesome teaching...
@arin_pАй бұрын
good
@upa2402 жыл бұрын
Is it more accurate to say for every combination of forward and backward pass instead of saying for each epoch? One epoch may consists of several mini batches. Also is it common to apply drop outs on input layer? Thanks...
@arush60702 жыл бұрын
ig we usually don't apply dropout to input layer
@ickywitchy46678 ай бұрын
yea you are right
@sharangkulkarni1759 Жыл бұрын
Good
@hussainshah7065 Жыл бұрын
sir please share the one note link for notes
@prakharagarwal94482 жыл бұрын
Sir pls send the notes link
@sajalhsn132 жыл бұрын
off-topic. do you have any connection with the Bengali language? I am a Bengali and for some reason, I think, you do. just curious. :)
@siddharthagrawal61572 жыл бұрын
Sir how do you make these amazing tutorials? What do you use to write on the screen? Just curious