Dropout Regularization | Deep Learning Tutorial 20 (Tensorflow2.0, Keras & Python)

  Рет қаралды 91,642

codebasics

codebasics

Күн бұрын

Пікірлер: 76
@codebasics
@codebasics 2 жыл бұрын
Check out our premium machine learning course with 2 Industry projects: codebasics.io/courses/machine-learning-for-data-science-beginners-to-advanced
@ITVishal
@ITVishal Жыл бұрын
no
@codinghighlightswithsadra7343
@codinghighlightswithsadra7343 Жыл бұрын
I'm at a loss for words to express my gratitude towards you. your tutorial is amazing, thank you so much
@shaiksuleman3191
@shaiksuleman3191 4 жыл бұрын
Sky has no limits u r teaching has no more questions.Those who are dislike they are like finding the color of water,run the car with out petrol
@codebasics
@codebasics 4 жыл бұрын
😊
@geekyprogrammer4831
@geekyprogrammer4831 2 жыл бұрын
such a funny analogy at the beginning! You are true genius educator :D :D
@clementvirgeniya8000
@clementvirgeniya8000 4 жыл бұрын
Sir, Your way of teaching is awesome.Sir, Please do videos on Multi-class classification problem in deep learning.
@rafibasha4145
@rafibasha4145 2 жыл бұрын
Hi ,can you please let me know how dropout works during testing phase
@nitishkeshri2378
@nitishkeshri2378 2 жыл бұрын
@@rafibasha4145 basically it drops neuron randomly from hidden layers
@karthikc8992
@karthikc8992 4 жыл бұрын
happy to be here again!!!
@codebasics
@codebasics 4 жыл бұрын
sure karthik. thanks.
@manideep1882
@manideep1882 3 жыл бұрын
and here I was waiting dropout regularization to happen for you to delete dense layers #2 and #3 !! hahaha. Great stuff. Keep up the good work.
@pa5119
@pa5119 4 жыл бұрын
Sir, your explanation is great great great. But, sir please make video on this series fast so, as our exam come near we prepare well and we complete in less time. Thanks a lot for making such a good videos.
@codebasics
@codebasics 4 жыл бұрын
I will try my best Pa. And thanks for your kind words
@bratsummer1980
@bratsummer1980 4 жыл бұрын
@@codebasics Sir actually my project is on continous data of 5 inputs and output. I have used multiple linear regression. Also I am using Ann by batch gradient and tensor flow But I request you to please upload video on continous output or supervised learning like prediction of salary by tf
@jongcheulkim7284
@jongcheulkim7284 3 жыл бұрын
I am enjoying your tutorials. Thank you so much.
@codebasics
@codebasics 3 жыл бұрын
Glad you like them!
@hardikvegad3508
@hardikvegad3508 4 жыл бұрын
Sir Please cover the concept of EARLY STOPPING... I know the implementation part but want to know in-depth.
@achelias8477
@achelias8477 2 жыл бұрын
Thank you for this amazing tutorial! I even understood the batch size, without this being my goal with this video here!
@sandiproy330
@sandiproy330 Жыл бұрын
Thank you. Nicely explained with a clear-cut example.
@MrSHANKSHINE
@MrSHANKSHINE 3 жыл бұрын
Awsome Sir, Thakyou so much for making us understand such important concepts in simple n easy way..!!!
@balajiplatinum08
@balajiplatinum08 3 жыл бұрын
Hi, In Deep learning, can you Please post some videos on hyper parameter tuning. Thanks
@tchintchie
@tchintchie 4 жыл бұрын
Invaluable 👏
@codebasics
@codebasics 4 жыл бұрын
Glad you think so!
@iaconst4.0
@iaconst4.0 8 ай бұрын
El accuracy me salio casi igual , sin embargo, agradesco el video
@devilzwishbone
@devilzwishbone Жыл бұрын
05:10 so effectively a drop out could be considered similarly then to test/train data, in that it trains neurons A and C, then adjusts B and D based on the test results from A and C
@ncf2294
@ncf2294 3 жыл бұрын
thank you for your tutorial. I have learned much from it
@mohdsyukur1699
@mohdsyukur1699 7 ай бұрын
You are the best My Boss
@jansirani4429
@jansirani4429 7 ай бұрын
Very good explanation
@jvandeal
@jvandeal 2 жыл бұрын
This was so good, thank you!
@fahadreda3060
@fahadreda3060 4 жыл бұрын
Great tutorial, love the biryani example 😂😂
@codebasics
@codebasics 4 жыл бұрын
ha ha.. i knew fahad that some biryani lovers are going to like it for sure. Looks like you like biryani correct :)
@fahadreda3060
@fahadreda3060 4 жыл бұрын
@@codebasics Who Doesn't like briyani😂😂??how much time does it take you to make these amazing videos ? I teach data science in Arabic , which is way harder than in English , because some terms doesn't have a proper translation and there is no source for Data Science in ARABIC!! so one webinar will take me 5 days , and a video will take me around 2 days , including video editing, so how much time does it take you to make these videos ? Thanks again !
@ITVishal
@ITVishal Жыл бұрын
what a lecture omg
@very_nice_777
@very_nice_777 Жыл бұрын
Sir, can you explain why dropping neurons to 50% isn't as the same as reducing the neuron size to 50%? For example, instead of taking 60 neurons and dropping 50%, why don't we just take 30 neurons to begin with? Thanks in advance
@NitinKumar-wm2dg
@NitinKumar-wm2dg Жыл бұрын
beacause neurons might be biased towards some neurons, with each epoch, it randomly drops neurons and trains the model, then back propagates. This way, we avoid bias in neural networks and we efficiently train our data
@r0cketRacoon
@r0cketRacoon 8 ай бұрын
@@NitinKumar-wm2dg so all about dropout is that for each epoch, it RANDOMLY drops neurons (which are different from neurons dropped from the previous and following epochs) and train with the remaining
@vishaltanawade7637
@vishaltanawade7637 3 жыл бұрын
9:10 can we replace M and R with 0 and 1 instead of using dummy variable ??
@parthjasani2114
@parthjasani2114 2 жыл бұрын
yes , you can.
@clementvirgeniya8000
@clementvirgeniya8000 4 жыл бұрын
Sir, in my dataset i am having 20 target variables(ie., multi- class problem). When i train and test my accuracy it is only 45%. I am little bit struck with this. It will be helpfull if you give me some suggestions .
@bratsummer1980
@bratsummer1980 4 жыл бұрын
We are huge fan of you.....
@Martyniqo
@Martyniqo 2 жыл бұрын
Thank You so much!
@bratsummer1980
@bratsummer1980 4 жыл бұрын
Sir please give example with continuous output regression or multiple classification
@AKSHAY99552
@AKSHAY99552 3 жыл бұрын
really great explanation ..
@AlonAvramson
@AlonAvramson 3 жыл бұрын
Thank you!
@abdansyakura2982
@abdansyakura2982 2 жыл бұрын
HI Sir, I have a question about the droupout technique, as we can see this technique randomly deactivate the neuron, what about the testing, is it still deactivated ?
@fakharyarkhan5848
@fakharyarkhan5848 2 жыл бұрын
They're only deactivated during training. During testing, all of the weights are resealed by p, the probability that a node gets dropped.
@AquarianVikas
@AquarianVikas 2 жыл бұрын
Hi, when I ran the classification report after adding the dropout layers, I got slightly lesser accuracy and F1 scores. Is this normal? or it could be that I must have made some mistake?
@PhaniHarshithKotturu
@PhaniHarshithKotturu Жыл бұрын
It is correct as many times adding the dropout layer decreases the accuracy but it is fine as it appropriately generalizes your model so that it performs well in all conditions.
@r0cketRacoon
@r0cketRacoon 8 ай бұрын
high accuracy is not always good, it also means ur model might deal with overfitting
@souvikghosh6509
@souvikghosh6509 3 жыл бұрын
Sir..Is it possible to apply dropout in deep autoencoder??
@osamashawky622
@osamashawky622 3 жыл бұрын
very good
@faezeabdolinejad731
@faezeabdolinejad731 3 жыл бұрын
its awesome , thanks
@codebasics
@codebasics 3 жыл бұрын
Glad you like it!
@sumitchhabra2419
@sumitchhabra2419 3 жыл бұрын
I loved your tutorials Brother. I have just one question from you, In every iteration, we have a new set of data and neurons can be chosen at random right? From this, I infer, that neurons will learn different data and will not be biased towards certain data inputs right?
@codebasics
@codebasics 3 жыл бұрын
yes that is the correct understanding
@Adinasa2
@Adinasa2 3 жыл бұрын
Looks like you always get confused between input layer dimensions and hidden layer dimensions
@mdsifath7741
@mdsifath7741 4 жыл бұрын
sir can you make or suggest any video for ADAM optimizer??@codebasics
@codebasics
@codebasics 4 жыл бұрын
yes i will be adding that in this series
@wbh786
@wbh786 2 жыл бұрын
love u sir
@bratsummer1980
@bratsummer1980 4 жыл бұрын
Sir one question more batch gradient is also artificial neural network
@codebasics
@codebasics 4 жыл бұрын
batch gradient is just a technique of gradient descent. That can be used in artificial newural net as well as statistical models such as decision tree
@karthikb.s.k.4486
@karthikb.s.k.4486 4 жыл бұрын
Sir what is the keyboard that you are using for progrmaming.
@codebasics
@codebasics 4 жыл бұрын
any keyboard is ok karthik.
@karthikb.s.k.4486
@karthikb.s.k.4486 4 жыл бұрын
codebasics Thanks sir
@asifurrahmankhan5006
@asifurrahmankhan5006 3 жыл бұрын
Why one hot encoding to convert 'y' into integer? Can't we do that with simpile 0 and 1 conversion?? Can you clear this please?
@kmnm9463
@kmnm9463 2 жыл бұрын
Hi - if you, say LabelEncode the y, the values will be 0 and 1, obviously, but what will happen here is the model will take the values as some kind of numerical order, that is 1 > 0. Here it is not what we want right , we want the model to learn the M and R and not M>R or R>M. That is why one - hot encoding is better option. There is a detailed video on this by Sir - where he has explained very clearly the logic of one-hot encode. You can check this out by YT search on One Hot Encode Codebasics. ' Thanks.
@spadiyar6725
@spadiyar6725 3 жыл бұрын
why drop out not used for test and validation data?
@ashishsasankar1479
@ashishsasankar1479 Жыл бұрын
getting message accuracy not defined
@shaiksuleman3191
@shaiksuleman3191 4 жыл бұрын
Sir,can u plz make videos on pytorch
@codebasics
@codebasics 4 жыл бұрын
sure it is in my todo. but first let me finish this series in tensorflow. I am also working on data structures series that I need to finish as well
@shaiksuleman3191
@shaiksuleman3191 4 жыл бұрын
@@codebasics sorry sir by mistake it is pycaret
@rishavbhattacharjee7182
@rishavbhattacharjee7182 4 жыл бұрын
Sir I can't download the csv file
@codebasics
@codebasics 4 жыл бұрын
check video descripition, there is csv file path.
@rishavbhattacharjee7182
@rishavbhattacharjee7182 4 жыл бұрын
@@codebasics Thank you sir 😁
@dataflex4440
@dataflex4440 2 жыл бұрын
Tea set or t-shirt 😂
Can You Find Hulk's True Love? Real vs Fake Girlfriend Challenge | Roblox 3D
00:24
The Singing Challenge #joker #Harriet Quinn
00:35
佐助与鸣人
Рет қаралды 39 МЛН
How Much Tape To Stop A Lamborghini?
00:15
MrBeast
Рет қаралды 201 МЛН
Regularization in a Neural Network | Dealing with overfitting
11:40
MIT Introduction to Deep Learning | 6.S191
1:09:58
Alexander Amini
Рет қаралды 727 М.
Tensorflow Tutorial for Python in 10 Minutes
11:33
Nicholas Renotte
Рет қаралды 461 М.
Tutorial 9- Drop Out Layers in Multi Neural Network
11:31
Krish Naik
Рет қаралды 178 М.
Can You Find Hulk's True Love? Real vs Fake Girlfriend Challenge | Roblox 3D
00:24