Check out our premium machine learning course with 2 Industry projects: codebasics.io/courses/machine-learning-for-data-science-beginners-to-advanced
@ITVishal Жыл бұрын
no
@codinghighlightswithsadra7343 Жыл бұрын
I'm at a loss for words to express my gratitude towards you. your tutorial is amazing, thank you so much
@shaiksuleman31914 жыл бұрын
Sky has no limits u r teaching has no more questions.Those who are dislike they are like finding the color of water,run the car with out petrol
@codebasics4 жыл бұрын
😊
@geekyprogrammer48312 жыл бұрын
such a funny analogy at the beginning! You are true genius educator :D :D
@clementvirgeniya80004 жыл бұрын
Sir, Your way of teaching is awesome.Sir, Please do videos on Multi-class classification problem in deep learning.
@rafibasha41452 жыл бұрын
Hi ,can you please let me know how dropout works during testing phase
@nitishkeshri23782 жыл бұрын
@@rafibasha4145 basically it drops neuron randomly from hidden layers
@karthikc89924 жыл бұрын
happy to be here again!!!
@codebasics4 жыл бұрын
sure karthik. thanks.
@manideep18823 жыл бұрын
and here I was waiting dropout regularization to happen for you to delete dense layers #2 and #3 !! hahaha. Great stuff. Keep up the good work.
@pa51194 жыл бұрын
Sir, your explanation is great great great. But, sir please make video on this series fast so, as our exam come near we prepare well and we complete in less time. Thanks a lot for making such a good videos.
@codebasics4 жыл бұрын
I will try my best Pa. And thanks for your kind words
@bratsummer19804 жыл бұрын
@@codebasics Sir actually my project is on continous data of 5 inputs and output. I have used multiple linear regression. Also I am using Ann by batch gradient and tensor flow But I request you to please upload video on continous output or supervised learning like prediction of salary by tf
@jongcheulkim72843 жыл бұрын
I am enjoying your tutorials. Thank you so much.
@codebasics3 жыл бұрын
Glad you like them!
@hardikvegad35084 жыл бұрын
Sir Please cover the concept of EARLY STOPPING... I know the implementation part but want to know in-depth.
@achelias84772 жыл бұрын
Thank you for this amazing tutorial! I even understood the batch size, without this being my goal with this video here!
@sandiproy330 Жыл бұрын
Thank you. Nicely explained with a clear-cut example.
@MrSHANKSHINE3 жыл бұрын
Awsome Sir, Thakyou so much for making us understand such important concepts in simple n easy way..!!!
@balajiplatinum083 жыл бұрын
Hi, In Deep learning, can you Please post some videos on hyper parameter tuning. Thanks
@tchintchie4 жыл бұрын
Invaluable 👏
@codebasics4 жыл бұрын
Glad you think so!
@iaconst4.08 ай бұрын
El accuracy me salio casi igual , sin embargo, agradesco el video
@devilzwishbone Жыл бұрын
05:10 so effectively a drop out could be considered similarly then to test/train data, in that it trains neurons A and C, then adjusts B and D based on the test results from A and C
@ncf22943 жыл бұрын
thank you for your tutorial. I have learned much from it
@mohdsyukur16997 ай бұрын
You are the best My Boss
@jansirani44297 ай бұрын
Very good explanation
@jvandeal2 жыл бұрын
This was so good, thank you!
@fahadreda30604 жыл бұрын
Great tutorial, love the biryani example 😂😂
@codebasics4 жыл бұрын
ha ha.. i knew fahad that some biryani lovers are going to like it for sure. Looks like you like biryani correct :)
@fahadreda30604 жыл бұрын
@@codebasics Who Doesn't like briyani😂😂??how much time does it take you to make these amazing videos ? I teach data science in Arabic , which is way harder than in English , because some terms doesn't have a proper translation and there is no source for Data Science in ARABIC!! so one webinar will take me 5 days , and a video will take me around 2 days , including video editing, so how much time does it take you to make these videos ? Thanks again !
@ITVishal Жыл бұрын
what a lecture omg
@very_nice_777 Жыл бұрын
Sir, can you explain why dropping neurons to 50% isn't as the same as reducing the neuron size to 50%? For example, instead of taking 60 neurons and dropping 50%, why don't we just take 30 neurons to begin with? Thanks in advance
@NitinKumar-wm2dg Жыл бұрын
beacause neurons might be biased towards some neurons, with each epoch, it randomly drops neurons and trains the model, then back propagates. This way, we avoid bias in neural networks and we efficiently train our data
@r0cketRacoon8 ай бұрын
@@NitinKumar-wm2dg so all about dropout is that for each epoch, it RANDOMLY drops neurons (which are different from neurons dropped from the previous and following epochs) and train with the remaining
@vishaltanawade76373 жыл бұрын
9:10 can we replace M and R with 0 and 1 instead of using dummy variable ??
@parthjasani21142 жыл бұрын
yes , you can.
@clementvirgeniya80004 жыл бұрын
Sir, in my dataset i am having 20 target variables(ie., multi- class problem). When i train and test my accuracy it is only 45%. I am little bit struck with this. It will be helpfull if you give me some suggestions .
@bratsummer19804 жыл бұрын
We are huge fan of you.....
@Martyniqo2 жыл бұрын
Thank You so much!
@bratsummer19804 жыл бұрын
Sir please give example with continuous output regression or multiple classification
@AKSHAY995523 жыл бұрын
really great explanation ..
@AlonAvramson3 жыл бұрын
Thank you!
@abdansyakura29822 жыл бұрын
HI Sir, I have a question about the droupout technique, as we can see this technique randomly deactivate the neuron, what about the testing, is it still deactivated ?
@fakharyarkhan58482 жыл бұрын
They're only deactivated during training. During testing, all of the weights are resealed by p, the probability that a node gets dropped.
@AquarianVikas2 жыл бұрын
Hi, when I ran the classification report after adding the dropout layers, I got slightly lesser accuracy and F1 scores. Is this normal? or it could be that I must have made some mistake?
@PhaniHarshithKotturu Жыл бұрын
It is correct as many times adding the dropout layer decreases the accuracy but it is fine as it appropriately generalizes your model so that it performs well in all conditions.
@r0cketRacoon8 ай бұрын
high accuracy is not always good, it also means ur model might deal with overfitting
@souvikghosh65093 жыл бұрын
Sir..Is it possible to apply dropout in deep autoencoder??
@osamashawky6223 жыл бұрын
very good
@faezeabdolinejad7313 жыл бұрын
its awesome , thanks
@codebasics3 жыл бұрын
Glad you like it!
@sumitchhabra24193 жыл бұрын
I loved your tutorials Brother. I have just one question from you, In every iteration, we have a new set of data and neurons can be chosen at random right? From this, I infer, that neurons will learn different data and will not be biased towards certain data inputs right?
@codebasics3 жыл бұрын
yes that is the correct understanding
@Adinasa23 жыл бұрын
Looks like you always get confused between input layer dimensions and hidden layer dimensions
@mdsifath77414 жыл бұрын
sir can you make or suggest any video for ADAM optimizer??@codebasics
@codebasics4 жыл бұрын
yes i will be adding that in this series
@wbh7862 жыл бұрын
love u sir
@bratsummer19804 жыл бұрын
Sir one question more batch gradient is also artificial neural network
@codebasics4 жыл бұрын
batch gradient is just a technique of gradient descent. That can be used in artificial newural net as well as statistical models such as decision tree
@karthikb.s.k.44864 жыл бұрын
Sir what is the keyboard that you are using for progrmaming.
@codebasics4 жыл бұрын
any keyboard is ok karthik.
@karthikb.s.k.44864 жыл бұрын
codebasics Thanks sir
@asifurrahmankhan50063 жыл бұрын
Why one hot encoding to convert 'y' into integer? Can't we do that with simpile 0 and 1 conversion?? Can you clear this please?
@kmnm94632 жыл бұрын
Hi - if you, say LabelEncode the y, the values will be 0 and 1, obviously, but what will happen here is the model will take the values as some kind of numerical order, that is 1 > 0. Here it is not what we want right , we want the model to learn the M and R and not M>R or R>M. That is why one - hot encoding is better option. There is a detailed video on this by Sir - where he has explained very clearly the logic of one-hot encode. You can check this out by YT search on One Hot Encode Codebasics. ' Thanks.
@spadiyar67253 жыл бұрын
why drop out not used for test and validation data?
@ashishsasankar1479 Жыл бұрын
getting message accuracy not defined
@shaiksuleman31914 жыл бұрын
Sir,can u plz make videos on pytorch
@codebasics4 жыл бұрын
sure it is in my todo. but first let me finish this series in tensorflow. I am also working on data structures series that I need to finish as well