Thanks. This couldn't have come at a better time. I've a project that needs Pytorch in my university and this will be greatly helpful
@patloeber4 жыл бұрын
Thanks for watching! I'm glad it is helpful :)
@alteshaus31493 жыл бұрын
Thank you for these great videos. I can really learn a lot and it helps me for my future career
@haoranwan12332 жыл бұрын
Thanks!The best tutorial I have ever seen!
@mahmoudabbasi19943 жыл бұрын
A great course. Thank you!
@Ftur-57-fetr3 жыл бұрын
Simple, clear, THANKS!!!!!!!!
@patloeber3 жыл бұрын
Glad it helped!
@riadhossainbhuiyan49787 ай бұрын
Great teaching ❤
@nicolasgabrielsantanaramos2914 жыл бұрын
Thanks for the class!!!
@beltraomartins2 жыл бұрын
Thank you!
@urosgrandovec3409 Жыл бұрын
without activation functions, a composition of linears transformations is again a linear transformation
@amrutapatil14173 жыл бұрын
great tutorial, thankyou so much!!!
@patloeber3 жыл бұрын
You're very welcome!
@DanielWeikert4 жыл бұрын
Thanks for all your effort. Appreciate it! Could you do a video on text generation with pytorch? Best regards
@patloeber4 жыл бұрын
Thanks for watching! I have 4 more tutorials planned for this playlist: - feed forward net - convolutional neural net - transfer learning - tensorboard After that I want to do more practical applications, so yeah text generation is a nice suggestion :)
@alirezamohseni50456 ай бұрын
thanks, a lot😀😀😀😀
@HoangNguyen-be4vy4 жыл бұрын
As far as I'm concerned about your example, this NN has 3 layers: - 1 input layer - 1 hidden layer (uses ReLu function) - 1 output layer (size 1, uses Sigmoid function) Am I understanding it correctly? Correct me if I am wrong.
@patloeber4 жыл бұрын
Exactly :)
@saurrav38014 жыл бұрын
Thanks bro I saw it
@abderrahmanebououden51734 жыл бұрын
thanks bro :)
@VarunKumar-pz5si3 жыл бұрын
Please do some videos on Computer Vision projects ...................
@patloeber3 жыл бұрын
Yes good suggestion!
@anonim50528 ай бұрын
great
@bluebox63073 жыл бұрын
If anybody wonders (like i did) if there is any difference between instantiating nn.Sigmoid() or calling torch.sigmoid(), check out the answer from KFrank in discuss.pytorch.org/t/torch-nn-sigmoid-vs-torch-sigmoid/57691/3 :) (and according to discuss.pytorch.org/t/is-there-any-different-between-torch-sigmoid-and-torch-nn-functional-sigmoid/995, torch.nn.functional seems to be deprecated by now)
@joywang8173 Жыл бұрын
Does anyone knows why tanH is a good option for hidden layers? i'm new to deep learning😊
@saimakhalil51374 ай бұрын
The tanh function outputs values between -1 and 1, which can help in centering the activations around zero. tanh is also non-linear, which allows neural networks to learn complex relationships in data.Tanh has stronger gradients than sigmoid, particularly around zero. This can facilitate learning and convergence, especially in deeper networks.Unlike sigmoid, which outputs values between 0 and 1 centered around 0.5, tanh outputs values between -1 and 1 centered around 0. This can make optimization easier because weights and biases can be updated in both directions in the weight space.