PyTorch Tutorial 12 - Activation Functions

  Рет қаралды 38,714

Patrick Loeber

Patrick Loeber

Күн бұрын

Пікірлер: 26
@normalperson1130
@normalperson1130 4 жыл бұрын
Thanks. This couldn't have come at a better time. I've a project that needs Pytorch in my university and this will be greatly helpful
@patloeber
@patloeber 4 жыл бұрын
Thanks for watching! I'm glad it is helpful :)
@alteshaus3149
@alteshaus3149 3 жыл бұрын
Thank you for these great videos. I can really learn a lot and it helps me for my future career
@haoranwan1233
@haoranwan1233 2 жыл бұрын
Thanks!The best tutorial I have ever seen!
@mahmoudabbasi1994
@mahmoudabbasi1994 3 жыл бұрын
A great course. Thank you!
@Ftur-57-fetr
@Ftur-57-fetr 3 жыл бұрын
Simple, clear, THANKS!!!!!!!!
@patloeber
@patloeber 3 жыл бұрын
Glad it helped!
@riadhossainbhuiyan4978
@riadhossainbhuiyan4978 7 ай бұрын
Great teaching ❤
@nicolasgabrielsantanaramos291
@nicolasgabrielsantanaramos291 4 жыл бұрын
Thanks for the class!!!
@beltraomartins
@beltraomartins 2 жыл бұрын
Thank you!
@urosgrandovec3409
@urosgrandovec3409 Жыл бұрын
without activation functions, a composition of linears transformations is again a linear transformation
@amrutapatil1417
@amrutapatil1417 3 жыл бұрын
great tutorial, thankyou so much!!!
@patloeber
@patloeber 3 жыл бұрын
You're very welcome!
@DanielWeikert
@DanielWeikert 4 жыл бұрын
Thanks for all your effort. Appreciate it! Could you do a video on text generation with pytorch? Best regards
@patloeber
@patloeber 4 жыл бұрын
Thanks for watching! I have 4 more tutorials planned for this playlist: - feed forward net - convolutional neural net - transfer learning - tensorboard After that I want to do more practical applications, so yeah text generation is a nice suggestion :)
@alirezamohseni5045
@alirezamohseni5045 6 ай бұрын
thanks, a lot😀😀😀😀
@HoangNguyen-be4vy
@HoangNguyen-be4vy 4 жыл бұрын
As far as I'm concerned about your example, this NN has 3 layers: - 1 input layer - 1 hidden layer (uses ReLu function) - 1 output layer (size 1, uses Sigmoid function) Am I understanding it correctly? Correct me if I am wrong.
@patloeber
@patloeber 4 жыл бұрын
Exactly :)
@saurrav3801
@saurrav3801 4 жыл бұрын
Thanks bro I saw it
@abderrahmanebououden5173
@abderrahmanebououden5173 4 жыл бұрын
thanks bro :)
@VarunKumar-pz5si
@VarunKumar-pz5si 3 жыл бұрын
Please do some videos on Computer Vision projects ...................
@patloeber
@patloeber 3 жыл бұрын
Yes good suggestion!
@anonim5052
@anonim5052 8 ай бұрын
great
@bluebox6307
@bluebox6307 3 жыл бұрын
If anybody wonders (like i did) if there is any difference between instantiating nn.Sigmoid() or calling torch.sigmoid(), check out the answer from KFrank in discuss.pytorch.org/t/torch-nn-sigmoid-vs-torch-sigmoid/57691/3 :) (and according to discuss.pytorch.org/t/is-there-any-different-between-torch-sigmoid-and-torch-nn-functional-sigmoid/995, torch.nn.functional seems to be deprecated by now)
@joywang8173
@joywang8173 Жыл бұрын
Does anyone knows why tanH is a good option for hidden layers? i'm new to deep learning😊
@saimakhalil5137
@saimakhalil5137 4 ай бұрын
The tanh function outputs values between -1 and 1, which can help in centering the activations around zero. tanh is also non-linear, which allows neural networks to learn complex relationships in data.Tanh has stronger gradients than sigmoid, particularly around zero. This can facilitate learning and convergence, especially in deeper networks.Unlike sigmoid, which outputs values between 0 and 1 centered around 0.5, tanh outputs values between -1 and 1 centered around 0. This can make optimization easier because weights and biases can be updated in both directions in the weight space.
PyTorch Tutorial 13 - Feed-Forward Neural Network
21:34
Patrick Loeber
Рет қаралды 91 М.
PyTorch Tutorial 11 - Softmax and Cross Entropy
18:17
Patrick Loeber
Рет қаралды 91 М.
Smart Sigma Kid #funny #sigma
00:33
CRAZY GREAPA
Рет қаралды 12 МЛН
Кто круче, как думаешь?
00:44
МЯТНАЯ ФАНТА
Рет қаралды 6 МЛН
The Ultimate Sausage Prank! Watch Their Reactions 😂🌭 #Unexpected
00:17
La La Life Shorts
Рет қаралды 8 МЛН
A Review of 10 Most Popular Activation Functions in Neural Networks
15:59
Machine Learning Studio
Рет қаралды 12 М.
Create a Simple Neural Network in Python from Scratch
14:15
Polycode
Рет қаралды 775 М.
ML Was Hard Until I Learned These 5 Secrets!
13:11
Boris Meinardus
Рет қаралды 341 М.
Mastering Activation Functions in PyTorch: A Deep Dive Tutorial
15:46
Ryan & Matt Data Science
Рет қаралды 302
Activation Functions - EXPLAINED!
10:05
CodeEmporium
Рет қаралды 121 М.
Convolutional Neural Networks from Scratch | In Depth
12:56
The StatQuest Introduction to PyTorch
23:22
StatQuest with Josh Starmer
Рет қаралды 161 М.
Smart Sigma Kid #funny #sigma
00:33
CRAZY GREAPA
Рет қаралды 12 МЛН