Activation Functions In Neural Networks Explained | Deep Learning Tutorial

  Рет қаралды 41,362

AssemblyAI

AssemblyAI

Күн бұрын

Get your Free Token for AssemblyAI Speech-To-Text API 👇
www.assemblyai.com/?...
In this video we are going to learn about Activation Functions in Neural Networks. We go over:
* The definition of activation functions
* Why they are used
* Different activation functions
* How to use them in code (TensorFlow and PyTorch)
Deep Learning In 5 Minutes video: • Deep learning in 5 min...
Different activation functions we go over:
Step Functions, Sigmoid, TanH, ReLU, Leaky ReLU, Softmax
Timestamps:
00:00 Introduction
00:35 Activation Functions Explained
01:48 Different activation functions
05:23 How to implement them
06:20 Get your Free AssemblyAI API link now!

Пікірлер: 31
@reireireireireireireireirei
@reireireireireireireireirei 2 жыл бұрын
Actuation functions.
@kaiserkonok
@kaiserkonok 2 жыл бұрын
🤣
@_Anna_Nass_
@_Anna_Nass_ 4 ай бұрын
OMG, you actually made this easy to understand. I can't believe it. The animations are so helpful. Thank you immensely!
@draziraphale
@draziraphale Жыл бұрын
These videos from Assembly AI are excellent. Distilled clarity
@oberstoffer
@oberstoffer Күн бұрын
wow ! really good explanation
@machinelearningexplained
@machinelearningexplained 4 күн бұрын
Really sharp tutorial!
@alpeshdongre8196
@alpeshdongre8196 7 ай бұрын
🎯 Key Takeaways for quick navigation: 01:35 🧠 *Activation functions are crucial in neural networks as they introduce non-linearity, enabling the model to learn complex patterns. Without them, the network becomes a stacked linear regression model.* 02:43 🔄 *The sigmoid function, commonly used in the last layer for binary classification, outputs probabilities between 0 and 1. It's effective for transforming very negative or positive inputs.* 03:25 ⚖️ *Hyperbolic tangent, ranging from -1 to +1, is often chosen for hidden layers. ReLU (Rectified Linear Unit) is simple but effective, outputting the input for positive values and 0 for negatives, addressing the dying ReLU problem.* 04:32 🔍 *Leaky ReLU is a modification of ReLU that prevents neurons from becoming "dead" during training by allowing a small output for negative inputs. Useful in hidden layers to avoid the dying ReLU problem.* 05:13 🌐 *Softmax function is employed in the last layer for multi-class classification, converting raw inputs into probabilities. It's commonly used to determine the class with the highest probability.* Made with HARPA AI
@terrylee6904
@terrylee6904 Жыл бұрын
Excellent Presentation.
@wagsman9999
@wagsman9999 Жыл бұрын
Thank you. I am a little smarter now!
@igrok878
@igrok878 2 жыл бұрын
thank you. Good pronouncing and good content.
@narendrapratapsinghparmar91
@narendrapratapsinghparmar91 6 ай бұрын
Thanks for this informative video
@bernardoolisan1010
@bernardoolisan1010 2 жыл бұрын
Very good video!
@_dion_
@_dion_ Ай бұрын
excellent.
@thepresistence5935
@thepresistence5935 2 жыл бұрын
Explained clearly
@AssemblyAI
@AssemblyAI 2 жыл бұрын
thank you!
@anurajms
@anurajms 11 ай бұрын
thank you
@ianlinify
@ianlinify Ай бұрын
Excellent explanation! Very easy to understand this complex concept through your clear presentation. By the way, it looks like in some cases we don't need to include an activation function in layers, any explanation about why sometimes activation functions are not necessary?
@muskduh
@muskduh Жыл бұрын
thanks
@joguns8257
@joguns8257 Жыл бұрын
Superb introduction. Other videos have just been vague and hazy inn approach.
@AssemblyAI
@AssemblyAI Жыл бұрын
Glad you liked it
@bezelyesevenordek
@bezelyesevenordek 8 ай бұрын
nice
@rashadloulou
@rashadloulou 10 ай бұрын
We could apply an AI tool to this video to replace actuation with activation :D
@be_present_now
@be_present_now Жыл бұрын
Good video! One thing I want to point out is that the presenter is talking too fast, a slower speed would make the video great!
@valentinleguizamon9957
@valentinleguizamon9957 3 ай бұрын
❤❤❤❤
@canygard
@canygard 4 ай бұрын
Why was the ReLU neuron so depressed? ...It kept getting negative feedback, and couldn't find any positive input in its life.
@B_knows_A_R_D-xh5lo
@B_knows_A_R_D-xh5lo Ай бұрын
😊😊😊😊🎉🎉🎉🎉
@DahBot-nr7rf
@DahBot-nr7rf Ай бұрын
V can be W..
@sumanbhattacharjee7550
@sumanbhattacharjee7550 7 ай бұрын
real life Sheldon Cooper
@brianp9054
@brianp9054 Жыл бұрын
it was said but worth the emphasis, ... 'actuation' function 🤣🤣🤣. Repeat after me, one two and three: A-C-T-I-V-A-T-I-0-N. Great, now keep doing it yourself until you stop saying actuation function...
@Huffman_Tree
@Huffman_Tree Жыл бұрын
Ok I'll give it a try: Activatizeron!
Русалка
01:00
История одного вокалиста
Рет қаралды 7 МЛН
LOVE LETTER - POPPY PLAYTIME CHAPTER 3 | GH'S ANIMATION
00:15
Batch normalization | What it is and how to implement it
13:51
AssemblyAI
Рет қаралды 57 М.
Activation Functions - EXPLAINED!
10:05
CodeEmporium
Рет қаралды 111 М.
Scientific Concepts You're Taught in School Which are Actually Wrong
14:36
ML Was Hard Until I Learned These 5 Secrets!
13:11
Boris Meinardus
Рет қаралды 243 М.
Activation functions in neural networks
12:32
CodeEmporium
Рет қаралды 3,5 М.
How I'd Learn AI (If I Had to Start Over)
15:04
Thu Vu data analytics
Рет қаралды 750 М.
But what is a neural network? | Chapter 1, Deep learning
18:40
3Blue1Brown
Рет қаралды 16 МЛН
Watching Neural Networks Learn
25:28
Emergent Garden
Рет қаралды 1,2 МЛН
Why Neural Networks can learn (almost) anything
10:30
Emergent Garden
Рет қаралды 1,2 МЛН