Activation Functions In Neural Networks Explained | Deep Learning Tutorial

  Рет қаралды 53,297

AssemblyAI

AssemblyAI

Күн бұрын

Пікірлер
@_Anna_Nass_
@_Anna_Nass_ 10 ай бұрын
OMG, you actually made this easy to understand. I can't believe it. The animations are so helpful. Thank you immensely!
@reireireireireireireireirei
@reireireireireireireireirei 3 жыл бұрын
Actuation functions.
@kaiserkonok
@kaiserkonok 2 жыл бұрын
🤣
@draziraphale
@draziraphale 2 жыл бұрын
These videos from Assembly AI are excellent. Distilled clarity
@FarizDarari
@FarizDarari 4 ай бұрын
This video activates my understanding on activation functions!
@terrylee6904
@terrylee6904 Жыл бұрын
Excellent Presentation.
@deeplearningexplained
@deeplearningexplained 5 ай бұрын
Really sharp tutorial!
@thepresistence5935
@thepresistence5935 3 жыл бұрын
Explained clearly
@AssemblyAI
@AssemblyAI 3 жыл бұрын
thank you!
@wagsman9999
@wagsman9999 Жыл бұрын
Thank you. I am a little smarter now!
@igrok878
@igrok878 2 жыл бұрын
thank you. Good pronouncing and good content.
@oberstoffer
@oberstoffer 5 ай бұрын
wow ! really good explanation
@alpeshdongre8196
@alpeshdongre8196 Жыл бұрын
🎯 Key Takeaways for quick navigation: 01:35 🧠 *Activation functions are crucial in neural networks as they introduce non-linearity, enabling the model to learn complex patterns. Without them, the network becomes a stacked linear regression model.* 02:43 🔄 *The sigmoid function, commonly used in the last layer for binary classification, outputs probabilities between 0 and 1. It's effective for transforming very negative or positive inputs.* 03:25 ⚖️ *Hyperbolic tangent, ranging from -1 to +1, is often chosen for hidden layers. ReLU (Rectified Linear Unit) is simple but effective, outputting the input for positive values and 0 for negatives, addressing the dying ReLU problem.* 04:32 🔍 *Leaky ReLU is a modification of ReLU that prevents neurons from becoming "dead" during training by allowing a small output for negative inputs. Useful in hidden layers to avoid the dying ReLU problem.* 05:13 🌐 *Softmax function is employed in the last layer for multi-class classification, converting raw inputs into probabilities. It's commonly used to determine the class with the highest probability.* Made with HARPA AI
@ianlinify
@ianlinify 7 ай бұрын
Excellent explanation! Very easy to understand this complex concept through your clear presentation. By the way, it looks like in some cases we don't need to include an activation function in layers, any explanation about why sometimes activation functions are not necessary?
@joguns8257
@joguns8257 Жыл бұрын
Superb introduction. Other videos have just been vague and hazy inn approach.
@AssemblyAI
@AssemblyAI Жыл бұрын
Glad you liked it
@narendrapratapsinghparmar91
@narendrapratapsinghparmar91 Жыл бұрын
Thanks for this informative video
@bernardoolisan1010
@bernardoolisan1010 2 жыл бұрын
Very good video!
@canygard
@canygard 10 ай бұрын
Why was the ReLU neuron so depressed? ...It kept getting negative feedback, and couldn't find any positive input in its life.
@ifeoluwarutholonijolu6944
@ifeoluwarutholonijolu6944 Ай бұрын
Can the softmax be used for a regression response
@_dion_
@_dion_ 7 ай бұрын
excellent.
@be_present_now
@be_present_now 2 жыл бұрын
Good video! One thing I want to point out is that the presenter is talking too fast, a slower speed would make the video great!
@anurajms
@anurajms Жыл бұрын
thank you
@Rashad99990
@Rashad99990 Жыл бұрын
We could apply an AI tool to this video to replace actuation with activation :D
@muskduh
@muskduh Жыл бұрын
thanks
@beypazariofficial
@beypazariofficial Жыл бұрын
nice
@DahBot-nr7rf
@DahBot-nr7rf 7 ай бұрын
V can be W..
@valentinleguizamon9957
@valentinleguizamon9957 8 ай бұрын
❤❤❤❤
@sumanbhattacharjee7550
@sumanbhattacharjee7550 Жыл бұрын
real life Sheldon Cooper
@brianp9054
@brianp9054 2 жыл бұрын
it was said but worth the emphasis, ... 'actuation' function 🤣🤣🤣. Repeat after me, one two and three: A-C-T-I-V-A-T-I-0-N. Great, now keep doing it yourself until you stop saying actuation function...
@Huffman_Tree
@Huffman_Tree Жыл бұрын
Ok I'll give it a try: Activatizeron!
Why Do We Need Activation Functions in Neural Networks?
14:32
NeuralNine
Рет қаралды 3,3 М.
Какой я клей? | CLEX #shorts
0:59
CLEX
Рет қаралды 1,9 МЛН
Почему Катар богатый? #shorts
0:45
Послезавтра
Рет қаралды 2 МЛН
GIANT Gummy Worm #shorts
0:42
Mr DegrEE
Рет қаралды 152 МЛН
"Идеальное" преступление
0:39
Кик Брейнс
Рет қаралды 1,4 МЛН
Regularization in a Neural Network | Dealing with overfitting
11:40
But what is a neural network? | Deep learning chapter 1
18:40
3Blue1Brown
Рет қаралды 18 МЛН
Batch normalization | What it is and how to implement it
13:51
AssemblyAI
Рет қаралды 67 М.
AI can't cross this line and we don't know why.
24:07
Welch Labs
Рет қаралды 1,5 МЛН
Convolutional Neural Networks from Scratch | In Depth
12:56
Watching Neural Networks Learn
25:28
Emergent Garden
Рет қаралды 1,4 МЛН
Graph Neural Networks - a perspective from the ground up
14:28
The moment we stopped understanding AI [AlexNet]
17:38
Welch Labs
Рет қаралды 1,5 МЛН
Какой я клей? | CLEX #shorts
0:59
CLEX
Рет қаралды 1,9 МЛН