Why ReLU Is Better Than Other Activation Functions | Tanh Saturating Gradients

  Рет қаралды 3,067

DataMListic

DataMListic

Күн бұрын

Пікірлер: 14
@datamlistic
@datamlistic 2 жыл бұрын
ReLU is cool, but its variants are cooler: kzbin.info/www/bejne/iZSqnqV8d9KijKs
@omaralkhasawneh1968
@omaralkhasawneh1968 Жыл бұрын
great video , thanks
@datamlistic
@datamlistic Жыл бұрын
Glad you enjoyed it! :)
@iacobsorina6924
@iacobsorina6924 2 жыл бұрын
Great video! keep going!
@datamlistic
@datamlistic 2 жыл бұрын
Thank you!!!
@davidryanguasrojas2241
@davidryanguasrojas2241 2 жыл бұрын
The content is pretty good, dedicate some time for editing and you may get a very good channel in the area soon enough 👍
@datamlistic
@datamlistic 2 жыл бұрын
Thank you so much for your feedback and your kind words! I have dedicated more time to editing in my last two videos and I really hope they are better. Also, if I am not asking for too much, could you tell me what parts more exactly you think would need more editing in this video? Maybe I missed some points.
@davidryanguasrojas2241
@davidryanguasrojas2241 2 жыл бұрын
@@datamlistic I would recommend to focus on two issues: - Try to explain with more confidence and in a more fluid way. During several moments in the video you slow down or introduce filler words in the explanation (2:35 for instance) which reduces the quality of the result. Taking different small cuts and choosing the best would lead to a way better one. - Try to improve the visuals, use different colors, improve the caligrafy (or switch to digital fonts) and/or introduce animations would lead to a better final product. The topics you are addressing are very interesting and I wish you success with this project. I would recommend you to take as reference channels as 3blue1brown which adresses math topics in a very clear and elegant way. Try improving step by step and you may get something very interesting.
@datamlistic
@datamlistic 2 жыл бұрын
@@davidryanguasrojas2241 Thank you so much for this detailed feedback! I have started to remove the filler words and pauses in my videos, while trying to sound a little bit more confident. Regarding the visuals, this is a bit harder to do, as I don't know exactly which tools are better and it's quite an experimental process. As you said, I am trying to improve step by step and hopefully the quality of the videos will also increase.
@w3w3w3
@w3w3w3 Жыл бұрын
best channels/videos have the least views/subs i find lol
@tomoki-v6o
@tomoki-v6o 2 жыл бұрын
Also adam optimizer works faster with relu
@datamlistic
@datamlistic 2 жыл бұрын
Thanks for bringing this up! I am not fully aware of why the Adam optimizer works faster with ReLU. Could you point me to some references?
@tomoki-v6o
@tomoki-v6o 2 жыл бұрын
@@datamlistic through experimentation only. i dont really why.
@datamlistic
@datamlistic 2 жыл бұрын
@@tomoki-v6o Oh, ok. Maybe there are also other people that have observed that, I will take a deeper look into the problem. This might become a subject for a future video. :)
Why The Reset Gate is Necessary in GRUs
12:24
DataMListic
Рет қаралды 392
PIZZA or CHICKEN // Left or Right Challenge
00:18
Hungry FAM
Рет қаралды 9 МЛН
Osman Kalyoncu Sonu Üzücü Saddest Videos Dream Engine 275 #shorts
00:29
бабл ти гель для душа // Eva mash
01:00
EVA mash
Рет қаралды 7 МЛН
Sigma baby, you've conquered soap! 😲😮‍💨 LeoNata family #shorts
00:37
A Review of 10 Most Popular Activation Functions in Neural Networks
15:59
Machine Learning Studio
Рет қаралды 12 М.
Why Do Neural Networks Love the Softmax?
10:47
Mutual Information
Рет қаралды 66 М.
Activation Functions - EXPLAINED!
10:05
CodeEmporium
Рет қаралды 119 М.
Why Do We Use the Sigmoid Function for Binary Classification?
8:50
Why is ReLU a Non-Linear Activation function?
9:24
Bhavesh Bhatt
Рет қаралды 4,2 М.
Neural Networks Pt. 3: ReLU In Action!!!
8:58
StatQuest with Josh Starmer
Рет қаралды 274 М.
Watching Neural Networks Learn
25:28
Emergent Garden
Рет қаралды 1,3 МЛН
PIZZA or CHICKEN // Left or Right Challenge
00:18
Hungry FAM
Рет қаралды 9 МЛН