Custom Activation and Loss Functions in Keras and TensorFlow with Automatic Differentiation

  Рет қаралды 12,832

Jeff Heaton

Jeff Heaton

Күн бұрын

Пікірлер: 26
@mockingbird3809
@mockingbird3809 5 жыл бұрын
Great Video on Auto Grads, Amazing as Always, Loved it Dr.Jeff
@NisseOhlsen
@NisseOhlsen 2 жыл бұрын
Small correction @1:36 : You don't "take the partial derivative of each weight". You do take the partial derivative of the loss function with respect to each weight. Also @7:24, the derivative of x^2 is 2x, not x. Also, @7:46, that IS the definition of ANALYTIC derivation. It is also used in the discrete case, the difference being that the jumps are finite, not infinitesimal.
@ChandraShekhar-rn9ty
@ChandraShekhar-rn9ty 4 жыл бұрын
Hi Jeff. Thank you so much. I spent couple of hours figuring out how the hell Keras is managing any changes in custom loss so easily. I was worried if it is even checking if the function is differentiable. With this video, things are pretty clear now.
@tanyajain3461
@tanyajain3461 4 жыл бұрын
does gradient tape break when math operations are applied on custom indexes of input_tensor? also while stacking tensors and then using in our loss function? Please suggest a workaround, I've been trying to implement it but it returns all gradients as NaN
@SrEngr
@SrEngr 2 жыл бұрын
Which version of tensorflow is this?
@kbd2820
@kbd2820 3 жыл бұрын
learning from the legend. It was an amazing experience. Thank you
@tonsandes
@tonsandes 2 жыл бұрын
Hi Jeff, is there a way to access y_pred information? I want to build my loss function, but not using a conventional way, which pass y_pred and y_true to a tf of backend function. I need to apply a step to access y_pred information and after that apply a function to estimate the std and return the std value as the output of my loss function. Do you know how to do this?
@heecheolcho3246
@heecheolcho3246 4 жыл бұрын
Thank you for a good lecture. I have a question. y = tf.divide(1.0,tf.add(1,tf.exp(tf.negative(x)))) vs y = 1.0/(1+tf.exp(-x)) Is there any difference?
@subhajitpaul8391
@subhajitpaul8391 Жыл бұрын
Thank you so much for this amazing video.
@prajith3676
@prajith3676 4 жыл бұрын
i was actually looking for this gradienttape() everywhere , thankyou..finally my doubt is cleared ... :-)
@StormiestOdin2
@StormiestOdin2 5 жыл бұрын
Hi Jeff, Thank you for all these great videos. I have a question about tensorflow. If I create a model but have no hidden layers does this make my model not a neural network but linear discriminant analysis. Like this : model = keras.Sequential([ keras.layers.Dense(12, activation="relu"), keras.layers.Dense(3, activation="softmax")
@HeatonResearch
@HeatonResearch 5 жыл бұрын
Its both at that point.
@StormiestOdin2
@StormiestOdin2 5 жыл бұрын
Ahh. thank you. Really appreciate all the videos you put on KZbin has helped me loads with making my own neural network:)
@zonexo5364
@zonexo5364 4 жыл бұрын
Stranget, why did I get "Tensor("AddN:0", shape=(), dtype=float32)" as output instead?
@zonexo5364
@zonexo5364 4 жыл бұрын
realise that in tensorflow, I have to run a session - with tf.Session() as sess: print(dz_dx.eval())
@tonihullzer1611
@tonihullzer1611 4 жыл бұрын
Awesome work, liked and subscribed, excited to see more.
@slime121212
@slime121212 4 жыл бұрын
Thank you for this video, this question was very important to me and now I know how to work it out)
@brubrudsi
@brubrudsi 5 жыл бұрын
Also, in the beginning you said the derivative of x^2 is x. It is 2x.
@HeatonResearch
@HeatonResearch 5 жыл бұрын
Yes you are correct, good point. All the more reason for me to use automatic differentiation. 😞
@maulikmadhavi
@maulikmadhavi 4 жыл бұрын
super explanation! subscribed!
@shunnie8482
@shunnie8482 3 жыл бұрын
Thanks for the amazing explanation, I finally understand GradientTape (I think at least haha).
@HeatonResearch
@HeatonResearch 3 жыл бұрын
Glad it helped!
@luchofrancisco
@luchofrancisco 4 жыл бұрын
Thanks, nice video!
@brubrudsi
@brubrudsi 5 жыл бұрын
The derivative of 4^2 is 0, not 8.
@mockingbird3809
@mockingbird3809 5 жыл бұрын
I think you should place take derivatives of the function, not the Number inputted values. You should take the derivative of the Function and place the value(A Number) into the derivated function to get the numeric output. You were wrong. 0 comes only if you take derivative a constant Derivative of f(x) = C is Zero, In this case, the funtion is f(x)=X^2.
@HA-pz7mv
@HA-pz7mv 3 жыл бұрын
Great video thanks!!
Python Keras Custom Loss Function and Gradient Tape
54:22
Jeff Heaton
Рет қаралды 10 М.
Automatic Differentiation with TensorFlow
19:17
Gradient Group
Рет қаралды 7 М.
СИНИЙ ИНЕЙ УЖЕ ВЫШЕЛ!❄️
01:01
DO$HIK
Рет қаралды 3,3 МЛН
Try this prank with your friends 😂 @karina-kola
00:18
Andrey Grechka
Рет қаралды 9 МЛН
So Cute 🥰 who is better?
00:15
dednahype
Рет қаралды 19 МЛН
Optimizing Neural Network Structures with Keras-Tuner
28:26
Metric Learning for Images - Keras Code Examples
22:39
Connor Shorten
Рет қаралды 5 М.
Tips Tricks 15 - Understanding Binary Cross-Entropy loss
18:29
DigitalSreeni
Рет қаралды 23 М.
Using K-Fold Cross Validation with Keras (5.2)
15:59
Jeff Heaton
Рет қаралды 26 М.
246 - Training a keras model by enumerating epochs and batches
17:48
Modern Keras design patterns | Session
14:12
TensorFlow
Рет қаралды 16 М.
What is Automatic Differentiation?
14:25
Ari Seff
Рет қаралды 120 М.
СИНИЙ ИНЕЙ УЖЕ ВЫШЕЛ!❄️
01:01
DO$HIK
Рет қаралды 3,3 МЛН