Calculating Loss MADE EASY [4/11]

  Рет қаралды 497

Andrew Jones

Andrew Jones

9 ай бұрын

Deep Learning & Neural Networks are behind the vast majority of the Artificial Intelligence that is sweeping the world.
In Part 4, we take a look at calculating loss, and the logic behind the most common loss functions - enjoy!
DATA SCIENCE CAREER WEBINAR:
--------------------
I've helped people just like you land FAANG roles, high 6-figure roles, and roles with enormous stock options.
They all started by joining thousands & thousands of others in watching my free 60-minute Data Science Career Webinar.
Click here: www.data-science-infinity.com...
DATA SCIENCE INFINITY:
---------------------------
The world's leading Data Science program, I've helped thousands move towards incredible roles at top companies.
Click here: www.data-science-infinity.com/
LET'S CONNECT
-----------------------------------------
LinkedIn: / andrew-jones-dsi

Пікірлер: 4
@andrew-jones-data-science
@andrew-jones-data-science 9 ай бұрын
Check out Data Science Infinity in full here: www.data-science-infinity.com/
@Rick88888888
@Rick88888888 4 ай бұрын
Where is the next tutorial "5/11" etc. ??? Tutorials 5/11 to 11/11 are all missing on your channel!
@exzorttt
@exzorttt 5 ай бұрын
how to code it?
@TheRealDanNguyen
@TheRealDanNguyen 3 ай бұрын
def binary_crossentropy(y_true, y_pred): epsilon = 1e-15 y_pred = np.clip(y_pred, epsilon, 1 - epsilon) # To avoid log(0) error return -np.mean(y_true * np.log(y_pred) + (1 - y_true) * np.log(1 - y_pred)) class SGD: def __init__(self, learning_rate=0.01): self.learning_rate = learning_rate def update(self, weights, gradients): return weights - self.learning_rate * gradients def accuracy(y_true, y_pred): predictions = np.round(y_pred) # Convert probabilities to binary predictions (0 or 1) return np.mean(predictions == y_true) for epoch in range(num_epochs): for x_batch, y_batch in data_loader: # Assuming data_loader yields batches of data # Forward pass y_pred = model.forward(x_batch) # Compute loss loss = binary_crossentropy(y_batch, y_pred) # Backward pass (compute gradients) gradients = model.backward(y_batch, y_pred) # Update weights optimizer.update(model.weights, gradients) # Compute accuracy acc = accuracy(y_batch, y_pred) # Print or log the loss and accuracy print(f"Epoch {epoch}, Loss: {loss}, Accuracy: {acc}") # or using tensorflow/keras for dog and cat model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy']) # or using tensorflow/keras for dog, cat, and cow model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
Artificial Neural Networks MADE EASY [1/11]
20:32
Andrew Jones
Рет қаралды 14 М.
Activation Functions MADE EASY [3/11]
23:17
Andrew Jones
Рет қаралды 2,1 М.
A teacher captured the cutest moment at the nursery #shorts
00:33
Fabiosa Stories
Рет қаралды 8 МЛН
Пранк пошел не по плану…🥲
00:59
Саша Квашеная
Рет қаралды 5 МЛН
Heartwarming Unity at School Event #shorts
00:19
Fabiosa Stories
Рет қаралды 20 МЛН
Weights & Biases MADE EASY [2/11]
17:16
Andrew Jones
Рет қаралды 7 М.
Why renormalizing ruins the shortest path
2:43
Nickvash Kani
Рет қаралды 15
Economist fact-checks Scott Galloway’s Anti-Boomer TED Talk
26:05
Money & Macro
Рет қаралды 38 М.
How Ants Learned to Control Fungus
10:48
Moth Light Media
Рет қаралды 25 М.