Calculating Loss MADE EASY [4/11]

  Рет қаралды 945

Andrew Jones

Andrew Jones

Күн бұрын

Deep Learning & Neural Networks are behind the vast majority of the Artificial Intelligence that is sweeping the world.
In Part 4, we take a look at calculating loss, and the logic behind the most common loss functions - enjoy!
DATA SCIENCE CAREER WEBINAR:
--------------------
I've helped people just like you land FAANG roles, high 6-figure roles, and roles with enormous stock options.
They all started by joining thousands & thousands of others in watching my free 60-minute Data Science Career Webinar.
Click here: training.data-...
DATA SCIENCE INFINITY:
---------------------------
The world's leading Data Science program, I've helped thousands move towards incredible roles at top companies.
Click here: www.data-scien...
LET'S CONNECT
-----------------------------------------
LinkedIn: / andrew-jones-dsi

Пікірлер: 4
@andrew-jones-data-science
@andrew-jones-data-science Жыл бұрын
Check out Data Science Infinity in full here: www.data-science-infinity.com/
@Rick88888888
@Rick88888888 10 ай бұрын
Where is the next tutorial "5/11" etc. ??? Tutorials 5/11 to 11/11 are all missing on your channel!
@exzorttt
@exzorttt 11 ай бұрын
how to code it?
@TheRealDanNguyen
@TheRealDanNguyen 9 ай бұрын
def binary_crossentropy(y_true, y_pred): epsilon = 1e-15 y_pred = np.clip(y_pred, epsilon, 1 - epsilon) # To avoid log(0) error return -np.mean(y_true * np.log(y_pred) + (1 - y_true) * np.log(1 - y_pred)) class SGD: def __init__(self, learning_rate=0.01): self.learning_rate = learning_rate def update(self, weights, gradients): return weights - self.learning_rate * gradients def accuracy(y_true, y_pred): predictions = np.round(y_pred) # Convert probabilities to binary predictions (0 or 1) return np.mean(predictions == y_true) for epoch in range(num_epochs): for x_batch, y_batch in data_loader: # Assuming data_loader yields batches of data # Forward pass y_pred = model.forward(x_batch) # Compute loss loss = binary_crossentropy(y_batch, y_pred) # Backward pass (compute gradients) gradients = model.backward(y_batch, y_pred) # Update weights optimizer.update(model.weights, gradients) # Compute accuracy acc = accuracy(y_batch, y_pred) # Print or log the loss and accuracy print(f"Epoch {epoch}, Loss: {loss}, Accuracy: {acc}") # or using tensorflow/keras for dog and cat model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy']) # or using tensorflow/keras for dog, cat, and cow model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
Activation Functions MADE EASY [3/11]
23:17
Andrew Jones
Рет қаралды 2,8 М.
How I'd Learn AI & Data Science in 2025 (If I Had To Start Over)
13:39
BAYGUYSTAN | 1 СЕРИЯ | bayGUYS
36:55
bayGUYS
Рет қаралды 1,9 МЛН
Quando A Diferença De Altura É Muito Grande 😲😂
00:12
Mari Maria
Рет қаралды 45 МЛН
China announces retaliatory tariffs on US goods
5:29
Al Jazeera English
Рет қаралды 227 М.
All Machine Learning Models Clearly Explained!
22:23
AI For Beginners
Рет қаралды 12 М.
Germany | Can you solve this? | Math Olympiad
11:22
Master T Maths Class
Рет қаралды 760
SQL for Data Science and Data Analytics (Course Overview)
2:57
Andrew Jones
Рет қаралды 95 М.
Weibull Analysis Overview
4:50
Prelical Solutions LLC
Рет қаралды 70 М.
What are  Parameters in Large Language Model?
6:29
Decoding Data Science
Рет қаралды 13 М.