If you are reading this you are the ten percent (as of the time of writing this) that didn't up and leave after the intro. I hope to see you all at lecture 22.
@m-aun4 жыл бұрын
you want to do this course together?
@conradwiebe79194 жыл бұрын
I'm really just skimming these to better form intuition. I'm not sure what you mean by do the course together, I'd be happy to discuss anything in the lectures but I'm not going on to do any projects with computer vision out of this.
@m-aun4 жыл бұрын
@@conradwiebe7919 I was planning to do all the HWs/ Assignments given on the course website along with the lectures
@conradwiebe79194 жыл бұрын
Didn't even see they had those lol, Imma still stick with my original plan tho. I'm trying a more organic entrance to ml. I made some really rudimentary search algos like queue, stack, greed, and astar and have now started generating mazes. I want to try and train something that looks like astar search. It's a long way from deep learning but I don't think I can make that leap and still know everything that's going on. Maybe I'll join you a month from now, I'd still be happy to discuss the topics with you.
@m-aun4 жыл бұрын
@@conradwiebe7919 then you should start with the ml course taught by Andrew Ng
@xanderlewis Жыл бұрын
25:22 He just described a well-known exam technique beloved of students everywhere!
@guavacupcake4 жыл бұрын
Much better audio thanks!
@huesOfEverything3 жыл бұрын
I like how he says.. 'This is WRONG.. so bad... you should not do this!' cracks me up for some reason
@terryliu36357 ай бұрын
Great lectures!! Pls keep posting the latest series! Thank you!!
@raphaelmourad39834 жыл бұрын
Very good teaching of computer vision! Thanks Justin Johnson for these very nice lectures.
@zhaobryan444110 ай бұрын
He taught the essential in a great way
@DariaShcherbak5 ай бұрын
Thank you for the lecture! Greetings from Ukraine)
@훼에워어-u1n Жыл бұрын
thanks! such an informative video
@andrewstang85909 ай бұрын
Hi I thought the MNIST dataset had 60k training images. Or?
@veggeata12014 жыл бұрын
For the nearest neighbor classifier isn't training time going to be O(n)? If we are going to store pointers for each training example, we still have to iterate over the number of training examples, which is n.
@bhavin_ch4 жыл бұрын
If you have to iterate over the elements - yes. If you just copied a list, it's probably a single pointer
@randomsht-cy7we5 ай бұрын
That Hot dog and not hot dogs was from Silicon Valley. The professor watches the show :)
@adarshtiwari63744 жыл бұрын
14:06
@mahmoudatiaead73472 жыл бұрын
how can i get the homework anyone knows?
@sampathkovvali6255 Жыл бұрын
Assignments? Check out this course page in description
@ДаниилГусев-с9л2 жыл бұрын
Well, maybe I didn't get something, but I totally disagree about the train-valid-test idea as Justin described it. We train a model on train data and evaluate on valid set to change a model behavior. That's correct, however, it does not mean we should look at the test set split only at the beginning of our research. We should estimate our model on the test set at least several times and if the model performance is very different on the test set in comparison to the validation set it means something was done very wrong - e.g. splitting strategy. Of course, using the test set influences our decisions, but how much? Can you say that the estimation of the ready model on the test set really spoils everything? I doubt that.
@sampathkovvali6255 Жыл бұрын
Nope, your model is not allowed to look at the test set during tuning even a peak. You as a model will also over-fitt. 😂