Hey, I'm following every video u making on this series.Thanks alot.it's been useful!⭐⭐⭐
@SlimeMage326 жыл бұрын
Im so tempted to watch this but i really need to sleep
@TheCodingTrain6 жыл бұрын
I recommend sleep 😴
@lvdw36536 жыл бұрын
Sleep? NOW?
@lvdw36536 жыл бұрын
HAIL MR SHIFFMAN
@Absfor306 жыл бұрын
I just watched the live stream right through, one thing I notice is your learning rate is 0.1, you train for only 1 epoch then expect the NN to give a correct answer.... you'll need to train using the data in a repeated & randomised order for the repetitions. Not just the 2400, many passes through the training set in different orders is needed, instead of 5-10 epoch's try something like 300+ and a smaller learning rate, you could track the error and stop the training when the error rate is below a certain level of your choosing over a couple of epochs to ensure it wasn't just a lucky guess... then start testing your "toy" NN (there is nothing toy like about this BTW!) it's amazing the work you've done with this and it'll teach sooooo many people how to approach data science and prepare data or use NN's for a practical purpose. I'd love to see the result after you have tried a smaller learning rate and a higher Number of epochs ! Great work either way Dan, just a few tweaks and tuning to do to get the results you are looking for :-)
@TheCodingTrain6 жыл бұрын
Thanks for this amazing feedback, i's super helpful!
@geoffwagner4935 Жыл бұрын
Lol what a scary error. columbs and rows, in nn.js. then your other video for XOR was like "just use mine", it was such a sinking feeling after most the day, lol for a while looking at yours, there's a few differences and it worked. was like no way, and i don't want to drag and drop it, i want mine... lol my Backpropagation worked with an error. i did have one little variable wrong in nn.js xD xD xD i got EPOCHED xD it went finally . the list works, for those with any doubt about the playlist not working it does
@cosmoserdean6 жыл бұрын
You should train on multiple examples at once (a batch of examples). By training on one example at a time the model won't be able to generalize well because every optimization done is made for a specific example which could not be in favour of the rest of the examples and they could end up adjusting against eachother.
@TheCodingTrain6 жыл бұрын
This this is a very important point that I glossed over! Thanks for the comment.
@guozhangliew73024 жыл бұрын
will we have a follow up video on creating the neural network with multiple hidden layers?
@wolfisraging6 жыл бұрын
U were, u r, and u always be best
@linzhang66186 жыл бұрын
Rishik Mourya so true :)
@Bopas26 жыл бұрын
Are some of the input numbers negative still? Is that a problem (some of my input data is negative)? Also, instead of shuffling your inputs, would be ok to alternate between your three examples over and over? I understand how just training one data set after another is very bad, but is it ok to have a small pattern like that? Love this series! Thanks for doing this!
@santiagocalvo3 жыл бұрын
You are a god!
@lomasviralyt64476 жыл бұрын
what syntax do you use?
@salim4446 жыл бұрын
can you please use some .map?
@geoffwagner4935 Жыл бұрын
i might have to check my underwear after this video, the toy we went through that huge playlist, really now plug into this??? it's rhetorical, i know it does now . wow
@mikaelrindmyr4 жыл бұрын
I got sad when you used nn insted of NN for NeuralNetwork :( Everything else was the best!
@mishacalifornia56706 жыл бұрын
I love to escape on the coding train...
@JurajPecháč6 жыл бұрын
Train Your Machine Learning Models on Google’s GPUs for Free - Forever : Google Colab
@KanalMcLP6 жыл бұрын
Would be wise to add a category: "something else / dunno" for training.
@cosmoserdean6 жыл бұрын
McLP I don't really think this is a good idea. Every category should be represented in the training data. I don't really know what examples of "something else" you could add without confusing the network
@KanalMcLP6 жыл бұрын
Cosmin Serdean random drawings from other categories, not represented in the rest of the training set.
@cosmoserdean6 жыл бұрын
Hmm... Good point... Still having my doubts, but this is definitely something worth trying out
@KanalMcLP6 жыл бұрын
Cosmin Serdean i got this idea from the description of the google speech commands dataset. And if humans are involved, "i dont know" is an important answer.
@cosmoserdean6 жыл бұрын
Maybe there is a certain confidence threshold under which it would respond this way.