I was glad to find a tutorial that didn't show "How to use a library to make a neural network so we can skip important details" I wanted to see the low-level construction to help me fill in some missing info in my knowledge and this was perfect.
@hynesie119 ай бұрын
That's why I searched for C++ neural network. You don't learn anything from using those high level Python libraries.
@grantallan73552 жыл бұрын
This guy has a great "I think you don't know anything so I'll talk down to your level" voice. It doesn't sound like a compliment, but I do mean it as one.
@Toy_Bubble_7 Жыл бұрын
I jumped into to something far too complicated in c++ without knowing much of the language, but after following this whole video, when I came back the next day I learned how to use pointers and how classes work. I made test files to get me more familiar with some of the concepts you were using. I am glad I jumped head first into it as I know way more that I would have known about c++ now. The next day I was able to fix issues I created in the program by some of the things I learned. Thank you for making this great video!
@gautamarora95733 ай бұрын
I am unable to run the last line ./neural-net-tutorial.cpp > out.txt can anybody please help with that, it is failing to run with exit code 1 edit: if anyone in future gets an error try debugging the code first and a possible explanation could be that in my case at least, the txt file made in notepad trainingData.txt stored data as UTF 16 LE instead of UTF 8 so try converting the encoding to UTF 8
@noiseflow4306Ай бұрын
This might be one of the best tutorials out there
@cobusuys47873 ай бұрын
This is one of the best hours I have ever spent.
@ArielLorusso4 жыл бұрын
0:00 Scope & Requirements 0:51 Neural net Uses and Example 6:00 Net :: ( Class ) 6:47 topology 10:43 inputVals ( Vector of inputs ) 15:52 m_Layers ( Vector of Layers ) 17:26 Net::Net ( constructor ) 23:39 Neuron :: ( Class ) 28:31 Neuron :: Neuron ( constructor ) 30:48 Net :: feed Forward 38:18 Neuron :: set Output Value 38:50 Neuron :: get Output Value 39:11 Neuron :: feed Forward 39:11 Neuron :: transfer Function 44:43 Net :: back Prop 50:57 Neuron :: calc Output Gradients 51:55 Neuron :: calc Midden Gradients 53:00 Neuron :: sumDOW ( Weights Differentials ) 54:05 Neuron :: update Input Weights 55:20 eta & alpha ( Neuron static attributes to adjust learning rate) 58:30 Neuron :: get Results 58:55 TrainingData explained ? 59:18 Prepare for the Superspeed ! 59:26 TrainingData :: object instantiation 59:32 showVectorVals ( function ) 59:34 getRecentAverageError ( function ) 59:34 TrainingData :: (Class) 59:37 TrainingData :: get Topology 59:38 TrainingData :: get Next Inputs 59:38 TrainingData :: get Target Outputs, 59:40 End of Superspeed ! 59:50 set Bias output to 1 1:00:25 Training data file 1:00:32 makeTrainingSamples.cpp 1:00:47 TrainingData.txt 1:01:23 testing the neural-net .cpp ! 1:01:52 output of neural-net.cpp
@DaJodad3 жыл бұрын
Thank you very much for this
@ArielLorusso3 жыл бұрын
Accounts made 1 & 2 week ago no subs Talking of a non related to this video movie web 80% sure you are bots making publicity on youtube coments ... disproof me
@dreamhollow3 жыл бұрын
This was pretty intense, but I now have a far better understanding of neural networks. Thank you.
@richardrisner9213 жыл бұрын
Wow!! I have been looking for a walkthrough exactly like this for many months. This got me past the mathematics so that I could get a real working example. Thank you so much!
@dbr_1997 ай бұрын
This is a very high-quality tutorial, thank you so much. Would've loved to see it in higher resolution!
@techmedia13604 жыл бұрын
This is probably the best video I've watched in my life
@jamesfield54153 жыл бұрын
Innit tho
@FPChris2 жыл бұрын
Agreed. Amazing video
@per-axelskogsberg38612 жыл бұрын
Yupp
@coding_with_thomas Жыл бұрын
I've been looking around for neuronal network stuff a long time. Found a lot of articles videos and u could save my time by just watching this video. Thank you for this one!!!
@laureven4 ай бұрын
Shame You are busy with work (good thing) :). You are absolutely brilliant as a teacher . Thank you for the video
@davidnoble53472 жыл бұрын
My guy, you really smashed it with this video. Explaining such a complex algorithm, while still managing to not assume your viewers know every nook and cranny of C++ syntax. Well done.
@ykesav77439 ай бұрын
Great implementation of neural network in C++ STL .Thanks for the tutorial.
@FatLingon4 жыл бұрын
"We want to compile often".... goes on to only compile one time during the whole write up.
@jdm89634 жыл бұрын
Don't be greeddy dude. It's obviusly better for us a smooth explanation without sttoping at each modification to compile and test a code that he already said he tested before doing the video. You can pause the video if you want to compile and test.
@JrPlays4 жыл бұрын
Yes because the perfect tutorial is expanded with 20-30 more minutes of just compiling after every module is added.... compile it yourself
@worldshaper17233 жыл бұрын
You have changed the world!!
@majedali14782 жыл бұрын
flexing your muscles😄😄😄 by excessive use of memory references Thank you dude really helpful
@sticktogether23264 жыл бұрын
thank you great man =) finally there was someone who wrote a neural network in an adequate programming language p.s. damn python
@awesomesauce11573 жыл бұрын
very true
@puppergump41172 жыл бұрын
Python is not inadequate, it's just barely adequate
@Uvuv69692 жыл бұрын
Yes burn python with fire
@mustafatuncer4780 Жыл бұрын
Wonderful course. Thanks a lot.
@rajatmond2 жыл бұрын
Invaluable. I can't thank you enough. I had been using TMVA included in CERN's ROOT. But it doesn't really compile in windows. I have the option of openCV but that's bloated. You're a lifesaver
@FPChris2 жыл бұрын
Outstanding. Just what I was looking for.
@kohltonpeterson328710 ай бұрын
He kind of sounds like a programming bob ross and it's amazing.
@Gabriel-V3 жыл бұрын
Really nice video. But he (Dave Miller) doesn't explain about the recentAverageError and the smoothingFactor? There are many loose ends, and I had to get them right myself.
@mustafatuncer4780 Жыл бұрын
I wish you also prepare a C++ encoder-decoder, GPT project videos.
@marzipug54392 жыл бұрын
I can feel my mind expanding.
@Hugo-go6yq2 жыл бұрын
Amazing tutorial thank you so much 🙏🙏🙏
@Darkl0ud_Productions Жыл бұрын
Maybe I'm an idiot, but using Visual Studio and trying to run the neural network with the trainingData.text present, I keep getting an abort() called error. It seems that the program does not recognize my trainingData.text file... Any ideas? EDIT: If anyone is having this issue, it is most likely that you are using UTF-16 formatting for the txt file. I noticed on my Linux machine it would work flawlessly, however in Windows it would not. I realized when redirecting the output to traningData.txt in windows powershell, it outputs in UTF-16.
@Gauthamphongalkar8 ай бұрын
thank you very much!
@MadMax-mw3og Жыл бұрын
Hey! Amazing video, If I could aska question... Would I be able to implement this inside of unreal engine? Anything I need to know before attempting this?
@f_ftactics79284 жыл бұрын
I don't see you use the error in the backprop function after you calculated out.
@agaveboy Жыл бұрын
Thank you
@mehmetfatih67503 жыл бұрын
Hello. In the file on the drive folder, Neuron::updateInputWeights function is deficient. I mean the calculation of newDeltaWeight variable is missing.
@puppergump41172 жыл бұрын
Nobody's gonna mention the fact that this guy woke up on 9/11 and thought it was a great day for machine learning, he must be a robot
@muhammadzaid3082 жыл бұрын
I have been vehemntly trying to ignore learning python, and I came so close too
@pythagorasaurusrex98532 жыл бұрын
Fantastic! Exactly what I needed!. Thank you so much!
@valekprometey3 жыл бұрын
Great. Thanks a lot for the video. But what is an Epoch in this example? If I know right, backpropagation starts after a whole Epoch completed.
@Madsycode2 жыл бұрын
Nice vid though!! i will recommend to make a serie out of this it will bring more subscriber!!
@tamas50022 жыл бұрын
I like this video. Thank you very much for this. Is it mandatory that Neuron class knows about the Network class and vica versa? Couldn't we just simple eliminate this depedency? And, could we out random function into a util class? Because it has nothing to do with a neuron.
@gauravuttarkar96823 жыл бұрын
Is there one for convolutional neural nets?
@puppergump41172 жыл бұрын
I had some issues (probably my fault) with the class Neuron declaration at the beginning. I just removed the declaration and moved the Neuron definition up
@Tio_kappa Жыл бұрын
can you re upload it in 1080p??
@willr0073 Жыл бұрын
This was amazing!! Can we use this code? How would we have to reference to it? Thank you!!
@slubblesthuggy3 жыл бұрын
Is it normal that it can spit out negative values? Like -0.00234 or something
@rayenghanem66435 ай бұрын
Hello guys i have a question why is he always using a unsigned and not int in some of the trivial places line for(unisned i ... why not simply for(int i =0... ?
@TheGoldenriff Жыл бұрын
Bob Ross for computer scientists!
@filakabamba95842 жыл бұрын
How can I do the following : Please help asap *1.* Make an Artificial Neural Network with dynamic input and binary out .... *2.* Make a Self Organizing Maps with dynamic input and binary out ....
@shotakhakhishvili86402 жыл бұрын
Does anyone have any idea why this doesn't work for me? I followed his path, I understand what he did very well, but it didn't work. Then I copied the working file, and it still doesn't work, neural network just won't progress at all, it is behaving on test 50 000 just as it was on the first 100 tests. Any idea what I should do? Edit: Test 1 000 000 now, and it "works" just the same, nothing has changed at all...
@galayuda10 ай бұрын
Здравствуйте. Курт Гёдель доказал, что конечно может найтись такой факт который нельзя ни доказать ни опровергнуть, однако это только тогда когда вы ограничиваетесь формализмом и стандартной логикой решения задачи. На формализме и стандартной логике построена компьютерная архитектура. Так как Фон Нейман был одним из приверженцев формазима в математической школе Гилберта. Вполне возможно рутинное решение можно и нужно переложить на компьютер до возникновения "нерешаемой задачи", а уже её решение найти ни применяя компьютер, а применяя так называемый человеческий фактор. Под челвоеческим фактором я подразумеваю фантазию, и воображение, а так же силу творца, которая дарована человеку от создателя. Удачным примером я хочу привести открытие комплексных чисел и решение тем самым текущих задач того времени
@virozz10242 жыл бұрын
how the fuk this gradient derivative is calculated
@Zi7ar213 жыл бұрын
I learned C++ in a very different order lol I still dunno how to use classes
@waxitoto12343 жыл бұрын
they are not that hard, but it took me some time to understand the concept, it's just a struct with extra steps LEL
@destiny_023 жыл бұрын
In c++ struct is same as class, only difference is struct members are public by default and class members are private by default. struct jj { int a; }; Is same as :- class jj { public: int a; };
@chadgregory90372 жыл бұрын
@Jacob maybe you've not really done anything yet that truly required them lol fukin datanerds man
@sqbi46143 жыл бұрын
It just doesn't work for me.
@sqbi46143 жыл бұрын
It is very important to set double newDeltaWeight = // Individual input, magnified by the gradient and train rate: eta * neuron.getOutputVal() * m_gradient // Also add momentum = a fraction of the previous delta weight; + alpha * oldDeltaWeight; with (...) + alpha * oldDeltaWeight and not (...) * alpha * oldDeltaWeight
@richardrisner9213 жыл бұрын
@@sqbi4614 I did miss that--thanks! I also found that I needed to set m_recentAverageSmoothingFactor to 100.0 like in the source code. I then initialized instance variable m_recentAverageError. But my program still does not work --it gets stuck at about .46 recent average error with tens of thousands of passes.
@sqbi46143 жыл бұрын
@@richardrisner921 well, stay determined and don't forget to set mutation variable to double (and pass it to functions as double, not unsigned)
@richardrisner9213 жыл бұрын
I can compile the source code on his website in C++, so I must have missed something when I was following along in C#. I will have to study this further.
@richardrisner9213 жыл бұрын
@@sqbi4614 Thanks!! I got a tremendous improvement with this: replace the approximate derivative 1 - x*x with 1 - tanh(x)*tanh(x). The network as designed was aggressively aiming for binary numbers, and it would get stuck on +1 for my training data. This seems to have allowed it to tune in continuous values.
@tyforman18212 жыл бұрын
crazy white guy #crazies #spook
@davidporterrealestate4 жыл бұрын
Where is the original source?
@krayt58754 жыл бұрын
This looks like the original source for the video above www.millermattson.com/dave/?cat=13
@thechoosen4240 Жыл бұрын
Good job bro, JESUS IS COMING BACK VERY SOON;WATCH AND PREPARE