this needs more recognition, I've never seen a NN loop back on itself.
@neatai67023 жыл бұрын
Thanks for that.. It does make a big difference for the asteroids pilot to know what it did on the previous move..
@Kraus-3 жыл бұрын
It's getting pretty good at blasting asteroids.
@neatai67023 жыл бұрын
it is.. I ran it again yesterday for awhile with an improved ai and it just blasts away for hours..
@captainjj71843 ай бұрын
1:32 I'm gonna blew yalls mind. This is how life just flows as fractals and everything repeats as mimicry. Dunno who or what before us but now we're doing what it was doing like a domino effect and soon, we'll built another version of... us. This NEAT algorithm already prefers a solution that a lot of us as kids did in fighting games: that annoying yet effective, continuous bursts of endless low kicks! It's alive😅
@tomoki-v6o11 ай бұрын
Recurrent connections keeps a memory about the ordering of data , so , to get a good performance on Xor data , the only thing you need to do is to shuffle the data randomly every iteration.
@typicalhog3 жыл бұрын
I'm not sure if you are using multiple mutatable activation functions or not, but I got another potentially interesting idea. Two new types of neurons. Integrator/accumulator neuron that would sum up the states into a "pool?" and a derivator/delta? (I really don't know all the math terms that could be applied here) a neuron that would return the change in its state's value instead of returning the state like a normal neuron. Another possibly emergent property could if derivator/delta? neuron fed its absolute output value into integrator/accumulator neuron. (Meaning the pair would essentially "collect" changes in values and you get something that would measure? volatility?). There could also be a decay factor to prevent the values in the accumulator to go to infinity. I might do a lil sketch in paint cause I'm really bad at explaining this.
@neatai67023 жыл бұрын
All great idea's.. please keep them coming.. I'll try and work them into future videos..
@typicalhog3 жыл бұрын
@@neatai6702 I also tried to draw this to better explain it, but YT seemed to auto remove the link. ipfs . io/ipfs/QmS2ptefbYMGcLetEiiYzWibKNfBohDtJ4fYyU9kFpxJ4z?filename=2021_06_23_0rf_Kleki.png Hope this works. Also, maybe all this is completely useless, no idea, really.
@typicalhog3 жыл бұрын
XOR is probably the simplest thing a non-recurrent network can learn to solve. The simplest problem I can think of that would benefit from recurrent connections might be counting to 10, or generating Fibonacci numbers? If you wanted to make a video that focuses on recurrent connections more, you could do that, or even a simple memory game. Let's say a 4x4 grid where we try to get the AI to find all the pairs with fewer tries than just choosing random cards.
@typicalhog3 жыл бұрын
What language are you using? Also, what GFX lib? I'm sorry if you mentioned it already or if I asked you before, I may have forgotten.
@dough60813 жыл бұрын
hi! just want to point out, before my tiny rant, that I love your videos! So, this isnt really how recurrent networks (or RNN) works. If you want to make a layer recurrent, you need to make a new hidden layer, with its own weights, and with its input being dependant on the layer you chose to make recurrent. its input: at the 0th state, the input is all 0, during all other iteration of your recurrent layer, its input is the output of your layer during the previous iteration. so: activation_func(w+w_from_recurrent_layer+b) if you are not using an alternative to backpropagation (here NEAT AI does use an alternative) you need to use BTT (backpropagation through time) to train the weights of the recurrent layer.
@neatai67023 жыл бұрын
Thanks for the comment ( and the rant !).. Fully agree with you on RNN's... All I'm doing here though is allowing a specific mutation which can add recurrent connections and seeing what the impact is.. maybe I have the naming wrong ?
@DavidGillespie1987 Жыл бұрын
Is there a place to see the code for this?
@Dalroc7 ай бұрын
Isn't it a bit weird to feed the recurrent connection into the input layer? I feel like it should be between different hidden layers or from the output layer to a hidden layer. The input layer shouldn't be messed with.
@FSckaff11 күн бұрын
The bias node isn’t really an input
@jonathanwilson88093 жыл бұрын
Is that poly bridge music?
@domc29093 жыл бұрын
I'm not quite following. So the previous output of the node gets fed back to the input node and added to its current value? How many times is this done? Is it just one previous value each time or do they build up and average? Also if recurrent connections connect back to the input layer, then do input nodes also require activation functions? Usually they just take a scaled input and pass it on without any squashing function applied.
@neatai67023 жыл бұрын
When you're working out the total input sum to a node, it simply takes the outputs from the connections that terminate at that node and adds them up.. If the connection is coming from the output of node later in the network, its value won't have been updated yet as the input signal won't have reached it, so its using the 'old' value for that nodes output.. So its using what it did previously as an input for what to do now.. hence the memory effect of recurrent connections.. Doesn't matter if its going back to an input layer node.. You might be scaling the input signals, but the recurrent connection component just gets added on. And the sum appears at the node output as there's no activation function for the input layer nodes..