Neat AI does Recurrent Connections

  Рет қаралды 14,473

Neat AI

Neat AI

Күн бұрын

Пікірлер: 19
@okboing
@okboing 3 жыл бұрын
this needs more recognition, I've never seen a NN loop back on itself.
@neatai6702
@neatai6702 3 жыл бұрын
Thanks for that.. It does make a big difference for the asteroids pilot to know what it did on the previous move..
@Kraus-
@Kraus- 3 жыл бұрын
It's getting pretty good at blasting asteroids.
@neatai6702
@neatai6702 3 жыл бұрын
it is.. I ran it again yesterday for awhile with an improved ai and it just blasts away for hours..
@captainjj7184
@captainjj7184 3 ай бұрын
1:32 I'm gonna blew yalls mind. This is how life just flows as fractals and everything repeats as mimicry. Dunno who or what before us but now we're doing what it was doing like a domino effect and soon, we'll built another version of... us. This NEAT algorithm already prefers a solution that a lot of us as kids did in fighting games: that annoying yet effective, continuous bursts of endless low kicks! It's alive😅
@tomoki-v6o
@tomoki-v6o 11 ай бұрын
Recurrent connections keeps a memory about the ordering of data , so , to get a good performance on Xor data , the only thing you need to do is to shuffle the data randomly every iteration.
@typicalhog
@typicalhog 3 жыл бұрын
I'm not sure if you are using multiple mutatable activation functions or not, but I got another potentially interesting idea. Two new types of neurons. Integrator/accumulator neuron that would sum up the states into a "pool?" and a derivator/delta? (I really don't know all the math terms that could be applied here) a neuron that would return the change in its state's value instead of returning the state like a normal neuron. Another possibly emergent property could if derivator/delta? neuron fed its absolute output value into integrator/accumulator neuron. (Meaning the pair would essentially "collect" changes in values and you get something that would measure? volatility?). There could also be a decay factor to prevent the values in the accumulator to go to infinity. I might do a lil sketch in paint cause I'm really bad at explaining this.
@neatai6702
@neatai6702 3 жыл бұрын
All great idea's.. please keep them coming.. I'll try and work them into future videos..
@typicalhog
@typicalhog 3 жыл бұрын
@@neatai6702 I also tried to draw this to better explain it, but YT seemed to auto remove the link. ipfs . io/ipfs/QmS2ptefbYMGcLetEiiYzWibKNfBohDtJ4fYyU9kFpxJ4z?filename=2021_06_23_0rf_Kleki.png Hope this works. Also, maybe all this is completely useless, no idea, really.
@typicalhog
@typicalhog 3 жыл бұрын
XOR is probably the simplest thing a non-recurrent network can learn to solve. The simplest problem I can think of that would benefit from recurrent connections might be counting to 10, or generating Fibonacci numbers? If you wanted to make a video that focuses on recurrent connections more, you could do that, or even a simple memory game. Let's say a 4x4 grid where we try to get the AI to find all the pairs with fewer tries than just choosing random cards.
@typicalhog
@typicalhog 3 жыл бұрын
What language are you using? Also, what GFX lib? I'm sorry if you mentioned it already or if I asked you before, I may have forgotten.
@dough6081
@dough6081 3 жыл бұрын
hi! just want to point out, before my tiny rant, that I love your videos! So, this isnt really how recurrent networks (or RNN) works. If you want to make a layer recurrent, you need to make a new hidden layer, with its own weights, and with its input being dependant on the layer you chose to make recurrent. its input: at the 0th state, the input is all 0, during all other iteration of your recurrent layer, its input is the output of your layer during the previous iteration. so: activation_func(w+w_from_recurrent_layer+b) if you are not using an alternative to backpropagation (here NEAT AI does use an alternative) you need to use BTT (backpropagation through time) to train the weights of the recurrent layer.
@neatai6702
@neatai6702 3 жыл бұрын
Thanks for the comment ( and the rant !).. Fully agree with you on RNN's... All I'm doing here though is allowing a specific mutation which can add recurrent connections and seeing what the impact is.. maybe I have the naming wrong ?
@DavidGillespie1987
@DavidGillespie1987 Жыл бұрын
Is there a place to see the code for this?
@Dalroc
@Dalroc 7 ай бұрын
Isn't it a bit weird to feed the recurrent connection into the input layer? I feel like it should be between different hidden layers or from the output layer to a hidden layer. The input layer shouldn't be messed with.
@FSckaff
@FSckaff 11 күн бұрын
The bias node isn’t really an input
@jonathanwilson8809
@jonathanwilson8809 3 жыл бұрын
Is that poly bridge music?
@domc2909
@domc2909 3 жыл бұрын
I'm not quite following. So the previous output of the node gets fed back to the input node and added to its current value? How many times is this done? Is it just one previous value each time or do they build up and average? Also if recurrent connections connect back to the input layer, then do input nodes also require activation functions? Usually they just take a scaled input and pass it on without any squashing function applied.
@neatai6702
@neatai6702 3 жыл бұрын
When you're working out the total input sum to a node, it simply takes the outputs from the connections that terminate at that node and adds them up.. If the connection is coming from the output of node later in the network, its value won't have been updated yet as the input signal won't have reached it, so its using the 'old' value for that nodes output.. So its using what it did previously as an input for what to do now.. hence the memory effect of recurrent connections.. Doesn't matter if its going back to an input layer node.. You might be scaling the input signals, but the recurrent connection component just gets added on. And the sum appears at the node output as there's no activation function for the input layer nodes..
Neat AI does XOR Speciate
8:18
Neat AI
Рет қаралды 6 М.
Neat AI does XOR Mutate
8:11
Neat AI
Рет қаралды 5 М.
Lamborghini vs Smoke 😱
00:38
Topper Guild
Рет қаралды 56 МЛН
Мама у нас строгая
00:20
VAVAN
Рет қаралды 12 МЛН
Twin Telepathy Challenge!
00:23
Stokes Twins
Рет қаралды 138 МЛН
I Made Life And It Evolved
19:26
scasz
Рет қаралды 17 М.
Neat AI does Boids
10:01
Neat AI
Рет қаралды 20 М.
Neat AI does Neat XOR Gen 0
9:17
Neat AI
Рет қаралды 12 М.
NEAT Algorithm Visually Explained
18:07
David Schäfer
Рет қаралды 7 М.
Snake learns with NEUROEVOLUTION (implementing NEAT from scratch in C++)
28:08
What P vs NP is actually about
17:58
Polylog
Рет қаралды 137 М.
I taught an A.I. to speedrun Minecraft. It made history.
11:10
Neuroevolution Explained by Example
8:12
argonaut
Рет қаралды 14 М.
Neat AI does XOR Crossover
8:16
Neat AI
Рет қаралды 4,9 М.