NeuroEvolution of Augmenting Topologies (NEAT) and Compositional Pattern Producing Networks (CPPN)

  Рет қаралды 6,786

Aleksa Gordić - The AI Epiphany

Aleksa Gordić - The AI Epiphany

Күн бұрын

❤️ Become The AI Epiphany Patreon ❤️
/ theaiepiphany
👨‍👩‍👧‍👦 Join our Discord community 👨‍👩‍👧‍👦
/ discord
In this video I cover 2 papers:
1) NEAT: NeuroEvolution of Augmenting Topologies - a seminal paper from 2002 that evolves not just the network weights but also network architectures
2) CPPN: Compositional Pattern Producing Networks:
A Novel Abstraction of Development - an interesting model of developmental biology with a completely different approach compared to e.g. cellular automata models
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
✅ NEAT paper: nn.cs.utexas.edu/downloads/pap...
✅ CPPN paper: eplex.cs.ucf.edu/papers/stanle...
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
⌚️ Timetable:
00:00 Intro to NEAT and CPPNs
02:35 Basic ideas behind NEAT
07:55 NEAT genome explained
11:05 Competing conventions problem
13:25 NEAT mutations explained
15:30 NEAT genome mating explained
19:20 Maintaining innovations via speciation
25:25 Explicit fitness sharing
29:45 NEAT on XOR task
31:30 CPPNs and neural automata
36:40 Spatial signal as a chemical gradient abstraction
39:20 Composing functions
45:10 CPPN main idea recap
46:45 Breeding "images" using CPPNs
49:20 CPPNs are highly expressive (symmetries, repetition...)
54:17 HyperNEAT idea explained
57:43 Outro
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
💰 BECOME A PATREON OF THE AI EPIPHANY ❤️
If these videos, GitHub projects, and blogs help you,
consider helping me out by supporting me on Patreon!
The AI Epiphany - / theaiepiphany
One-time donation - www.paypal.com/paypalme/theai...
Huge thank you to these AI Epiphany patreons:
Eli Mahler
Kevin Stone
Petar Veličković
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
💼 LinkedIn - / aleksagordic
🐦 Twitter - / gordic_aleksa
👨‍👩‍👧‍👦 Discord - / discord
📺 KZbin - / theaiepiphany
📚 Medium - / gordicaleksa
💻 GitHub - github.com/gordicaleksa
📢 AI Newsletter - aiepiphany.substack.com/
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
#NEAT #CPPN #evolution

Пікірлер: 17
@chankhavu
@chankhavu 2 жыл бұрын
Another great paper review! NEAT was actually the first ever AI paper that I implemented (in... C++ lol), and also my personal introduction to AI (at that time I did not know about CNNs, perceptrons, I didn't even know about trees, SVMs, regressions). I even wrote a 2D game to test out NEAT. This video was my motivation: kzbin.info/www/bejne/p6eZhomFhpV5apY&ab_channel=SethBling
@Aniket7Tomar
@Aniket7Tomar 2 жыл бұрын
Just came across this on Twitter, glad that you decided to do a video on it.
@fmj.mytube8846
@fmj.mytube8846 10 ай бұрын
Nice video, I'm struggling with NEAT honestly, and for really ridiculous things. I have 2 questions, why "trait" exists...? I don't see any purpose in the code, however, the class exists everywhere... And the second thing, is the Crossover, which I'm thinking on making my own algorithm because... let's go to 18:05. Innovation 5 gets disabled, cool, now.. let's think for a second the hypothetical case (which happens actually, it's not that hypothetical).. that Parent2 doesn't have a continuance of innovations coming out from node 5. So now node 5 doesn't have any output and is a dead end. This sounds like a defect in the algorithm of crossover, in fact, it is. Another example of crossing over defect is when you don't want recurrency, but there is a disjoint innovation of one parent that goes, let's say to 4->5 and then you have an excess innovation that goes 5->4. So you have to check this and then it's like you start to break the topology. Therefore you end up with a broken neural network, with endless loops, like 4->5 and 5->4... which one is the recurrent one if both parents have the innovation as a forward connection? Just because 4 was created before than 5...? It's a vague algorithm... And you might say "oh well but the rest of the algorithms are fine"(let's suppose to, speciation last calculation was created blindfolded, there is no logic behind it).... but the Crossover algorithm is the heart of NEAT, a weak Crossover doesn't improve the fitness of the Organisms..... I'm between throwing NEAT and do something else, the problem is that I don't know math, or doing my own algorithms which will take a lot of time and tries....
@alexanderchernyavskiy9538
@alexanderchernyavskiy9538 2 жыл бұрын
Thank you for such an inspiration, reviewing great things from so unusual perspectives!
@TheAIEpiphany
@TheAIEpiphany 2 жыл бұрын
🙏🙏🙏
@alexijohansen
@alexijohansen 2 жыл бұрын
This is so awesome! Thank you doing these videos.
@TheAIEpiphany
@TheAIEpiphany 2 жыл бұрын
Thanks!!
@user-co6pu8zv3v
@user-co6pu8zv3v Жыл бұрын
Thank you for video.During the pandemic, I implemented NEAT in C++ using SML and Box2D (Asteroids game). It worked quite well. Unfortunately, I didn't save my work. I'll try now to redo my work using Python. I think it is help me to have some practic in Python
@TheAIEpiphany
@TheAIEpiphany 2 жыл бұрын
Very interesting, non-standard ideas inspired by genetics, evolution and developmental biology.
@tanguypledel3247
@tanguypledel3247 2 жыл бұрын
Thanks for this video. I think I am not skilled enough to understand everything but I'll come back at it later.
@TheAIEpiphany
@TheAIEpiphany 2 жыл бұрын
Become comfortable with not understanding everything - that's a very important trait/skill one needs in the field of deep learning. You'll be collecting pieces of the puzzle from multiple resources. Good luck!
@tanguypledel
@tanguypledel 2 жыл бұрын
@@TheAIEpiphany thanks for the advice :) I'll try to apply it even if it's scary
@TheMLover
@TheMLover 2 жыл бұрын
Thanks a lot! This was very helpful! Maybe you can do a quick serie of videos implementing some things in Python? just to see how this should be done following the paper
@rolexianibuyat2818
@rolexianibuyat2818 Жыл бұрын
I'd love to see that too. I hope he make it in the future
@OleguitoSwagbucks
@OleguitoSwagbucks Жыл бұрын
Hi! Thanks!!! I have a question, would you like to suggest what method of pruning non-fit individuals is generally better?
@tomaszsikora6723
@tomaszsikora6723 10 ай бұрын
Today i thought I found an easier way to generate cppns acting as a substrate for generating neural networks (hyper neat). I used linear genetic programming. You just define your functions that all take lets say 2 arguments for simple crossover, and you introduce registers. Single instruction can be SUM R1 R2 -> R3 which sums the register 1 and 2 and saves to 3. At the beginning you write your input to these registers and for the output at the end of the program you lets say average them. I couldnt make it to produce an image of a circle... and I had functions like sin tanh gauss and it looks like LGP isnt a way to go. I thought I was clever
@kutay8421
@kutay8421 Жыл бұрын
Very much liked the idea. 1- can I reach you with email to share some inspirations and ideas with you on this very subject 2- are you an academician, if so how can you choose the topics you like (biology deep learning etc) and if not, how do you make a living I really want to know because it seems impossible to me to follow your passion and earning a life at the same time in thia era. Best,
Hyperbolic Graph Convolutional Networks | Geometric ML Paper Explained
42:51
Aleksa Gordić - The AI Epiphany
Рет қаралды 7 М.
ConvNeXt: A ConvNet for the 2020s | Paper Explained
40:08
Aleksa Gordić - The AI Epiphany
Рет қаралды 16 М.
Kitten has a slime in her diaper?! 🙀 #cat #kitten #cute
00:28
😱СНЯЛ СУПЕР КОТА НА КАМЕРУ⁉
00:37
OMG DEN
Рет қаралды 1,8 МЛН
Neuro-Evolution of Augmenting Topologies (NEAT) - Complex Systems Simulation and Artificial Life
38:25
The Vesuvius challenge breakthrough with Luke Farritor
46:16
Aleksa Gordić - The AI Epiphany
Рет қаралды 2,1 М.
Neural Sheaf Diffusion: A Topological Perspective on Heterophily and Oversmoothing in GNNs
1:08:06
AI Learns to Speedrun Mario
8:07
Kush Gupta
Рет қаралды 600 М.
MAMBA from Scratch: Neural Nets Better and Faster than Transformers
31:51
Algorithmic Simplicity
Рет қаралды 77 М.
How are memories stored in neural networks? | The Hopfield Network #SoME2
15:14
Facebook AI's DINO | PyTorch Code Explained
48:53
Aleksa Gordić - The AI Epiphany
Рет қаралды 16 М.
Tensor Programs V: Tuning Large Neural Networks via Zero-Shot Hyperparameter Transfer (μTransfer)
51:45
I programmed some creatures. They Evolved.
56:10
davidrandallmiller
Рет қаралды 4 МЛН