@leo Typical genetic algorithm relies mostly on recombination. Mutation is completely unnecessary if you start with a large enough population.

It would be more interesting if you generate a population of random neural networks and combine their weights (take some part from one and some part from another) to make the next generation.

Otherwise what you are doing is just a less efficient way to make gradient descent.

@leo I would really like to read what comes out of this if you find time to implement it.

On the one hand, there are reasons why it shouldn't work. If you take two different trained *deep* networks, their internal representations on the hidden layers may be incompatible. But, on the other hand, if there are two classes of incompatible neural networks, one of them will at some point outnumber the other.

There are already articles from 1989 on this topic, though: ijcai.org/Proceedings/89-1/Pap

Sign in to participate in the conversation
niu.moe

Welcome to your niu world ! We are a cute and loving international community O(≧▽≦)O !
We are a moderated instance, that aren't supporting harassment nor hateful speech. But we aren't a "safe" space, we won't prevent you to interact with instances that aren't respecting our rules.
"Be conservative in what you send and liberal in what you receive." - Netiquette
The main language used here is English, but for most of us this isn't our main language, so it's a great place to learn!