Projects / neuroevolution paper
I met Dániel Barabási at the Neuroscience Imbizo in 2019, and chatted about his toy “XOY” model of how gene expression in different neuron types could establish broad-scale connectivity patterns in the developing brain. Later, while attending the Cosyne neuroscience conference, I realized there was a simple way to recast the evolutionary problem of designing particular neural connectivity to make an organism capable of rapid learning as a metalearning problem in which backpropogation itself could be applied to optimize the matrices that establish the gene expression templates.
I explained how an outer optimization loop, which models evolution, optimizes the X, O, Y matrices that establish connectivity matrices, and an inner optimization loop, modeling organism learning, optimizes the resulting connectivity matrices. This models how evolution optimizes brain connectivity in order to promote rapid learning. I scribbled the mathematical core of the idea on a napkin at the conference. Later, after much discussion, we met in Budapest and hacked on the idea. I wrote the code and helped design the diagrams, and Dániel wrote them majority of the paper itself.
After many experiments and discussions and a gruelling slog of reviewer interaction (which Dániel heroically took the charge on), and the invaluable contribution of two other authors, we were able to publish in Nature Communications under the title Complex computation from developmental priorswww.nature.com.