You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Would it be possible to write an algorithm like NEAT which evolves the topology of a neural network in Fold? From my understanding, Fold does not support dynamic computation graphs in this sense, but these dynamic graphs are rather input dependent.
The text was updated successfully, but these errors were encountered:
crisbodnar
changed the title
NEAT implementation
NEAT Algorithm implementation
Oct 29, 2018
The original NEAT algorithm evolved small neural networks at the level of
individual neurons. That obviously won't work with tensorflow; calling a
python function to schedule a tensorflow operation to invoke a cuda kernel
to multiply two scalars would have ridiculously high overhead.
However, if you want to implement NEAT at the level of NN layers, rather
than individual neurons, then Tensorflow Fold should work quite well.
You'll want to use the loom library, not the blocks library.
First, create a separate LoomOp class for every NN operation you want to
support. Second, implement NEAT on a population of programs. Each program
in the population is a DAG of NN operations. Write a recursive python
function which traverses the DAG for every program in the population, and
invokes the appropriate LoomOp for each node. There is a calculator
example in loom that shows how to do this for arithmetic expressions. Loom
will handle the dynamic batching for you, and evaluate the LoomOps for all
programs using TensorFlow.
It might be easier to build a prototype using TensorFlow Eager, and then
switch to loom. Loom should give you a nice ~30x speedup over eager due to
dynamic batching.
-DeLesley
On Mon, Oct 29, 2018 at 10:22 AM Cristian Bodnar ***@***.***> wrote:
Would it be possible to write an algorithm like NEAT which evolves the
topology of a neural network in Fold? From my understanding, Fold does not
support dynamic computation graphs in this sense, but these dynamic graphs
are rather input dependent.
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#100>, or mute the thread
<https://github.com/notifications/unsubscribe-auth/AGGbTeryJ8zn8Q4ol6FgWrioSsqJavvrks5upzlwgaJpZM4X_rAF>
.
Would it be possible to write an algorithm like NEAT which evolves the topology of a neural network in Fold? From my understanding, Fold does not support dynamic computation graphs in this sense, but these dynamic graphs are rather input dependent.
The text was updated successfully, but these errors were encountered: