Getting Started

 

To begin evolution, we need to create a seed genome and a population from it. Before everything though, we create an object which holds all parameters used by NEAT:  


import MultiNEAT as NEAT
params = NEAT.Parameters() 


params.PopulationSize = 100


This is usually the point where all custom values for the parameters are set. Here we set the population size to 100 individuals (default value is 300). Now we create a genome with 3 inputs and 2 outputs:    


genome = NEAT.Genome(0, 3, 0, 2, False, NEAT.ActivationFunction.UNSIGNED_SIGMOID, NEAT.ActivationFunction.UNSIGNED_SIGMOID, 0, params)    


Notice that we set more properties of the genome than just number of inputs/outputs. Also, if the number of inputs you're going to use in your project is 2, you need to write 3 in the constructor. Always add one extra input. The last input is always used as bias and also when you activate the network always set the last input to 1.0 (or any other constant non-zero value). The type of activation function of the outputs and hidden neurons is also set. Hidden neurons are optional. After the genome is created, we create the population like this:


pop = NEAT.Population(genome, params, True, 1.0)


The last two parameters specify whether the population should be randomized and how much. Because we are starting from a new genome and not one that was previously saved, we randomize the initial population.


Evolution can run now. For this we need an evaluation function. It takes a Genome as a parameter and returns a float that is the fitness of the genome's phenotype.


def evaluate(genome):

    # this creates a neural network (phenotype) from the genome


    net = NEAT.NeuralNetwork()
    genome.BuildPhenotype(net)


    # let's input just one pattern to the net, activate it once and get the output


    net.Input( [ 1.0, 0.0, 1.0 ] )
    net.Activate()
    output = net.Output()


    # the output can be used as any other Python iterable. For the purposes of the tutorial,
    # we will consider the fitness of the individual to be the neural network that outputs constantly
    # 0.0 from the first output (the second output is ignored)


    fitness = 1.0 - output[0]
    return fitness


So we have our evaluation function now, we can enter the basic generational evolution loop.


for generation in range(100): # run for 100 generations


    #
retrieve a list of all genomes in the population
    genome_list = NEAT.GetGenomeList(pop)


    # apply the evaluation function to all genomes
    for genome in genome_list:
        fitness = evaluate(genome)
        genome.SetFitness(fitness)


    # at this point we may output some information regarding the progress of evolution, best fitness, etc.
    # it's also the place to put any code that tracks the progress and saves the best genome or the entire
    # population. We skip all of this in the tutorial.


    # advance to the next generation
    pop.Epoch()


The rest of the algorithm is controlled by the parameters we used to initialize the population. One can modify the parameters during evolution, accessing the pop.Parameters object. When a population is saved, the parameters are saved along with it.