is not keeping all evaluated parameter sets.
Every time the optimizer gets new parameter sets, it sorts them ascending
and splits them by the
The first parameter sets inside the darlings range are hold infinitely.
Everything outside the darlings range will be checked for their
and removed if the innovation value is lower than the
innovation keep threshold
A parameter set's innovation value is assigned to
and decreases (multiplied by the
innovation abrasion factor
) every time the optimizer gets new parameter sets.
Interestingly, in the prototype example it was even needed
to keep any result forever in the knowledge (the darlings range was 0).
Without removing parameter sets (killing darlings) eventually, it was sometimes to hard escaping local optima fast enough.