JR
Josef Rissling
Stories
Code
Sound
About

create-your-filter.jpg
Create Your Filter
Machine Learning: Parameter optimization for custom IIR filter



create-your-filter.jpg
Create Your Filter
Machine Learning: Parameter optimization for custom IIR filter
The Right Parameters

My research often tends to create prototypes with many parameters. The adjustment of those parameters can be time-consuming and, well, boring. For those occasions I always wanted to have an automation process, that would do that for me. That is why I created a JavaScript prototype for a
black box optimizer
, that is able to figure out good sets of parameters on its own. It needs no neural network or topology and needs only to know how good the parameters are (cost function). Below is a
demo
to figure out coefficients (parameters) of an IIR filter (bi-quad low-pass) by measuring the difference in the time domain on processed random signals (cost function).
Let's Try 'Em All

The optimizer itself manages a collection of different
algorithms
, which cover a big range of optimizing strategies. They focus differently on gradients, statistics, geometry or randomness. During the search for parameters, the optimizer picks one randomly and asks for proposals of parameters, that are evaluated by the
cost function
and given a
cost
. This repeats until the
threshold
for the cost of one parameter set is met (success) or an maximum trials limit is reached (fail). During the process, the optimizer can adapt the chance of how likely an algorithm get picked and can so self-optimize its success.
optimizer-layout.svg optimizer-chance.svg
Ants Knowledge

The algorithms are
instances
(instead of state-less global functions) that are reused while searching and can memorize things on their own. But in addition, a list of evaluated parameter sets (of all algorithms) is stored in the optimizer: the
knowledge
. Every time an algorithm is requested for proposals, it is passed this knowledge, eventually using it for creating proposals. The knowledge is inspired from the . It helps to identify and mark promising paths in the parameter space and also allows to analyze multiple previous proposals.
Kill Your Darlings

The optimizer's
knowledge
is not keeping all evaluated parameter sets. Every time the optimizer gets new parameter sets, it sorts them ascending by
least cost
and splits them by the
darlings range
. The first parameter sets inside the darlings range are hold infinitely. Everything outside the darlings range will be checked for their
innovation
and removed if the innovation value is lower than the
innovation keep threshold
. A parameter set's innovation value is assigned to
1.0
on creation and decreases (multiplied by the
innovation abrasion factor
) every time the optimizer gets new parameter sets.
Interestingly, in the prototype example it was even needed
not
to keep any result forever in the knowledge (the darlings range was 0). Without removing parameter sets (killing darlings) eventually, it was sometimes to hard escaping local optima fast enough.
Filter Design Prototype

The optimizer in the demo finds
coefficients
(parameters) randomly chosen for a bi-quad low-pass filter. It creates a
session
and runs
trials
, that are initial random configurations and parameter sets of the optimizer. If a trial is not successful after 10000 iterations (stuck in local optimum) it clears the old one and starts an new trial (different configuration and initials sets). If the cost is less than
1.0-e8
the trial succeeds and the session is closed. Afterwards, a new session is started with new coefficients.
The
cost function
computes the cost by creating a random signal and processing it with the chosen and the proposal coefficients (parameter sets). The sum of the average and maximum difference of those signals equals the
cost
.
optimizer-demo-processing.svg optimizer-demo-cost.svg
The demo shows - besides numeric statistics about the ongoing and past sessions - the current progress of the optimizer. The left graphic is the parameter search space. It renders the chosen parameters, the best proposal parameters and the optimizer's knowledge. The right graphic is the history of the cost of the best proposals for all algorithms. (It actually also draws a history for each algorithm separately, but that is not important).

First: parameter search space - chosen(green), best(orange) and knowledge(purple).
Second: best cost history(white) and per algorithm cost histories(colored).
optimizer-parameters.jpg optimizer-cost.jpg
Resources
Conclusion

Although it has a lot of prototype overhead (naïve JavaScript, single-threaded and competing with the UI, wasteful memory management, etc.), it still looks quite
promising
. The (computation and time) performance depends also on the perspective, it is not even close compared to the performance of calculating the coefficients directly, but still significantly faster than a human. Most of the time it needs approx. 1.1 trials in a session to solve the coefficients with ca. 2000 iterations for a succeeding trial. Since it can be run easily in multiple threads, it is probably a good candidate for evolutionary algorithms. By running multiple optimizers in parallel, the genes (knowledge) of different species (optimizers with different randomizations) can be shared to shorten the time to succeed. Most settings of the optimizer itself are setup by hand and could be optimized as well.
2017 Josef Rissling Contact Terms of Use Nutzungsbedingungen
This site uses cookies. Read more.
Stories
Code
Sound
About