Optimizer Tuning

For the simulated annealing optimizer there are a number of parameters that can be tuned that will have an effect on the results of the registration. This page will document those parameters and what effect each of them has on the outcome.

Optimizer Parameters

Step Size:

search_range = 2*array([0.75,0.75,0.75,1.*pi/180,1.25*pi/180,1.25*pi/180])

Initial Temperature:

T0=0.50

Final Temperature:

Tf=1e-100

Max Cooing Iterations

maxiter=20

Max Function Evaluations:

maxeval=400

Max Acceptances

maxaccept=100

Number of evaluations at each cooling iteration:

dwell=30

Tolerance of cost function

feps=1e-100

Boltzmann constant (chance of accepting a 'worse' pose)

boltzmann=5e-3

Learning Rate:

learn_rate=0.5 #Not used in Fast schedule

Quench (for Fast SA schedule)

quench=1.5

M (for Fast SA schedule)

m=1.3

N (for Fast SA schedule)

n=.7

Cooling Schedule (fast, cauchy, boltzmann)

schedule='fast'

There are some additional parameter that are not directly related to the optimizer, but still influence the outcome.

Additional Parameters

Image Blur (gsigma) - for gradient based metric algorithms

gsigma = 1.4

Image Size

NOT IMPLEMENTED

AUTOTUNE

To automatically tune all of the parameters in the system by including them all as inputs to an optimizer. For this to work the same images and starting pose would be used and the error of the image registration process would be the cost function. Averaging the results of several trials might be necessary due to the stochastic nature of the Simulated Annealing Algorithm. After a run of registrations using one set of parameters, the parameter set would be modified and the registration would be run again. This process would continue with the goal of minimizing the errors of the registration.

The first autotune test will be a proof of concept and will only focus on optimizing the stepsize for the registration.