For the simulated annealing optimizer there are a number of parameters that can be tuned that will have an effect on the results of the registration. This page will document those parameters and what effect each of them has on the outcome.
search_range = 2*array([0.75,0.75,0.75,1.*pi/180,1.25*pi/180,1.25*pi/180])
Max Cooing Iterations
Max Function Evaluations:
Number of evaluations at each cooling iteration:
Tolerance of cost function
Boltzmann constant (chance of accepting a 'worse' pose)
learn_rate=0.5 #Not used in Fast schedule
Quench (for Fast SA schedule)
M (for Fast SA schedule)
N (for Fast SA schedule)
Cooling Schedule (fast, cauchy, boltzmann)
There are some additional parameter that are not directly related to the optimizer, but still influence the outcome.
Image Blur (gsigma) - for gradient based metric algorithms
gsigma = 1.4
To automatically tune all of the parameters in the system by including them all as inputs to an optimizer. For this to work the same images and starting pose would be used and the error of the image registration process would be the cost function. Averaging the results of several trials might be necessary due to the stochastic nature of the Simulated Annealing Algorithm. After a run of registrations using one set of parameters, the parameter set would be modified and the registration would be run again. This process would continue with the goal of minimizing the errors of the registration.
The first autotune test will be a proof of concept and will only focus on optimizing the stepsize for the registration.