Write a class that does a basic gradient descent using something like the Nelder-Meade or Newtonian descent. The class takes in a function that is the routine to optimize, the initial conditions, a data set (images, and the desired output), and an objective function.
The objective function returns a single value that corresponds to the "goodness" of the function at performing a task. The optimize method applies the function to a set of data with a given set of parameters and then uses objective function to process those results and return a value:
For example
optimal_params = optimizer.optimize(myMethod, #the thing to optimize
initial_conditions, #the initial set of params as a dict
data_set, # combination of data and results desired
objective_function) # a that evaluates the results.
see:
http://en.wikipedia.org/wiki/Nelder%E2%80%93Mead_method
http://en.wikipedia.org/wiki/Levenberg%E2%80%93Marquardt_algorithm
Anonymous