Optimization with logging via a hook.
Author: Jing Wu and Frank Dellaert
def gtsam.utils.logging_optimizer.optimize |
( |
|
optimizer, |
|
|
|
check_convergence, |
|
|
|
hook |
|
) |
| |
Given an optimizer and a convergence check, iterate until convergence.
After each iteration, hook(optimizer, error) is called.
After the function, use values and errors to get the result.
Arguments:
optimizer (T): needs an iterate and an error function.
check_convergence: T * float * float -> bool
hook -- hook function to record the error
Definition at line 58 of file logging_optimizer.py.
def gtsam.utils.logging_optimizer.optimize_using |
( |
|
OptimizerClass, |
|
|
|
hook, |
|
|
|
args, |
|
|
|
gtsam, |
|
|
|
Values |
|
) |
| |
Wraps the constructor and "optimize()" call for an Optimizer together and adds an iteration
hook.
Example usage:
```python
def hook(optimizer, error):
print("iteration {:}, error = {:}".format(optimizer.iterations(), error))
solution = optimize_using(gtsam.GaussNewtonOptimizer, hook, graph, init, params)
```
Iteration hook's args are (optimizer, error) and return type should be None
Args:
OptimizerClass (T): A NonlinearOptimizer class (e.g. GaussNewtonOptimizer,
LevenbergMarquardtOptimizer)
hook ([T, double] -> None): Function to callback after each iteration. Args are (optimizer,
error) and return should be None.
*args: Arguments that would be passed into the OptimizerClass constructor, usually:
graph, init, [params]
Returns:
(gtsam.Values): A Values object representing the optimization solution.
Definition at line 20 of file logging_optimizer.py.