seisflows.tests.test_optimize
Test the optimization module by setting up a Rosenbrock optimization problem and running line search
Module Contents
Functions
Rosenbrock objective function used to test the optimization module. |
|
Define the gradient of the Rosenbrock function, i.e., the gradient of the |
|
|
Create vectors required for a Rosenbrock optimization problem to be used |
|
Ensure that gradient computes the correct search direction. The gradient |
|
Test that the optimization module properly initializes the line search by |
|
Ensure that updating the line search works as advertised, i.e., we get a |
|
Run a small optimization problem to try to reduce the Rosenbrock |
|
Checkpointing is used to store the status of the optimization module |
|
Make sure we can tell when we're supposed to attempt a line search restart, |
|
Run a small optimization problem to try to reduce the Rosenbrock |
Rather than run a single line search evaluation, which all the previous |
|
Wrapper function to test the Gradient descent optimization problem |
|
|
Wrapper function to test the L-BFGS descent optimization problem |
|
- seisflows.tests.test_optimize.rosenbrock_objective_function(x)
Rosenbrock objective function used to test the optimization module. The Rosenbrock is defined mathematically as:
f(x,y) = (a-x)^2 + b(y-x^2)^2
where the global minimum is at (x,y) == (a, a^2) and typical constant values are: a==1, b==100
https://en.wikipedia.org/wiki/Rosenbrock_function
- Parameters
x (np.array) – input model [x, y] to feed into the objective function
- Return type
float
- Returns
misfit value for the given input model
- seisflows.tests.test_optimize.rosenbrock_gradient(x)
Define the gradient of the Rosenbrock function, i.e., the gradient of the rosenbrock_objective_function
- Parameters
x (np.array) – input model [x, y] to feed into the gradient of the obj function
- Return type
np.array
- Returns
gradient vector of the Rosenbrock function
- seisflows.tests.test_optimize.setup_optimization_vectors(tmpdir)
Create vectors required for a Rosenbrock optimization problem to be used to test the line search and optimization algorithms.
Note
The optimization module requires a model (m_new), corresponding misfit of that model (f_new), the gradient of that misfit function (g_new) and a search direction (p_new; calculated by each individual optimization sub-class)
- seisflows.tests.test_optimize.test_gradient_compute_direction(tmpdir, setup_optimization_vectors)
Ensure that gradient computes the correct search direction. The gradient vector created by the Rosenbrock function is expected to be [-215.6, -88.]
- seisflows.tests.test_optimize.test_gradient_initialize_search(tmpdir, setup_optimization_vectors)
Test that the optimization module properly initializes the line search by providing a new model and misfit value.
Initialize search requres 4 vectors to properly intialize the search, they are ‘m_new’ (model), ‘g_new’ (gradient), ‘p_new’ (search direciton_ and ‘f_new’ (misfit)
- seisflows.tests.test_optimize.test_gradient_update_line_search(tmpdir, setup_optimization_vectors)
Ensure that updating the line search works as advertised, i.e., we get a status on how the line search should proceed
- seisflows.tests.test_optimize.test_bracket_line_search(tmpdir, setup_optimization_vectors)
Run a small optimization problem to try to reduce the Rosenbrock objective function using the Bracket’ing line search method. Checks that the line search only takes a few steps and that the reduced misfit value is as expected.
- seisflows.tests.test_optimize.test_optimize_checkpoint_reload(tmpdir)
Checkpointing is used to store the status of the optimization module in the case of a failed or re-started workflow. Test that this works as advertised
- seisflows.tests.test_optimize.test_optimize_attempt_line_search_restart(tmpdir, setup_optimization_vectors)
Make sure we can tell when we’re supposed to attempt a line search restart, i.e., when the gradient and search direction are the same
- seisflows.tests.test_optimize.test_line_search_recover_from_failure(tmpdir, setup_optimization_vectors)
Run a small optimization problem to try to reduce the Rosenbrock objective function using the Bracket’ing line search method. Simulate a job failure during the line search, and attempts to recover the line search from a checkpointed state, mimicing a real life inversion where a line search might fail a job (forward simulation), and we do not want to have to run the line search from the beginning.
- seisflows.tests.test_optimize._test_inversion_optimization_problem_general(optimize, iterations=250)
Rather than run a single line search evaluation, which all the previous tests have done, we want to run a inversion workflow to find a best fitting model. To do this we essentially have to mimic the inversion workflow, but with barebones functions.
This function is written to be general, other tests should populate the optimize input parameter with instantiated Optimization modules.
Note
We do not save m_try to disk each time it is evaluated because it is small. However in real workflows, m_try must be saved to disk rather than passed in memory because it is likely to be a large vector.
Note
This replaces workflow.test_optimize from original code
- Parameters
optimize (module) – specific SeisFlows optimization module to test
iterations (int) – number of iterations to run. defaults to 250
- seisflows.tests.test_optimize.test_inversion_optimization_problem_with_gradient(tmpdir, setup_optimization_vectors)
Wrapper function to test the Gradient descent optimization problem
- seisflows.tests.test_optimize.test_inversion_optimization_problem_with_LBFGS(tmpdir, setup_optimization_vectors)
Wrapper function to test the L-BFGS descent optimization problem
- seisflows.tests.test_optimize.test_inversion_optimization_problem_with_NLCG(tmpdir, setup_optimization_vectors)