Evaluation of Derivative-Free Optimizers for Parameter Estimation in Systems Biology
Contents
1 Citation
Evaluation of Derivative-Free Optimizers for Parameter Estimation in Systems Biology, Y Schälte, P Stapor, J Hasenauer, IFAC-PapersOnLine 51 (19), 98-101.
2 Summary
Different parameter estimation settings necessitate the use of different optimization techniques. Hence, different local and global optimization techniques were compared in this paper by use of classic optimization test problems and also 8 ODE models. This article especially focused on the performance of derivative free optimizations (DFO) and if they are a valuable alternative to gradient-based methods.
3 Study outcomes
This section purely focuses on the outcomes obtained for ODE-models
3.1 ODE models are complicated
The results on the test models are not representative for the behavior of the optimization routines in ODE models.
3.2 Performance of DFOs
Gradient-based methods outperform DFOs in the realistic ODE models in terms of converged runs. However, in cases where gradient-based methods failed, the particle swarm method PSWARM or the evolutionary method CMAES were reasonable alternatives, see Fig. 2 in the original publication.
3.3 Finite Differences vs Sensitivity Analysis
In the gradient-based approach, finite differences lead to inferior convergence performance compared to forward/adjoint sensitivities.
4 Study design and evidence level
- The optimizers were tested on 466 classic test problems and 8 ODE models.
- The study advised special care to adequately comparing local and global optimization techniques.
- The study protocol was varied (with respect to the classic problems) to account for the different needs of comparing optimizers in the ODE model setting.
- The biological models were tested using the parameter estimation toolbox PESTO.
- The gradient-based optimization was performed with the function fmincon with sensitivities calculated for finite differences and with forward/adjoint sensitivity analysis.