Difference between revisions of "Performance of objective functions and optimization procedures for parameter estimation in system biology models"
(→Paper name) |
m (→Study design and evidence level) |
||
(6 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
− | + | __NUMBEREDHEADINGS__ | |
=== Citation === | === Citation === | ||
Andrea Degasperi, Dirk Fey & Boris N. Kholodenko, [https://www.nature.com/articles/s41540-017-0023-2.pdf Performance of objective functions and optimisation procedures for parameter estimation in system biology models], 2017, Systems Biology and Applications volume 3, Article number: 20 | Andrea Degasperi, Dirk Fey & Boris N. Kholodenko, [https://www.nature.com/articles/s41540-017-0023-2.pdf Performance of objective functions and optimisation procedures for parameter estimation in system biology models], 2017, Systems Biology and Applications volume 3, Article number: 20 | ||
=== Summary === | === Summary === | ||
− | + | In systems biology, relative data are a common occurrence. In ODE-based models, this is regarded by either introducing scaling parameters or data-driven normalization to bring data and simulations onto the same scale. It was shown in this article, that data-driven normalization improves optimization performance and does not aggravate non-identifiability problems compared to a scaling factor approach. Furthermore, this article reports that hybrid optimization methods which combine stochastic global and deterministic local search outperforms deterministic local gradient-based strategies. | |
=== Study outcomes === | === Study outcomes === | ||
− | |||
− | |||
− | |||
− | + | ==== Identifiability ==== | |
− | + | Employing data-driven normalization instead of scaling factors improved the identifiability of dynamic parameters, providing a computational example to demonstrate how this occurs. | |
− | |||
− | |||
− | |||
− | |||
− | |||
− | ==== | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
+ | ==== Convergence Speed ==== | ||
+ | As visualized in Fig. 4 and Fig. 5 of the original publication, convergence speed was consistently improved using data driven normalization compared to scaling factors. Combining the data-driven normalization with the hybrid optimization algorithm GLSDC provided the best performance results especially in high-parameter settings. | ||
=== Study design and evidence level === | === Study design and evidence level === | ||
− | + | * The provided claims are tested on 3 parameter estimation problems with varying amount of parameters. | |
− | + | * The 3 main algorithms tested were GLSDC, LevMar SE, LevMar FD with scaling factors and data normalization each. These were tested in 96 runs each. | |
− | The | + | * Although the previously best-performing method using LSQNONLIN with sensitivity equations as found in [[Lessons Learned from Quantitative Dynamical Modeling in Systems Biology]] has been mentioned, but a comparison with GLSDC was restricted to use of their implementation of the algorithm. |
− | + | * The study used Least-Squares instead of Likelihood as objective function, omitting error model fits. | |
− | |||
− | |||
− | |||
− | |||
− | |||
− | * The | ||
− | |||
− | * . | ||
− | |||
− | |||
− | |||
− | |||
− | * The | ||
− | |||
− | |||
=== Further comments and aspects === | === Further comments and aspects === | ||
− | + | * Additionally to the performance advantages of not using scaling factors, it is also stated that the amount of overfitting is reduced. | |
− | The | + | * The notion of practical identifiability does deviates from other literature, see for example e.g. [https://doi.org/10.1093/bioinformatics/btp358 Structural and practical identifiability analysis of partially observed dynamical models by exploiting the profile likelihood] |
+ | * The objective function values in Fig. 4 and Fig. 5 are not entirely obvious to interpret, since stochastic algorithms and multi-start algorithms are analyzed. |
Latest revision as of 13:38, 25 February 2020
Contents
1 Citation
Andrea Degasperi, Dirk Fey & Boris N. Kholodenko, Performance of objective functions and optimisation procedures for parameter estimation in system biology models, 2017, Systems Biology and Applications volume 3, Article number: 20
2 Summary
In systems biology, relative data are a common occurrence. In ODE-based models, this is regarded by either introducing scaling parameters or data-driven normalization to bring data and simulations onto the same scale. It was shown in this article, that data-driven normalization improves optimization performance and does not aggravate non-identifiability problems compared to a scaling factor approach. Furthermore, this article reports that hybrid optimization methods which combine stochastic global and deterministic local search outperforms deterministic local gradient-based strategies.
3 Study outcomes
3.1 Identifiability
Employing data-driven normalization instead of scaling factors improved the identifiability of dynamic parameters, providing a computational example to demonstrate how this occurs.
3.2 Convergence Speed
As visualized in Fig. 4 and Fig. 5 of the original publication, convergence speed was consistently improved using data driven normalization compared to scaling factors. Combining the data-driven normalization with the hybrid optimization algorithm GLSDC provided the best performance results especially in high-parameter settings.
4 Study design and evidence level
- The provided claims are tested on 3 parameter estimation problems with varying amount of parameters.
- The 3 main algorithms tested were GLSDC, LevMar SE, LevMar FD with scaling factors and data normalization each. These were tested in 96 runs each.
- Although the previously best-performing method using LSQNONLIN with sensitivity equations as found in Lessons Learned from Quantitative Dynamical Modeling in Systems Biology has been mentioned, but a comparison with GLSDC was restricted to use of their implementation of the algorithm.
- The study used Least-Squares instead of Likelihood as objective function, omitting error model fits.
5 Further comments and aspects
- Additionally to the performance advantages of not using scaling factors, it is also stated that the amount of overfitting is reduced.
- The notion of practical identifiability does deviates from other literature, see for example e.g. Structural and practical identifiability analysis of partially observed dynamical models by exploiting the profile likelihood
- The objective function values in Fig. 4 and Fig. 5 are not entirely obvious to interpret, since stochastic algorithms and multi-start algorithms are analyzed.