Oral Presentations - Abstracts

Model 1 - Cardiac Excitation

James Ritchie and Iain Murray

Obtaining first place estimates for model 1:

We used a Bayesian approach to infer the parameters of Cside 2018 model 1. Models such as this often cause problems for standard MCMC approaches due to the strong constraints on the posterior. We were able to develop a robust procedure for inferring the parameters using careful initialisation and affine invariant sampling, without being overly adapted to the specific problem.

I’ll describe how we implemented our solution, how we checked it with simulations, and discuss some drawbacks and improvements we could make.

Gary Mirams, Chon Lok Lei and Michael Clerx

Statistical Inference for the Cardiac Excitation Model using Pints:

Model 1 is a simple 3 ODE non-linear dynamical system for cardiac excitation with 12 parameters. A simplistic fitting approach such as the Nelder-Mead simplex algorithm in the original model parameter space does not converge to the global optimum. We used Pints (Probabilistic Inference on Noisy Time Series), an open source Python package (https://github.com/pints-team/pints) we have developed, which interfaces to a range of optimisers and Monte Carlo methods. We used synthetic data based on the default parameters to study the model and define our likelihood function. We will show how we used 1D slices to explore the shape of the log-likelihood response surface to allow us to pick a suitable parameter transformation, and boundaries to simplify the inference problem. The procedure first used CMA-ES (a global optimisation technique) to find good starting points and then an adaptive covariance MCMC method (with a uniform prior) to explore the space more fully - from which we pick the maximum posterior density as our competition entry. In the synthetic data study, we inferred the default parameter set with a reasonable estimate for a precision matrix (inverse of covariance matrix) based on the MCMC chain covariance, then applied the same techniques to the competition data.

Edward Acheampong

Heuristic identification of practical parameter spaces for ODE estimation:

Initializing starting values of parameter estimators is a vital step in parameter estimation of dynamical systems. A heuristic approach to parameter estimation is carried out which involves refining the parameter space until a practically meaningful space, that is, one from which parameter estimators best described the data is found for the given challenge. This requires exploring the system under study to gain deeper insight of the credible parameter spaces by restricting the ranges of the parameters, which reduces the issues of being trapped in local minima/maxima which optimization algorithms usually encounter. Based on this consideration, a Latin hypercube sampling technique is then applied to sample initial starting values from the chosen parameter space. The method is thus useful for analysing the effects of a combination of parameters on a system. Accordingly, with efficient coding and a combination of a gradient free and gradient based optimisation algorithms the classical method to parameter estimation, nonlinear weighted least squares, yielded ‘good’ estimates. The weights are computed by taking the inverse of the estimated standard deviations of the three state variables.

Model 2 - Vessel Stiffness in the Pulmonary-Artery System

Umberto Noè

The Scaled Expected Improvement method:

The model of the pulmonary circulation (the simulator) was considered as an expensive black-box function. The likelihood of the parameters, involving a forward simulation of the pulmonary circulation model at every parameter query, can also be seen as an expensive objective function.

Inference in this settings is often performed using Bayesian Optimization (BO). Different BO algorithms vary in the choice of the acquisition function. For this challenge I used the Scaled Expected Improvement method introduced by Noè and Husmeier (2018).

Bayesian Optimization iteratively updates an emulator of the objective function by evaluating the (expensive) simulator at the point suggested by the acquisition function, in this case the Scaled Expected Improvement​. Typically these points have a high information content compared to the function evaluations requested by standard non-emulation-based solvers, leading to a reduction in the required number of (expensive) function evaluations and computational time savings.

The preprint of "On a new improvement-based acquisition function for Bayesian optimization" by Umberto Noè and Dirk Husmeier can be found here https://arxiv.org/abs/1808.06918.

Mihaela Paun and Alan Lazarus 

Nonlinear optimization and a comparison of frequentist and Bayesian uncertainty quantification methods for PDEs describing pulmonary circulation:

Typically, parameter inference for partial differential equations is a challenging problem with the large computational cost limiting the number of forward simulations that can be run in real time. This paradigm lends itself to approximate procedures such as statistical emulation, where one constructs a cheaper approximation to the true function that can be queried at new points in the parameter space. The emulation approach has particular importance in the clinical setting where one wishes to provide quick parameter inference as patients arrive in the clinic—a feature that is shared with that of Cside. Through discretization of the space, we were able to generate a high resolution emulator for the simulator, which enabled us to view the objective function once the data were made available. This provided more informative constraints for an SQP procedure of optimization which provided our point estimate of the parameter of the model. 

Uncertainty quantification can proceed under two alternative statistical frameworks. Under the frequentist setting, one can adopt a parametric bootstrapping approach where variation in the parameters results as a direct consequence of a replication of the data generating process. One can use knowledge of the SNR to obtain a distribution for the residuals, repeatedly resampling noisy data instantiations and reoptimizing to find a point estimate of the parameter. Alternatively, one can proceed under a Bayesian framework where we place a prior over the parameter space and sample from the intractable posterior using Markov chain monte carlo. Both of these methods will be considered in this talk, providing some consistency between the objective Bayes and frequentist approaches and a comparison with the Cramer Rao Lower bound.

Iain Murray and James Ritchie

Expressing and checking uncertainty:

For this model we did a simple brute force computation of the parameter's posterior on a grid. I'll review some consistency checks that we can run on code for probabilistic inference, such as Geweke's "getting it right" method. I'll propose how these checks could be adapted to assess the robustness of a competitive ranking of methods, and how we might reduce variance in future competition criteria. Finally I'll ask what our competition criteria should be, and whether conventional log-probability loss reflects what practitioners of differential equation models are interested in.

Model 3 - Cell migration & Additional Challenge - Cell Migration (Partially Observed System)

Agnieszka Borowska

Gaussian process emulation for parameter inference in a stochastic differential equation system for chemotaxis:

An emulation-based method for inference of 10 parameters of the stochastic differential equation (SDE) system describing the behaviour of cells migrating as a response to chemotaxis is developed. The proposed emulator employs Gaussian process (GP) regression for each parameter separately (multivariate output GP regression is left for further research). Hyperparameters of each GP are estimated with maximum likelihood based on a training set of 1000 points generated from a Sobol sequence. Due to unspecified parameter ranges for the problem, the bounds for the space filling design are set to [0.5Θ, 2Θ], where Θ denotes the default parameter value from the Supplementary Notes by Luke Tweedy provided by the Organisers. The choice of the optimal (in the sense of minimising RMSE) kernel function for each regression is made based on the out-of-sample performance on the test set of 100 points generated from a Latin hypercube.

Efficient feature extraction is crucial for the performance of any regression model and is particularly challenging in this case due to the high-dimensional output of the SDE simulator. The lack of alignment between the output points on the membrane over time and time-varying dimensions of the data pose additional difficulties.

To detect potentially relevant features an extensive exploration was performed, separately for Model 3 (including signals unobserved in laboratory experiments) and the Additional Challenge (based on observed measurements only). The resulting insights suggest the importance of allowing for scale, translation and rotation invariance of potential measures. Hence, PCA based on Fourier transforms of the signals and cell contours is implemented leading to the extraction of several resulting features. In addition, the time series of ratios of cell area to its perimeter is investigated and the label of a model best fitting to this series taken as an explanatory variable. In total 56 and 31 variables are considered for Model 3 and the Additional Challenge, respectively.

 

Diana Giurghita

Applying approximate Bayesian computation algorithms to cell migration:

A sequential Monte Carlo approximate Bayesian computation (SMC ABC) algorithm was applied to perform inference in a stochastic differential equations (SDE) model with 10 unknown parameters which describes the behaviour of a cell migrating in response to chemotaxis.

Following the idea of a semi-automatic ABC developed by Fearnhead & Prangle (2012), the summary statistics for the algorithm is first based on the prediction from a linear model, and the Euclidean distance is employed to link these summary statistics with those calculated on the observed data set. Second, the approach of Fearnhead & Prangle (2012) is extended to allows for a more flexible regression model being a Gaussian process (GP) regression applied to each parameter separately.

The regressors in these models consist of various features extracted from the high dimensional output of the SDE simulator. The lack of alignment between the output points on the membrane over time and time-varying dimensions of the data pose additional difficulties.

Due to the stochastic nature of the model, the features extracted have to be scale, translation and rotation invariant. Hence, PCA based on Fourier transforms of the signals and cell contours is implemented leading to the extraction of several resulting features. In addition, the time series of ratios of cell area to its perimeter is investigated and the label of a model best fitting to this series taken as an explanatory variable. Other features extracted include the ratio between the minimum and maximum "radius" of the cell (with a radius being the distance from the cell's midpoint to a point on a membrane), total distance traveled by the cell's midpoint, and the distance to furthest destination reached by the cell's midpoint.

Since the observed data for the Additional Challenge consists of a cell's membrane coordinates over time, while for Model 3 this contains further signals normally unobserved in laboratory experiments, 31 and 56 features were considered for each of the models, respectively.

Both the linear model and GP regression model were trained on a set of 1000 points generated from a Sobol sequence. Due to unspecified parameter ranges for the problem, the bounds for the space filling design are set to [0.5Θ, 2Θ], where Θ denotes the default parameter value from the Supplementary Notes by Luke Tweedy provided by the Organisers.