2021 - Online - In the cloud

PAGE 2021: Methodology - Other topics
Manu Francis

Subspace MCMC algorithm for Bayesian parameter estimation of hierarchical PK/PD models in Pumas

Manu Francis, Vijay Ivaturi, Mohamed Tarek

Pumas AI

Objectives: To speed up the sampling from high-dimensional posteriors of hierarchical Pumas [1] models with a high correlation between the variables' samples.

Methods: The number of parameters in a hierarchical PK/PD models scales linearly with the number of subjects in a population. Sampling from the full posterior using Hamiltonian Monte Carlo (HMC) [2] can be computationally expensive if the number of parameters is large and the curvature of the posterior is uneven along one or more of the dimensions. Sampling in a lower dimensional subspace, aka subspace inference, was shown to be effective in literature for over-parameterized Bayesian neural networks [3]. In this work, subspace inference is shown to be effective when quantifying uncertainty in high dimensional parameter spaces if the parameters are highly correlated, such that most of the variance in the posterior and the data could be reproduced using a lower dimensional subspace. The subspace bases are calculated by first pretraining the model on the data using maximum likelihood estimation followed by a stochastic gradient descent maximum a-posteriori routine that allows the estimation of the mean parameter values and the covariance matrix. Principal component analysis is then used to find the lower dimensional subspace where most of the variance takes place. The No-U-Turn Sampler (NUTS) version of the HMC algorithm [4] as implemented in AdvancedHMC.jl [5] is finally used to perform the sampling.

Results: The proposed method was observed to cut the running time in half in our limited set of benchmarks using standard PK models. The means of the posterior samples were generally close to those produced by the full-dimensional NUTS algorithm. The variances of the posterior samples are also shown to approach those of the full-dimensional samples as the parameters' correlation or the number of subspace dimensions used increase.

Conclusions: The subspace inference algorithm outperforms the full-dimensional NUTS algorithm at the expense of some loss of accuracy in the variance. The proposed method can be used as a cheaper MCMC algorithm for Bayesian inference of hierarchical PK/PD models when a high parameter correlation is detected using a few NUTS samples.

[1] Chris Rackauckas, et. al. (2020). Accelerated Predictive Healthcare Analytics with Pumas, a High Performance Pharmaceutical Modeling and Simulation Platform. bioRxiv 2020.11.28.402297; doi: https://doi.org/10.1101/2020.11.28.402297
[2] Neal, R. M. (2011). MCMC using Hamiltonian Dynamics. Handbook of Markov Chain Monte Carlo, 2(11), 2.
[3] Izmailov et. al. (2020). Subspace Inference for Bayesian Deep Learning. The 35th Proceedings of Uncertainty in Artificial Intelligence Conference. (1169-1179). PMLR.
[4] Betancourt, M. (2017). A conceptual introduction to Hamiltonian Monte Carlo. arXiv Preprint arXiv:1701.02434.
[5] Kai Xu, Hong Ge, Will Tebbutt, Mohamed Tarek, Martin Trapp, Zoubin Ghahramani (2020). AdvancedHMC. jl: A robust, modular and efficient implementation of advanced HMC algorithms. Symposium on Advances in Approximate Bayesian Inference,(1-10), PMLR.

Reference: PAGE 29 (2021) Abstr 9862 [www.page-meeting.org/?abstract=9862]
Poster: Methodology - Other topics
Click to open PDF poster/presentation (click to open)