2017 - Budapest - Hungary

PAGE 2017: Methodology - Model Evaluation
Douglas J. Eleveld

How many bits of information did my study provide? Examining Kullback-Leibler divergence, standard errors and shrinkage

Douglas Eleveld (1), Pieter J. Colin (1,2)

(1) Department of Anesthesiology, University Medical Center Groningen, University of Groningen, The Netherlands; (2)Laboratory of Medical Biochemistry and Clinical Analysis, Faculty of Pharmaceutical Sciences, Ghent University, Belgium

Objectives: To explore Kullback-Leibler divergence (information gain) as a measure of information gained by model estimation. A useful information measure should 1) have units of bits, a natural unit of information, 2) should increase proportionally with increasing number of individuals 3) increase with the number of observations per individual, 4) increase with decreasing observation error 5) increase with decreasing parameter relative standard-error, and 6) insensitive to non-eta-influential individuals.

Methods: Simulated data sets were constructed using NONMEM V7.3.0[1] using a one-compartment PK model with absorption. The number of simulated individuals, number of observations per individual, and observation error were varied. The directed Kullback-Liebler divergence of the (assumed multivariate normal) individual posthoc estimates (NONMEM eta and phi) from the population estimate were calculated and the sum (or average per individual) evaluated as a measure of the total quantity (or average quality) of information obtained from model estimation from data. Parameter relative standard errors and shrinkage were also calculated for comparison.

Results: For the model and datasets tested, the total individual Kullback-Liebler divergence satisfied all of the expected properties of a measure of information. In contrast, parameter shrinkage does not reflect parameter certainty when individuals vary in informativeness. A close relationship was observed between parameter shrinkage and the average (per individual) single parameter (1-dimensional) Kullback-Leibler divergence.

Conclusions: The sum of the Kullback-Liebler divergences of the individual posthoc estimates from the population estimate may be a useful as a quantification of the amount of information obtained from model estimation. It can be interpreted in a per-experiment, per-parameter, or per-individual context. The approach avoids a number of shortcomings of shrinkage as an information quality (or quantity) measure. The approach allows model diagnostics to summarize the total amount of information gained by model estimation in a natural unit of information, the bit.



References:
[1] Beal SL, Sheiner LB, Boeckmann AJ & Bauer RJ (Eds.) NONMEM Users Guides. 1989-2011. Icon Development Solutions, Ellicott City, Maryland, USA.


Reference: PAGE 26 (2017) Abstr 7088 [www.page-meeting.org/?abstract=7088]
Poster: Methodology - Model Evaluation
Click to open PDF poster/presentation (click to open)
Top