What is PAGE?

We represent a community with a shared interest in data analysis using the population approach.


2001
   Basel, Switzerland

LOQ, a relic of the past?

Ferdie Rombout

Bayer AG, Wuppertal, Germany

A reliable calculation of pharmacokinetic parameters for drugs is of high importance. In the recent years newly developed drugs have become increasingly more specific and thus more potent. As a consequence, lower concentrations of the parent drug and its metabolites have to be determined in biological matrices. In addition, when a new chemical entity is applied to humans for the first time, usually a dose is administered being distinctly lower than the final therapeutic dose, thus increasing the requirements for the assay sensitivity. Although the sensitivity of the assay methods applied have increased, it is not always possible under these circumstances, for example, to quantify the plasma concentration time profile for a sufficiently long period of time in order to obtain a reliable estimate for the terminal half-life, because they often get below the lower limit of quantitation (LOQ) too early after dosing. To avoid this, higher dosages must be employed to raise the drug concentrations or a more sensitive assay has to be developed. The first approach is not always possible (e.g. for safety reasons), whereas the latter one might not be needed in a later stage of drug development and thus might be a waste of resources.

According to several national and international guidelines, the LOQ is defined as the concentration, below which the variability of the assay

(imprecision) and/or the deviation from the target value (inaccuracy) surpasses a certain limit (usually 20 %). As soon as the calculated concentration value drops below this limit, the value is normally not reported because it does not fulfill the quality criteria. This leaves us with the strange situation that, if for example the LOQ is set at a level of 1 µg/L, a value of 1.000 µg/L is acceptable, whereas a value of 0.999 µg/L is not reported. The LOQ is usually determined with artificial quality control (QC) samples that have been prepared by spiking blank matrix with known drug concentrations. The variability that defines the LOQ has been thus observed by analyzing (calibration and QC) samples of only known concentration. The assumption that the factors influencing this variability are identical in magnitude for the analysis of unknown, biologically grown, samples is usually not investigated. Therefore, in principle, the unknown factors causing samples of known concentrations to be rejected are not taken into account when calculating/reporting concentrations of the unknown samples. Especially towards the end of the drug concentration time profile when approaching the LOQ, these factors could exhibit large influences on the pharmacokinetic parameters calculated and could affect decisions based upon these parameters.

This presentation describes an alternative approach to estimate concentration values based on chromatographic results and to test whether the new method can give us reliable estimates of residual variability and factors influencing this variability, which enables a reliable quantitation of unknown samples below the conventional LOQ.

The alternative approach is based on non linear mixed effect modeling using the software program NONMEM. Although originally designed to model sparsely sampled pharmacokinetic data [4], it can be more generally applied to model responses of other origin. The program offers the possibility to model variabilities, for example, within subjects (= residual variability), between subjects and between occasions and to search for and model factors influencing these variabilities. Within this paper we replaced a subject by a run and limited the variabilities to intra-run (= residual) and inter-run variabilities.



Top