Modeling biological systems - The Pharmaco-Kinetic-Dynamic paradigm

Similar documents
The general concept of pharmacokinetics

Neural Network Training

Noncompartmental vs. Compartmental Approaches to Pharmacokinetic Data Analysis Paolo Vicini, Ph.D. Pfizer Global Research and Development David M.

Sequence labeling. Taking collective a set of interrelated instances x 1,, x T and jointly labeling them

Matrices and systems of linear equations

Noncompartmental vs. Compartmental Approaches to Pharmacokinetic Data Analysis Paolo Vicini, Ph.D. Pfizer Global Research and Development David M.

Statistical Machine Learning Theory. From Multi-class Classification to Structured Output Prediction. Hisashi Kashima.

1. Describing patient variability, 2. Minimizing patient variability, 3. Describing and Minimizing Intraindividual

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012

Clustering. Professor Ameet Talwalkar. Professor Ameet Talwalkar CS260 Machine Learning Algorithms March 8, / 26

If we want to analyze experimental or simulated data we might encounter the following tasks:

3 Results. Part I. 3.1 Base/primary model

Statistical Machine Learning Theory. From Multi-class Classification to Structured Output Prediction. Hisashi Kashima.

Maximum Likelihood Estimation. only training data is available to design a classifier

Reading Group on Deep Learning Session 1

Linear Models for Classification

Variations. ECE 6540, Lecture 10 Maximum Likelihood Estimation

Computational Biology Course Descriptions 12-14

Linear Classification. CSE 6363 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington

Master of Science in Statistics A Proposal

Stat 5101 Lecture Notes

LINEAR MODELS FOR CLASSIFICATION. J. Elder CSE 6390/PSYC 6225 Computational Modeling of Visual Perception

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Today. Probability and Statistics. Linear Algebra. Calculus. Naïve Bayes Classification. Matrix Multiplication Matrix Inversion

Univariate ARIMA Models

Lecture 5: Logistic Regression. Neural Networks

ECE521 week 3: 23/26 January 2017

Graduate Econometrics I: What is econometrics?

Lecture 6. Regression

F. Combes (1,2,3) S. Retout (2), N. Frey (2) and F. Mentré (1) PODE 2012

Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1. x 2. x =

Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen. Bayesian Learning. Tobias Scheffer, Niels Landwehr

Advanced Engineering Statistics - Section 5 - Jay Liu Dept. Chemical Engineering PKNU

Chapter 4 Neural Networks in System Identification

Lecture 4: Types of errors. Bayesian regression models. Logistic regression

Feature selection and extraction Spectral domain quality estimation Alternatives

Relevance Vector Machines for Earthquake Response Spectra

Association studies and regression

PDF hosted at the Radboud Repository of the Radboud University Nijmegen

Bioinformatics: Network Analysis

Nonlinear Mixed Effects Models

Lecture 6: Discrete Choice: Qualitative Response

Compound Optimal Designs for Percentile Estimation in Dose-Response Models with Restricted Design Intervals

Math 423/533: The Main Theoretical Topics

University of Houston, Department of Mathematics Numerical Analysis, Fall 2005

Linear discriminant functions

Machine Learning! in just a few minutes. Jan Peters Gerhard Neumann

Summary and discussion of: Dropout Training as Adaptive Regularization

A Novel Screening Method Using Score Test for Efficient Covariate Selection in Population Pharmacokinetic Analysis

Lecture Notes 1: Vector spaces

Estimation and Model Selection in Mixed Effects Models Part I. Adeline Samson 1

Machine Learning Lecture 7

Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts III-IV

EL1820 Modeling of Dynamical Systems

Covariance function estimation in Gaussian process regression

Introduction to Neural Networks

On the convergence of the iterative solution of the likelihood equations

401 Review. 6. Power analysis for one/two-sample hypothesis tests and for correlation analysis.

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Machine Learning Linear Classification. Prof. Matteo Matteucci

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

Phase I design for locating schedule-specific maximum tolerated doses

Lecture 16 Solving GLMs via IRWLS

Ch 4. Linear Models for Classification

PATTERN RECOGNITION AND MACHINE LEARNING

What is the Matrix? Linear control of finite-dimensional spaces. November 28, 2010

Artificial Intelligence Module 2. Feature Selection. Andrea Torsello

OR MSc Maths Revision Course

Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen. Linear Classifiers. Blaine Nelson, Tobias Scheffer

Introduction to Design and Analysis of Computer Experiments

Kasetsart University Workshop. Multigrid methods: An introduction

Fitting PK Models with SAS NLMIXED Procedure Halimu Haridona, PPD Inc., Beijing

Linear Regression (9/11/13)

Least Squares. Ken Kreutz-Delgado (Nuno Vasconcelos) ECE 175A Winter UCSD

Integration of SAS and NONMEM for Automation of Population Pharmacokinetic/Pharmacodynamic Modeling on UNIX systems

Statistical Machine Learning, Part I. Regression 2

Logistic Regression and Generalized Linear Models

Introduction to Maximum Likelihood Estimation

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE GRAVIS 2016 BASEL. Logistic Regression. Pattern Recognition 2016 Sandro Schönborn University of Basel

[y i α βx i ] 2 (2) Q = i=1

Matrices BUSINESS MATHEMATICS

Foundation of Intelligent Systems, Part I. Regression 2

Modeling and Analysis of Dynamic Systems

Classification. Chapter Introduction. 6.2 The Bayes classifier

Stochastic Analogues to Deterministic Optimizers

Lossless Online Bayesian Bagging

Prerequisite: STATS 7 or STATS 8 or AP90 or (STATS 120A and STATS 120B and STATS 120C). AP90 with a minimum score of 3

Resampling techniques for statistical modeling

Online monitoring of MPC disturbance models using closed-loop data

A. Motivation To motivate the analysis of variance framework, we consider the following example.

Statistical Data Mining and Machine Learning Hilary Term 2016

Reading Assignment. Distributed Lag and Autoregressive Models. Chapter 17. Kennedy: Chapters 10 and 13. AREC-ECON 535 Lec G 1

Confidence and Prediction Intervals for Pharmacometric Models

Last updated: Oct 22, 2012 LINEAR CLASSIFIERS. J. Elder CSE 4404/5327 Introduction to Machine Learning and Pattern Recognition

Parametric Techniques

Development of Stochastic Artificial Neural Networks for Hydrological Prediction

Lecture 2: Simple Classifiers

Bayesian Linear Regression [DRAFT - In Progress]

Gradient-Based Learning. Sargur N. Srihari

Obnoxious lateness humor

Transcription:

Modeling biological systems - The Pharmaco-Kinetic-Dynamic paradigm Main features:. Time profiles and complex systems 2. Disturbed and sparse measurements 3. Mixed variabilities One of the most highly developed skills in contemporary Western civilization is dissection: the split-up of problem into their smallest possible components. We are good at it. So good, we often forget to put the pieces back together again. Foreword Science and change Alvin Toffler In: Ilya Prigorine (977 Nobel in Chem), Isabelle Stengers, Order out of chaos. Man s new dialogue with nature, Bantam NY 984

CHAPT I : Processes and models Real process and mathematical model The functional scheme. Modeling in modern PKs. Structure and parameters in models. Computational tools: matrix calculus. Linear and nonlinear operators. The parsimony principle Redundancy and misspecification. Indexes for goodness of fit: RMSE and determination coefficient. The fotemustine example: modeling ANC toxicity. Dynamic functional scheme Initialization, iterations, parameter convergence. Residual error. Real-time processing of data Bayesian estimation, the Bayesian attractor. Geometric analysis. Influence of the noise in parameter estimates. Model validation State and parameter spaces. Checking structural and parametric identifiability. Optimize PK experiments. Building and using models Identification, simulation, dosage adjustment. Synthesis of main tasks. Controlled systems, cybernetics. 2

Real process and mathematical model y [ng/ml] 0 2 Real process 0 2 Fitted model 0 0 2 4 6 8 t [h] y [ng/ml] y Mathematical model () t ( k t) y [ng/ml] 0 2 D = exp V V 0 0 2 4 6 8 t [h] k 0 0 2 4 6 8 t [h] V = 6 L k = 0. h - 3

Standardize observations Ex : Administer the same dose to 2 different subjects and compare kinetics. Sampling times [ h ] 2 6 2 Real sampling times [ h ] / Subject n 0.975 2.06 6.2 - Conc. # - Real sampling times [ h ] / Subject n 2.03 2.0 5.9 2.06 Conc. # y y2 3 Heterogeneity in real sampling: the comparison is not possible. Compare the kinetics out of the «world» of concentrations. y2 y22 y23 y24 y Transformation Use a mathematical model smoothing observations and computing PK parameters Compare PK parameters. 4

Compare PKs for 3 subjects Dataset In general [ ] n ˆ ˆ - V L k h.02 0.47 2 2.24 0.086 3 8.97 0.073 { xˆ j ; j = : n } Subjects are statistically distinguishable : V k thanks to for n 2 and 3, # to for n and 2. k [ h - ] 0.2 0.5 0. 0.05 2 0 5 0 5 20 25 V [L] 3 5

Functional scheme Measurement noise noise PK PK process Observation Administration protocol Prior Prior information Equivalence criterion PK PK model Prediction Nonlinear programming 6

Topics in in modern PKs PK process Classical PKs : analyze physiology, drug characteristics, biopharmacy, etc. Mathematical models PK : structural and parametric aspects. Error variance models. Equivalence criterion Bayesian, maximum likelihood estimators. Nonlinear programming Iterative methods, heuristic algorithms, quasi-newton methods. Prior information Population approaches, parametric and nonparametric methods. 7

Modeling Models Dialectic, mental, physical, mathematical, etc. Elements Structural and parametric knowledge, state variable. Goals Model is proposed to be efficient for a specified action : recognition, control, simulation, etc. Validation Experimental and statistical tools. Applications in : engineering ( fast dynamics ), : biology, sociology, economics, PKs ( slow dynamics ), : physics, astronomy, PDs ( mixed dynamics ). Ex : Tourist map of the city of Marseille. 8

Species of models Static / dynamic ( time dimension, differential equations ). Discrete time / continuous time ( computer control ). Linear / nonlinear ( which is the reference? ). Deterministic, statistic ( precision of estimates ), Input-Output form / state-space model : stochastic models ( dispersion of parameters ). presentation ( macro-rates ) / knowledge ( micro-rates ) models. Lumped / distributed parameter models ( space-time differential equations ). Time invariant / time varying ( parameters ). In PKs { Dynamic, continuous time, [ nonlinear / PK parameters, linear / drug amounts ], statistical and stochastic, I-O form and state-space, lumped, time invariant }. 9

Mathematical modeling Models are defined by : - their structure ( number and connectivity of compartments, etc ) expressed by mathematical operations involving adjustable parameters Ex : -cpt, D V () t = exp( k t) exponential structure, parameters - the numerical value of parameters used y [ ] [ ] x= V k - V k = 6 L 0. h CHARACTERIZATION Structure + ESTIMATION Parameters MODELING System Identification 0

Matrix definitions Definition of terms An array of numbers written as below is called a matrix of order. m= n n = m = m= n= Square matrix if. Column vector if. Row vector if. Scalar if. A square matrix may be : aij = a ji j j aij = 0 i j a ii = i = : n Symmetric if, Diagonal if, etc. m n a a... a a a... a 2 n 2 22 2n =............ a a... a m m2 mn A diagonal matrix with is the identity matrix. Notation : matrix A ( cap ), matrix element a ( lower ), vector a ( lower, underlined ). A I

Matrix algebra Transposition Addition T A m n C A n m c = = ij ji A, B m n C A + B m n = ij ij ij a c = a + b Multiplication A B m n n q C = A B m q c = ij a n k = ik b kj Square matrix Determinant ( characteristic scalar ) Ex. Inversion Rules = a a a for n = 2 A A 22 2 a2 C = A C A = A C = I c = ( ) T T T ( A B) = B A ( B) = B A ij A i+ j = A A ( ) A T A i, j 2

Operators : Analytic geometry - Rotation of rectangular coordinate system Rotation angle ω Matrix form or [ X, Y ] [ xy, ] Matrix inversion old and new coordinates x = X cosω + Y sinω y = X sinω+ Y cosω x cosω sinω X = y sinω cosω Y c new = Ω c c old old Ω cosω sin c with = new y Y ω + ω Ω = sinω cosω X M x 3

Operators : Analytic geometry - 2 Conversion of rectangular to polar system Inversely r = X + Y ξ = arctan X 2 2 ( Y X) = r cosξ Y r M Impossible to link r ξ and Y X Y = r sinξ by a matrix operator. ξ X The conversion relations are nonlinear. 4

Linear operators Let :. 2. e () t ( t) TT e 2 () t ( t) TT s s 2 For the input combination : () t + h e () t g e 2 TT?? The operator T is linear, if and only if?? = g s( t) + h s2( t) 5

For the -cpt model y () t s( t) e( t) associate : - to and D - to. The model is a linear operator because y PK example - D V () t = exp( k t) when then D D D () ( ) V V g D + h D2 D + h D2 exp V y t = exp k t and 2 D2 y2() t = exp( k t) ( k t) = g y ( t) + h y ( t) g 2 PK linearity ( proportionality ) 6

For the -cpt model associate : - to and - to. PK example - 2 The model is a nonlinear operator because when then k y () t s( t) e( t) k y D V () t = exp( k t) D D V V D g k+ h k2 exp g k+ h k2 t g y t + h y2 t V y() t = exp ( k t) and k2 y2() t = exp( k2 t) [ ( ) ] () () Parametric nonlinearity 7

Choose the best model Rule : The model should be a necessary and sufficient description of the real process necessary : good fitting on observed data. sufficient : without redundancy of parameters in the structure. Rule 2 : The model structure should be parsimonious : to adequately represent the real process. by containing the smallest number of parameters. Ex : -cpt model is misspecified. 3-cpt model is redundant. 2-cpt is the best model. y t 8

Fotemustine dataset Rectangular array i = : n n = 29 ID ANC ANC BASE Nadir ANC Dose D Clearance CL BSA BSA Weight W i ANC i ANC i D i CL i BSA i W i n ANC n ANC n D n CL n BSA n W n Output / observations Relative ANC count y i = ANC ANC ANC i 9

Predictions with using then z the constants / parameters data for individual i Modeling neutrophil toxicity zi e Logit model ymi = + zi e = x + D x + CL x + BSA x + + W i 2 i T 3 i 4 i x p x x x x p, and T i g = [ D CL BSA W ] i z i = 2 = g T i x i i i i p =4 Problem Obtain numeric values for : fit to. Application y M 0 xˆ y Mi Predict toxicity for a new individual with given data y = logit 0 x ( T g ˆ ) M 0 x y i g 0 20

Regression line correlation coefficient Coefficient of determination 2 var( yi ymi ) R = var y y Indexes for the goodness of fit y = a y + b + e ( ) is the part of data variability i i R Mi ( with respect to the average y ) explained by the model. n Root Mean Squared Error RMSE = w n p i= RMSE n p y Mi y i R 2 takes into account the sample tail,, the number of model parameters,, and the closeness of predictions to the observations. i y i 0.9 0.8 0.7 0.6 0.5 0.5 0.6 0.7 0.8 0.9 y Mi 2 i ( y y ) i Mi 2 2

Reduce the model structure CV 4 3 2 0 9 8 7 6 5 p x 0.69 0.75 0.73 0.46 0.5 0.4 0.3 0.5 0.9 0.22 x 2 0.35 0.20 0.9 0.8 0.3 0.09 0.08 0.08 0.2 0.4 x -2.7-4.3-3.29 * * * * * * * 3 x 4 0.60 0.33 0.30 0.26 0.24 0.22 0.2 0.5 0.23 0.33 x -2.56 3.75 * * * * * * * * 5 x -3.27 2.6 2.5 3.67 * * * * * * 6 x.87-2.50-2.32-2.36-2.8 * * * * * 7 x.36 0.7 0.70 0.67 0.39 0.35 0.34 0.25 * * 8 x 0.84 0.66 0.64 0.54 0.52 0.42 0.37 0.20 0.20 0.27 9 x.69 0.79 0.77 0.67 0.57 0.5 0.52 * * * 0 x 7.67 * * * * * * * * * x 2 -.66-2.08-0.98 -.0 -.00 -.09 * * * * x -4.49-0.69-0.62-0.60-0.52-0.23-0.23-0.7-0.26 * 3 4 Largest coefficient of variation CV x -.66-0.77-0.69-0.68-0.62-0.34-0.29-0.20-0.23-0.32 RMSE (%) 6.04 5.46 5.30 5.6 5.03 4.92 4.90 5.30 7.08 8.66 22

Redundancy, misspecification 9 0.95 8 0.9 Misspecification High RMSE NC non-satisfied RMSE (%) 7 6 0.85 0.8 R 2 Redundancy Increasing RMSE SC not-satisfied 5 0.75 4 0.7 5 6 7 8 9 0 2 3 4 p Warning : R 2, the squared correlation coefficient, does not detect redundancy! 23

Functional scheme ( dynamic ) Linear / p model : no loop, one stage estimation. Nonlinear / p model : many loops until convergence. PK process Measurement noise noise Administration protocol Prior Prior information Equivalence criterion Arbitrary initial initial values values PK model Nonlinear programming 24

Iterations, parameter convergence Ex : Fotemustine neutrophil toxicity Nonlinear modeling n = 29, p = 8 0 Optimized final values 0 2 nd Param. value 0 0 0 - st 3 rd 0 0 0 - RMSE Arbitrary initial values 0-2 0 5 0 5 20 25 Iteration no 0-2 25

Errors in in the functional scheme The existing errors : experimental, structural, parametric. Residual error : experimental, structural ( model misspecification ). RMSE 0 0 0 0-0 -2 Initial parametric error ( canceled at the convergence ) Residual error 0 5 0 5 20 Iteration no 26

The system identification loop Experimental Design Design Prior Knowledge Dataset Choose Model Model Structure Calculate Model Model Choose Criterion of of Fit Fit Validate Model Model Not OK: Revise OK: Use it! 27

Real-time processing of data Data processing In batch or off-line { m observations } - data processing Sequentially or in real-time Wait until the last observation for all data processing. Perform modeling while observations are available. { observation - datum processing } { times } Warning : In the first stages, the lack of individual information should be offset by the prior information ( attractor - population studies ). m As the number of individual samples increases, the prior information can be forgotten. Bayesian estimation 28

Estimates in in real-time processing Only one observation D y exp t V Two observations D y = exp k t V D y2 = exp k t V ( ) ( t, y ) = k infinite pairs of values for ( V,k ) ( ) More than two observations V,k No exact solution : find minimizing the total discrepancy between observations and predictions i [ ( t, y ) ( t, y ) ] ( ) 2 2 2 ( ) D V y exp( k ) 29 t i k V = = t D y t 2 y y ln 2 y y 2 t t t 2 minimize i =: m RMSE

Equivalence criterion, real-time Indexing observation times ; individual associated with parameters. Equivalence criterion Intra-kinetic weighting In real-time when, leads to infinite solutions. Bayesian attractor m i j p x j i= Combine criteria to obtain m ( j ) 2 SE = wi obsi pred i x m < p SE Inter-kinetic weighting Select the most appealing solution by using an attractor : located at and associated with force of attraction x A prior = T ( x x ) D ( x x ) j A x j A D j A A 2 ( prior) ln( SE) ln + A 30

Concept of simulation Random generator of of measurement noise noise PK PK process y ( t x ) M i, 0 Observation Administration protocol PK PK model No structural error Known parameter values y ( t x ) M i, 0 3 Nonlinear programming Equivalence criterion Prediction

Geometric analysis y D V () t = exp( k t) 70 60 t y Each i i observed pair is represented by a geometric locus in the parameter space Reference values V = 6 L k = Experimental design 0.h D = mg K = 5% - y [ng/ml] 50 40 30 k = t D ln V i y i 20 0 0 2 4 6 8 0 2 t [h] 32

nbr. samples < nbr. parameters 0.4 Infinite solutions in the geometric locus of the st observed pair. 0.3 Geometric locus h - 52.9 ng/ml Bayesian attractor Set solution using the Bayesian attractor [ ] - V L k h Ref. 6.0 0.0 k [ h - ] 0.2 0. x A D A Est. 6.5 0.357 0 5 0 5 20 25 30 33 V [L]

nbr. samples = nbr. parameters Bias! - V [ L ] k h Ref. 6.0 0.0 Calc. 3.73 0.3202 0.4 0.3 Remove bias k [ h - ] 0.2 2 h - 38.4 ng/ml 0. Intensive sampling 0 5 0 5 20 25 30 34 h - 52.9 ng/ml V [L]

nbr. samples > nbr. parameters Calculated parameters are the coordinates of intersecting loci. 0.4 Intersections are the p combinations of observations : C m p 5 = C 2 = [ ] 0 m - V L k h med 7.5 0.0859 iqr 8.3 0.0805 MLE 7.27 0.0937 k [ h - ] 0.3 0.2 0. 4 h 0 5 0 5 20 25 30 35 h 2 h V [L] 6 h 8 h

Sensitivity w.r.t. noise 0.4 [ ] - V k L h Ref. 6.0 0.0 5% med 6.37 0.0956 iqr 2.3 0.0257 5% med 7.5 0.0859 iqr 8.3 0.0805 k [ h - ] 0.3 0.2 0. 0 5 0 5 20 25 30 V [L] 36

Without measurement error 0.4 Accurate exp devices 0.3 Undisturbed observations m = p k [ h - ] 0.2 0. When? 0 5 0 5 20 25 30 V [L] 37

Bayesian estimation It analyzes only a few samples per patient It requires Respect of the ethical constraints, Reduction of the cost of the PK analysis. ( ) f x Obtaining the prior information : attractor or. Balancing the population and the individual information balance the prior and likelihood terms Operating with : staffs having a good experience and high specialized health professionals. It may be used in a feedback loop Efficient data analyses. 38

State and parameter spaces Parameter space State space Real process PK PK process Observation Comparison Comparison Artificial mechanism PK PK model Prediction Random component Precision of estimates Measurement error 39

State vs. parameter space y [µg/ml] 0.06 0.04 y D 0.06 = exp Confidence V 90% () t ( k t) y [µg/ml] 0.04 regions and corridors 0.02 0 2 4 6 8 t [h] 0.02 0 2 4 6 8 t [h] 0.2 0.2 y [µg/ml] 0.5 0. 0.05 2 D k [h - ] 0.5 0. 0.05 Likelihoodists 0 0 2 4 6 8 t [h] 0 0 5 0 5 20 25 30 V [L] 40

Checking identifiability Structural identifiability Given a hypothetical structure with unknown parameters, and a set of proposed experiments ( not measurements! ), would the unknown parameters be uniquely determinable from these experiments? Parametric identifiability Estimating the parameters from measurements with errors and optimize sampling designs. u() t -V k 2 k 2 2 k [ h - ] 0.2 0.5 0. 0.05 90% k p k e k p 0 0 5 0 5 20 25 30 V [L] structural non-identifiable Non-consistent estimate V 4

Overall notes Modeling extensions in PKs Repeated dose ( multi-input ) and simultaneous observation of several kinetics ( multioutput systems ), model the error process. Validation Check whether the model is "good enough", whether it is valid for its purpose : { statistical tests, confidence intervals, sensitivity analysis, analysis of residuals, information concepts, new experiments } Remember The model never explains, but only describes the available observations. The model is not an exact description of the real process. The success of modeling depends on the richness of available knowledge. 42

Building and using models Real Process y tt Time schedule + Drug amounts Functional properties Observations, or Predictions, or Clinical Constraints Dosage adjustment L j= ( ) A exp a t ij j Simulation 43

Devil's triangle Process : Modeling Simulation Control Input : Adm. Protocol Identification Output : Concentrations 44

Dosage adjustment flowchart Individual dosage adjustment Invert the model Identify individual PK parameters by using Bayesian estimation Ethical Ethical constraints Individual observation Prior information Compiling knowledge 45

Processes and models Biological real processes Preserve integral functionality of the real process. Avoid experiments leading to : breakdown the feedback mechanisms or excessively stimulate the process to record quantifiable observations. Analysis by modeling Obtain a reliable and efficient ( simple ) tool complying with the functional properties of the real process. Manage complex situations : Optimize experiments : administration and sampling. Individualize pharmaco-therapy : adjust dosage regimens, prediction and control the real process. 46

Modeling pros vs. cons Pros : Discovery of fundamental properties of PK process by assaying biological samples. Data reduction : large datasets are summarized with a few PK parameters. Observations standardization : by means of mechanistic models, the overall observed profile is shared to its constitutive [ administration + system + drug ] components. Description of mixed variabilities ( population studies ) : compare, discriminate and classify individuals. Ethical individual recognition : combine population and incomplete individual information in a Bayesian estimator. Cons : Pluridisciplinarity : with emphasis on applied mathematics and engineering techniques. Long training course : [ undergraduate + graduate ] studies followed by Ph.D. thesis. 47

Optimize experiments Administration protocol Administration site Dosage adjustment Sampling protocol Schedule + Amounts Sampling site + Schedule Structural identifiability Parametric identifiability 48

Controlled systems Control channel Operator Old Transfer and use energy. Human operator. Management of simple situations. Information transmission and data processing. Computer help, communication theory. Control devices New Rapid actions, independent of the nature of support. Complex but optimized situations. Observation devices Remote control. Feedback channel 49

N. Wiener, Cybernetics, 940-50

W. von Brawn, NASA, 960-5

R. R. Jelliffe, Clinical PKs, 970- R. Bellman R. Jelliffe 52

system identification ( ) ( ) x arg max{ } B = f x y x arg max{ f y x } L = R.A. Fisher 0 2 y (ng/ml) 0 0 2 4 6 8 t (h) C.F. Gauss y D V () t = exp( k t) 53