1 / 42

Basic Methods in Theoretical Biology

Basic Methods in Theoretical Biology. 1 Methodology 2 Mathematical toolkit 3 Models for processes 4 Model-based statistics. http://www.bio.vu.nl/thb/course/tb/tb.pdf. Empirical cycle 1.1. Assumptions summarize insight 1.1. task of research: make all assumptions explicit

hsmith
Télécharger la présentation

Basic Methods in Theoretical Biology

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Basic Methods in Theoretical Biology 1 Methodology 2 Mathematical toolkit 3 Models for processes 4 Model-based statistics http://www.bio.vu.nl/thb/course/tb/tb.pdf

  2. Empirical cycle 1.1

  3. Assumptions summarize insight 1.1 • task of research: make all assumptions explicit • these should fully specify subsequent model formulations • assumptions: interface between experimentalist  theoretician • discrepancy model predictions  measurements: • identify which assumption needs replacement • models that give wrong predictions can be very useful • to increase insight • structure list of assumptions to replacebility (mind consistency!)

  4. Model: definition & aims 1.1 • model: • scientific statement in mathematical language • “all models are wrong, some are useful” • aims: • structuring thought; • the single most useful property of models: • “a model is not more than you put into it” • how do factors interact? (machanisms/consequences) • design of experiments, interpretation of results • inter-, extra-polation (prediction) • decision/management (risk analysis) • observations/measurements: • require interpretation, so involve assumptions • best strategy: be as explicitly as possible in assumptions

  5. Model properties 1.1 • language errors: • mathematical, dimensions, conservation laws • properties: • generic (with respect to application) • realistic (precision; consistency with data) • simple (math. analysis, aid in thinking) • complex models are easy to make, difficult to test • simple models that capture essence are difficult to make • plasticity in parameters (support, testability) • ideals: • assumptions for mechanisms (coherence, consistency) • distinction action variables vs measured quantities • need for core and auxiliary theory

  6. Modelling 1 1.1 • model: • scientific statement in mathematical language • “all models are wrong, some are useful” • aims: • structuring thought; • the single most useful property of models: • “a model is not more than you put into it” • how do factors interact? (machanisms/consequences) • design of experiments, interpretation of results • inter-, extra-polation (prediction) • decision/management (risk analysis)

  7. Modelling 2 1.1 • language errors: • mathematical, dimensions, conservation laws • properties: • generic (with respect to application) • realistic (precision) • simple (math. analysis, aid in thinking) • plasticity in parameters (support, testability) • ideals: • assumptions for mechanisms (coherence, consistency) • distinction action variables/meausered quantities • core/auxiliary theory

  8. Presumptions  Laws11.1 Laws Theories Hypotheses Presumptions decrease in demonstrated support amount of support is always limited Proofs only exist in mathematics role of abstract concepts large 0 “facts” “general theories” predictions possible no predictions possible

  9. Theories  Models1.1 Theory: set of coherent and consistent assumptions from which models can be derived for particular situations Models may or may not represent theories it depends on the assumptions on which they are based If a model itself is the assumption, it is only a description if it is inconsistent with data, and must be rejected, you have nothing If a model that represents a theory must be rejected, a systematic search can start to assumptions that need replacement Unrealistic models can be very useful in guiding research to improve assumptions (= insight) Many models don’t need to be tested against data because they fail more important consistency tests Testability of models/theories comes in gradations

  10. Auxiliary theory 1.1 Quantities that are easy to measure (e.g. respiration, body weight) have contributions form several processes  they are not suitable as variables in explenatory models Variables in explenatory models are not directly measurable  we need auxiliary theory to link core theory to measurements Standard DEB model: isomorph with 1 reserve & 1 structure that feeds on 1 type of food

  11. Measurements typicallyinvolve interpretations, models 1.1 Given: “the air temperature in this room is 19 degrees Celsius” Used equipment: mercury thermometer Assumption: the room has a temperature (spatially homogeneous) Actual measurement: height of mercury column Height of the mercury column  temperature: model! How realistic is this model? What if the temperature is changing? Task: make assumptions explicit and be aware of them Question: what is calibration?

  12. Complex models 1.1 • hardly contribute to insight • hardly allow parameter estimation • hardly allow falsification Avoid complexity by • delineating modules • linking modules in simple ways • estimate parameters of modules only

  13. Causation 1.1 Cause and effect sequences can work in chains A  B  C But are problematic in networks A B C Framework of dynamic systems allow for holistic approach

  14. Dimension rules 1.2 • quantities left and right of = must have equal dimensions • + and – only defined for quantities with same dimension • ratio’s of variables with similar dimensions are only dimensionless if • addition of these variables has a meaning within the model context • never apply transcendental functions to quantities with a dimension • log, exp, sin, … What about pH, and pH1 – pH2? • don’t replace parameters by their values in model representations • y(x) = a x + b, with a = 0.2 M-1, b = 5  y(x) = 0.2 x + 5 • What dimensions have y and x? Distinguish dimensions and units!

  15. Models with dimension problems 1.2 • Allometric model: y = a W b • y: some quantity a: proportionality constant • W: body weight b: allometric parameter in (2/3, 1) • Usual form ln y = ln a + b ln W • Alternative form: y = y0 (W/W0 )b, with y0 = a W0b • Alternative model: y = a L2 + b L3, where L W1/3 • Freundlich’s model: C = k c1/n • C: density of compound in soil k: proportionality constant • c: concentration in liquid n: parameter in (1.4, 5) • Alternative form: C = C0 (c/c0 )1/n, with C0 = kc01/n • Alternative model: C = 2C0 c(c0+c)-1 (Langmuir’s model) • Problem: No natural reference values W0 , c0 • Values of y0 , C0 depend on the arbitrary choice

  16. Egg development time 1.2 Bottrell, H. H., Duncan, A., Gliwicz, Z. M. , Grygierek, E., Herzig, A., Hillbricht-Ilkowska, A., Kurasawa, H. Larsson, P., Weglenska, T. 1976 A review of some problems in zooplankton production studies. Norw. J. Zool.24: 419-456

  17. Space-time scales 1.3 Each process has its characteristic domain of space-time scales system earth space ecosystem population When changing the space-time scale, new processes will become important other will become less important Models with many variables & parameters hardly contribute to insight individual cell time molecule

  18. Problematic research areas 1.3 Small time scale combined with large spatial scale Large time scale combined with small spatial scale Reason: likely to involve models with large number of variables and parameters Such models rarely contribute to new insight due to uncertainties in formulation and parameter values

  19. Different models can fit equally well 1.5 Two curves fitted: a L2 + b L3 with a = 0.0336 μl h-1 mm-2 b = 0.01845 μl h-1 mm-3 a Lb with a = 0.0156 μl h-1 mm-2.437 b = 2.437 O2 consumption, μl/h Length, mm

  20. Plasticity in parameters 1.7 • If plasticity of shapes of y(x|a) is large as function of a: • little problems in estimating value of a from {xi,yi}i • (small confidence intervals) • little support from data for underlying assumptions • (if data were different: other parameter value results, • but still a good fit, so no rejection of assumption) • A model can fit data well for wrong reasons

  21. Biodegradation of compounds 1.7 n-th order model Monod model ;  ;  X : conc. of compound, X0 : X at time 0 t : time k : degradation rate n : order K : saturation constant

  22. Biodegradation of compounds 1.7 n-th order model Monod model scaled conc. scaled conc. scaled time scaled time

  23. Verification  falsification1.9 Verification cannot work because different models can fit data equally well Falsification cannot work because models are idealized simplifications of reality “All models are wrong, but some are useful” Support works to some extend Usefulness works but depends on context (aim of model) a model without context is meaningless

  24. Model without dimension problem 1.2 Arrhenius model: ln k = a – T0 /T k: some rate T: absolute temperature a: parameter T0: Arrhenius temperature Alternative form: k = k0exp{1 – T0 /T}, with k0 = exp{a – 1} Difference with allometric model: no reference value required to solve dimension problem

  25. Central limit theorems 2.6 The sum of n independent identically (i.i.) distributed random variables becomes normally distributed for increasing n. The sum of n independent point processes tends to behave as a Poisson process for increasing n. Number of events in a time interval is i.i. Poisson distributed Time intervals between subsequent events is i.i. exponentially distributed

  26. Sums of random variables 2.6 Exponential prob dens Poisson prob

  27. Normal probability density 2.6

  28. Dynamic systems 3.2 Defined by simultaneous behaviour of input, state variable, output Supply systems: input + state variables  output Demand systems: input  state variables + output Real systems: mixtures between supply & demand systems Constraints: mass, energy balance equations State variables: span a state space behaviour: usually set of ode’s with parameters Trajectory: map of behaviour state vars in state space Parameters: constant, functions of time, functions of modifying variables compound parameters: functions of parameters

  29. Statistics 4.1 • Deals with • estimation of parameter values, and confidence in these values • tests of hypothesis about parameter values • differs a parameter value from a known value? • differ parameter values between two samples? • Deals NOT with • does model 1 fit better than model 2 • if model 1 is not a special case of model 2 • Statistical methods assume that the model is given • (Non-parametric methods only use some properties of the given • model, rather than its full specification)

  30. Stochastic vs deterministic models 4.1 • Only stochastic models can be tested against experimental data • Standard way to extend deterministic model to stochastic one: • regression model: y(x| a,b,..) = f(x|a,b,..) + e, with eN(0,2) • Originates from physics, where e stands for measurement error • Problem: • deviations from model are frequently not measurement errors • Alternatives: • deterministic systems with stochastic inputs • differences in parameter values between individuals • Problem: • parameter estimation methods become very complex

  31. Stochastic vs deterministic models 4.1 • Tossing a die can be modeled in two ways • Stochastically: each possible outcome has the same probability • Deterministically: detailed modelling of take off and bounching, • with initial conditions; many parameters • Imperfect control of process makes deterministic model unpractical

  32. Large scatter 4.1 • complicates parameter estimation • complicates falsification Avoid large scatter by • Standardization of factors that contribute to measurements • Stratified sampling

  33. Kinds of statistics 4.1 Descriptive statistics sometimes useful, frequently boring Mathematical statistics beautiful mathematical construct rarely applicable due to assumptions to keep it simple Scientific statistics still in its childhood due to research workers being specialised upcoming thanks to increase of computational power (Monte Carlo studies)

  34. Tasks of statistics 4.1 • Deals with • estimation of parameter values, and confidence of these values • tests of hypothesis about parameter values • differs a parameter value from a known value? • differ parameter values between two samples? • Deals NOT with • does model 1 fit better than model 2 • if model 1 is not a special case of model 2 • Statistical methods assume that the model is given • (Non-parametric methods only use some properties of the given • model, rather than its full specification)

  35. Independent observations 4.1 If X and Y are independent IIf

  36. Statements to remember 4.1 • “proving” something statistically is absurd • if you do not know the power of your test, • you don’t know what you are doing while testing • you need to specify the alternative hypothesis to know the power • this involves knowledge about the subject (biology, chemistry, ..) • parameters only have a meaning if the model is “true” • this involves knowledge about the subject

  37. Nested models 4.5 Venn diagram

  38. Testing of hypothesis 4.5 Error of the first kind: reject null hypothesis while it is true Error of the second kind: accept null hypothesis while the alternative hypothesis is true Level of significance of a statistical test:  = probability on error of the first kind Power of a statistical test:  = 1 – probability on error of the second kind null hypothesis decision No certainty in statistics

  39. Parameter estimation 4.6 Most frequently used method: Maximization of (log) Likelihood likelihood: probability of finding observed data (given the model), considered as function of parameter values If we repeat the collection of data many times (same conditions, same number of data) the resulting ML estimate

  40. Profile likelihood 4.6 large sample approximation 95% conf interval

  41. Comparison of models 4.6 Akaike Information Criterion for sample size n and K parameters in the case of a regression model You can compare goodness of fit of different models to the same data but statistics will not help you to choose between the models

  42. correlations among parameter estimates can have big effects on sim conf intervals Confidence intervals 4.6 95% conf intervals excludes point 4 length, mm includes point 4 time, d

More Related