1 / 28

Bayesian parameter estimation in cosmology with Population Monte Carlo

Bayesian parameter estimation in cosmology with Population Monte Carlo. By Darell Moodley (UKZN) Supervisor: Prof. K Moodley (UKZN). SKA Postgraduate conference, 29 Nov 2010. Applying Bayesian statistics to cosmology. Estimate cosmological parameters for specified models efficiently.

cleta
Télécharger la présentation

Bayesian parameter estimation in cosmology with Population Monte Carlo

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Bayesian parameter estimation in cosmology with Population Monte Carlo By DarellMoodley (UKZN) Supervisor: Prof. K Moodley (UKZN) SKA Postgraduate conference, 29 Nov 2010

  2. Applying Bayesian statistics to cosmology Estimate cosmological parameters for specified models efficiently. To quantitatively discriminate one model from another in light of data (model testing). Other relevant applications of parameter estimation include optimizing experimental configurations e.g. MeerKAT antenna parameters.

  3. Bayesian Inference Provides an expression for the posterior probability that contains the uncertainty regarding parameters of interest. Difficult to evaluate posterior, due to the normalising constant. Solution: Use a simulation to draw samples from this distribution.

  4. Population Monte Carlo (PMC) • Is an adaptive version of importance sampling. • Constructs a sequence of samples to provide improved estimations of parameters. • Based on the fundamental identity: • are drawn from q, we estimate by,

  5. Methodology • Draw samples from importance function: , where and are D component weights that are proportions of the sample taken from each mixture density, , with parameters, . • Allocate weights to samples:

  6. Updating rule

  7. Mixture densities Mixture densities with iteration The sum of the mixture densities iteratively approaches the target distribution.

  8. Convergence of the importance function Convergence is reached when the importance function adequately resembles that of the target distribution.

  9. MCMC versus PMC Both methods generate samples that are representative of complex distributions. MCMC draws from a proposal distribution while PMC draws from an importance function, that can be chosen to be a mixture of densities. PMC can reduce computational time and since chains are not correlated, there is no ‘burn-in’ period. PMC, like MCMC, also has the ability to be parallelisable, hence computationally feasible. Each iteration produces an independent sample, therefore it can be stopped at any time.

  10. Illustrative example Banana shaped distribution that we wish to simulate draws from Updated importance function after 11 iterations Wraith et al. 2009

  11. Applications • Optimisation • Search regions of high likelihood to determine optimal parameters. Determine maximum likelihood estimates. • Model selection • Ability to compute Bayesian evidence from existing chains, hence compute Bayes’ factor for different models. • Evidence is immediately accessible from the sample used for parameter estimation.

  12. Model Testing • PMC can be used to determine the Bayes’ Factor, used to discriminate between models Test extensions of the standard model with dark-energy and curvature scenarios Kilbingeret al. 2010

  13. Project Objectives Do a systematic study of the PMC method. Examine the behaviour between algorithm dependencies and efficiency A quantitative comparison between PMC and MCMC efficiency Estimation of cosmological parameters using current data As well as discriminate between cosmological models.

  14. References

  15. THE ENDTHANK YOU

  16. Adaptive Importance sampling • Use the Kullback-Liebler distance measure • Incorporate mixture densities • D component weights • ,such that

  17. Estimator for the Evidence • Using importance sampling: • where are the importance weights for importance distribution q. • Variance is given by • Want to choose optimal q such that σ is minimised.

  18. Diagnostics • Want to maximise so we use the perplexity as an estimate

  19. Application to cosmology • Compare the cosmological constant and flat ΛCDM model to Dark Matter models • A Flat (Ωk=0) and Curved (Ωk≠0) model is assumed for each Dark matter model.

  20. Priors for dark energy and curvature models

  21. Specifying the PMC parameters • For the Dark energy models: • T=10, but can increase if perplexity is still low. • N=7 500 • D=10 • N/D should be chosen not too small to ensure numerically stable updating of the component.

  22. Results • Standard ΛCDM model is favoured.

  23. Testing Stability • Repeat the PMC runs 25 times.

  24. Primordial fluctuation models • Dark matter density fluctuations are given by the power spectrum • with tensor modes • Parametrise the parameters in terms of the slow-roll parameters.

  25. Assumptions and results Priors Results

  26. Constraining parameters and Model discrimination PMC can be used to determine the Bayes’ Factor, used to discriminate between models PMC is also used to constrain the dark energy equation of state using various data. Kilbingeret al. 2010

  27. Jeffreys’ scale

More Related