1 / 15

150 likes | 277 Vues

New approaches to variable stars data processing and interpretation. Zdeněk Mikulášek Institute for Theoretical Physics and Astrophysics, Ma saryk University, Brno, Czech Republic. Introduction.

Télécharger la présentation
## New approaches to variable stars data processing and interpretation

**An Image/Link below is provided (as is) to download presentation**
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.
Content is provided to you AS IS for your information and personal use only.
Download presentation by click this link.
While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

**New approaches to variablestarsdata processing and**interpretation Zdeněk Mikulášek Institute for Theoretical Physics and Astrophysics, MasarykUniversity, Brno, Czech Republic**Introduction**Development from Tsessevich’s times in the field of variable stars research is large. It has arisen: • the number of VS itself - by one or two orders, as well as the number of their observers and interpreters. • the volume and common access to high-quality VS observing data and computational techniques. • the number of new efficient statistical techniques and methods that are available for everybody thanks to wide spread PCs. Nevertheless, the methods used for processing of data mostly have remained the same as those used in Vladimir Platonovich’s era.**Every astrophysicist likes large quantities and better**quality of modern observational data, new methods of processing are not so popular. Majority of them needs a good knowledge of matrix calculus. • A frequent syndrome of VS observers – Matrixphobia. • There are exceptions: few of mathematically erudite theoreticians love new methods and matrices so much that they do not use them for real observational data. • Both extremes in the data processing are bad – we should find our golden mean. • The contemporary statistics shares inexhaustible quantity of methods. It is necessary to select several of the most versatile and diverse methods, master them and to learn to combine them. • The method of processing must not be unique, bur always must be made-to-measure of the set problem.**Advanced Principal Component Analysis**The majority of VS data processing tasks are solved using LSM, strictly speaking linear regression (polynomials, harmonic polynomials). There are many other methods which are able to give us the same or better results. The example: APCA. APCA – a combination of LR and standard PCA – optimal for solving a lot astrophysics problems: • realistic fitting of multicolour light curves • the determination of the moments of extrema of McLC • modeling of light multicolour curves necessary for improvement of ephemerides • diagnostics of LC secular changes. Classification of LCs**Least square method**• the most popular method among astronomers: minimalization of the sum of quadrates of deflections of y in respect of the before established model of observed dependence S. The solution of LSM – the vector of free parameters of the model + their uncertainties • The invention of the scientist – an adequate modeling of the reality. Consequent steps – only the technique of solution. • The finding of real solution is quick if one knows a good estimate of the real solution – then substitution of the Sin the space of free parameters +1 by a paraboloid • Then conversion to linear regression – solution of the systems of k equations with k unknown parameters • Linear regression – the model is the linear combinations of k functions – favorite – polynomial regression, hpr**Benefits of orthogonal models**• Linear (linearized) LSM: uncertainties of parameters. • Is valid: No!!!! • What use is to assign errors of parameters???**How to estimate the uncertainty of the prediction?**You must know H. You can transform functions fi so that form an orthogonal basis e.g. by Gram-Schmidt orthogonalization procedure. Then H will be diagonal and the meaning of parameter uncertainty will have their awaited sense. Orthogonal polynomials:**True weights in LSM**• Canonical weights of VS observers: • visual – 1, photographic – 3, photoelectric – 10 (20) • True weights for TW Dra(before 1942) • faintening – 1; visual I – 4, vis. II28; PEP+photoseries – 266! • True weights should not be stated in advance! It should be the result of a preliminary iterative analysis. • The weight is not given only by inner accuracy of a particular observational method, but also the adequacy of the model. function. If the model is wrong, the weights of all type of measurements might be nearly equal!**Robust regression**• Practically all real (untrimmed) astrophysical data contain rough errors – outliers. They devastate LSM method – their results are a vagary of outliers’ number and distribution. • Second problem: Observers intending to clean their data of outliers occasionally erase also non-outliers. • Both problems can be treated properly by a suited robust regression. • We prefer RR which modifies weights of particular measurements by a special function of deflection of measured quantity from predicted values. Our favorite:**Conclusions**• New methods of variable stars data processing enable us better exploit information hidden in their observations. Endeavor connected with mastering of them will return in new subtle discoveries and revealing. • Matrix calculus, true using of weights,advanced principal component analysis, factor analysis, robust regression, creation and usage of orthogonal models and several other processing techniques should appertain to compulsory outfit of each variable stars’ observer of the 21st century

More Related