1 / 23

FISHER INFORMATION

FISHER INFORMATION. SUBMITTED BY CHETHAN M. INTRODUCTION.

theta
Télécharger la présentation

FISHER INFORMATION

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. FISHER INFORMATION SUBMITTED BY CHETHAN M

  2. INTRODUCTION • Fisher information is named for its inventor, R.A. Fisher (1890–1962), a British biostatistician who was among the first to develop and employ—when and as the need arose during his work in genetics and eugenics—such methods as maximum likelihood estimation, the analysis of variance, and the design of experiments. • He also pointed out that Gregor Mendel had probably falsified the “data” in his famous pea-plant experiments, which seem too clean to be the result of any natural process.

  3. FISHER’S IDEA • Fisher’s idea was that attempts to measure physical quantities—such as the time required for the winner of a 100-yard dash to reach the finish line—are invariably frustrated by “noise.” • That’s why multiple stopwatches are ordinarily employed. Moreover, the quantity of information concerning the actual elapsed time contained in such a sample varies directly with the degree to which the sample measurements cluster about a common value.

  4. DEFINITION The Fisher information is the amount of information that an observable random variable X carries about an unknown parameter θ upon which the likelihood function of θ, L(θ) = f(X;θ), depends. The likelihood function is the joint probability of the data, the Xs, conditional on the value of θ, as a function of θ. Since the expectation of the score is zero, the variance is simply the second moment of the score, the derivative of the log of the likelihood function with respect to θ

  5. which implies . The Fisher information is thus the expectation of the squared score. A random variable carrying high Fisher information implies that the absolute value of the score is often high. The Fisher information is not a function of a particular observation, as the random variable X has been averaged out. The concept of information is useful when comparing two methods of observing a given random process.

  6. INFORMATION VIEW Over the past 15 or so years, it has become increasingly clear that the fundamental laws of science are expressions of the concept of information.  This includes laws governing the small (at subatomic scales), the large (astronomical scales), and everything in between. 

  7. INFORMATION VIEW In between are chemistry, biology and even higher-level effects involving willful human activity, such as economics and sociocultural organization.  These laws all derive from a principle of information optimization called extreme physical information, or EPI.

  8. EPI - Extreme Physical Information EPI is an expression of the imperfection of observation: • Owing to random interaction of a subject with its observer and other possible disturbances, its measurement contains less Fisher information than does the subject per se. • Moreover, the information loss is an extreme value.

  9. Contd… • An EPI output may alternatively be viewed as the payoff of a zero-sum game of information acquisition between the observer and a ‘demon’ in subject space. • EPI derives, Escher-like, the very probability law that gave rise to the measurement. • In applications, EPI is used to derive both existing and new analytical relations governing probability laws of physics, genetics, cancer growth, ecology and economics.

  10. EPI Principle The two types of Fisher Information I,J • There is a decisive difference between Experiences of passively observing a lamp voltage of 120.0 von a meter and  • becoming an active part of the electrical phenomenon by sticking your finger in the lamp socket.

  11. EPI Principle • The difference may be expressed on the level of information by the incredibly simple form I - J =extremum ---- 1 • This is called the EPI principle.The numbers I and J are the outputs of  integrals that define values of the "Fisher information,"  

  12. Why do two Fisher informations arise in eq. (1), and why is their simple difference extremized? The information that is acquired in a message does not generally arise out of nothing.  Any acquired information is usually about something. That something is generally called an information source, and it has information level J.  That is, the information level J is required to completely describe it.  The source is an effect like lamp voltage in the above.

  13. Physics from Fisher Information Physics is fundamentally tied into measurement. One may regard "physics", by which we mean the equations of physics, as simply a manmade code that represents all past (and future) measurement as briefly, concisely and correctly as is possible. Thus physics is a body of equations that describes measurements.

  14. Physics from Fisher Information In fact we could equally well replace the word "physics" in the foregoing with "chemistry", "biology," or any other quantitative science.  All describe measurements and potential measurements by manmade codes called "equations."  But measurements are generally made for the purpose of knowing, in particular knowing the state of a system.  Thus, physics presumes definite states to exist. 

  15. Contd… We characterize these by definite values of a parameter such as a above (for example a position or time value). A definite parameter value is presumed to characterize a definite system or "object".  Ideally accurate versions of the laws of science should follow from maximally accurate estimates, and therefore by eq. ("Cramer-Rao inequality" e2 ≥ 1/I ) maximum (note: not minimum) Fisher information.

  16. Contd… Fisher information measures how much information is present, whereas entropy measures how much is missing. B. R. Frieden uses a single procedure (called extreme physical information) with the aim of deriving 'most known physics, from statistical mechanics and thermodynamics to quantum mechanics, the Einstein field equations and quantum gravity'.

  17. Contd…. A notable exception to this, based on Shannon's work on information theory (Shannon and Weaver 1964), was initiated by Brillouin (1956) and developed by Jaynes (1957). This approach (usually referred to as the maximum entropy method) has now been developed to encompass other aspects of statistics and probability theory (Jaynes 1983).

  18. Differences between Jaynes and Frieden All things physical are information-theoretic in origin and this is a participatory universe Observer participancy gives rise to information; and information gives rise to physics.

  19. Contd… (J. A. Wheeler) His emphasis is, therefore, on getting rather than simply having information; that is to say on measurement, though whether he really believes that the physics would not be there without the measurement is difficult to say. While Jaynes, within the area of the foundations of physics, confined himself to statistical mechanics, Frieden claims to be able to derive the fundamental equations of almost all of physics.

  20. CONCLUSIONS The application of the ideas of information theory to physics is interesting;and the use of Fisher information to provide the gradient terms in the Lagrangian for a variational procedure is of some importance. The crucial step,however, is to provide in some rational and widely-applicable manner the remaining terms of the Lagrangian. Frieden believes he is able to do this by using the idea of bound information.

  21. Final Fisher The contributions Fisher made included the development of methods suitable for small samples, the discovery of the precise distributions of many sample statistics and the invention of analysis of variance.

  22. Ronald Aylmer. Fisher, 1929, as sketched by B. Roy Frieden, author of the book under review. The sketch, which appears in Physics from Fisher Information: A Unification, was done from a photograph taken at the time of Fisher's election as a Fellow of the Royal Society.

  23. “perhaps the most original mathematical scientist of the [twentieth] century”Bradley Efron Annals of Statistics (1976) “Fisher was a genius who almost single-handedly created the foundations for modern statistical science ….”Anders Hald A History of Mathematical Statistics (1998) “Sir Ronald Fisher … could be regarded as Darwin’s greatest twentieth-century successor.” Richard Dawkins River out of Eden (1995) “I occasionally meet geneticists who ask me whether it is true that the great geneticist R. A. Fisher was also an important statistician.” Leonard J. Savage Annals of Statistics (1976)

More Related