200 likes | 341 Vues
This lecture discusses the log-likelihood ratio test and the challenge of computing conditional error probabilities in statistical hypothesis testing. It addresses the difficulty of finding simple formulas for detection and false alarm probabilities and introduces Monte Carlo simulations as a cumbersome alternative. Chernoff Bounds are proposed as a more efficient method for obtaining analytical bounds on error probabilities. The discussion includes moment-generating functions, methods of deriving tighter bounds, and the asymptotic behavior of detection probabilities under certain conditions.
E N D
Chernoff Bounds (Theory) ECE 7251: Spring 2004 Lecture 25 3/17/04 Prof. Aaron D. Lanterman School of Electrical & Computer Engineering Georgia Institute of Technology AL: 404-385-2548 <lanterma@ece.gatech.edu>
Consider the loglikelihood ratio test • Conditional error probabilities: The Setup • General purpose likelihood ratio test
The Problem • Alas, it is often difficult, if not impossible, to find simple formulas for • Makes computing probabilities of detection and false alarm difficult • Could use Monte Carlo simulations, but those are cumbersome • Alternative: find easy to compute, analytic bounds on the error probabilities • Discussion based on Van Trees, pp. 116-125
Tilted Densities • Define a new random variable Xs (for various values of s) with density
The Happy Mu Function (EFTR)
A Weird Way of Writing PFA • With • Then
Find the Tightest Bound • We want the s0 which makes the RHS as small as possible • Assuming everything worked (things exist, equation for maximizing s solvable, etc.):
Similar Analysis Bounds PM • We want the s1 which makes the RHS as small as possible • Assuming everything worked (things exist, equation for maximizing s solvable, etc.):
Putting It All Together Why is this useful? L can often be easily described by its moment generating function
Case of Equal Costs and Equal Priors • Let s=smsatisfy
where Another Look at the Derivation
A Revelation About the Constant Original Chernoff inequality was formed by replacing this with 1. We can get a tighter constant in some asymptotic cases.
Asymptotic Gaussian Approximation • In some cases, Z approaches a Gaussian random variable as the number of samples n grows large (ex: data points i.i.d. with finite means and variances (EFTR)
If we can approximate Q • using an upper bound Yet Another Approximation
If we can approximate Q using the upper bound Similar Analysis Works for PM
Asymptotic Analysis for Pe • For the case of equal priors and equal costs, if the conditions for the approximation for Q to be valid on the previous to slides holds, we have (EFTR)