1 / 17

Calculating the entropy on-the-fly

Calculating the entropy on-the-fly. Daniel Lewandowski Faculty of Information Technology and Systems, TU Delft. Introducing a function h. h - is a measure of the uncertainty about the outcome of an experiment modelled using probability distributions. Assumptions. We assume that h :

orea
Télécharger la présentation

Calculating the entropy on-the-fly

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Calculating the entropy on-the-fly Daniel Lewandowski Faculty of Information Technology and Systems, TU Delft

  2. Introducing a function h. h - is a measure of the uncertainty about the outcome of an experiment modelled using probability distributions

  3. Assumptions • We assume that h: • depends only on the probability of the outcome of an experiment or event • takes values in non-negative real numbers • is a continuous and decreasing function • h(p1p2)=h(p1)+h(p2) The assumptions forces h to be of the form: h(p)= - C log(p)

  4. Definition of entropy The entropy H is the expectation of the function h. Example: x1, x2,…,xnare realizations of a rand. variab. X with probabilities p1, p2,…,pn respectively. Then the entropy of X is:

  5. Units in which the entropy is measured log2(x) – bits log3(x) – trits ln(x) – nats log10(x) – Hartleys

  6. Entropy of some continuous distributions • The standard normal (Gaussian) distribution: H = 1,4189 • The Weibull distribution (=1,127; =2,5): H = 0,5496 • The Weibull distribution (=1,107; =1,5): H = 0,8892 • The gamma distribution (==5): H = 0,5441

  7. Approximation of the density Y1,Y2,…,Yn – samples D0,D1,…,Dn – midpoints D0=Y1 – (Y2 – Y1)/2, Di=Yi+1 – (Yi+1 – Yi)/2, for i=2,…,n-1 Dn=Yn + (Yn – Yn-1)/2,

  8. Computations The density above the Yi is estimated as: The entropy is then computed as:

  9. Grouping samples Remark: The result of calculating the entropy without grouping samples is biased – the bias is asymptotically equal to  - 1 + ln2, ( - Euler constant)

  10. Numerical test – 5000 samples Entropy 1,159 1,372 1,399 1,458 1,438 1,418 1,324 Exact : 1,418 The red line marks the exact density function of a standard normal vrb.

  11. Results – 20 iterations (1000 samples) Compare results to exact solutions from slide 6

  12. Updating the distribution before updating after updating Dk Dk+1 Dk+2 Yk Yk+1 D(N+1) Y(N+1) D(N+2)

  13. Updating the entropy HN – the entropy calculated based on N samples. where:

  14. The program - properties • Uses the approach from the previous slide • Starts updating the entropy from N=4 • Written in VBA, uses spreadsheet only to store the samples • Results exactly the same as computed in Matlab (for the same samples) • It is not grouping samples

  15. Results Comparison of results obtained using formula and program (5000 samples – without grouping and adding the bias). The program updates the entropy HN starting from N = 4.

  16. Results, cont.

  17. Relative information – theoretical value = 2,0345

More Related