110 likes | 271 Vues
This article explores the concepts of information theory, focusing on entropy, conditional entropy, and mutual information. It highlights the importance of maximum mutual information (MMI) in optimizing sensor parameter selection, particularly in the context of the 12 Coin Problem. The need for effective learning and problem-solving within observation models is emphasized, alongside techniques such as discretization and Monte Carlo methods to estimate MI. It also discusses the challenges posed by local maxima in maximizing mutual information, supported by experimental results.
E N D
Information Theory • Entropy: • Conditional Entropy: • Mutual Information:
Optimal Sensor Parameter Selection • MMI: Maximum Mutual Information
Problem • Need to learn: • Need to solve:
Observation Model • Can be learnt over many experiments • Or, modelled by recognition system
Solve argmax problem • Integral difficult to compute: • Discretise • Or, use Monte Carlo methods to estimate • Even if we can compute the MI, we also need to maximise. • Local maxima possible
Experimental Results MI Max MI