Description |
1 online resource (xiv, 209 pages) : illustrations |
Physical Medium |
polychrome |
Description |
text file |
Bibliography |
Includes bibliographical references (pages 199-206) and index. |
Contents |
Information Theory and The Central Limit Theorem; Preface; Contents; 1. Introduction to Information Theory; 2. Convergence in Relative Entropy; 3. Non-Identical Variables and Random Vectors; 4. Dependent Random Variables; 5. Convergence to Stable Laws; 6. Convergence on Compact Groups; 7. Convergence to the Poisson Distribution; 8. Free Random Variables; Appendix A Calculating Entropies; Appendix B Poincare Inequalities; Appendix C de Bruijn Identity; Appendix D Entropy Power Inequality; Appendix E Relationships Between Different Forms of Convergence; Bibliography; Index. |
Summary |
This book provides a comprehensive description of a new method of proving the central limit theorem, through the use of apparently unrelated results from information theory. It gives a basic introduction to the concepts of entropy and Fisher information, and collects together standard results concerning their behaviour. It brings together results from a number of research papers as well as unpublished material, showing how the techniques can give a unified view of limit theorems. |
Local Note |
eBooks on EBSCOhost EBSCO eBook Subscription Academic Collection - North America |
Subject |
Central limit theorem.
|
|
Central limit theorem. |
|
Information theory -- Statistical methods.
|
|
Information theory -- Statistical methods. |
|
Information theory. |
|
Probabilities.
|
|
Probabilities. |
Genre/Form |
Electronic books.
|
Other Form: |
Print version: Johnson, Oliver (Oliver Thomas). Information theory and the central limit theorem. London : Imperial College Press ; River Edge, NJ : Distributed by World Scientific Pub., ©2004 1860944736 (OCoLC)56905822 |
ISBN |
1860945376 (electronic book) |
|
9781860945373 (electronic book) |
|
9781860944734 |
|
1860944736 |
|