Information theory is a mixture of applied statistics, computer science, and electrical engineering which is focused on the study of efficiency limits for processing information. Although information theory was originally developed to find fundamental limits on signal processing operations such as data compression and reliability of storing and communicating data, its application has expanded into diverse fields like statistical inference, language processing, cryptography, neurobiology, molecular coding, model selection in ecology, thermal physics, quantum computation, plagiarism detection, and many others.
A course in information theory will most likely cover the following topics to the extent preferred by individual professors:
- entropy, relative entropy, and mutual information
- the asymptotic equipartition property
- entropy rates of stochastic processes
- data compression
- kolmogorov complexity
- channel capacity
- differential entropy
- gaussian channels
- spectral estimation
- theory of rate distortion
- the stock market
To stay up to date, a terrific compilation of books and articles covering the theory and widespread application of information theory is available at Google scholar.
To fulfill our mission of educating students, our online tutoring centers are standing by 24/7, ready to assist students who need help with information theory.