A Mathematical Theory of Communication
Claude Elwood Shannon. 1948. (View Paper → )
In the present paper, we will extend the theory to include a number of new factors, particularly the effect of noise in the channel and the savings possible due to the statistical structure of the original message and the nature of the final destination of the information.
This paper introduces information theory and establishes the foundation for understanding the quantification, transmission, and efficient encoding of information.
Shannon introduces the concept of entropy as a measure of information, providing a quantitative method for assessing the information in a message. He addresses the issue of noise in communication systems and establishes the Shannon limit, which defines the maximum rate at which information can be transmitted over a noisy channel with no errors. This concept is crucial for understanding and enhancing the reliability of communication systems. Additionally, Shannon lays the theoretical groundwork for error detection and correction codes, which are essential for ensuring dependable data transmission across noisy channels.
The theories outlined in this paper form the basis of much of our modern technology infrastructure, including the internet and cellular networks.