top of page

Shannon & The Birth of Information Theory | Homanga Bharadwaj

Homanga Bharadwaj

This is the story of Claude Shannon, the man who singlehandedly founded one of the most popular and widely utilized field of knowledge, Information Theory. Regarded as one of the greatest geniuses of the 20th century, his impactful research right from his PhD days left millions of intellectuals bewildered at the scientific force that he was. His Masters thesis is said to be the most significant thesis of the 20th century, because it laid down the foundations of everything that has to do with ‘digital technology’. It is here where Claude Shannon, at the precocious age of 21, proved how Boolean algebra could become the building block of computers. Simply put, Shannon showed how a bunch of 0’s and 1’s could be a language in itself.

Information Theory is one of the few scientific fields fortunate enough to have an identifiable beginning — Claude Shannon’s landmark 1948 paper. It was titled A Mathematical Theory of Communication and laid the foundations of what is now known as Information Theory.  The story of the evolution of how it progressed from a single theoretical paper to a broad field that has redefined our world is a fascinating one.  One of those key concepts was his definition of the limit for channel capacity. Similar to Moore’s Law, the Shannon limit can be considered as a self-fulfilling prophecy. It is a benchmark that tells people what can be done, and what remains to be done, compelling them to achieve it. It is an interesting trivia that the word bit that is now so ubiquitous in digital communication was coined by Shannon in his 1948 paper.

Shannon considered a source of information as an entity that generates words composed of a finite number of symbols. These are transmitted through a channel, with each symbol spending a finite time in the channel. The problem involved statistics with the assumption that if xn is the nth symbol produced by the source, then the xn process is a stationary stochastic process. He gave a method of analyzing a sequence of error terms in a signal to find their inherent variety, matching them to the designed variety of the control system.

The most eminent of Shannon’s results was the concept that every communication channel has an upper speed limit, which is famously known as the Shannon limit.  It is possible to transmit information with zero error below this limit, but it is mathematically impossible to get error-free communication above this limit. Shannon also showed that a compromise between bandwidth and transmit power must be made in any communication channel. The above results are enshrined in the Noisy Channel Coding Theorem, which establishes that for any given degree of noise contamination of a communication channel, it is possible to communicate discrete data (digital information) nearly error-free up to a computable maximum rate.

His later work focused on ideas in artificial intelligence.  He invented chess playing programs and an electronic mouse, which could solve maze problems. The chess-playing program appeared in the paper Programming a Computer for Playing Chess, which was published in 1950. This proposal led to the first game played by the Los Alamos MANIAC computer in 1956. This was also the year in which Shannon published a paper showing that a universal Turing machine may be constructed with only two states.

Later in his life, he felt that the communications revolution, which he had pioneered in the early days, was going too far. He famously wrote, “Information theory has perhaps ballooned to an importance beyond its actual accomplishments.”

Shannon received many honours for his work. Among a long list of awards are the Alfred Nobel American Institute of American Engineers Award in 1940, the National Medal of Science in 1966, the Audio Engineering Society Gold Medal in 1985, and the Kyoto Prize in 1985. He was awarded the Marconi Lifetime Achievement Award by the Guglielmo Marconi International Fellowship Foundation in 2000. It was the first time that this organization, known for its annual Fellowship Prize, gave this particular award.

Towards the end of his life, he was afflicted by Alzheimer’s disease, and spent his last few years in a Massachusetts nursing home. The truth is that what Shannon left behind is more than a legacy — it is a recipe for a whole new world. Data Compression, Error Detection, Coding Theory, Cryptology are just a few popular areas that have greatly benefited from the impactful research of this towering legend.

bottom of page