From Wires to Bits: Claude Shannon and the Reduction of Communication

It was the 1920s in rural Michigan, not far from the little logging town of Gaylord, and a skinny boy in a wool cap was wondering, “What’s a better way to get information?” He had connected a battery to the wire fence that ran to his friend’s house half a mile away. The fence wire served as a simple telegraph system, with a key to tap out Morse code at one end and a sounder at the other. The boy was the son of a small-town businessman and schoolteacher, and his name was Claude Shannon.

This was a time before most homes in the county had telephones. To two boys, the idea of speaking across the distance without leaving the farmyard was a scientific marvel. Flashlights would not carry that far, especially in daytime. Sure, the fence was prone to failure, as when conditions were snowy or rainy. But it gave them a line that turned the field between them into a channel for words.

Little Claude Shannon liked the problem as much as the solution. The way he saw it, there were two challenges. The first was scientific: How do you make sure the receiver gets the message? As he discovered, inserting strips of rubber or leather between the wire and the wooden fence poles helped insulate the line from grounding out and kept the signal strong enough to carry.

The second challenge was clarity: How do you make sure the receiver understands the message? Short, precise signals, repeated when needed, reduced the risk of misunderstanding.

I imagine Shannon was the kind of boy who never stopped asking why. You know, that boy who presses his parents with a question, listens to the answer, then fires back with another why. Again and again.

“What’s a better way to get information?” This is the question that Shannon would pursue for the rest of his life, growing up to become the “father of information theory,” shaping how the world sends and stores information, and laying the groundwork for computers, the internet, and the digital age.

Shannon’s curiosity led him to take things apart to see how they worked, and to put things together to accomplish something new. He enjoyed fixing other people’s radios. He grew to be a lifelong tinkerer who crafted many novel devices: a trumpet that blew flames, a rocket-propelled frisbee, foam shoes for walking on the surface of the lake, a toy clown that juggled, a robot mouse with the capacity for memory and logic to find its way out of a maze.

And then there was Shannon’s “useless box.” It looked like a plain wooden box with a switch on it. If you flipped the switch on, a little robot arm would emerge from the top, flip the switch off, and then retreat back inside. That was all the machine was built to do: shut itself off. The science fiction author Arthur C. Clarke, upon seeing the box on Shannon’s desk, allegedly quipped that there was something unspeakably sinister about a machine that does nothing, absolutely nothing, except switch itself off. To Shannon, the tinkerer, his sinister box was incredibly funny.

Shannon graduated from the University of Michigan in 1936 and took a job as a research assistant at MIT. His love of dismantling things and rebuilding them landed him in the lab of Vannevar Bush, the future director of the U.S. Office of Scientific Research and Development (OSRD), which oversaw projects like radar and the early development of atomic weapons.

Shannon’s job was to maintain what was at that time the world’s most advanced computer, a sprawling contraption of gears, shafts, pulleys, vacuum tubes, and rotating disks called the “differential analyzer.” The machine weighed several tons and filled a room with massive integrators and thousands of precision-machined parts. Long belts transferred motion from one wheel to another, and the output traced a line on graph paper. Shannon’s job was to make sure it ran properly, tightening belts, fixing jams, and recalibrating delicate linkages.

But the machine’s complex nature made it fragile. The scientists would enter some information, set the machine in motion, and hope to get the correct answer. Dust or wear on a gear might throw off a calculation. Belts slipped. Alignments drifted. Complex problems required an elaborate choreography of interconnected components, each one multiplying the chance of error. To Shannon, it resembled a Rube Goldberg device, temperamental and overbuilt to solve problems that could be solved more simply. The engineering challenge was essentially the same as his childhood fence-telegraph system.

Again, he wondered, “What’s a better way to get information?”

Shannon realized the computer’s switches were essentially implementing the binary logic he had studied as an undergraduate. This insight became his master’s thesis. He demonstrated that any complex calculation could be simplified into a series of true-or-false statements. This meant intricate problems could be solved physically using basic open-and-closed electrical circuits, creating the foundations for digital computing.

After the U.S. entered World War II, Shannon moved to Bell Telephone Laboratories in 1941. At the time, Bell Labs was an unprecedented hub of innovation. It brought together an unparalleled concentration of scientific and engineering talent, all fueled by immense resources and the urgency of a nation at war.

At Bell, he was assigned to the problem of encryption. The stakes were clear: messages needed to reach allies without being compromised by enemies. The U.S. and Britain relied on electromechanical cipher machines for secure communication, such as the Bell Labs-designed SIGSALY. The system worked by transforming the sound of a voice into a numerical stream and mixing it with a random key. Only someone with an identical key could decipher the voice message. While ingenious, the system was unwieldy. It was massive, consumed enormous amounts of power, and required that physical key records be perfectly synchronized. Other cipher machines had similar drawbacks, often being too slow, fragile, or vulnerable if their keys were ever compromised.

These analog systems like SIGSALY and the MIT differential analyzer were all elaborate constructions straining against their own limitations. They were fragile, sensitive to noise, and difficult to scale. Their solutions added more machinery, more amplification, and more moving parts, piling complexity on complexity.

Shannon’s thinking shifted from simply improving communication to a far more ambitious goal. As he saw it, the question evolved from “What’s a better way to get information?” to “What’s the perfect way to get information?” He sought a system as reliable and elegant as a mathematical proof.

This pursuit led him, in the years after the war, to develop Information Theory, which he formalized in his groundbreaking 1948 paper: “A Mathematical Theory of Communication.”

In his paper, Shannon established a new definition for information. It’s one of those definitions that, in retrospect, seems obvious, but at the time was novel and counterintuitive. Shannon argued that information was not about content, but about uncertainty. His theory suggested that the purpose of a communication system isn’t to convey meaning, but to accurately reproduce a message from one point to another. The system simply needs to handle the statistical properties of the signal. The meaning of the message, whether it is a poem or a grocery list, is irrelevant to the engineering problem of transmitting it.

By abstracting information into a universal, quantifiable form, he created a paradigm that could be applied to any communication system. This shift allowed engineers to create robust systems that work on any type of data, treating all messages as sequences of bits and bytes, regardless of their semantic content.

In the next post we unpack how Shannon’s Theory of Communication demonstrates an important process of singular links, a process that I call transformation by reduction. Read it here.

1 thought on “From Wires to Bits: Claude Shannon and the Reduction of Communication”

Comments are closed.

Singular Links Book Cover Singular Links: The Innovator’s Guide to Compounding Connections
By Tony Parish