Code: The Hidden Language of Computer Hardware and Software — A Review

Ximena Sandoval
4 min readOct 28, 2021

--

TL;DR

Code: The Hidden Language of Computer Hardware and Software by Charles Petzold talks about the fundamental ideas behind the construction and from scratch of the modern computer. Computers are a very important tool for software engineers and in order to effectively use this tool, we must know the main concepts behind them and Code is a good first approach to aboard this study.

Let’s talk about code

Codes have been used for many years to represent information and they come in many forms, but the most basic code we can have, using only two symbols, is binary code. These symbols can be anything: On and off states of a flashlight, “-” and “.”, and or course, “0” and “1”.

This simplicity makes binary code so powerful since with only these two symbols we can represent any kind of object, one being numbers. Even more, we can do the basic operations like addition, subtraction, multiplication, and even division with binary numbers (numbers represented with binary code).

From the basics to the not so basics

Petzold then introduces the notion of operating binary digits (bits) using algebra. The basic operations (gates) we can do with bits are:

  • AND — Outputs 1 if all inputs are 1, 0 otherwise.
  • OR — Outputs 1 if at least one input is 1, 0 otherwise.
  • NOT — Outputs 1 if the input was 0, 0 otherwise.

And of course, we can build more sophisticated gates, like NAND, NOR, XOR, and so on. Now that we have these gates, we can start operating bits with them. Believe it or not, building a circuit to add two bits is just a composition of these basic gates.

Now that we know how to add two bits, we can start adding these bit-adding units to build a more complex operation, adding numbers composed of multiple bits. We can also subtract by taking the complement of the subtrahend. Since we know that multiplying is just adding multiple times, we can also implement that too, and finally, the division is just subtracting multiple times too.

Now, we have a basic arithmetic unit, that takes binary numbers and operates them.

How do computers remember?

We already know about circuits, but we are now presented with the idea of Flip-Flops, a basic combination of gates that can hold 1 bit of information. This is the most basic memory, and by adding multiple Flip-Flops we can have a memory unit that holds multiple bits of information.

Now we can memory to our arithmetic unit, and we can start stacking multiple operations that have their outputs stored in these memory units, and finally, we have built a very basic computer.

A little bit of history

We talked about the construction of these arithmetic operations and memory units and how to use them to build a computer, but back in the day, they were made using relays. These weren’t the best for computers, since after a period of time they would break due to extended use.

An alternative to relays were the vacuum tubes since they would last longer and did operations faster and they were used to build the first electronic computers. But they had their disadvantages: they were expensive, used a lot of electricity, and generated a lot of heat. This is why transistors came into the game. They were smaller than vacuum tubes, required less power, generated less heat, and lasted longer.

Using transistors and resistors (along with other components) made out a single piece of silicon, chips came to be, and with multiple chips, integrated circuits were created.

And finally, we can start talking about microprocessors — “a consolidation of all the components of a central processing unit (CPU) of a computer into a single chip of silicon”.

Computer representations

We were now presented with a new problem: How to represent text using binary code? To solve this several encodings were proposed, like Baudot encoding, but it had problems, like using shift codes that produced bad results while decoding.

The solution to this problem was the ASCII (American Standard Code for Information Interchange) code. This code consists of 7 bits and includes numeric digits and punctuation.

We know how to represent whole numbers and text, but how do we represent real numbers? The answer is the floating-point notation, which is based on the scientific notation of numbers.

What was missing

Finally, we can start talking about how to coordinate everything that was built throughout the book: an operating system. An operating system handles the interaction between all the components of the computer: the processor, the memory, the I/O devices. Operating systems also provide an easy way to access hardware.

Final thoughts

The book finishes by adding information about low and high-level programming languages, which differentiate one another by how similar they are to assembly code, and also talking about the evolution of graphical interfaces in computers (everything we talked about before, was using a CLI kind of environment).

I enjoyed reading this book, it has very simple yet illustrative examples in each section and has a natural flow of ideas to keep on building in the next chapter. I believe this book is a really good first approach for people that wants to learn about the main ideas of how computers work, even if they don’t have a strong, technical background.

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

--

--

Ximena Sandoval
Ximena Sandoval

Written by Ximena Sandoval

Cloud Platform Engineer at Encora. Currently learning about CI/CD, Kubernetes and watching Studio Ghibli movies✨

No responses yet

Write a response