Saturday, 3 June 2017

Review: Code: The Hidden Language of Computer Hardware and Software by Charles Petzold

A few months ago, I was doing that "but how does it work fundamentally" thing I do and Leon's friend Scarlett managed to shut me up by recommending this book, CODE, that goes from the absolute basics to modern-day (well, 1999) computing. 

CODE covers, in order, the idea of a code (Morse code and Braille), telegraphs and relays, the decimal system and other systems including binary, octal and hexadecimal, logic and logic gates, binary adding and subtraction, feedback and flip-flops, bytes, memory, microprocessors, ASCII characters, operating systems, fixed point and floating point systems, programming languages and graphics, along with a couple of other things I don't remember from their chapter titles.

In a weird twist, this book called CODE only has one chapter out of twenty-five on actual programming languages (not counting assembly language and machine code, which it has a lot of). I was (admittedly naively) hoping to see some of my beloved Ruby or at least Javascript, but the only high-level language it mentioned was an obscure one called ALGOL. It's very much about the fundamentals about computing, so I did (mostly) get what I wished for it when I asked for an explanation from the ground up. 


  • Logic is fun! I really enjoyed reading about Boolean logic, probably because I knew it already. He had a nice metaphor for AND, OR, and NOT gates, going to the shelter looking for a cat and it can be white, grey or black, neutered or non-neutered, and male or female. Say you want a cat that's neutered and female and not grey, you could say (non-neutered AND female) AND (not grey) aka (NN && F) && (!G). He then showed how you'd translate that into circuits, where they lightbulb will only light up where all those conditions are satisfied. 

  • The parts about other number systems were cool, again probably because I already knew them. 

  • I enjoyed the explanations of things like bar code scanners and how they read in bits.

  • The chapters on binary adding and subtraction were laborious to read if I wanted to get anything out of them but good and seemed very comprehensive. It was interesting to learn about things like one's complement

  • The memory parts were sometimes hard to understand but were interesting. I'm not sure how I didn't realise before why RAM gets lost when the computer turns off. I also didn't realise before I read this that when your storage isn't full it's not like there are some places that are just empty, those places have disorganised electrical signals instead and places you've saved things to have organised information. So you're not adding something to nothing, you've overwriting that chaos. 

  • The snippets of history were enjoyable and well-written.


  • The learning curve and pacing is really weird. It starts out very simplistic with the introduction to the idea of a code that goes on for about five chapters, then the chapters about other number systems are fine as well (although I had already studied them in college), and I loved the mathematical logic and even the simple circuits were okay, but then I got destroyed by circuits with dozens of wires, flipflops and microprocessors.

  • The second half of the book was pretty boring. It went into HUGE detail on two specific microprocessors, and I had no interest in learning about the location of all the individual inputs on an intel 8080 or whatever. It also talked about "the graphics revolution" but was written in 1999 so it was saying things were impressive that didn't seem impressive, and again too much detail on specifics of operating systems. I liked seeing how they worked in general, though.

  • I would really have liked more of an emphasis on programming -- how compilers work to get code from nice while x < 7 do statements to something the computer understands. It was explained for assembly language, not that I really got it, but not for higher-level languages. I suppose there's only so much you can fit in a book, and it does specifically say (late into the book) that it doesn't have room to cover compilers.

  • He spent the vast majority of the book talking about using telegraph relays and transistors barely made an appearance -- I would've liked to understand those more. The input-output system he spent most of the book on was lightbulbs, and I get that it was supposed to be clear to everyone but it didn't really suit my level of knowledge, i.e. knowing how to do basic programming and being "decent with computers" but not understanding what's under the hood. He specifically says in the Preface "The "bit" isn't defined until page 68; "byte" isn't defined until page 180. I don't mention transistors until page 142, and that's only in passing" and seems proud of that, but not really my thing. 

  • The Internet is pretty much ignored.

In short:

This is a book you really need to have a pen and paper out for, because it's not the sort of thing you just read and take information in from -- it's very involved. This is especially important because it starts incredibly easy and lulls you into a false sense of security, then flip-flops appear. As long as you're not looking for a book on modern programming but want to understand where computers come from, I'd still recommend this book. It did an impressive job of a big part of what I was after, translating logic to circuits and showing how you'd use that to do fundamental computing operations like addition and subtraction. I know I've said a lot of negative things here, but the first half is excellent. Recommended if you can take time with it.

No comments:

Post a Comment