This is an open question to programmers: is it worth teaching someone assembly language anymore? My instinctive reaction is “of course,” but let me explain why I’m second-guessing that instinct.

The Case For

Back in the 1990’s I had my moment of revelation: a pal of mine, a far more experienced programmer than me, showed me how C code compiled into assembly. This was on the Motorola 68K instruction set. I learned to walk through how the stack works, how pointers work, how loops work… it all finally made sense. From that time forward, I’ve had no fear of C or C++ because I know how the language morphs the code I write into the code the computer sees.

As I write my book for Pragmatic Bookshelf, one of my must-have chapters has been “how the machine works” because I want my readers to have that same moment of revelation. While it’s not my goal to teach readers everything there is to know about computers, I figure that one chapter on digging into the machine would be worthwhile.

The Goal

What I’m trying to accomplish with assembly is not teaching people to hand-code assembly for the purpose of writing production code. That’s crazy. Rather, the goal is to use assembly as a tool for better understanding the machine. For example, when assembly instructions are manipulating the stack pointer on function entry and exit, you can’t help but learn how the stack works.

I could take two approaches to teaching assembly: writing it by hand or disassembling C code. The first approach is easier for distilling concepts since I can make the assembly as simple as needed. The second is easier for teaching how C compiles, but examples quickly get complicated, and I don’t expect readers to trace through page-long assembly listings. Then there’s a middle ground where I use some of both; that’s the approach I’d likely take.

The Problem

I set out to update my assembly knowledge to i386 since that’s what everyone has on their desks. (Yes, it does seem like i386 is a huge step backward from M68K, but that’s another story.) As I dug into the architecture of modern CPUs, however, I had a new revelation:

The computer advertised in the instruction set is not the computer you think it is.

Here’s the thing: while Intel has retained a backward-compatible instruction set dating back to the 8086, which had roughly the horsepower of a gerbil on an exercise wheel, these days the instruction set is really just a front-end binary interface into a totally different computer.

Intel says as much in their developer documentation. Look in their Software Developer’s Manual, Volume 1: Basic Architecture, and you’ll find that your machine instructions are translated into a different microcode on the chip. Furthermore, the processor can play all kinds of tricks with the machine instructions:

  • Branch prediction where it decodes instructions past a branch in the direction the processor thinks it will go.

  • Speculative execution where it will run instructions from both sides of a branch until it knows what path the branch took, and then throw away the unneeded stuff.

  • Out-of-order instruction execution to keep pipelines full.

  • Caching of damn near everything, including registers.

  • Pre-fetch of data from memory into cache to reduce cache misses.

  • …and all kinds of other stuff. That’s even before you get to the Core micro-architecture where they start making up words like “Macrofusion” and “Microfusion” to describe what the hell the chip is doing.

So what exactly is under the hood? Not a gerbil, that’s for sure. It’s got fourteen gerbils, each strapped to a Dodge Viper with nitro boost and a jet engine on the back and propellers on the front and rockets on the side. And now it’s got macrofusion, too.

The problem is that if you apply your enlightened assembly language skills to a modern x86 processor, you’re more likely to slow the system down than speed it up. Intel says this, too: “Processors today are so complex that performance snags can occur in places that even experienced developers would never consider.”

Therefore, it seems a fallacy to use x86 assembly as a way to better understand how your computer works, because that’s not actually how your computer works anymore.

Where To Go?

My first reaction was, “screw it, this will just lead readers into a rathole.” But I still think there’s something worth learning about how most computers work. For example, I was interviewing a junior-level programmer earlier this week and I asked him to write a Fibonacci sequence function. He couldn’t get it, and when I showed him the recursive solution he was just baffled.

One could argue “that guy is just stupid” but I don’t think that’s the case. He just didn’t know how function calls and the stack work. Once you understand how the computer can keep track of state when recursing, there’s no mystery to recursion anymore.

My next reaction was, “maybe I should teach a different processor.” It turns out there’s a really cool MIPS simulator where you can run instructions forwards and backwards, watch registers and memory, all kinds of neat stuff. MIPS is a very elegant architecture once you understand its pipelining goals. (If you want to learn more, I highly recommend the book See MIPS Run.)

While MIPS is a very straightforward architecture and it has inspired a great number of machines that have come since, the downside is that nobody’s got a MIPS box on their desk. (Are you reading this page on a MIPS? Or PowerPC, even? Probably not.) Therefore any lesson in MIPS is bound to be abstract at best.

Options

I can see three options:

1) Go ahead and teach i386, which is easy to compile and experiment with on the computer in front of you. Include a big warning upfront that any thoughts you have about optimizing with hand-coded assembly are foolish.

2) Teach MIPS, which has a relatively simple instruction set and is easy to visualize with a simulator. Discard any notion of “use this on your own code to see what it’s doing” since that would require cross-compiling, and I don’t think many readers would follow along with that.

3) Screw it, tell people that C is the lowest-level language that you should program in these days.

So that’s where I’m at. Now it’s your turn: what do you think?