Machine Code

We write programs all the time using high level programming languages like Java, C etc… With these programs we can instruct the computer to do something useful. But have you ever wondered how exactly the computer is able to execute these instructions? Idea of this post is not to teach you how to program in a low level language or in worst case binary form but to show you what happens under the hood.

As you might already know, computers work with bits. A bit is something that just represents two states. An on or off, 1 or 0, true or false, plus or minus etc… We can use any two things to represent these two states but in general we use 1 and 0 to say the bit is on and off respectively. We know that computers can store words, pictures, sounds etc… But in reality all these are nothing but bits and we humans have grouped them and given it some meaning (a code). For examples let’s take a byte which is in fact 8 bits. How many patterns can we make up with 8 bits? With 1 bit there are only two possibilities, 1 or 0. With 2 bits, 4 patterns; likewise with 8 bits we have 256 possible ways of arranging bits. So what did people do? They got together and held some meetings and agreed on some code and made it a standard. Take ASCII for example,the bit pattern ‘01000001’ represents letter ‘A’. We just gave some meaning to a bit pattern. But keep in mind that how these patterns are interpreted is based on the context in which it is being used.

In a high level sense basically what happens when you run a program is, your program gets loaded into the RAM first and then the processor starts to execute the instructions in that program. An instruction is simply a sequence of one or more bytes and different processors follow different Instruction Set Architectures(ISAs) in their instruction encoding. (In Linux, you can use the following command to find out what ISA is being used by your machine.