Well they're not you know how to code many of you have probably seen a few lines of a program written in a language like C, Python or Pascal which are all based on English and designed to be readable by humans to some degree.

But when you get right down to it the CPU that processes all of that code is just a hunk of silicon and the last time I had a conversation with an inanimate object I didn't feel like it had a whole lot to contribute to the discussion.

So how do CPUs actually take code that's expressed in letters and numbers and put things onto the screen that makes sense to us carbon-based life forms.

Well it first helps to realize that what we call code can actually refer to a number of things usually when people say code they're talking about source code which is the English or Chinese or whatever language you really want base set of instructions written in whatever programming language you like.

But after a programmer finishes writing a program in source code it needs to be further processed so that the CPU can actually understand it which is done by running a source code through a special kind of program called a compiler that will check the code for errors and convert it into a CPU understandable form which is called object code or machine code the reason that a CPU can interpret machine code is because it's compiled in binary the series of ones and zeros that is the basis for all modern digital computing.

But hold on a minute why can CPUs understand ones and zeros which are just another form of human readable information well they can do this because those ones and zeros are really just representations of an electrical signal which is on or off.

Machine code travels around the inside of your PC as a series of electrical pulses that correspond to each a 1 and 0 that the compiler spits out and when these pulses hit your CPU a large number of things happen.

An average CPU has millions of transistors many of which serve as logic gates that open or shut depending if they're receiving an electrical impulse in other words whether they're receiving a zero or one.

Logic gates will open and shut to manipulate machine code in very complex ways until the CPU spits out processed data that travels to other parts of your computer all the principles behind processor design are immensely complicated you can think of the transistors inside of a CPU as beads on a really big abacus these beads are arranged according to the processors microarchitecture denoted by code names such as Haswell, Broadwell an Ivy Bridge for Intel CPUs or bulldozers, Steamroller and pile driver for AMD chips.

However even with all these different architectures most modern applications will run on any of these processors because nearly any PC CPU is going to use the same instruction set which is just what it sounds like the set of binary instructions that the CPU can use to understand and execute.

Current consumer CPUs for desktops and laptops virtually all use either the x86 instruction set or the newer but backwards compatible x86-64 instruction set for 64-bit systems.

Since so many different microarchitectures make use of the same instruction set the main difference between them is how quickly and efficiently different processors can execute those set instructions.

Although a newer model Intel Core i7 and an older model Core 2 Duo can understand the same instructions the i7 is often going to be much faster due to its radically different microarchitecture.

It's a little bit like the difference between accelerating in a Maserati versus a Yugo both cars understand that pressing down the gas pedal means go but because the Maseratis different engine architecture it can execute the instruction much faster than a Yugo and suddenly you're being pulled over don't worry though you'll never get pulled over for overclocking your processor you might just void your warranty.