How does your computer know how to run programs? How does it translate the code from an app into an action? Here’s how it works.
It is common knowledge that code is written by a developer and how humans communicate to computers. However, have you ever thought about how software such as code interacts with computer hardware such as a CPU (Central Processing Unit)? If the answer is yes, then you've come to the right place.
To understand how code is executed on a computer, you must understand what makes a computer tick and how it can be manipulated. First, let's first talk about the fundamental ideas of computer hardware before progressing to the software side of things.
Binary is a base-2 number system that processors and memory use to execute code. Binary numbers can only be 1 or 0, hence its name. If you group eight binary numbers (00000000), you get what is known as a byte, while a single binary number (0) is called a bit.
Everything logical about computing with machines starts with the simple switch. A simple switch has two conductors and a connecting and disconnecting mechanism. Connecting both conductors allows current to flow, which produces a signal to the other end of the conductor. On the other hand, if the conductors are disconnected, the current will not flow, meaning no signal will be produced.
Since a switch can only be on or off at an instance, they provide the ideal mechanism to make the high and low signals used to produce square wave signals.
When you flick a switch, it produces a signal or one bit of data. A regular photo taken from a smartphone would be around five Megabytes of data, equalling 40,000,000 bits. That would mean you'll need to flick the switch tens of millions of times just to produce enough data for one photo taken from your smartphone.
With a switch's mechanical limitations, engineers needed something that didn't have any moving parts and provided faster switching speeds.
Thanks to the discovery of doping (manipulating the electrical conductivity of semiconductors like silicon), engineers were able to make electrically controlled switches known as transistors. This new invention allowed for faster processing speeds that needed little voltage to power, ultimately making it possible to stack over a billion of these transistors on a single modern CPU.
The transistors are then cleverly arranged to make logic gates, half-adders, adders, flip flops, multiplexers, registers, and various components that make the CPU functional. The way these components have been stacked defines what is known as a CPU architecture.
The CPU architecture also dictates a processor's ISA (Instruction Set Architecture). An ISA holds a built-in list of instructions that a CPU can execute natively. These instructions are then sequentially mixed together through a programming language to make what is known as a program. Usually, hundreds of instructions are readily available on a CPU, including addition, subtraction, move, save, and load.
Here is a sample of an instruction set:
Each instruction in an instruction set has its own binary address known as an opcode. The opcode will be the first few binary bits that tell which operation from the instruction set to use.
Following the opcode is the operand. The operand contains values and addresses where the opcode will be used.
The diagram shows an 8-bit instruction. If a CPU has a 64-bit architecture, then the instructions can span up to 64 bits in instruction width, making it a more capable processor.
Now that you understand binary signals, you can learn about how your computer interprets such signals. How machine code is to be interpreted depends on the type of logic used on an assembler (a low-level program used to decode and assemble code to proper binary).
For example, if our assembler uses the ASCII (American Standard Code for Information Interchange) standard, our assembler would take the machine code given and interpret it the same as from the ASCII in the table below.
Since our assembler uses ASCII (8- bit version), every eight binary numbers in the binary are interpreted as one character. The assembler would take this byte and interpret it according to the standards given. For example, 01000001 01101001 01010100 would translate into the word "bit."
Assembly Language is a human-readable low-level programming language that directly manipulates a CPU architecture's opcodes and operands.
Here is an example of a simple assembly code using the instruction set shown earlier:
This block of code is stored in RAM till the CPU fetches each line of code one by one.
The CPU executes code through a cycle known as Fetch, Decode, and Execute. This sequence shows how a CPU processes each line of code.
Fetch: The instruction counter within the CPU takes one line of instruction from RAM to let the CPU know what instruction to execute next.
Decode: The Assembler will decode the human-readable block of code and assemble it as properly formatted binaries for the computer to understand.
Execute: The CPU then executes the binaries by applying the instructions indicated by the opcode to the operands provided.
The computer will execute it as follows:
The computer has successfully added two numbers together and stored the value at the specified RAM address.
Great! Now you know how a computer executes code. However, it doesn't stop there.
With the proper hardware, an assembler, and an assembly language, people could execute code with reasonable ease. However, as both programs and computer hardware became even more complex, engineers and programmers had to think of a way to make programming less tedious and ensure compatibility with different kinds of CPU architecture. Thus the creation of compilers and interpreters.
The compiler and the interpreter are translational programs that take source code (programs made from high-level programming languages) and translate them to assembly language, which the assembler will then decode to binary.
An interpreter will take one line of code and immediately execute it. This is usually used on terminals like the Linux Bash Shell terminal and the Windows PowerShell terminal. Great for performing simple one-off tasks.
In contrast, a compiler will take multiple lines of code and compile them to make a program. Examples of these programs would be Microsoft Word, Photoshop, Google Chrome, Safari, and Steam.
With the creation of compilers and interpreters, high-level programming languages were created.
High-level programming languages are any language after assembly code. Some of these languages you may be familiar with are C, Python, Java, and Swift. These programming languages made programming more human-readable and simpler than the assembly language.
Here is a side by side comparison to illustrate how harder it is to program in assembly than with a high-level programming language like Python:
Both codes will print "Hello World."
With these programming languages, developers can program games, websites, applications, and drivers, with reasonable amounts of time.
A computer is a device that can only read binary. These binaries are produced by over a billion microscopic-sized transistors packed inside a CPU. Transistor arrangement dictates a CPU's ISA (Instruction Set Architecture), which provides hundreds of instructions that a CPU can readily perform once its opcode is called out through code. Developers mix and match these instructions sequentially, which creates an entire program such as game engines, web browsers, applications, and drivers.
A CPU executes code through a sequence known as the fetch, decode, execute cycle. Once a piece of code is loaded into RAM, the CPU will fetch its contents one by one, decode the contents into binary through the assembler, and then execute the code.
Since the assembler can only translate code made explicitly for the CPU architecture, compilers and interpreters were built on top of the assembler (much like an adapter) to work on different types of CPU architecture. An interpreter will take one command and execute it immediately. In contrast, a compiler will take all your commands and compile them into a reusable program.
High-level programming languages such as Python, C, and Java have been created to make programming easier, faster, and convenient. The large majority of programmers no longer have to code in assembly language, as their easy-to-use high-level programming languages can be translated to the assembly through a compiler.
Hopefully, you now have a better understanding of the fundamentals of computers and how they execute code.
Craving to learn how things worked, Jayric Maning started tinkering with all kinds of electronic and analog devices during his earlier teens. He took up forensic science at the University of Baguio to where he got acquainted with computer forensics and cyber security. He is currently doing lots of self-study and tinkering with tech figuring out how they work and how we can use them to make life easier (or at least cooler!).
Join our newsletter for tech tips, reviews, free ebooks, and exclusive deals!