Join 425,000 subscribers and get a daily digest of news, geek trivia, and our feature articles.
By submitting your email, you agree to the Terms of Use and Privacy Policy.
Benj Edwards is a former Associate Editor for How-To Geek. Now, he is an AI and Machine Learning Reporter for Ars Technica. For over 15 years, he has written about technology and tech history for sites such as The Atlantic, Fast Company, PCMag, PCWorld, Macworld, Ars Technica, and Wired. In 2005, he created Vintage Computing and Gaming, a blog devoted to tech history. He also created The Culture of Tech podcast and regularly contributes to the Retronauts retrogaming podcast. Read more…
With so much talk about Apple’s M1 and smartphone chips these days, you might hear about the “system on a chip” (SoC) designs used in them. But what are SoCs, and how do they differ from CPUs and microprocessors? We’ll explain.
A system on a chip is an integrated circuit that combines many elements of a computer system into a single chip. An SoC always includes a CPU, but it might also include system memory, peripheral controllers (for USB, storage), and more advanced peripherals such as graphics processing units (GPUs), specialized neural network circuitry, radio modems (for Bluetooth or Wi-Fi), and more.
A system on a chip approach is in contrast with a traditional PC with a CPU chip and separate controller chips, a GPU, and RAM that can be replaced, upgraded, or interchanged as necessary. The use of SoCs makes computers smaller, faster, cheaper, and less power-hungry.
RELATED: What Is Bluetooth?
Since the early 20th century, the progression of electronics has followed a predictable path regarding two major trends: miniaturization and integration. Miniaturization has seen individual electronic components such as capacitors, resistors, and transistors get smaller over time. And with the invention of the integrated circuit (IC) in 1958, integration has combined multiple electronic components onto a single piece of silicon, allowing for even further miniaturization.
As this miniaturization of electronics took place over the 20th century, computers got smaller too. The earliest digital computers were made of large discrete components such as relays or vacuum tubes. Later, they used discrete transistors, then groups of integrated circuits. In 1972, Intel combined the elements of a computer central processing unit (CPU) into a single integrated circuit, and the first commercial, single-chip microprocessor was born. With the microprocessor, computers could be smaller and use less power than ever before.
RELATED: The Microprocessor Is 50: Celebrating the Intel 4004
In 1974, Texas Instruments released the first microcontroller, which is a type of microprocessor with RAM and I/O devices integrated with a CPU onto a single chip. Instead of needing separate ICs for a CPU, RAM, memory controller, serial controller, and more, all of that could be placed into a single chip for small embedded applications such as pocket calculators and electronic toys.
Throughout most of the PC era, using a microprocessor with separate controller chips, RAM, and graphics hardware resulted in the most flexible, powerful personal computers. Microcontrollers were generally too limited to be good for general computing tasks, so the traditional method of using microprocessors with discrete supporting chips remained.
Recently, the drive toward smartphones and tablets has pushed integration even further than microprocessors or microcontrollers. The result is the system on a chip, which can pack many elements of a modern computer system (GPU, cell modem, AI accelerators, USB controller, network interface) along with the CPU and system memory into a single package. It’s one more step in the continued integration and miniaturization of electronics that will likely continue long into the future.
Putting more elements of a computer system on a single piece of silicon lowers power requirements, reduces cost, increases performance, and reduces physical size. All of that helps dramatically when trying to create ever-more-powerful smartphones, tablets, and laptops that use less battery life.
For example, Apple prides itself in making capable, compact computing devices. Over the past 14 years, Apple has used SoCs in its iPhone and iPad lines. At first, they used ARM-based SoCs designed by other firms. In 2010, Apple debuted the A4 SoC, which was the first iPhone SoC designed by Apple. Since then, Apple has iterated on its A-series of chips with great success. SoCs help iPhones use less power while still staying compact and getting more capable all the time. Other smartphone manufacturers use SoCs as well.
Until recently, SoCs rarely appeared in desktop computers. In 2020, Apple introduced the M1, its first SoC for desktop and notebook Macs. The M1 combines a CPU, GPU, memory, and more on one piece of silicon. In 2021, Apple improved on the M1 with the M1 Pro and M1 Max. All three of these chips give Macs impressive performance while sipping power relative to the traditional discrete microprocessor architecture found in most PCs.
The Raspberry Pi 4, a popular hobbyist computer, also uses a system on a chip (a Broadcom BCM2711) for its core functions, which keeps the device cost low (about $35) while providing plenty of power. The future is bright for SoCs, which continue the tradition of electronics integration and miniaturization that began over a century ago. Exciting times ahead!
RELATED: What’s the Difference Between Apple’s M1, M1 Pro, and M1 Max?
Facebook
Instagram
Twitter
LinkedIn
RSS Feed
The Best Free Tech Newsletter Anywhere
By submitting your email, you agree to the Terms of Use and Privacy Policy.