Meet the FPGA

An FPGA, short for field programmable gate array, is a highly capable type of integrated circuit, an electronic circuit in a single package. The field programmable part of the name indicates that FPGAs can be reprogrammed when in the field (that is, without having to return them to the manufacturer). The gate array part indicates that an FPGA is made up of a two-dimensional grid featuring a large number of gates, fundamental units of digital logic.

Youtube link to what is fpga video

The name is actually a bit of an anachronism. The reality is that some FPGAs aren’t field programmable, and most are no longer just an array of simple gates. In fact, they’re much more sophisticated than that. Despite these exceptions, the name has stuck over the years, and it highlights a unique characteristic of  FPGAs: their incredible flexibility. An FPGA’s uses are limited only by the designer’s imagination. Other digital programmable devices, such as microcontrollers, are designed with a specific set of capabilities; you can only do something if that feature is built in. By contrast, an FPGA’s array of gates (or the more modern equivalent) is like a blank slate that you can program, and reprogram, and reprogram to do almost anything you want, with fewer restrictions. This freedom doesn’t come without trade-offs, however, and FPGA development demands a unique set of skills.

Here are some examples of designs that are possible with FPGAs:

Learning how to work with FPGAs requires a different style of thinking from traditional computer programming. Traditional software engineering, like programming in C, for example, is serial: first this happens, then this happens, and finally this happens. This is because C is compiled to run on a single processor, or CPU, and that CPU is a serial machine. It processes one instruction at a time.

FPGAs, on the other hand, work in parallel: everything is happening at the same time. Understanding the difference between serial and parallel programming is fundamental to working with FPGAs. When you can think about solving a problem using parallel methods, your overall problem-solving skills will increase. These skills will also translate to other, non-FPGA applications; you’ll begin to see problems differently than if you were only thinking about them serially. Learning how to think in parallel rather than serially is a critical skill for becoming an FPGA engineer, and it’s one you’ll develop throughout this book.

FPGAs are a lot of fun to work with. When you create an FPGA design using Verilog or VHDL (more on these languages later in this chapter), you’re writing code at the lowest possible level. You’re literally creating the physical connections, the actual wires, between electrical components and input/output pins on your device. This allows you to solve almost any digital problem: you have complete control. It’s a much lower level of programming than working with a microcontroller that has a processor, for example. For this reason, learning about FPGAs is an excellent way to become familiar with hardware programming techniques and better understand how exactly digital logic works in other applications. You’ll gain a newfound respect for the complexities of even the simplest integrated circuits once you start working with FPGAs.

A Brief History of FPGAs

The very first FPGA was the XC2064, created by Xilinx in 1985. It was very primitive, with a measly 800 gates, a fraction compared to the millions of gate operations that can be performed on today’s FPGAs. It was also relatively expensive, costing $55, which adjusted for inflation would be around $145 today. Still, the XC2064 kicked off an entire industry, and (alongside Altera) Xilinx has remained one of the dominant companies in the FPGA market for more than 30 years.

Early FPGAs like the XC2064 were only able to perform very simple tasks: Boolean operations such as taking the logical OR of two input pins and putting the result onto an output pin (you’ll learn much more about Boolean operations and logic gates later). In the 1980s, this type of problem required a dedicated circuit built of OR gates. If you also needed to perform a Boolean AND on two different pins, you might have to add another circuit, filling up your circuit board with these dedicated components. When FPGAs came along, a single device could replace many discrete gate components, lowering costs, saving component space on the circuit board, and allowing the design to be reprogrammed as the requirements of the project changed.

From these humble beginnings, the capabilities of FPGAs have increased dramatically. Over the years, the devices have been designed with more hard intellectual property (IP), or specialized components within the FPGA that are dedicated to performing a specific task (as opposed to soft components that can be used to perform many tasks). For example, hard IP blocks in modern FPGAs let them interface directly with USB devices, DDR memory, and other off-chip components. Some of these capabilities (like a USB-C interface) would simply not be possible without some dedicated hard IP to do the job. Companies have even placed dedicated processors (called hard processors) inside FPGAs so that you can run normal C code within the FPGA itself.

As the devices have evolved, the FPGA market has undergone many mergers and acquisitions. In 2020, the chip-making company AMD purchased Xilinx for $35 billion. It’s plausible that this purchase was a response to its main competitor Intel’s 2015 acquisition of Altera for $16.7 billion. It’s interesting that two companies focused predominantly on CPUs decided to purchase FPGA companies, and there’s much speculation as to why. In general, it’s thought that as CPUs mature, dedicating some part of the chip to FPGA-like reprogrammable hardware seems to be an idea worth pursuing.

Apart from Xilinx and Altera (which from here on I’ll be calling by their parent company names, AMD and Intel, respectively), other companies have carved out their own niches within the FPGA market. For example, Lattice Semiconductor has done well for itself making mostly smaller, less expensive FPGAs. Lattice has been happy to play on its own in this lower end of the market, while letting AMD and Intel slug it out at the higher end. Today, the open source community has embraced Lattice FPGAs, which have been reverse-engineered to allow for low-level hacking. Another medium-sized player in the FPGA space, Actel, was acquired by Microsemi in 2010 for $430 million. Microsemi itself was acquired by Microchip Technology in 2018.

Popular FPGA Applications

In their modern, highly capable and flexible form, FPGAs are used in many interesting areas. For example, they’re a critical component in the telecommunications industry, where they’re often found in cell phone towers. They route internet traffic to bring the internet to your smartphone, allowing you to stream YouTube videos on your bus ride to work.

FPGAs are also widely used in the finance industry for high-frequency trading, where companies use algorithms to automatically buy and sell stocks incredibly quickly. Traders have found that if you can execute a stock purchase or sale slightly quicker than the competition, you can gain a financial edge. The speed of execution is paramount; a tiny bit of latency can cost a company millions of dollars. FPGAs are well suited to this task because they’re very fast and can be reprogrammed as new trading algorithms are discovered. This is an industry where milliseconds matter, and FPGAs can provide an advantage.

FPGAs are used in the defense industry as well, for applications like radar digital signal processing. FPGAs can process received radar reflections using mathematical filters to see small objects hundreds of miles away. They’re also used to process and manipulate images from infrared (IR) cameras, which can see heat rather than visible light, allowing military operatives to see people even in complete darkness. These operations are often highly math-intensive, requiring many multiplication and addition operations to happen at parallel—something that FPGAs excel at.

Another area where FPGAs have found a niche is in the space industry: they can be programmed with redundancies to hedge against the effects of radiation bombardment, which can cause digital circuits to fail. On Earth, the atmosphere protects electronics (and people) from lots of solar radiation, but outer space doesn’t have that lovely blanket, so the electronics on satellites are subjected to a much harsher environment.

Finally, FPGAs are also getting interest from the artificial intelligence (AI) community. They can be used to accelerate neural nets, another massively parallel computational problem, and thus are helping humans attack issues that weren’t solvable using traditional programming techniques: image classification, speech recognition and translation, robotics control, game strategy, and more.

This look at popular FPGA applications is far from exhaustive. Overall, FPGAs are a good candidate for any digital electronics problem where high bandwidth, low latency, or high processing capability is needed.

FPGA-101

If you enjoyed this, order the book!

Getting Started with FPGAs

For the most thorough introduction to FPGAs

Over 300 pages of beginner friendly, easy-to-understand FPGA content

Pre-Order Now!

Available Nov. 21

DETAILS / PRE-ORDER