FPGA can be large computing power?

9/29/2024 11:55:23 AM

The large computing power generated by AI has brought real revenue to many processor companies, in contrast, in the face of this wave of AI dividends, the high-end FPGA market is somewhat "lonely".
The role of "facilitator"
In the existing data center, the role of the FPGA is more like an "auxiliary".
Hardware simulation or prototype verification platform before ASIC film is the first identity of FPGA.
Compared with CPU and GPU, FPGA is more flexible as an integrated circuit with programmable characteristics. Once the GPU and other types of processors are produced, their logic functions are "solidified" on the chip, although many chips can be programmed through the on-chip register, but this programming is more to change the configuration of the chip, will not change its own logic function. If you want to make changes to its logical function, or fix flaws and vulnerabilities in the design, you have to re-design, verify and manufacture the chip, which requires a lot of time, manpower, money, and will also bring higher investment and risk to the design company.
The FPGA contains only the most basic logic gates, and by arranging and combining these logic gates in a certain way, its logical structure can be changed through programming. As a result, processor suppliers, including computing chip companies, will purchase FPgas for hardware pre-research before the chip.
Another identity of the FPGA is nicknamed "the function like glue" in the industry - to better connect different types of processors together, so as to help the entire processor system play a better role.
Accelerated reasoning is one.
"A large part of the application of AI technology is on the reasoning side. Fpgas provide an effective acceleration of AI reasoning through their hardware flexibility, low latency, low power consumption, and the ability to run hybrid computations with cpus and the like. Moreover, in parallel intensive tasks or application scenarios such as matrix operations running neural networks, the use of FPgas can significantly improve the performance and efficiency of AI reasoning." An FPGA industry insider said in an interview with a reporter from China Electronics News.
Increasing the computing power of general-purpose processors is the second.
Fpgas, with their flexible hardware acceleration capabilities, configurable rich interfaces such as PCIe Gen5 and CXL, and the ability to work with processors, provide powerful support for processors to better meet the increasingly complex AI computing needs. For example, FPGas can work with general-purpose processors to form a hybrid computing environment, that is, processors and FPgas complement each other, jointly cope with different types of computing tasks, and provide more comprehensive computing performance. In addition, the industry is also helping customers flexibly run multiple AI workloads and reduce the total cost of ownership (TCO) through open source software suites.
Difficult to develop
AI has created a rare growth opportunity for almost every type of processor. The data center market has gained considerable market attention because of its large product value and high profit margin. In the past two years, head computing chip companies such as Nvidia, cloud computing manufacturers entering the chip design market such as Google, and emerging computing chip design companies springing up like mushrooms are frequently updating their product sequences and technical parameters.
Fpgas, by contrast, are less active.

AMD's current high-end flagship products for the data center side are still the Versal product portfolio that will begin mass production in 2021, including the Versal AI Core, Versal Prime and Versal Premium three product families. After the acquisition of FPGA company Xilinx, AMD is the world's largest FPGA manufacturer

Altera's current flagship product for the data center side, Agilex, was released in 2019 and is based on Intel's 10-nanometer process design. In the meantime, the product range has gradually expanded. Among them, the Agilex 9, Agilex 7F series and I Series flagships for the data center have been put into production.
Compared with the other AI-oriented products of the two companies, the FPGA product line is not updated at the same rate.
The reason for this situation, "development is difficult" is a very important reason for the reporter to understand.
Di Zhixiong, deputy dean of the School of Integrated Circuit Science and Engineering of Southwest Jiaotong University, said in an interview with China Electronics News that FPGA is an integrated chip based on digital circuits, so the development threshold and deployment difficulty are greater than GPU and other computing power chips.
Sheng Linghai, vice president of market research firm Gartner, said in an interview with reporters that compared with FPgas, ASics such as NPU do not need to meet the requirements of universality, as long as they meet the requirements of a single function, so the design is simpler.
"This is one of the reasons why NPU manufacturers have mushroomed in the past two years, but there are only two high-end FPGA manufacturers so far." Shenglinghai added.
Facing the computing chip market crowding
Power chips that update quickly are also more profitable than FPgas. Computing cards introduced by companies such as Nvidia are highly integrated, large in scale, and the process nodes are updated quickly, so that they can handle the rapidly growing computing power needs. In contrast, the process node update and performance improvement speed of FPGA is difficult to achieve, so it is difficult to break through the original "comfort zone".
Li Guoqiang, a veteran of the semiconductor industry, said in an interview that FPGA does have its own value in verifying the rationality of the structure of AI chips, but when undertaking AI high computing power needs, the performance differs greatly compared with AI chips.
While AMD has the world's largest FPGA business, it is also the world's second largest supplier of cpus and Gpus on the data center side. From the second quarter of 2024 financial results, its data center business revenue (including GPU and CPU) was $2.8 billion, and embedded business (including FPGA) revenue was $861 million. It is not difficult to see that the two are not in the same order of magnitude. In addition, in AMD's embedded business, there is a considerable proportion of edge - and end-oriented business, and the proportion of FPGA business for data centers is even lower. This is probably also the reason why AMD has a huge difference in product investment and publicity on the data center side CPU, GPU and FPGA. As in June this year, AMD Chairman and CEO Zifeng Su delivered more than an hour long speech at COMPUTEX2024, the embedded business for the data center was hardly mentioned.
Not only that, the role that FPgas originally play in data centers is also facing the risk of being replaced by new computing power chips.
Li Guoqiang said that due to the large scale of the newly released AI chip, it gradually integrates many functions that were originally held by other chips, so that some functions of the FPGA that were originally used to assist the processor to improve computing efficiency and supplement computing and system functions were replaced. "Just like mobile phone chips, as the main chip becomes more integrated, the market value of other peripheral chips is crowded out." 'he said.
However, whether the scale of application in the data center has been reduced or not, the market demand for FPGA as a chip verification function continues to exist. This is a market space that cannot be replaced by other product types. Although the scale is not large, it is stable growth, but the growth rate is not as significant as the computing power chip. This is the FPGA chip suppliers, even if the revenue has changed significantly, but the profit margin is basically stable reason.

payment
OCEAN  ELECTRONIC (INT'L) LIMITED
HOME

HOME

PRODUCT

PRODUCT

PHONE

PHONE

USER

USER

OnlineOnline