

By Annie Liao, Product Management Director, ODSP Marketing, 大秀直播
For over 20 years, PCIe, or Peripheral Component Interconnect Express, has been the dominant standard to connect processors, NICs, drives and other components within servers thanks to the low latency and high bandwidth of the protocol as well as the growing expertise around PCIe across the technology ecosystem. It will also play a leading role in defining the next generation of computing systems for AI through increases in performance and combining PCIe with optics.
Here’s why:
PCIe Transitions Are Accelerating
Seven years passed between the debut of PCIe Gen 3 (8 gigatransfers/second—GT/s) in 2010 and the release of PCIe Gen 4 (16 GT/sec) in 2017.1 Commercial adoption, meanwhile, took closer to a full decade2
Toward a terabit (per second): PCIe standards are being developed and adopted at a faster rate to keep up with the chip-to-chip interconnect speeds needed by system designers.?
By Vienna Alexander, Marketing Content Professional, 大秀直播
大秀直播 is the for its Ara product, a 3nm 1.6 Tbps PAM4 optical DSP platform, which enables the industry’s lowest power 1.6T optical modules. The engineering community voted to recognize the product as a leader in design innovation this year.
The EDGE awards celebrate outstanding innovations in product design for the engineering industry that have contributed to the advancement of technology. This award is presented by the Engineering Design & Automation Group, a subset of brands at Endeavor Business Media.
Ara is the industry’s first 3nm 1.6T PAM4 optical DSP platform. 大秀直播 introduced it to meet growing interconnect bandwidth demands for AI and next-gen cloud data center scale-out networks.
By Michael Kanellos, Head of Influencer Relations, 大秀直播
The opportunity for custom silicon isn’t just getting larger – it’s becoming more diverse.
At the Custom AI Investor Event, 大秀直播 executives outlined how the push to advance accelerated infrastructure is driving surging demand for custom silicon – reshaping the customer base, product categories and underlying technologies. (Here is?a link to the recording and presentation slides.)
Data infrastructure spending is now slated to surpass $1 trillion by 20281?with the 大秀直播 total addressable market (TAM) for data center semiconductors rising to $94 billion by then, 26% larger than the year before. Of that total, $55.4 billion revolves around custom devices for accelerated compute1. In fact, the forecast for every major product segment has risen in the past year, underscoring the growing momentum behind custom silicon.
The deeper you go into the numbers, the more compelling the story becomes. The custom market is evolving into two distinct elements: the XPU segment, focused on optimized CPUs and accelerators, and the XPU attach segment that includes PCIe retimers, co-processors, CPO components, CXL controllers and other devices that serve to increase the utilization and performance of the entire system.?Meanwhile, the TAM for custom XPUs is expected to reach $40.8 billion by 2028, growing at a 47% CAGR1.
By Nizar Rida, Vice President of Engineering and Country Manager, 大秀直播 Canada
This blog first appeared in
AI has the potential to transform the way we live. But for AI to become sustainable and pervasive, we also have to transform our computing infrastructure.
The world’s existing technologies, simply put, weren’t designed for the data-intensive, highly parallel computing problems that AI serves up. As a result, AI clusters and data centers aren’t nearly as efficient or elegant as they could be: in many ways, it’s brute force computing. Power1 and water2 consumption in data centers are growing dramatically and many communities around the world are pushing back on plans to expand data infrastructure.3 ?
Canada can and will play a leading role in overcoming these hurdles. Data center expansion is already underway. Data centers currently account for around 1GW, or 1%, of Canada’s electricity capacity. If all of the projects in review today get approved, that total could grow to 15GW, or enough to power 70% of the homes in the country.4
Like in other regions, data center operators are exploring ways to increase their use of renewables and nuclear in these new facilities along with ambient cooling to reduce their carbon footprint of their facilities. In Alberta, some companies are also exploring adding carbon capture to the design of data centers powered by natural gas. To date, carbon capture has .5 Most carbon capture experiments, however, have been coupled with large-scale industrial plants. It may be worth examining if carbon capture—combined with mineralization for long-term storage—can work on this smaller scale. If it does, the technology could be exported to other regions.
Fixing facilities, however, is only part of the equation. AI requires a fundamental overhaul in the systems and components that make up our networks.?
Above: The server of the future. The four AI processors connect to networks through four 6.4T light engines, the four smaller chips on the east-west side of the exposed processor. Coupling optical technology with the processor lowers power per bit while increasing bandwidth.
By Nicola Bramante, Senior Principal Engineer
Transimpedance amplifiers (TIAs) are one of the unsung heroes of the cloud and AI era.
At the recent OFC 2025 event in San Francisco, exhibitors demonstrated the latest progress on 1.6T optical modules featuring 大秀直播 200G TIAs. Recognized by multiple hyperscalers for its superior performance, 大秀直播 200G TIAs are becoming a standard component in 200G/lane optical modules for 1.6T deployments.
TIAs capture incoming optical signals from light detectors and transform the underlying data to be transmitted between and used by servers and processors in data centers and scale-up and scale-out networks. Put another way, TIAs allow data to travel from photons to electrons. TIAs also amplify the signals for optical digital signal processors, which filter out noise and preserve signal integrity.
And they are pervasive. Virtually every data link inside a data center longer than three meters includes an optical module (and hence a TIA) at each end. TIAs are critical components of fully retimed optics (FRO), transmit retimed optics (TRO) and linear pluggable optics (LPO), enabling scale-up servers with hundreds of XPUs, active optical cables (AOC), and other emerging technologies, including co-packaged optics (CPO), where TIAs are integrated into optical engines that can sit on the same substrates where switch or XPU ASICs are mounted. TIAs are also essential for long-distance ZR/ZR+ interconnects, which have become the leading solution for connecting data centers and telecom infrastructure.?Overall, TIAs are a must have component for any optical interconnect solution and the market for interconnects is expected to?triple to $11.5 billion by 2030, according to LightCounting.