With artificial intelligence (AI) spearheading many of the innovations in the microelectronics industry, Mike Ranjram, assistant professor at Arizona State University, shared his insights on data center power delivery for the evolving AI ecosystem. Speaking to a room of advanced packaging professionals at IMAPS’s last 2024 chapter event at Saras Micro Devices in Chandler, Arizona, he underscored the challenges brought upon by AI’s rising power demands.
AI’s need for immense amounts of power is one of many hurdles preventing the technology from evolving beyond its current state. In my latest blog, I even cited power management as the biggest issue facing AI scalability today.
To put this in perspective, Ranjram shared that over the next five years, Arizona’s grid will see a staggering 40% increase in peak load. As of now, Arizona Public Service, a local energy company, has 10 gigawatts (GW) in pending interconnect requests from data centers. For context, gigawatts are the unit that measures the power capacity of large power plants. Suffice to say, 10 GW is a lot, and power needs only ever seem to increase.
Addressing Power Delivery Challenges for AI
To accommodate AI, Ranjram shared that upstream converters must keep up with today’s power demands without neglecting isolation and power factor correction. To break that down:
- Upstream converters are power converters that increase voltage where the power is flowing
- Power isolation protects electronic components from electric shock
- Power factor correction is a technology that helps make the process of converting power into useful output more efficient
In summary, each of these are key considerations, and failure to adequately address even one will create power delivery issues down the line. As Ranjram put it, “Voltages go up, and everyone says, ‘great, we solved it.’ And the power levels go up, and the current levels go up again, so you end up back in the same corner of the problems you’ve got to fix.”
Although power delivery concerns seem to arise like Whac-a-Mole, Ranjram highlighted that using upstream converters with more power density is key for mitigating this issue, at least for now. By packing more power into the same footprint, the power delivery industry can help reduce stress on the grid until AI’s power needs are increased again. Migration from 12 volt (V) to 48V converters, he said, is a short-term approach helping to alleviate the issue.
However, as upstream converters pose a challenge to meeting rising power demands, point-of-load (POL) converters struggle to accommodate it as well. POLs are DC-to-DC converters placed as close to the power load as possible, and they’re widely used to accommodate high-performance semiconductors. Most POLs, Ranjram said, are multi-phase buck converters or a similar variant. In the field of power delivery, multi-phase buck converters are mature, proven, and widely used, but they have a scaling issue.
Despite scaling setbacks, POLs have the benefit of parallelization, or the ability to connect multiple power supplies together. Parallelization ensures that the total available current is increased by evenly distributing the load across all connected components. In addition, POLs are attractive because they have broad market support, they don’t need ultra-low voltage blocking devices, and current data center architecture is widely set up to accommodate them.
With all of the positives associated with POLs, addressing scaling concerns or moving away from them doesn’t seem likely in the foreseeable future. Ranjram highlighted NVIDIA’s commercial state-of-the-art H100 GPU card as an example of the industry’s reliance on these converters.
“Most of the area on the card is a power converter,” he said. “When NVIDIA decides to sell you this, what they’re really selling you is 28 power converters surrounding their amazing chip.”
A Path Forward
So, in addition to increasing power density and leveraging parallelization, what else can be done to mitigate AI scaling concerns? Vertical power delivery (VPD), Ranjram said, may be part of the solution.
He explained that today’s standard, lateral power converters send power to the point of load in the middle, which can create issues with impedance – a measurement of how much a circuit resists AC signals. The more impedance found within a system, the more it resists the flow of electricity, and the more power it requires. VPD structures help mitigate this by reducing the distance the currents need to travel, and they also have the added benefits of lower resistances and inductances.
Given the complexity of AI’s power demands, this blog barely scratches the surface of the work that needs to be done. Nevertheless, I’m looking forward to the solutions that will inevitably emerge.