SEMICON West 2024 Keynotes

Deepak Chopra said, “All great changes are preceded by chaos.”

Here’s my Aha! moment from SEMICON West 2024: After great chaos precipitated by the pandemic, geopolitical tensions, the climate crisis, and the talent shortage, one singular driver is driving unprecedented change: Generative AI.

Generative AI was referenced in almost every talk presented at the show and every conversation I had as the official podcaster of SEMICON West. It’s no wonder, as SEMI President Ajit Manocha noted, Gen AI currently drives 52% of the semiconductor market growth. “HBM” and “Digital Twins” were also bandied about as industry trends driving change.

AI’s Impact on Market

Generative AI is impacting the markets by upending how we measure growth.

For many years, the market was volume-driven, and manufacturers like TSMC and Intel reaped the profits. Now, it is value-driven, and design companies like Nvidia are getting the lion’s share of the profits. Charles Shi, of Needham and Company said Nvidia is replacing Intel as the King of 2024. (Why “King” I wonder, why not Queen? – but I digress).

This is because growth is less about the PC and mobile markets, where cost is crucial, and more about AI chipsets for data centers, where high performance is critical. Because chip volumes are lower, Shi predicted it will negatively impact the wafer fab equipment (WFE) market, which after hitting a historical peak in 2024, is set to decline for the first time since 2015.

I asked Jean-Christoph Eloy of Yole Group for his take on this, because with all the expected fab expansion coming in the next few years, it didn’t track that equipment would hit a slump. He explained that it will stabilize at a higher level after such enormous growth (50% in the past year). But that shouldn’t be confused with a decline in WFE. Equipment companies will still do just fine.

AI’s Impact on Chips Act Funding

The American semiconductor industry faces challenges due to supply chain disruptions, and investing in domestic capacity is crucial for technological leadership, noted Undersecretary Laurie Locascio, in her keynote address.

She cited AI as one of the critical areas where the US hopes to dominate. To this end, the administration has proposed $50 billion in funding for the industry, which will support the addition of 19 greenfield fabs in the U.S.

Some say that $50B is a drop in the bucket compared to the investments of other regions – such as the Chinese government, which invested $100B ten years ago and another $100B since then. However, Locascio noted that the US investment goes beyond government funding.

“The total public and private investment from our four leading-edge companies will equal roughly 300 billion between now and the end of the decade, far and away, the most investment in new production in the history of the US semiconductor industry,” she said.

“Prior to 2022 and the passage of the Chips and Science Act, the US produced 0% of the world’s leading-edge chips,” said Locascio, noting that the US is on pace to grow its share of (advanced node) logic manufacturing to 28% by 2032. This is according to a recent Semiconductor Industry Association report.

I fact-checked this with SEMI Market Intelligence analyst, Christian Dieseldorff, who confirmed that this number applies only to the market share of advanced node technology at 7nm and below.  Before 2019, the most advanced nodes produced in the U.S. were 12nm.

During his Market Symposium presentation, Dieseldorff laid out a detailed explanation of global fab expansion, where the fabs are located, and the impact investment will make on regional market share (Figure 1).

Figure 1: Details on volume fabs that started construction in 2020-2022, versus where they are today. (Source: SEMI MIT)
Figure 1: Details on volume fabs that started construction in 2020-2022, versus where they are today. (Source: SEMI MIT)

Due to across-the-globe investment, he said that each region was unlikely to increase its market share beyond what it currently is. But if the investment is not made, they risk losing market share. For the U.S. that number is 10% of total capacity including mature technologies. By 2027, it will be difficult for that to go beyond 13%, he explained.

Locascio also announced a $1.6 billion open competition for R&D activities to advance packaging in the US, critical for addressing sustainability concerns and enabling leadership in AI.  She invited the microelectronics community to propose prototypes demonstrating research advances in packaging flows. Locasio expanded on this initiative in a 3D InCites podcast interview that drops on Thursday, July 18.

Generative AI and Sustainability

As noted by Marco Chisari, Samsung Electronics during Monday’s marketing symposium, artificial intelligence – specifically Generative AI – is powering the next wave of semiconductor innovation. But not always for reasons you might think.

Chisari provided some sobering statistics. While computing power scales at 2x – 3x per year, AI models are scaling at 10x per year. The bottom line: there isn’t enough electricity to support total Generative AI demand. Global energy usage is growing faster than production. So what can we do to reduce power consumption?

One of the most compelling rhetorical questions Chisari asked in his talk, was “Do we really need so much computing? The industry isn’t asking this question enough.”  I find myself asking that question every day.

Chisari described the historical trend of increasing compute speeds to improve performance.  If we continue on this trajectory, how much compute power will be required by Gen AI in 2025?  Samsung is trying to work with companies that don’t need matrix and vector multiplication for compute, he said. Chisari called the situation a “three fundamental dilemma” comprising power, the memory wall, and capacity.

During the CEO Summit, Mousumi Bhat, SEMI’s Chief Sustainability Officer asked panelists that included Frank Sanders, Intel; Angela Baker, Qualcomm, and John Powers, Schneider Electric how we can address generative AI’s thirst for energy. They offered solutions from stepping up renewable energy production to accelerating the progress of AI at the edge to reduce the power demands on the grid (Figure 2)

power hungry generative AI
Figure 2: SEMI Chief Sustainability Officer, Mousumi Bhat discusses ways to quench AI’s thirst for energy with Frank Sanders, Intel; Angela Baker, Qualcomm, and John Powers, Schneider Electric.

When I asked Stephan Haferl,  CEO of Comet Group, what he thought of that solution, he said he had doubts simply because there is not enough battery power at the edge (on a local PC, for example) to run an AI chipset.

“You need the data centers, and ideally, would have a super performing 5G, 6G, or 7G network with practically no latency communication between the devices,” he said.

At Comet Xylon, a division of Comet Group, they are putting AI to work in x-ray systems to find defects. “This is the perfect playground for AI because it’s all about investigating patterns to detect anomalies and defects,” he said.

3D Packaging Leads the Way

AI’s thirst for power is driving the transition from traditional CMOS scaling to 3D heterogeneous integration and chip stacking to reduce power needs. It’s even inspired a change in the very definition of Moore’s Law. (Although if you recall, that began back in 2017 when someone took the time to read beyond the first paragraph of Gordon Moore’s famous paper.)

More IDMs and foundries– traditionally focused on front-end technology advancement – are integrating advanced packaging solutions into their roadmaps to address this dilemma.

According to Chisari, Samsung’s approach to solving this is through leading-edge gate-all-around technology, high-bandwidth memory, and 2.5D and 3D packaging (Figure 3). Instead of traditional stand-alone HBM integrated on an interposer, Custom HBM is stacked on top of an advanced logic die to achieve power savings.

Generative AI power savings
Figure 3: In Samsung’s iCube™ package platform, custom HBM is stacked on top of advanced logic die to achieve power savings.

Haferl is also firmly in the 3D advanced packaging camp to approach some of the energy challenges. 3D stacking lets you reduce the wires between memory and compute, he explained, which helps reduce transmission and power losses along the wires. “That’s why everything is being 3D-ified,” he said.

When the conversation leads with the importance of advanced packaging in AI applications to reduce power consumption and overcome capacity and bandwidth limitations of the memory wall – you know we’ve arrived. ~ FvT


To catch more of our SEMICON West coverage, stay tuned for blog posts from Dean Freeman and Jillian Carapella, and tune in to the 3D InCites Podcast.

Francoise von Trapp

They call me the “Queen of 3D” because I have been following the course of…

View Francoise's posts

Become a Member

Media Kit

Login