The widespread deployment of 3D stacked CMOS Image Sensors (CIS) in consumer electronics, namely smartphones, by handset makers domestic (Apple, iPhone) and overseas (Samsung, Galaxy), is certain proof that 3D integration technologies pivoted over the last ten years from being something useful only for fairly esoteric applications and high ASP products, to being a technology that reached the right market, at the right cost, at the right time, in volumes high enough to push yields up, costs down, and, in Sony’s case particularly, put money in the bank. (See Bloomberg on how, for Sony “Image Sensors Bigger Than Spider Man, Beyonce, or TVs.”)
An exploding market
According to Yole Developpement, as reported by Peter Clarke, “The (CIS) market was up 19.8% from $11.6B in 2016 mainly driven by smartphones and the desire to add improved cameras. However, Yole believes the CMOS image sensor has a bright future driven by new applications in autonomous vehicles and industrial and machine vision. By 2023 Yole predicts the annual market will have climbed to more than $23B, a compound annual growth rate of 9.4% from 2017 to 2023.” That’s a lot of Simoleans. And that’s a lot of CIS.
As Coventor reports, “A 3D-stacked image sensor consists of a backside illuminated (BSI) image-sensor die, face-to-face stacked on a logic die. The motivation to invest in stacked chip BSI CIS development has been somewhat varied depending upon the manufacturer, but can be summarized
Who talks on their phone anymore?
Sure, people like their smartphones for texting, talking, and surfing the internet, but people really like their smartphones for recording videos and snapping still photos, and when you build a better camera into a phone you’ve built a better phone. Engadget UK ranks the most important smartphone features this way:
- Design and build quality
- Screen
- Great camera
- Headphone jack
- Battery life
- Processor power
- Price
A picture’s worth a thousand words, and even if Andy Instagram or Sally SnapChat don’t know it, 3D stacked CIS have made their social media feeds insanely great.
Is AI the next stop for 3D ICs?
Where will 3D and heterogeneous integration technologies be in the next 10 years? How about leaping from big silicon in the data center to porting small silicon for on-the-fly decisions at the edge?
We already know about high-bandwidth memory (HBM) integrated with graphics processor units (GPUs) for high-performance computing applications, and in autonomous vehicles, but if the direction IBM is heading (“IBM Guns for 8-bit AI Breakthroughs”) pans out, analog artificial intelligence (AI) chips using 8-bit precision in-memory multiplication with projected phase-change memory may be supplanting trillion transistor GPUs in going from “narrow AI” (puppy or muffin?) to “broader AI” (reading medical images, for example).
Since “existing hardware can’t efficiently handle the largest neural networks that researchers have built,” we’re probably going to be heterogeneously packaging more devices, together with memory, running those 8-bit precision calculations, and deploying heterogeneously integrated SiP everywhere we need ubiquitous intelligence in the world.