As any driver knows, accidents can happen in the blink of an eye. So, when it comes to the camera system in autonomous vehicles, processing time is critical. The time it takes for the system to take an image and transmit the data to the microprocessor for image processing can be the difference between avoiding an obstacle or ending up in a major accident.
In-sensor image processing, in which important features are extracted from the raw data by the image sensor itself instead of the separate microprocessor, can speed up visual processing. To date, in-sensor processing demonstrations have been limited to emerging research materials that are, at least for now, difficult to integrate into commercial systems.
Now, researchers at Harvard’s John A. Paulson School of Engineering and Applied Sciences (SEAS) have developed the first sensor-integrated processor that could be integrated into commercial silicon imaging sensor chips – known as complementary metal-oxide-semiconductor (CMOS). image sensors – which are used in almost all commercial devices that need to capture visual information, including smartphones.
The research is published in Natural electronics.
“Our work can leverage the consumer semiconductor electronics industry to rapidly bring computing in sensors to a wide variety of real-world applications,” said Donhee Ham, Gordon McKay Professor of Electrical Engineering and Physics. applied to SEAS and main author of the article. .
Ham and his team developed an array of silicon photodiodes. Commercially available image sensing chips also have a silicon photodiode array to capture images, but the team’s photodiodes are electrostatically doped, which means that the sensitivity of the individual photodiodes, or pixels, to the Incoming light can be regulated by voltages. A network that connects multiple voltage-tunable photodiodes together can perform an analog version of the multiply and add operations at the heart of many image processing pipelines, extracting relevant visual information as soon as the image is captured.
“These dynamic photodiodes can simultaneously filter images as they are captured, moving the first stage of vision processing from the microprocessor to the sensor itself,” said Houk Jang, postdoctoral fellow at SEAS and first author of the article.
The silicon photodiode array can be programmed into different image filters to remove unnecessary detail or noise for various applications. An imaging system in an autonomous vehicle, for example, may require a high-pass filter to follow lane markings, while other applications may require a filter that blurs to reduce noise.
“For the future, we foresee the use of this silicon-based sensor-integrated processor not only in machine vision applications, but also in bio-inspired applications, in which early information processing enables the co-localization of sensor and computational units, such as in the brain,” said Henry Hinton, graduate student at SEAS and co-first author of the paper.
Next, the team aims to increase the density of the photodiodes and integrate them into silicon integrated circuits.
“By replacing the standard non-programmable pixels of commercial silicon image sensors with the programmable ones developed here, imaging devices can intelligently discard unnecessary data, and thus could be made more power and bandwidth efficient to meet the demands of the next generation of sensory applications,” Jang said.
The research was co-authored by Woo-Bin Jung, Min-Hyun Lee, Changhyun Kim, Min Park, Seoung-Ki Lee, and Seongjun Park. It was supported by the Samsung Advanced Institute of Technology under contract A30216 and by the National Science Foundation Science and Technology Center for Integrated Quantum Materials under contract DMR-1231319.