Recently, the research team led by Professor Shen Guozhen and researcher Wang Zhuoran from the Institute of Flexible Electronic Devices and Intelligent Manufacturing at the School of Integrated Circuits and Electronics, Beijing Institute of Technology, published a groundbreaking research paper titled 'A symmetry-reconfigurable photodiode for sensing and computing' in the prestigious international journal Nature Electronics. This innovative study introduces a novel symmetry-reconfigurable photodiode (SRPD) that seamlessly integrates dual operational modes: sensing and computing.
The team has also conducted comprehensive system validations for two critical application scenarios: transmission imaging and neuromorphic eye-machine interaction. These efforts offer a pioneering device solution for the advancement of low-power edge vision intelligence systems. Built upon the I-V-VI group semiconductor AgBiS₂, the research team meticulously designed and fabricated the SRPD. This device excels in wide-spectrum transmission imaging, covering ultraviolet to short-wave infrared wavelengths, when operating in sensing mode. Conversely, in asymmetric mode, it showcases non-volatile, bipolar photoresponsive weight modulation capabilities.
The team has successfully achieved monolithic integration of the SRPD on a 64×64 TFT chip and validated its engineering potential for seamless compatibility with silicon-based readout circuits. Furthermore, capitalizing on the device's in-situ photocurrent computing characteristics, the researchers have effectively implemented convolution operations, including image edge extraction and image sharpening. They have also constructed an artificial neural network classification device, thereby confirming its superiority in in-sensor information preprocessing and parallel analog computing.
The SRPD also holds immense promise in the realm of neuromorphic eye-machine interaction. By leveraging a hybrid convolutional neural network, it achieves high-precision recognition of eye movement directions. Additionally, the team has demonstrated the practical application of this technology through real-time eye-tracking-based coordination of robotic arms and UAV follow-up control.
