We reach more than 65,000 registered users in Dec!! Register Now

HKU researchers unveil neuromorphic exposure control system to improve machine vision in extreme lighting environments
- March 25, 2025
- 4 Views
- 0 Likes
- 0 Comment
A research team led by Professor Jia Pan and Professor Yifan Evan Peng from the Department of Computer Science and Department of Electrical & Electronic Engineering under the Faculty of Engineering at the University of Hong Kong (HKU), in collaboration with the researcher at Australian National University, has recently developed a groundbreaking neuromorphic exposure control (NEC) system that revolutionizes machine vision under extreme lighting variations. Published in Nature Communications, this biologically inspired system mimics human peripheral vision to achieve unprecedented speed and robustness in dynamic perception environments.Traditional automatic exposure (AE) systems rely on iterative image feedback, creating a chicken-and-egg dilemma that fails in sudden brightness shifts (e.g., tunnels, glare). The NEC system solves this by integrating event cameras — sensors that capture per-pixel brightness changes as asynchronous “events” — with a novel Trilinear Event Double Integral (TEDI) algorithm. This approach: Operates at 130 million events/sec on a single CPU, enabling edge deployment."Like how our pupils instantly adapt to light, NEC mimics biological synergy between retinal pathways," explained Mr. Shijie Lin, the first-author of the article. "By fusing event streams with physical light metrics, we bypass traditional bottlenecks to deliver lighting-agnostic vision."In tests, the team has validated NEC across mission-critical scenarios:
- Autonomous Driving: Improved detection accuracy (mAP +47.3%) when vehicles exit tunnels into blinding sunlight.
- Augmented Reality (AR): Achieved 11% higher pose estimation (PCK) for hand tracking under surgical lights.
- 3D Reconstruction: Enabled continuous SLAM in overexposed environments where conventional methods fail.
- Medical AR Assistance: Maintained clear intraoperative visualization despite dynamic spotlight adjustments.
About Professor Jia PanJia Pan is an Associate Professor in the Department of Computer Science at the University of Hong Kong (HKU). His research focuses on robotics, artificial intelligence, motion planning, and human-robot interaction. Professor Pan is particularly known for his work in developing algorithms for robot motion, collision detection, and optimization, with applications in autonomous systems and industrial robotics. He has published extensively in top-tier conferences and journals in robotics and AI, earning recognition for his innovative contributions to the field.About Professor Yifan “Evan” PengYifan Evan Peng is an Assistant Professor at HKU Electrical & Electronic Engineering and Computer Science, leading the Computational Imaging & Mixed Representation Laboratory. He was a Postdoctoral Research Scholar at Stanford University. He received his PhD in Computer Science the University of British Columbia, and both his MS and BS in Optical Science and Engineering from the State Key Lab of Modern Optical Instrumentation, Zhejiang University. Professor Peng’s research interest lies in the interdisciplinary field of Optics, Graphics, Vision, and Artificial Intelligence, particularly with the focus of: Computational Optics, Imager, Sensor, and Display; Holography & VR/AR/MR; Human-centered Visual & Sensory Systems.
List of Referenes
- Shijie Lin, Guangze Zheng, Ziwei Wang, Ruihua Han, Wanli Xing, Zeqing Zhang, Yifan Peng, Jia Pan. Embodied neuromorphic synergy for lighting-robust machine vision to see in extreme bright. Nature Communications, 2024; 15 (1) DOI: 10.1038/s41467-024-54789-8
Cite This Article as
No tags found for this post