Researchers at the University of Zurich have developed an “event-based” camera in partnership with NCCR Robotics. Inspired by the human eye, this new camera can handle both fast-motion and near-dark conditions, capabilities that are becoming more appealing to various industries.
Unlike current UAVs, whose cameras require enough light or need to fly at relatively slow speeds to pinpoint their exact positions, which GPS needs to properly track them, this new camera overcomes motion-blur issues or speed restrictions.
According to ZDNet, this event-based camera was inspired by human ocular biology, and have vision sensors that afford the camera with information regarding changes in pixel-level brightness, as opposed to standard frame rates. Most notably, is the aforementioned lack of requiring “enough” light to capture a clear image.
“This research is the first of its kind in the fields of artificial intelligence and robotics, and will soon enable drones to fly autonomously and faster than ever, including in low-light environments,” said Professor Davide Scaramuzza, Director of the Robotics and Perception Group at the University of Zurich, according to ZDNet.
Naturally, this kind of advancement will allow UAVs to operate at full efficiency on pitch-black, moonless nights. It may very well change the landscape of drone operations regarding time of day, and what we are able to capture clearly in other low-light conditions.
So what’s the difference between using frames per second and pixel-level brightness? It’s actually easier to understand than you may think. Traditionally, film and video are a series of frames that come together to create clear, seemingly uninterrupted motion. Pixel-level brightness, however, look at each individual pixel being captured, and focus exclusively on changes in brightness from one instant to the next. Eponymously, then, an event-camera needs movement to track changes being captured, and wouldn’t benefit from a stationary setup, but it certainly would from zipping through the skies. A substantial development in capturing light and images for drones, in other words.
Let’s take a look at what kind of footage an event-based camera feeds back to the research team, shall we?
According to ZDNet, this advancement has already brought about some very interesting developments. Autonomous flight at a higher speed and with less light than usual has been successfully tested at the university, which is something standard commercial drones are currently incapable of.
A spokesman from the NCRR told ZDNet that it’s technology like this, that, together with drones, could help search and rescue teams in situations where traditional drones would be of little to no use. Think nighttime emergencies, as well as reaching the areas in question at higher speeds.
This is all very exciting, but as usual, getting this technology into the hands of consumers will take time.
“There is still a lot of work to be done before these drones can be deployed in the real world since the event camera used for our research is an early prototype,” PhD Student Henri Rebecq warned. Scaramuzza says that he and his team “think this is achievable, however, and our recent work has already demonstrated that combining a standard camera with an event-based camera improves the accuracy and reliability of the system.”