Hyper-dimensional scalable sensors: the road forward to always-on context-awareness in electronic devices?

01 January 2018 → 31 December 2021
Research Foundation - Flanders (FWO)
Research disciplines
No data available
Project description

We, humans, are masters at constantly capturing tons of sensory information in an “lways-on”fashion. Yet, crucial to our ability to process this hyperdimensional stream of sensory information, is

that we do not always devote the same level of mental effort to all sensory inputs. This dynamic

scalability allows us to extract the relevant information from the sensory data with our limited

human computational bandwidth.

Wouldn’ it be great if electronics could also benefit from such scalable processing of a

hyperdimensional stream of sensory data? This would enable robots, drones, cars, or buildings to

constantly be aware of their complete surroundings. Currently, such devices struggle to process

hyperdimensional visual data under the energy and processing constraints of embedded devices.

This can be overcome by bringing in similar dynamic scalability when processing the

hyperdimensional data.

HYPERSCALES will enable such always-on hyper-dimensional, scalable sensing, focusing on visual

sensors. The goal is to demonstrate a ring of many low-cost visual sensors, capturing a rich

datastream of omnidirectional information. Using a new paradigm of online scalable neural

networks, and aligned dynamic scaling of customly designed hardware, always-on visual awareness

will become feasible with an order or magnitude lower energy consumption than the state-of-theart.

Unique is the tight interplay between algorithmic (Prof. Dambre) and hardware (Prof. Verhelst)