Project

MULTISCALE - Transformer based framework for integrating and interpreting multiscale, multirate and multimodal data in mechatronic systems

Acronym
MULTISCALE
Code
180F6724
Duration
01 September 2025 → 31 August 2029
Funding
Regional and community funding: various
Other information
Research disciplines
  • Engineering and technology
    • Control engineering
    • Mechanical drive systems
    • Numerical modelling and design
    • Kinematics and dynamics
    • Physical system modelling
    • Sensing, estimation and actuating
    • Signals and systems
    • Signal processing not elsewhere classified
Keywords
system identification control engineering mechatronics hybrid physics-based data-driven models hybrid AI transformers robotics machines uncertainties multimodal sensor interpretation
 
Project description

MULTISCALE will develop an advanced framework for analyzing, merging and interpreting multiscale data, addressing critical challenges in industrial sectors like manufacturing. These industries often face productivity losses due to difficulties in synchronizing and merging multirate and multimodal data (e.g. timeseries vs camera feeds), which hinders effective maintenance actions and leads to unplanned downtime. The project aims to provide actionable insights by efficiently integrating data across varying temporal scales, dealing with intermittent or missing data, essential for optimizing mechatronic systems and production processes. 

The framework is built around the attention-mechanism of transformers, a modality-independent architecture, ideal for learning long-range dependencies between sensor signals (e.g. seasonal effects, degradation dynamics) compared to other state-of-the-art techniques, making it explicitly suitable for merging multiscale data. To unlock the full potential of the framework in an industrial context, we will enhance the framework with physics-based models and expert insights. This approach allows the framework to operate effectively in data-scarce environments, reducing the manual preprocessing and feature engineering efforts, while maintaining high forecasting accuracy. Moreover, the attention-based layers of the transformer allow to interpret the actions made by the framework as well as to visualize relations between the various sensors to gain multiscale system insights. Through transfer learning methods, we will investigate how to ease transferability of pretrained transformer towards application on similar systems or unseen operating conditions, which will speed up the learning process and further reduce data requirements. MULTISCALE will leverage the potential of transformers for industrial applications by efficiently merging multirate, multimodal data & intermittent data; and by providing actionable insights, such as discovering multiscale dependencies and detecting anomalous behavior in order to improve system performance, efficiency and product quality.