Abstract
Structural health monitoring is designed to detect and analyze damage in mechanical, aerospace, and industrial structures to prevent potential catastrophic failures. The effectiveness of identifying structural faults and operational damage significantly relies on the sensing technology. Multisensory data fusion, as one of the promising techniques, combines data from multiple sensor modalities for reliable and accurate fault diagnosis. In this paper, we propose a novel decision-level multisensory data fusion approach. This approach combines data from an ultrasonic guided-wave sensor modality and a vibration sensor modality for the first time, aiming to achieve precise and stable damage identification. The Bayesian probabilistic distributions from each sensor modality are obtained using a multi-level Metropolis–Hastings algorithm. Subsequently, a multilayer perceptron network is trained to automatically allocate confidence scores for each sensor modality. The final fusion results are derived from the proposed decision-level fusion approach, combining the Bayesian probabilistic distributions from each sensor modality with allocated confidence scores. Simulation and experiment results indicate that the decision-level fusion approach produces more accurate and reliable outcomes for damage identification compared to relying on a single modality, especially in the case of progressive damage.
Original language | English |
---|---|
Article number | 111597 |
Journal | Mechanical Systems and Signal Processing |
Volume | 219 |
DOIs | |
Publication status | Published - 1 Oct 2024 |
Keywords
- Bayesian theorem
- Multisensory data fusion
- Structural health monitoring
- Ultrasonic modality
- Vibration modality
ASJC Scopus subject areas
- Control and Systems Engineering
- Signal Processing
- Civil and Structural Engineering
- Aerospace Engineering
- Mechanical Engineering
- Computer Science Applications