Thanh Bui-Tien https://orcid.org/0000-0002-4001-9246 Dung Bui-Ngoc https://orcid.org/0000-0001-9934-7474 Hieu Nguyen-Tran https://orcid.org/0000-0002-3007-6554 Lan Nguyen-Ngoc https://orcid.org/0000-0002-0078-4074 Hoa Tran-Ngoc https://orcid.org/0000-0003-2161-8064 Hung Tran-Viet


The process of damage identification in Structural Health Monitoring (SHM) gives us a lot of practical information about the current status of the inspected structure. The target of the process is to detect damage status by processing data collected from sensors, followed by identifying the difference between the damaged and the undamaged states. Different machine learning techniques have been applied to attempt to extract features or knowledge from vibration data, however, they need to learn prior knowledge about the factors affecting the structure. In this paper, a novel method of structural damage detection is proposed using convolution neural network and recurrent neural network. A convolution neural network is used to extract deep features while recurrent neural network is trained to learn the long-term historical dependency in time series data. This method with combining two types of features increases discrimination ability when compares with it to deep features only. Finally, the neural network is applied to categorize the time series into two states - undamaged and damaged. The accuracy of the proposed method was tested on a benchmark dataset of Z24-bridge (Switzerland). The result shows that the hybrid method provides a high level of accuracy in damage identification of the tested structure.


  1. Latest Oldest Top Comments


    Download data is not yet available.


    SI: Steels and Composites for Engineering Structures

    How to Cite

    Bui-Tien, T., Bui-Ngoc, D., Nguyen-Tran, H., Nguyen-Ngoc, L., Tran-Ngoc, H., & Tran-Viet, H. (2021). Damage Detection in Structural Health Monitoring using Hybrid Convolution Neural Network and Recurrent Neural Network. Frattura Ed Integrità Strutturale, 16(59), 461–470. https://doi.org/10.3221/IGF-ESIS.59.30