Quantitative Bridge Inspection Ratings using Autonomous Robotic Systems (Im-2)

Universities : The City University of New York
                            Missouri University of Science and Technology
                            University of Colorado - Boulder

Principal Investigator : Dr. Anil Agrawal, The City University of New York

PI Contact Information : phone: (202) 650-8442  |  email: anil@ce-mail.engr.ccny.cuny.edu

Co-Principal Investigators : Dr. Jizhong Xiao, The City University of New York
                                                          Dr. Genda Chen, Missouri S&T
                                                          Dr. George Hearn, University of Colorado at Boulder

Funding Sources and Amounts Provided:
The City University of New York: $144,079
INSPIRE UTC: $89,550

Total Project Cost : $233,629

Match Agencies ID or Contract Number:
CCNY: In-Kind Match    |     INSPIRE UTC: 00055082-01B

INSPIRE Grant Award Number : 69A3551747126

Start Date : March 1, 2017
End Date :  July 31, 2018

Brief Description of Research Project:

The 2001 study sponsored by FHWA raised serious concern on the consistency and reliability of visual inspection. Although consistent ratings can be obtained with a good QA/QC program, based on a recent study by the PI, the concern for reliability of defect detection remains. With the adoption of the recent AASHTO Manuel for Bridge Element Inspection, the new inspection approach not only requires rating for bridge elements, but also the location and extent of deterioration. Since autonomous robotic systems generate an enormous amount of inspection data, deducing from the data to a simple rating along with the location and extent of deterioration is a significant challenge. For example, RABITTM has been used to inspect concrete bridge decks with six devices, including ground penetrating radar (GPR), impact-echo and ultrasonic surface wave. However, the probability of detection (POD) for damage has not been fully demonstrated to be significantly improved using multiple devices.

Approach and Methodology. Data fusion will be used to derive a rating from test data from multiple NDE devices and visual inspections in two steps. First, location and extent of deterioration, such as delamination, will be determined by fusing data from NDE devices. Currently, almost all data fusion techniques are based on the measurement outputs of multiple devices. In this study, the wave-structure interaction leading to the measurement outcomes will be taken into account in data fusion, potentially resulting in more consistent identification of deterioration. Second, once identified reliably, damage data and visual inspection findings can be fused to determine a rating through algorithms such as artificial neural network and Fuzzy logic, while minimizing false positives and particularly false negatives. Training data for the algorithms will come from existing experience on the type and extent of deterioration and damage through inspection reports and experience of inspectors (e.g. 21 bridge inspection teams that the PI has worked with during the recent study).

Overall Objectives. This project aims to develop new fusion strategies of data collected from multiple NDE devices for improved POD based on further understanding and modeling of damage detection mechanisms, and to develop algorithms for the derivation of bridge ratings from identified damage and visual inspection findings.

Scope of Work in Year 1. The focus of the first year will be to: (1) Develop a framework of quantitative bridge inspection using relevant data from the literature and those derived from NDE devices, (2) Identify potential NDE devices for different bridge elements, (3) Characterize POD and its improvement through data fusion.

Describe Implementation of Research Outcomes:
Research outcomes and implementation plan will be described towards the end of this project.

Impacts/Benefits of Implementation:
Impact/Benefits of Implementation will be summarized at the end of this project.

Project website: http://inspire-utc.mst.edu/researchprojects/im-2/
Progress reports: