A Field Deployable Wall-Climbing Robot for Bridge Inspection using Vision and Impact Sounding Techniques (AS-6)

Lead University: The City College of New York (CCNY)

Principal Investigator: Dr. Jizhong Xiao, CCNY

PI Contact Information: Phone: (973) 851-7345  |  Email jxiao@ccny.cuny.edu

Co-Principal Investigator: Dr. Anil Agrawal, CCNY

Funding Sources and Amounts Provided:
CCNY: $107,901

INSPIRE UTC: $107,901

Total Project Cost: $215,802

Match Agencies ID or Contract Number:  CCNY In-kind and Cash | INSPIRE UTC: 00055082-01C

INSPIRE Grant Award Number: 69A3551747126

Start Date: January 1, 2020
End Date: June 30, 2021

Brief Description of Research Project:

In the past three years, the CCNY robotics team developed wall-climbing robot prototypes that enable vertical mobility of nondestructive evaluation (NDE) devices, such as RGB-D camera, impact sounding, and ground penetrating radar (GPR). The team also developed visual inspection and machine learning algorithms to detect and localize the surface flaws such as cracks and spalling, and explored impact sounding methods to detect subsurface defects such as delamination and voids. Leveraging these achievements, this project is focused on the hardware and software system integration for a reliable and field deployable wall-climbing robot for bridge inspection.

Approach and Methodology: Impact sounding has been recognized as an effective NDE tool to detect delamination and void in concrete structures. It can be realized by an inspection system with a pair of impactors and microphones. The main challenges in the practical application of this technology are the automatic data collection and the development of advanced impact sounding data analytics for subsurface defect identification. This project will explore the potential use of a wind-rider robot that can maneuver on curved surfaces for visual and impact sounding inspection. In particular, a new impact sounding mechanism will be investigated to enhance the visual simultaneous localization and mapping (V-SLAM) algorithm for visual inspection and mapping, integrate the sounding data analytics in the robotic software and visualize the subsurface defects in 2D map, assisting professional engineers in structural condition analysis.

Overall Objectives: This project aims to: (1) develop a reliable and field deployable robot to provide vertical mobility for data collection on concrete bridges and other civil infrastructure, (2) improve image processing algorithms to detect surface flaws using cameras, (3) modify the impact sounding device and improve software algorithms to detect subsurface defects using sound, and (4) integrate the hardware and software system into a holistic solution for a field deployable robot to automate the bridge inspection process with minimal human intervention.

Scope of Work in Year 1: (1) Investigate solutions to eliminate the limitations of the current impact sounding mechanism, (2) Investigate impact sounding and/or impact echo instruments to be outfitted on a robot for structural integrity testing, (3) Select the most feasible solution and conduct detailed mechanical design and CAD modeling of the new impacting mechanism, and (4) Design embedded control and automatic sounding data collection system.

Implementation of Research Outcomes:
Research outcomes and implementation plan will be described towards the end of this project.

Impacts/Benefits of Implementation: 
Impact/benefits of implementation will be summarized at the end of this project.

Project Website: http://inspire-utc.mst.edu/researchprojects/as-6/

Progress Reports: