
Publications and Research
Document Type
Article
Publication Date
8-14-2025
Abstract
Modern composite materials promise superior performance and load-bearing capabilities, yet evaluating their structural integrity remains challenging. Current testing methods, such as visual, thermographic, ultrasonic, optical, electromagnetic, terahertz, shearography, X-ray, and neutron imaging, are hampered by long scan durations, limited field of view, suboptimal accuracy, and high costs, particularly when applied to large structures.
This paper addresses these issues by introducing a novel robotic multimodal imaging system that overcomes the limitations of traditional methods. This system dynamically captures both static and dynamic properties of materials using advanced motion compensation techniques. By integrating multiple radiographic modalities into a coordinated robotic platform, it provides rapid, high-resolution imaging of composite materials of large structures without the need for disassembly.
The system was validated through simulations of a four-robot radiograph setup, treated as two single-plane systems. The 3D position and orientation of a cube phantom were determined by generating computer-based digitally reconstructed radiographs from a computed tomography model and applying a 3D line intersection method based on known imaging geometries. Comparisons between marker-based and markerless kinematics tracking methods yielded differences of only 0.03 mm in translation and 0.06° in rotation.
These findings demonstrate that the proposed system significantly reduces scan times and enhances accuracy, offering a robust, scalable solution for dynamic inspection in diverse fields such as aerospace and medical device manufacturing.
Included in
Artificial Intelligence and Robotics Commons, Biomedical Informatics Commons, Radiology Commons
Comments
This article was originally published in London Journal of Research in Computer Science & Technology, available at https://journalspress.com/LJRCST_Volume25/
This work is distributed under a Creative Commons Attribution 4.0 International License (CC BY 4.0).