Subscribe to our newsletter to receive the latest news and events from TWI:

Subscribe >
Skip to content

Autonomous Robotic Inspection System for NDT Utilising Automated Digital Twin Generation

Project Code: 35924

Start date and planned duration: April 2024, 24 months

Objective

TWI Technology Centre (Wales) has developed a Software Development Kit (SDK) which allows for robotic integration and inspection for common NDT tasks. This includes the ability to acquire ultrasonic data, link to a digital twin and display data in 2D and 3D with the ability to post-process data using a number of data reprocessing algorithms. The core aim of this CRP is to extend the capability of this SDK to support an autonomous robotic inspection system for NDT utilising automated digital twin generation. Sensors are to be investigated with the objective to allow for the creation of a digital twin in real time. AI solutions are to be developed to assist the autonomous path planning of complex geometric structures utilising the digital twin data.There are four core objectives:

  1. Determine best technology for generation of a digital twin and CAD model.
  2. This is to include an investigation into different camera technologies and structure from motion algorithms.
  3. Develop an AI solution to allow for autonomous path planning of complex geometric structures.Implement algorithms and code into TWI’s Crystal SDK to improve current operational capability.
  4. Report on findings, identifying any limitations and provide recommendation for future work.

 

Project Outline

Within the manufacturing sector, as well as non-destructive testing (NDT), robotic solutions are primarily based on repetitive pre-programmed tasks, where precise knowledge of the component is known via a digital twin in the form of a CAD model of the component and surrounding environment. This is acceptable for most scenarios, however in the case of unknown or ill-defined geometry (where the CAD model does not match the component) this approach is ineffective, leading to delays and additional cost. In this instance, a highly skilled operator is required to manually determine and measure variations in the CAD in an effort to define an appropriate robotic scanning path. This requirement is amplified where the part geometry has potential to restrict probe access to joint areas (e.g. wing spar joints). The manual process is therefore often iterative taking several hours to days to resolve. Recent advancements in robotics have seen their deployment into a wide range of industrial sectors outside traditional manufacturing scenarios. With recent advancements in Artificial Intelligence (AI) systems for autonomous vehicles coupled by the reduction in cost for robotic sensors (Lidar, 3D cameras etc.) and the availability of open source libraries and frameworks, implementation of autonomous robotic inspection has never been more accessible. The use of sensor driven robotics (such as technology found in self-driving vehicles), allows for the creation and validation of the digital twin in real-time.

This CRP project aims to explore AI methods for autonomous robotic path scanning, using low cost sensors to provide feedback and data for pre, during and post manufacturing process to extend well beyond current capability. The problem can be broken down into two discrete challenges:

  1. Given a component of unknown geometry, how can the system learn its environment and generate a digital twin?
  2. Given a component of complex geometry, how can AI make decisions and fully automate the robotic scanning process?

In the case of item 1, the objective is to generate a live CAD model of the component using appropriate sensor feedback and robotic movement. To support this, three technologies are to be investigated: Lidar, 3D depth Camera and Structure from Motion algorithms (using traditional 2D high-resolution cameras). The goal is to develop a software solution that allows for the creation of a CAD model and digital twin (where the co-ordinate systems of the real world and digital world align).

In the case of item 2, the objective is to develop a system that can learn the best scanning path for the data acquired in item 1 through post processing of the acquired data. This is to be achieved by using an Artificial Neural Network (ANN) system supported with re-enforced learning. The goal being that within the digital twin a best approach and scan solution will be autonomously computed before execution within the real-world. 

The successful implementation of the proposed solution will allow for the post manufacture automated robotic NDT scanning of components without initial data. This would remove the requirement for a costly skilled operator to develop accurate CAD models and would automate the robotic path planning operation which currently requires expensive dedicated software. In addition, a significant reduction in total inspection time (from days to minutes) is expected to be achievable as a result of the project. Ultimately, although beyond the scope of the baseline work proposed here, the aim is to provide a self-driving and self-inspection solution with little or no operator input. This has not been achieved for application of NDT before. Successful realisation would place TWI’s Crystal SDK software in the forefront for inspection solutions and would benefit TWI members in meeting their Industry 4.0 requirements. The technology would allow for site-deployable robotic NDT inspection, where for example, a cobot is taken to a mid-production lifeboat hull or aircraft wing in a hangar and the system learns its environment before autonomously conducting the inspection with little operator input.

Industry Sectors

Aerospace

- Oil and gas

Benefits to Industry

Current NDT robotic methods rely on experience robotic personnel, precise knowledge of the component and a high level of skill. The benefits of this research would provide a level of high automation reducing the level of skill required and improving time to complete inspection.

 

 

 

For more information please email:


contactus@twi.co.uk