Subscribe to our newsletter to receive the latest news and events from TWI:

Subscribe >
Skip to content

Artificial Intelligence for NDT Scanning of Unknown Geometries using Collaborative Robots

Project Code: 34241

Start date and planned duration: February 2021, 36 months


  • Review current artificial intelligence (AI) techniques for goal-oriented action planning and their fields of applications. This is to include currently implemented robotic and self-driving car applications using deep-learning networks. The suitability of these approaches to be used with cobots will be determined.
  • Integrate a phased array probe as an additional sensor tool with a cobot which has highly sensitive force/torque sensors.
  • Develop algorithms that can correct the motion and orientation of a cobot based on force and ultrasonic feedback.
  • Implement search algorithms to ensure full surface coverage as needed by the application with the cobot control software.
  • Demonstrate the benefits and drawbacks of the developed system through a parametric study based on industry provided samples.

Project Outline

The project will look at how artificial intelligence can be used in conjunction with ultrasonic sensors and cooperative robots (cobots) to perform NDT of unknown complex geometries. The use of force/torque sensors means that the robot can detect when a probe touches a surface. The feedback from these sensors can measure forces and torque in three dimensions, which allows the robot to control both position and orientation. These data can be utilised to follow a surface, alleviating the need to know the full surface geometry. NDT inspections often require a probe to cover a certain area, such as the heat-affected zone around a weld. Deciding how to cover this area is a task that is well suited for artificial intelligence (AI) as it becomes an exploration problem.

Ultrasonic phased array probes have been accepted as state-of-the-art in industry due to their enhanced imaging capabilities. By using a phased array probe (with a wedge) on a cobot, it is possible to incorporate this additional sensory information to align the cobot with the surface, in order to ensure the capture of adequate data. Combining force/torque sensing with ultrasonic feedback will lead to an automated scanning system that is optimised for NDT of complex surfaces.

The project will investigate an aspect of AI known as goal-oriented action planning. Goal oriented AIs have been used for a number of years in computer games and have recently been implemented in self-driving cars. The goal(s) are typically defined at a high level, such as “reach point B from point A”, allowing the AI to make a number of decisions about how to reach that state autonomously. The project will incorporate this type of AI approach to ultrasonic inspection of complex geometries using cobots.

To ensure a well-defined problem space, the project will be limited to the inspection of complex weld geometries, supplied by industry partners, using a cobot with a phased array probe. This will ensure that the number of parameters is kept at a reasonable level while algorithms and AI approaches are developed. It is envisaged that the project will result in actual, on-site trials to determine the applicability of the developed system in real-world scenarios.


Industry Sectors


Surface Transport



Benefits to Industry

Deployable automated techniques are required for in-service inspections as well as at the manufacture and assembly stages. The drive toward industry 4.0 requires that digital automated systems are developed to provide feedback to design and manufacture and to enable the creation of digital twins. In addition, industry needs to leverage the power of AI to decrease set up times and ensure good data quality.

The capability developed in this project will enable industry to employ cooperating robots to undertake NDT and other tasks at manufacture and in-service without the need for a dedicated safety cell.

Click here to view the project webinar*

*Industrial Members only, login required. Click here to register

For more information please email: