January 13, 2023
Automated optical inspection of FAST's reflector surface using drones and computer vision
The Five-hundred-meter Aperture Spherical radio Telescope (FAST), also known as the China Sky Eye, is the world's largest single-dish radio telescope. Its reflector is a partial sphere of radius R=300 m. The planar partial spherical cap of the reflector has a diameter of 519.6 m, 1.7 times larger than that of the previously largest radio telescope.
The large reflecting surface makes FAST the world's most sensitive radio telescope. It was used by astronomers to observe, for the first time, fast radio bursts in the Milky Way and to identify more than 500 new pulsars, four times the total number of pulsars identified by other telescopes worldwide. More interesting and exotic objects may yet be discovered using FAST.
However, a larger reflecting surface is more prone to external damage due to environmental factors. The FAST reflector comprises a total 4,450 spliced trilateral panels, made of aluminum with uniform perforations to reduce weight and wind impact. Falling objects (e.g., during the extreme events such as rockfalls, severe windstorms, and hailstorms) may cause severe dents and holes in the panels. Such defects adversely impact the study of small-wavelength radio waves, which demands a perfect dish surface. Any irregularity in the parabola scatters these small waves away from the focus, causing information loss.
The rapid detection of surface defects for timely repair is hence critical for maintaining the normal operation of FAST. This is traditionally done by direct visual inspection. Skilled inspectors climb up the reflector and visually examine the entire surface, searching for and replacing any panels showing dents and holes.
However, this procedure has several limitations. First, there is danger involved in accessing hard-to-reach places high above ground. Second, it is labor-intensive and time-consuming to scrutinize the thousands of panels. Third, the procedure relies heavily on the inspectors' expertise and is prone to human-based errors and inconsistencies.
The remedy to the shortcomings of manual inspection at FAST is automated inspection. In a new paper published in Light: Advanced Manufacturing, a team of scientists led by Professor Jianan Li and Tingfa Xu from Beijing Institute of Technology have made the first step towards automating the inspection of FAST by integrating deep-learning techniques with drone technology.
As a first step, the research team integrated deep-learning techniques with the use of drones to automatically detect defects on the reflector surface. Specifically, they began by manually controlling a drone equipped with a high-resolution RGB camera to fly over the surface along a predetermined route. During the flight, the camera captured and recorded videos of surface conditions.
One benefit of the advanced flight stability of drones is that the recorded videos can capture a lot of information about surface details. Moreover, thanks to the GPS device and the RTK module onboard the drone platform, every video frame can be tagged with the corresponding drone location with centimeter-level accuracy. The physical locations of the panels that appear in each frame can thus be easily determined.
To tackle the challenges of finding surface defects in drone imagery exhibiting large-scale variation and high inter-class similarity, they introduced a simple yet effective cross-fusion operation for deep detectors, which aggregates multi-level features in a point-wise selective manner to help detect defects of various scales and types. The cross-fusion method is lightweight and computationally efficient, particularly valuable features for onboard drone applications.
Future work will implement the algorithm on embedded hardware platforms to process captured videos onboard the drone, to make the inspection system more autonomous and more robust.
More information: Jianan Li et al, Automated optical inspection of FAST's reflector surface using drones and computer vision, Light: Advanced Manufacturing (2022). DOI: 10.37188/lam.2023.001