Newsroom
A research team from the Shenyang Institute of Automation of the Chinese Academy of Sciences developed an artificial intelligence–based model called O-TabPFN that significantly improves the precision and consistency of robotic grinding for cast turbine blades. These blades are critical core components used in aerospace, energy, and gas turbine systems.
The study, published in Precision Engineering on February 16, introduces a material removal depth prediction model that allows robots to automatically adjust grinding process parameters based on the distribution of machining allowance across different areas of the blade, enabling precise point-by-point material removal and significantly improving machining accuracy and surface consistency.
Cast blades typically have complex free-form surface structures with characteristics such as twisting, bending, and variable cross-sections. Traditional grinding methods often struggle to apply uniform force and achieve consistent machining across the entire blade surface. Additionally, factors such as cooling shrinkage and mold deviations can lead to an uneven distribution of machining allowance. Therefore, adaptive adjustments based on the actual allowance are required during the grinding process.
Abrasive belt grinding is a key technology for machining complex-profile blades used in aero-engines and gas turbines. However, existing robotic systems generally rely on constant-force grinding, applying fixed parameters across the entire surface. while this approach offers advantages in process simplicity and efficiency, it faces challenges in practical applications, such as insufficient machining accuracy, limited control over material removal, and inconsistent surface quality. Furthermore, the machining allowance distribution on the blade profile is uneven and accompanied by significant curvature variations.
To address these limitations, the researchers developed a data-driven model capable of capturing the nonlinear relationship between grinding parameters and material removal depth. The model is based on TabPFN, a meta-learning framework that uses prior knowledge to perform well on small tabular datasets. It also integrates hyperparameter optimization using the Optuna algorithm to improve performance and robustness.
According to ZHU Guang, a team member led by LI Lun, the proposed O-TabPFN prediction model can capture the nonlinear relationship between process parameters and material removal depth more accurately while reducing the risk of local optima. These characteristics make it particularly well-suited for high-precision, data-efficient prediction tasks in robotic abrasive belt grinding.
Experimental results show that the model achieves a prediction accuracy of 95.81% for material removal depth, with an average prediction error of only 0.007316 mm, outperforming various existing mainstream prediction models.
The researchers suggested that this approach could help advance processing parameters based on the allowance distribution, thereby significantly enhancing processing accuracy and surface consistency.
This study was supported by the National Natural Science Foundation of China, the Natural Science Foundation of Liaoning Province, and others.