Research Info

Home /Sensor data fusion and ...
Title Sensor data fusion and cutting tool status recognition by k-means clustering
Type JournalPaper
Keywords multi-sensory, data fusion, cutting tool, k-means clustering
Abstract In this study, a novel multi-sensory data fusion approach is developed for real-time tool wear condition monitoring during the turning process, addressing the limitations of single-sensor systems that often suffer from noise and uncertainty. By integrating data from four distinct sensors – machine vision, electrical current, accelerometer, and strain gauge – this method enhances the reliability and robustness of wear state identification. Key features extracted include the entropy of the workpiece’s surface texture via stationary wavelet transform, the time-frequency marginal integral of the motor current, and the Shannon entropy of both the cutting tool’s bending strain and acceleration signals. These features are fused using K-means clustering with Lloyd’s algorithm to classify tool wear into three distinct categories: low (0-0.1 mm), medium (0.1-0.2 mm), and high (> 0.2 mm). Experimental results demonstrate that this approach achieves a classification accuracy of 95 %, significantly outperforming traditional single-sensor methods, which typically yield accuracies below 80 %. This scalable and efficient technique is well-suited for intelligent manufacturing, offering precise tool replacement decisions with minimal computational overhead.
Researchers Khalil Khalili (First Researcher), Mehdi Danesh (Second Researcher)