Machine learning for workpiece mass prediction using real and synthetic acoustic data

We apply a feedforward neural network using supervised learning to sound recordings obtained without specialised equipment as workpieces undergo a simple manufacturing process to predict their mass. We also report a simple technique to seed synthetic from real data for training and testing the algor...

Full description

Saved in:
Bibliographic Details
Published inScientific reports Vol. 15; no. 1; pp. 19534 - 12
Main Authors Whittaker, D. S., Gregório, J., Byrne, T. F.
Format Journal Article
LanguageEnglish
Published England Nature Publishing Group 04.06.2025
Nature Publishing Group UK
Nature Portfolio
Subjects
Online AccessGet full text
ISSN2045-2322
2045-2322
DOI10.1038/s41598-025-03018-3

Cover

More Information
Summary:We apply a feedforward neural network using supervised learning to sound recordings obtained without specialised equipment as workpieces undergo a simple manufacturing process to predict their mass. We also report a simple technique to seed synthetic from real data for training and testing the algorithm. Work was performed in the frequency domain, with spectrally-resolved magnitudes fed as variables (or features) to the network. The mean absolute percentage deviation when predicting the mass of each of thirty-two workpieces of different material composition considering just real data was 19.2%. This fell to 8.7%, however, when the exercise was repeated making use of a significantly greater amount of these synthetic data to train and test it. Fractional uncertainty in measured mass is of the order of 10 , and so this serves well as a general proxy to test our approach. It could find application when data to augment algorithm performance must be obtained robustly, relatively quickly and without much computational effort in the wider context of waste minimisation and green manufacturing.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2045-2322
2045-2322
DOI:10.1038/s41598-025-03018-3