Machine learning-based prediction of compressive strength and slump in high-performance concrete
The design of a concrete structure heavily depends on one of the most significant performance parameters: concrete strength. Reliable strength prediction can save design expenses and time, as well as reduce material waste from multiple mixing experiments. Because it has a very long service life, imp...
Saved in:
| Published in | Multiscale and Multidisciplinary Modeling, Experiments and Design Vol. 8; no. 10; p. 463 |
|---|---|
| Main Authors | , |
| Format | Journal Article |
| Language | English |
| Published |
Cham
Springer International Publishing
01.11.2025
Springer Nature B.V |
| Subjects | |
| Online Access | Get full text |
| ISSN | 2520-8160 2520-8179 |
| DOI | 10.1007/s41939-025-01001-z |
Cover
| Summary: | The design of a concrete structure heavily depends on one of the most significant performance parameters: concrete strength. Reliable strength prediction can save design expenses and time, as well as reduce material waste from multiple mixing experiments. Because it has a very long service life, improves with time in strength, and is highly fire-resistant, concrete is the safest and most environmentally friendly building material used worldwide. It is estimated that between 21 and 31 billion tons are consumed annually. To generate concrete with the required strength, durability, and functionality in the most cost-efficient manner, designing a concrete mix entails selecting appropriate concrete elements and determining their relative ratios. This work used Extreme Gradient Boosting Regression (XGBR) to estimate the Compressive Strength (CS) and Slump (SL) in high-performance concrete. Three optimization methods were employed: The Population-Based Vortex Search Algorithm (PBVSA), the Escaping Bird Search for Constrained Optimization (EBSCO), as well as the Honey Badger Algorithm (HBA), to increase the predictive capacity of the underlying model. To improve accuracy, a calculated decision was made to combine these optimizers with existing schemes, resulting in the development of novel hybrid schemes with enhanced prediction capabilities. XGBR + PBVSA (XGPB), XGBR + EBSCO (XGEB), and XGBR + HBA (XGHB) are the names of the resulting hybrid schemes. The XGEB model, with a value of 0.985, performs exceptionally well during the test phase, as indicated by the R
2
index value in the SL section. The XGPB model, with a value of 0.975, comes in second. XGEB has the highest performance model in the CS section during the test phase, with an R
2
index value of 0.991, while XGB has the lowest productivity model with a value of 0.968.
Graphical abstract |
|---|---|
| Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 2520-8160 2520-8179 |
| DOI: | 10.1007/s41939-025-01001-z |