sparsesurv: a Python package for fitting sparse survival models via knowledge distillation

Abstract Motivation Sparse survival models are statistical models that select a subset of predictor variables while modeling the time until an event occurs, which can subsequently help interpretability and transportability. The subset of important features is often obtained with regularized models,...

Full description

Saved in:
Bibliographic Details
Published inBioinformatics (Oxford, England) Vol. 40; no. 9
Main Authors Wissel, David, Janakarajan, Nikita, Schulte, Julius, Rowson, Daniel, Yuan, Xintian, Boeva, Valentina
Format Journal Article
LanguageEnglish
Published England Oxford University Press 02.09.2024
Oxford Publishing Limited (England)
Subjects
Online AccessGet full text
ISSN1367-4811
1367-4803
1367-4811
DOI10.1093/bioinformatics/btae521

Cover

More Information
Summary:Abstract Motivation Sparse survival models are statistical models that select a subset of predictor variables while modeling the time until an event occurs, which can subsequently help interpretability and transportability. The subset of important features is often obtained with regularized models, such as the Cox Proportional Hazards model with Lasso regularization, which limit the number of non-zero coefficients. However, such models can be sensitive to the choice of regularization hyperparameter. Results In this work, we develop a software package and demonstrate how knowledge distillation, a powerful technique in machine learning that aims to transfer knowledge from a complex teacher model to a simpler student model, can be leveraged to learn sparse survival models while mitigating this challenge. For this purpose, we present sparsesurv, a Python package that contains a set of teacher–student model pairs, including the semi-parametric accelerated failure time and the extended hazards models as teachers, which currently do not have Python implementations. It also contains in-house survival function estimators, removing the need for external packages. Sparsesurv is validated against R-based Elastic Net regularized linear Cox proportional hazards models as implemented in the commonly used glmnet package. Our results reveal that knowledge distillation-based approaches achieve competitive discriminative performance relative to glmnet across the regularization path while making the choice of the regularization hyperparameter significantly easier. All of these features, combined with a sklearn-like API, make sparsesurv an easy-to-use Python package that enables survival analysis for high-dimensional datasets through fitting sparse survival models via knowledge distillation. Availability and implementation sparsesurv is freely available under a BSD 3 license on GitHub (https://github.com/BoevaLab/sparsesurv) and The Python Package Index (PyPi) (https://pypi.org/project/sparsesurv/).
Bibliography:SourceType-Scholarly Journals-1
content type line 14
ObjectType-Report-1
ObjectType-Article-1
ObjectType-Feature-2
content type line 23
ISSN:1367-4811
1367-4803
1367-4811
DOI:10.1093/bioinformatics/btae521