Supervised Robustness-preserving Data-free Neural Network Pruning

When deploying pre-trained neural network models in real-world applications, model consumers often encounter resource-constraint platforms such as mobile and smart devices. They typically use the pruning technique to reduce the size and complexity of the model, generating a lighter one with less res...

Full description

Saved in:
Bibliographic Details
Published inProceedings (International Conference on Engineering of Complex Computer Systems. Online) pp. 22 - 31
Main Authors Meng, Mark Huasong, Bai, Guangdong, Teo, Sin G., Dong, Jin Song
Format Conference Proceeding
LanguageEnglish
Published IEEE 14.06.2023
Subjects
Online AccessGet full text
ISSN2770-8535
DOI10.1109/ICECCS59891.2023.00013

Cover

Abstract When deploying pre-trained neural network models in real-world applications, model consumers often encounter resource-constraint platforms such as mobile and smart devices. They typically use the pruning technique to reduce the size and complexity of the model, generating a lighter one with less resource consumption. Nonetheless, most existing pruning methods are proposed with a premise that the model after being pruned has a chance to be fine-tuned or even retrained based on the original training data. This may be unrealistic in practice, as the data controllers are often reluctant to provide their model consumers with the original data. In this work, we study the neural network pruning in the data-free context, aiming to yield lightweight models that are not only accurate in prediction but also robust against undesired inputs in open-world deployments. Considering the absence of fine-tuning and retraining that can fix the mis-pruned units, we replace the traditional aggressive one-shot strategy with a conservative one that treats model pruning as a progressive process. We propose a pruning method based on stochastic optimization that uses robustness-related metrics to guide the pruning process. Our method is evaluated with a series of experiments on diverse neural network models. The experimental results show that it significantly outperforms existing one-shot data-free pruning approaches in terms of robustness preservation and accuracy.
AbstractList When deploying pre-trained neural network models in real-world applications, model consumers often encounter resource-constraint platforms such as mobile and smart devices. They typically use the pruning technique to reduce the size and complexity of the model, generating a lighter one with less resource consumption. Nonetheless, most existing pruning methods are proposed with a premise that the model after being pruned has a chance to be fine-tuned or even retrained based on the original training data. This may be unrealistic in practice, as the data controllers are often reluctant to provide their model consumers with the original data. In this work, we study the neural network pruning in the data-free context, aiming to yield lightweight models that are not only accurate in prediction but also robust against undesired inputs in open-world deployments. Considering the absence of fine-tuning and retraining that can fix the mis-pruned units, we replace the traditional aggressive one-shot strategy with a conservative one that treats model pruning as a progressive process. We propose a pruning method based on stochastic optimization that uses robustness-related metrics to guide the pruning process. Our method is evaluated with a series of experiments on diverse neural network models. The experimental results show that it significantly outperforms existing one-shot data-free pruning approaches in terms of robustness preservation and accuracy.
Author Meng, Mark Huasong
Dong, Jin Song
Teo, Sin G.
Bai, Guangdong
Author_xml – sequence: 1
  givenname: Mark Huasong
  surname: Meng
  fullname: Meng, Mark Huasong
  email: menghs@i2r.a-star.edu.sg
  organization: National University of Singapore,Singapore
– sequence: 2
  givenname: Guangdong
  surname: Bai
  fullname: Bai, Guangdong
  email: g.bai@uq.edu.au
  organization: The University of Queensland,Australia
– sequence: 3
  givenname: Sin G.
  surname: Teo
  fullname: Teo, Sin G.
  email: teo_sin_gee@i2r.a-star.edu.sg
  organization: Institute for Infocomm Research,ASTAR,Singapore
– sequence: 4
  givenname: Jin Song
  surname: Dong
  fullname: Dong, Jin Song
  email: dcsdjs@nus.edu.sg
  organization: National University of Singapore,Singapore
BookMark eNotj9FKwzAUhqMoOOfeQKQv0HmSkzQ9l6NuOhgqTq9Hmp5KdbYlWRXf3opeffDz8cF_Lk7armUhriTMpQS6XhfLotgayknOFSicA4DEIzEjSzkaQA2g9bGYKGshzQ2aMzGL8W3UEBVYVBOx2A49h88mcpU8deUQDy3HmPaB4-_cviY37uDSOjAn9zwEtx9x-OrCe_IYhnYULsRp7faRZ_-cipfV8rm4SzcPt-tisUkbmVlM0WNOZHMAp70rs5ozMs5UCF4Tk1F-vKGNtlVFnsvSePamlCw961rXhFNx-ddtmHnXh-bDhe-dBFSSJOIPmTxOXA
CODEN IEEPAD
ContentType Conference Proceeding
DBID 6IE
6IL
CBEJK
RIE
RIL
DOI 10.1109/ICECCS59891.2023.00013
DatabaseName IEEE Electronic Library (IEL) Conference Proceedings
IEEE Xplore POP ALL
IEEE Xplore All Conference Proceedings
IEEE Electronic Library (IEL)
IEEE Proceedings Order Plans (POP All) 1998-Present
DatabaseTitleList
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Computer Science
EISBN 9798350340044
EISSN 2770-8535
EndPage 31
ExternalDocumentID 10321913
Genre orig-research
GroupedDBID 6IE
6IL
6IN
ABLEC
ADZIZ
ALMA_UNASSIGNED_HOLDINGS
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CBEJK
CHZPO
IEGSK
OCL
RIE
RIL
ID FETCH-LOGICAL-i1673-3c38997800a4cab6fe695a5d30c49e952c8914547dd9cebb5cec5b1e1ce4f4f93
IEDL.DBID RIE
IngestDate Wed Aug 27 02:37:21 EDT 2025
IsDoiOpenAccess false
IsOpenAccess true
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-i1673-3c38997800a4cab6fe695a5d30c49e952c8914547dd9cebb5cec5b1e1ce4f4f93
OpenAccessLink http://hdl.handle.net/10072/428247
PageCount 10
ParticipantIDs ieee_primary_10321913
PublicationCentury 2000
PublicationDate 2023-June-14
PublicationDateYYYYMMDD 2023-06-14
PublicationDate_xml – month: 06
  year: 2023
  text: 2023-June-14
  day: 14
PublicationDecade 2020
PublicationTitle Proceedings (International Conference on Engineering of Complex Computer Systems. Online)
PublicationTitleAbbrev ICECCS
PublicationYear 2023
Publisher IEEE
Publisher_xml – name: IEEE
SSID ssj0003320732
Score 1.8546678
Snippet When deploying pre-trained neural network models in real-world applications, model consumers often encounter resource-constraint platforms such as mobile and...
SourceID ieee
SourceType Publisher
StartPage 22
SubjectTerms Computational modeling
Data models
model optimization
Neural networks
pruning
Robustness
Simulated annealing
Stochastic processes
Training data
Title Supervised Robustness-preserving Data-free Neural Network Pruning
URI https://ieeexplore.ieee.org/document/10321913
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LT8MwDI7YTpzGY4i3euCarXm1y7lsGkhMCJi025SHixCom7b2wq8n7rqBkJA4NcqhqezUjh1_nwm58T6VLmUI-xWKylwPqIVUU-6BGY4HhPoq5mGSjKfyfqZmDVi9xsIAQF18Bj0c1nf5fuEqTJX1kfwtxBeiRVppqjdgrV1CRQgetitvUMAs1v27bJhlz0oPNAaCHKlMY2xj8KONSu1FRh0y2a6_KR5571Wl7bnPX9SM__7AA9L9BuxFjztXdEj2oDginW3Hhqj5gY-DEa2WaBzW4KOnha3WJVo6isWwOF28RremNDRfAURI22E-wqOuEw-vrzCF0iXT0fAlG9OmiQJ9Y0kqqHDIoJeGc6GRztgkh0Qro7yIndSgFXdBPMjq5b12YK1y4JRlwBzIPOhOnJB2sSjglESQh2gsjb32iZaac4Oxj1EstxpCpKvOSBdFMl9ueDLmW2mc_zF_QfZRLVh4xeQlaZerCq6Ciy_tda3aL4hypSo
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LT8JAEN4oHvSED4xve_C60O52W_ZcIaBAjELCjexjaoymEGgv_np3SkFjYuKpzR6azex2Zr_Z-b4h5M7aODRxgLRfLmiYyjbVEEvKLASK4QGhvIoZjqLeJHyYimlFVi-5MABQFp9BE1_Lu3w7NwWmyloo_ubwBd8le8LBinhN19qmVDhnbsOyigcc-LLVTzpJ8iJkWyIUZChm6mMjgx-NVMo40q2T0WYG6_KR92aR66b5_CXO-O8pHpLGN2XPe9oGoyOyA9kxqW96NnjVL3zi3GixQPewAus9z3WxytHXUSyHxeHs1btXuaLpEsBD4Q714R5lpbj7fIFJlAaZdDvjpEerNgr0LYhiTrlBDb3YnQxVaJSOUoikUMJy34QSpGDGmQd1vayVBrQWBozQAQQGwtStHj8ltWyewRnxIHV4LPattJEMJWMK0Y8SQaolOKwrzkkDTTJbrJUyZhtrXPwxfkv2e-PhYDbojx4vyQEuEZZhBeEVqeXLAq5dwM_1TbnMX2cEqHs
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=proceeding&rft.title=Proceedings+%28International+Conference+on+Engineering+of+Complex+Computer+Systems.+Online%29&rft.atitle=Supervised+Robustness-preserving+Data-free+Neural+Network+Pruning&rft.au=Meng%2C+Mark+Huasong&rft.au=Bai%2C+Guangdong&rft.au=Teo%2C+Sin+G.&rft.au=Dong%2C+Jin+Song&rft.date=2023-06-14&rft.pub=IEEE&rft.eissn=2770-8535&rft.spage=22&rft.epage=31&rft_id=info:doi/10.1109%2FICECCS59891.2023.00013&rft.externalDocID=10321913