An explainable multi-sparsity multi-kernel nonconvex optimization least-squares classifier method via ADMM
Convex optimization techniques are extensively applied to various models, algorithms, and applications of machine learning and data mining. For optimization-based classification methods, the sparsity principle can greatly help to select simple classifier models, while the single- and multi-kernel me...
Saved in:
| Published in | Neural computing & applications Vol. 34; no. 18; pp. 16103 - 16128 |
|---|---|
| Main Authors | , , , , , , , |
| Format | Journal Article |
| Language | English |
| Published |
London
Springer London
01.09.2022
Springer Nature B.V |
| Subjects | |
| Online Access | Get full text |
| ISSN | 0941-0643 1433-3058 |
| DOI | 10.1007/s00521-022-07282-6 |
Cover
| Abstract | Convex optimization techniques are extensively applied to various models, algorithms, and applications of machine learning and data mining. For optimization-based classification methods, the sparsity principle can greatly help to select simple classifier models, while the single- and multi-kernel methods can effectively address nonlinearly separable problems. However, the limited sparsity and kernel methods hinder the improvement of the predictive accuracy, efficiency, iterative update, and interpretable classification model. In this paper, we propose a new Explainable Multi-sparsity Multi-kernel Nonconvex Optimization Least-squares Classifier (EM
2
NOLC) model, which is an optimization problem with a least-squares objective function and multi-sparsity multi-kernel nonconvex constraints, aiming to address the aforementioned issues. Based on reconstructed multiple kernel learning (MKL), the proposed model can extract important instances and features by finding the sparse coefficient and kernel weight vectors, which are used to compute importance or contribution to classification and obtain the explainable prediction. The corresponding EM
2
NOLC algorithm is implemented with the Alternating Direction Method of Multipliers (ADMM) method. On the real classification datasets, compared with the three ADMM classifiers of Linear Support Vector Machine Classifier, SVMC, Least Absolute Shrinkage and Selection Operator Classifier, the two MKL classifiers of SimpleMKL and EasyMKL, and the gradient descent classifier of Feature Selection for SVMC, our proposed EM
2
NOLC generally obtains the best predictive performance and explainable results with the least number of important instances and features having different contribution percentages. |
|---|---|
| AbstractList | Convex optimization techniques are extensively applied to various models, algorithms, and applications of machine learning and data mining. For optimization-based classification methods, the sparsity principle can greatly help to select simple classifier models, while the single- and multi-kernel methods can effectively address nonlinearly separable problems. However, the limited sparsity and kernel methods hinder the improvement of the predictive accuracy, efficiency, iterative update, and interpretable classification model. In this paper, we propose a new Explainable Multi-sparsity Multi-kernel Nonconvex Optimization Least-squares Classifier (EM
2
NOLC) model, which is an optimization problem with a least-squares objective function and multi-sparsity multi-kernel nonconvex constraints, aiming to address the aforementioned issues. Based on reconstructed multiple kernel learning (MKL), the proposed model can extract important instances and features by finding the sparse coefficient and kernel weight vectors, which are used to compute importance or contribution to classification and obtain the explainable prediction. The corresponding EM
2
NOLC algorithm is implemented with the Alternating Direction Method of Multipliers (ADMM) method. On the real classification datasets, compared with the three ADMM classifiers of Linear Support Vector Machine Classifier, SVMC, Least Absolute Shrinkage and Selection Operator Classifier, the two MKL classifiers of SimpleMKL and EasyMKL, and the gradient descent classifier of Feature Selection for SVMC, our proposed EM
2
NOLC generally obtains the best predictive performance and explainable results with the least number of important instances and features having different contribution percentages. Convex optimization techniques are extensively applied to various models, algorithms, and applications of machine learning and data mining. For optimization-based classification methods, the sparsity principle can greatly help to select simple classifier models, while the single- and multi-kernel methods can effectively address nonlinearly separable problems. However, the limited sparsity and kernel methods hinder the improvement of the predictive accuracy, efficiency, iterative update, and interpretable classification model. In this paper, we propose a new Explainable Multi-sparsity Multi-kernel Nonconvex Optimization Least-squares Classifier (EM2NOLC) model, which is an optimization problem with a least-squares objective function and multi-sparsity multi-kernel nonconvex constraints, aiming to address the aforementioned issues. Based on reconstructed multiple kernel learning (MKL), the proposed model can extract important instances and features by finding the sparse coefficient and kernel weight vectors, which are used to compute importance or contribution to classification and obtain the explainable prediction. The corresponding EM2NOLC algorithm is implemented with the Alternating Direction Method of Multipliers (ADMM) method. On the real classification datasets, compared with the three ADMM classifiers of Linear Support Vector Machine Classifier, SVMC, Least Absolute Shrinkage and Selection Operator Classifier, the two MKL classifiers of SimpleMKL and EasyMKL, and the gradient descent classifier of Feature Selection for SVMC, our proposed EM2NOLC generally obtains the best predictive performance and explainable results with the least number of important instances and features having different contribution percentages. |
| Author | Li, Shuqing Cao, Jie Wang, Pingjiang Zhang, Zhiwang He, Jing Li, Xingsen Shi, Yong Zhang, Kai |
| Author_xml | – sequence: 1 givenname: Zhiwang orcidid: 0000-0002-1060-5797 surname: Zhang fullname: Zhang, Zhiwang email: 9120211050@nufe.edu.cn organization: College of Information Engineering, Nanjing University of Finance and Economics – sequence: 2 givenname: Jing surname: He fullname: He, Jing organization: Department of Neuroscience, University of Oxford – sequence: 3 givenname: Jie surname: Cao fullname: Cao, Jie organization: College of Information Engineering, Nanjing University of Finance and Economics – sequence: 4 givenname: Shuqing surname: Li fullname: Li, Shuqing organization: College of Information Engineering, Nanjing University of Finance and Economics – sequence: 5 givenname: Xingsen surname: Li fullname: Li, Xingsen organization: Research Institute of Extenics and Innovation Methods, Guangdong University of Technology – sequence: 6 givenname: Kai surname: Zhang fullname: Zhang, Kai organization: Quanzhou HUST Research Institute of Intelligent Manufacturing – sequence: 7 givenname: Pingjiang surname: Wang fullname: Wang, Pingjiang organization: Quanzhou HUST Research Institute of Intelligent Manufacturing, School of Mechanical Science and Engineering, Huazhong University of Science and Technology – sequence: 8 givenname: Yong surname: Shi fullname: Shi, Yong organization: Research Center on Fictitious Economy and Data Science, Chinese Academy of Sciences |
| BookMark | eNp9kM1OAyEURompiW31BVyRuEYZmDKwbOpv0saNrglD7yh1ykyBNq1P77Q1MXHRFQG-c7-bM0A933hA6DqjtxmlxV2kdMQyQhkjtGCSEXGG-lnOOeF0JHuoT1XefYucX6BBjAtKaS7kqI8WY49h29bGeVPWgJfrOjkSWxOiS7vf6xcEDzXuOm3jN7DFTZvc0n2b5BqPazAxkbhamwAR29rE6CoHAS8hfTZzvHEGj-9ns0t0Xpk6wtXvOUTvjw9vk2cyfX16mYynxPJMJQKlkqpkBQgh1TyvVEGVFBJUWciytFZlWc6MraS0c6GqOSstEyC53D90cT5EN8e5bWhWa4hJL5p18F2lZgUdKSoU512KHVM2NDEGqHQb3NKEnc6o3jvVR6e6c6oPTrXoIPkPsi4dLKRgXH0a5Uc0dj3-A8LfVieoH-Ufj4Q |
| CitedBy_id | crossref_primary_10_1016_j_procs_2022_11_337 crossref_primary_10_1016_j_eswa_2023_120635 crossref_primary_10_1007_s00521_023_09321_2 crossref_primary_10_1007_s44196_024_00455_2 crossref_primary_10_1007_s10614_023_10535_8 |
| Cites_doi | 10.1017/CBO9780511801389 10.1016/j.knosys.2013.04.006 10.1017/CBO9780511809682 10.1561/2200000015 10.18637/jss.v084.i10 10.1201/b18401 10.1007/s10489-020-01687-3 10.1016/j.patcog.2014.12.001 10.1201/b17758 10.1561/2200000016 10.1109/TII.2020.3030709 10.1198/016214506000001383 10.1109/TGRS.2015.2514161 10.1007/s10994-009-5119-5 10.1016/S0893-6080(00)00077-0 10.1111/j.2517-6161.1996.tb02080.x 10.1201/b14297 10.1016/j.patrec.2010.06.017 10.1145/1273496.1273646 10.1561/2200000058 10.1561/2000000102 10.1016/j.neucom.2014.11.078 10.1162/NECO_a_00537 10.1137/16M1080173 10.1017/CBO9781107298019 10.1561/2200000018 10.1073/pnas.1900654116 10.1017/9781108583664 10.1016/j.patcog.2020.107194 |
| ContentType | Journal Article |
| Copyright | The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2022 The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2022. |
| Copyright_xml | – notice: The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2022 – notice: The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2022. |
| DBID | AAYXX CITATION 8FE 8FG AFKRA ARAPS BENPR BGLVJ CCPQU DWQXO HCIFZ P5Z P62 PHGZM PHGZT PKEHL PQEST PQGLB PQQKQ PQUKI PRINS |
| DOI | 10.1007/s00521-022-07282-6 |
| DatabaseName | CrossRef ProQuest SciTech Collection ProQuest Technology Collection ProQuest Central UK/Ireland Advanced Technologies & Computer Science Collection ProQuest Central Technology collection ProQuest One Community College ProQuest Central SciTech Premium Collection AAdvanced Technologies & Aerospace Database (subscription) ProQuest Advanced Technologies & Aerospace Collection ProQuest Central Premium ProQuest One Academic (New) ProQuest One Academic Middle East (New) ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China |
| DatabaseTitle | CrossRef Advanced Technologies & Aerospace Collection Technology Collection ProQuest One Academic Middle East (New) ProQuest Advanced Technologies & Aerospace Collection ProQuest One Academic Eastern Edition SciTech Premium Collection ProQuest One Community College ProQuest Technology Collection ProQuest SciTech Collection ProQuest Central China ProQuest Central Advanced Technologies & Aerospace Database ProQuest One Applied & Life Sciences ProQuest One Academic UKI Edition ProQuest Central Korea ProQuest Central (New) ProQuest One Academic ProQuest One Academic (New) |
| DatabaseTitleList | Advanced Technologies & Aerospace Collection |
| Database_xml | – sequence: 1 dbid: 8FG name: ProQuest Technology Collection url: https://search.proquest.com/technologycollection1 sourceTypes: Aggregation Database |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Computer Science |
| EISSN | 1433-3058 |
| EndPage | 16128 |
| ExternalDocumentID | 10_1007_s00521_022_07282_6 |
| GrantInformation_xml | – fundername: National Natural Science Foundation of China grantid: 61877061; 71271191; 71871109; 91646204; 71701089 funderid: http://dx.doi.org/10.13039/501100001809 – fundername: Jiangsu Provincial Policy Guidance Program grantid: BZ2020008 – fundername: High-End Foreign Experts Projects grantid: G2021194011L – fundername: Key Program of National Natural Science Foundation of China grantid: 92046026 – fundername: Jiangsu Provincial Key Research and Development Program grantid: BE2020001-3 funderid: http://dx.doi.org/10.13039/501100013058 |
| GroupedDBID | -4Z -59 -5G -BR -EM -Y2 -~C .4S .86 .DC .VR 06D 0R~ 0VY 123 1N0 1SB 2.D 203 28- 29N 2J2 2JN 2JY 2KG 2LR 2P1 2VQ 2~H 30V 4.4 406 408 409 40D 40E 53G 5QI 5VS 67Z 6NX 8FE 8FG 8TC 8UJ 95- 95. 95~ 96X AAAVM AABHQ AACDK AAHNG AAIAL AAJBT AAJKR AANZL AAOBN AARHV AARTL AASML AATNV AATVU AAUYE AAWCG AAYIU AAYQN AAYTO AAYZH ABAKF ABBBX ABBXA ABDBF ABDZT ABECU ABFTD ABFTV ABHLI ABHQN ABJNI ABJOX ABKCH ABKTR ABLJU ABMNI ABMQK ABNWP ABQBU ABQSL ABSXP ABTEG ABTHY ABTKH ABTMW ABULA ABWNU ABXPI ACAOD ACBXY ACDTI ACGFS ACHSB ACHXU ACKNC ACMDZ ACMLO ACOKC ACOMO ACPIV ACSNA ACUHS ACZOJ ADHHG ADHIR ADIMF ADINQ ADKNI ADKPE ADMLS ADRFC ADTPH ADURQ ADYFF ADZKW AEBTG AEFIE AEFQL AEGAL AEGNC AEJHL AEJRE AEKMD AEMSY AENEX AEOHA AEPYU AESKC AETLH AEVLU AEXYK AFBBN AFEXP AFGCZ AFKRA AFLOW AFQWF AFWTZ AFZKB AGAYW AGDGC AGGDS AGJBK AGMZJ AGQEE AGQMX AGRTI AGWIL AGWZB AGYKE AHAVH AHBYD AHKAY AHSBF AHYZX AIAKS AIGIU AIIXL AILAN AITGF AJBLW AJRNO AJZVZ ALMA_UNASSIGNED_HOLDINGS ALWAN AMKLP AMXSW AMYLF AMYQR AOCGG ARAPS ARCSS ARMRJ ASPBG AVWKF AXYYD AYJHY AZFZN B-. B0M BA0 BBWZM BDATZ BENPR BGLVJ BGNMA BSONS CAG CCPQU COF CS3 CSCUP DDRTE DL5 DNIVK DPUIP DU5 EAD EAP EBLON EBS ECS EDO EIOEI EJD EMI EMK EPL ESBYG EST ESX F5P FEDTE FERAY FFXSO FIGPU FINBP FNLPD FRRFC FSGXE FWDCC GGCAI GGRSB GJIRD GNWQR GQ6 GQ7 GQ8 GXS H13 HCIFZ HF~ HG5 HG6 HMJXF HQYDN HRMNR HVGLF HZ~ I-F I09 IHE IJ- IKXTQ ITM IWAJR IXC IZIGR IZQ I~X I~Z J-C J0Z JBSCW JCJTX JZLTJ KDC KOV KOW LAS LLZTM M4Y MA- N2Q N9A NB0 NDZJH NPVJJ NQJWS NU0 O9- O93 O9G O9I O9J OAM P19 P2P P62 P9O PF0 PT4 PT5 QOK QOS R4E R89 R9I RHV RIG RNI RNS ROL RPX RSV RZK S16 S1Z S26 S27 S28 S3B SAP SCJ SCLPG SCO SDH SDM SHX SISQX SJYHP SNE SNPRN SNX SOHCF SOJ SPISZ SRMVM SSLCW STPWE SZN T13 T16 TSG TSK TSV TUC TUS U2A UG4 UOJIU UTJUX UZXMN VC2 VFIZW W23 W48 WK8 YLTOR Z45 Z5O Z7R Z7S Z7V Z7W Z7X Z7Y Z7Z Z81 Z83 Z86 Z88 Z8M Z8N Z8P Z8Q Z8R Z8S Z8T Z8U Z8W Z92 ZMTXR ~8M ~EX AAPKM AAYXX ABBRH ABDBE ABFSG ABRTQ ACSTC ADHKG ADKFA AEZWR AFDZB AFHIU AFOHR AGQPQ AHPBZ AHWEU AIXLP ATHPR AYFIA CITATION PHGZM PHGZT PQGLB PUEGO DWQXO PKEHL PQEST PQQKQ PQUKI PRINS |
| ID | FETCH-LOGICAL-c319t-eb989b27e6689d4f9709868e9b78bbcc91142acf88cd69fd2bc26e83888cd4f93 |
| IEDL.DBID | U2A |
| ISSN | 0941-0643 |
| IngestDate | Mon Jul 14 10:43:13 EDT 2025 Wed Oct 01 02:26:13 EDT 2025 Thu Apr 24 23:03:59 EDT 2025 Fri Feb 21 02:44:54 EST 2025 |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 18 |
| Keywords | Multiple kernel learning Nonconvex optimization Sparse learning Explainable Least squares Classification |
| Language | English |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c319t-eb989b27e6689d4f9709868e9b78bbcc91142acf88cd69fd2bc26e83888cd4f93 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ORCID | 0000-0002-1060-5797 |
| PQID | 2705906933 |
| PQPubID | 2043988 |
| PageCount | 26 |
| ParticipantIDs | proquest_journals_2705906933 crossref_primary_10_1007_s00521_022_07282_6 crossref_citationtrail_10_1007_s00521_022_07282_6 springer_journals_10_1007_s00521_022_07282_6 |
| ProviderPackageCode | CITATION AAYXX |
| PublicationCentury | 2000 |
| PublicationDate | 2022-09-01 |
| PublicationDateYYYYMMDD | 2022-09-01 |
| PublicationDate_xml | – month: 09 year: 2022 text: 2022-09-01 day: 01 |
| PublicationDecade | 2020 |
| PublicationPlace | London |
| PublicationPlace_xml | – name: London – name: Heidelberg |
| PublicationTitle | Neural computing & applications |
| PublicationTitleAbbrev | Neural Comput & Applic |
| PublicationYear | 2022 |
| Publisher | Springer London Springer Nature B.V |
| Publisher_xml | – name: Springer London – name: Springer Nature B.V |
| References | JainPKarPNon-convex optimization for machine learningFound Trends Mach Learn2017103–41423361388.68251 SjöstrandKClemmensenLHLarsenREinarssonGErsbøllBKSpasm: a matlab toolbox for sparse statistical modelingJ Stat Softw20188410137 GallierJQuaintanceJFundamentals of optimization theory with applications to machine learning2019PhiladelphiaUniversity of Pennsylvania1446.90107 BottouLCurtisEFNocedalJOptimization methods for large-scale machine learningSIAM Rev201860222331137977191397.65085 WestonJElisseeffASchölkopfBTippingMUse of the zero-norm with linear models and kernel methodsJ Mach Learn Res200331439146119830131102.68605 Shalev-ShwartzSOnline learning and online convex optimizationFound Trends Mach Learn2011421071941253.68190 HandDJMeasuring classifier performance: a coherent alternative to the area under the ROC curveMach Learn2009771031231470.62085 AiolliFDoniniMEasyMKL: a scalable multiple kernel learning algorithmNeurocomputing2015169215224 Weston J, Mukherjee S, Chapelle O, Pontil M, Poggio T, Vapnik V (2000) Feature selection for SVMs CharniakEIntroduction to deep learning2019CambridgeThe MIT Press Shawe-TaylorJCristianiniNKernel methods for pattern analysis2004CambridgeCambridge University Press0994.68074 ZhangZGaoGYaoTHeJTianYAn interpretable regression approach based on bi-sparse optimizationAppl Intell2020501141174142 RishIGrabarnikGYSparse modeling: theory, algorithms, and applications2014Boca RatonChapman & Hall/CRC Press1351.94011 Dua D, Graff C (2019) UCI Machine Learning Repository [http://archive.ics.uci.edu/ml]. Irvine, CA: University of California, School of Information and Computer Science Chapelle O, Keerthi SS (2008) Multi-class feature selection with support vector machines. In Proceedings of the American statistical association MairalJBachFPonceJSparse modeling for image and vision processingFound Trends Comput Graph Vis201282–3852831333.68263 MurdochWJSinghCKumbierKAbbasi-AslRYuBInterpretable machine learning: definitions, methods, and applicationsPNAS20191164422071220801431.62266 https://web.stanford.edu/~boyd/index.html YangXIntroduction to algorithms for data mining and machine learning2019CambridgeAcademic Press1411.68005 WangLShenXOn L1-Norm Multiclass Support Vector MachinesJ Am Stat Assoc20071024785835941172.62317 NazarpourAAdibiPTwo-stage multiple kernel learning for supervised dimensionality reductionPattern Recogn201548518541862 Shalev-ShwartzSBen-DavidSUnderstanding machine learning: from theory to algorithms2014CambridgeCambridge University Press1305.68005 SraSNowozinSWrightSJOptimization for machine learning2012CambridgeMit Press1304.65075 SonnenburgSRätschGSchäferCSchölkopfBLarge scale multiple kernel learningJ Mach Learn Res200671531156522744161222.90072 TibshiraniRRegression shrinkage and selection via the lassoJ R Stat Soc Ser B (Methodological)19965826728813792420850.62538 SuykensJAVandewalleJMoorBDOptimal control by least squares support vector machinesNeural Netw20011412335 CaoJWangYHeJLiangWTaoHZhuGPredicting grain losses and waste rate along the entire chain: a multitask multigated recurrent unit autoencoder based methodIEEE Trans Industr Inform202117643904400 KantardzicMData mining concepts, models, methods, and algorithms20203HobokenWiley-IEEE Press1232.68001 HastieTTibshiraniRWainwrightMStatistical learning with sparsity: the lasso and generalizations2015Boca RatonCRC Press1319.68003 HuangKZhengDSunJHottaYFujimotoKNaoiSSparse learning for support vector classificationPattern Recogn Lett2010311319441951 Zhu J, Rosset S, Tibshirani R, Hastie TJ (2004) 1-norm support vector machines. In Advances in neural information processing systems, pages 49–56 Hall, P. & Gill, N. (2019). An Introduction to Machine Learning Interpretability, An Applied Perspective on Fairness, Accountability, Transparency, and Explainable AI, 2nd Edition. O'Reilly Media, Inc. HazanEIntroduction to online convex optimizationFound Trends Optim201523–4157325 GoodfellowIBengioYCourvilleADeep learning2016CambridgeThe MIT Press1373.68009 BoydSParikhNChuEPeleatoBEcksteinJDistributed optimization and statistical learning via the alternating direction method of multipliersFound Trends Mach Learn20103111221229.90122 Beck A (2017) First-order methods in optimization. Mathematical optimization society and the society for industrial and applied mathematics, Philadelphia, PA 19104–2688 USA Molnar C (2021). Interpretable machine learning, a guide for making black box models explainable. Leanpub.com CristianiniNShawe-TaylorJAn introduction to support vector machines and other kernel-based learning methods2000CambridgeCambridge University Press0994.68074 SimeoneOA brief introduction to machine learning for engineersFound Trends Signal Process2018123–420043138528791403.68214 ShigeoASupport vector machines for pattern classification20102BerlinSpringer1191.68549 Gregorova M (2019) Sparse learning for variable selection with structures and nonlinearities. Doctoral dissertation, Geneve BoydSVandenbergheLIntroduction to applied linear algebra vectors, matrices, and least squares2018CambridgeCambridge University Press1406.15001 GönenMAlpaydinEMultiple kernel learning algorithmsJ Mach Learn Res2011122211226828254251280.68167 LauriolaIGallicchioCAiolliFEnhancing deep neural networks via multiple kernel learningPattern Recogn2020101107194 TheodoridisSMachine learning a Bayesian and optimization perspective20202ElsevierAcademic Press YamadaMJitkrittumWSigalLXingEPSugiyamaMHigh-dimensional feature selection by feature-wise kernelized lassoNeural Comput201426118520731555841410.68328 Zien, A., & Ong, C. S. (2007). Multiclass multiple kernel learning. In Proceedings of the 24th international conference on Machine learning, pages 1191–1198, ACM. WangTZhaoDFengYTwo-stage multiple kernel learning with multiclass kernel polarizationKnowl-Based Syst2013481016 Matlab, http://www.mathworks.com BachFJenattonRMairalJObozinskiGOptimization with sparsity-inducing penaltiesFound Trends Mach Learn201141110606064248 ParikhNBoydSProximal algorithmsFound Trends Optim201313123231 DengNTianYZhangCSupport vector machines: optimization-based theory, algorithms, and extensions2012Boca RatonCRC Press1278.68011 RakotomamonjyABachFRCanuSGrandvaletYSimpleMKLJ Mach Learn Res200892491252124608911225.68208 GuYLiuTJiaXBenediktssonJAChanussotJNonlinear multiple Kernel learning with multiple-structure-element extended morphological Profiles for hyperspectral image classificationIEEE Trans Geosci Remote Sens201654632353247 XanthopoulosPPardalosPMTrafalisTBRobust data mining2012BerlinSpringer Science & Business Media1260.90003 7282_CR54 7282_CR55 7282_CR12 T Hastie (7282_CR48) 2015 DJ Hand (7282_CR53) 2009; 77 J Mairal (7282_CR28) 2012; 8 7282_CR52 M Yamada (7282_CR30) 2014; 26 A Rakotomamonjy (7282_CR9) 2008; 9 M Gönen (7282_CR10) 2011; 12 P Xanthopoulos (7282_CR50) 2012 Y Gu (7282_CR11) 2016; 54 P Jain (7282_CR22) 2017; 10 7282_CR45 7282_CR47 A Nazarpour (7282_CR14) 2015; 48 K Sjöstrand (7282_CR31) 2018; 84 Z Zhang (7282_CR18) 2020; 50 J Weston (7282_CR23) 2003; 3 X Yang (7282_CR2) 2019 K Huang (7282_CR24) 2010; 31 (7282_CR1) 2012 E Charniak (7282_CR43) 2019 A Shigeo (7282_CR5) 2010 I Goodfellow (7282_CR42) 2016 R Tibshirani (7282_CR29) 1996; 58 7282_CR32 F Aiolli (7282_CR16) 2015; 169 7282_CR35 J Gallier (7282_CR36) 2019 S Shalev-Shwartz (7282_CR38) 2014 S Shalev-Shwartz (7282_CR40) 2011; 4 WJ Murdoch (7282_CR46) 2019; 116 O Simeone (7282_CR7) 2018; 12 S Theodoridis (7282_CR37) 2020 J Shawe-Taylor (7282_CR8) 2004 J Cao (7282_CR44) 2021; 17 JA Suykens (7282_CR49) 2001; 14 E Hazan (7282_CR41) 2015; 2 N Cristianini (7282_CR4) 2000 I Rish (7282_CR20) 2014 7282_CR21 S Boyd (7282_CR51) 2018 7282_CR25 7282_CR27 N Deng (7282_CR6) 2012 S Sonnenburg (7282_CR15) 2006; 7 T Wang (7282_CR13) 2013; 48 L Wang (7282_CR26) 2007; 102 N Parikh (7282_CR33) 2013; 1 S Boyd (7282_CR34) 2010; 3 F Bach (7282_CR19) 2011; 4 L Bottou (7282_CR39) 2018; 60 I Lauriola (7282_CR17) 2020; 101 M Kantardzic (7282_CR3) 2020 |
| References_xml | – reference: GallierJQuaintanceJFundamentals of optimization theory with applications to machine learning2019PhiladelphiaUniversity of Pennsylvania1446.90107 – reference: TheodoridisSMachine learning a Bayesian and optimization perspective20202ElsevierAcademic Press – reference: SimeoneOA brief introduction to machine learning for engineersFound Trends Signal Process2018123–420043138528791403.68214 – reference: SraSNowozinSWrightSJOptimization for machine learning2012CambridgeMit Press1304.65075 – reference: Chapelle O, Keerthi SS (2008) Multi-class feature selection with support vector machines. In Proceedings of the American statistical association – reference: ParikhNBoydSProximal algorithmsFound Trends Optim201313123231 – reference: HandDJMeasuring classifier performance: a coherent alternative to the area under the ROC curveMach Learn2009771031231470.62085 – reference: ShigeoASupport vector machines for pattern classification20102BerlinSpringer1191.68549 – reference: WangLShenXOn L1-Norm Multiclass Support Vector MachinesJ Am Stat Assoc20071024785835941172.62317 – reference: GuYLiuTJiaXBenediktssonJAChanussotJNonlinear multiple Kernel learning with multiple-structure-element extended morphological Profiles for hyperspectral image classificationIEEE Trans Geosci Remote Sens201654632353247 – reference: RakotomamonjyABachFRCanuSGrandvaletYSimpleMKLJ Mach Learn Res200892491252124608911225.68208 – reference: Gregorova M (2019) Sparse learning for variable selection with structures and nonlinearities. Doctoral dissertation, Geneve – reference: YangXIntroduction to algorithms for data mining and machine learning2019CambridgeAcademic Press1411.68005 – reference: XanthopoulosPPardalosPMTrafalisTBRobust data mining2012BerlinSpringer Science & Business Media1260.90003 – reference: BoydSVandenbergheLIntroduction to applied linear algebra vectors, matrices, and least squares2018CambridgeCambridge University Press1406.15001 – reference: Shawe-TaylorJCristianiniNKernel methods for pattern analysis2004CambridgeCambridge University Press0994.68074 – reference: Hall, P. & Gill, N. (2019). An Introduction to Machine Learning Interpretability, An Applied Perspective on Fairness, Accountability, Transparency, and Explainable AI, 2nd Edition. O'Reilly Media, Inc. – reference: Beck A (2017) First-order methods in optimization. Mathematical optimization society and the society for industrial and applied mathematics, Philadelphia, PA 19104–2688 USA – reference: Molnar C (2021). Interpretable machine learning, a guide for making black box models explainable. Leanpub.com – reference: Matlab, http://www.mathworks.com – reference: SuykensJAVandewalleJMoorBDOptimal control by least squares support vector machinesNeural Netw20011412335 – reference: Shalev-ShwartzSBen-DavidSUnderstanding machine learning: from theory to algorithms2014CambridgeCambridge University Press1305.68005 – reference: Weston J, Mukherjee S, Chapelle O, Pontil M, Poggio T, Vapnik V (2000) Feature selection for SVMs – reference: JainPKarPNon-convex optimization for machine learningFound Trends Mach Learn2017103–41423361388.68251 – reference: HuangKZhengDSunJHottaYFujimotoKNaoiSSparse learning for support vector classificationPattern Recogn Lett2010311319441951 – reference: https://web.stanford.edu/~boyd/index.html – reference: CristianiniNShawe-TaylorJAn introduction to support vector machines and other kernel-based learning methods2000CambridgeCambridge University Press0994.68074 – reference: WestonJElisseeffASchölkopfBTippingMUse of the zero-norm with linear models and kernel methodsJ Mach Learn Res200331439146119830131102.68605 – reference: GoodfellowIBengioYCourvilleADeep learning2016CambridgeThe MIT Press1373.68009 – reference: DengNTianYZhangCSupport vector machines: optimization-based theory, algorithms, and extensions2012Boca RatonCRC Press1278.68011 – reference: KantardzicMData mining concepts, models, methods, and algorithms20203HobokenWiley-IEEE Press1232.68001 – reference: Shalev-ShwartzSOnline learning and online convex optimizationFound Trends Mach Learn2011421071941253.68190 – reference: CharniakEIntroduction to deep learning2019CambridgeThe MIT Press – reference: Dua D, Graff C (2019) UCI Machine Learning Repository [http://archive.ics.uci.edu/ml]. Irvine, CA: University of California, School of Information and Computer Science – reference: CaoJWangYHeJLiangWTaoHZhuGPredicting grain losses and waste rate along the entire chain: a multitask multigated recurrent unit autoencoder based methodIEEE Trans Industr Inform202117643904400 – reference: BottouLCurtisEFNocedalJOptimization methods for large-scale machine learningSIAM Rev201860222331137977191397.65085 – reference: BoydSParikhNChuEPeleatoBEcksteinJDistributed optimization and statistical learning via the alternating direction method of multipliersFound Trends Mach Learn20103111221229.90122 – reference: TibshiraniRRegression shrinkage and selection via the lassoJ R Stat Soc Ser B (Methodological)19965826728813792420850.62538 – reference: HastieTTibshiraniRWainwrightMStatistical learning with sparsity: the lasso and generalizations2015Boca RatonCRC Press1319.68003 – reference: ZhangZGaoGYaoTHeJTianYAn interpretable regression approach based on bi-sparse optimizationAppl Intell2020501141174142 – reference: MairalJBachFPonceJSparse modeling for image and vision processingFound Trends Comput Graph Vis201282–3852831333.68263 – reference: WangTZhaoDFengYTwo-stage multiple kernel learning with multiclass kernel polarizationKnowl-Based Syst2013481016 – reference: NazarpourAAdibiPTwo-stage multiple kernel learning for supervised dimensionality reductionPattern Recogn201548518541862 – reference: BachFJenattonRMairalJObozinskiGOptimization with sparsity-inducing penaltiesFound Trends Mach Learn201141110606064248 – reference: MurdochWJSinghCKumbierKAbbasi-AslRYuBInterpretable machine learning: definitions, methods, and applicationsPNAS20191164422071220801431.62266 – reference: AiolliFDoniniMEasyMKL: a scalable multiple kernel learning algorithmNeurocomputing2015169215224 – reference: RishIGrabarnikGYSparse modeling: theory, algorithms, and applications2014Boca RatonChapman & Hall/CRC Press1351.94011 – reference: SjöstrandKClemmensenLHLarsenREinarssonGErsbøllBKSpasm: a matlab toolbox for sparse statistical modelingJ Stat Softw20188410137 – reference: Zien, A., & Ong, C. S. (2007). Multiclass multiple kernel learning. In Proceedings of the 24th international conference on Machine learning, pages 1191–1198, ACM. – reference: LauriolaIGallicchioCAiolliFEnhancing deep neural networks via multiple kernel learningPattern Recogn2020101107194 – reference: GönenMAlpaydinEMultiple kernel learning algorithmsJ Mach Learn Res2011122211226828254251280.68167 – reference: Zhu J, Rosset S, Tibshirani R, Hastie TJ (2004) 1-norm support vector machines. In Advances in neural information processing systems, pages 49–56 – reference: YamadaMJitkrittumWSigalLXingEPSugiyamaMHigh-dimensional feature selection by feature-wise kernelized lassoNeural Comput201426118520731555841410.68328 – reference: HazanEIntroduction to online convex optimizationFound Trends Optim201523–4157325 – reference: SonnenburgSRätschGSchäferCSchölkopfBLarge scale multiple kernel learningJ Mach Learn Res200671531156522744161222.90072 – volume-title: An introduction to support vector machines and other kernel-based learning methods year: 2000 ident: 7282_CR4 doi: 10.1017/CBO9780511801389 – volume-title: Machine learning a Bayesian and optimization perspective year: 2020 ident: 7282_CR37 – volume: 48 start-page: 10 year: 2013 ident: 7282_CR13 publication-title: Knowl-Based Syst doi: 10.1016/j.knosys.2013.04.006 – volume: 12 start-page: 2211 year: 2011 ident: 7282_CR10 publication-title: J Mach Learn Res – ident: 7282_CR32 – volume-title: Support vector machines for pattern classification year: 2010 ident: 7282_CR5 – volume-title: Kernel methods for pattern analysis year: 2004 ident: 7282_CR8 doi: 10.1017/CBO9780511809682 – volume: 4 start-page: 1 issue: 1 year: 2011 ident: 7282_CR19 publication-title: Found Trends Mach Learn doi: 10.1561/2200000015 – ident: 7282_CR55 – volume-title: Data mining concepts, models, methods, and algorithms year: 2020 ident: 7282_CR3 – volume: 84 start-page: 1 issue: 10 year: 2018 ident: 7282_CR31 publication-title: J Stat Softw doi: 10.18637/jss.v084.i10 – volume-title: Deep learning year: 2016 ident: 7282_CR42 – volume-title: Statistical learning with sparsity: the lasso and generalizations year: 2015 ident: 7282_CR48 doi: 10.1201/b18401 – ident: 7282_CR47 – volume: 50 start-page: 4117 issue: 11 year: 2020 ident: 7282_CR18 publication-title: Appl Intell doi: 10.1007/s10489-020-01687-3 – volume: 48 start-page: 1854 issue: 5 year: 2015 ident: 7282_CR14 publication-title: Pattern Recogn doi: 10.1016/j.patcog.2014.12.001 – volume-title: Robust data mining year: 2012 ident: 7282_CR50 – volume-title: Fundamentals of optimization theory with applications to machine learning year: 2019 ident: 7282_CR36 – volume-title: Introduction to deep learning year: 2019 ident: 7282_CR43 – volume-title: Sparse modeling: theory, algorithms, and applications year: 2014 ident: 7282_CR20 doi: 10.1201/b17758 – volume: 3 start-page: 1 issue: 1 year: 2010 ident: 7282_CR34 publication-title: Found Trends Mach Learn doi: 10.1561/2200000016 – volume: 17 start-page: 4390 issue: 6 year: 2021 ident: 7282_CR44 publication-title: IEEE Trans Industr Inform doi: 10.1109/TII.2020.3030709 – volume-title: Optimization for machine learning year: 2012 ident: 7282_CR1 – ident: 7282_CR27 – ident: 7282_CR54 – volume: 102 start-page: 583 issue: 478 year: 2007 ident: 7282_CR26 publication-title: J Am Stat Assoc doi: 10.1198/016214506000001383 – volume: 8 start-page: 85 issue: 2–3 year: 2012 ident: 7282_CR28 publication-title: Found Trends Comput Graph Vis – volume: 54 start-page: 3235 issue: 6 year: 2016 ident: 7282_CR11 publication-title: IEEE Trans Geosci Remote Sens doi: 10.1109/TGRS.2015.2514161 – ident: 7282_CR21 – volume: 9 start-page: 2491 year: 2008 ident: 7282_CR9 publication-title: J Mach Learn Res – volume: 77 start-page: 103 year: 2009 ident: 7282_CR53 publication-title: Mach Learn doi: 10.1007/s10994-009-5119-5 – volume: 14 start-page: 23 issue: 1 year: 2001 ident: 7282_CR49 publication-title: Neural Netw doi: 10.1016/S0893-6080(00)00077-0 – volume: 58 start-page: 267 year: 1996 ident: 7282_CR29 publication-title: J R Stat Soc Ser B (Methodological) doi: 10.1111/j.2517-6161.1996.tb02080.x – volume-title: Support vector machines: optimization-based theory, algorithms, and extensions year: 2012 ident: 7282_CR6 doi: 10.1201/b14297 – volume: 31 start-page: 1944 issue: 13 year: 2010 ident: 7282_CR24 publication-title: Pattern Recogn Lett doi: 10.1016/j.patrec.2010.06.017 – volume: 2 start-page: 157 issue: 3–4 year: 2015 ident: 7282_CR41 publication-title: Found Trends Optim – volume: 7 start-page: 1531 year: 2006 ident: 7282_CR15 publication-title: J Mach Learn Res – volume-title: Introduction to algorithms for data mining and machine learning year: 2019 ident: 7282_CR2 – ident: 7282_CR12 doi: 10.1145/1273496.1273646 – volume: 10 start-page: 142 issue: 3–4 year: 2017 ident: 7282_CR22 publication-title: Found Trends Mach Learn doi: 10.1561/2200000058 – volume: 12 start-page: 200 issue: 3–4 year: 2018 ident: 7282_CR7 publication-title: Found Trends Signal Process doi: 10.1561/2000000102 – volume: 169 start-page: 215 year: 2015 ident: 7282_CR16 publication-title: Neurocomputing doi: 10.1016/j.neucom.2014.11.078 – volume: 26 start-page: 185 issue: 1 year: 2014 ident: 7282_CR30 publication-title: Neural Comput doi: 10.1162/NECO_a_00537 – volume: 60 start-page: 223 issue: 2 year: 2018 ident: 7282_CR39 publication-title: SIAM Rev doi: 10.1137/16M1080173 – volume-title: Understanding machine learning: from theory to algorithms year: 2014 ident: 7282_CR38 doi: 10.1017/CBO9781107298019 – ident: 7282_CR45 – volume: 4 start-page: 107 issue: 2 year: 2011 ident: 7282_CR40 publication-title: Found Trends Mach Learn doi: 10.1561/2200000018 – volume: 116 start-page: 22071 issue: 44 year: 2019 ident: 7282_CR46 publication-title: PNAS doi: 10.1073/pnas.1900654116 – volume: 3 start-page: 1439 year: 2003 ident: 7282_CR23 publication-title: J Mach Learn Res – ident: 7282_CR35 – ident: 7282_CR52 – volume-title: Introduction to applied linear algebra vectors, matrices, and least squares year: 2018 ident: 7282_CR51 doi: 10.1017/9781108583664 – volume: 101 start-page: 107194 year: 2020 ident: 7282_CR17 publication-title: Pattern Recogn doi: 10.1016/j.patcog.2020.107194 – ident: 7282_CR25 – volume: 1 start-page: 123 issue: 3 year: 2013 ident: 7282_CR33 publication-title: Found Trends Optim |
| SSID | ssj0004685 |
| Score | 2.3309393 |
| Snippet | Convex optimization techniques are extensively applied to various models, algorithms, and applications of machine learning and data mining. For... |
| SourceID | proquest crossref springer |
| SourceType | Aggregation Database Enrichment Source Index Database Publisher |
| StartPage | 16103 |
| SubjectTerms | Algorithms Artificial Intelligence Classification Classifiers Computational Biology/Bioinformatics Computational geometry Computational Science and Engineering Computer Science Convexity Data mining Data Mining and Knowledge Discovery Feature extraction Image Processing and Computer Vision Iterative methods Kernel functions Least squares Machine learning Optimization Optimization techniques Original Article Performance prediction Probability and Statistics in Computer Science Sparsity Support vector machines |
| SummonAdditionalLinks | – databaseName: ProQuest Central dbid: BENPR link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1LSwMxEB60Xrz4FqtVcvCmQZvdZpODSH0hQouIgrelyU5ArduqVfrzzaRZq4IeN5sNy2Qyk0nmmw9gVxboqCgLF6k1PBUoeA8Tx72nSpsoLDpN4OROV17epVf3rfsZ6FZYGEqrrGxiMNTFwNIZ-YHICCYpffx9PHzhxBpFt6sVhUYvUisUR6HE2CzMCaqMVYO5k_Pu9c03pGQg6fQxDeX7pEmE0QQwHZ2Q-lZB2Zg-EOHyp6ua7j9_XZkGT3SxBAtxC8nakzlfhhksV2CxomdgcbWuwmO7ZDge9iM-ioXcQe4tSMjDiI9P-Fpin5WDMuSfj9nA25DnCM5kfWL24W8v74RSYpZ22g_OO1I2IZ5mHw891j7rdNbg7uL89vSSR2oFbv2aG3E0WmkjMpRS6SJ1OjvUSirUJlPGWKsJYtuzTilbSO0KYayQqBJFDb57sg41_2e4AUy2mplsGh82OUxRtBTd1LnEB17WFF5F6tCspJjbWHec6C_6-VfF5CD53Es-D5LPZR32vr4ZTqpu_Nu7UU1OHlfgWz7VlzrsVxM2ff33aJv_j7YF8yLoCKWZNaA2en3Hbb8vGZmdqGyfa5ffuA priority: 102 providerName: ProQuest |
| Title | An explainable multi-sparsity multi-kernel nonconvex optimization least-squares classifier method via ADMM |
| URI | https://link.springer.com/article/10.1007/s00521-022-07282-6 https://www.proquest.com/docview/2705906933 |
| Volume | 34 |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVEBS databaseName: EBSCOhost Academic Search Ultimate customDbUrl: https://search.ebscohost.com/login.aspx?authtype=ip,shib&custid=s3936755&profile=ehost&defaultdb=asn eissn: 1433-3058 dateEnd: 20241103 omitProxy: true ssIdentifier: ssj0004685 issn: 0941-0643 databaseCode: ABDBF dateStart: 19990101 isFulltext: true titleUrlDefault: https://search.ebscohost.com/direct.asp?db=asn providerName: EBSCOhost – providerCode: PRVEBS databaseName: Inspec with Full Text customDbUrl: eissn: 1433-3058 dateEnd: 20241103 omitProxy: false ssIdentifier: ssj0004685 issn: 0941-0643 databaseCode: ADMLS dateStart: 19930301 isFulltext: true titleUrlDefault: https://www.ebsco.com/products/research-databases/inspec-full-text providerName: EBSCOhost – providerCode: PRVLSH databaseName: SpringerLink Journals customDbUrl: mediaType: online eissn: 1433-3058 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0004685 issn: 0941-0643 databaseCode: AFBBN dateStart: 19970301 isFulltext: true providerName: Library Specific Holdings – providerCode: PRVPQU databaseName: ProQuest Central customDbUrl: http://www.proquest.com/pqcentral?accountid=15518 eissn: 1433-3058 dateEnd: 20241103 omitProxy: true ssIdentifier: ssj0004685 issn: 0941-0643 databaseCode: BENPR dateStart: 20120101 isFulltext: true titleUrlDefault: https://www.proquest.com/central providerName: ProQuest – providerCode: PRVPQU databaseName: ProQuest Technology Collection customDbUrl: eissn: 1433-3058 dateEnd: 20241103 omitProxy: true ssIdentifier: ssj0004685 issn: 0941-0643 databaseCode: 8FG dateStart: 20180401 isFulltext: true titleUrlDefault: https://search.proquest.com/technologycollection1 providerName: ProQuest – providerCode: PRVAVX databaseName: SpringerLINK - Czech Republic Consortium customDbUrl: eissn: 1433-3058 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0004685 issn: 0941-0643 databaseCode: AGYKE dateStart: 19970101 isFulltext: true titleUrlDefault: http://link.springer.com providerName: Springer Nature – providerCode: PRVAVX databaseName: SpringerLink Journals (ICM) customDbUrl: eissn: 1433-3058 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0004685 issn: 0941-0643 databaseCode: U2A dateStart: 19970101 isFulltext: true titleUrlDefault: http://www.springerlink.com/journals/ providerName: Springer Nature |
| link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3fT9swED5B-8ILDDZEB1R-2NtmaXVSx34MrAUxFU2ISuwpqp2zVOi6rj9Q_3zOrlNgAiSeoiSOFZ3Pvjv7--4AvsgSnU_KwkVqDU8FCj7AxHGyVGkLhUWnPTm5dynP--nFTfsmksJmFdq9OpIMK_Wa7OZ3MCn0FR4tSYECl5tQb_t0XqTFfZE_YUOGQpwUt3hMT5pEqszLfTw3R48-5n_HosHadD_AdnQTWb4a113YwPEe7FQlGFickR_hNh8zXE5GkQPFAj6Q0yoRsBbx9g6nYxwxCvQDxnzJ_tI68ScSMNnIV-_hs38Lz0Ri1nvTQ0fGkq2KS7P74YDlP3q9T9Dvdq5Pz3ksn8Atzas5R6OVNiJDKZUuU6ez71pJhdpkyhhrtafRDqxTypZSu1IYKySqRPkH1DzZhxr9GR4Ak-1WJluGQiOHKYq28qdxLqHgypqS1KABrUqKhY25xX2Ji1GxzoocJF-Q5Isg-UI24Ov6m8kqs8abrY-qwSniLJsVIvPUWamTpAHfqgF7fP16b5_f1_wQtkTQGQ8tO4LafLrAY_JF5qYJm6p71oR6fvb7Z4euJ53LX1fNoJAP7VzaHA |
| linkProvider | Springer Nature |
| linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1LbxMxEB6V9gAX3oiUAj7ACSyI1_HahwqlL6W0iRBqpd6W2DsrFdJN2qQP_hy_jRnH2wASvfW4u15rNZ6dhz3ffABvTIkVN2WRSgcvtUIlh5hVkjyVbqMKWDkGJ_cHpneoPx91jpbgV4OF4bLKxiZGQ12OA--Rf1A5wyQN5d-fJqeSWaP4dLWh0BgmaoVyPbYYS8COPfx5SSncdH13i9b7rVI72webPZlYBmQg9ZtJ9M46r3I0xrpSVy7_6Kyx6HxuvQ_BMdp0GCprQ2lcVSoflEGbWb5BwzOa9w6s6Ew7Sv5WNrYHX77-gcyMpKCUQ3F9kc4SbCeC93hHlu4qrv6kxEeav13jIt7954g2er6dh3A_hayiO9exR7CE9WN40NBBiGQdnsD3bi3wajJKeCwRaxUlWaxY95Euf-BZjSNRj-tY734lxmSzThIYVIyYSUhOT88ZFSUCR_bHFTluMSe6FhfHQ9Hd6vefwuGtCPkZLNOX4XMQptPOTdtTmlahRtWxfDJYkfQ7wZekki1oN1IsQupzznQbo-K6Q3OUfEGSL6LkC9OCd9fvTOZdPm4cvdYsTpH--Gmx0M8WvG8WbPH4_7Ot3jzba7jbO-jvF_u7g70XcE9FfeEStzVYnp2d40uKiWb-VVI8Ad9uW9d_Aw2HHX0 |
| linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LT8MwDI5gSIgLb8RgQA7cIIKlXZocJ8bEaxMHJnGrltSRgDHGHmg_HydLt4EAiWNbN6ocJ7ab77MJORYZWFeUhfHYaBZz4KwNkWXoqeIycANWOXJyoymuWvHNY-VxjsXv0e75keSE0-CqNHWHZ73Mnk2Jb-5vJqbB3CEnMWlgYpEsxa5QAlp0i1fnmJG-KSfmMA7fE0eBNvPzGF9d0yze_HZE6j1PfZ2shpCRVidzvEEWoLtJ1vJ2DDSszi3yXO1SGPc6gQ9FPVaQ4Y7hcRfh8gX6XehQTPo93nxM33DPeA1kTNpxnXzY4H3kWEnUuMj6yaLjpJNG0_TjqU2rtUZjm7Tqlw8XVyy0UmAG19iQgVZSaZ6AEFJlsVXJuZJCgtKJ1NoY5Si1bWOlNJlQNuPacAEyku4Gikc7pIBfBruEiko5EWWNaZKFGHhFupM5G2GiZXSGJlEk5VyLqQl1xl27i046rZDsNZ-i5lOv-VQUycn0nd6kysaf0qV8ctKw4gYpTxyNVqgoKpLTfMJmj38fbe9_4kdk-b5WT--um7f7ZIV783GIsxIpDPsjOMAQZagPvRV-Ai9N3TA |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=An+explainable+multi-sparsity+multi-kernel+nonconvex+optimization+least-squares+classifier+method+via+ADMM&rft.jtitle=Neural+computing+%26+applications&rft.au=Zhang%2C+Zhiwang&rft.au=He%2C+Jing&rft.au=Cao%2C+Jie&rft.au=Li%2C+Shuqing&rft.date=2022-09-01&rft.pub=Springer+London&rft.issn=0941-0643&rft.eissn=1433-3058&rft.volume=34&rft.issue=18&rft.spage=16103&rft.epage=16128&rft_id=info:doi/10.1007%2Fs00521-022-07282-6&rft.externalDocID=10_1007_s00521_022_07282_6 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0941-0643&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0941-0643&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0941-0643&client=summon |