An empirical study on the effects of code visibility on program testability

Software testability represents the degree of ease with which a software artifact supports testing. When it is easy to detect defects in a program through testing, the program has high testability; otherwise, the testability of the program is low. As an abstract property of programs, testability can...

Full description

Saved in:
Bibliographic Details
Published inSoftware quality journal Vol. 25; no. 3; pp. 951 - 978
Main Authors Ma, Lei, Zhang, Cheng, Yu, Bing, Sato, Hiroyuki
Format Journal Article
LanguageEnglish
Published New York Springer US 01.09.2017
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN0963-9314
1573-1367
DOI10.1007/s11219-016-9340-8

Cover

Abstract Software testability represents the degree of ease with which a software artifact supports testing. When it is easy to detect defects in a program through testing, the program has high testability; otherwise, the testability of the program is low. As an abstract property of programs, testability can be measured by various metrics, which are affected by different factors of design and implementation. In object-oriented software development, code visibility is important to support design principles, such as information hiding. It is widely believed that code visibility has some effects on testability. However, little empirical evidence has been shown to clarify whether and how software testability is influenced by code visibility. We have performed a comprehensive empirical study to shed light on this problem. We first use code coverage as a concrete proxy for testability. We select 27 real-world software programs as subjects and ran two state-of-the-art automated testing tools, Randoop and EvoSuite, on these programs to analyze their code coverage, in comparison with that of developer-written tests. The results show that code visibility does not necessarily have effects on code coverage, but can significantly affect automated tools. Developer-written tests achieve similar coverage on code areas with different visibility, while low code visibility often leads to low code coverage for automated tools. In addition, we have developed two enhanced variants of Randoop that implement multiple strategies to handle code visibility. The results on Randoop variants show that different treatments on code visibility can result in significant differences in code coverage for automated tools. In the second part, our study uses fault detection rate as another concrete measurement of testability. We apply the automated testing tools on 357 real faults. The result of our in-depth analysis is consistent with that of the first part, demonstrating the significant effects of code visibility on program testability.
AbstractList Software testability represents the degree of ease with which a software artifact supports testing. When it is easy to detect defects in a program through testing, the program has high testability; otherwise, the testability of the program is low. As an abstract property of programs, testability can be measured by various metrics, which are affected by different factors of design and implementation. In object-oriented software development, code visibility is important to support design principles, such as information hiding. It is widely believed that code visibility has some effects on testability. However, little empirical evidence has been shown to clarify whether and how software testability is influenced by code visibility. We have performed a comprehensive empirical study to shed light on this problem. We first use code coverage as a concrete proxy for testability. We select 27 real-world software programs as subjects and ran two state-of-the-art automated testing tools, Randoop and EvoSuite, on these programs to analyze their code coverage, in comparison with that of developer-written tests. The results show that code visibility does not necessarily have effects on code coverage, but can significantly affect automated tools. Developer-written tests achieve similar coverage on code areas with different visibility, while low code visibility often leads to low code coverage for automated tools. In addition, we have developed two enhanced variants of Randoop that implement multiple strategies to handle code visibility. The results on Randoop variants show that different treatments on code visibility can result in significant differences in code coverage for automated tools. In the second part, our study uses fault detection rate as another concrete measurement of testability. We apply the automated testing tools on 357 real faults. The result of our in-depth analysis is consistent with that of the first part, demonstrating the significant effects of code visibility on program testability.
Software testability represents the degree of ease with which a software artifact supports testing. When it is easy to detect defects in a program through testing, the program has high testability; otherwise, the testability of the program is low. As an abstract property of programs, testability can be measured by various metrics, which are affected by different factors of design and implementation. In object-oriented software development, code visibility is important to support design principles, such as information hiding. It is widely believed that code visibility has some effects on testability. However, little empirical evidence has been shown to clarify whether and how software testability is influenced by code visibility. We have performed a comprehensive empirical study to shed light on this problem. We first use code coverage as a concrete proxy for testability. We select 27 real-world software programs as subjects and ran two state-of-the-art automated testing tools, Randoop and EvoSuite, on these programs to analyze their code coverage, in comparison with that of developer-written tests. The results show that code visibility does not necessarily have effects on code coverage, but can significantly affect automated tools. Developer-written tests achieve similar coverage on code areas with different visibility, while low code visibility often leads to low code coverage for automated tools. In addition, we have developed two enhanced variants of Randoop that implement multiple strategies to handle code visibility. The results on Randoop variants show that different treatments on code visibility can result in significant differences in code coverage for automated tools. In the second part, our study uses fault detection rate as another concrete measurement of testability. We apply the automated testing tools on 357 real faults. The result of our in-depth analysis is consistent with that of the first part, demonstrating the significant effects of code visibility on program testability.
Author Ma, Lei
Yu, Bing
Zhang, Cheng
Sato, Hiroyuki
Author_xml – sequence: 1
  givenname: Lei
  surname: Ma
  fullname: Ma, Lei
  email: malei@hit.edu.cn
  organization: Harbin Institute of Technology
– sequence: 2
  givenname: Cheng
  surname: Zhang
  fullname: Zhang, Cheng
  organization: University of Waterloo
– sequence: 3
  givenname: Bing
  surname: Yu
  fullname: Yu, Bing
  organization: Waseda University
– sequence: 4
  givenname: Hiroyuki
  surname: Sato
  fullname: Sato, Hiroyuki
  organization: The University of Tokyo
BookMark eNp9kE1LAzEQhoNUsK3-AG8Bz6uZZD-PpfiFBS96DtnspKZsNzVJhf57U9eDCHoamHmemeGdkcngBiTkEtg1MFbdBAAOTcagzBqRs6w-IVMoKpGBKKsJmbKmFGkC-RmZhbBh7GjlU_K0GChud9ZbrXoa4r47UDfQ-IYUjUEdA3WGatch_bDBtra38YvYebf2aksjhqjG9jk5NaoPePFd5-T17vZl-ZCtnu8fl4tVpkXNY9ahrjRi1yLLq7aAtkaeWiAEdrkxquEmB9NUKGoNhRK647VJpIEGTNkKMSdX4970w_s-3Zcbt_dDOimh4YJDkQtIVDVS2rsQPBqpbVTRuiF6ZXsJTB4zkGNyMiUnj8nJOpnwy9x5u1X-8K_DRyckdlij__HTn9InGrCDVA
CitedBy_id crossref_primary_10_1007_s11219_018_9426_6
crossref_primary_10_1016_j_asoc_2022_109562
Cites_doi 10.1016/S0164-1212(00)00086-8
10.1109/HotWeb.2015.14
10.1007/978-3-540-71209-1_12
10.1109/AST.2015.23
10.1109/ICSE.2012.6227132
10.1145/2568225.2568271
10.1109/ASE.2015.49
10.1109/SANER.2016.32
10.1145/2610384.2628055
10.1109/SBST.2015.19
10.1145/182987.184077
10.1109/ICSE.2007.37
10.1145/2658761.2658768
10.1109/TSE.2012.14
10.1109/ASE.2015.102
10.1145/2635868.2635929
10.1145/2685612
10.1109/ICPC.2016.7503725
10.1109/COMPSAC.2009.29
ContentType Journal Article
Copyright Springer Science+Business Media New York 2016
Software Quality Journal is a copyright of Springer, 2017.
Copyright_xml – notice: Springer Science+Business Media New York 2016
– notice: Software Quality Journal is a copyright of Springer, 2017.
DBID AAYXX
CITATION
3V.
7SC
7WY
7WZ
7XB
87Z
8AL
8AO
8FD
8FE
8FG
8FK
8FL
8G5
ABUWG
AFKRA
ARAPS
AZQEC
BENPR
BEZIV
BGLVJ
CCPQU
DWQXO
FRNLG
F~G
GNUQQ
GUQSH
HCIFZ
JQ2
K60
K6~
K7-
L.-
L7M
L~C
L~D
M0C
M0N
M2O
MBDVC
P5Z
P62
PHGZM
PHGZT
PKEHL
PQBIZ
PQBZA
PQEST
PQGLB
PQQKQ
PQUKI
Q9U
DOI 10.1007/s11219-016-9340-8
DatabaseName CrossRef
ProQuest Central (Corporate)
Computer and Information Systems Abstracts
ABI/INFORM Collection
ABI/INFORM Global (PDF only)
ProQuest Central (purchase pre-March 2016)
ABI/INFORM Global (Alumni Edition)
Computing Database (Alumni Edition)
ProQuest Pharma Collection
Technology Research Database
ProQuest SciTech Collection
ProQuest Technology Collection
ProQuest Central (Alumni) (purchase pre-March 2016)
ABI/INFORM Collection (Alumni Edition)
Research Library
ProQuest Central (Alumni)
ProQuest Central
Advanced Technologies & Aerospace Collection
ProQuest Central Essentials Local Electronic Collection Information
ProQuest One Academic
Business Premium Collection
Technology Collection
ProQuest One Community College
ProQuest Central
Business Premium Collection (Alumni)
ABI/INFORM Global (Corporate)
ProQuest Central Student
Research Library Prep
SciTech Premium Collection
ProQuest Computer Science Collection
ProQuest Business Collection (Alumni Edition)
ProQuest Business Collection
Computer Science Database
ABI/INFORM Professional Advanced
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
ProQuest ABI/INFORM Global
Computing Database
Research Library
Research Library (Corporate)
Advanced Technologies & Aerospace Database
ProQuest Advanced Technologies & Aerospace Collection
ProQuest Central Premium
ProQuest One Academic
ProQuest One Academic Middle East (New)
ProQuest One Business
ProQuest One Business (Alumni)
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Applied & Life Sciences
ProQuest One Academic
ProQuest One Academic UKI Edition
ProQuest Central Basic
DatabaseTitle CrossRef
ABI/INFORM Global (Corporate)
ProQuest Business Collection (Alumni Edition)
ProQuest One Business
Research Library Prep
Computer Science Database
ProQuest Central Student
Technology Collection
Technology Research Database
Computer and Information Systems Abstracts – Academic
ProQuest One Academic Middle East (New)
ProQuest Advanced Technologies & Aerospace Collection
ProQuest Central Essentials
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
ProQuest Central (Alumni Edition)
SciTech Premium Collection
ProQuest One Community College
Research Library (Alumni Edition)
ProQuest Pharma Collection
ABI/INFORM Complete
ProQuest Central
ABI/INFORM Professional Advanced
ProQuest One Applied & Life Sciences
ProQuest Central Korea
ProQuest Research Library
ProQuest Central (New)
Advanced Technologies Database with Aerospace
ABI/INFORM Complete (Alumni Edition)
Advanced Technologies & Aerospace Collection
Business Premium Collection
ABI/INFORM Global
ProQuest Computing
ABI/INFORM Global (Alumni Edition)
ProQuest Central Basic
ProQuest Computing (Alumni Edition)
ProQuest One Academic Eastern Edition
ProQuest Technology Collection
ProQuest SciTech Collection
ProQuest Business Collection
Computer and Information Systems Abstracts Professional
Advanced Technologies & Aerospace Database
ProQuest One Academic UKI Edition
ProQuest One Business (Alumni)
ProQuest One Academic
ProQuest One Academic (New)
ProQuest Central (Alumni)
Business Premium Collection (Alumni)
DatabaseTitleList
ABI/INFORM Global (Corporate)
Database_xml – sequence: 1
  dbid: 8FG
  name: ProQuest Technology Collection
  url: https://search.proquest.com/technologycollection1
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Computer Science
EISSN 1573-1367
EndPage 978
ExternalDocumentID 10_1007_s11219_016_9340_8
GrantInformation_xml – fundername: Fundamental Research Funds for the Central Universities
  grantid: AUGA5710000816
– fundername: The National High-tech R-and-D Program of China (863 Program)
  grantid: 2015AA020101
GroupedDBID -4Z
-59
-5G
-BR
-EM
-Y2
-~C
.4S
.86
.DC
.VR
06D
0R~
0VY
123
1N0
1SB
2.D
203
28-
2J2
2JN
2JY
2KG
2LR
2P1
2VQ
2~H
30V
3V.
4.4
406
408
409
40D
40E
5QI
5VS
67Z
6NX
7WY
8AO
8FE
8FG
8FL
8G5
8TC
95-
95.
95~
96X
AAAVM
AABHQ
AACDK
AAHNG
AAIAL
AAJBT
AAJKR
AANZL
AAOBN
AARHV
AARTL
AASML
AATNV
AATVU
AAUYE
AAWCG
AAYIU
AAYOK
AAYQN
AAYTO
AAYZH
ABAKF
ABBBX
ABBXA
ABDPE
ABDZT
ABECU
ABFTD
ABFTV
ABHLI
ABHQN
ABJNI
ABJOX
ABKCH
ABKTR
ABMNI
ABMQK
ABNWP
ABQBU
ABQSL
ABSXP
ABTEG
ABTHY
ABTKH
ABTMW
ABULA
ABUWG
ABWNU
ABXPI
ACAOD
ACBXY
ACDTI
ACGFS
ACHSB
ACHXU
ACKNC
ACMDZ
ACMLO
ACOKC
ACOMO
ACPIV
ACSNA
ACZOJ
ADHHG
ADHIR
ADIMF
ADINQ
ADKNI
ADKPE
ADMLS
ADRFC
ADTPH
ADURQ
ADYFF
ADZKW
AEBTG
AEFIE
AEFQL
AEGAL
AEGNC
AEJHL
AEJRE
AEKMD
AEMSY
AENEX
AEOHA
AEPYU
AESKC
AETLH
AEVLU
AEXYK
AFBBN
AFEXP
AFGCZ
AFKRA
AFLOW
AFQWF
AFWTZ
AFZKB
AGAYW
AGDGC
AGGDS
AGJBK
AGMZJ
AGQEE
AGQMX
AGRTI
AGWIL
AGWZB
AGYKE
AHAVH
AHBYD
AHSBF
AHYZX
AIAKS
AIGIU
AIIXL
AILAN
AITGF
AJBLW
AJRNO
AJZVZ
ALMA_UNASSIGNED_HOLDINGS
ALWAN
AMKLP
AMXSW
AMYLF
AMYQR
AOCGG
ARAPS
ARCSS
ARMRJ
ASPBG
AVWKF
AXYYD
AYJHY
AZFZN
AZQEC
B-.
BA0
BBWZM
BDATZ
BENPR
BEZIV
BGLVJ
BGNMA
BPHCQ
BSONS
CAG
CCPQU
COF
CS3
CSCUP
DDRTE
DL5
DNIVK
DPUIP
DU5
DWQXO
EBLON
EBS
EDO
EIOEI
EJD
ESBYG
FEDTE
FERAY
FFXSO
FIGPU
FINBP
FNLPD
FRNLG
FRRFC
FSGXE
FWDCC
GGCAI
GGRSB
GJIRD
GNUQQ
GNWQR
GQ6
GQ7
GQ8
GROUPED_ABI_INFORM_COMPLETE
GUQSH
GXS
H13
HCIFZ
HF~
HG5
HG6
HMJXF
HQYDN
HRMNR
HVGLF
HZ~
I09
IHE
IJ-
IKXTQ
ITM
IWAJR
IXC
IZIGR
IZQ
I~X
I~Z
J-C
J0Z
JBSCW
JCJTX
JZLTJ
K60
K6V
K6~
K7-
KDC
KOV
KOW
LAK
LLZTM
M0C
M0N
M2O
M4Y
MA-
N2Q
NB0
NDZJH
NPVJJ
NQJWS
NU0
O9-
O93
O9G
O9I
O9J
OAM
OVD
P19
P2P
P62
P9O
PF0
PQBIZ
PQBZA
PQQKQ
PROAC
PT4
PT5
Q2X
QOK
QOS
R4E
R89
R9I
RHV
RIG
RNI
ROL
RPX
RSV
RZC
RZE
RZK
S16
S1Z
S26
S27
S28
S3B
SAP
SCJ
SCLPG
SCO
SDH
SDM
SHX
SISQX
SJYHP
SNE
SNPRN
SNX
SOHCF
SOJ
SPISZ
SRMVM
SSLCW
STPWE
SZN
T13
T16
TEORI
TSG
TSK
TSV
TUC
TUS
U2A
UG4
UOJIU
UTJUX
UZXMN
VC2
VFIZW
W23
W48
WK8
YLTOR
Z45
Z7R
Z7S
Z7X
Z7Z
Z81
Z83
Z88
Z8M
Z8N
Z8R
Z8T
Z8W
Z92
ZMTXR
~A9
~EX
AAPKM
AAYXX
ABBRH
ABDBE
ABFSG
ABRTQ
ACSTC
ADHKG
AEZWR
AFDZB
AFHIU
AFOHR
AGQPQ
AHPBZ
AHWEU
AIXLP
ATHPR
AYFIA
CITATION
PHGZM
PHGZT
PQGLB
PUEGO
7SC
7XB
8AL
8FD
8FK
JQ2
L.-
L7M
L~C
L~D
MBDVC
PKEHL
PQEST
PQUKI
Q9U
ID FETCH-LOGICAL-c382t-dec7ceedbe047b51b8e2ec7133ed4ffa92f41f97e38c15a3cd28f47bf191f6b33
IEDL.DBID BENPR
ISSN 0963-9314
IngestDate Sat Aug 23 12:55:52 EDT 2025
Wed Oct 01 02:38:11 EDT 2025
Thu Apr 24 23:00:26 EDT 2025
Fri Feb 21 02:36:37 EST 2025
IsPeerReviewed true
IsScholarly true
Issue 3
Keywords Software testing
Software testability
Fault detection
Code coverage
Code visibility
Automated testing
Code accessibility
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c382t-dec7ceedbe047b51b8e2ec7133ed4ffa92f41f97e38c15a3cd28f47bf191f6b33
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
PQID 1923215431
PQPubID 29106
PageCount 28
ParticipantIDs proquest_journals_1923215431
crossref_citationtrail_10_1007_s11219_016_9340_8
crossref_primary_10_1007_s11219_016_9340_8
springer_journals_10_1007_s11219_016_9340_8
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2017-09-01
PublicationDateYYYYMMDD 2017-09-01
PublicationDate_xml – month: 09
  year: 2017
  text: 2017-09-01
  day: 01
PublicationDecade 2010
PublicationPlace New York
PublicationPlace_xml – name: New York
– name: Dordrecht
PublicationTitle Software quality journal
PublicationTitleAbbrev Software Qual J
PublicationYear 2017
Publisher Springer US
Springer Nature B.V
Publisher_xml – name: Springer US
– name: Springer Nature B.V
References JaCoCo v064 (2015). http://www.eclemma.org/jacoco/.
Baresi, L., & Young, M. (2001). Test oracles. Tech. rep., Technical Report CIS-TR-01-02, Dept. of Computer and Information Science, University of Oregon, Eugene, Oregon, USA.
Just, R., Jalali, D., & Ernst, M. D. (2014a). Defects4J: A database of existing faults to enable controlled testing studies for Java programs. In Proceedings of the international symposium on software testing and analysis (ISSTA), San Jose, CA, USA, pp. 437–440.
BuddTAn introduction to object-oriented programming1997BostonAddison-Wesley
Anand, S., Păsăreanu, C. S., & Visser, W. (2007). Jpf-se: A symbolic execution extension to java pathfinder. In Proceedings of the 13th international conference on tools and algorithms for the construction and analysis of systems (TACAS), Braga, Portugal, pp. 134–138.
SheskinDJHandbook of parametric and nonparametric statistical procedures20074Boca RatonChapman & Hall/CRC1118.62001
Staats, M., Gay, G., & Heimdahl, M. P. E. (2012). Automated oracle creation support, or: How I learned to stop worrying about fault propagation and love mutation testing. In Proceedings of the 34th international conference on software engineering (ICSE), Zurich, Switzerland, pp. 870–880.
BinderRVDesign for testability in object-oriented systemsCommunications of the ACM19943798710110.1145/182987.184077
Ma, L., Zhang, C., Yu, B., & Sato, H. (2015d). An empirical study on effects of code visibility on code coverage of software testing. In 2015 IEEE/ACM 10th international workshop on automation of software test (AST), Florence, Italy, pp. 80–84.
MassolVHustedTJUnit in action2003Greenwich, CT, USAManning Publications Co.
PayneJEAlexanderRTHutchinsonCDDesign-for-testability for object-oriented softwareObject Magazine1997753443
MyersGJSandlerCThe art of software testing2004New YorkWiley
IshiiKMiHMaLLaokulratNInamiMIgarashiTPebbles: User-configurable device network for robot navigation2013BerlinSpringer420436
Cobertura. (2015). https://github.com/cobertura/cobertura/.
Inozemtseva, L., & Holmes, R. (2014). Coverage is not strongly correlated with test suite effectiveness. In Proceedings of the 36th international conference on software engineering (ICSE), Hyderabad, India, pp. 435–445.
BeustCSuleimanHNext generation java testing: TestNG and advanced concepts2007BostonAddison-Wesley Professional
Ma, L., Zhang, C., Yu, B., & Zhao, J. (2016). Retrofitting automatic testing through library tests reusing. In Proceedings of the IEEE 24th international conference on program comprehension (ICPC), Texas Austin, USA, pp. 1–4.
Bruntink, M. (2003). Testability of object-oriented systems: A metrics-based approach. Master’s thesis, Comput Sci, Univ of Amsterdam.
EmamKEMeloWMachadoJCThe prediction of faulty classes using object-oriented design metricsJournal of Systems and Software2001561637510.1016/S0164-1212(00)00086-8
Google Guice. (2015). https://github.com/google/guice.
Ma, L., Artho, C., Zhang, C., & Sato, H. (2014). Efficient testing of software product lines via centralization (short paper). In Proceedings of the 2014 international conference on generative programming: Concepts and experiences (GPCE), Vasteras, Sweden, pp. 49–52.
Jungmayr, S. (2002). Design for testability. CONQUEST, pp. 57–64.
Yu, B., Ma, L., & Zhang, C. (2015). Incremental web application testing using page object. In the 3rd IEEE workshop on hot topics in web systems and technologies (HotWeb), Washington, DC, USA, pp. 1–6.
FraserGArcuriAA large scale evaluation of automated unit test generation using EvoSuiteACM Transactions on Software Engineering and Methodology (TOSEM)2014242810.1145/2685612
Jaygarl, H., Chang, C. K., & Kim, S. (2009). Practical extensions of a randomized testing tool. In Proceedings of 33rd annual IEEE international computer software and application conference (COMPSAC), Seattle, USA, pp. 148–153.
Ma, L., Artho, C., Zhang, C., Sato, H., Gmeiner, J., & Ramler, R. (2015a). GRT: Program-analysis-guided random testing (t). In Proceedings of the 30th IEEE/ACM international conference on automated software engineering (ASE), Nebraska Lincoln, USA, pp. 212–223.
Ma, L., Artho, C., Zhang, C., Sato, H., Gmeiner, J., & Ramler, R. (2015b). GRT: An automated test generator using orchestrated program analysis. In Proceedings of the 30th IEEE/ACM international conference on automated software engineering (ASE), Nebraska Lincoln, USA, pp. 842–847.
Ma, L., Artho, C., Zhang, C., Sato, H., Hagiya, M., Tanabe, Y., et al. (2015c). GRT at the SBST 2015 tool competition. In Proceedings of the 8th international workshop on search-based software testing (SBST), Florence, Italy, pp. 48–51.
EasyMock. (2015). http://easymock.org/.
Joda Time. (2015). http://www.joda.org/joda-time/.
The Apache Software Foundation. (2015). http://www.apache.org/.
Artho, C., & Ma, L. (2016). Classification of randomly generated test cases. In 2016 IEEE 23rd international conference on software analysis, evolution, and reengineering (SANER), Osaka, Japan, Vol. 2, pp. 29–32.
CadarCDunbarDEnglerDRKlee: Unassisted and automatic generation of high-coverage tests for complex systems programsOSDI20088209224
Just, R., Jalali, D., Inozemtseva, L., Ernst, M. D., Holmes, R., & Fraser, G. (2014b). Are mutants a valid substitute for real faults in software testing? In Proceedings of the 22nd ACM SIGSOFT international symposium on foundations of software engineering (FSE), Hong Kong, China, pp. 654–665.
JungmayrSReviewing software artifacts for testabilityEuroSTAR199999812
FraserGArcuriAWhole test suite generationIEEE Transactions on Software Engineering201339227629110.1109/TSE.2012.14
Gosling, J., Joy, B., Steele, G. L. Jr., Bracha, G., & Buckley, A. (2013). The Java Language Specification. Java SE 7 Edition, 1st edn. Addison-Wesley Professional.
Pacheco, C., Lahiri, S. K., Ernst, M. D., & Ball, T. (2007). Feedback-directed random test generation. In Proceedings of the 29th international conference on software engineering (ICSE), Minnesota, USA, pp. 75–84.
KE Emam (9340_CR11) 2001; 56
GJ Myers (9340_CR30) 2004
9340_CR17
9340_CR15
9340_CR14
9340_CR36
9340_CR35
RV Binder (9340_CR5) 1994; 37
9340_CR34
9340_CR10
9340_CR31
9340_CR9
V Massol (9340_CR29) 2003
G Fraser (9340_CR12) 2013; 39
T Budd (9340_CR7) 1997
DJ Sheskin (9340_CR33) 2007
9340_CR28
9340_CR27
9340_CR26
C Beust (9340_CR4) 2007
G Fraser (9340_CR13) 2014; 24
9340_CR25
9340_CR24
9340_CR23
JE Payne (9340_CR32) 1997; 7
9340_CR21
9340_CR20
9340_CR300
S Jungmayr (9340_CR22) 1999; 99
9340_CR200
9340_CR100
9340_CR6
9340_CR1
9340_CR3
9340_CR2
K Ishii (9340_CR18) 2013
C Cadar (9340_CR8) 2008; 8
9340_CR19
References_xml – reference: FraserGArcuriAA large scale evaluation of automated unit test generation using EvoSuiteACM Transactions on Software Engineering and Methodology (TOSEM)2014242810.1145/2685612
– reference: IshiiKMiHMaLLaokulratNInamiMIgarashiTPebbles: User-configurable device network for robot navigation2013BerlinSpringer420436
– reference: Ma, L., Zhang, C., Yu, B., & Sato, H. (2015d). An empirical study on effects of code visibility on code coverage of software testing. In 2015 IEEE/ACM 10th international workshop on automation of software test (AST), Florence, Italy, pp. 80–84.
– reference: The Apache Software Foundation. (2015). http://www.apache.org/.
– reference: BeustCSuleimanHNext generation java testing: TestNG and advanced concepts2007BostonAddison-Wesley Professional
– reference: Ma, L., Artho, C., Zhang, C., Sato, H., Gmeiner, J., & Ramler, R. (2015a). GRT: Program-analysis-guided random testing (t). In Proceedings of the 30th IEEE/ACM international conference on automated software engineering (ASE), Nebraska Lincoln, USA, pp. 212–223.
– reference: Just, R., Jalali, D., & Ernst, M. D. (2014a). Defects4J: A database of existing faults to enable controlled testing studies for Java programs. In Proceedings of the international symposium on software testing and analysis (ISSTA), San Jose, CA, USA, pp. 437–440.
– reference: SheskinDJHandbook of parametric and nonparametric statistical procedures20074Boca RatonChapman & Hall/CRC1118.62001
– reference: Joda Time. (2015). http://www.joda.org/joda-time/.
– reference: EmamKEMeloWMachadoJCThe prediction of faulty classes using object-oriented design metricsJournal of Systems and Software2001561637510.1016/S0164-1212(00)00086-8
– reference: Just, R., Jalali, D., Inozemtseva, L., Ernst, M. D., Holmes, R., & Fraser, G. (2014b). Are mutants a valid substitute for real faults in software testing? In Proceedings of the 22nd ACM SIGSOFT international symposium on foundations of software engineering (FSE), Hong Kong, China, pp. 654–665.
– reference: Ma, L., Zhang, C., Yu, B., & Zhao, J. (2016). Retrofitting automatic testing through library tests reusing. In Proceedings of the IEEE 24th international conference on program comprehension (ICPC), Texas Austin, USA, pp. 1–4.
– reference: PayneJEAlexanderRTHutchinsonCDDesign-for-testability for object-oriented softwareObject Magazine1997753443
– reference: Bruntink, M. (2003). Testability of object-oriented systems: A metrics-based approach. Master’s thesis, Comput Sci, Univ of Amsterdam.
– reference: FraserGArcuriAWhole test suite generationIEEE Transactions on Software Engineering201339227629110.1109/TSE.2012.14
– reference: MyersGJSandlerCThe art of software testing2004New YorkWiley
– reference: BuddTAn introduction to object-oriented programming1997BostonAddison-Wesley
– reference: Staats, M., Gay, G., & Heimdahl, M. P. E. (2012). Automated oracle creation support, or: How I learned to stop worrying about fault propagation and love mutation testing. In Proceedings of the 34th international conference on software engineering (ICSE), Zurich, Switzerland, pp. 870–880.
– reference: Ma, L., Artho, C., Zhang, C., & Sato, H. (2014). Efficient testing of software product lines via centralization (short paper). In Proceedings of the 2014 international conference on generative programming: Concepts and experiences (GPCE), Vasteras, Sweden, pp. 49–52.
– reference: BinderRVDesign for testability in object-oriented systemsCommunications of the ACM19943798710110.1145/182987.184077
– reference: Jaygarl, H., Chang, C. K., & Kim, S. (2009). Practical extensions of a randomized testing tool. In Proceedings of 33rd annual IEEE international computer software and application conference (COMPSAC), Seattle, USA, pp. 148–153.
– reference: CadarCDunbarDEnglerDRKlee: Unassisted and automatic generation of high-coverage tests for complex systems programsOSDI20088209224
– reference: EasyMock. (2015). http://easymock.org/.
– reference: Ma, L., Artho, C., Zhang, C., Sato, H., Gmeiner, J., & Ramler, R. (2015b). GRT: An automated test generator using orchestrated program analysis. In Proceedings of the 30th IEEE/ACM international conference on automated software engineering (ASE), Nebraska Lincoln, USA, pp. 842–847.
– reference: Artho, C., & Ma, L. (2016). Classification of randomly generated test cases. In 2016 IEEE 23rd international conference on software analysis, evolution, and reengineering (SANER), Osaka, Japan, Vol. 2, pp. 29–32.
– reference: JaCoCo v064 (2015). http://www.eclemma.org/jacoco/.
– reference: Pacheco, C., Lahiri, S. K., Ernst, M. D., & Ball, T. (2007). Feedback-directed random test generation. In Proceedings of the 29th international conference on software engineering (ICSE), Minnesota, USA, pp. 75–84.
– reference: Google Guice. (2015). https://github.com/google/guice.
– reference: Gosling, J., Joy, B., Steele, G. L. Jr., Bracha, G., & Buckley, A. (2013). The Java Language Specification. Java SE 7 Edition, 1st edn. Addison-Wesley Professional.
– reference: JungmayrSReviewing software artifacts for testabilityEuroSTAR199999812
– reference: Cobertura. (2015). https://github.com/cobertura/cobertura/.
– reference: Inozemtseva, L., & Holmes, R. (2014). Coverage is not strongly correlated with test suite effectiveness. In Proceedings of the 36th international conference on software engineering (ICSE), Hyderabad, India, pp. 435–445.
– reference: MassolVHustedTJUnit in action2003Greenwich, CT, USAManning Publications Co.
– reference: Anand, S., Păsăreanu, C. S., & Visser, W. (2007). Jpf-se: A symbolic execution extension to java pathfinder. In Proceedings of the 13th international conference on tools and algorithms for the construction and analysis of systems (TACAS), Braga, Portugal, pp. 134–138.
– reference: Baresi, L., & Young, M. (2001). Test oracles. Tech. rep., Technical Report CIS-TR-01-02, Dept. of Computer and Information Science, University of Oregon, Eugene, Oregon, USA.
– reference: Jungmayr, S. (2002). Design for testability. CONQUEST, pp. 57–64.
– reference: Ma, L., Artho, C., Zhang, C., Sato, H., Hagiya, M., Tanabe, Y., et al. (2015c). GRT at the SBST 2015 tool competition. In Proceedings of the 8th international workshop on search-based software testing (SBST), Florence, Italy, pp. 48–51.
– reference: Yu, B., Ma, L., & Zhang, C. (2015). Incremental web application testing using page object. In the 3rd IEEE workshop on hot topics in web systems and technologies (HotWeb), Washington, DC, USA, pp. 1–6.
– volume: 56
  start-page: 63
  issue: 1
  year: 2001
  ident: 9340_CR11
  publication-title: Journal of Systems and Software
  doi: 10.1016/S0164-1212(00)00086-8
– ident: 9340_CR36
  doi: 10.1109/HotWeb.2015.14
– ident: 9340_CR3
– volume-title: JUnit in action
  year: 2003
  ident: 9340_CR29
– ident: 9340_CR1
  doi: 10.1007/978-3-540-71209-1_12
– ident: 9340_CR27
  doi: 10.1109/AST.2015.23
– ident: 9340_CR34
  doi: 10.1109/ICSE.2012.6227132
– ident: 9340_CR17
  doi: 10.1145/2568225.2568271
– volume-title: An introduction to object-oriented programming
  year: 1997
  ident: 9340_CR7
– ident: 9340_CR100
  doi: 10.1109/ASE.2015.49
– volume-title: Handbook of parametric and nonparametric statistical procedures
  year: 2007
  ident: 9340_CR33
– ident: 9340_CR14
– volume-title: Next generation java testing: TestNG and advanced concepts
  year: 2007
  ident: 9340_CR4
– ident: 9340_CR2
  doi: 10.1109/SANER.2016.32
– ident: 9340_CR24
  doi: 10.1145/2610384.2628055
– ident: 9340_CR35
– volume: 8
  start-page: 209
  year: 2008
  ident: 9340_CR8
  publication-title: OSDI
– ident: 9340_CR9
– ident: 9340_CR300
  doi: 10.1109/SBST.2015.19
– volume: 37
  start-page: 87
  issue: 9
  year: 1994
  ident: 9340_CR5
  publication-title: Communications of the ACM
  doi: 10.1145/182987.184077
– ident: 9340_CR31
  doi: 10.1109/ICSE.2007.37
– ident: 9340_CR10
– volume: 99
  start-page: 8
  year: 1999
  ident: 9340_CR22
  publication-title: EuroSTAR
– volume: 7
  start-page: 34
  issue: 5
  year: 1997
  ident: 9340_CR32
  publication-title: Object Magazine
– ident: 9340_CR23
– ident: 9340_CR26
  doi: 10.1145/2658761.2658768
– volume-title: The art of software testing
  year: 2004
  ident: 9340_CR30
– ident: 9340_CR21
– volume: 39
  start-page: 276
  issue: 2
  year: 2013
  ident: 9340_CR12
  publication-title: IEEE Transactions on Software Engineering
  doi: 10.1109/TSE.2012.14
– ident: 9340_CR15
– ident: 9340_CR200
  doi: 10.1109/ASE.2015.102
– ident: 9340_CR25
  doi: 10.1145/2635868.2635929
– volume: 24
  start-page: 8
  issue: 2
  year: 2014
  ident: 9340_CR13
  publication-title: ACM Transactions on Software Engineering and Methodology (TOSEM)
  doi: 10.1145/2685612
– ident: 9340_CR6
– ident: 9340_CR19
– start-page: 420
  volume-title: Pebbles: User-configurable device network for robot navigation
  year: 2013
  ident: 9340_CR18
– ident: 9340_CR28
  doi: 10.1109/ICPC.2016.7503725
– ident: 9340_CR20
  doi: 10.1109/COMPSAC.2009.29
SSID ssj0010074
Score 2.12787
Snippet Software testability represents the degree of ease with which a software artifact supports testing. When it is easy to detect defects in a program through...
Software testability represents the degree of ease with which a software artifact supports testing. When it is easy to detect defects in a program through...
SourceID proquest
crossref
springer
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 951
SubjectTerms Automation
Compilers
Computer Science
Data Structures and Information Theory
Defects
Design factors
Empirical analysis
Fault detection
Faults
Interpreters
Object oriented programming
Operating Systems
Programming Languages
Software
Software development
Software development tools
Software Engineering/Programming and Operating Systems
Test systems
Testability
Visibility
SummonAdditionalLinks – databaseName: SpringerLINK - Czech Republic Consortium
  dbid: AGYKE
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3PT8IwFH5RuHgRf0YUTQ-eNCVs7bbuSAxoJHqCBE_LVtrEqIPIOOhf72vXgRI14bZsb832-uN9zfveV4BLxXTIlVI0lJGkPI01TTNTKyNxxcQI50srKfTwGN6N-P04GLs67nnFdq9SknalXhW7eb5pxAtpzHiHim2oW7mtGtS7t0-D3jJ5YMKildgLGZp6vEpm_tbIz3C0wphraVEbbfoNGFbfWZJMXtqLImvLzzUJxw1_ZA92Hfok3XK47MOWyg-gUZ3sQNxEP4RBNyfqbfZs9UOIlaAl05wgWCSOAEKmmphyeGKK0y3B1lo4uhdBAFuUCuAfRzDq94Y3d9Qdu0AlE35BJwq7DUNnpjo8ygIvE8rHW7iZVROudRr7mns6jhQT0gtSJie-0Gipceunw4yxY6jl01ydAAm0TIXKAl8KxeUkFVx5YYxXEQI3XAua0Km8n0inSW6OxnhNVmrKxlmJ4aEZZyWiCVfLV2alIMd_xq2qSxM3N-eJwbQIdBA5NeG66qFvj_9q7HQj6zPY8Q0CsHS0FtSK94U6R_xSZBduvH4BzgLmjw
  priority: 102
  providerName: Springer Nature
Title An empirical study on the effects of code visibility on program testability
URI https://link.springer.com/article/10.1007/s11219-016-9340-8
https://www.proquest.com/docview/1923215431
Volume 25
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVEBS
  databaseName: Inspec with Full Text
  customDbUrl:
  eissn: 1573-1367
  dateEnd: 20241105
  omitProxy: false
  ssIdentifier: ssj0010074
  issn: 0963-9314
  databaseCode: ADMLS
  dateStart: 20080301
  isFulltext: true
  titleUrlDefault: https://www.ebsco.com/products/research-databases/inspec-full-text
  providerName: EBSCOhost
– providerCode: PRVLSH
  databaseName: SpringerLink Journals
  customDbUrl:
  mediaType: online
  eissn: 1573-1367
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0010074
  issn: 0963-9314
  databaseCode: AFBBN
  dateStart: 19970301
  isFulltext: true
  providerName: Library Specific Holdings
– providerCode: PRVPQU
  databaseName: ProQuest Central
  customDbUrl: http://www.proquest.com/pqcentral?accountid=15518
  eissn: 1573-1367
  dateEnd: 20171231
  omitProxy: true
  ssIdentifier: ssj0010074
  issn: 0963-9314
  databaseCode: BENPR
  dateStart: 19970101
  isFulltext: true
  titleUrlDefault: https://www.proquest.com/central
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: ProQuest Technology Collection
  customDbUrl:
  eissn: 1573-1367
  dateEnd: 20241105
  omitProxy: true
  ssIdentifier: ssj0010074
  issn: 0963-9314
  databaseCode: 8FG
  dateStart: 19970101
  isFulltext: true
  titleUrlDefault: https://search.proquest.com/technologycollection1
  providerName: ProQuest
– providerCode: PRVAVX
  databaseName: SpringerLINK - Czech Republic Consortium
  customDbUrl:
  eissn: 1573-1367
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0010074
  issn: 0963-9314
  databaseCode: AGYKE
  dateStart: 19970101
  isFulltext: true
  titleUrlDefault: http://link.springer.com
  providerName: Springer Nature
– providerCode: PRVAVX
  databaseName: SpringerLink Journals (ICM)
  customDbUrl:
  eissn: 1573-1367
  dateEnd: 99991231
  omitProxy: true
  ssIdentifier: ssj0010074
  issn: 0963-9314
  databaseCode: U2A
  dateStart: 19970101
  isFulltext: true
  titleUrlDefault: http://www.springerlink.com/journals/
  providerName: Springer Nature
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1LT8MwDLZgu3DhjRiPKQdOoIg1Tbv0gNCADcRjQsAkOFVtmkhI0A0YB_49dpYyQIJTqzrJwXl8Tm1_BtgxoY2lMYbHuq25zBLLs5xyZTSemIhwQjtKoat-fDaQ5_fR_Qz0q1wYCquszkR3UBdDTf_I98kSQXhCvDscvXCqGkXe1aqERuZLKxQHjmJsFuqCmLFqUD_q9q9vvvwKhJiOfS8OeRIGsvJzumQ67EGxQzFKZIurn0g1NT9_eUwdEPUWYd5bkKwzmfIlmDHlMixU1RmY36wrcNEpmXkePToOEOZoZNmwZGjwMR_EwYaWUUo7owRzFyTrWviQLYZG6HjC4v2xCoNe9-74jPvSCVyHSox5YVD1CH-5acl2HgW5MgI_4YXUFNLaLBFWBjZpm1DpIMpCXQhlsaXF65uN8zBcg1o5LM06sMjqTJk8EloZqYtMSRPECb610fjC_dyAVqWmVHtecSpv8ZROGZFJsynFkpFmU9WA3a8uowmpxn-Ntyrdp35_vaXT1dCAvWo-von_Gmzj_8E2YU4QbLsYsi2ojV_fzTYaHeO8CbOqd9qEeufk6vKWnqcPF92mX18oHYjOJz0P2J0
linkProvider ProQuest
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1LS8QwEB5ED3rxLa7PHPSiBLdp2k0PIj5ZXV1EFLzVNp2AoN3VXRH_nL_NSTZ1VdCbt9KmoUymM18yM98AbGBoYomIPNYNzWWWGJ7ltlZGk8UkDye0oxS6aMfNG3l2G92OwHtVC2PTKiub6Ax10dH2jHzHIhFyT-Tv9rpP3HaNstHVqoVG5lsrFLuOYswXdrTw7ZW2cL3d0yNa700hTo6vD5vcdxngOlSizwukryRPkWNdNvIoyBUKukV7NyykMVkijAxM0sBQ6SDKQl0IZWikoZ2OiXN7IEouYEyGMqHN39jBcfvy6jOOYT20Y_uLQ56Egaziqq54LxBWKEFMT2Sdq--ecQh3f0RoneM7mYZJj1jZ_kDFZmAEy1mYqrpBMG8c5qC1XzJ87N47zhHmaGtZp2QEMJlPGmEdw2wJPbMF7S4p143wKWKMQG9_wBr-Ng83_yLEBRgtOyUuAouMzhTmkdAKpS4yJTGIE7pqENgj-1GDeiWmVHsec9tO4yEdMjBbyaY2d81KNlU12Pp8pTsg8fhr8Eol-9T_z710qH012K7W48vj3yZb-nuydRhvXl-cp-en7dYyTAgLGVz-2gqM9p9fcJUATz9f81rF4O6_FfkDmiITJw
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Rb9MwED5Nm4T2sg0GWkcBP7AXkLXGcRLnAaGKUloKEw-r1LeQOGdpEiSFFk39a_t13LlJC0jsrW9R4ljR-XL32Xf3HcBLDF2sEVHGNrFS56mTecG1MpYsJnk4ZT2l0OereDTVH2fRbA_u2loYTqtsbaI31GVt-Yz8kpEIuSfyd5euSYv4Mhi-nf-Q3EGKI61tO421ikxwdUvbt8Wb8YDW-kKp4fvrdyPZdBiQNjRqKUukLyQvUWBPJ0UUFAYV3aJ9G5bauTxVTgcuTTA0Nojy0JbKOBrpaJfj4oIPQ8n8HyTM4s5V6sMPmwgG-2bP8xeHMg0D3UZUfdleoFgcQUxPdE-av33iFuj-E5v1Lm94AkcNVhX9tXI9hD2sHsFx2wdCNGbhFCb9SuD3-Y1nGxGesFbUlSBoKZp0EVE7wcXzgkvZfTquH9EkhwmCu8s1X_jqMUx3IsInsF_VFZ6BiJzNDRaRsga1LXOjMYhTukoI5pHl6ECvFVNmGwZzbqTxLdtyL7NkM85aY8lmpgOvNq_M1_Qd9w3utrLPmj95kW31rgOv2_X44_H_Jju_f7IX8IDUN_s0vpo8hUPFWMEnrnVhf_nzFz4jpLMsnnuVEvB11zr8GxWSEME
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=An+empirical+study+on+the+effects+of+code+visibility+on+program+testability&rft.jtitle=Software+quality+journal&rft.au=Ma%2C+Lei&rft.au=Zhang%2C+Cheng&rft.au=Yu%2C+Bing&rft.au=Sato%2C+Hiroyuki&rft.date=2017-09-01&rft.pub=Springer+Nature+B.V&rft.issn=0963-9314&rft.eissn=1573-1367&rft.volume=25&rft.issue=3&rft.spage=951&rft_id=info:doi/10.1007%2Fs11219-016-9340-8&rft.externalDBID=HAS_PDF_LINK
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0963-9314&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0963-9314&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0963-9314&client=summon