On the difficulty of replicating human subjects studies in software engineering

Replications play an important role in verifying empirical results. In this paper, we discuss our experiences performing a literal replication of a human subjects experiment that examined the relationship between a simple test for consistent use of mental models, and success in an introductory progr...

Full description

Saved in:
Bibliographic Details
Published in2008 ACM/IEEE 30th International Conference on Software Engineering Vol. 2008; no. 24; pp. 191 - 200
Main Authors Lung, J., Aranda, J., Easterbrook, S., Wilson, G.
Format Conference Proceeding Journal Article
LanguageEnglish
Published IEEE 01.01.2008
Subjects
Online AccessGet full text
ISBN1424444861
9781424444861
1605580791
9781605580791
ISSN0270-5257
DOI10.1145/1368088.1368115

Cover

More Information
Summary:Replications play an important role in verifying empirical results. In this paper, we discuss our experiences performing a literal replication of a human subjects experiment that examined the relationship between a simple test for consistent use of mental models, and success in an introductory programming course. We encountered many difficulties in achieving comparability with the original experiment, due to a series of apparently minor differences in context. Based on this experience, we discuss the relative merits of replication, and suggest that, for some human subjects studies, literal replication may not be the the most effective strategy for validating the results of previous studies.
Bibliography:SourceType-Scholarly Journals-2
ObjectType-Feature-2
ObjectType-Conference Paper-1
content type line 23
SourceType-Conference Papers & Proceedings-1
ObjectType-Article-3
ISBN:1424444861
9781424444861
1605580791
9781605580791
ISSN:0270-5257
DOI:10.1145/1368088.1368115