On the difficulty of replicating human subjects studies in software engineering
Replications play an important role in verifying empirical results. In this paper, we discuss our experiences performing a literal replication of a human subjects experiment that examined the relationship between a simple test for consistent use of mental models, and success in an introductory progr...
Saved in:
Published in | 2008 ACM/IEEE 30th International Conference on Software Engineering Vol. 2008; no. 24; pp. 191 - 200 |
---|---|
Main Authors | , , , |
Format | Conference Proceeding Journal Article |
Language | English |
Published |
IEEE
01.01.2008
|
Subjects | |
Online Access | Get full text |
ISBN | 1424444861 9781424444861 1605580791 9781605580791 |
ISSN | 0270-5257 |
DOI | 10.1145/1368088.1368115 |
Cover
Summary: | Replications play an important role in verifying empirical results. In this paper, we discuss our experiences performing a literal replication of a human subjects experiment that examined the relationship between a simple test for consistent use of mental models, and success in an introductory programming course. We encountered many difficulties in achieving comparability with the original experiment, due to a series of apparently minor differences in context. Based on this experience, we discuss the relative merits of replication, and suggest that, for some human subjects studies, literal replication may not be the the most effective strategy for validating the results of previous studies. |
---|---|
Bibliography: | SourceType-Scholarly Journals-2 ObjectType-Feature-2 ObjectType-Conference Paper-1 content type line 23 SourceType-Conference Papers & Proceedings-1 ObjectType-Article-3 |
ISBN: | 1424444861 9781424444861 1605580791 9781605580791 |
ISSN: | 0270-5257 |
DOI: | 10.1145/1368088.1368115 |