Follow for Updates (@reser2011)
We would like to thank InfoTrax Systems, Inc. for their generous financial support.
DeadlinesSubmission: 7 May 2011 (extended)
Notification: 7 June 2011
Camera Ready: 28 June 2011
Organizing CommitteeNatalia Juristo – Universidad Politécnica de Madrid, Spain
Charles Knutson – Brigham Young University, USA
Jonathan Krein – Brigham Young University, USA
Lutz Prechelt – Freie Universität Berlin, Germany
Advisory CommitteeJoerg Doerr – Fraunhofer IESE, Germany
Peri Tarr – IBM T. J. Watson Research Center, USA
Program CommitteeMaria Teresa Baldassarre – Università di Bari, Italy
Christian Bird – Microsoft Research, USA
Marcela Genero Bocco – Universidad de Castilla-La Mancha, Spain
Andrew Brooks – University of New Haven, USA
Jeffrey Carver – University of Alabama, USA
Marcus Ciolkowski – Fraunhofer IESE, Germany
Kevin Crowston – Syracuse University, USA
Daniel Delorey – Google, Inc., USA
Daniel German – University of Victoria, Canada
Jesus Gonzalez-Barahona – Universidad Rey Juan Carlos, Spain
Alicia Grubb – University of Toronto, Canada
Andreas Jedlitschka – Fraunhofer IESE, Germany
James Miller – University of Alberta, Canada
Dietmar Pfahl – Lund University, Sweden
Martin Pinzger – Delft University of Technology, Netherlands
Marc Roper – University of Strathclyde, UK
Carolyn Seaman – Fraunhofer CESE & University of Maryland, USA
Megan Squire – Elon University, USA
Sira Vegas – Universidad Politécnica de Madrid, Spain
Patrick Wagstrom – IBM T. J. Watson Research Center, USA
Claes Wohlin – Blekinge Institute of Technology, Sweden
Murray Wood – University of Strathclyde, UK
Previous RESER WorkshopsRESER 2010 – Cape Town, South Africa
Many results in Software Engineering suffer from threats to validity that can be addressed by the replication of previous empirical studies. These threats include: 1) Lack of independent validation of empirical results; 2) Contextual shifts in Software Engineering practices or environments since the time of the original research studies; and 3) Limited data sets at the time of the original research studies.
However, certain factors discourage replication studies: 1) A perception persists that replication studies are less valuable than the presentation of original studies; 2) Data sets are often not made publicly available; 3) Reports of empirical studies are often not sufficiently detailed to foster replication; and 4) Research tools are either not available or not usable, so precise replication is impractical.
The primary goal of this workshop is to raise the quality and amount of replication work performed in software engineering research.
In particular this means:
The workshop intends to be a forum for small-scale replications that are otherwise hard to publish. Accordingly, and in addition to general paper submissions, we collect results for one specific joint replication each year—soliciting small-scale replications, from which we intend to form large-scale replications by meta-analysis. Through this process, we expect to produce valuable insights on practical issues concerning replication. In addition, the workshop seeks to identify and suggest solutions for recurring practical problems in selecting, designing, performing, reporting, and publishing replication studies by furthering appropriate methods, tools, and standards.
Joint Replication Project
Participate in this year's Joint Replication Project! Simply visit the experiment portal, conduct the guided experiment (framework provided), and write-up the results. Individual results will be compiled into a major, distributed, joint replication. Additionally, all reports will be published in full and authors will share their work in a joint session at the workshop.
Update: *** Find the complete proceedings published at IEEE Xplore ***
All accepted papers will be published as full papers. Additionally, 5–7 papers will be selected for formal presentation at the workshop, and all submissions for the joint replication will be presented in a joint session. All other accepted papers will be invited for poster presentation. Valid results will not be rejected. At least one author of each accepted paper must register for and attend the workshop.
We appreciate questions and comments about the workshop.