1st International Workshop on Recruiting Participants for Empirical Software Engineering (RoPES’22)

  • Co-located with ICSE 2022
  • Virtual
  • May 17, 2022

Registration: https://conf.researchr.org/attending/icse-2022/registration

Theme & Goals

Software studies, and the field of software engineering research, benefit from the use of high numbers of quality participants. However, participant recruitment is challenging. Recruiting people from open source projects often involves ethical hurdles like privacy protection 1, or participant fatigue from being over-researched. Using students can be helpful, but also suffers from generalizability to the wider software developer population 2. On top of this, challenges due to pandemic-related health measures have made in-person studies even more difficult to conduct. Finally, the maturity of our community in understanding sampling issues 3, and the associated power calculations, effect size measurements etc. derived from samples can always be improved 4.

There are also new, potentially useful tools for crowd-sourcing research, such as Mechanical Turk, or Prolific, but these tools are typically not used very often, and when they are, require careful study organization to prevent bias and flawed sampling 5. Other communities, such as HCI and psychology, do have extensive knowledge of how to use these tools, but may not have similar expertise requirements of their subjects.

Topics of Interest

ROPES 2022 seeks contributions addressing, but not limited to, the following topics related to recruitment of participants in empirical software engineering:

  • strategies for new platforms like Prolific, such as payment and filtering;
  • students vs practitioners;
  • determining adequate sample sizes;
  • ethical considerations of participant involvement (such as GDPR, use of emails);
  • equity, diversity and inclusion considerations, such as oversampling women;
  • incentive structures;
  • building reputation with a community (so they don’t reject study invites);
  • challenges with joint studies with industry and academia;
  • self-selection bias (the same people always respond);
  • privacy and IRB restrictions;
  • overfishing problem (returning to the same well);
  • personal contacts effective but limited;
  • study fadeout effect and how to get people to complete the study;
  • use of attention questions;
  • challenges related to recruiting the ``right” participants - right number, right background, right demographics, etc.
  • budgeting for recruitment costs.

Keynotes

Jason Jacques. City, University of London.

Title: Recruiting Developers at Scale: the case for (and against) crowdsourcing

Abstract Recruiting programmers for software development studies remains challenging. Costs can be high, samples may be small, and the impact of unintentional selection bias is easy to overlook. Crowdsourcing – the recruitment and use of online labour – has seen notable success in lowering costs and broadening access to data in areas such as machine learning and human computer interaction. Here, we consider the opportunities afforded by web-based crowdsourcing platforms: the potential to rapidly reach large numbers of software developers at low-cost. We discuss some of the challenges arising from large-scale participation, such as user validation and data analysis. And, finally, we look forward as we consider and reflect upon the increasing flexibility, relevance, and use of web-technologies in the developer experience and development lifecycle.

Bio: Jason is a Lecturer in the Department of Computer Science, at City, University of London. Jason’s research is focused on both the micro- and macro-impacts of digital technology on people and society, with extensive background in many areas of human-computer interaction. Research topics include economics, crowdsourcing, software development, ubiquitous computing, and interface design; including a focus on human-factors, engineering design, user behaviour, and digital markets at internet-scale, particularly as they extend and shape broader policy. His current work is focused on both the micro- and macro-impacts of digital technology on people and society.

Jason is a member of the SIGCHI Sustainability Committee, underlining his interest not only in the application and use of novel technologies but the broader impact of both research and practice on wider society. Jason began his postgraduate work at the University of St Andrews, as a member of the SACHI research group, and completed his PhD at the University of Cambridge.

Accepted Papers

# Authors Title/Paper PDF link
1 Austen Rainer and Claes Wohlin Recruiting participants and sampling items of interest in field studies of software engineering
2 Steffen Herbold, Alexander Trautsch and Benjamin Ledel The researcher turk in action: experiences from the LLTC4J project
3 Melina Vidoni and Nicolás E. Díaz Ferreyra Should I Get Involved? On the Privacy Perils of Mining Software Repositories for Research Participants
4 Brittany Reid, Markus Wagner, Marcelo d’Amorim and Christoph Treude Software Engineering User Study Recruitment on Prolific: An Experience Report
5 Mohammad Tahaei and Kami Vaniea Lessons Learned From Recruiting Participants With Programming Skills for Empirical Privacy and Security Studies
6 Mary Sánchez-Gordón and Ricardo Colomo-Palacios Challenges in Recruiting Andean Indigenous Participants for Software Engineering Research
7 Helen Sharp, Tamara Lopez and Michel Wermelinger Informed consent and participant recruitment in studies of software practice
8 Carolin Brandt and Andy Zaidman Strategies and Challenges in Recruiting Interview Participants for a Qualitative Evaluation
9 Daniel Russo Recruiting Software Engineers on Prolific
10 Chris Brown Nudging Developers to Participate in SE Research
11 Elisa Hartmann and Janet Siegmund How (Not) to Recruit Students Outside of Computer Science: An Experience Report
12 Felipe Ebert, Alexander Serebrenik, Christoph Treude, Nicole Novielli and Fernando Castor On Recruiting Experienced GitHub Contributors for Interviews and Surveys on Prolific
13 Nikhil Patnaik, Joseph Hallett, Mohammad Tahaei and Awais Rashid If You Build It, Will They Come? Developer Recruitment for Security Studies
14 Marco Gutfleisch, Jan H. Klemmer, Yasemin Acar, Sascha Fahl and Martina Angela Sasse Recruiting Software Professionals for Research Studies: Lessons Learned with the Freelancer Platform Upwork
15 Irum Rauf, Tamara Lopez, Helen Sharp and Marian Petre Challenges of Recruiting Developers in Multidisciplinary Studies
16 Oscar Dieste, Davide Fucci, Valentina Lenarduzzi and Sira Vegas Population Characterization Comes Before Sample Selection
17 Alena Naiakshina, Anastasia Danilova and Matthew Smith Lessons Learned in Five Years of Conducting Security Studies With Software Developers
18 Michael Coblenz and Felix Sosa Using Games to Broaden Audiences for Programming Studies
19 Matthew Smith, Anastasia Danilova and Alena Naiakshina A Meta-Research Agenda for Recruitment and Study Design for Developer Studies
20 Madeline Endres, Westley Weimer and Amir Kamil Making a Gamble: Recruiting SE Participants on a Budget

Workshop Program (* final *)

Times will be Eastern Daylight Time (EDT, UTC-4) May 17 2022 on Midspace, ICSE’s virtualization platform.

Time Title Who
9:00 AM Welcome and overview of the workshop RoPES organizers
9:05 AM Keynote Jason Jacques (introduced by N Ernst)
10:00 AM Break  
10:15 AM Session 1: Finding Participants #20, #11, #10, #2, #18, #5, #6 (chair: P. Chatterjee)
11:15 AM Break  
11:30 AM Session 2: Platforms and Participants #14,#9, #12, #4, #13, #8 (chair: B. Sharif)
12:30PM Lunch Break  
1:30PM Session 3: Ethics, lessons learned, and meta concerns #19, #17, #16, #15, #7, #1, #3 (chair: J. Carver)
2:30PM Break  
2:45 PM Session 4: Paper Brainstorming Organizers using Miro/other tools
4:00 PM Closing, End of RoPES 2022  

Submission Guidelines

Workshop papers must follow the ICSE 2022 Format and Submission Guideline, but will use a single blind submission process. All submitted papers will be reviewed on the basis of technical quality, relevance, significance, and clarity by the program committee. All workshop papers should be submitted electronically in PDF format through the EasyChair workshop website. Accepted papers will become part of the workshop proceedings.

Important Dates

Event Deadline
PAPER SUBMISSION January 21, 2022
MAIN TRACK AUTHOR NOTIFICATION February 18, 2022
CAMERA-READY March 18, 2022
DATE OF WORKSHOP May 17, 2022

Organizing Committee

Name Affil Twitter
Neil Ernst U. Victoria @neilernst
Jeffrey C. Carver U. Alabama @JeffCarver32
Carianne Pretorius TU Eindhoven @cari_pretorius
Preetha Chatterjee Drexel U. @PreethaChatterj
Alexander Serebrenik TU Eindhoven @aserebrenik
Bonita Sharif Nebraska @shbonita
Matthew Smith Bonn @m42smith

Program Committee

The OC + the following generous individuals below:

First name Last name Organization Social Media
Alex Bezzubov JetBrains  
Christian Bird Microsoft  
Kelly Blincoe The University of Auckland  
Fabiano Dalpiaz Utrecht University  
Anastasia Danilova University of Bonn  
Felipe Ebert Eindhoven University of Technology  
Felipe Fronchetti Virginia Commonwealth University  
Davide Fucci HITeC, University of Hamburg  
Fabian Gilson University of Canterbury  
Valentina Lenarduzzi LUT University  
Tamara Lopez The Open University  
Daniel Mendez Blekinge Institute of Technology, Sweden, and fortiss, Germany  
Kevin Moran College of William & Mary  
Nicole Novielli Dipartimento di Informatica, University of Bari  
Richard Paige McMaster University  
Paul Ralph Dalhousie University  
Martin Robillard McGill University  
Daniel Russo Department of Computer Science, Aalborg University  
Igor Scaliante Wiese Federal University of Technology – Paraná - UTFPR  
Janet Siegmund Chemnitz University of Technology  
Melina Vidoni Australian National University, CECS School of Computing  
Andy Zaidman Delft University of Technology  

Call for Papers

People are invited to submit short, 2 page papers that encompass the following categories and pertain to the workshop themes:

  • short position papers describing issues related to participant recruitment;
  • summaries of previously published work that involved significant participant recruitment challenges, i.e. an experience report;
  • extended abstracts of ongoing work or challenges.

In all cases, papers should be no more than 2 pages. All workshop papers should be submitted electronically in PDF format through the EasyChair workshop website. Accepted papers will then be hosted on the workshop home page. All papers will be reviewed by the PC for suitability to the topics, but we hope to accept all on-topic submissions.

The workshop outcome will be an extended paper summarizing the workshop for a venue such as Software Engineering Notes or IEEE Software.

Format

Please follow the ICSE formatting guidelines but anonymization is not required.

References

  1. Nicolas E. Gold and Jens Krinke. 2020. Ethical Mining. InProceedings of the 17th International Conference on Mining Software Repositories. ACM. https://doi.org/10.1145/3379597.3387462 

  2. Robert Feldt, Thomas Zimmermann, Gunnar R. Bergersen, Davide Falessi, AndreasJedlitschka, Natalia Juristo, Jürgen Münch, Markku Oivo, Per Runeson, MartinShepperd, Dag I. K. Sjøberg, and Burak Turhan. 2018. Four commentaries on theuse of students and professionals in empirical software engineering experiments.Empirical Software Engineering23, 6 (Nov. 2018), 3801–3820. https://doi.org/10.1007/s10664-018-9655-0 

  3. Sebastian Baltes and Paul Ralph. 2020. Sampling in Software Engineering Research: A Critical Review and Guidelines. CoRRabs/2002.07764 (2020). https://arxiv.org/abs/2002.07764 

  4. Francisco Gomes de Oliveira Neto, Richard Torkar, Robert Feldt, Lucas Gren,Carlo A. Furia, and Ziwei Huang. 2019. Evolution of statistical analysis in empiricalsoftware engineering research: Current state and steps forward. Journal of Systemsand Software 156 (Oct. 2019), 246–267. https://doi.org/10.1016/j.jss.2019.07.002 

  5. Anastasia Danilova, Alena Naiakshina, Stefan Horstmann, and Matthew Smith. 2021. Do you Really Code? Designing and Evaluating Screening Questions forOnline Surveys with Programmers. In2021 IEEE/ACM 43rd International Conferenceon Software Engineering (ICSE). IEEE. https://dl.acm.org/doi/abs/10.1109/ICSE43902.2021.00057