Recruitment Problems
- sample size: we need to get more subjects to have more confidence in the results.
- industry experience
- domain knowledge
- EDI considerations
- incentives
- cost vs value
- ethical considerations
- oversampling women for example
- building reputation with a community (so they don’t reject your invites)
- self-selection bias (the same people always respond)
- privacy and IRB restrictions
- overfishing problem (returning to the same well)
- personal contacts effective but limited
- get lots of data but is the data reliable
- e.g. fadeout effect
- filtering and selection: basic knowledge of X
- how to get people to complete the study
- they do the survey badly
- they don’t take it seriously
- attention questions
Recruitment Anecdotes
- mechanical turk: screening not the problem as much as “clicking through to the end”
- Carianne Pretorius, Prolific for design tasks
- Daniel Russo, Prolific
- Chris Brown, in person attendance at meetups
- The North West Branch of the British Computer Society (Julian Bass, Peggy Gregory) put on this event in 2019 and I (Lucy H) found study partners from it. We got to present our research proposals to local SE people / orgs. We also got to share challenges of research and publication. https://www.bcs.org/events/2019/july/wor…
A Workshop at ICSE
- is a good idea []
- is a bad idea []
Other suggestions
- a special issue at (Venue)
- a seminar or online invite-only working session
- …
Workshop Output
(what should be the result of this workshop)
- a reviewer guideline
- and why not to reject
- getting away from strict thresholds for sample size
- best practices for online crowd-sourcing marketplace
- task structure
- compensation
- identification of the “target population” on these marketplaces
- educational material for non-empirical researchers (build understanding)
Keynote ideas
(who, what topics)
- Daniel Lakens, Eindhoven University of Technology, The Netherlands
- Stefan van der Stigchel, Utrecht University, The Netherlands
- community development manager e.g. at ROS, Linux Foundation etc
- Chris Bird, Ciera Jaspan, etc. researchers at MSFT, Google, Jetbrains
- Successful industry-partnered researchers (e.g. Gorschek, Runeson)
- Jason Jacques crowdwork expert at Cam
Submission formats
(tend to favor informal, 2-4 page submissions)
Schedule ideas
(1 day, breakouts, working sessions)
- add an industry section - how they work with research, how they recruit
Workshop Name
- ROPES (Recruitment Of Participants in Experiments for Software)
- ??
Please add me to a potential workshop PC
(this section will be assumed to be a subset of the next section)
- name, affiliation, email
- Alex Bezzubov, Jetbrains
- Alexander Serebrenik, Eindhoven University of Technology, a.serebrenik@tue.nl
Please contact me with updates
- name, affiliation, email
- Janja Garnbret, Utrecht, janja@example.com
- Chris Brown, Virginia Tech, dcbrown@vt.edu
- Lucy H
Related Work
(papers/sections of papers, and workshops on this topic people should be aware of)
- Danilova et al., “Do you Really Code? Designing and Evaluating Screening Questions for Online Surveys with Programmers”
- Feldt et al., “Four commentaries on the use of students and professionals in empirical software engineering experiments”
- Lakens, “Sample Size Justification”
- Baltes and Ralph, “Sampling in Software Engineering Research: A Critical Review and Guidelines”
- “Information Visualization Evaluation Using Crowdsourcing” - how to run better crowdsourcing evaluations
- “Microdiversions to improve task attention”
- Matthias Hirth1, Jason Jacques2, Peter Rodgers3, Ognjen Scekic4, and Michael Wybrow5, “Crowdsourcing Technology to Support Academic Research”
- Jason T. Jacques and Per Ola Kristensson. 2021. “Studying Programmer Behaviour at Scale: A Case Study Using Amazon Mechanical Turk”. Programming 2021.
- Valentina Lenarduzzi, Oscar Dieste, Davide Fucci, Sira Vegas, “Towards a Methodology for Participant Selection in Software Engineering Experiments. A Vision of the Future”