And then paste this code immediately after the openingtag:
Gaby Goldstein, JD, Ph.D, Director of Research at SDAN;
Mallory Roman, Ph.D, Associate Director of Research at SDAN
In 2018, we ran two studies in partnership with Dr. Katherine Haenschen at Virginia Tech to explore the best ways to get volunteers to RSVP and attend team events. This post synthesizes the main findings of those studies. For specifics about each study, you can find recruitment study results here and confirmation study results here.
Personally contacting volunteers more than doubled RSVP and attendance rates over sending just emails. Personally contacting volunteers increased RSVP rates by 168% and attendance rates by 177% over just sending emails asking people to come. Our findings indicate that giving volunteers a call and leaving a voicemail, giving them a call plus sending them a text, and just sending them a text are all superior means of communication over just sending an email if you want people to RSVP and then show up to your team activity!
Personally contacting volunteers to confirm their plan to attend an event (confirming their RSVP) increased attendance rates by 51% over just sending them an email to confirm. Our findings indicate that giving volunteers a call and leaving a voicemail, giving them a call plus sending them a text, and just sending them a text are all superior means of communication over just sending an email if you want people to actually show up to an event once they have already RSVP’d to attend!
Caveats: Both studies recruited Sister District volunteers only and cannot be generalized outside that population. This was also our first investigation into event recruitment and confirmation and both studies were underpowered; we’ll need to replicate the study (run it again) to see whether the results are reliable. We plan to replicate this or very similar studies in 2019.
SDAN’s commitment: It is SDAN’s intention to provide as much context as possible to allow for the nuanced interpretation of our data. SDAN’s convention is to contextualize effects by reporting p values, confidence intervals, and standardized/contextualized effect sizes for all models tested (see the linked report if these metrics are missing from the blog). Additionally, SDAN always differentiates between planned and exploratory analyses and a priori and post hoc tests, and reports the results of all planned analyses regardless of statistical significance. These findings were peer reviewed by a subset of the Sister District Data and Research team composed of senior-level statisticians called the Quantitative Advisory Committee. If you are interested in joining the Quantitative Advisory Committee please email Mallory.