Skip to main content

Email recruitment for chronic pain clinical trials: results from the LAMP trial

Abstract

Background

Recruitment for clinical trials and large-scale studies is challenging, especially for patients with complex conditions like chronic pain. Email recruitment has the potential to increase efficiency, to reduce costs, and to improve access for underrepresented patient populations. The objective of this study was to examine the effectiveness, efficiency, and equitability of email versus postal mail recruitment for the Learning to Apply Mindfulness to Pain (LAMP) study, a three-site clinical trial of mindfulness-based interventions for chronic pain.

Methods

Patients with chronic pain diagnoses were recruited from three United States Department of Veterans Affairs (VA) facilities using the VA electronic health record (EHR). Recruitment materials were sent using either postal mail (n = 7986) or email (n = 19,333). Patients in the email recruitment group were also mailed introductory postcards before any emails. Mailing addresses and email addresses were obtained from the EHR. Effectiveness was measured by the response rate of patients who logged into the secure LAMP study website. Efficiency was measured by the number of days from when the recruitment materials were sent to when patients logged into the LAMP portal as well as the estimated costs of each recruitment approach. To assess equitability, we examined whether email recruitment was less effective for underrepresented populations, based on demographic information from the EHR.

Results

Effectiveness—unadjusted response rates were greater for email versus postal-mail recruitment (18.9% versus 6.3%), and adjusted response rates were over three times greater for email recruitment (RR = 3.5, 95% CI 3.1–3.8) based on a multivariable analysis controlling for age, gender, race, ethnicity, rurality, and site. Efficiency—email recruitment had a significantly lower mean response time (1 day versus 8 days) and a lower cost. Equity—email recruitment led to higher response rates for all subpopulations, including older, non-White, Hispanic, rural, and female Veterans.

Conclusions

Email recruitment is an effective, efficient, and equitable way to recruit VA patients to large-scale, chronic pain clinical trials.

Trial registration

Clinical Trial Registration Number: NCT04526158. Patient enrollment began on December 4, 2020.

Peer Review reports

Background

Clinical trial recruitment is an active area of study due to its importance in contributing to the success of clinical trials as well as its many practical challenges [1]. Clinical trials with ineffective recruitment efforts can lead to underpowered or failed studies [2] and can have significant financial and ethical implications [3]. Clinical trials often have difficulty recruiting underrepresented patient groups, resulting in study populations that do not reflect the targeted populations [4,5,6]. Chronic pain clinical trials, in particular, often have difficulty recruiting sufficient sample sizes and recruiting underrepresented patient groups, yet very few studies have investigated the success of different recruitment methods for chronic pain clinical trials [7].

In recent years, digital approaches to clinical trial recruitment (e.g., email, text messages, websites, and social media) have been compared to more traditional approaches (e.g., mailing, phone calls, newspaper advertisements, and media campaigns). The results in terms of response rates, costs, time to recruit participants, and access have been mixed, depending on the specific details of how the digital recruitment tools were implemented [8, 9]. However, several studies have shown that combining digital and traditional recruitment tools may have the potential to improve recruitment outcomes [10, 11].

The Learning to Apply Mindfulness to Pain (LAMP) study is a three-site clinical trial to test the effectiveness of mindfulness-based interventions (MBIs) for chronic pain. Patients with moderate to severe chronic pain were recruited from three U.S. Veterans Affairs facilities and were randomly assigned to two intervention groups (Group MBI and Self-paced MBI), which were compared against usual care. The primary outcome was change in the Brief Pain Inventory (BPI) interference score at 10 weeks, 6 months, and 12 months. Secondary outcomes include changes in pain intensity, global improvement in pain, anxiety, depression, fatigue, post-traumatic stress disorder, physical function, sleep disturbance, and participation in social roles and activities. Additional details can be found in the study protocol paper [12].

Partway through the LAMP study, we switched from traditional postal recruitment to email recruitment, which allowed us to compare the two recruitment modalities in terms of equity, efficiency, and effectiveness in a United States Department of Veterans Affairs (VA) population.

Methods

Patients were recruited from the Minneapolis VA Health Care System (MVAHCS), Durham VA Health Care System (DVAHCS), and VA Greater Los Angeles Healthcare System (VAGLAHS) if their electronic health record (EHR) showed qualifying pain diagnoses on at least two occasions within the same pain category, at least 90 days part, during the previous two years [12]. The qualifying pain categories were defined using the International Classification of Diseases, Tenth Revision, Clinical Modification (ICD-10-CM) diagnostic codes for common pain conditions [13]. To ensure generalizability of the pragmatic clinical trial, minimal exclusion criteria were used. This study is part of the LAMP trial, which was approved by the VA Central Institutional Review Board.

Recruitment materials were sent to six separate waves of patients, recruited at different times between 2020 and 2022. Each wave was a random sample of potentially eligible men and women from the included recruitment sites at that time. Part of the recruitment occurred during the COVID-19 pandemic. The recruitment method started as postal recruitment only for the first two waves, but we decided to try email recruitment for the remaining waves due to an increase in printing and mailing costs. We also felt that email recruitment might integrate better with the technology focus of the study. Women were oversampled from the identified population to try to get approximately even numbers of men and women randomized into the trial. Patients were either sent recruitment materials by postal mail or email, depending on their wave of recruitment. Their postal and email addresses were obtained from the VA EHR. The number of recruitment packets sent for each wave was based on an estimated response rate that would efficiently fill the intervention session times. We also made sure that the group sizes for each wave would not exceed the capacity of the Group MBI intervention facilitators.

Patients in the postal recruitment group, waves 1 and 2, were mailed a packet of documents that included information about the study and instructions for accessing the study website. The mailed packet included an optional quick response (QR) code that could be used to simplify the process of accessing the study website. They were also given contact information and a prefilled postcard to opt out of the study if they wanted. Patients who logged into the study website using a study-specific identifier were prompted to complete the study screener. The patients in the postal recruitment group were from the MVAHCS site and not the other sites.

Patients in the email recruitment group, waves 3–6, were first mailed an introductory postcard, a requirement of our Institutional Review Board (IRB). The postcard notified patients that they would receive an email about participation in the study. They were also given contact information and a website link to opt out of the study if they wanted. Approximately a week after they were sent a postcard, they were sent an email that contained the same information as the packet of documents sent to the postal recruitment group. No one who was sent email recruitment materials requested paper documents. Waves 3 and 4 were patients from the MVAHCS and DVAHCS sites. Waves 5 and 6 were mainly patients from the DVAHCS and VAGLAHS sites.

Reminder postcards were mailed to non-responders in waves 1–3, and reminder emails were sent to non-responders in waves 3, 5, and 6. Reminder emails were not sent to patients in wave 4 because the maximum number of participants who could be included for that wave had already been reached. In line with the pragmatic nature of the study, the reminder methods changed over the different waves as we tried to improve recruitment strategies.

Effectiveness was measured by the response rate of patients to the recruitment materials, where a response was defined as a patient logging into the study website using their study-specific identifier. No more than a single response per patient was recorded in the dataset. Logging into the study website was chosen as a response, as opposed to completing the study screener, because patients may exit the screener early for reasons that do not reflect their engagement with the recruitment method, such as inclusion and exclusion criteria. We then calculated a ratio of email to postal response rates by dividing the email response rate by the postal response rate.

Efficiency was measured by the response time for patients to respond to the recruitment materials as well as the difference in estimated costs. Response time was calculated as the number of days from mailing the packet of study documents to logging into the study website for the postal-recruited group and as the number of days from sending the email for the email-recruited group to logging into the study website. The postcard mailing date was not included in this calculation because patients were unable to log into the study website until they received either the mailed packet or email. To compare response times between email and postal recruitment, we generated box and whisker plots by recruitment strategy. The cost for postal recruitment materials included printing and mailing ten different items in the mailed recruitment packet. The cost for email recruitment materials was primarily the cost of the postcards.

Equity was based on an analysis of response rates by recruitment method across key demographic groups. We coded age, gender, race, ethnicity, and rurality based on the patient’s entry in the VA EHR. VA rurality data is based on the Rural–Urban Commuting Areas (RUCA) system, which classifies United States census tracts using measures of urbanization, population density, and daily commuting. We also conducted a multivariable analysis of response rates controlling for age, gender, race, ethnicity, rurality, and site.

Results

We identified 121,441 potential participants from the VA EHR and sent postal mail recruitment materials to 7986 patients and email recruitment materials to 19,333 patients (Fig. 1). Table 1 shows the demographic information for the patients sent recruitment materials. Due to the demographics of the Veterans at the different recruitment sites used for the different waves, the patients sent email materials were younger, more female, more ethnically and racially diverse, and less rural.

Fig. 1
figure 1

Recruitment flow diagram

Table 1 Baseline characteristics of patients sent recruitment materials

Effectiveness

Unadjusted response rates were higher for email recruitment (18.9%) compared to postal mail recruitment (6.3%). Additionally, in a multivariable analysis controlling for age, gender, race, ethnicity, rurality, and site, the adjusted response rates were over three times greater for email recruitment (RR = 3.5, 95% CI 3.1–3.8).

Most non-responders did not contact the study team. However, 1240 (15.5%) patients in the postal group actively refused the screener, mostly using opt-out postcards, compared with 289 (1.2%) of patients in the email group. Recruitment materials were returned to the study team due to bad address for 314 (3.9%) patients in the postal group and 521 (2.7%) patients in the email group. A small number of patients in both groups or their family members contacted the study team to inform us that the patient was ineligible or deceased. Additionally, a few patients were determined to be ineligible or to have deceased based on a chart review performed by the study team approximately six weeks after the recruitment materials were sent.

Following initial recruitment, 1524 of the 3662 (42%) responders in the email group and 213 of the 506 (42%) responders in the postal group completed the baseline survey, showing that recruitment method did not negatively affect engagement in other study activities. We ultimately randomized 667 (18%) responders from the email group and 144 (28%) responders from the postal mail group into the trial. Due to the unexpectedly high effectiveness of email recruiting, the maximum capacity of the intervention sessions was reached in later waves, and some eligible patients were not randomized to an intervention group in that wave but were included in next recruitment wave.

Efficiency

The time to respond to the recruitment materials was much shorter for email than postal mail recruitment (Fig. 2). The median time to respond was 1 day for the email group compared to 8 days for the postal group. Many people in the email group responded the same day that the email recruitment materials were sent. The cost of printing and mailing the recruitment materials to the 19,333 patients in the email group would have cost approximately $2.33 per participant, corresponding to a total of approximately $45,000 saved. Additionally, the personnel time required to prepare and send the recruitment materials was estimated to have taken 130–200 h for the 7986 people in the postal group and 5–20 h for the 19,333 people in the email group.

Fig. 2
figure 2

Response time following email or postal mailed recruitment materials. The median is indicated by the vertical line, the interquartile range by the box, and the 2.5th and 97.5th percentiles by the whiskers

Equity

Email recruitment had a higher unadjusted response rate than postal mail recruitment for every age, gender, race, ethnicity, and rurality category (Table 2). The unadjusted ratio of email to postal response rates was at least 2.2 for each subpopulation, and the highest unadjusted ratio was 7.1 for Black patients.

Table 2 Response rates

Discussion

We found email recruitment to be an effective, efficient, and equitable way to recruit VA patients to the LAMP study. The time to respond was consistently shorter for patients in the email group. The median response time of 1 day was shorter than the minimum estimated time to deliver postal mail. Response rates were higher for email recruitment overall and across individual subpopulations, including for older, non-White, Hispanic, rural, and female Veterans. Additionally, Black and multiracial patients had the largest ratio of email to postal response rates, highlighting the capabilities of using email to recruit populations often underrepresented in research.

The introductory postcards mailed to patients in the email recruitment group may have increased response rates by combining the benefits of digital and traditional recruitment methods, which has also been reported in other non-chronic-pain clinical trials [10, 11]. We heard from members of the study’s Veteran Engagement Panel, a diverse group of Veterans with chronic pain, that the introductory postcards lent credibility to the recruitment email, which would make the email less likely be disregarded, deleted, or marked as spam. Additionally, the recruitment email made it easy for patients to click on a link to access the study website, which would have required less effort than the patients who received the mail documents who had to manually enter the login information or use a QR code to access the study website. Overall, email recruitment combined with introductory postcards improved recruitment outcomes and reduced burden for both study staff and potential participants.

There were limitations to this study. We were not able to conduct a randomized controlled trial, which would have been the gold standard method to evaluate postal versus email recruitment. Due to the nature of the pragmatic clinical trial, different waves had different recruitment methods and were recruited at different times from different sites resulting in groups with different demographics. Response rates may have been affected by the different phases of the pandemic and other external factors at the time that each wave was recruited. Additionally, postal mail recruitment was only tested with patients from the Minneapolis VA site. Nevertheless, the multivariable analysis showed that response rates were greater for email recruitment after controlling for site (as well as age, gender, race, ethnicity, and rurality). Email-only recruitment (i.e., without postcards) was not tried with any of the waves, as this was not permitted by our Institutional Review Board (IRB). This study examined only VA patients, and the recruitment outcomes of email and mail recruitment might be different for non-VA populations. Other demographic factors that could impact recruitment, such as education status, socio-economic status, and household income were not available for analysis. Also, the study required interested participants to sign into a website, which may have been easier for those who received recruitment materials by email. We did not track the time required for email and postal recruitment and instead used an estimate. Finally, recruitment materials were sent during the COVID-19 pandemic, when people spent more time at home and might have been more likely to respond to recruitment materials.

Conclusions

Email recruitment is an effective, efficient, and equitable way to recruit VA patients to large-scale, chronic pain clinical trials. Postal costs and personnel time were also much less for email recruitment. Future studies are needed to further explore how email recruitment affects groups who do not have regular access to email via computer or smartphone. As more VA studies consider using electronic recruitment and data collection, it will be important to ensure that all Veterans have access to resources that enable them to participate in VA research.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Abbreviations

BPI:

Brief Pain Inventory

COVID-19:

Coronavirus disease 2019

DVAHCS:

Durham VA Health Care System

EHR:

Electronic health record

ICD-10-CM:

International Classification of Diseases, Tenth Revision, Clinical Modification

IRB:

Institutional Review Board

LAMP:

Learning to Apply Mindfulness to Pain

MBI:

Mindfulness-based intervention

MVAHCS:

Minneapolis VA Health Care System

QR:

Quick response

VA:

Veterans Affairs

VAGLAHS:

VA Greater Los Angeles Healthcare System

References

  1. Treweek S, Pitkethly M, Cook J, Fraser C, Mitchell E, Sullivan F, Jackson C, Taskila TK, Gardner H. Strategies to improve recruitment to randomised trials. Cochrane Database Syst Rev. 2018;2(2):MR000013.

  2. McDonald AM, Knight RC, Campbell MK, Entwistle VA, Grant AM, Cook JA, Elbourne DR, Francis D, Garcia J, Roberts I, Snowdon C. What influences recruitment to randomised controlled trials? A review of trials funded by two UK funding agencies. Trials. 2006;7(1):1–8.

    Article  Google Scholar 

  3. Gul RB, Ali PA. Clinical trials: the challenge of recruitment and retention of participants. J Clin Nurs. 2010;19(1–2):227–33.

    Article  PubMed  Google Scholar 

  4. Bonevski B, Randell M, Paul C, Chapman K, Twyman L, Bryant J, Brozek I, Hughes C. Reaching the hard-to-reach: a systematic review of strategies for improving health and medical research with socially disadvantaged groups. BMC Med Res Methodol. 2014;14:1–29.

    Article  Google Scholar 

  5. Ellard-Gray A, Jeffrey NK, Choubak M, Crann SE. Finding the hidden participant: solutions for recruiting hidden, hard-to-reach, and vulnerable populations. Int J Qual Methods. 2015;14(5):1609406915621420.

    Article  Google Scholar 

  6. Chau AJ, Sudore RL, Hays RD, Tseng CH, Walling AM, Rahimi M, Gibbs L, Patel K, Sanz Vidorreta FJ, Wenger NS. Telephone outreach enhances recruitment of underrepresented seriously ill patients for an advance care planning pragmatic trial. J Gen Intern Med. 2023;30:1–6.

    Google Scholar 

  7. Kennedy N, Nelson S, Jerome RN, Edwards TL, Stroud M, Wilkins CH, Harris PA. Recruitment and retention for chronic pain clinical trials: a narrative review. PAIN Reports. 2022;7(4):e1007.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Brøgger-Mikkelsen M, Ali Z, Zibert JR, Andersen AD, Thomsen SF. Online patient recruitment in clinical trials: systematic review and meta-analysis. J Med Internet Res. 2020;22(11):e22179.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Tan RK, Wu D, Day S, Zhao Y, Larson HJ, Sylvia S, Tang W, Tucker JD. Digital approaches to enhancing community engagement in clinical trials. NPJ Digit Med. 2022;5(1):37.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Frampton GK, Shepherd J, Pickett K, Griffiths G, Wyatt JC. Digital tools for the recruitment and retention of participants in randomised controlled trials: a systematic map. Trials. 2020;21(1):1–23.

    Article  Google Scholar 

  11. Murphy CC, Craddock Lee SJ, Geiger AM, Cox JV, Ahn C, Nair R, Gerber DE, Halm EA, McCallister K, Skinner CS. A randomized trial of mail and email recruitment strategies for a physician survey on clinical trial accrual. BMC Med Res Methodol. 2020;20:1–7.

    Article  Google Scholar 

  12. Burgess DJ, Evans R, Allen KD, et al. Learning to Apply Mindfulness to Pain (LAMP): design for a pragmatic clinical trial of two mindfulness-based interventions for chronic pain. Pain Med. 2020;21(2):S29-36.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Mayhew M, DeBar LL, Deyo RA, Kerns RD, Goulet JL, Brandt CA, Von Korff M. Development and assessment of a crosswalk between ICD-9-CM and ICD-10-CM to identify patients with common pain conditions. J Pain. 2019;20(12):1429–45.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

Thank you to the continued feedback we receive from our LAMP study Veteran Engagement Panel (VEP).

Funding

The U.S. Army Medical Research Acquisition Activity, 820 Chandler Street, Fort Detrick MD 21702–5014 is the awarding and administering acquisition office. This work was supported by the Assistant Secretary of Defense for Health Affairs endorsed by the Department of Defense, through the Pain Management Collaboratory—Pragmatic Clinical Trials Demonstration Projects under Award No. W81XWH-18–2-0003. Research reported in this publication was supported by Grant Number U24AT009769 from the National Center for Complementary and Integrative Health (NCCIH) and the Office of Behavioral and Social Sciences Research (OBSSR). This material is the result of work supported with resources at the Minneapolis VA Health Care System, Durham VA Health Care System, and VA Greater Los Angeles Healthcare System. This work is a product of the Pain Management Collaboratory. For more information about the Collaboratory, visit https://painmanagementcollaboratory.org/. Opinions, interpretations, conclusions, and recommendations are those of the authors and are not necessarily endorsed by the Department of Defense, NCCIH, OBSSR, National Institutes of Health, U.S. Department of Veterans Affairs, or United States Government.

Author information

Authors and Affiliations

Authors

Contributions

AB, LJSC, KB, MB, and SH collected participant data. JEF, EHC, AB, LJSC, CC, JKF, BCT, and DJB analyzed and interpreted the data. JEF wrote the first draft of the manuscript, and all authors edited and revised the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Lee J. S. Cross.

Ethics declarations

Ethics approval and consent to participate

A waiver of documentation of informed consent and the data collection reported herein was approved by the VA Central Institutional Review Board (IRB) (project 18–21). Participants were provided an information sheet about the study which included required consent form language and how to contact study staff with any questions.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ferguson, J.E., Hagel Campbell, E., Bangerter, A. et al. Email recruitment for chronic pain clinical trials: results from the LAMP trial. Trials 25, 491 (2024). https://doi.org/10.1186/s13063-024-08301-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13063-024-08301-8

Keywords