This substudy is a quantitative content analysis [14] of help tickets logged in a participant management database during the enrollment phase of a larger RCT. The purpose of the main RCT was to test whether or not the addition of an online community within an automated internet-mediated walking intervention called Stepping Up to Health increased participant adherence to the intervention. The Stepping Up to Health intervention utilizes automated goal setting in conjuncture with enhanced pedometers to objectively assess participants' daily walking to encourage increased physical activity [15]. Through the use of a USB port located on the pedometer, participants upload their captured walking data to the Stepping Up to Health website and are able to see detailed graphs, tailored feedback regarding their walking, and tailored motivational messages.
Participants in the main clinical trial were randomized to participate in a 16-week trial of either the basic Stepping Up to Health intervention (control group) or the basic Stepping Up to Health intervention plus the addition of an online community (intervention group). As an incentive to participate in this study, enrolled participants received either a $25 gift card to Amazon.com, or in the event a participant was an employee of the University of Michigan, $25 was paid through the university payroll. In addition, participants received a free one-year subscription to walkingspree.com, a commercial version of the Stepping Up to Health intervention.
Human Subjects Protection
The University of Michigan Medical Institutional Review Board reviewed and approved the methods used in this investigation (UM IRB HUM00012230). A waiver of written informed consent was granted and participants clicked on a button on the website indicating consent after reading a short online informed consent document.
Main Clinical Trial Recruitment
One-page invitation letters were mailed to a random subset of 5,954 individuals ≥ 18 years of age treated at the University of Michigan Health System between August 2007 and January 2008 and who had diagnoses of coronary artery disease, type 2 diabetes or BMI ≥ 25, as identified by the University of Michigan clinical data warehouse. Individuals with diagnosis codes for quadriplegia and paraplegia or pregnancy-related diagnoses or procedures within the previous year were excluded from our sample. The invitations briefly described the study and directed the recipients to a website that contained further information, as well as directions on how to enroll in the study.
To be eligible for this study, participants were required to be at least 18 years of age, sedentary, capable of walking at least one block, and have at least one of the following conditions: coronary artery disease, type 2 diabetes and/or BMI ≥ 25. Participants were also required to have access to the internet through a computer with Windows 2000, XP, or Vista, a USB port, and had to self-report using email at least once weekly. Exclusion criteria included pregnancy and inability to communicate in English. Although the University of Michigan clinical data warehouse was used to identify individuals who were likely to be eligible for our study, to be eligible for this study, participants were required to self-report meeting all eligibility criteria.
Enrollment Process for Main Clinical Trial
The enrollment process for this RCT consisted of multiple steps. To begin, after receiving the recruitment letter that contained a URL to the study website, potential participants visited the website to read more about the clinical trial. The introductory text on the website further explained the study and indicated that not only would participants be given a pedometer to wear daily, but they would also be asked to upload pedometer data on a regular basis and to complete periodic surveys. Furthermore, the study website explained that in order to join the study, participants would be required to 1) obtain medical clearance from a treating physician, 2) complete an initial online baseline survey, and 3) wear a blinded pedometer (step-count display is covered) for seven days to collect baseline walking data, and upload this data to the study server using a USB port located on the pedometer. If interested in enrolling, potential participants were asked for their email address. To validate these email addresses, the submission of an email address triggered an automated email containing a link to the online screening survey, which interested potential participants followed and completed. Next, if a potential participant met eligibility criteria, he/she progressed to an online consent form, whereas ineligible individuals were thanked for their time. Once a potential participant consented, they were mailed a box of study materials that included a checklist of items included in the shipment, study contact information, website login information, study enrollment process procedures, pedometer instructions, download software instruction, disease-specific information for exercising with coronary artery disease and/or type 2 diabetes if applicable, a copy of the consent form, a medical clearance form, and a blinded pedometer. Due to the fact that our outcome assessment was automated by a computer, coupled with the fact that participants were not blinded to the intervention, no other portions of the trial were blinded aside from the baseline pedometer-wearing period. Finally, as indicated on the study website, prior to randomization within the trial, potential participants were required to complete three tasks: 1) obtain medical clearance from a treating physician and submit to study personnel via fax; 2) complete a baseline survey online with questions concerning their baseline physical activity, social history, computer experience, and global health status; and 3) wear their blinded pedometer for seven days to collect baseline walking data, and upload this data to the study server using a USB port located on the pedometer. Once these three tasks were completed, participants were randomized into the trial. This enrollment process was tested and refined in a pilot feasibility study prior to being implemented on a larger scale in the RCT.
In an effort to develop a vibrant online community and increase the likelihood of participants utilizing the feature, randomization into the control and intervention arms was unbalanced with participants having a 78% probability of being randomized into the intervention arm. Upon completion of the 16-week trial, participants received their $25 Amazon gift card or $25 through payroll, as well as the one-year subscription to walkingspree.com.
Substudy Description
The purpose of this present substudy was to determine the obstacles potential participants encountered during the enrollment phase of this RCT. We analyzed correspondence between study staff and potential participants documented in help tickets within our participant management system. During the enrollment process for this study, communication between potential participants and the study team occurred by phone and/or email, and each time a potential participant contacted the study team, or vice versa, the participant management system would create an electronic help ticket documenting the initial communication and all subsequent correspondence relating to that particular topic. These help tickets became part of a database with each help ticket linked to a potential participant by their unique participant ID number. Participants typically initiated correspondence when they had questions or concerns and wanted to speak with study staff. Study staff typically initiated correspondence to follow up with participants who had not completed all necessary steps for randomization, or if there was a problem with the submitted information. Because help tickets documented communication regarding different correspondences and issues, it was possible for individual participants to be associated with multiple help tickets. Help tickets included in this analysis were created between the time potential participants first responded to the recruitment letter by visiting the study website, until the time they were randomized in the trial.
To manage help tickets, study protocol dictated that new incoming help tickets be checked at least twice a day during the week (once in the morning and once in the afternoon) and at least once per day on the weekend. Study staff responded to new correspondence as swiftly as possible during the week regardless of urgency, and urgent messages were responded to immediately on the weekend. Because participant ID numbers were not assigned until individuals consented to participate in the study, it was not always possible to tell if multiple tickets concerning the same issue were sent by the same potential participant. In the event this occurred, whenever possible, tickets that were known to have originated by a specific person that concerned the same issue were merged into one ticket by study staff using their best judgment. To facilitate the appropriate merging of tickets, the number of study staff responsible for responding to help tickets was left intentionally small. After consent, but prior to randomization in the study, new incoming help tickets by a participant were compared to already-logged help tickets to see if incoming correspondence was related to a new issue or if it was a follow-up to a previous help ticket.
For this substudy, each help ticket was coded according to who initiated the correspondence (staff vs. potential participant), the mode of communication used to initiate the correspondence, the mode of communication used to resolve the issue, and the presence or absence of different themes discussed in the help ticket. Help ticket themes were developed a priori by study staff. A total of 623 help tickets were logged during this sample period.
Intercoder Reliability and Data Analysis for Substudy
Coding of help tickets was completed using NVIVO 8 (QSR International Pty Ltd, Doncaster, Victoria, Australia), and quantitative data analysis was conducted using STATA 10 (StataCorp LP, College Station, TX, USA). To establish intercoder reliability, two of the authors of this study, both of whom were involved in intervention delivery, analyzed a 10% subsample. The predetermined cut-point for inter-rater reliability of each coded variable was .80 using Cohen's Kappa. We used descriptive statistics to describe the different characteristics of the help tickets and a Poisson regression analysis to identify predictors of the number of help tickets associated with a potential participant.