Skip to main content

Is this study feasible? Facilitating management of pragmatic trial planning milestones under a phased award funding mechanism

Abstract

Background

Improving efficiencies in clinical research is crucial to translation of findings into practice and delivery of effective, patient-centered health care. This paper describes a project that monitored pragmatic clinical trials by working with investigators to track achievement of early phase milestones. The National Institutes of Health (NIH) Pragmatic Trials Collaborative Project supported scientifically diverse, low-cost, randomized, controlled, pragmatic clinical intervention trials. Funds were available through a cooperative agreement award mechanism, with the initial phase supporting trial planning and the subsequent 4-year awards funding trial implementation. A coordinating center provided evaluation and administrative support, which included capturing progress toward achieving milestones.

Methods

Six funded trials participated in monthly calls throughout the first year to identify and demonstrate metrics and deliverables for each milestone in the Notice of Grant Award. Interviews were conducted with investigators, trial team members, and NIH program officers/project scientists to discuss their perceptions of the impact and value of the management strategy.

Results

Five of six trials transitioned to the implementation phase with milestones ranging from 6 to 15 and quantifiable metrics ranging from 15 to 33, for a total of 121 deliverables. One third of the metrics (42, 35%) were trial-specific. Trial teams reported that the oversight was onerous but complemented their management strategies; program officers/project scientists found that documentation submitted for review was sufficient to assess trial feasibility; and investigators reported advantages to the phased award mechanism, such as leverage to secure commitments from stakeholders and collaborators, help with task prioritization, and earlier consultation with key members of the trial team.

Conclusions

Implementing systematic approaches to identify milestones and track metrics can strengthen the evidence base regarding time and effort to plan and conduct pragmatic clinical trials. Investigators were unaccustomed to producing evidence of performance, and it was challenging to determine what documentation to provide. Efforts to standardize expectations regarding milestones that mark a significant change or stage in trial development or that represent minimum success criteria may provide guidance for more effective and efficient trial management. A framework with clearly specified metrics is especially critical for transparency, particularly when funding decisions are contingent on both merit and feasibility.

Peer Review reports

Background

Improving efficiencies across all phases and types of clinical research is crucial to accelerating translation of findings into practice, leading to better delivery of effective, patient-centered care [1,2,3]. The complexities of conducting clinical trials are well known [4, 5], and numerous strategies at multiple levels have been proposed or adopted to address challenges with research design and conduct [6]. This paper describes outcomes of a unique National Institutes of Health (NIH) project that provided management and coordination support for a set of pragmatic clinical trials (RFA-HL-14-019) by working closely with principal investigators (PIs) during the early phase of the trial to identify and track achievement of explicit trial planning milestones.

The NIH Pragmatic Trials Collaborative Project, initiated in 2014 to support scientifically diverse, low-cost, patient-centered, randomized, controlled, pragmatic clinical intervention trials, incorporated several strategies to ensure optimal trial planning and conduct and to promote early identification of potential threats to trial success [7]. The first is the use of cooperative agreements, wherein NIH program officers (POs) and project scientists (PSs) work jointly with the PIs to serve as a resource and provide scientific guidance throughout the life cycle of the trial. Under this cooperative agreement, members of the project participated in joint activities to gain a better understanding of the struggles and successes of trial planning, explore the significance of stakeholder engagement and other factors, and anticipate potential challenges to meeting patient accrual and data management objectives.

The second strategy to enhance the likelihood of trial success is the phased award mechanism, increasingly used across NIH in recent years, which incorporates processes to identify early phase (i.e., first year) milestones and trials at risk. Funds for the trial implementation phase (i.e., subsequent 4 years) are contingent on administrative review of milestone achievement. Milestones are the qualitative benchmarks of accomplishment of essential goals, and most require a sequence of steps that collectively represent milestone achievement. All trials were required to complete the identified planning milestones within the early phase time period. The milestones reflect the critical start-up steps as delineated in the application, and are incorporated in the Notice of Grant Award (NoGA). The NIH has utilized various approaches to the phased award mechanism, including varying the length of time allotted to complete the planning milestones. Under the NIH Pragmatic Trials Collaborative Project described in this paper, activities expected to be accomplished in the early phase (approximately 12 months) included refinement of existing resources, further development of study partnerships, and finalization of trial protocols. There were sufficient funds for full implementation of all trials. To advance the sponsor’s interest in pragmatic trial designs and determine whether they can help to bridge the translation gap, a companion award (RFA-HL-14-020) was made to a coordinating center (awarded to Westat, an employee-owned research organization headquartered in Rockville, MD, USA) to evaluate the funded trials from a process and operational view, particularly during the planning phase, which included assembling appropriate documentation for the administrative review conducted by the NIH [8]. Figure 1 illustrates this process and timeline.

Fig. 1
figure 1

Planning phase: process flowchart

Pragmatic trials conducted in real-world settings have design features that distinguish them from more explanatory trials [9,10,11,12,13,14,15,16]. Given the likelihood of additional unanticipated challenges that may be encountered as investigators secure buy-in from stakeholders and gatekeepers, test feasibility of systems and data collection methods pertaining to primary outcomes, and confirm availability of patient populations, understanding the management of critical start-up activities in the early phase of more pragmatic trials may be especially relevant to trial designers, sponsors, and research partners [17,18,19,20]. While considerable research exists on milestones or features associated with traditional clinical trials [21, 22], only recently have efforts been undertaken to systemically capture critical factors, contingencies, and timelines associated with trial planning for more pragmatic research [23]. Furthermore, the trials funded under this specific initiative had the additional requirement of a lower cost budget compared to many other funding opportunities, such that factors related to management efficiency, workflow, and resource utilization were even more critical [19]. With the support and cooperation of the awardees and their POs, the additional management support provided by the coordinating center facilitated a learning and collaborative platform and offered an opportunity to capture and share lessons learned regarding identifying evidence of achievement of planning milestones under this phased award.

Methods

Based on the available literature on clinical trial milestones and requirements of the funding announcement, the coordinating center developed a general framework to categorize milestones as Collaborations, Materials and Methods, Clearances, Study Population, Resources, and Patient Information Management. The framework (Table 1) was used to align milestones for each trial, with those appearing to fall outside these categories classified as Trial-Specific.

Table 1 Pragmatic trial planning: general framework for milestones and metrics

Six awards were made under this initiative. As is customary under this phased approach, the NIH PO assigned to each trial worked with the PI to finalize the planning milestones (September 2014) and subsequently to determine specific metrics associated with each (January 2015). The lag in time was due to the awareness by the project leadership that the indicators needed for the administrative review were at the metric rather than the milestone level. Metrics provide objectively measurable evidence of milestone progress, overall functioning of the trial, and forewarnings about factors that need attention. Tracking achievement on these performance metrics was intended to encourage improvement, increase effectiveness, and manage expectations.

Management support

From December 2014 (introductory kick-off meeting) through June of 2015, each PI and members of their trial teams participated in recorded monthly conference calls with the coordinating center to discuss progress. These were collaborative 1-h discussions of task prioritization, alignment of metrics with milestones, estimates of completion dates, and negotiation on the type of deliverable to be provided (e.g., screen shots, lists of variables found in data dictionaries, copies of signed agreements). Acceptable forms of documentation included PDFs, Word documents, or Excel files. A tailored tracking form, developed for each trial and updated and redistributed following each monthly call, was used to capture information including date of completion, type of deliverable, and deliverable receipt date. One 2-day in-person meeting was also held toward the end of the first year.

A process was developed to package and deliver documents to the NIH for their internal administrative panel review. Performance documentation was collected using a secure File Transfer Protocol (FTP) server. Submission instructions were provided to the trial teams and included a document naming convention to identify documents and maintain version control as they were received via the FTP server. The coordinating center conducted an adequacy check as materials were collected and worked in collaboration with the trial teams if questions arose. The coordinating center did not assess the documentation on scientific merit but from an operational point of view. A binder for each trial was compiled and included a one-page summary; all the documentation received; and a Reviewer Checklist that itemized each deliverable, provided a column for optional reviewer comments, and requested indication of a satisfactory assessment for each metric. Hard copies of the binders and a flash drive with all documentation were delivered to the NIH where two independent NIH POs (i.e., not the PO of the grant) served as reviewers and provided recommendations for continued funding.

Evaluation

Recordings and meeting minutes from the 1-h monthly calls, proceedings of the annual 2-day in-person meeting, and semi-structured interviews were the qualitative data for the evaluation. A semi-structured guide was developed for the interviews conducted with PIs and trial team members (August through September 2015) and the NIH POs/PSs (October 2015) on their perceptions of the impact and value of the management strategy. Data were analyzed by the coordinating center (PDL and LD) using a modified grounded approach [24], with recordings accessed for clarity or to supplement meeting minutes and notes.

Results

During the early phase of funding, one of the investigators recognized that assumptions about eligibility criteria and availability of patients were flawed, leading to withdrawal prior to the administrative review to pursue a more appropriate funding mechanism. Based on recommendations from the NIH internal administrative review, the remaining five trials [25,26,27,28,29] were approved for implementation funding (summarized in Table 2).

Table 2 Funded trials (Phase II)

The number of milestones delineated in the five award notices (NoGAs) ranged from 6 to 15. Most milestones had one or more associated metrics; the total number of metrics ranged from 15 to 33. One third of the metrics (42, 35%) were associated with trial-specific milestones. Metrics, rather than milestones, are presented in Table 3, as these were the explicit indicators by which performance and progress were assessed.

Table 3 Planning phase: number of metrics by trial

Specification of deliverables

A common challenge in discussions with the trial teams was specifying the deliverable or documentation associated with each metric; this was particularly evident for those that were more unique (e.g., trial-specific) or that represented technological or system-level progress. Occasionally the same deliverable was linked to more than one metric, and this was clearly documented on the tracking form and in the administrative review materials for the NIH. All of these issues were resolved through discussion of options and clear communication with the PIs about expectations.

Table 4 provides a list of trial-specific metrics and a description of their deliverables, further grouped as related to Training (of research staff or interventionists); Stakeholder buy-in or partner engagement; Data management; Intervention refinement and finalization; Recruitment/accrual feasibility; and Information technology (IT) or systems interoperability. Testing feasibility of systems, ensuring buy-in from stakeholders, and assessing intervention acceptability were among the critical achievements required in the planning phase.

Table 4 Illustrative planning phase trial-specific metrics and deliverables

Table 4 illustrates that many metrics categorized as trial-specific will test assumptions regarding patient recruitment or accrual, intervention delivery, and management of outcome data, requiring that the associated deliverables demonstrate achievements pertaining to access to electronic health records or functioning of IT and database systems. Descriptions of each deliverable were included in the summary reports provided to reviewers.

Qualitative findings

Analysis of qualitative data sources, including monthly meeting minutes, the annual in-person meeting transcript, and semi-structured qualitative interviews, indicated that members of the trial teams found the oversight process onerous at times but reflected that it mostly improved or complemented their own management strategies. POs/PSs benefited from enhanced engagement with the PIs and the opportunity to learn more about pragmatic trial management and implementation of the phased award mechanism, and reported that the extensive documentation submitted by the coordinating center provided sufficient evidence to assess trial feasibility. Overall, the PIs reported several distinct advantages of the phased award mechanism, including how pressure to demonstrate progress helped to prioritize essential project management tasks, led to earlier engagement with technical and data management staff, and provided additional leverage to secure commitments from external stakeholders and collaborators. Table 5 provides a summary of themes identified in the interviews conducted during the first project year with PIs and POs/PSs, and Table 6 includes select quotes that capture these sentiments.

Table 5 Summary of Themes from the Planning Phase with PIs, Research Teams, and POs/PSs
Table 6 Illustrative quotes from participants in monthly calls and year 1 in-person project meeting

Discussion

The observations from this project have the potential to improve the knowledge base regarding macro-level strategies to increase clinical research productivity, thereby demonstrating responsible stewardship of publicly funded science. As this was a unique design under a specific NIH solicitation with a small number of low-cost trials, future efforts are needed to expand upon our preliminary findings, for example by assessing the association between planning milestones and successful participant recruitment or accrual. However, this effort achieved one of its overarching intentions — early identification of an at-risk trial — as one of the Phase I awardees discovered during this phase that the patient population in their single-site trial was insufficient. Other positive elements included co-management of the planning process, support for generating reliable metrics to assess progress, and a collaborative environment that provided a forum for investigators to share their progress with other researchers in different fields and to communicate in person with their NIH POs/PSs.

Synthesis of lessons from strategies for early identification of trial risk factors can contribute to management guidance and standardization [21], potentially of benefit to both trial designers and funding organizations. Results from our efforts to systematically categorize critical start-up milestones illustrate the need for additional research in this area [30]. The approach used to differentiate milestones specific to the trial from those more likely to be common across all trials suggests that this distinction is not clear-cut. We also speculated but could not confirm whether the relatively large proportion of trial-specific milestones reflects something unique to more pragmatic trials with particular constraints due to their conduct in real-world settings. However, given the overall failure of many trials to meet recruitment or dissemination goals [31, 32], there is value in efforts such as the phased award mechanism to identify critical precursors demonstrating a potential study population adequate to meet the sample size of the trial, and other factors associated with trial feasibility and effective resource utilization.

Conclusions

Strategies such as cooperative agreements and phased mechanisms are increasingly being adopted and integrated into biomedical funding practices. From the perspective of the investigator, advantages of the phased mechanism include clear delineation of the development time period, as well as specification of critical milestones to be accomplished, which helps prioritize task management, galvanize gatekeepers, and emphasize feasibility testing [33]. The funding institute benefits, as their investment in the trial is potentially less risky, with a clearly delineated process for internal review and clear stopping rules. The methodology developed and implemented by the coordinating center to facilitate management of early phase progress has been disseminated and adapted for similar projects within the NIH.

With regard to the field of pragmatic research more generally, implementing systematic approaches to identify milestones and track metrics can strengthen the evidence base regarding the time and effort required to efficiently conduct and manage large simple trials [4], and this process has been proposed among a set of solutions to improve community-engaged implementation research [34] and the efficiency and effectiveness of clinical trial recruitment planning [35]. Although each awardee in this project was required to provide evidence of completion of metrics, there was considerable variability in number and type required. Future efforts to link early phase management support with trial implementation outcomes can support guidance regarding when flexibility and adaptation versus more rigid adherence to pre-determined milestones is appropriate [13]. Developing and disseminating a classification or framework to guide trial design and review is especially critical for transparency, particularly when funding decisions are contingent on both merit and feasibility [2].

Abbreviations

DSMB:

Data and Safety Monitoring Board

ENGAGES:

Electro-encephalograph Guidance of Anesthesia to Alleviate Geriatric Syndromes (trial)

FTP:

File Transfer Protocol

HUSH:

Pragmatic Trial of Behavioral Interventions for Insomnia in Hypertensive Patients

IRB:

Institutional Review Board

IT:

Information technology

NHLBI:

National Heart, Lung, and Blood Institute

NIH:

National Institutes of Health

NoGA:

Notice of Grant Award

PART:

Pragmatic Trial of Airway Management in Out-of-Hospital Cardiac Arrest

PI:

Principal investigator

PO:

Program officer

PROOFCheck:

Prevention of Severe Acute Respiratory Failure in Patients with PROOFCheck (Electronic Checklist to Prevent Organ Failure)

PS:

Project scientist

REDAPS:

Default Palliative Care Consultation for Seriously Ill Hospitalized Patients

RFA:

Request for Applications

References

  1. Baer AR, Bridges KD, O'Dwyer M, Ostroff J, Yasko J. Clinical research site infrastructure and efficiency. J Oncol Pract. 2010;6(5):249–52.

    Article  Google Scholar 

  2. Hudson KL, Lauer MS, Collins FS. Toward a new era of trust and transparency in clinical trials. JAMA. 2016;316(13):1353–4.

    Article  Google Scholar 

  3. Duley L, Gillman A, Duggan M, Belson S, Knox J, McDonald A, Rawcliffe C, Simon J, Sprosen T, Watson J, Wood W. What are the main inefficiencies in trial conduct: a survey of UKCRC registered clinical trials units in the UK. Trials. 2018;19(1):15.

    Article  Google Scholar 

  4. Eapen ZJ, Lauer MS, Temple RJ. The imperative of overcoming barriers to the conduct of large, simple trials. JAMA. 2014;311(14):1397–8.

    Article  CAS  Google Scholar 

  5. Farrell B. Efficient management of randomised controlled trials: nature or nurture. BMJ. 1998;317(7167):1236.

    Article  CAS  Google Scholar 

  6. Rubio DM. Common metrics to assess the efficiency of clinical research. Eval Health Prof. 2013;36(4):432–46.

    Article  Google Scholar 

  7. Lauer MS, Bonds D. Eliminating the “expensive” adjective for clinical trials. Am Heart J. 2014;167(4):419–20.

    Article  Google Scholar 

  8. Lipman PD, Loudon K, Dluzak L, Moloney R, Messner D, Stoney CM. Framing the conversation: use of PRECIS-2 ratings to advance understanding of pragmatic trial design domains. Trials. 2017;18(1):532.

    Article  Google Scholar 

  9. Treweek S, Zwarenstein M. Making trials matter: pragmatic and explanatory trials and the problem of applicability. Trials. 2009;10(1):37.

    Article  Google Scholar 

  10. Schwartz D, Lellouch J. Explanatory and pragmatic attitudes in therapeutical trials. J Clin Epidemiol. 2009;62(5):499–505.

    Article  Google Scholar 

  11. Schwartz D, Lellouch J. Explanatory and pragmatic attitudes in therapeutical trials. J Clin Epidemiol. 1967;20(8):637–48.

    CAS  Google Scholar 

  12. Patsopoulos NA. A pragmatic view on pragmatic trials. Dialogues Clin Neurosci. 2011;13(2):217.

    PubMed  PubMed Central  Google Scholar 

  13. Maclure M. Explaining pragmatic trials to pragmatic policy-makers. Can Med Assoc J. 2009;180(10):1001–3.

    Article  Google Scholar 

  14. Tunis SR, Stryer DB, Clancy CM. Practical clinical trials: increasing the value of clinical research for decision making in clinical and health policy. JAMA. 2003;290(12):1624–32.

    Article  CAS  Google Scholar 

  15. Sox HC, Lewis RJ. Pragmatic trials: practical answers to “real world” questions. JAMA. 2016;316(11):1205–6.

    Article  Google Scholar 

  16. Dal-Ré R, Janiaud P, Ioannidis JP. Real-world evidence: How pragmatic are randomized controlled trials labeled as pragmatic? BMC Med. 2018;16(1):49.

    Article  Google Scholar 

  17. Tickle-Degnen L. Nuts and bolts of conducting feasibility studies. Am J Occup Ther. 2013;67(2):171–6.

    Article  Google Scholar 

  18. Whicher DM, Miller JE, Dunham KM, Joffe S. Gatekeepers for pragmatic clinical trials. Clinical Trials. 2015;12(5):442–8.

    Article  Google Scholar 

  19. Johnson KE, Tachibana C, Coronado GD, Dember LM, Glasgow RE, Huang SS, Martin PJ, Richards J, Rosenthal G, Septimus E, Simon GE. A guide to research partnerships for pragmatic clinical trials. BMJ. 2014;349:g6826.

    Article  Google Scholar 

  20. Thabane L, Ma J, Chu R, Cheng J, Ismaila A, Rios LP, Robson R, Thabane M, Giangregorio L, Goldsmith CH. A tutorial on pilot studies: the what, why and how. BMC Med Res Methodol. 2010;10(1):1.

    Article  Google Scholar 

  21. Farrell B, Kenyon S, Shakur H. Managing clinical trials. Trials. 2010;11(1):78.

    Article  Google Scholar 

  22. Ioannidis JP. Why most clinical research is not useful. PLOS Med. 2016;13(6):e1002049.

    Article  Google Scholar 

  23. NIH Collaboratory website. https://rethinkingclinicaltrials.org/. Accessed 24 Sept 2018.

  24. Strauss A, Corbin JM. Basics of qualitative research: grounded theory procedures and techniques. Thousand Oaks: Sage; 1990.

    Google Scholar 

  25. Wildes TS, Winter AC, Maybrier HR, Mickle AM, Lenze EJ, Stark S, Lin N, Inouye SK, Schmitt EM, McKinnon SL, Muench MR. Protocol for the Electroencephalography Guidance of Anesthesia to Alleviate Geriatric Syndromes (ENGAGES) study: a pragmatic, randomised clinical trial. BMJ Open. 2016;6(6):e011505. https://doi.org/10.1136/bmjopen-2016-011505.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  26. Levenson JC, Rollman BL, Ritterband LM, Strollo PJ, Smith KJ, Yabes JG, Moore CG, Harvey AG, Buysse DJ. Hypertension with unsatisfactory sleep health (HUSH): study protocol for a randomized controlled trial. Trials. 2017;18(1):256. https://doi.org/10.1186/s13063-017-2001-9.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Gong MN, Schenk L, Gajic O, Mirhaji P, Sloan J, Dong Y, Festic E, Herasevich V. Early intervention of patients at risk for acute respiratory failure and prolonged mechanical ventilation with a checklist aimed at the prevention of organ failure: protocol for a pragmatic stepped-wedged cluster trial of PROOFCheck. BMJ Open. 2016;6:e011347. https://doi.org/10.1136/bmjopen-2016-011347.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  28. Courtright KR, Madden V, Gabler NB, Cooney E, Small DS, Troxel A, Casarett D, Ersek M, Cassel JB, Nicholas LH, Escobar G. Rationale and Design of the Randomized Evaluation of Default Access to Palliative Services (REDAPS) trial. Ann Am Thorac Soc. 2016;13(9):1629–39. https://doi.org/10.1513/AnnalsATS.201604-308OT.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Wang HE, Prince DK, Stephens SW, Herren H, Daya M, Richmond N, Carlson J, Warden C, Colella MR, Brienza A, Aufderheide TP. Design and implementation of the resuscitation outcomes consortium pragmatic airway resuscitation trial (PART). Resuscitation. 2016;101:57–64.

    Article  Google Scholar 

  30. Treweek S, Littleford R. Trial management–building the evidence base for decision-making. Trials. 2018;19:11.

    Article  Google Scholar 

  31. Carlisle B, Kimmelman J, Ramsay T, MacKinnon N. Unsuccessful trial accrual and human subjects protections: an empirical analysis of recently closed trials. Clin Trials. 2015;12(1):77–83.

    Article  Google Scholar 

  32. Ross JS, Tse T, Zarin DA, Xu H, Zhou L, Krumholz HM. Publication of NIH funded trials registered in ClinicalTrials.gov: cross sectional analysis. BMJ. 2012;344:d7292.

    Article  Google Scholar 

  33. Mickle AM, Maybrier HR, Winter AC, McKinnon SL, Torres BA, Lin N, Lenze EJ, Stark S, Muench MR, Jacobsohn E, Inouye SK. Achieving milestones as a prerequisite for proceeding with a clinical trial. Anesth Analg. 2018;126(6):1851–8.

    Article  Google Scholar 

  34. Mensah GA, Cooper RS, Siega-Riz AM, Cooper LA, Smith JD, Brown CH, Westfall JM, Ofili EO, Price LN, Arteaga S, Parker MC. Reducing cardiovascular disparities through community-engaged implementation research: a National Heart, Lung, and Blood Institute workshop report. Circ Res. 2018;122(2):213–30.

    Article  CAS  Google Scholar 

  35. Huang GD, Bull J, McKee KJ, Mahon E, Harper B, Roberts JN. Clinical trials recruitment planning: a proposed framework from the Clinical Trials Transformation Initiative. Contemp Clin Trials. 2018;66:74–9.

    Article  Google Scholar 

Download references

Acknowledgements

We thank the Pragmatic Trial Awardees and trial teams; and the National Heart, Lung, and Blood Institute (NHLBI) and National Institute on Aging (NIA) program officers and scientists for their support of our collaborative activities. Mary N. Masters (formally of Westat) played a critical role in the project during this phase.

Funding

The study is supported by NHLBI Grant 1R01HL125114-01 to Paula Darby Lipman, Ph.D. (Westat); the NHLBI, and the NIH, Bethesda, MD. Dr. Sean Tunis, Center for Medical Technology Policy, is co-investigator.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Declarations

The content is solely the responsibility of the authors and does not necessarily represent the official views of the NHLBI, the National Institutes of Health (NIH), or the US Department of Health and Human Services.

Author information

Authors and Affiliations

Authors

Contributions

PDL was the primary writer of the manuscript, led the design of the study and evaluation protocol, and conducted the monthly calls. LD contributed to the design of the study, participated in monthly calls, tracked milestones, and contributed to the analysis and manuscript. CMS provided feedback on the study design and evaluation protocol as well as the interpretations and implications of the findings. PDL and LD reviewed and conducted quality control of tables and figures. All authors contributed to writing, reviewing, and approving drafts leading to the final manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Paula Darby Lipman.

Ethics declarations

Ethics approval and consent to participate

The Westat Institutional Review Board (IRB) reviews all studies involving research on human subjects. This study is exempt from IRB review as received from the Chair of the Westat IRB on October 23, 2014 (FWA 00005551). Per [45 CFR 46.101(b5)] and a letter received on October 16, 2014, from Denise Bonds, Medical Officer, NHLBI, this research involves a program evaluation and therefore is exempt from IRB review.

Westat is conducting an evaluation of the methods and processes that contribute to successful pragmatic, low-cost clinical trials. The work involves monitoring the design/planning of these trials and, in years 2–5, the implementation of the trials. As members of this cooperative agreement, all investigators consented to participate in these activities and, specifically, provided oral consent prior to the conduct of the interviews as reported in this submission.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lipman, P.D., Dluzak, L. & Stoney, C.M. Is this study feasible? Facilitating management of pragmatic trial planning milestones under a phased award funding mechanism. Trials 20, 307 (2019). https://doi.org/10.1186/s13063-019-3387-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13063-019-3387-3

Keywords