Skip to main content

Advertisement

An international, Delphi consensus study to identify priorities for methodological research in behavioral trials in health research

Abstract

Background

Non-communicable chronic diseases are linked to behavioral risk factors (including smoking, poor diet and physical inactivity), so effective behavior change interventions are needed to improve population health. However, uptake and impact of these interventions is limited by methodological challenges. We aimed to identify and achieve consensus on priorities for methodological research in behavioral trials in health research among an international behavioral science community.

Methods

An international, Delphi consensus study was conducted. Fifteen core members of the International Behavioral Trials Network (IBTN) were invited to generate methodological items that they consider important. From these, the research team agreed a “long-list” of unique items. Two online surveys were administered to IBTN members (N = 306). Respondents rated the importance of items on a 9-point scale, and ranked their “top-five” priorities. In the second survey, respondents received feedback on others’ responses, before rerating items and re-selecting their top five.

Results

Nine experts generated 144 items, which were condensed to a long-list of 33 items. The four most highly endorsed items, in both surveys 1 (n = 77) and 2 (n = 57), came from two thematic categories:“Intervention development” (“Specifying intervention components” and “Tailoring interventions to specific populations and contexts”) and “Implementation” (“How to disseminate behavioral trial research findings to increase implementation” and “Methods for ensuring that behavioral interventions are implementable into practice and policy”). “Development of novel research designs to test behavioral interventions” also emerged as a highly ranked research priority.

Conclusions

From a wide array of identified methodological issues, intervention development, implementation and novel research designs are key themes to drive the future behavioral trials’ research agenda. Funding bodies should prioritize these issues in resource allocation.

Peer Review reports

Introduction

Rapidly increasing rates of chronic disease are a key global societal challenge [51]. The leading behavioral risk factors are similar across chronic diseases including tobacco use, harmful alcohol consumption, unhealthy diet including high salt and sodium intake, physical inactivity, and being overweight and obesity [45]. Effective, evidence-based behavior change interventions are urgently needed to reduce the prevalence of chronic disease internationally and the burden these conditions place on patients and health services.

For the purposes of this study, behavioral interventions were defined as: “interventions that require the active participation of a target group (e.g., the patient/individual, health professional, health care systems) with the proximal or ultimate goal of changing health-related behavior.” Behavioral interventions may be delivered in person or digitally, employing digital technologies such as the Internet, telephones and mobile and environmental sensors [23]. Interventions may also be delivered as national campaigns, or through communities.

Within behavioral medicine, much research is focused on developing behavior change interventions to reduce chronic disease prevalence, mortality, and burden of disease [24]. However, despite the significant potential to improve health and clinical outcomes, the reach and impact of behavioral interventions remains limited [35]. Suboptimal behavior change research not only reduces the likelihood that this research impacts on health outcomes, but it is also cost-ineffective. In 2010, expenditure on life sciences (mostly biomedical) research internationally was US$240 billion [47]. Waste across medical research (clinical or other types) has been estimated as consuming 85% of the billions spent each year [25] and commentators have criticized clinical research suggesting that most research is not useful [18].

An array of reasons has been suggested for the limited success in behavior change research including: low investment in this area of research [33], poor quality evaluation methods [13], lack of application of behavior change theory [29], poor specification of intervention content [30] and lack of an interdisciplinary team science approach [12]. Behavior change intervention research involves development, testing and implementation of “complex” interventions, with multiple components and involving multiple stakeholders [8]. This type of research requires a more complex, biopsychosocial approach to evidence generation than has been previously applied to answering questions about the effectiveness of clinical interventions [48]. Behaviour change research raises unique methodological challenges for the researcher, which need to be addressed and overcome if we are to develop a strong evidence base for behavior change interventions.

The International Behavioral Trial Network (IBTN; www.ibtnetwork.org) was established by a team of behavioral researchers in June 2013 to address methodological challenges specifically relevant to behavioral trials’ research. The IBTN is a global network of professionals working to improve the quality of clinical trials and behavioral interventions, with three main goals: first, to facilitate the global improvement of the quality of behavioral trials; second, to create networks and capacity to undertake more and higher-quality trials; and third, to develop a repository of resources of existing recommendations, tools and methodology papers on behavioral trials and intervention development. Currently (June 2019), the IBTN has 322 members, from 30 different countries across the world, and includes academics/researchers, postgraduate students, health professionals, general public and industry representatives.

Improving the quality and potential of behavioral trials requires methodological issues in this area to be identified and research to be conducted with the specific aim of addressing these issues. Previously discussed methodological issues specific to the design and conduct of behavioral trials include intervention development and piloting, intervention reporting, identifying suitable comparison groups, selection of appropriate outcome measures and intervention fidelity [3]. However, a formal, systematic process to identify and specify methodological priorities is now needed to facilitate the development of an international and cohesive behavioral trials’ research agenda.

Research prioritization provides such a process, whereby key stakeholders generate ideas and move towards consensus on important research topics [43]. The prioritization process has been used to identify priorities across conditions and populations [26]. In the area of trials’ research prioritization has been conducted with Directors of UK Clinical Research Collaboration Clinical Trials Units to inform the broader trials’ methodological research agenda [49] and, more recently, a priority setting exercise has been reported to inform the global health trials’ methodology research agenda [46]. Research prioritization can provide useful information to guide research funders.

The aim of this study was to identify priorities for, and achieve consensus on, methodological research in behavioral trials in health research. This information is needed to inform and guide the direction of the behavioral trials’ research agenda internationally. This study used a Delphi priority-setting consensus approach, inviting all members of the IBTN to participate.

Methods

The study protocol has been published elsewhere [5]. This Delphi study was conducted and is reported following the reporting standard for Conducting and REporting of DElphi Studies (CREDES) [21].

The Delphi process

An electronic Delphi (e-Delphi), with online administration of questionnaires, was used for this research prioritization to facilitate international participation [10]. The Delphi process is a structured group facilitation technique to obtain consensus among anonymous respondents through iterative rounds with feedback [28]. The Delphi approach has been widely used in health research [20, 21]. The features of the Delphi process which make it suitable for gaining consensus include: anonymity to facilitate balanced participation and iterative rounds to allow participants to change their opinion in response to controlled feedback where participants are provided with information on the distribution of overall group responses from previous rounds [20].

Participants

Participants for Phase 1, the topic generation phase, were 15 experts in behavioral trials selected by the research team. Experts included founding members of IBTN, members of the IBTN Executive Committee and members of the research team. All experts had a minimum of 10 years’ experience of behavioral trials and a reputation for leadership in the field. Participants for Phase 2, the e-Delphi survey, were all those registered as members of the IBTN in February 2018 (N = 306, including members from five continents).

Delphi stages

See the flow chart in Fig. 1 which illustrates the stages of the Delphi process.

Fig. 1
figure1

Flow chart to illustrate the stages of the Delphi process

Delphi Phase 1: expert topic generation

Fifteen experts in behavioral trials were contacted by a member of the research team (MB) by email in May 2017 and invited to generate a list of all possible topics or research questions which they consider important for behavioral trials’ methodology research. Respondents were asked to provide demographic information including: sex, current professional position, country of residence and number of years of experience of working in the area of trials of behavioral interventions.

Two members of the research team (MB and JMS) reviewed generated items initially, removing duplicates and merging similar topics, and along with two other members of the research team (KL and SB) agreed a draft “long-list” of unique items. This list was emailed to respondents to check for agreement and to see if items were faithful to the originally generated items, and feedback was discussed by the research team. The final long-list was approved and agreed by the research team in July 2017.

Delphi Phase 2: E-Delphi survey

All members of the IBTN were invited by email to participate in two online surveys, using LimeSurvey online survey software (LimeSurvey GmbH, Hamburg, Germany. URL http://www.limesurvey.org).

The first survey was emailed to IBTN members (N = 306) in February 2018. Recipients were asked for their views on priorities for methodological research in trials of behavioral interventions. They were asked to rate the importance of each item on a 9-point scale, where 9 indicated items of highest importance and 1 indicated lowest importance. Following rating of the 33 items, they were asked to select and rank their “top-five” most important methodological research topics for trials of behavioral interventions. Respondents were provided with an open text-box to add any items which they believed were important and were missing from the list. Respondents were asked to provide demographic information including: sex, current professional position, country of residence, age group and number of years of experience of working in the area of trials of behavioral interventions.

In the second survey (administered 3 weeks after the closing of survey 1), participants who had responded to survey 1 received information reminding them of how they had responded in survey 1 and information about how others rated and ranked the items in survey 1. For rating the importance of individual items, bar charts plotting group responses to each item were provided, as well as the group mean importance rating for each item, and the individual’s own importance rating from survey 1. Respondents were asked to re-rate items with this information in mind. For the top-five ranking question, participants were reminded of their top-five selection from survey 1, and were presented with the percentage of respondents who had ranked each item in their top five in survey 1. Participants were asked to re-rank their top-five priority items with this information in mind.

Any additional items proposed in the free-text comment box in survey 1 were discussed by the research team and included for rating in survey 2 if the majority of team members agreed that the item was a unique, novel, previously excluded item. New items added to survey 2 were, therefore, rated only once in the Delphi process.

To encourage participation, the names of respondents to both surveys were entered into a draw for two prizes (personal fitness tracking devices). Only those who had responded to both surveys were included in the draw.

All data were extracted from the online survey software and imported into an SPSS database, which was stored anonymously on password-protected computers to which only members of the research team had access. Survey 2 ranked priority items were allocated a “ranking weighted score,” as follows: first priority was given 5; second priority was given 4; third priority was given 3; fourth priority was given 2; and fifth priority was given 1.

Ethical approval

Ethical approval was granted by the National University of Ireland Galway Research Ethics Committee (reference: 17-Jun-13).

Results

Delphi Phase 1: expert topic generation

Nine of the 15 experts contacted agreed to participate and returned a list of items, representing a response rate of 60%. Of these, four were women and five were men. They were working in Canada, the UK, the US, Ireland and France, and had between 10 and 35 years of experience working in the area of behavioral interventions. Four of these nine experts were members of the research team; no other conflicts of interest related to research were disclosed by included experts.

In total, the nine experts generated 144 items. Following the initial review (by MB and JMS), removing duplicates and merging similar topics, the list was reduced to 40 items, which were organized for ease of review and to aid comprehension into 12 categorical themes. The categorical themes, agreed by the research team, were: Intervention Development; Comparison Group; Intervention Fidelity; Pilot/Feasibility trials; Reporting; Novel Trial Designs; Data Issues; Outcomes; Cost-effectiveness; Implementation; Stakeholder engagement; and Development of behavioral science and theory. Following feedback from the experts and discussions among the research team, the final list for survey 1 included 33 items, organized into the same 12 categorical themes. The list can be seen in Table 1.

Table 1 The “long-list” of items for methodological research in trials of behavioral interventions agreed in Phase 1

Delphi Phase 2: E-Delphi survey

Response rates: of the 306 invitations sent in survey 1, complete responses were received from 77 people (25% response rate); incomplete responses were returned from 11 people and 218 people did not respond. Of the 77 invitations sent in survey 2, complete responses were received from 57 people (74% response rate); incomplete responses were returned from one person and 19 people did not respond. Only complete responses were used in the analysis. The professional background and demographic data for survey-1 and -2 completers are shown in Table 2. In survey 1, 69% of respondents were female. The majority (64%) had academic positions, 22% were students (undergraduate and graduate) and the remainder were health care practitioners, policy-makers or described themselves as “other.” Forty-three percent lived in Canada, 16% the US, 16% in Ireland, with of the remaining 25% of participants living in: Israel, Australia, Netherlands, Portugal, Sweden, UK, Brazil, China, Columbia and France. The majority of respondents were between the ages of 31 and 50 years (58%). Thirty-five percent of respondents had between 1 and 5 years’ experience in behavioral trials’ research, but it is worth noting that 26% reported having more than 10 years’ experience.

Table 2 Professional background and demographic data for survey completers

The mean importance ratings for individual items in surveys 1 and 2 can be seen in Table 3. The same six items were the six most highly rated items in both surveys 1 and 2, although the order changed slightly. These were (in order of descending levels of importance from the most highly rated item from the ratings in survey 2): Specifying intervention components; How to disseminate behavioral trial research findings to increase implementation; Methods for ensuring that behavioral interventions are implementable into practice and policy; Use of systematic approaches to move from evidence to intervention components; Selecting appropriate behavioral outcomes for trials; and Tailoring interventions to specific populations and contexts. The four most highly rated items, in both surveys 1 and 2, came from two of the categories: Intervention development (Specifying intervention components and Use of systematic approaches to move from evidence to intervention components) and Implementation (How to disseminate behavioral trial research findings to increase implementation, and Methods for ensuring that behavioral interventions are implementable into practice and policy).

Table 3 Mean importance ratings for individual items in surveys 1 and 2, ordered by survey 2 importance ratings (possible score range 1–9: 1 = lowest importance, 9 = highest importance)

Two new items were generated by suggestions made by survey-1 respondents: Investigating the impact of intervention intensity on outcomes and Engaging stakeholders in the selection of outcomes. Therefore, participants received a list of 35 items to rate and rank in survey 2. Neither of these items scored above the median in survey 2: Investigating the impact of intervention intensity on outcomes received a mean rating of 7.11 (SD 1.18), putting it in 19th place of the 35 items; Engaging stakeholders in the selection of outcomes received a mean rating of 6.63 (SD 1.540, putting it in 28th place of the 35 items.

The number and percentage of participants who ranked each item as their top priority in surveys 1 and 2 are shown in Table 4. As in the item ratings, there were high levels of similarity in the items ranked most highly in surveys 1 and 2. The three items most frequently ranked as top priority in survey 2 were: Tailoring interventions to specific populations and contexts; Methods for ensuring that behavioral interventions are implementable into practice and policy; and Development of novel research designs to test behavioral interventions as alternatives to, or to complement, standard randomized controlled trials (RCTs). As with the item-importance ratings, the first and second items are from item categories Intervention development and Implementation. A new item appeared in the top-five priority items ranking as important, at number three, which was within the category Novel Trial Designs: Development of novel research designs to test behavioral interventions as alternatives to, or to complement, standard RCTs.

Table 4 Number and percentage of participants who ranked each item as their top priority in surveys 1 and 2, listed in order of the items that were most often selected as the top priority in survey 2

When respondents’ top-five priorities were given a weight and each item allocated a “ranking weighted score,” the top-five ranked items in surveys 1 and 2 were the same items, although the order changed slightly. Scores can be seen in Table 5. These were (in order of descending priority from the most highly ranking, weighted, scoring item from the rankings in survey 2): Tailoring interventions to specific populations and contexts; Methods for ensuring that behavioral interventions are implementable into practice and policy; Specifying intervention components; Use of systematic approaches to move from evidence to intervention components; and Development of novel research designs to test behavioral interventions as alternatives to, or to complement, standard RCTs. Again, the four highest scoring items in both surveys 1 and 2 were from categories: Intervention development and Implementation.

Table 5 Weighted ranking of participant responses to the “top-five” priorities question order by the most highly ranked item in survey 2

Discussion

Summary of findings

The aim of this study was to identify priorities for methodology research specific to trials of behavioral interventions, and to seek the views of, and achieve consensus from, an international community of researchers working in this field. A large number of items was generated by the nine experts and many items from the long-list of 33 items were strongly endorsed as important methodological issues for behavioral trials’ research. There were no major changes between responses in survey 1 and responses in survey 2. From item-ratings and -rankings in both surveys, there was consensus around the types of items considered as most important or of highest priority. The four most highly rated items in terms of importance, in both surveys 1 and 2, came from two of the thematic categories, highlighting consensus that these are important priority areas for future methodological research within behavioral trials: Intervention development (Specifying intervention components and Use of systematic approaches to move from evidence to intervention components) and Implementation (How to disseminate behavioral trial research findings to increase implementation and Methods for ensuring that behavioral interventions are implementable into practice and policy). These items reasserted themselves as priorities from respondents’ ranking of their top-five priorities, with one new item emerging in the ranking, which had not been highlighted in the importance ratings: Development of novel research designs to test behavioral interventions as alternatives to, or to complement, standard RCTs.

Methodological challenges associated with the development of behavioral interventions were consistently identified as priorities within this study. These included, specifically, the challenges associated with specifying intervention components and the use of systematic approaches to move from evidence to intervention components. There has been significant recent progress in classifying the active components of behavior change interventions and methodological advances in the development of behavior change interventions. Replicable methods for identifying and reporting the active ingredients of behavioral interventions have been recently developed, including the Template for intervention description and replication (TIDieR) Checklist and Guide [17], the taxonomy of Behavior Change Methods [22] and the Behavior Change Technique (BCT) Taxonomy [34]. The BCT taxonomy has been widely adopted within health psychology; it provides an extensive, consensually agreed hierarchically structured taxonomy of 93 BCTs used in behavior change interventions.

In addition, frameworks have been developed to support the process of systematically moving from behavioral theory to intervention content. For example, Intervention Mapping [11], the Theoretical Domains Framework [6] and the Behavior Change Wheel [31] are all frameworks developed to support this process. While there has been rapid uptake of these tools since their publication, it is still early days to determine their impact on the quality and outcomes of behavioral intervention research and difficulties remain. For example, the process of identifying BCTs from behavioral interventions is not straightforward [19]. There is a lack of reliable methods for identifying which specific BCTs or BCT combinations have the potential to be effective for a given behavior in a given context [36]. The priorities identified in the current study reinforce the need for future work to focus on improving the reliability and robustness of descriptions of behavioral intervention components, and ensuring that during intervention development the active contents of interventions can be linked to the theoretical premises for behavior change. These issues are central to an ongoing program of research called the “Human Behavior-Change Project,” where behavioral scientists are working with computer scientists to develop an online knowledge system (an ontology) to facilitate the identification, extraction and synthesis of knowledge related to behavior change interventions [32, 40].

The identification of the methodological research priority “Development of novel research designs to test behavioral interventions as alternatives to, or to complement, standard randomized controlled trials (RCTs)” may assist in resolving some of the challenges identified above in the development and specification of theory-based interventions. There has been a growing interest within behavioral science in novel research designs that can provide information beyond that provided by the standard RCT design. The classic, two-armed RCT allows us to test the effectiveness of one intervention package compared to another intervention package. However, this design is of limited use to inform our understanding of the relative importance or potency of constituent intervention components, the optimal dose of each component, the optimal combination or sequence of delivery of components, or their mechanisms of action to effect behavior change [7]. There is a growing number of studies in the literature leveraging alternative frameworks and trial designs such as the Multiphase Optimization Strategy (MOST) and the Sequential Multiple Assignment Randomized Trial (SMART) design. The Multiphase Optimization Strategy (MOST) is an engineering-inspired methodological framework for optimizing and evaluating interventions [7]. MOST uses randomized experimentation to assess the performance of individual intervention components and their interactions in an optimization trial, to optimize interventions in advance of testing through RCTs. MOST has been used in a number of settings, including to optimize interventions in Internet cognitive-behavioral therapy for depression [50], human immunodeficiency virus (HIV) care [16], smoking cessation [44] and remotely delivered intensive lifestyle treatment for obesity [42]. The SMART design allows evaluation of adaptive interventions in which the type or dose of treatment is individually tailored based on the patient’s needs [2, 37]. A SMART design has been used in a number of areas; for example, to evaluate alternative combinations of perinatal interventions and sequencing patterns to optimize women’s health outcomes [14]. These approaches are still in their infancy and behavioral scientists should use and develop these frameworks to enhance the quality of behavioral intervention research.

There is potential for digital health-behavior change interventions to enhance our understanding of behavior change mechanisms [38] and enable more sophisticated research designs which promote a more nuanced understanding of intervention processes. For example, the just-in-time adaptive intervention (JITAI) is an intervention design developed within digital health intervention research which aims to provide the right type and amount of support, at the right time, by adapting to an individual’s changing internal and contextual state [39]. Increasingly powerful mobile and sensing technologies within JITAIs enable the monitoring of changes to an individuals’ state and tailored delivery of intervention components. Research on the development and evaluation of these interventions is still very limited and it is critical that researchers develop sophisticated and nuanced health behavior theories capable of guiding the construction of such interventions in line with the rapidly growing technological capabilities for delivering JITAIs.

In addition, qualitative research should be used more comprehensively within behaviour change intervention research to enhance quality. Qualitative research can enhance pre-trial intervention development and strengthen the interpretation of the findings of intervention trials by shedding light on implementation issues and understanding the impact of intervention context on effectiveness [41].

The other methodological research category identified as a high priority in this study, was the area of implementation. Gaps in methods to ensure translation of behavioral trial research findings into practice and policy were strongly endorsed as important by respondents in this study, as was the lack of strategies to effectively disseminate behavioral trial research findings to increase implementation. Difficulties in dissemination and implementation of research findings is not unique to behavioral trials; the gap between research evidence and routine practice has been identified as a consistent feature of health care delivery [27]. Integrated Knowledge Translation (IKT) has been suggested as a method to increase the relevance and applicability of research by engaging knowledge users through the entire research process, not just at the end of a project [15]. Indeed, stakeholder engagement, which refers to the involvement of public, patients, health professionals, service users, funders and other decision-makers in research, should be used throughout the whole research process to enhance the relevance, quality and impact of behavior change intervention research [4]. Exploring ways to incorporate emerging IKT methods within behavioral trials’ research may strengthen the potential impact of behavioral science research in improving health and health care.

Strengths and limitations

This is the first study which has attempted to systematically achieve consensus on methodological research priorities for behavioral trials’ research. The study protocol was published on an open-access publication platform and was subjected to transparent peer review [5]. The study was conducted in line with internationally recognized guidelines for the Conducting and Reporting of DElphi Studies (CREDES) [21].

Caution is needed in generalizing the findings, as the response rate for survey 1 (25%) was relatively low compared with other research prioritization e-Delphi studies (for example, [9] achieved a 42% response rate to survey 1 in their study). However, the retention rate for participants in survey 2 was adequate (74%). The sampling frame was limited to members of the IBTN and the sample of the e-Delphi survey was relatively small. Responders may have differed from non-responders; we did not have data on the full sampling frame to enable comparison. While we achieved a reasonable spread of countries internationally in the sample, respondents are drawn largely from developed countries. Developing countries are not represented. Methodological challenges associated with behavior change intervention research in developing countries are likely to differ significantly from those relevant in developed countries [1]. The majority of IBTN members and participants in this study are researchers, with academic appointments or are in graduate training programs. Health professionals, policy-makers, patients and the public were underrepresented or absent from the study. It would be useful to obtain the views of more diverse stakeholder groups in future research.

A further potential limitation to note in relation to the Delphi process was that members of the research team (MB, JMS, KL and SB) were also members of the expert panel that generated the initial long-list of items. This was done as we wanted to maximize the number of items generated for the long-list. However, this may have been a source of bias in the initial process of refining the list of items for the survey.

Conclusion

Given the significant potential impact of behavioral interventions on global health, ensuring that we are conducting high-quality research is imperative. While caution is needed in interpreting the findings of this study due to the relatively low response rate and small sample size, the priorities identified in this study can be used to inform the research agenda of the IBTN and could be used more broadly to inform the behavioral trials’ methodology agenda internationally. Furthermore, the results of this study can be leveraged by national and international funding bodies to help identify and shape resource allocation, and could be used to advocate for targeted research calls. Specifically, future research should prioritize: improving strategies to systematically develop interventions and specify intervention components; exploring novel research designs which allow us to develop more effective interventions and better understand what intervention components work for whom in what settings; and developing strategies to ensure that the findings from behavioral intervention research can be translated into practice and policy.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

References

  1. 1.

    Aboud FE, Singla DR. Challenges to changing health behaviours in developing countries: a critical overview. Soc Sci Med. 2012;75(4):589–94. https://doi.org/10.1016/j.socscimed.2012.04.009.

  2. 2.

    Almirall D, Nahum-Shani I, Sherwood NE, Murphy SA. Introduction to SMART designs for the development of adaptive interventions: with application to weight loss research. Transl Behav Med. 2014;4(3):260–74. https://doi.org/10.1007/s13142-014-0265-0.

  3. 3.

    Bacon SL, Lavoie KL, Ninot G, Czajkowski S, Freedland KE, et al. An international perspective on improving the quality and potential of behavioral clinical trials. Curr Cardiovasc Risk Rep. 2015;9(1):427.

  4. 4.

    Byrne M. Increasing the impact of behavior change intervention research: Is there a role for stakeholder engagement? Health Psychol. 2019;38(4):290.

  5. 5.

    Byrne M, McSharry J, Meade O, Lavoie K, Bacon S. An international, Delphi consensus study to identify priorities for methodological research in behavioural trials: a study protocol [version 2; referees: 2 approved]. HRB Open Res. 2018;1(11). https://doi.org/10.12688/hrbopenres.12795.2.

  6. 6.

    Cane J, O’Connor D, Michie S. Validation of the theoretical domains framework for use in behaviour change and implementation research. Implement Sci. 2012;7(1):1–17. https://doi.org/10.1186/1748-5908-7-37.

  7. 7.

    Collins L. Optimization of behavioral, biobehavioral, and biomedical interventions: the Multiphase Optimization Strategy (MOST). New York: Springer; 2018.

  8. 8.

    Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ. 2008;337:a1655.

  9. 9.

    Deane HC, Wilson CL, Babl FE, Dalziel SR, Cheek JA, et al. PREDICT prioritisation study: establishing the research priorities of paediatric emergency medicine physicians in Australia and New Zealand. Emerg Med J. 2018;35(1):39–45. https://doi.org/10.1136/emermed-2017-206727.

  10. 10.

    Donohoe H, Stellefson M, Tennant B. Advantages and limitations of the e-Delphi technique: Implications for health education researchers. Am J Health Educ. 2012;43(1):38–46.

  11. 11.

    Eldredge LKB, Markham CM, Ruiter RA, Kok G, Parcel GS. Planning health promotion programs: an intervention mapping approach. Hoboken: Wiley; 2016.

  12. 12.

    Freedland KE. A new era for health psychology. Health Psychol. 2017;36(1):1–4.

  13. 13.

    Gardner B, Smith L, Lorencatto F, Hamer M, Biddle S, et al. How to reduce sitting time? A review of behaviour change strategies used in sedentary behaviour reduction interventions among adults. Health Psychol Rev. 2016;10(1):89–112. https://doi.org/10.1080/17437199.2015.1082146Germeroth.

  14. 14.

    Germeroth LJ, Benno MT, Conlon RPK, Emery RL, Cheng Y, et al. Trial design and methodology for a non-restricted sequential multiple assignment randomized trial to evaluate combinations of perinatal interventions to optimize women’s health. Contemp Clin Trials. 2019;79:111–21. https://doi.org/10.1016/j.cct.2019.03.002.

  15. 15.

    Graham ID, Kothari A, McCutcheon C, Angus D, Banner D, Bucknall T, on behalf of the Integrated Knowledge Translation Research Network Project Leads. Moving knowledge into action for more effective practice, programmes and policy: protocol for a research programme on integrated knowledge translation. Implement Sci. 2018;13(1):22. https://doi.org/10.1186/s13012-017-0700-y.

  16. 16.

    Gwadz MV, Collins LM, Cleland CM, Leonard NR, Wilton L, Gandhi M, et al. Using the multiphase optimization strategy (MOST) to optimize an HIV care continuum intervention for vulnerable populations: a study protocol. BMC Public Health. 2017;17(1):383. https://doi.org/10.1186/s12889-017-4279-7.

  17. 17.

    Hoffmann TC, Glasziou PP, Boutron I, Milne R, Perera R, et al. Better reporting of interventions: Template for Intervention Description and Replication (TIDieR) checklist and guide. BMJ. 2014;348:g1687.

  18. 18.

    Ioannidis JPA. Why most clinical research is not useful. PLoS Med. 2016;13(6):e1002049. https://doi.org/10.1371/journal.pmed.1002049.

  19. 19.

    Johnston M, Johnston D, Wood CE, Hardeman W, Francis J, Michie S. Communication of behaviour change interventions: can they be recognised from written descriptions? Psychol Health. 2018;33(6):713–23. https://doi.org/10.1080/08870446.2017.1385784.

  20. 20.

    Jones J, Hunter D. Consensus methods for medical and health services research. BMJ. 1995;311(7001):376–80.

  21. 21.

    Jünger S, Payne SA, Brine J, Radbruch L, Brearley SG. Guidance on Conducting and REporting DElphi Studies (CREDES) in palliative care: recommendations based on a methodological systematic review. Palliat Med. 2017;31(8):684–706. https://doi.org/10.1177/0269216317690685.

  22. 22.

    Kok G, Gottlieb NH, Peters G-JY, Mullen PD, Parcel GS, et al. A taxonomy of behaviour change methods: an intervention mapping approach. Health Psychol Rev. 2016;10(3):297–312. https://doi.org/10.1080/17437199.2015.1077155.

  23. 23.

    Kraft P, Yardley L. Current issues and new directions in psychology and health: what is the future of digital interventions for health behaviour change? Psychol Health. 2009;24(6):615–8. https://doi.org/10.1080/08870440903068581.

  24. 24.

    Lim SS, Vos T, Flaxman AD, Danaei G, Shibuya K, et al. A comparative risk assessment of burden of disease and injury attributable to 67 risk factors and risk factor clusters in 21 regions, 1990–2010: a systematic analysis for the Global Burden of Disease Study 2010. Lancet. 2012;380(9859):2224–60.

  25. 25.

    Macleod MR, Michie S, Roberts I, Dirnagl U, Chalmers I, et al. Biomedical research: increasing value, reducing waste. Lancet. 2014;383(9912):101–4. https://doi.org/10.1016/S0140-6736(13)62329-6.

  26. 26.

    McSharry J, Fredrix M, Hynes L, Byrne M. Prioritising target behaviours for research in diabetes: using the nominal group technique to achieve consensus from key stakeholders. Res Involv Engagem. 2016;2(1):1–19. https://doi.org/10.1186/s40900-016-0028-9.

  27. 27.

    McGlynn EA, Asch SM, Adams J, Keesey J, Hicks J, DeCristofaro A, Kerr EA. The quality of health care delivered to adults in the United States. N Engl J Med. 2003;348(26):2635–45. https://doi.org/10.1056/NEJMsa022615.

  28. 28.

    McMillan SS, King M, Tully MP. How to use the nominal group and Delphi techniques. Int J Clin Pharm. 2016;38(3):655–62.

  29. 29.

    Michie S. Designing and implementing behaviour change interventions to improve population health. J Health Serv Res Policy. 2008;13(3_Suppl):64–9.

  30. 30.

    Michie S, Abraham C, Eccles M, Francis J, Hardeman W, Johnston M. Strengthening evaluation and implementation by specifying components of behaviour change interventions: a study protocol. Implement Sci. 2011;6:10.

  31. 31.

    Michie S, Atkins L, West R. The behaviour change wheel: a guide to designing interventions. London: Silverback Publishing; 2014.

  32. 32.

    Michie S, Carey RN, Johnston M, Rothman AJ, de Bruin M, Kelly MP, Connell LE. From theory-inspired to theory-based interventions: a protocol for developing and testing a methodology for linking behaviour change techniques to theoretical mechanisms of action. Ann Behav Med. 2018a;52(6):501–12. https://doi.org/10.1007/s12160-016-9816-6.

  33. 33.

    Michie S, Fixsen D, Grimshaw JM, Eccles MP. Specifying and reporting complex behaviour change interventions: the need for a scientific method. Implement Sci. 2009;4(1):40. https://doi.org/10.1186/1748-5908-4-40.

  34. 34.

    Michie S, Richardson M, Johnston M, Abraham C, Francis J, et al. The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: building an international consensus for the reporting of behavior change interventions. Ann Behav Med. 2013;46(1):81–95. https://doi.org/10.1007/s12160-013-9486-6.

  35. 35.

    Michie S, Thomas J, Johnston M, MacAonghusa P, Shawe-Taylor J, et al. The human behaviour-change project: harnessing the power of artificial intelligence and machine learning for evidence synthesis and interpretation. Implement Sci. 2017;12(1):121.

  36. 36.

    Michie S, West R, Sheals K, Godinho CA. Evaluating the effectiveness of behavior change techniques in health-related behavior: a scoping review of methods used. Transl Behav Med. 2018b;8(2):212–24. https://doi.org/10.1093/tbm/ibx019.

  37. 37.

    Murphy SA. An experimental design for the development of adaptive treatment strategies. Stat Med. 2005;24(10):1455–81. https://doi.org/10.1002/sim.2022.

  38. 38.

    Murray E, Hekler EB, Andersson G, Collins LM, Doherty A, et al. Evaluating digital health interventions: key questions and approaches. Am J Prev Med. 2016;51(5):843–51. https://doi.org/10.1016/j.amepre.2016.06.008.

  39. 39.

    Nahum-Shani I, Smith SN, Spring BJ, Collins LM, Witkiewitz K, Tewari A, Murphy SA. Just-in-Time Adaptive Interventions (JITAIs) in mobile health: key components and design principles for ongoing health behavior support. Ann Behav Med. 2017;52(6):446–62. https://doi.org/10.1007/s12160-016-9830-8.

  40. 40.

    Norris E, Finnerty AN, Hastings J, Stokes G, Michie S. A scoping review of ontologies related to human behaviour change. Nat Hum Behav. 2019;3(2):164–72. https://doi.org/10.1038/s41562-018-0511-4.

  41. 41.

    O’Cathain A, Thomas KJ, Drabble SJ, Rudolph A, Hewison J. What can qualitative research do for randomised controlled trials? A systematic mapping review. BMJ Open. 2013;3(6):e002889. https://doi.org/10.1136/bmjopen-2013-002889.

  42. 42.

    Pellegrini CA, Hoffman SA, Collins LM, Spring B. Optimization of remotely delivered intensive lifestyle treatment for obesity using the Multiphase Optimization Strategy: Opt-IN study protocol. Contemp Clin Trials. 2014;38(2):251–9. https://doi.org/10.1016/j.cct.2014.05.007.

  43. 43.

    Petit-Zeman S, Firkins L, Scadding JW. The James Lind Alliance: tackling research mismatches. Lancet. 2010;376(9742):667–9.

  44. 44.

    Piper ME, Fiore MC, Smith SS, Fraser D, Bolt DM, et al. Identifying effective intervention components for smoking cessation: a factorial screening experiment. Addiction. 2016;111(1):129–41. https://doi.org/10.1111/add.13162.

  45. 45.

    Riley L, Guthold R, Cowan M, Savin S, Bhatti L, Armstrong T, Bonita R. The World Health Organization STEPwise approach to noncommunicable disease risk-factor surveillance: methods, challenges, and opportunities. Am J Public Health. 2015;106(1):74–8. https://doi.org/10.2105/AJPH.2015.302962.

  46. 46.

    Rosala-Hallas A, Bhangu A, Blazeby J, Bowman L, Clarke M, et al. Global health trials methodological research agenda: results from a priority setting exercise. Trials. 2018;19(1):48. https://doi.org/10.1186/s13063-018-2440-y.

  47. 47.

    Røttingen J-A, Regmi S, Eide M, Young AJ, Viergever RF, et al. Mapping of available health research and development data: what’s there, what’s missing, and what role is there for a global observatory? Lancet. 2013;382(9900):1286–307. https://doi.org/10.1016/S0140-6736(13)61046-6.

  48. 48.

    Rutter H, Savona N, Glonti K, Bibby J, Cummins S, et al. The need for a complex systems model of evidence for public health. Lancet. 2017;390(10112):2602–4.

  49. 49.

    Smith CT, Hickey H, Clarke M, Blazeby J, Williamson P. The trials methodological research agenda: results from a priority setting exercise. Trials. 2014;15(1):32.

  50. 50.

    Watkins E, Newbold A, Tester-Jones M, Javaid M, Cadman J, et al. Implementing multifactorial psychotherapy research in online virtual environments (IMPROVE-2): study protocol for a phase III trial of the MOST randomized component selection method for Internet cognitive-behavioural therapy for depression. BMC Psychiatry. 2016;16(1):345. https://doi.org/10.1186/s12888-016-1054-8.

  51. 51.

    World Health Organisation. Noncommunicable diseases country profiles 2018. Geneva: 2018. Retrieved 02/12/2018.

Download references

Acknowledgements

We acknowledge the assistance of Dr. Geneviève Szczepanik and Ms. Thalie Labonté, of the Montreal Behavioral Medicine Centre (www.mbmc-cmcm.ca), who helped with online survey support. We acknowledge the generous input of the experts who contributed to the generation of the list of items in Phase 1: Kenneth E Freedland, Linda M Collins, Paul Montgomery, Justin Presseau and Gregory Ninot.

Funding

This work was supported by the following awards held by MB: Health Research Board Ireland Research Leadership Award 2013 [RL-2013-8] and an Ireland Canada University Foundation James M Flaherty Visiting Professor Award 2016–17. The IBTN was developed with funding from the Canadian Institutes of Health Research (MPE 309504) and is supported by SLBs CIHR SPOR Chair (SMC 383472) and KLLs UQAM Behavioral Medicine Chair and Salary Award from the Fonds de la Recherche du Québec – Santé (FRQS).

Author information

MB, KLL, SLB and JMS made substantial contributions to the conception and design of the work. All authors made substantial contributions to the acquisition, analysis and interpretation of data. MB and OM drafted the paper and all authors reviewed it and substantively revised it. All authors have approved the submitted version. All authors have agreed both to be personally accountable for the author’s own contributions and to ensure that questions related to the accuracy or integrity of any part of the work, even ones in which the author was not personally involved, are appropriately investigated, resolved, and the resolution documented in the literature.

Correspondence to Molly Byrne.

Ethics declarations

Ethics approval and consent to participate

Ethical approval was granted by the National University of Ireland Galway Research Ethics Committee (reference: 17-Jun-13). Informed consent was obtained from all study participants.

Consent for publication

Not applicable

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Byrne, M., McSharry, J., Meade, O. et al. An international, Delphi consensus study to identify priorities for methodological research in behavioral trials in health research. Trials 21, 292 (2020). https://doi.org/10.1186/s13063-020-04235-z

Download citation

Keywords

  • Behavior change interventions
  • Research prioritization
  • Randomized controlled trials
  • Methodological research
  • Delphi study

Comments

By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate. Please note that comments may be removed without notice if they are flagged by another user or do not comply with our community guidelines.