Skip to main content
  • Study protocol
  • Open access
  • Published:

The effectiveness of a low-intensity problem-solving intervention for common adolescent mental health problems in New Delhi, India: protocol for a school-based, individually randomized controlled trial with an embedded stepped-wedge, cluster randomized controlled recruitment trial



Conduct, anxiety, and depressive disorders account for over 75% of the adolescent mental health burden globally. The current protocol will test a low-intensity problem-solving intervention for school-going adolescents with common mental health problems in India. The protocol also tests the effects of a classroom-based sensitization intervention on the demand for counselling services in an embedded recruitment trial.


We will conduct a two-arm, individually randomized controlled trial in six Government-run secondary schools in New Delhi. The targeted sample is 240 adolescents in grades 9–12 with persistent, elevated mental health symptoms and associated distress/impairment. Participants will receive either a brief problem-solving intervention delivered over 3 weeks by lay counsellors (intervention) or enhanced usual care comprised of problem-solving booklets (control). Self-reported adolescent mental health symptoms and idiographic problems will be assessed at 6 weeks (co-primary outcomes) and again at 12 weeks post-randomization. In addition, adolescent-reported distress/impairment, perceived stress, mental wellbeing, and clinical remission, as well as parent-reported adolescent mental health symptoms and impact scores, will be assessed at 6 and 12 weeks post-randomization. We will also complete a parallel process evaluation, including estimations of the costs of delivering the interventions.

An embedded recruitment trial will apply a stepped-wedge, cluster (class)-randomized controlled design in 70 classes across the six schools. This will evaluate the added effect of a classroom-based sensitization intervention over and above school-level sensitization activities on the primary outcome of referral rate into the host trial. Other outcomes will be the proportion of referrals eligible to participate in the host trial, proportion of self-generated referrals, and severity and pattern of symptoms among referred adolescents in each condition. Power calculations were undertaken separately for each trial. A detailed statistical analysis plan will be developed separately for each trial prior to unblinding.


Both trials were initiated on 20 August 2018. A single research protocol for both trials offers a resource-efficient methodology for testing the effectiveness of linked procedures to enhance uptake and outcomes of a school-based psychological intervention for common adolescent mental health problems.

Trial registration

Both trials are registered prospectively with the National Institute of Health registry (, registration numbers NCT03633916 and NCT03630471, registered on 16th August, 2018 and 14th August, 2018 respectively).

Peer Review reports


Adolescence is a critical period for the prevention and treatment of mental health problems. Around 10% of adolescents experience a mental disorder [1] and about half of all mental disorders have their onset by the mid-teens, rising to three-quarters by the mid-20s [2]. Effective early intervention is therefore vital to mitigate the substantial personal, familial, and societal costs of mental disorders [3]. Low- and middle-income countries (LMICs) are home to 90% of the world’s 1.3 billion adolescents, but there is a severe shortage of mental health services targeting this age group in most LMICs [4]. This includes India, which is home to one-fifth of the global population of adolescents. Resource constraints are compounded by low demand for mental health care and the scarcity of context-specific evidence on the effectiveness of interventions [5]. Although a robust body of research testifies to the treatability of adolescent mental disorders, mainly through psychological interventions such as cognitive behavioral therapy (CBT), the bulk of such evidence originates from high-income countries [6]. Generalizability of the existing evidence base to LMICs is further restricted by the widespread use of specialist providers in intervention trials, with supervision often provided directly by program developers [7].

Transdiagnostic approaches have been advocated as a means of providing more scalable psychological interventions [8], with emerging evidence (mainly from adult populations) supporting their use in LMICs [9, 10]. Transdiagnostic interventions recognize the considerable overlap that exists in the constituent elements of disorder-specific protocols and the abundance of shared risk and protective factors for psychopathology in general [11]. The available data suggest that transdiagnostic interventions may be comparable in effectiveness to their disorder-specific counterparts, although head-to-head comparisons are scarce [12]. There are also indications that transdiagnostic protocols may confer advantages in terms of improved efficiencies afforded by the parsimonious use of a single intervention framework for multiple problems [13], as well as meeting an expressed need among practitioners for therapies that are designed to fit ‘real-world’ settings where psychosocial complexity and comorbidity are commonplace [14].

The PRIDE (PRemIum for aDolEscents) research program involves linked studies in India with the goal to design and evaluate a scalable transdiagnostic intervention model that addresses common mental health problems (i.e., anxiety, depression, and conduct difficulties) in school-going adolescents. The public health importance of adolescent mental health has been recognized in the National Adolescent Health Program (the Rashtriya Kishor Swasthya Karyakram) [15]. PRIDE was initiated in response to these national and global priorities, and challenges, for improving the quality and coverage of adolescent mental health interventions. The process of aligning the global evidence base on youth psychotherapies with local evidence followed recommendations from an earlier research program (PREMIUM) on psychological intervention development in low-resource settings, which led to the design and demonstration of the clinical effectiveness of two brief psychological treatments for adult mental health problems [16,17,18].

Our formative and pilot studies have informed the design of a stepped-care architecture involving two interventions of incremental intensity [19,20,21,22]. The current trial protocol focuses on the first step: a low-intensity problem-solving intervention designed for delivery by non-specialist school counsellors. Problem solving is strongly represented in the global literature, where it is among the most commonly used practice elements in evidence-based mental health programs for children and adolescents [23, 24]. It has been applied successfully as the main element in other low-intensity psychological interventions in LMICs [25, 26]. The emphasis on problem solving also reflects the primacy of psychosocial factors in adolescents’ narratives around explanatory models of distress and help-seeking [21]. Our provisional theory of change for the intervention draws on evidence-based principles of stress and coping [27], such that the impact of an ecological stressor is assumed to be mediated by appraisals of the stressor and of the repertoire of available coping resources. Our problem-solving intervention can be considered transdiagnostic in the sense that a single procedure is assumed to have generalized benefits for a diversity of clinical presentations. Non-responders to this first-line intervention will be offered a more intensive and dynamic transdiagnostic treatment incorporating additional cognitive and behavioral procedures. The effectiveness of the second step will be evaluated in a separate randomized trial for which participants will be recruited from a different school cohort.

As well as shaping the design of the two intervention steps, formative and pilot work suggested a need for awareness generation around the topics of mental health and psychological help-seeking. We therefore developed a sensitization plan to address factors such as low mental health literacy and confidentiality concerns, which might otherwise impede the demand for school mental health services. In so doing, we noted the lack of consistent evidence for the effects of school-based and other youth-focused mental health sensitization interventions. Existing approaches have varied considerably in their design and intensity [28, 29] and their ability to increase demand from adolescents for mental health care has yet to be established [30, 31]. We therefore identified an opportunity to test an additional component of the PRIDE intervention architecture—a classroom sensitization session led by school counsellors—by embedding a recruitment trial within a host intervention trial [32]. Where applicable, the distinctive features of the two trials are presented sequentially, structured according to the Standard Protocol Items: Recommendations for Interventional Trials (SPIRIT) guidelines [33]. Shared features of the two trials (e.g., data management) are presented under unified headings.

Objectives and hypotheses

Embedded recruitment trial

The primary objective of this stepped-wedge, cluster randomized controlled trial is to evaluate the impact of a classroom sensitization session (intervention condition), over and above school-level sensitization activities (control condition), on the rate of referred adolescents (i.e., the proportion of adolescents referred as a function of the total sampling frame in each condition) into the host trial. The primary hypothesis is that the classroom-level sensitization intervention will be associated with a higher referral rate into the host trial compared with referrals arising from school-level sensitization activities in isolation. The secondary hypotheses are that, compared with the control condition, the intervention condition will be associated with a greater proportion of referred students who meet eligibility criteria for inclusion in the host trial (Table 1) and a greater proportion of students who self-refer. We will also explore whether there are any differences between conditions in terms of the severity of total symptoms and symptom subtypes presented by referred adolescents.

Table 1 Eligibility criteria for the host trial

Host trial

The primary objective of this two-armed, parallel-design, individually randomized controlled trial is to evaluate the effectiveness of a low-intensity, problem-solving intervention (intervention arm) in reducing adolescent-reported mental health symptoms and idiographic problems at 6 weeks post-randomization, compared with enhanced usual care (control arm), for adolescents with common mental health problems. The primary hypothesis is that the problem-solving intervention will be superior to the control arm in reducing the severity of adolescent-reported mental health symptoms and idiographic problems at 6 weeks post-randomization.

The secondary objectives are:

  • To evaluate the effectiveness of the intervention on adolescent-reported distress/functional impairment, perceived stress, mental wellbeing, and clinical remission

  • To explore whether a theoretically informed a priori factor (perceived stress at 6 weeks) mediates the effects of the intervention on mental health symptoms and idiographic problems at 12 weeks

  • To explore the effectiveness of the intervention on caregiver-reported adolescent mental health symptoms, associated distress/functional impairment, and adolescent-reported prosocial behavior

  • To evaluate intervention delivery processes in order to assist in the interpretation of the trial results and to inform potential implementation of the PRIDE interventions on a wider scale

  • To estimate the costs and cost-effectiveness of implementing the PRIDE interventions


Methods are described according to SPIRIT guidelines [33]. Completed SPIRIT checklists for the two trials are provided as Additional files 1 and 2.

Study setting

The two trials will be conducted in six Government-run secondary schools in New Delhi, India. The schools were purposively selected in consultation with the Department of Education, Government of New Delhi, to focus on relatively under-served, low-income communities. Of the six schools, three are boys’ schools, two are girls’ schools and one is co-educational. Each school contains grades 6–12, of which grades 9–12 will be the focus of this research. As of August 2018, there were 172 classes in grades 9–12 with a total student population of 8448 (ranging from 1050 to 1632 per school; mean = 1408; standard deviation (SD) = 225), including 4694 (56%) boys and 3754 (44%) girls.


Figures 1 and 2 summarize the participant timeline and flow for the embedded recruitment trial as per CONSORT guidelines for reporting stepped-wedge, cluster-randomized controlled trials [36]. Figure 3 presents the CONSORT diagram for the host trial [37].

Fig. 1
figure 1

Illustration showing implementation of the control and intervention conditions in the embedded recruitment trial. The white boxes indicate the group of classes in the control condition and the colored boxes indicate the group of classes in the intervention condition. *0 = control condition; 1 = intervention condition

Fig. 2
figure 2

CONSORT flowchart for the embedded recruitment trial. The white boxes indicate the group of classes in the control condition and the colored boxes indicate the group of classes in the intervention condition

Fig. 3
figure 3

CONSORT flowchart for the host trial

Embedded recruitment trial

Seventy classes will participate in the embedded recruitment trial. These classes will be selected at random using computer-generated random numbers, stratified by school and grade, drawing from a pool of 118 eligible classes (excluding 54 classes that had received sensitization during earlier pilot work in these schools). The participating 70 classes will be randomized to receive the control and intervention conditions across two sequences. A small block size of 2 will be used to allocate the 70 classes across the two sequences in order to ensure balance, as the number of classes within each grade from the individual schools is relatively small. In the rare instance that a selected class has been dissolved or merged with another class, the next class in the random list will be included to replace the unavailable class. Each sequence will be implemented over three consecutive 4-week intervals (excluding holidays and exam breaks). Thus, each class will switch over from the control to the intervention condition at 4-week intervals, over two steps (Fig. 1). Schedules for sensitization in the allocated classes will be shared with the schools in advance to ensure access.

Host trial

Referrals to the host trial will be generated through a combination of self-referrals, teacher referrals, and referrals made by friends, siblings, and/or parents. These referrals will be drawn from the 70 classes sampled in the embedded recruitment trial, with additional participants recruited from the remaining 102 classes as needed. The precise schedule of recruitment activities in the latter classes will be calibrated according to referral patterns and caseload capacity for intervention providers in the various schools. When initiating a self-referral, students will have the option to either approach a counsellor directly or else post a completed referral form/written note in a secure drop-box. The school counsellor will also serve as a central point of contact for other potential referrers, and will offer referral forms on request. All referred adolescents will be followed up by a researcher and screened for eligibility to participate in the host trial (Table 1).

Consenting participants (see section on consent procedures below) will be enrolled by researchers and randomized to the intervention or the control arm after baseline outcome assessments are completed. Participants who are randomized to the intervention arm will be escorted by a researcher to meet the counsellor in an adjacent room/cubicle, ensuring efficient and discrete handover. The randomization list will be developed by an independent statistician (HW), applying stratification by school (and gender for the co-educational school) using randomly sized blocks of four or six. The randomization code will be concealed using sequentially numbered opaque sealed envelopes to maximize allocation concealment. Errors in randomization will be recorded and reported.

Sample size and power calculations

Embedded recruitment trial

We based our power calculation on a within-period comparison [38] for a stepped-wedge design using Stata package “clustersampsi”. Based on pilot data, we anticipated referral rates of 5% and 15% on the control and intervention conditions, respectively, with an intra-cluster correlation coefficient (ICC) of 0.124. We assumed the same ICC for the between-time correlation given the short time period of follow-up. In practice, it may be smaller than 0.124 and both ICCs will be reported. Using these parameters, a sample size of 70 classes (average class size of 50 students) will have 92% power to detect a difference of 10 percentage points (treating the outcome as a binomial variable), at a significance level of 0.05.

Host trial

Sample size estimations were produced for two co-primary outcomes: severity of adolescent-reported mental health symptoms measured by the Total Difficulties score on the Strengths and Difficulties Questionnaire (SDQ) and severity of idiographic problems measured by the Youth Top Problems (YTP; Table 2). We based the estimations on two data sources. First, we obtained uncontrolled effect sizes (ES = Difference in means/pooled SD) for both co-primary outcomes from a group of 52 adolescents who received the problem-solving intervention during pilot work in the six secondary schools in New Delhi. Among these students, all of whom met the same baseline eligibility criteria as intended for the current trial, the mean SDQ Total Difficulties scores changed from 23.4 (SD 3.4) at baseline to 16.1 (SD 5.9) at the end of the intervention (ES = 1.4). The mean YTP scores for the same group changed from 5.6 (SD 2.0) at baseline to 2.9 (SD 2.6) at the end of the intervention (ES = 0.9). Second, we obtained a paired effect size on the SDQ Total Difficulties score from another cohort of 47 adolescents participating in a later phase of piloting, including 29 students who received the problem-solving intervention and 18 waitlisted controls (ES = 1.0). YTP data were unavailable for this second cohort. Effect sizes in trials are often smaller than in pilots so we conservatively hypothesized that our intervention would be associated with an ES = 0.5 on both the co-primary outcomes with 90% power. We assumed a 1:1 allocation ratio of individual participants within each of the six schools, loss to follow-up of 15% over 6 weeks (based on piloting), and a Bonferroni correction to adjust for multiple primary outcomes. Based on these assumptions, we determined that N = 240 participants would be required. This sample size also provides 80% power to detect an ES of 0.44.


Embedded recruitment trial

Intervention condition

This will comprise a one-off 30-min classroom session that is intended to improve understanding about signs and symptoms of mental health problems, raise awareness about the school counselling service, and generate demand for the service. The session will be delivered for individual classes (approximately 50 students per class) by a counsellor (drawn from the same group responsible for the problem-solving intervention in the host trial) with assistance from a researcher who will have additional responsibilities for processing referrals and conducting eligibility assessments. The classroom session will start with a short animated video ( which provides age-appropriate information about types, causes, impacts and ways of coping with common mental health problems. The video will be followed by a guided group discussion, structured around a standardized script that builds on the topics covered in the video. In case of technical difficulties that may prevent the video from being shown, the counsellor will use a flipchart based on printed images from the video. At the end of the session, students will be handed a self-referral form which includes normalizing information and question-based prompts to assist with self-identification of mental health problems. Interested students can approach the facilitators immediately after the session with self-referral forms, or else deposit the forms discreetly in a secure drop-box located outside or near to the counsellor’s usual room.

The counsellors and researchers delivering the classroom sensitization sessions will be provided with a structured manual and complete a one-day, office-based training. Training will be conducted by master’s level psychologists (who will also serve as supervisors) and comprise lectures, demonstrations, and role-plays. The training will be followed by a period of supervised field practice, when the counsellors and researchers will be required to complete at least two classroom sessions independently under direct observation from supervisors. Fidelity of intervention delivery will be assessed on a checklist of observable procedures which have been distilled from the intervention manual. Each procedure will be rated on a three-point Likert scale (not completed, partially completed, fully completed). A ‘refresher’ training session will also be conducted before the trial begins.

Control condition

This will comprise whole-school sensitization activities. The supervisor will meet the principal of each school individually to inform them about planned counselling and research activities and to seek their cooperation for the same. This meeting will also provide structured information about common mental health problems faced by adolescents and address any concerns related to planned procedures and resource demands. Teachers will be invited to participate in separate group sensitization meetings (up to 30 teachers at a time). A standardized script will mirror the topics covered in the meetings with the school principals, but with additional emphasis placed on referral procedures for the host trial. Up to three meetings will be held in each school to maximize coverage of teaching staff. These meetings will be conducted by the same counsellor and researcher pairings responsible for delivering the classroom intervention. Posters will be placed in highly visible locations such as noticeboards or common corridors, in addition to signage on the drop-box, which will remind students (and teachers) of the counselling service.

Host trial

Intervention arm

A problem-solving intervention will be delivered to individual students across four to five face-to-face sessions spread over 3 weeks. Each session will last for up to 30 min (aligned with the usual duration of school periods) and will be delivered in the local language (Hindi). The sessions will be conducted on school premises in private rooms or, where private rooms are not available, behind screens and curtains in a suitable location (e.g., the school library). Such contingencies to address space limitations were piloted in earlier work and deemed to be feasible and acceptable in the local context, enabling a temporary counselling space in which students would not be on direct view.

Session 1 will focus on fostering engagement, understanding the participant’s difficulties, and introducing the structure and process of the intervention. Over the next three sessions, the participant will be helped to learn and apply a structured problem-solving strategy involving three steps (following the acronym “POD”): (1) identify and prioritize distressing/impairing problems (“Problem identification”); (2) generate and select coping options for modifying the identified problem directly (problem-focused strategies) and/or the associated stress response (emotion-focused strategies) (“Option generation”); and (3) implement and evaluate the outcome of this strategy (“Do it”). The intervention may be concluded after four sessions or else extended to a fifth session, depending on the adolescent’s preferences and logistical barriers to intervention completion such as exam breaks and holidays. The concluding session will focus on consolidating learning and generalizing problem-solving skills across different contexts. With permission, all sessions will be audio-recorded for office-based quality and fidelity assessments. Adolescents will be encouraged to practice problem-solving skills between the sessions, aided by a set of three “POD booklets” which explain problem solving using illustrated vignettes and suggest corresponding between-session practice exercises. The booklets (each corresponding to one of the steps of problem solving) will be distributed sequentially over the first three intervention sessions. In the concluding session, the adolescent will be additionally handed a full-color POD poster that summarizes the three steps of problem-solving.

Each school will have one or two counsellors, depending on demand. The counsellors will be Hindi-speaking college graduates aged 18 years or above, with no formal training or qualifications related to psychotherapy or mental health. They will be recruited through online job portals commonly used in the NGO/public sector in India. Selection will be based on reasoning capacity (assessed by written test) and interpersonal skills (assessed by structured role-plays and interview). Selected candidates will receive an intervention manual and complete one week of classroom-based training involving a combination of lectures, demonstrations, and role-plays. This will be followed by a minimum 6-week period of field training in which counsellors will carry out casework (with at least four cases) under the supervision of psychologists. Trainees’ performance will be evaluated using structured role-plays at the end of classroom-based training, as well as supervisors’ ratings of audio-recorded intervention sessions.

Counsellors will participate in weekly peer group supervision meetings, based on an approach tested in the PREMIUM trials, where it was found to be an acceptable, effective, and scalable supervision model for lay counsellors in low-resource settings [39]. Each 2-h meeting will be facilitated by one of the counsellors in rotation and overseen by a supervisor. Counsellors will review and discuss one or two audio-recorded sessions in each meeting. Audio-recordings will be rated by all group members using a therapy quality rating scale that incorporates elements from two established scales [40, 41] and assesses skills specific to problem solving as well as non-specific therapeutic skills (e.g., empathic understanding). Recurrent skills deficits noted by supervisors will be addressed through supplementary training workshops held on a monthly basis. The supervision schedule will ensure a representative selection of audio-recorded sessions, with the intention that all counsellors should receive equal opportunities to discuss their cases. In addition, supervisors will undertake weekly telephone calls (lasting 20–30 min) with each counsellor in order to monitor the progress of their caseload, and identify and manage risks. The counsellors will be able to initiate ad hoc calls if urgent consultation is needed on any case.

Control arm

There are no mental health services in the participating schools. A standardized control arm was devised accordingly, keeping in mind the requirement to offer a pragmatic, resource-efficient mode of support with minimal risk of contamination between trial arms. In terms of contamination, a recent scoping review of complex intervention trials in mental health [42] found that the principal processes leading to contamination were the same clinicians treating participants across conditions and communication between clinicians/participants. Moreover, the review recommended that methods other than cluster randomization should be considered to minimize contamination, given the lack of evidence for a link between the level of randomization and intervention effect size.

Participants allocated to the control arm will therefore receive the same printed problem-solving materials used in the intervention arm but without any counsellor contact. Immediately following random allocation to this condition, a researcher (rather than a counsellor) will provide a set of POD booklets and explain their purpose and contents using a standardized script. Students will be encouraged to read through the booklets in sequence and complete the specified practice exercises. No further guidance will be provided. In this way, all trial participants will receive the POD booklets, thereby eliminating the likeliest source of contamination. The counselling process itself is less likely to spill-over as this will be delivered in a one-to-one individual format, and our formative and pilot work showed that students emphasized confidentiality (mentioned earlier) such that disclosure of counselling experiences should be minimized.

Screening and outcome measures

Embedded recruitment trial

The primary outcome (referral rate based on the proportion of referred adolescents as a function of the total sampling frame in each condition) will be collated from referral logs maintained by researchers in each school. Referral data will be aggregated over each 4-week calendar period. Students deemed ineligible for participation in the host trial will be allowed to re-refer themselves after a gap of 4 weeks, offering a suitable time period to re-assess mental health status in line with the host trial’s inclusion criterion about symptom chronicity (Table 1). Secondary outcomes pertaining to the eligibility and clinical characteristics of students referred to the host trial will be derived from screening data on the SDQ (see below).

Host trial

All screening and outcome assessments will be undertaken using standardized self-report measures that have been translated into Hindi. Clinical eligibility criteria (i.e., severity, chronicity, and impacts of mental health symptoms) will be assessed using the adolescent-reported form of the SDQ (including the Impact Supplement). The same screening data will also serve as the baseline SDQ/Impact Supplement outcomes for eligible participants who are subsequently enrolled in the trial; baseline assessments for other outcome measures will be completed as soon as possible after completing consent procedures (ideally within 2 working days). The adolescent-reported SDQ/Impact Supplement will be repeated at 6 and 12 weeks post-randomization, along with the parent-reported SDQ/Impact Supplement, and adolescent-reported Youth Top Problems (YTP) [43], Perceived Stress Scale-4 (PSS-4) [44] and Short Warwick-Edinburgh Mental Wellbeing Scale (SWEMWBS) [45]. These measures are described in Table 2. The SDQ will also serve as the basis for assessing remission at both end-points, defined as falling below cut-offs for eligibility on both the SDQ Total Difficulties score and Impact score.

Table 2 Outcome measures in the host trial

Process measures

Process data on enrollment, randomization, and assessment procedures in both trials will be obtained from researcher-completed record forms. These will be collated to obtain assent/consent rates of adolescents and parents (and reasons for missing assent/consent); randomization rates (and reasons for randomization errors); completion rates of baseline and follow-up outcome assessments (and reasons for non-completion); and time lags between intended and completed assessments (and reasons for deviating from targets). In addition, motivations for help-seeking and expectancies for the school counselling program will be explored at the time of eligibility assessment through a brief qualitative interview with a sub-sample of referred students. Assent/consent to use the interview data in the research will be obtained as part of the consent process for the embedded recruitment trial.

Intervention processes will be assessed using additional data sources. In the embedded recruitment trial, counsellor-completed record forms will provide data on key participation indicators, including attendance rates and duration for all teacher meetings and classroom sensitization sessions, in addition to the numbers of posters and drop-boxes installed in the schools.

In the intervention arm of the host trial, counsellor-completed session record forms will be used to obtain process data on duration, spacing, and frequency of attended sessions (and reasons for non-attendance); and intervention uptake and completion rates (and reasons for pre-intervention and mid-intervention drop-out). Participants’ adherence to intervention activities and potential engagement challenges will be assessed using checklists within the same record forms, indicating whether or not the student completed practice exercises at home; used the POD booklets at home; brought the POD booklets to the session; and demonstrated understanding of POD booklets and session content. Use of POD booklets will be assessed in each arm of the trial at 6- and 12-week follow-up using a brief adolescent-reported measure that asks about estimated frequency of home use and perceived helpfulness of POD booklets in the preceding 6 weeks. Service satisfaction data will also be obtained from participants in each trial arm at 12 weeks using an eight-item service satisfaction questionnaire [53]. Three supplementary questions will elicit open-ended written feedback on the most preferred aspects of the service, potential areas for improvement and recommended changes.

Intervention quality and fidelity will be assessed in both trials using independent ratings of audio-recorded sessions. For the classroom sensitization intervention, 20% of all recordings will be selected at random and rated by a psychologist who is not directly involved with supervision of the intervention providers. A similar approach will be taken with the problem-solving intervention, for which 10% of all audio-recorded sessions will be rated independently. Reliability of the independent raters will be established initially by comparison with intervention quality and fidelity ratings from supervisors (see above).


Embedded recruitment trial

The researchers who co-facilitate the classroom sensitization sessions will also record referrals and conduct the host trial eligibility assessments. Blinding of the outcome assessors will therefore not be possible.

Host trial

Baseline and outcome assessments will be conducted by separate teams of researchers. All trial investigators, apart from the data manager (BB), will be blind to allocation status until the trial arms are revealed in the presence of both the Trial Steering Committee (TSC) and Data Safety and Monitoring Committee (DSMC). However, unblinding of individual participants may be undertaken if requested by the DSMC (e.g., in case of a serious adverse event).

Data collection, management, and analysis

Data collection

There will be a seamless flow of adolescents from the embedded recruitment trial to the host trial. The schedules for enrollment, interventions, and assessments are summarized in separate SPIRIT diagrams for the embedded recruitment trial (Fig. 4) and host trial (Fig. 5). A team of school-based researchers will process the referrals, undertake eligibility assessments for the host trial (within a target of ≤ 3 working days from the date of referral) and obtain adolescent assent/consent (within the same day if possible). A separate team of community-based researchers will visit parents/guardians (within a target of ≤ 2 working days after confirming an adolescent’s eligibility) to obtain consent and complete baseline outcome assessments (within the same day if possible). The school-based research team will complete baseline outcome assessments with adolescents once all consent procedures are completed (within a target of ≤ 2 working days). All assessment procedures should therefore be completed within 7 working days from the date of referral.

Fig. 4
figure 4

SPIRIT figure for the embedded recruitment trial

Fig. 5
figure 5

SPIRIT figure for the host trial

The community-based research team (blinded to allocation) will complete follow-up assessments at 6 and 12 weeks post-randomization. Assessments will take place at participants’ homes or other convenient locations, within a maximum period of 7 calendar days from the due date. Researchers will make up to four approaches for each scheduled contact.

Process data from researchers’ logs and counsellors’ session records will be captured on paper forms. All other measures, except for the YTP (which rates idiographic problems and does not readily lend itself to a digital format), will be administered via a tablet computer.

Data management

Data will be collected digitally using the customized STAR software program [54], and will be remotely uploaded as comma-separated values (CSV) files on a secured server. The date and time stamps for original data entry will be included, and an audit trail documenting any subsequent changes will be maintained. All paper-based data will be entered manually in SQL Epi-info forms and linked by participant ID with digitally collected data. Range and consistency checks will be performed at weekly intervals, with all inconsistencies and corrections logged to maintain an audit trail. All data will be anonymized and backed-up on external hard disks on a daily basis. All session audio-recordings will be linked with the participant ID and stored in a separate, secure, password-protected folder. A separate password-protected file linking names and participant IDs and the random allocation code will be maintained securely by the data manager and will not be accessed until the unblinding of the trial. All data will be shared in an encrypted form in password-protected files and through secure electronic transfer, when necessary.

Data analysis

Quantitative analysis will be conducted using STATA (version 15). A detailed analysis plan will be agreed with the DSMC before any analysis is undertaken. Findings will be reported as per CONSORT guidelines [37] for the host trial, and the CONSORT extension for reporting of stepped-wedge, cluster-randomized trials for the embedded recruitment trial [36].

Embedded recruitment trial

The baseline characteristics of the participating 70 classes, including class size and gender composition, will be described and assessed for any systematic differences across the two sequences. The primary outcome will be analyzed using generalized estimating equations (GEE) with robust standard errors. GEE is a recommended method for analysis of stepped-wedge, cluster-randomized controlled trials, providing population-averaged effects of exposure across trial conditions [55]. GEE allows for longitudinal data analysis without resorting to fully specified random effect models and can be applied to both continuous and categorical outcomes [56]. It provides both parameter estimates and standard errors that are corrected for clustering of data and are consistent despite misspecifications in the correlation structure. For this trial, the clustering of data will be specified at the class level. Analysis of the secondary and exploratory outcomes will also be undertaken using the GEE method. Sensitivity analysis will be conducted using a ‘within-period comparison’ of data [38] from the second period only. No interim analyses will be undertaken.

Host trial

The trial flowchart will include the number of students referred, screened, eligible, randomized, followed up at 6-week and 12-week endpoints and analyzed for the primary outcomes. The number refusing participation or excluded (with reasons), actively withdrawing, and passively lost to follow-up will be shown by arm. These will be summarized by means (standard deviation), medians (interquartile range), or numbers and proportions as appropriate to relevant subgroups (defined by age, gender, and baseline outcome score). For continuous outcomes, histograms within each arm will be plotted to assess normality and determine whether transformation is required.

The primary analyses will be on an intention-to-treat basis at the 6-week end-point, adjusted for baseline values of the outcome measure; school (as a fixed effect in the analysis) to allow for within-school clustering; counsellor variation (as a random effect); variables for which randomization did not achieve reasonable balance between the arms at baseline; and variables associated with missing outcome data [57]. Analyses of outcomes will be conducted using linear mixed-effects regression models for continuous outcomes with normally distributed errors (e.g., SDQ Total Difficulties score) and generalized (logistic) mixed-effects regression models for binary outcomes (e.g., remission rate). Intervention effects will be presented as adjusted mean differences and effect sizes (ES), defined as standardized mean differences. We will use 95% confidence intervals (CIs) for continuous outcomes, and adjusted odds ratios with 95% CIs for binary outcomes. Additionally, intervention effects for students who receive fewer sessions than prescribed will be estimated using the Complier Average Causal Effect structural equation model [58]. Repeated-measures analysis will be used to analyze data from the two end-points (6 and 12 weeks). Initial models will include an interaction effect between arm and time to allow for differential effects at these two end-points. This will be retained if there is evidence of effect modification by time. No interim analyses of outcomes will be undertaken.

We will explore potential moderators of intervention effects, with respect to a priori defined modifiers (chronicity of mental health difficulties, severity of mental health difficulties, YTP type, and SDQ caseness profile). We will fit relevant interaction terms and test for heterogeneity of intervention effects in regression models. A mediation analysis will be conducted to examine whether the theoretically driven a priori factor (perceived stress at 6 weeks) mediates the effects of the intervention on mental health symptoms and idiographic problems at 12 weeks.

Process evaluation

We will undertake descriptive statistical analysis of quantitative process data to explore the differential implementation of intervention procedures. In addition, thematic analysis will be used to code and organize qualitative interview data on service expectancies (assessed prior to enrolment in the host trial) and qualitative written feedback on service satisfaction (assessed at 12-week follow-up in the host trial). Findings from the various data sources will be triangulated and used to develop explanatory hypotheses about potential differences in intervention delivery and participation across schools, subgroups of participants, and providers. Process evaluation findings will be used to facilitate interpretation of the main trial results. The trial statisticians may conduct further analyses to test hypotheses generated from integration of the process evaluation and trial outcome data; these will necessarily be post hoc and identified as such in any subsequent publications.

Cost-effectiveness analysis

An economic evaluation will be conducted to estimate the costs and incremental cost-effectiveness of the problem-solving intervention. A combination of top-down and ingredients-based costing approaches will be used to generate cost estimates for the whole package, and for each package component (e.g., counselling sessions and POD booklets), in the intervention and control arms. All costing will be estimated from the providers’ perspective (the schools and the implementing partner Sangath); financial and economic costs will be calculated for all inputs (e.g., materials, training, supervision, staff time, overheads). The cost analysis will assess the costs of setting up and running the interventions; the distribution of costs across different forms of inputs; the unit cost per student/adolescent reached; the cost per additional case remitted; the cost of delivering all activities in intervention schools; and the cost per unit of measure for selected primary and secondary outcomes. We will estimate the incremental cost-effectiveness of the intervention relative to the control condition (enhanced usual care). The cost-effectiveness measure proposed here will be compared to similar school programs in the region and it will inform program replication, scalability, and financial sustainability.

Results will be plotted on a cost-effectiveness plane and presented as cost-effectiveness acceptability curves to show the probability of the intervention being cost-effective at a range of willingness-to-pay threshold levels. A sensitivity analysis will be conducted to take account of uncertainty and imprecision in the measurements.

Trial governance

Monitoring and governance for both trials will be provided by a Trial Management Group (TMG; comprising senior investigators and project staff involved in day-to-day coordination of research activities), TSC (comprising senior investigators and independent subject experts), and DSMC (a fully independent group with relevant clinical and trials expertise). The TMG and TSC will review trial process indicators (e.g., rates of screening, eligibility, consent, outcome assessments, adverse events) fortnightly and quarterly, respectively. The independent DSMC will meet at the outset of the two linked trials and again at the time of unblinding the trial results, as well as receiving reports of emergent serious adverse events (as per criteria below). Any trial protocol amendments will be agreed and formulated in conjunction with the TSC and DSMC and submitted to relevant Institutional Review Boards for approval.


Research ethics

Approvals have been obtained from the Institutional Review Boards of Sangath, Harvard Medical School, the London School of Hygiene and Tropical Medicine, and Indian Council of Medical Research. Harvard Medical School is the trial sponsor while Sangath is the implementing agency in India.


A two-stage consent process will be used across both trials. To begin, a school-based researcher will provide each referred student with structured verbal and written information about the use of their screening data for research purposes (as part of the embedded recruitment trial), irrespective of their eligibility to take part in the host trial. Students will be able to opt-out from providing any self-reported data for the embedded recruitment trial. Students meeting eligibility criteria for the host trial will be provided with additional structured verbal information and a printed participant information sheet. Assent will be sought for adolescents aged below 18 years and consent will be sought for adolescents who are 18 years or older. For assenting participants aged under 18 years, consent will also be sought from a parent/guardian for participation of the index adolescent and for their own participation in outcome assessments. Consenting adolescents aged 18 years or older will be able to take part without permission from their parent/guardian. We will seek their permission before approaching a parent/guardian about participating in assessments. When approaching the family member of an index adolescent, telephone contact will be initiated by a community-based researcher in the first instance, after which a meeting will be arranged at their home or another convenient location, if agreed.

Serious adverse events

Serious adverse events (SAEs) include death, life-threatening events, clinical deterioration requiring hospitalization or other specialist treatment, victimization, sexual abuse, and chronic absenteeism and/or drop-out from school. Immediate safeguarding actions will prioritize the safety of participants. This may involve suicide risk assessment, informing stakeholders, facilitating treatment with specialists, and statutory reporting in line with relevant legislation, such as the Protection of Children from Sexual Offences Act 2012 and the Juvenile Justice (Care and Protection) Act 2000 (last amended in 2015).

Each potential SAE will also be assessed for causality by two clinically qualified co-investigators and classified as unrelated, unlikely, possible, probable, or definitely related to trial participation. In the event that consensus is not reached, a third clinical psychologist (independent of the trial) will review the SAE report. Where causality is deemed to be anything other than unrelated to trial participation, the DSMC will advise on further actions such as withdrawal of individual participants, modifications to the trial protocol, continuing without modifications, or suspending/terminating the trial.


This paper describes an integrated protocol that will evaluate the demand for a school counselling program delivering a low-intensity psychological intervention, and the effectiveness of that intervention for school-going adolescents with elevated mental health presentations in New Delhi, India. The interventions used in the host trial and embedded recruitment trial will be provided by lay counsellors, working under the supervision of psychologists, in Government-run secondary schools catering to adolescents from lower socio-economic groups of the city. Concurrent process evaluation and cost-effectiveness analysis will complement the effectiveness findings, generating important evidence relevant to the scaling up of the interventions. To the best of our knowledge, these two trials have no comparable precedent from any low-resource context, and our findings have the potential to inform the design of school-based interventions to address adolescent mental health problems on a large scale in India and other global settings.

An individually randomized design was chosen for the host trial due to the relatively small number of available schools, which ruled out an alternative cluster-randomized design. The inherent risk of contamination associated with individual randomization was minimized by the inclusion of an enhanced usual care control arm, in which participants received the same printed materials as provided in the intervention arm. The augmentation of face-to-face counselling in the intervention arm was not expected to pose a significant risk of spill-over due to the reluctance of participants to share confidential counselling experiences with peers. Moreover, enhanced usual care was designed in such a way that the same delivery agents would not be involved in treating participants across conditions, ruling out another commonly cited source of contamination [42].

The use of a stepped-wedge cluster randomized design for the embedded recruitment trial was also influenced by pragmatic considerations. Formative and pilot work showed that classroom-based sensitization activities had the potential to increase the volume of referrals for school-based counselling. A stepped-wedge design—in which classes formed natural clusters in each school—offered the potential to stagger the roll-out of classroom sessions so that school-based counsellors could accommodate the anticipated flow of referrals within their limited caseload capacity.

Despite the use of contextually adapted sensitization activities, some potentially eligible students (and/or their caregivers) may be unwilling to participate. Reasons for non-participation will be systematically recorded and examined in the embedded process evaluation. We will also seek to address the concerns of referred adolescents who are not eligible for inclusion in the host trial despite a felt need for counselling. In anticipation, we have designed hand-outs with advice on self-management of common problems (such as academic stress). These hand-outs will be distributed to relevant students by the researchers conducting the baseline screening assessments. Another recruitment challenge relates to the academic calendar in the participating schools, which includes frequent breaks for exams, festivals, and other holidays. These scheduling disruptions may require temporary halting of recruitment (for example, so that students are not recruited immediately prior to a long break, as they would not be able to receive the intervention without a delay).

In addition to the publication of our findings in separate papers for each trial, we will share trial outcomes and implications with the participants and other stakeholder groups, including school principals and the local Department of Education in New Delhi. If effective, we will use the process and economic data to model the costs for scaling up the interventions across the school system in New Delhi. This may involve the deployment of counsellors by the state government under the Educational and Vocational Guidance Scheme (EVGC), due for implementation in some sectors of New Delhi starting from the 2018–2019 academic year.

Trial status

Both trials are registered with (host trial, NCT03630471,; embedded recruitment trial, NCT03633916, Recruitment for both trials was initiated on 20 August 2018. We expect to conclude participant recruitment by February 2019 and complete follow-up assessments by June 2019.

Availability of data and materials

Data from pilot studies used for arriving at sample size and power calculations can be made available by the corresponding author upon reasonable request.



Confidence interval


Data Safety and Monitoring Committee


Effect size


Low- and middle-income countries


Perceived Stress Scale-4


Serious adverse event


Standard deviation


Strengths and Difficulties Questionnaire


Short Warwick-Edinburgh Mental Wellbeing Scale


Trial Steering Committee


Youth Top Problems


  1. Belfer ML. Child and adolescent mental disorders: the magnitude of the problem across the globe. J Child Psychol Psychiatry Allied Discip. 2008;49(3):226–36.

    Article  Google Scholar 

  2. Kessler RC, Amminger GP, Aguilar-Gaxiola S, Alonso J, Lee S, Ustun TB. Age of onset of mental disorders: A review of recent literature. Curr Opin Psychiatry. 2007;20(4):359–64.

    Article  Google Scholar 

  3. Holmes EA, Ghaderi A, Harmer CJ, Ramchandani PG, Cuijpers P, Morrison AP, et al. The Lancet Psychiatry Commission on psychological treatments research in tomorrow's science. Lancet Psychiatry. 2018;5(3):237–86.

    Article  Google Scholar 

  4. Morris J, Belfer M, Daniels A, Flisher A, Ville L, Lora A, et al. Treated prevalence of and mental health services received by children and adolescents in 42 low-and-middle-income countries. J Child Psychol Psychiatry Allied Discip. 2011;52(12):1239–46.

    Article  Google Scholar 

  5. Collins PY, Patel V, Joestl SS, March D, Insel TR, Daar AS. Grand challenges in global mental health: A consortium of researchers, advocates and clinicians announces here research priorities for improving the lives of people with mental illness around the world, and calls for urgent action and investment. Nature. 2011;475(7354):27–30.

    Article  CAS  Google Scholar 

  6. Das JK, Salam RA, Lassi ZS, Khan MN, Mahmood W, Patel V, et al. Interventions for Adolescent Mental Health: An Overview of Systematic Reviews. J Adolesc Health. 2016;59(4s):S49-s60.

    Article  Google Scholar 

  7. Weisz JR, Krumholz LS, Santucci L, Thomassin K, Ng MY. Shrinking the gap between research and practice: tailoring and testing youth psychotherapies in clinical care contexts. Annu Rev Clin Psychol. 2015;11:139–63.

    Article  Google Scholar 

  8. Murray LK, Jordans MJ. Rethinking the service delivery system of psychological interventions in low and middle income countries. BMC Psychiatry. 2016;16:234.

    Article  CAS  Google Scholar 

  9. Bolton P, Lee C, Haroz EE, Murray L, Dorsey S, Robinson C, et al. A transdiagnostic community-based mental health treatment for comorbid disorders: Development and outcomes of a randomized controlled trial among Burmese refugees in Thailand. PLoS Med. 2014;11(11):e1001757.

    Article  Google Scholar 

  10. Murray LK, Hall BJ, Dorsey S, Ugueto AM, Puffer ES, Sim A, et al. An evaluation of a common elements treatment approach for youth in Somali refugee camps. Global Mental Health. 2018;5:e16.

    Article  CAS  Google Scholar 

  11. Marchette LK, Weisz JR. Practitioner review: Empirical evolution of youth psychotherapy toward transdiagnostic approaches. J Child Psychol Psychiatry. 2017;58(9):970–84.

    Article  Google Scholar 

  12. García-Escalera J, Chorot P, Valiente RM, Reales JM, Sandín B. Efficacy of transdiagnostic cognitive-behavioral therapy for anxiety and depression in adults, children and adolescents: A meta-analysis. Revista de Psicopatología y Psicología Clínica. 2016;21(3):147.

    Article  Google Scholar 

  13. Chorpita BF, Daleiden EL, Park AL, Ward AM, Levy MC, Cromley T, et al. Child STEPs in California: A cluster randomized effectiveness trial comparing modular treatment with community implemented treatment for youth with anxiety, depression, conduct problems, or traumatic stress. J Consult Clin Psychol. 2017;85(1):13–25.

    Article  Google Scholar 

  14. Santucci LC, Thomassin K, Petrovic L, Weisz JR. Building evidence-based interventions for the youth, providers, and contexts of real-world mental-health care. Child Dev Perspect. 2015;9(2):67–73.

    Article  Google Scholar 

  15. Ministry of Health and Family Welfare. Implementation Guidelines Rashtriya Kishor Swasthya Karyakram (RKSK) 2018. New Delhi: Ministry of Health and Family Welfare, Government of India; 2018.

  16. Vellakkal S, Patel V. Designing psychological treatments for scalability: The PREMIUM approach. PLoS One. 2015;10(7):e0134189.

    Article  Google Scholar 

  17. Nadkarni A, Weobong B, Weiss HA, McCambridge J, Bhat B, Katti B, et al. Counselling for Alcohol Problems (CAP), a lay counsellor-delivered brief psychological treatment for harmful drinking in men, in primary care in India: a randomised controlled trial. Lancet. 2017;389(10065):186–95.

    Article  Google Scholar 

  18. Patel V, Weobong B, Weiss HA, Anand A, Bhat B, Katti B, et al. The Healthy Activity Program (HAP), a lay counsellor-delivered brief psychological treatment for severe depression, in primary care in India: a randomised controlled trial. Lancet. 2017;389(10065):176–85.

    Article  Google Scholar 

  19. Michelson D, Malik K, Krishna M, Sharma R, Mathur S, Bhat B, et al. Development of a transdiagnostic, low-intensity, psychological intervention for common adolescent mental health problems in Indian secondary schools. Behav Res Ther. 2019; in press.

  20. Parikh R, Michelson D, Sapru M, Sahu R, Singh A, Cuijpers P, et al. Priorities and preferences for school-based mental health services in India: a multi-stakeholder study with adolescents, parents, school staff and mental health providers. Global Mental Health. 2019; in press.

  21. Parikh R, Sapru M, Krishna M, Cuijpers P, Patel V, Michelson D. "It is like a mind attack": stress and coping among urban school-going adolescents in India. BMC Psychol. 2019;7(1):31.

    Article  Google Scholar 

  22. Roy K, Shinde S, Sarkar BK, Malik K, Parikh R, Patel V. India's response to adolescent mental health: a policy review and stakeholder analysis. Soc Psychiatry Psychiatr Epidemiol. 2019;54(4):405-14.

    Article  Google Scholar 

  23. Boustani MM, Frazier SL, Becker KD, Bechor M, Dinizulu SM, Hedemann ER, et al. Common elements of adolescent prevention programs: minimizing burden while maximizing reach. Admin Pol Ment Health. 2015;42(2):209–19.

    Article  Google Scholar 

  24. Chorpita BF, Daleiden EL. Mapping evidence-based treatments for children and adolescents: application of the distillation and matching model to 615 treatments from 322 randomized trials. J Consult Clin Psychol. 2009;77(3):566–79.

    Article  Google Scholar 

  25. World Health Organization. Problem management plus (PM+): psychological help for adults in communities exposed to adversity: WHO Kenyan field-trial version 1.0, 2016: World Health Organization; 2016 [Available from: Accessed 30 July 2019.

  26. Dias A, Azariah F, Cohen A, Anderson S, Morse J, Cuijpers P, et al. Intervention development for the indicated prevention of depression in later life: The “DIL” protocol in Goa, India. Contemp Clin Trials Commun. 2017;6:131-9.

    Article  Google Scholar 

  27. Lazarus RS, Folkman S. Stress, appraisal, and coping. New York: Springer Pub. Co.; 1984.

    Google Scholar 

  28. Chiappetta L, Stark S, Mahmoud KF, Bahnsen KR, Mitchell AM. Motivational interviewing to increase outpatient attendance for adolescent psychiatric patients. J Psychosoc Nurs Ment Health Serv. 2018;56(6):31–5.

    Article  Google Scholar 

  29. Notley C, Christopher R, Hodgekins J, Byrne R, French P, Fowler D. Participant views on involvement in a trial of social recovery cognitive-behavioural therapy. Br J Psychiatry. 2015;206(2):122–7.

    Article  Google Scholar 

  30. Michelson D, Day C. Improving attendance at child and adolescent mental health services for families from socially disadvantaged communities: evaluation of a pre-intake engagement intervention in the UK. Admin Pol Ment Health. 2014;41(2):252–61.

    Article  Google Scholar 

  31. Bonevski B, Randell M, Paul C, Chapman K, Twyman L, Bryant J, et al. Reaching the hard-to-reach: a systematic review of strategies for improving health and medical research with socially disadvantaged groups. BMC Med Res Methodol. 2014;14:42.

    Article  Google Scholar 

  32. Treweek S, Lockhart P, Pitkethly M, Cook JA, Kjeldstrøm M, Johansen M, et al. Methods to improve recruitment to randomised controlled trials: Cochrane systematic review and meta-analysis. BMJ open. 2013;3(2).

    Article  Google Scholar 

  33. Madurasinghe VW. Sandra Eldridge on behalf of MRCSG, Gordon Forbes on behalf of the SECG. Guidelines for reporting embedded recruitment trials. Trials. 2016;17(1):27.

    Article  Google Scholar 

  34. Bhola P, Sathyanarayanan V, Rekha DP, Daniel S, Thomas T. Assessment of Self-Reported Emotional and Behavioral Difficulties Among Pre-University College Students in Bangalore, India. Indian J Community Med. 2016;41(2):146-50.

    Article  Google Scholar 

  35. Goodman R, Ford T, Simmons H, Gatward R, Meltzer H. Using the Strengths and Difficulties Questionnaire (SDQ) to screen for child psychiatric disorders in a community sample. Br J Psychiatry. 2000;177:534–9.

    Article  CAS  Google Scholar 

  36. Hemming K, Taljaard M, McKenzie JE, Hooper R, Copas A, Thompson JA, et al. Reporting of stepped wedge cluster randomised trials: extension of the CONSORT 2010 statement with explanation and elaboration. BMJ. 2018;363:k1614

  37. Moher D, Hopewell S, Schulz KF, Montori V, Gøtzsche PC, Devereaux PJ, et al. CONSORT 2010 Explanation and Elaboration: updated guidelines for reporting parallel group randomised trials. BMJ. 2010;340:c869.

    Article  Google Scholar 

  38. Thompson JA, Davey C, Fielding K, Hargreaves JR, Hayes RJ. Robust analysis of stepped wedge trials using cluster-level summaries within periods. Stat Med. 2018;37(16):2487–500.

    Article  CAS  Google Scholar 

  39. Singla DR, Weobong B, Nadkarni A, Chowdhary N, Shinde S, Anand A, et al. Improving the scalability of psychological treatments in developing countries: an evaluation of peer-led therapy quality assessment in Goa, India. Behav Res Ther. 2014;60:53–9.

    Article  Google Scholar 

  40. Kohrt BA, Jordans MJ, Rai S, Shrestha P, Luitel NP, Ramaiya MK, et al. Therapist competence in global mental health: Development of the ENhancing Assessment of Common Therapeutic factors (ENACT) rating scale. Behav Res Ther. 2015;69:11–21.

    Article  Google Scholar 

  41. Muse K, McManus F, Rakovshik S, Thwaites R. Development and psychometric evaluation of the Assessment of Core CBT Skills (ACCS): An observation-based tool for assessing cognitive behavioral therapy competence. Psychol Assess. 2017;29(5):542–55.

    Article  Google Scholar 

  42. Magill N, Knight R, McCrone P, Ismail K, Landau S. A scoping review of the problems and solutions associated with contamination in trials of complex interventions in mental health. BMC Med Res Methodol. 2019;19(1):4

  43. Weisz JR, Chorpita BF, Frye A, Ng MY, Lau N, Bearman SK, et al. Youth Top Problems: using idiographic, consumer-guided assessment to identify treatment needs and to track change during psychotherapy. J Consult Clin Psychol. 2011;79(3):369–80.

    Article  Google Scholar 

  44. Cohen S, Kamarck T, Mermelstein R. A global measure of perceived stress. J Health Soc Behav. 1983;24(4):385–96.

    Article  CAS  Google Scholar 

  45. Clarke A, Friede T, Putz R, Ashdown J, Martin S, Blake A, et al. Warwick-Edinburgh Mental Well-being Scale (WEMWBS): Validated for teenage school students in England and Scotland. A mixed methods assessment. BMC Public Health. 2011;11:487.

  46. Srikala B, Kishore KK. Empowering adolescents with life skills education in schools - School mental health program: Does it work? Indian J Psychiatry. 2010;52(4):344–9.

    Article  Google Scholar 

  47. Srinath S, Kandasamy P, Golhar TS. Epidemiology of child and adolescent mental health disorders in Asia. Curr Opin Psychiatry. 2010;23(4):330–6.

    Article  Google Scholar 

  48. Singh K, Junnarkar M, Sharma S. Anxiety, stress, depression, and psychosocial functioning of Indian adolescents. Indian J Psychiatry. 2015;57(4):367–74.

    Article  Google Scholar 

  49. Augustine LF, Vazir S, Rao SF, Rao MV, Laxmaiah A, Nair KM. Perceived stress, life events & coping among higher secondary students of Hyderabad, India: a pilot study. Indian J Med Res. 2011;134:61–8.

  50. Patalay P, Fitzsimons E. Correlates of mental illness and wellbeing in children: Are they the same? Results from the UK Millennium Cohort Study. J Am Acad Child Adolesc Psychiatry. 2016;55(9):771–83.

    Article  Google Scholar 

  51. Stewart-Brown S. The Warwick-Edinburgh Mental Well-Being Scale (WEMWBS): Performance in Different Cultural and Geographical Groups. In: Keyes CLM, editor. Mental Well-Being: International Contributions to the Study of Positive Mental Health. Dordrecht: Springer Netherlands; 2013. p. 133-50.

    Google Scholar 

  52. Wolpert, M., Görzig, A., Deighton, J., Fugard, A. J., Newman, R., & Ford, T. 2015. Comparison of indices of clinically meaningful change in child and adolescent mental health services: difference scores, reliable change, crossing clinical thresholds and ‘added value’-an exploration using parent rated scores on the SDQ. Child and Adolescent Mental Health, 20(2), 94–101.

  53. Larsen DL, Attkisson CC, Hargreaves WA, Nguyen TD. Assessment of client/patient satisfaction: development of a general scale. Eval Program Plann. 1979;2(3):197–207.

    Article  CAS  Google Scholar 

  54. OPSPL. STAR: Sangath digital tool for advanced research. 2013.

  55. Barker D, McElduff P, D'Este C, Campbell MJ. Stepped wedge cluster randomised trials: a review of the statistical methodology used and available. BMC Med Res Methodol. 2016;16:69.

  56. Liang K-Y, Zeger SL. Longitudinal data analysis using generalized linear models. Biometrika. 1986;73(1):13–22.

    Article  Google Scholar 

  57. Sullivan TR, White IR, Salter AB, Ryan P, Lee KJ. Should multiple imputation be the method of choice for handling missing data in randomized trials? Stat Methods Med Res. 2018;27(9):2610–26

    Article  Google Scholar 

  58. Peugh JL, Strotman D, McGrady M, Rausch J, Kashikar-Zuck S. Beyond intent to treat (ITT): A complier average causal effect (CACE) estimation primer. J Sch Psychol. 2017;60:7–24.

    Article  Google Scholar 

Download references


We gratefully acknowledge the contributions made by Aoife Doyle in preparing the statistical analysis plan for the host trial and by Bhagwant Chilhate in data collection and management of the host trial.


This study was supported by a Principal Research Fellowship awarded to Vikram Patel by the Wellcome Trust (grant number 106919/A/15/Z). The funding agency has no role in the design of the study and in data collection or the writing of the manuscript.

Author information

Authors and Affiliations



VP led the conception of the trials with contributions from all other authors. DM, RP, SS, and VP led on trial design and developed the initial draft of the manuscript with contributions from HW, AH, and JR on statistical considerations; KM, RS (R. Sharma), SM, and MK (M. Krishna) on the description of trial interventions; and BB and RS (R. Sahu) on data collection and management. MK (M. King), CF, BC, PC, and PS reviewed the manuscript at different stages and provided critical revisions. All authors have read and approved the final manuscript.

Corresponding author

Correspondence to Vikram Patel.

Ethics declarations

Ethics approval and consent to participate

Written informed consent will be obtained from all participants in both the trials. For adolescents younger than 18 years of age (minors), informed consent will be obtained from parents or guardians of the adolescents. The study procedures were approved by the Institutional Review Boards of the Harvard Faculty of Medicine (reference IRB17–0379), Sangath (references VP_2018_41 and RP_2018_47) and London School of Hygiene and Tropical Medicine (LSHTM Ethics reference 15907). Additional approval from the Indian Council of Medical Research (HMSC/1/2016/SBR) and the Department of Education, Government of India (DE40 [20]/EVGB/2017/711–717) were also obtained before the commencement of the trials.

Consent for publication

Consent for publication of anonymized data will be obtained from trial participants.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional files

Additional file 1:

SPIRIT checklist for embedded recruitment trial. (DOC 122 kb)

Additional file 2:

SPIRIT checklist for host trial. (DOC 122 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Parikh, R., Michelson, D., Malik, K. et al. The effectiveness of a low-intensity problem-solving intervention for common adolescent mental health problems in New Delhi, India: protocol for a school-based, individually randomized controlled trial with an embedded stepped-wedge, cluster randomized controlled recruitment trial. Trials 20, 568 (2019).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: