The American Psychiatric Association (APA) has updated its Privacy Policy and Terms of Use, including with new information specifically addressed to individuals in the European Economic Area. As described in the Privacy Policy and Terms of Use, this website utilizes cookies, including for the purpose of offering an optimal online experience and services tailored to your preferences.

Please read the entire Privacy Policy and Terms of Use. By closing this message, browsing this website, continuing the navigation, or otherwise continuing to use the APA's websites, you confirm that you understand and accept the terms of the Privacy Policy and Terms of Use, including the utilization of cookies.

×

Abstract

Objective:

Mental Health First Aid (MHFA) is a globally disseminated course that trains members of the public to recognize and respond to mental health issues in their communities. Although substantial evidence suggests that MHFA training is associated with positive changes in knowledge, attitudes, and behavioral intent, little is known about how MHFA trainee–delivered aid supports mental health needs. This systematic review sought to summarize the extant research evaluating MHFA trainees’ helping behaviors and the impacts of these behaviors on people experiencing a mental health problem (i.e., recipients).

Methods:

Electronic databases were searched for MHFA evaluations published before or on March 9, 2021. Studies that evaluated at least one outcome related to trainee helping behavior or recipient mental health were included in the synthesis. Outcomes were organized into three categories: trainee use of MHFA skills, helpfulness of trainees’ actions, and recipients’ mental health. Only studies that compared pre- and posttraining outcomes, included a control group, and directly evaluated MHFA were used to assess its efficacy.

Results:

The search identified 31 studies, nine of which met criteria to assess MHFA efficacy. The findings of the nine studies indicated that MHFA had mixed effects on trainees using the skills taught in the course and no effects on the helpfulness of trainees’ actions or on recipient mental health.

Conclusions:

The findings indicate that there is insufficient current evidence that MHFA improves the helping behaviors of trainees or the mental health of those receiving helping behaviors. They highlight a crucial research gap that should be prioritized as MHFA continues to grow in popularity.

HIGHLIGHTS

  • Mental Health First Aid (MHFA) is a widely promoted course that trains members of the public to recognize and respond to mental health issues in their communities.

  • Most of the evidence base is focused on direct training outcomes, and little is known about how MHFA performs outside the classroom.

  • This systematic review found insufficient evidence that MHFA improves the helping behaviors of MHFA trainees or the mental health of aid recipients.

  • Although MHFA likely is useful as a psychoeducational initiative, much more research is needed to understand how it supports the mental health of its intended recipients.

In the United States, approximately one in two adults will experience a mental illness in their lifetime, often beginning during childhood or adolescence (1, 2). Because of a plethora of factors, including cost, stigma, availability of services, and a fragmented mental health care system (3, 4), less than half will receive adequate treatment (2). The combination of high prevalence and limited access to services has led to increasing efforts to train the lay public in first-line mental health response.

Modeled on the conventional approach to first aid, Mental Health First Aid (MHFA) is a course that trains the general public to recognize and respond to mental health issues in their communities (5). MHFA was founded in 2000 in Australia and has since expanded to 24 countries, including the United States (6). Besides the standard adult course, specialized MHFA curricula exist for law enforcement, firefighters and emergency medical services personnel, teens, higher education, rural settings, workplaces, and people who work with youths, older adults, and veterans (7, 8).

To provide basic first-line assistance and make referrals to professional care, most people who participate in the MHFA course (i.e., trainees) learn a five-step action plan known as ALGEE: Assess for risk for suicide or harm, Listen nonjudgmentally, Give reassurance and information, Encourage appropriate professional help, and Encourage self-help and other support strategies (9). Trainees of the teen MHFA course learn a modified action plan: Look for warning signs, Ask how they are, Listen up, Help them connect with an adult, and Your Friendship is important (10). Course length and content may vary by country, and changes to the program are often made in light of new evidence. For example, the MHFA Australia ALGEE action plan language was recently updated to the following: Approach the person, Assess and assist with any crisis, Listen and communicate nonjudgmentally, Give support and information, Encourage the person to get appropriate professional help, and Encourage other supports (11). MHFA USA continues to use the original ALGEE action plan. Additionally, in Australia, the standard adult course is 12 hours long, and the youth course is 14 hours long (8), whereas in the United States, both classes are 8 hours long (12). Several MHFA courses have been adapted for specific national, cultural, and linguistic contexts (6, 13).

Since it was adapted and introduced to the United States in 2008, >2 million people have been trained in MHFA (14). MHFA trainings are available to the general public and are typically voluntary. However, training in MHFA increasingly is required by some police departments, fire departments, and schools (15), given the likelihood of encountering a person experiencing a mental health crisis in those settings. In an effort to support mental health programming and prevention efforts, MHFA has been explored in >87 studies (16) and is supported by policy makers (17, 18) as well as several health, education, and police departments (19). MHFA encourages mental health literacy among the general public and professionals (e.g., paramedics, law enforcement officers, and teachers) who are likely to be called on to support mental health needs in their communities. Significant funding for MHFA-related projects is awarded through Project AWARE (Advancing Wellness and Resiliency in Education) state agency grants (17, 20).

MHFA has consistently been shown to reduce stigma regarding mental health conditions and increase mental health knowledge, recognition of mental disorders, belief in effective treatments, and confidence and intent to help among its trainees, with mixed results for its effect on the amount of actual helping behavior performed by trainees (2126). However, because most existing program evaluations (16) and systematic reviews (21, 22, 25, 26) have focused on evaluating direct training outcomes (e.g., changes in knowledge, attitudes, and behavioral intent), little is known about how effective MHFA is in addressing the mental health needs of those who receive helping behaviors of MHFA trainees. Two meta-analyses, with literature searches conducted in 2017 (22) and 2018 (24), have begun to examine this gap in the literature, finding no significant effects on the quality of helping behaviors provided by MHFA trainees (22) or the mental health of recipients of MHFA-guided helping behaviors (22, 24). Mei and McGorry’s recent commentary highlights the growing interest in and need to evaluate MHFA-related mental health outcomes among recipients (27).

This systematic review included solely evaluations of MHFA actions taken outside of the classroom to provide an understanding of whether—and how—MHFA actions are helpful to those experiencing a mental health crisis. Although changing the knowledge, attitudes, and behavioral intent of MHFA trainees is an important and worthwhile goal, ultimately, it is crucial to understand how MHFA affects those it intends to help. Furthermore, because of the continued proliferation of programming (6, 15, 19) and evaluation studies (16), there is a need for an up-to-date synthesis of MHFA’s effects. Finally, although recent meta-analyses (22, 24) have examined the evidence for MHFA’s effects on trainee behavior and recipient mental health, they did so among other training outcomes, with limited attention placed on the particular challenges associated with evaluating posttraining, “real-world” outcomes. To our knowledge, no other evaluations, systematic reviews, or meta-analyses have selectively focused on trainee behavior and recipient mental health outcomes. We conclude by providing several recommendations for how to strengthen the evidence base of the MHFA program.

Methods

We conducted a systematic review by using the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) Statement (28). (A table showing a PRISMA checklist of items is available as an online supplement.)

Search Strategy

We searched the PubMed, PsycINFO, PTSDpubs, and EMBASE electronic databases for studies published before or on March 9, 2021. Owing to linguistic similarity in intervention names, we used the search terms “psychological first aid,” “mental health first aid,” “psychological crisis intervention,” and “mental health crisis intervention.” (The full search strategy and exact search terms are detailed in the online supplement.) The research protocol was developed prospectively and in adherence to the PRISMA Protocols guidelines (29).

Selection

We included peer-reviewed studies that evaluated behaviors taken by MHFA trainees to help people experiencing a mental health problem (i.e., recipients). We examined outcomes that assessed trainee behavior, ranging from whether the trainee approached someone in crisis and engaged in an MHFA-guided action (e.g., encouraged appropriate professional help) to the effect of MHFA-guided action on the recipients’ mental health. To isolate the impact of training outside of the classroom, we excluded outcomes evaluating changes in trainee mental health, knowledge, attitudes, or behavioral intent. Commentaries, book chapters, opinion pieces, protocols, reviews, and studies not published in English were also excluded. There were no restrictions on setting.

Two authors (S.F., K.S.) independently reviewed the database search results by title and abstract and selected studies on the basis of predetermined inclusion and exclusion criteria (see online supplement). In total, 15% of titles and 10% of abstracts were randomly selected for review by both authors and compared for quality control. A third author (S.H.) compared the selections and settled any disagreements. Two authors (S.F., K.S.) then independently reviewed the full texts of selected studies for final inclusion.

Data Analysis

Four authors (S.F., K.S., M.B., K.J.) independently extracted study-level data related to setting, design (including whether the study incorporated a control group with or without pre- and posttest results), participant characteristics, intervention details, and outcomes evaluated. Four studies were randomly selected to be extracted independently again by a different author as a quality check.

Behavioral outcomes were categorized by type (trainee use of MHFA skills, helpfulness of trainee’s actions, or recipient mental health) and the person who reported it (trainee or recipient). The main findings of each study were then summarized and identified as evidence of positive effect, partial positive effect, no effect, or negative effect, on the basis of a p<0.05 significance level. If studies did not compare pre- and posttraining outcomes or did not have a control group, their findings were considered to have insufficient information to assess MHFA efficacy. Additionally, to ensure comparability across courses, only studies evaluating a form of MHFA that explicitly taught the ALGEE or the “Look, Ask, Listen, Help Your Friend” action plan were used to evaluate efficacy. Last, we did not report on behavioral outcomes measured immediately posttraining, because trainees would not have had sufficient time to perform any real-world MHFA actions by then.

Two authors (S.F., S.H.) independently assessed risk for bias using the Cochrane Risk of Bias tool, which rates studies as “low risk,” “high risk,” or “unclear risk” for bias in the following domains: random sequence generation, allocation concealment, blinding of participants and personnel, blinding of outcome assessment, incomplete outcome data, selective reporting, and other sources of bias (30).

Results

The search identified 9,855 records, of which 1,093 were duplicates, 7,827 were excluded after title review, and 786 were excluded after abstract review. Of the 149 articles reviewed in full, 119 were excluded. These articles were excluded because they evaluated an intervention that was not MHFA (e.g., psychological first aid, N=57 studies), they did not measure at least one trainee behavior or recipient mental health outcome (N=36), the full papers were unavailable or not in English (N=11), they were systematic reviews (N=8), or they were other nonevaluation studies (N=7). One additional study (31) was identified through an anonymous peer reviewer of this study, leaving 31 studies to be included in the synthesis.

Of these 31 studies, nine (3139) were rigorous enough to be used to evaluate MHFA efficacy. Studies that did not meet minimum rigor criteria lacked a control group and reported posttraining outcomes only (4047), had no control groups and reported pre- and posttraining outcomes (9, 10, 4857), were a cluster-randomized controlled trial (RCT) with relevant outcomes not measured in the control group (58), or were a cluster RCT that did not evaluate the ALGEE or the “Look, Ask, Listen, Help Your Friend” action plan (59) (see online supplement).

Description of Studies Used to Assess Efficacy

Of the nine studies we used to assess MHFA efficacy, three (31, 38, 39) had not been synthesized in previous systematic reviews. All were RCTs (including two cluster RCTs). Follow-up periods ranged from 4 months to 3 years after the training. (A summary of the studies used to assess efficacy is available in the online supplement.)

MHFA evaluations occurred in predominantly high-income countries, including six in Australia. Trainees came from diverse backgrounds and represented the general public, students, teachers, government employees, and parents. Nearly all MHFA courses were in person and taught by certified MHFA instructors or mental health professionals. Course formats varied from multiple, shorter sessions to long, 1-day sessions. Training totaled between 9 and 14 hours. Six studies evaluated the standard adult MHFA course, and three evaluated the youth course (for adults who work with youths).

Outcomes

Trainee use of MHFA skills, reported by trainee.

All of the reviewed studies measured trainee-reported use of MHFA skills to help a person experiencing a mental health condition (see online supplement), with nine (3139) meeting our criteria to allow a conclusion. Studies asked trainees whether they used MHFA skills at all (31, 33, 34, 3639) and, if they did, the frequency of use (33, 36, 37), with one study (32) not specifying the questionnaire wording. Five studies (31, 33, 3739) additionally considered the fidelity of trainee actions to the ALGEE plan. Three studies found a statistically significant increase in use of MHFA skills after 4 (34), 6 (31), and 24 (37) months, whereas six (32, 33, 35, 36, 38, 39) found no change in such use (see online supplement). Four of the studies that found no change (32, 36, 38, 39) were underpowered at posttraining because of significant loss to follow-up.

Trainee use of MHFA skills, reported by recipient.

Four studies asked about receipt of MHFA help from the recipient’s perspective (see online supplement), one of which (35) had enough information to enable a conclusion. The study found that high school students reported receiving increased information about mental health conditions from their trainee teachers after 6 months but did not report receiving increased help from them. The study was adequately powered.

Helpfulness of trainee’s actions, reported by trainee.

Seven studies asked trainees whether they perceived the assistance they provided as helpful to recipients. However, none of the studies had enough information to allow a conclusion (see online supplement).

Helpfulness of trainee’s actions, reported by recipient.

Two studies (38, 39) reported on the same RCT at different follow-up periods and asked adolescents how well their parents who were trained in MHFA supported them when they experienced a mental health difficulty. The studies found no effect of the training at 12, 24 (39), or 36 months (38) (see online supplement). Both studies were underpowered.

Recipient mental health, reported by trainee.

Using the parent report version of the Strengths and Difficulties Questionnaire (SDQ), the aforementioned two studies (38, 39) also found no change in parent-reported adolescent mental health difficulties at 12, 24 (39), or 36 months (38) (see online supplement). Again, the two studies were underpowered.

Recipient mental health, reported by recipient.

Three studies (35, 38, 39) assessed recipient mental health as reported by the recipient. In one study (35), recipients were high school students whose teachers were trained, and in the other two studies (reporting on the same RCT at different follow-up periods) (38, 39), recipients were adolescents whose parents were trained in MHFA. None of the studies found significant changes in mental health difficulties assessed with child report versions of the SDQ (see online supplement). Two studies (38, 39) were underpowered.

Risk for Bias

Risk for bias, according to the Cochrane Risk of Bias tool, is summarized in Table 1 (4059). Thirteen studies were identified as having overall high risk for bias (10, 4044, 4852, 54, 58), two as having medium-to-high risk (45, 57), seven as having medium risk (9, 31, 46, 47, 53, 56, 59), seven as having medium-to-low risk (3237, 55), and two as having low risk (38, 39). Few studies used an allocation strategy that included random sequence generation (3139) or allocation concealment (3234, 36, 38, 39). Blinding of participants and personnel was impossible for all MHFA evaluations; this resulted in a high risk for bias only if study participants could become aware that they were participating in an intervention evaluation and, as a result, may respond to surveys in a systematically different way than did control group participants. Therefore, studies with no control group (9, 10, 4057) or with a comparable intervention as control (38, 39) were classified as having a low risk for bias, whereas studies with as-usual or waitlist control groups were classified as having a high risk for bias. Very few studies blinded outcome assessors (32, 3639). Studies that did not have control groups were again classified as having a low risk for bias. Results for incomplete outcome data were mixed and dependent on loss to follow-up rates. The main other source of bias identified was not controlling for participant-level characteristics, such as trainee’s previous mental health response experience.

TABLE 1. Cochrane Risk of Bias ratings from 31 studies included in this review measuring MHFA trainee behavior and recipient mental health outcomes

StudyRandom
sequence
generation
Allocation
concealment
Blinding of
participants
and staff
Blinding of
outcome
assessment
Incomplete
outcome
data
Selective
reporting
Other
sources
of bias
Overall
Armstrong et al., 2020 (48)HighHighLowLowHighUnclearHighHigh
Ashoorian et al., 2019 (40)HighHighLowLowHighUnclearHighHigh
Banh et al., 2019 (49)HighHighLowLowHighUnclearHighHigh
Bond et al., 2020 (50)HighHighLowLowHighUnclearHighHigh
Carpini et al., 2020 (41)HighHighLowLowHighUnclearHighHigh
Currie and Davidson, 2015 (42)HighHighLowHighUnclearUnclearHighHigh
Fisher et al., 2020 (58)UnclearUnclearHighHighHighUnclearLowHigh
Hart et al., 2012 (51)HighHighLowHighLowUnclearHighHigh
Hart et al., 2012 (52)HighHighLowLowHighUnclearHighHigh
Hart et al., 2016 (10)HighHighLowLowHighUnclearHighHigh
Hart et al., 2019 (53)HighHighLowLowLowUnclearHighMedium
Hung et al., 2021 (31)aLowHighHighHighLowUnclearLowMedium
Jensen et al., 2016 (32)aLowLowHighLowHighUnclearLowMedium to low
Jorm et al., 2004 (34)aLowLowHighHighUnclearLowLowMedium to low
Jorm et al., 2010 (33)aLowLowHighUnclearLowUnclearLowMedium to low
Jorm et al., 2010 (35)aLowHighHighUnclearLowLowLowMedium to low
Kelly et al., 2011 (54)HighHighLowLowHighUnclearHighHigh
Kitchener and Jorm, 2002 (9)HighHighLowLowLowUnclearHighMedium
Kitchener and Jorm, 2004 (36)aLowLowHighLowHighLowLowMedium to low
Mendenhall et al., 2013 (43)HighHighLowLowHighUnclearHighHigh
Morgan et al., 2019 (39)aLowLowLowLowHighLowLowLow
Morgan et al., 2020 (38)aLowLowLowLowHighLowLowLow
Rodgers et al., 2021 (44)HighHighLowLowHighUnclearHighHigh
Svensson and Hansson, 2014 (37)aLowUnclearHighLowLowUnclearLowMedium to low
Svensson et al., 2015 (45)HighHighLowLowUnclearUnclearHighMedium to high
Svensson and Hansson, 2017 (55)HighHighLowLowLowUnclearLowMedium to low
Thombs et al., 2015 (59)UnclearUnclearHighUnclearLowUnclearLowMedium
Uribe Guajardo et al., 2018 (57)HighHighLowLowHighUnclearUnclearMedium to high
Uribe Guajardo et al., 2019 (56)HighHighLowLowLowUnclearHighMedium
Witry et al., 2020 (46)HighHighLowLowHighUnclearLowMedium
Witry et al., 2020 (47)HighHighLowLowHighUnclearLowMedium

aThe study met minimum criteria for rigor (i.e., compared pre- and posttraining outcomes, had control groups, and evaluated a form of Mental Health First Aid [MHFA] that explicitly taught the ALGEE or “Look, Ask, Listen, Help Your Friend” action plan).

TABLE 1. Cochrane Risk of Bias ratings from 31 studies included in this review measuring MHFA trainee behavior and recipient mental health outcomes

Enlarge table

Discussion

We identified 31 studies that evaluated the behaviors of MHFA trainees and the mental health of recipients. All of the included studies asked trainees whether they used MHFA skills in real-life situations. However, few evaluated the helpfulness of their actions or their effects on recipient mental health. Only nine studies assessed MHFA efficacy in a rigorous manner. On the basis of these nine studies, we found mixed (positive and neutral) evidence of changes in trainees’ use of MHFA skills and no evidence of improvements in the helpfulness of trainees’ behaviors or recipient mental health.

Few of the included studies used rigorous study designs needed to establish effects of the MHFA program. Although several studies included pre- and posttraining outcomes (9, 3133, 3539, 4859), fewer used a control group (3139, 58, 59); randomly assigned participants to treatment (3139, 58, 59); were adequately powered (31, 3335, 37, 49); or accounted for participant-level characteristics that might influence MHFA helping behaviors (3134, 3639, 46, 47, 49, 55, 5759), such as profession or previous mental health experience. Furthermore, most outcomes were reported by trainees. Although this form of assessment is an important first step and facilitates data collection, it is based on the subjective impression of the trainee and may be susceptible to social desirability and recall biases. Trainee reports are particularly undesirable for evaluating the impact of MHFA on recipients, because they involve making assumptions about recipients’ experiences. Some studies included both trainee and recipient reports for the same outcomes, which helped address reliability but did not address the aforementioned biases. Ideally, studies would use standardized surveys or professional assessments to evaluate mental health outcomes. Finally, although follow-up periods varied, none of the studies measured recipient mental health immediately after MHFA-guided trainee interventions. Because MHFA trainees are trained to provide a first-line response to a crisis situation, initial reductions in mental health difficulties of recipients may be more appropriate to measure than medium- and long-term effects. Overall, the risk for bias of the included studies was medium to high.

Previous systematic reviews of MHFA evaluations have addressed behavioral outcomes only minimally and alongside training outcomes. Systematic reviews and meta-analyses by Hadlaczky et al. (21) and Maslowski et al. (24) found moderate improvements in trainees’ use of MHFA skills, whereas Morgan et al. (22) found small improvements. Morgan et al. also considered the quality of helping behaviors offered, reporting no significant improvements on this measure. Neither Morgan et al. nor Maslowski et al. found significant improvements in recipients’ mental health. Ng et al.’s 2020 systematic review (25) focused on teen and youth MHFA and found that both training curricula generally resulted in more helping behavior of trainees. However, findings in a 2020 systematic review of youth MHFA for educators by Sánchez et al. (26) were inconclusive for this measure. Differences in results for trainee use of MHFA skills were likely due to a mixture of study inclusion criteria and search dates. Hadlaczky et al. (21), Ng et al. (25), and Sánchez et al. (26) did not restrict by study type, and Morgan et al. (22) and Maslowski et al. (24) included all controlled trials. Also of note, Maslowski et al.’s reporting of trainee use of MHFA skills included confidence measures, which increased the number of eligible studies used to assess this outcome. In our systematic review, we used only RCTs measuring actual helping behaviors and their effects on recipients to assess MHFA efficacy. Three of the nine studies we used to evaluate efficacy had not been synthesized in previous reviews. Finally, unlike in other reviews, we solely focused on posttraining behavioral outcomes and on whether outcomes were reported by trainees or recipients, an essential element in program evaluations.

Given that there currently are few studies of MHFA with adequate rigor, and that findings from these studies are mixed, we conclude that there is insufficient evidence that MHFA achieves the desired impact on the helping behaviors of trainees and the mental health of recipients. MHFA implementers should take particular care when describing the intervention as evidence based and be specific about outcomes when evidence does exist, such as improving trainee knowledge, attitudes, and behavioral intent (2126), and when evidence is insufficient, such as whether MHFA measurably affects trainee behavior or recipient mental health. Notably, some evidence suggests that MHFA trainees who rated themselves as having high intent to help were more likely to report at follow-up that they had actually provided help (6062). Thus, it is possible that training outcomes such as behavioral intent are mediating the relationship between MHFA training and trainee helping behavior, but this possibility requires further investigation. Future research could seek to isolate the specific training components and mechanisms that affect trainee behavior and recipient mental health to ultimately inform updates to the curricula.

A lack of good-quality evidence does not necessarily render MHFA an unhelpful intervention; in fact, it has been shown to positively affect trainees’ knowledge of mental health issues, attitudes toward mental illness, and intent to help (2126). Rather, it highlights the gaps in our understanding of how it affects trainee behavior and recipient mental health. More—and more rigorous—evaluations of these outcomes are necessary. Researchers interested in building the evidence base for MHFA can draw on decades of development and evolution in program evaluation. Rigorous designs including MHFA trainee randomization, control groups, and longitudinal follow-up (beyond pre- and posttraining designs) are the minimum required to establish the efficacy of MHFA on trainee helping behavior and recipient mental health outcomes. To address the primary weaknesses in the existing literature, such as lack of power to detect statistically significant effects and bias introduced by a reliance on trainee reports, we consider it very critical that all future study designs prepare for substantial loss to follow-up and measure recipient responses to MHFA-guided helping behaviors. Moreover, future studies should choose more dynamic follow-up times that allow adequate time for trainees to encounter a situation requiring MHFA actions (22) but also capture the initial, short-term impact of these actions on recipients. To overcome challenges related to data collection and design, study designs that use pre- and post-MHFA training assessments of trainee-recipient dyad outcomes (e.g., parents trained in MHFA and their children) (38, 39) can serve as exemplar designs for future studies.

To facilitate data collection among potential recipients, studies may first be restricted to smaller populations in which recipients can be monitored more easily (e.g., families and schools) (33, 38, 39); then, as programmatic effect is established, surveillance can be expanded to larger populations of potential recipients. Using a validated rubric to observe and rate simulated role-play may also help to address self-reporting biases of trainee behavior (63). Several challenges are associated with evaluating posttraining outcomes, and, notably, evaluations of the even more ubiquitous physical first aid have been similarly limited (64). However, innovative research designs (33, 38, 39) and tools (63) have begun to address these challenges and should continue to be supported and further improved. To enable this essential research, it is crucial that MHFA-supporting institutions and funding mechanisms (20) allocate sufficient funds to evaluations of trainee behavior and recipient mental health that meet at least the aforementioned standards of rigor.

The main strengths of this review were its focus on trainee behavior and recipient mental health outcomes and conclusions that were based only on studies that met a standard of adequate rigor. This allowed us to highlight the current state of the evidence for outcomes that are infrequently studied yet crucial to understanding MHFA’s practical, real-world applications. The review was limited by the exclusion of non-English studies and of studies that were not available online, although there were relatively few such studies. Additionally, several of the included studies were underpowered, raising the possibility of type II error. Although pooling the studies into a meta-analysis would have addressed this, measurements for each outcome were too few and varied to do so in a meaningful way. Last, most evaluations were performed in high-income Western countries—Australia, in particular—thus limiting the generalizability of our findings. MHFA has been licensed and adapted in 24 countries (6) and will likely continue to expand. Rigorous evaluations should be conducted in every setting where MHFA is performed, if possible, and particularly in low- and middle-income countries.

Conclusions

Our review found insufficient evidence that MHFA improves the helping behaviors of trainees or the mental health of recipients of such behaviors. Our findings highlight a crucial research and evaluation gap whose closure must be prioritized as MHFA continues to become more popular. As a psychoeducational intervention, MHFA addresses critical barriers to improving community mental health, such as stigma and public education. Furthermore, the rapid proliferation of (5) and funding allocated to (20) MHFA indicates a growing desire to understand and address mental health issues in the United States, and this momentum should not be lost. However, as MHFA trainees are expected and encouraged to provide first-line support to people experiencing mental distress, it is just as, if not more, important to understand how their actions affect those they intend to help.

Columbia–WHO Center for Global Mental Health, Department of Psychiatry, Columbia University Irving Medical Center, New York City (Forthal, Sadowska, Pike, Balachander, Jacobsson); Institute for Social Research, University of Michigan, Ann Arbor (Hermosilla).
Send correspondence to Ms. Forthal ().

This work was supported by the National Institute of Mental Health (R01-MH110872) and program funding for the Columbia–WHO Center for Global Mental Health.

The funders had no role in study design, data collection, analysis, interpretation, or writing of this article.

The authors report no financial relationships with commercial interests.

References

1 Kessler RC, Berglund P, Demler O, et al.: Lifetime prevalence and age-of-onset distributions of DSM-IV disorders in the National Comorbidity Survey Replication. Arch Gen Psychiatry 2005; 62:593–602Crossref, MedlineGoogle Scholar

2 Mental Illness. Bethesda, MD, National Institute of Mental Health. 2020. https://www.nimh.nih.gov/health/statistics/mental-illness. Accessed November 19, 2020Google Scholar

3 America’s Mental Health 2018. Stamford, CT, Cohens Veterans Network and National Council for Behavioral Health, 2018. https://www.cohenveteransnetwork.org/americasmentalhealt. Accessed April 24, 2021Google Scholar

4 Compton-Phillips A, Mohta NS: Care redesign survey: it’s time to treat physical and mental health with equal intent. NEJM Catal 2018. https://catalyst.nejm.org/doi/full/10.1056/CAT.18.0255. Accessed July 9, 2021Google Scholar

5 Mental Health First Aid USA: About MHFA. Washington, DC, National Council for Mental Wellbeing, 2021. https://www.mentalhealthfirstaid.org/about. Accessed April 24, 2021Google Scholar

6 Global Presence. Parkville, Victoria, Australia, MHFA International, 2021. https://mhfainternational.org/international-mental-health-first-aid-programs. Accessed April 24, 2021Google Scholar

7 Mental Health First Aid USA: Programs. Washington, DC, National Council for Mental Wellbeing, 2021. https://www.mentalhealthfirstaid.org/programs. Accessed April 24, 2021Google Scholar

8 Which Courses Are Available Parkville, Victoria, Australia, Australia MHFA, 2019. https://mhfa.com.au/courses. Accessed April 24, 2021Google Scholar

9 Kitchener BA, Jorm AF: Mental health first aid training for the public: evaluation of effects on knowledge, attitudes and helping behavior. BMC Psychiatry 2002; 2:10Crossref, MedlineGoogle Scholar

10 Hart LM, Mason RJ, Kelly CM, et al.: ‘teen Mental Health First Aid’: a description of the program and an initial evaluation. Int J Ment Health Syst 2016; 10:3Crossref, MedlineGoogle Scholar

11 Mental Health First Aid Australia: What We Do at Mental Health First Aid. Parkville, Victoria, Australia, Australia MHFA, 2019. https://mhfa.com.au/about/our-activities/what-we-do-mental-health-first-aid. Accessed April 24, 2021Google Scholar

12 Course Types. Washington, DC, National Council for Mental Wellbeing, 2021. https://www.mentalhealthfirstaid.org/take-a-course/course-type. Accessed April 24, 2021Google Scholar

13 Jorm AF, Kitchener BA, Reavley NJ: Mental Health First Aid training: lessons learned from the global spread of a community education program. World Psychiatry 2019; 18:142–143Crossref, MedlineGoogle Scholar

14 Mental Health First Aid. Washington, DC, National Council for Mental Wellbeing, 2021. https://www.thenationalcouncil.org/topics/mental-health-first-aid. Accessed April 24, 2021Google Scholar

15 Alexander D: Mental Health First Aid Enacted Laws (2015-2018). Washington, DC, National Council for Mental Wellbeing, 2021. https://www.mentalhealthfirstaid.org/wp-content/uploads/2018/08/MHFA-enacted-laws-15-18.pdfGoogle Scholar

16 Mental Health First Aid USA: Research & Evidence Base. Washington, DC, National Council for Mental Wellbeing, 2021. https://www.mentalhealthfirstaid.org/about/research. Accessed April 24, 2021Google Scholar

17 Obama B: Now Is the Time: The President’s Plan to Protect Our Children and Our Communities by Reducing Gun Violence. Washington, DC, White House, 2013. https://obamawhitehouse.archives.gov/sites/default/files/docs/wh_now_is_the_time_full.pdfGoogle Scholar

18 Mental Health First Aid Act of 2016, HR1877, 114th Cong (2015–2016), HR Rep. No 114-786Google Scholar

19 Mental Health First Aid USA: Partners. Washington, DC, National Council for Mental Wellbeing, 2021. https://www.mentalhealthfirstaid.org/partners-2. Accessed April 24, 2021Google Scholar

20 Mental Health First Aid: Funding Opportunities. Washington, DC, National Council for Mental Wellbeing, 2021. https://www.mentalhealthfirstaid.org/funding-opportunities. Accessed April 24, 2021Google Scholar

21 Hadlaczky G, Hökby S, Mkrtchian A, et al.: Mental Health First Aid is an effective public health intervention for improving knowledge, attitudes, and behaviour: a meta-analysis. Int Rev Psychiatry 2014; 26:467–475Crossref, MedlineGoogle Scholar

22 Morgan AJ, Ross A, Reavley NJ: Systematic review and meta-analysis of Mental Health First Aid training: effects on knowledge, stigma, and helping behaviour. PLoS One 2018; 13:e0197102Crossref, MedlineGoogle Scholar

23 Kitchener BA, Jorm AF: Mental health first aid training: review of evaluation studies. Aust N Z J Psychiatry 2006; 40:6–8Crossref, MedlineGoogle Scholar

24 Maslowski AK, LaCaille RA, LaCaille LJ, et al.: Effectiveness of mental health first aid: a meta-analysis. Ment Health Rev 2019; 24:245–261CrossrefGoogle Scholar

25 Ng SH, Tan NJH, Luo Y, et al.: A systematic review of youth and teen mental health first aid: improving adolescent mental health. J Adolesc Health (Epub ahead of print, Nov 18, 2020) Google Scholar

26 Sánchez AM, Latimer JD, Scarimbolo K, et al.: Youth Mental Health First Aid (Y-MHFA) trainings for educators: a systematic review. School Ment Health 2021; 13:1–12CrossrefGoogle Scholar

27 Mei C, McGorry PD: Mental Health First Aid: strengthening its impact for aid recipients. Evid Based Ment Health 2020; 23:133–134Crossref, MedlineGoogle Scholar

28 Moher D, Liberati A, Tetzlaff J, et al.: Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med 2009; 6:e1000097Crossref, MedlineGoogle Scholar

29 Moher D, Shamseer L, Clarke M, et al.: Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. Syst Rev 2015; 4:1Crossref, MedlineGoogle Scholar

30 Higgins JP, Altman DG, Gøtzsche PC, et al.: The Cochrane Collaboration’s tool for assessing risk of bias in randomised trials. BMJ 2011; 343:d5928Crossref, MedlineGoogle Scholar

31 Hung MS, Chow MC, Chien WT, et al.: Effectiveness of the Mental Health First Aid programme for general nursing students in Hong Kong: a randomised controlled trial. Collegian 2021; 28:106–113CrossrefGoogle Scholar

32 Jensen KB, Morthorst BR, Vendsborg PB, et al.: Effectiveness of Mental Health First Aid training in Denmark: a randomized trial in waitlist design. Soc Psychiatry Psychiatr Epidemiol 2016; 51:597–606Crossref, MedlineGoogle Scholar

33 Jorm AF, Kitchener BA, Fischer J-A, et al.: Mental health first aid training by e-learning: a randomized controlled trial. Aust N Z J Psychiatry 2010; 44:1072–1081Crossref, MedlineGoogle Scholar

34 Jorm AF, Kitchener BA, O’Kearney R, et al.: Mental health first aid training of the public in a rural area: a cluster randomized trial [ISRCTN53887541]. BMC Psychiatry 2004; 4:33Crossref, MedlineGoogle Scholar

35 Jorm AF, Kitchener BA, Sawyer MG, et al.: Mental Health First Aid training for high school teachers: a cluster randomized trial. BMC Psychiatry 2010; 10:51Crossref, MedlineGoogle Scholar

36 Kitchener BA, Jorm AF: Mental Health First Aid training in a workplace setting: a randomized controlled trial [ISRCTN13249129]. BMC Psychiatry 2004; 4:23Crossref, MedlineGoogle Scholar

37 Svensson B, Hansson L: Effectiveness of Mental Health First Aid training in Sweden. A randomized controlled trial with a six-month and two-year follow-up. PLoS One 2014; 9:e100911Crossref, MedlineGoogle Scholar

38 Morgan AJ, Fischer J-AA, Hart LM, et al.: Long-term effects of Youth Mental Health First Aid training: randomized controlled trial with 3-year follow-up. BMC Psychiatry 2020; 20:487Crossref, MedlineGoogle Scholar

39 Morgan AJ, Fischer J-AA, Hart LM, et al.: Does Mental Health First Aid training improve the mental health of aid recipients? The training for parents of teenagers randomised controlled trial. BMC Psychiatry 2019; 19:99Crossref, MedlineGoogle Scholar

40 Ashoorian D, Albrecht KL, Baxter C, et al.: Evaluation of Mental Health First Aid skills in an Australian university population. Early Interv Psychiatry 2019; 13:1121–1128Crossref, MedlineGoogle Scholar

41 Carpini JA, Chandra J, Lin J, et al.: Mental Health First Aid by Australian tertiary staff: Application rates, modes, content, and outcomes. Early Interv Psychiatry (Epub ahead of print, Nov 25, 2020) Google Scholar

42 Currie R, Davidson K: An evaluation of the initial impact of using educational psychologists to deliver NHS Scotland’s ‘Scottish Mental Health First Aid: Young People’s training programme. Educ Child Psychol 2015; 32:42–48Google Scholar

43 Mendenhall AN, Jackson SC, Hase S: Mental Health First Aid USA in a rural community: perceived impact on knowledge, attitudes, and behavior. Soc Work Ment Health 2013; 11:563–577CrossrefGoogle Scholar

44 Rodgers G, Burns S, Crawford G: “I was able to actually do something useful”: evaluating the experiences of university students after completing Mental Health First Aid: a mixed-methods study. Adv Ment Health 2021; 19:40–62CrossrefGoogle Scholar

45 Svensson B, Hansson L, Stjernswärd S: Experiences of a Mental Health First Aid training program in Sweden: a descriptive qualitative study. Community Ment Health J 2015; 51:497–503Crossref, MedlineGoogle Scholar

46 Witry MJ, Fadare O, Pudlo A: Pharmacy professionals’ preparedness to use Mental Health First Aid (MHFA) behaviors. Pharm Pract 2020; 18:2102MedlineGoogle Scholar

47 Witry M, Karamese H, Pudlo A: Evaluation of participant reluctance, confidence, and self-reported behaviors since being trained in a pharmacy Mental Health First Aid initiative. PLoS One 2020; 15:e0232627Crossref, MedlineGoogle Scholar

48 Armstrong G, Sutherland G, Pross E, et al.: Talking about suicide: an uncontrolled trial of the effects of an Aboriginal and Torres Strait Islander mental health first aid program on knowledge, attitudes and intended and actual assisting actions. PLoS One 2020; 15:e0244091Crossref, MedlineGoogle Scholar

49 Banh MK, Chaikind J, Robertson HA, et al.: Evaluation of Mental Health First Aid USA using the Mental Health Beliefs and Literacy Scale. Am J Health Promot 2019; 33:237–247Crossref, MedlineGoogle Scholar

50 Bond KS, Reavley NJ, Kitchener BA, et al.: Evaluation of the effectiveness of online mental health first aid guidelines for helping someone experiencing gambling problems. Adv Ment Health (Epub ahead of print, May 21, 2020). doi: 10.1080/18387357.2020.1763815Google Scholar

51 Hart LM, Jorm AF, Paxton SJ: Mental health first aid for eating disorders: pilot evaluation of a training program for the public. BMC Psychiatry 2012; 12:98Crossref, MedlineGoogle Scholar

52 Hart LM, Jorm AF, Paxton SJ, et al.: Mental health first aid guidelines: an evaluation of impact following download from the World Wide Web. Early Interv Psychiatry 2012; 6:399–406Crossref, MedlineGoogle Scholar

53 Hart LM, Bond KS, Morgan AJ, et al.: Teen Mental Health First Aid for years 7–9: a description of the program and an initial evaluation. Int J Ment Health Syst 2019; 13:71Crossref, MedlineGoogle Scholar

54 Kelly CM, Mithen JM, Fischer JA, et al.: Youth Mental Health First Aid: a description of the program and an initial evaluation. Int J Ment Health Syst 2011; 5:4Crossref, MedlineGoogle Scholar

55 Svensson B, Hansson L: Mental health first aid for the elderly: a pilot study of a training program adapted for helping elderly people. Aging Ment Health 2017; 21:595–601Crossref, MedlineGoogle Scholar

56 Uribe Guajardo MG, Kelly C, Bond K, et al.: An evaluation of the teen and Youth Mental Health First Aid training with a CALD focus: an uncontrolled pilot study with adolescents and adults in Australia. Int J Ment Health Syst 2019; 13:73Crossref, MedlineGoogle Scholar

57 Uribe Guajardo MG, Slewa-Younan S, Kitchener BA, et al.: Improving the capacity of community-based workers in Australia to provide initial assistance to Iraqi refugees with mental health problems: an uncontrolled evaluation of a Mental Health Literacy Course. Int J Ment Health Syst 2018; 12:2Crossref, MedlineGoogle Scholar

58 Fisher H, Harding S, Bell S, et al.: Delivery of a Mental Health First Aid training package and staff peer support service in secondary schools: a process evaluation of uptake and fidelity of the WISE intervention. Trials 2020; 21:745Crossref, MedlineGoogle Scholar

59 Thombs DL, Gonzalez JMR, Osborn CJ, et al.: Resident assistant training program for increasing alcohol, other drug, and mental health first-aid efforts. Prev Sci 2015; 16:508–517Crossref, MedlineGoogle Scholar

60 Rossetto A, Jorm AF, Reavley NJ: Predictors of adults’ helping intentions and behaviours towards a person with a mental illness: a six-month follow-up study. Psychiatry Res 2016; 240:170–176Crossref, MedlineGoogle Scholar

61 Jorm AF, Nicholas A, Pirkis J, et al.: Associations of training to assist a suicidal person with subsequent quality of support: results from a national survey of the Australian public. BMC Psychiatry 2018; 18:132Crossref, MedlineGoogle Scholar

62 Yap MBH, Jorm AF: Young people’s mental health first aid intentions and beliefs prospectively predict their actions: findings from an Australian National Survey of Youth. Psychiatry Res 2012; 196:315–319Crossref, MedlineGoogle Scholar

63 El-Den S, Moles RJ, Zhang R, et al.: Simulated patient role-plays with consumers with lived experience of mental illness post-Mental Health First Aid training: interrater and test re-test reliability of an observed behavioral assessment rubric. Pharmacy 2021; 9:28Crossref, MedlineGoogle Scholar

64 Tannvik TD, Bakke HK, Wisborg T: A systematic literature review on first aid provided by laypeople to trauma victims. Acta Anaesthesiol Scand 2012; 56:1222–1227Crossref, MedlineGoogle Scholar