Effectiveness of workshop training for psychosocial addiction treatments: a systematic review
Effectiveness bank home page. Opens new windowReview analysis

This entry is our analysis of a review or synthesis of research findings considered particularly relevant to improving outcomes from drug or alcohol interventions in the UK. The original review was not published by Findings; click Title to order a copy. Free reprints may be available from the authors – click prepared e-mail. The summary conveys the findings and views expressed in the review. Below is a commentary from Drug and Alcohol Findings.

Links to other documents. Hover over for notes. Click to highlight passage referred to. Unfold extra text Unfold supplementary text
Copy title and link | Comment/query |

Effectiveness of workshop training for psychosocial addiction treatments: a systematic review.

Walters S.T., Matson S.A., Baer J.S. et al.
Journal of Substance Abuse Treatment: 2005, 29(4), p. 283–293.
Unable to obtain a copy by clicking title? Try asking the author for a reprint by adapting this prepared e-mail or by writing to Dr Walters at Scott.Walters@unthsc.edu.

Concludes that retaining psychosocial therapy skills after the popular workshop training format requires follow-up consultation, supervision or feedback. Rather than simply ‘ticking the box’ by sending staff to one-off workshops, the implication is that services must invest much more to be confident the investment has paid off for clients.

Summary Training providers to deliver evidence-based substance use interventions is an important step in disseminating these interventions. Training can take many forms, including readings, internet-based courses, stand-alone lectures, and immersion experiences. Most commonly, training appears to be a discrete one- to three-day workshop consisting of lectures, discussion, and role play of clinical strategies.

This review focused on workshop training for the treatment of substance use problems. It included workshops aimed at all providers who might reasonably provide substance use treatment in the community. It was restricted to training formats that were reasonably discrete and designed for practitioners already working in the community [Editor’s note: rather than students learning their professions]. To be included, studies had to have been published in English in a peer-reviewed journal and measured outcomes (eg, knowledge, attitudes, demonstration of skills) from training in a psychosocial intervention for substance abuse, such as motivational or cognitive-behavioural approaches, counselling, or brief advice. The training had to have spanned less than 40 hours of contact time and trained practitioners to provide services in the community rather than as part of a clinical trial or academic training.

Seventeen such studies were found. Of the 17 workshops, eight were primarily directed at health care providers and another at a mix of health care workers, psychologists, and social workers. Workshops varied in length from 20 minutes to five days of residency programmes and tended to emphasise relatively circumscribed screening, advice, and negotiation skills for smoking or other substance use disorders.

Eight other workshops were directed at behavioural health providers such as psychologists, counsellors, social workers, and criminal justice caseworkers. These tended to be longer (12 to 35 hours plus additional supervision components) and to emphasise more sophisticated clinical strategies such as motivational interviewing or cognitive-behavioural and network therapies.

Across both categories of trainees, the most widely used format was a discrete educational workshop; only one health care programme, but six aimed at behavioural health workers, included follow-up components. Featuring in 10 of the 17 training programmes, motivational interviewing was the most frequently taught method. Typically, assessment of impact was confined to comparing trainees’ responses to questionnaires after versus before workshops, or post-workshop ratings of trainees’ interactions with people acting as a standardised patient. One study of training health care workers included a follow-up evaluation of skill retention beyond the end of the workshop, and two additional studies measured patient outcomes as a function of provider training. In contrast, five of the eight studies of behavioural health providers followed up attendees for at least two months, and two studies measured patient outcomes.

Main findings and conclusions

In summary, a review of 17 studies evaluating workshop training in substance use treatment revealed generally positive effects on self-reported knowledge and attitudes, some support for skill acquisition immediately after workshops, but less evidence that these gains were sustained over time. Additional contact tended to reduce the attrition in skills seen after one-off training experiences. Because of the limited number of studies, small sample sizes, and variations in training topics and goals, these conclusions should be seen as preliminary. In fact, given the ubiquity of workshop training in substance use sectors, the lack of outcome research is striking. More details below.

Almost without exception, studies found improvements in attendees’ own assessments of their knowledge and attitudes toward clients with substance use problems, and their competence in responding to them. However, although a positive attitude towards and knowledge about an intervention may be preconditions to successfully adopting it, belief in one’s ability may not be a good indicator of proficiency. Studies which compared trainees’ perceptions with observers’ ratings of their interactions with clients uniformly found trainees overestimated their skills. There was likewise a limited relation between self-reported skills and client responses. Indeed, the (perhaps false) belief that one has learned a technique may counterproductively deter additional training. For these reasons, it seems important to balance positive self-perception with objective feedback on clinical performance.

Workshops did seem to produce modest increases in skills for some attendees as demonstrated through videotaped or audiotaped interactions with real or simulated clients, or reports from clients. There was evidence that relatively brief training improved skills and increased the frequency of screening and brief interventions in medical settings. A related finding is that distance training via computer or videotapes can have a modest impact on skills, significant given the relatively low cost and high portability of these formats. Workshops spanning two or three days ended with from 40–60% of trainees being assessed as competent in the intervention. The most intensive training programme (workshop training, videotaped interactions, and supervision of three to four clinical cases over 12 weeks) achieved a figure approaching 90%. There was also good evidence for positive changes in trainees’ written responses to clinical vignettes, both immediately after workshops and at follow-ups up to four months later, though the relationship between these responses and clinical performance needs clarifying.

Although several studies analysed results by trainees’ sex, prior education and experience, or type of profession/occupation, few such characteristics were related to how well they learnt and adopted the interventions. One study did however find that trainees more proficient in motivational interviewing before training learned and retained more skills at the follow-up assessment.

Following training with additional learning contacts in the form for example of coaching and supervision appears to have uniformly positive effects. Findings included:
improved 24-week patient outcomes as a result of a provider training, weekly supervision, and monthly videotaped role play;
• counsellors who received feedback on their performance and/or coaching retained more skills than those who only attended a workshop;
better outcomes with more intensive training: highest adherence and skill ratings among trainees who received a seminar, supervision, and manual (33 hours training over four months); intermediate improvements among those who completed web training with a manual (26 hours); and the poorest ratings among those who completed a manual only (10 hours).

In contrast, there was somewhat less evidence that substance use clinicians readily learned complex techniques, such as motivational interviewing or cognitive-behavioural therapy, solely from a discrete two- to four-day workshop, and maintained those skills over time. These important findings may reflect the need to build ongoing components into training initiatives, to train existing supervisors to provide supervision, or to consider immersion five- to ten-day training programmes with extensive experiential content. Likewise, there was some evidence for a deterioration in skills if not accompanied by additional consultation or support. For many trainees, a discrete workshop does not lead to long-term improvements in clinical skills. Though findings strongly support the value of additional supervision or monitoring, it seems difficult to get some practitioners to engage in these. When studies encouraged participants to attend additional supervisory groups or submit audiotapes of client sessions, few actually did so.

Unlike additional post-workshop components, it was unclear whether the length of the initial training experience itself makes a difference.

These results from training programmes should be interpreted in the context of the larger process of disseminating innovations. This has been described as a series of steps. Exposure to new ideas usually comes through educational lectures, readings, or consultation with experts; adoption is a formal decision by an individual or organisation to try an innovation; implementation is when providers are testing the innovation; and, finally, providers incorporate an innovation into regular use. Viewed in this context, discrete workshop training is probably best seen as exposure to new information; it is important to realise that other experiences may be needed to promote exploratory use and incorporation in to routine practice.

These 17 workshops may not be representative of workshops delivered in the field outside of a research context. Research on training outcomes appears to be in its infancy and few evaluation designs meet standards now assumed for research in clinical treatments. For example, only four of the 17 studies included a control condition and none documented wider changes in the treatment organisation. In addition, apparent improvements in knowledge and skills can be due simply to repeated administrations of the tests and not genuinely reflect skill acquisition. The studies also largely relied on changes from baseline to post-training to indicate training effects; few established that trainees were actually proficient in the targeted skills.


Findings logo commentary The process of selecting seminal and key research for the drug and alcohol treatment matrices revealed that this review from 2005 has not been superseded and remains a valuable resource. Its findings on the value of post-training supervision or coaching based on feeding back trainees’ actual performance was confirmed by a later review focused on training in motivational interviewing, which included but was not limited to substance use studies.

The implications of such findings are that post-workshop feedback on performance with clients and/or expert coaching should be mandated and supported by services and managers, because without this practitioners tend not to engage with the ongoing support needed to translate training in to better outcomes for clients. That makes training in to an extended workforce development programme. Given the review’s findings that self-ratings bear little relation to actual competence, the process cannot be considered successful until the trainee has demonstrated competence, preferably through ratings of session recordings – a labour-intensive process. Rather than simply ‘ticking the training box’ by sending staff to one-off workshops, managers and services must invest much more before they can be confident that the investment has paid off for clients.

Last revised 06 October 2014. First uploaded 06 October 2014

Comment/query
Open Effectiveness Bank home page


Top 10 most closely related documents on this site. For more try a subject or free text search

REVIEW 2011 Implementing evidence-based psychosocial treatment in specialty substance use disorder care

MATRIX CELL 2020 Alcohol Treatment Matrix cell C4: Management/supervision; Psychosocial therapies

STUDY 2012 Implementation issues in an innovative rural substance misuser treatment program

REVIEW 2013 Meta-analysis of the effects of MI training on clinicians’ behavior

STUDY 2011 An experimental demonstration of training probation officers in evidence-based community supervision

REVIEW 2006 My way or yours?

REVIEW 2005 The motivational hallo

REVIEW 2018 Alliance rupture repair: a meta-analysis

STUDY 2009 The alliance in motivational enhancement therapy and counseling as usual for substance use problems

REVIEW 2011 Alcohol-use disorders: diagnosis, assessment and management of harmful drinking and alcohol dependence