Article Text

Teaching general practitioners and doctors-in-training to discuss advance care planning: evaluation of a brief multimodality education programme
  1. Karen Detering1,
  2. William Silvester1,
  3. Charlie Corke2,3,
  4. Sharyn Milnes3,
  5. Rachael Fullam1,
  6. Virginia Lewis4 and
  7. Jodie Renton1
  1. 1Respecting Patient Choices, Austin Health, Heidelberg, Victoria, Australia
  2. 2Respecting Patient Choices, Barwon Health
  3. 3School of Medicine, Deakin University, Victoria, Australia
  4. 4Australian Institute for Primary Care and Ageing, La Trobe University, Melbourne, Victoria, Australia
  1. Correspondence to Dr Karen Detering, Respecting Patient Choices, Austin Hospital, PO Box 5555, Heidelberg, Victoria 3084, Australia; karen.detering{at}


Objective To develop and evaluate an interactive advance care planning (ACP) educational programme for general practitioners and doctors-in-training.

Design Development of training materials was overseen by a committee; informed by literature and previous teaching experience. The evaluation assessed participant confidence, knowledge and attitude toward ACP before and after training.

Setting Training provided to metropolitan and rural settings in Victoria, Australia.

Participants 148 doctors participated in training. The majority were aged at least 40 years with more than 10 years work experience; 63% had not trained in Australia.

Intervention The programme included prereading, a DVD, interactive patient e-simulation workshop and a training manual. All educational materials followed an evidence-based stepwise approach to ACP: Introducing the topic, exploring concepts, introducing solutions and summarising the conversation.

Main outcome measures The primary outcome was the change in doctors’ self-reported confidence to undertake ACP conversations. Secondary measures included pretest/post-test scores in patient ACP e-simulation, change in ACP knowledge and attitude, and satisfaction with programme materials.

Results 69 participants completed the preworkshop and postworkshop evaluation. Following education, there was a significant change in self-reported confidence in six of eight items (p=0.008 –0.08). There was a significant improvement (p<0.001) in median scores on the e-simulation (pre 7/80, post 60/80). There were no significant differences observed in ACP knowledge following training, and most participants were supportive of patient autonomy and ACP pretraining. Educational materials were rated highly.

Conclusions A short multimodal interactive education programme improves doctors’ confidence with ACP and performance on an ACP patient e-simulation.

  • Advance Care Planning
  • Education Program
  • General Practice
  • Medical Education
  • Communication Training

Statistics from

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Advance care planning improves quality of care including end-of-life care1; increases the knowledge of and respect for a person's end-of-life wishes1 ,2; reduces the likelihood of a person receiving unwanted burdensome treatments1 ,3; and improves patient and family satisfaction with care.1 ,3 ,4 Advance care planning also reduces the risk of stress, anxiety and depression in the surviving relatives of deceased patients.1 Most patients do not have access to advance care planning5 ,6 despite research showing that many patients such as those with chronic diseases, and their families, would like information related to their medical condition and prognosis, and would like to participate in discussion about their future medical treatment wishes.7–11 Patients expect their doctors to initiate advance care planning conversations12 ,13 and appreciate it when they do. Many doctors find these conversations difficult,14 and have inadequate training in advance care planning and end-of-life care communication.15 ,16

Quality communication between doctors, patients and their families is an essential component of advance care planning discussions.1 ,17 ,18 Communication skills of doctors do not reliably improve with experience alone19 but can be learned,20–23 and once learned they can be retained.22 ,24 Interactive workshops on breaking bad news and having end-of-life conversations with patients, have been shown to enable doctors to feel better prepared to perform these tasks20–22 ,24 and to facilitate improved outcomes for patients.25 ,26

Learning should be an active process, have a problem-centred focus, and should consist of realistic clinically relevant scenarios that doctors may anticipate encountering.25 ,27 ,28 It should draw on participants’ previous experience, allow the learner to apply what is being learned, include the opportunity for feedback and reflection,28 be structured and, ideally, should include the use of role models.29 The optimal length of doctor communication skills training programmes is unknown, with previous studies reporting on programme durations ranging from less than 1 day21 to 4 days.20 ,22–24 ,26 ,28 ,29

We developed a multimodal training programme on advance care planning specifically for general practitioners and doctors-in-training, designed to be time efficient and, therefore, aiming for greater access and completion by these doctors. We hypothesised that this training, the ‘Next Steps’ education programme, would improve doctors’ confidence in undertaking advance care planning conversations with their patients, would improve performance on an advance care planning patient e-simulation, and that the ‘next steps’ programme materials would be acceptable to participants. Based on existing evidence, the assumptions of the underlying logic of the program30 were that developing skills and confidence would make it more likely that advance care planning would be implemented with patients and, when implemented appropriately, advance care planning would lead to better patient/family outcomes. We report the results of the evaluation of this education programme.


Development of materials

The ‘Next Steps’ steering committee was established to determine the content and format of the materials and to design the evaluation of the training programme. The committee comprised doctors, specialists and general practitioners, an evaluation expert and other key stakeholders.31 The project was funded by the Victorian Quality Council division of the Department of Health Victoria, Australia.31

The education programme included a DVD, an interactive patient e-simulation, a structured 2 h workshop and a training manual to assist with facilitation of the workshop. The content, delivery and format of materials and the workshop structure were informed by a review of the literature and previous experience in teaching advance care planning, and discussing end-of-life issues using e-learning and workshops, (KD, WS, CC)32 illustrative videos (CC) and e-simulation (SM)33–36 to Australian doctors and medical students. The development of the programme materials was a detailed process whereby each component was initially developed, approved by the steering committee, and then tested on a sample of doctors, prior to further refinement and finalisation of the programme.

Throughout the different training materials, a structured step-wise approach to advance care planning is followed. The steps are (1) introducing the topic, (2) exploring the concepts (3) introducing a solution and (4) summarising the conversation. These steps are based on previous literature37 (see box 1). The DVD showed real doctors conducting simulated discussions with actor patients, using their own words, but following the four key steps of an advance care planning conversation.

Box 1

‘Next Steps’ Education materials

  • Prereading material—information on advance care planning and the law

  • DVD

    • Scenarios of successful and unsuccessful advance care planning conversations

    • Short acute hospital vignettes for initiating advance care planning discussions

    • Scenario on follow-up discussion

    • Document completion information

  • Patient e-simulation

  • Workshop—120 min interactive workshop

    • Facilitated group discussion

    • DVD scenarios

    • Role play

    • Information provision

  • Training manual—resource for facilitators wishing to deliver the ‘next steps’ advance care planning education

Throughout the DVD scenarios and the e-simulation the four key steps of advance care planning conversations are reinforced:

  1. Introducing the topic

  2. Exploring the topic in relation to the patient's situation

  3. Creating opportunities to change by providing information, offering a solution and discussing

  4. Summarising conversation and planning for follow-up

The patient e-simulation was adapted from a similar programme developed by the School of Medicine, Deakin University, for use by medical students that successfully taught medical students to have not-for-resuscitation, end-of-life, and giving-bad-news conversations with patients. In the e-simulation, the participant plays the role of the doctor, and conducts an advance care planning conversation with a virtual patient. At each time-point during the conversation, the participant chooses from several possible responses/questions (prompts), and a video clip is played to display the patient's response to the selection. The doctor then asks further questions and the conversation progresses. Depending on the choices, the conversation will proceed to a successful advance care planning conversation, or the patient may become upset or angry and leave. Each prompt choice triggers specific feedback that is informed from current literature. At the end of the simulation, the participant receives a score and a transcript of their conversation, with information as to why a question/statement scores well or poorly. The e-simulation scoring is based on the appropriateness of each of the responses, and scores of −10, −5, 0, +5 or +10 are assigned to each prompt choice. For example, a perfect response that explored the views of the patient was scored +10, a neutral response that did not advance the conversation but did not upset the patient was scored 0, and a response that would cause the patient to become upset and to terminate the interview scored −10. The scoring for each of the possible responses was based on the literature review, previous experience with patient e-simulation with several hundred medical students (SM),36 and the advance care planning experience of the members of the steering committee. The range of possible scores is from −30 (which occurs if the participant makes three consecutive ‘bad’ choices, following which the conversation ends because the simulated patient ‘leaves the consultation’) to +85 (where a very good, structured and appropriate conversation has occurred). The e-simulation could be undertaken as often as participants desired.

Education programme and setting

A total of eight workshops were held in metropolitan (four) and rural (four) settings. The aim was to recruit similar numbers of general practitioners and doctors-in-training. Prior to the workshop, participants received brief prereading materials (up to 1 h reading time), and were encouraged to undertake at least one e-simulation patient conversation. The workshop duration was 2 h and included facilitated discussion, use of DVD scenarios, role play and information provision. All participants volunteered to attend the free workshops, and were recruited by their local education providers. All workshops were facilitated by one of the ‘Next Steps’ steering committee doctors in conjunction with another facilitator (either the Next Steps project officer or a local educational facilitator). Separate workshops were developed for general practitioners and for doctors-in-training. The only significant difference in content between these is in the section related to how to initiate advance care planning discussions, with the inclusion of acute hospital vignettes in the DVD and subsequent discussion related to this in the doctors-in-training workshops. In the general practitioner workshop, a general discussion regarding how these doctors could initiate advance care planning conversations as part of their usual work is included instead of the acute hospital vignettes. See online supplementary material for detailed outline of workshops.

Evaluation design and tools

The study was approved by the Austin Health research ethics committee. Analysis was based on a convenience sample of participants who volunteered to attend the training programme. All participants received written information regarding the ‘Next Steps’ education programme and were invited to participate in the evaluation by completing pre-education and posteducation questionnaires. Participants completed these anonymously but were asked to create a unique 4–6-digit code so that presurveys and postsurveys could be matched to a unique individual. Presurveys were completed prior to attendance at workshops, and could be either mailed prior to, or returned at the beginning of, the workshop. Participants were asked to complete the survey prior to the required workshop prereading, however, it was not possible to ensure this sequence. Upon completion of the workshop, participants were given the post-education evaluation questionnaire and asked to return these immediately or via mail within 2 weeks. Participants were required to register by email address in order to access the e-simulation. Three components were assessed before and after the education programme was undertaken: (1) knowledge of advance care planning was assessed with eight statements requiring responses of true, false or not sure; (2) attitudes towards advance care planning were assessed using 10 statements with responses of strongly agree, agree, disagree or strongly disagree and (3) confidence in discussing advance care planning was assessed using eight statements with responses of very confident, confident, unconfident or very unconfident. Demographic information and a summary of previous experience with advance care planning were only asked on the first questionnaire. The post-education questionnaire also included questions regarding the experience, satisfaction and acceptability of the workshop and educational materials, whereby respondents were asked to rate statements with responses of strongly agree, agree, disagree or strongly disagree.

The e-simulation was used to evaluate participant performance in a step-wise advance care planning conversation. Participants’ e-simulation scores, preworkshop and postworkshop, were evaluated as a measure of improved performance. The e-simulation exercise was completed as often as desired, and gave a score out of a maximum of 85 after each attempt. The participants were asked to complete their first attempt prior to reading prereading materials or attending the workshop. Each e-simulation attempt was recorded including the date and time of completion. For the evaluation the first e-simulation attempt score was used provided it occurred prior to the workshop and this was compared to the maximum score obtained during subsequent attempts. Given the design of the e-simulation, which measures current performance and assists learning by providing written feedback to participants, subsequent attempts may improve numerical scores. Therefore, the maximum score was used rather than the score on the second attempt.

Outcome measures

The primary outcome measure was the change in doctors’ self-reported confidence to undertake advance care planning conversations. Secondary outcome measures included performance on the e-simulation advance care planning discussion, change in advance care planning knowledge and attitude and, finally, the participants’ satisfaction with, and reported acceptability of, the ‘next steps’ education programme materials and approach.

Statistical analysis

Demographic differences and differences in prior advance care planning experience between the matched and unmatched samples were assessed using χ2 tests. Differences between total general knowledge scores, pre-education and posteducation, were assessed using a paired samples t test. Respondents who did not answer one or more of the general knowledge questions were coded as incorrect for those items. Pre-education versus posteducation changes on individual general knowledge items, and responses to attitudes and confidence statements, were assessed using McNemar tests. Participants who did not respond to one or more of the latter items were excluded from the analyses. As data were not normally distributed, Wilcoxon signed rank tests were used to assess differences between scores on the first and subsequent attempts on the e-simulation task. Bonferroni correction to adjust for multiple statistical comparisons were applied (see table footnotes for details). Adjusted p values are reported, with a probability equal to or less than 0.05 considered meaningful.


The workshops were attended by 148 doctors. The potential number of doctors cannot be determined as they were recruited via advertisements posted by their local education providers. Of those attending a workshop, 119 completed the preworkshop survey, 98 completed the postworkshop evaluation survey, and 69 participants completed both surveys. A total of 64 participants completed the e-simulation at least once (see figure 1).


There was a significant difference in the number of years of practice (0–10 years vs 10+ years, p=0.04) between the participants who only completed the presurvey (n=50) and those who completed the presurvey and postsurvey (n=69) (see table 1). There was also a significant difference in the primary workplace with more participants being general practitioners in the group who completed both surveys. The majority of participants were 40 years old or less, with less than 10 years of working experience. Seventy-five doctors (63%) had not trained in Australia, and most had practised in Australia for less than 5 years.

Table 1

Demographics for the sample

Prior experience with advance care planning

The preworkshop survey revealed that in the last 6 months 55% of respondents reported having initiated an advance care planning conversation, 30% had completed an Advance Care Plan with a patient, and 30% had been a witness for an Advance Care Plan (see online supplementary table S1) Twenty-six percent of respondents, however, also reported observing patients receive treatments which were not consistent with the patients’ wishes (see online supplementary table S1). Forty-eight percent of responding doctors indicated that it was the doctor who usually initiated advance care planning and end-of-life conversations with patients, but 49% of participants felt more than one person should be involved in the discussion (see online supplementary table S2).


The level of self-reported confidence to undertake an advance care planning conversation showed a significant improvement in six out of eight of the areas surveyed (table 2). In all statements, there were fewer ‘no answer’ responses in the posteducation survey, thought to be reflective of greater comfort by the participants to answer as a result of the education programme.

Table 2

Self-reported confidence related to advance care planning skills, roles and responsibilities n=69

Performance in the e-simulation

Of the 64 participants who attempted the e-simulation at least once prior to the workshop, 43 completed it two or more times (median times =2, see table 3). There was a significant improvement in the scores of those participants who had more than one e-simulation attempt from a median of 7 pre-education to a median of 60 on posteducation (p<0.001).

Table 3

E-simulation performance

Advance care planning general knowledge

The majority of the doctors (pre 88.4%; post 85.4%) answered six or more of the eight advance care planning general knowledge items correctly, with no significant difference in mean scores at baseline compared to posteducation (pre 6.3; post 6.5). There was a significant improvement in the proportion of correct responses to one question: ‘Victorian Law permits a person to complete a certificate directing a doctor to withhold or withdraw treatment for their current medical condition’ (p=0.0.01) (see online supplementary table S3).


The majority of doctors exhibited attitudes that were supportive of patient autonomy and advance care planning, pre-education and posteducation (table 4). There was no difference between the mean scores at baseline compared with follow-up. The only statement showing a change following education was statement 10: ‘Helping patients complete an Advance Care Plan is emotionally draining’, with fewer doctors considering this to be emotionally draining following education (p=0.02). As with the confidence section of the survey, there were fewer ‘no answer’ responses in the post surveys.

Table 4

Attitudes to advance care planning, patient participation, and decision making (n=69)

Satisfaction with, and usability of, the ‘next steps’ education materials

The majority of participants rated the workshop highly, with 90% agreeing or strongly agreeing that the workshop was well presented, 88% agreeing or strongly agreeing that the pace was appropriate, and 85% reporting the group discussions were useful (see online supplementary table S4). Sixty-seven percent of participants agreed or strongly agreed that the prereading materials were useful, 85% of respondents agreed or strongly agreed that the DVD was ‘a valuable way of learning concepts and skills that might be difficult to learn in a real workplace’, and 83% agreed or strongly agreed that the DVD ‘provided a non-threatening way of learning real-life work-related experiences’. With respect to the e-simulation, 69% of respondents agreed or strongly agreed that this was ‘a valuable way of learning concepts and skills that might be difficult to learn in a real workplace’, and 73% agreed or strongly agreed that it ‘provided a non-threatening way of learning real-life work-related experiences’.


Although the study participants may be a self-selected group interested in advance care planning, this is the first study to report on an intervention which is successful in improving the confidence and simulated ability of general practitioners and doctors-in-training to undertake advance care planning conversations with patients. This brief (approximately 3 h duration) multimodal advance care planning education programme significantly improved the doctors’ self-reported confidence in discussing advance care planning, improved their scores on the patient e-simulation exercise and was rated highly by the majority of participants. The generalisability of these findings may be limited by the participant evaluation completion rate. While there was no direct measure of clinician behaviour following the training, the e-simulation and role-play elements of the programme enabled the participants to benchmark their performance and evaluate their personal improvement over time. A meta-analysis of technology-enhanced simulation for health professionals education found that simulation training showed a moderate pooled effect size of 0.50 (95% CI 0.34 to 0.66; p<0.001)38

The optimal length of doctor communication skills training programmes is unknown. Previous studies that focus on clinical communication skills23 and end-of-life communication skills training have shown similar improvements in levels of confidence and performance,20 ,22 ,24 ,28 ,29 but these programmes are 1–4 days in duration. For many doctors, attendance at such workshops may be unrealistic. Recently, an Australian study of junior doctors undertaking a training programme, consisting of 3 h of face-to-face training, and a further 2 h of time reviewing take-home written and audiovisual materials, showed improvements in their communication skills and confidence with end-of-life communication.21

Training programme elements

This training programme was dependent on the following elements based on evidence from the literature.25 ,27 ,28 ,37 ,39 These included: the structured standardisation of the advance care planning discussion into four steps; the use of video role modelling; standardised patient e-simulation; the use of role play, group discussion and feedback during the workshop; and the provision of prereading material. These components will be discussed further.

In our experience and that of others, one of the frequently cited barriers to undertaking advance care planning discussions is the belief that the conversation takes too long.6 ,15 This may be partly related to the difficulty doctors have in planning out the conversation in such a way that it can be incorporated into clinical practice in a time-efficient manner. This education programme has attempted to overcome this barrier by teaching a semistructured approach to the advance care planning conversation. Based on the work by Briggs,37 we structured the conversation into four sequential steps that involve a linear progression that the conversation can follow (see box). These steps can be taught in a similar way to the ‘PREPARED’ skills relevant to end-of-life and palliative discussions.40

The materials used in this training programme were specifically designed to facilitate learning. First, prereading materials were developed to ensure baseline knowledge of advance care planning and the relevant law, and to encourage positive attitudes prior to undertaking the active skills-based part of the programme. This decision was based on our previous experience in teaching advance care planning to health professionals,32 and is supported by the work of others.22 ,25 ,28 In this study, prior to attending the workshop, our participants demonstrated a very high level of pretraining advance care planning knowledge, and generally exhibited attitudes that were supportive of patient autonomy and advance care planning. Due to the evaluation design, we are unable to determine if this was the result of the prereading materials in some cases or whether these doctors, all of whom were volunteers, already had high pre-existing knowledge and positive attitudes in this area.

The patient e-simulation and the DVD with actor patients and real doctors enabled a practical and standardised learning experience for participants, and provided highly skilled communicator doctor role models. The e-simulation also delivered a novel and valuable measure of pretraining practical skill that provided a baseline for comparison after completing the programme. Despite the high levels of knowledge and positive attitudes expressed in the pretraining survey, the data from the first attempt at the e-simulation indicated relatively low levels of communication skills, and these increased significantly after the workshop.

All the tools used in the training have previously been shown to be important in facilitating the learning of communication skills.20 ,22 ,26–28 ,39 ,41 Furthermore, the e-simulation has the additional advantage of allowing assessment, immediate feedback and the ability to repeat the exercise as frequently as desired, thus, also assisting in learning. The DVD could also be viewed again after the training. The workshop included facilitated discussion and role play, tools that have been shown to be extremely useful in communication skills training programmes.20 ,22 ,25 ,26 ,28 ,29 Participants in this study found the programme materials acceptable, as evidenced by their ratings in the posteducation questionnaire. Most participants rated the education materials highly, found them useful and agreed or strongly agreed with statements regarding the tools as being a valuable and non-threatening way of learning.

Although this education package was quite short (total duration of approximately 3 h) compared to many other communication skills programmes,20 ,22 ,24 ,26 ,28 ,29 some very positive and encouraging results were achieved. Clayton and colleagues also found positive results with their brief training programme targeting junior doctors,21 but their programme required highly skilled facilitators and the use of actor patients during the workshop, which may limit the generalisability and access of the programme. In the ‘Next Steps’ programme, the development of a workshop manual to assist less experienced facilitators to deliver the programme, and the use of the standardised e-simulation and DVD tools, will potentially give more doctors the opportunity to undertake this training.

Limitations of this study

The predesign/postdesign of the evaluation was appropriate for the purpose of demonstrating the acceptability of the training to participants and its effectiveness in increasing self-confidence. Future studies could be undertaken using a randomised controlled study design and include follow-up of participants to assess change in actual clinical practice related to advance care planning discussions. These areas would all benefit from further study. A further limitation of this study is the amount of missing data in each of the sections. As all participants volunteered, it is unknown if similar results would be seen if the training was made compulsory. On the one hand, the participants may be a biased sample that were already primed with a positive attitude. On the other hand, the results of this study may suggest that the training would be of more benefit to those who had a less positive baseline attitude to, or knowledge of, advance care planning. During this study, we only used ‘Next Steps’ project doctors as facilitators for the workshops and, thus, do not know if using potentially less experienced facilitators (aided by the training manual) would produce similar results. Finally, the evaluation performed during this study was not able to ascertain the impact of the individual components of the package to determine which ones were responsible for the positive outcomes.

Implications of this study

We believe this education programme offers a practical and acceptable way to assist doctors to develop skills to improve their confidence and competence in undertaking advance care planning conversations with their patients. By making this programme widely available, and encouraging doctors to participate in this training, it would be expected that more patients would have access to advance care planning and, thereby, an improved quality of care including at the end of their life.


This short multimodal education programme improved doctors’ confidence in undertaking, advance care planning discussions with patients, and improved their performance on an e-simulation. Additionally, the materials were rated highly and were acceptable to the participants.


Next steps steering committee: Associate Professor Charlie Corke, Ms Maree Cuddihy, Dr Karen Detering, Mr Robbie Ferguson, Dr Robert Grenfell, Dr Patrick Kinsella, Associate Professor Virginia Lewis, Ms Sharyn Milnes, Mr Bryce Prosser, Dr Tom Rozen, Associate Professor William Silvester, Ms Carolyn Stapleton. We would also like to acknowledge all the participants who volunteered to undertake the next steps training and complete the evaluation, and the doctors and actors who were involved in the DVD/e-simulation scenarios.


Supplementary materials


  • Contributors KD: Responsible for the research design, implementation and manuscript writing. WS: Overseeing research design, implementation and manuscript writing. CC: Involved in the research design and implementation and review of the manuscript. SM: Involved in the research design, the e-simulation and implementation and review of the manuscript. RF: Conducted the data analysis and was involved in the writing of the manuscript. VL: Contributed to the design, implementation and review of the manuscript. JR: Involved in the design and implementation of the intervention.

  • Funding The development and evaluation of the ‘Next Step’ education programme was funded by Victorian Quality Council, Department of Health Victoria.

  • Competing interests None.

  • Ethics approval Austin Human Research Ethics Committee.

  • Provenance and peer review Not commissioned; externally peer reviewed.