23 July 2018
Int J Med Sci 2012; 9(3):228-236. doi:10.7150/ijms.3353
Assessment of Resident Physicians in Professionalism, Interpersonal and Communication Skills: a Multisource Feedback
1. School of Public Health, Harbin Medical University, Harbin 150086, China;
This is an open access article distributed under the terms of the Creative Commons Attribution (CC BY-NC) License. See http://ivyspring.com/terms for full terms and conditions.
How to cite this article:
Qu B, Zhao Yh, Sun Bz. Assessment of Resident Physicians in Professionalism, Interpersonal and Communication Skills: a Multisource Feedback. Int J Med Sci 2012; 9(3):228-236. doi:10.7150/ijms.3353. Available from http://www.medsci.org/v09p0228.htm
Objective: To assess the internal validity and reliability of a multisource feedback (MSF) program by China Medical Board for resident physicians in China.
Method: Multisource feedback was used to assess professionalism, interpersonal and communication skills. 258 resident physicians were assessed by attending doctors, self-evaluation, resident peers, nurses, office staffs, and patients who completed a sealed questionnaire at 19 hospitals in China. Cronbach's alpha coefficient was used to assess reliability. Validity was assessed by exploratory factor analyses and by profile ratings.
Results: 4128 questionnaires were collected from this study. All responses had high internal consistency and reliability (Cronbach's α> 0.90), which suggests that both questions and form data were internally consistent. The exploratory factor analysis with varimax rotation for the evaluators' questionnaires was able to account for 70 to 74% of the total variance.
Conclusion: The current MSF assessment tools are internally valid and reliable for assessing resident physician professionalism and interpersonal and communication skills in China.
Keywords: Resident physician, Multisource feedback, Professionalism, Interpersonal and Communication Skills, international
The Accreditation Council for Graduate Medical Education (ACGME) recommends that residency programs to evaluate their trainees under six core competencies - patient care, medical knowledge, practice-based learning and improvement, interpersonal and communication skills, professionalism and system-based care (2). While the criteria of ACGME have been gradually accepted by hospitals as well as medical education organizations in China, interpersonal and communication skills are underrepresented in the medical education curriculum in China (4-5).
The ACGME suggests that assessment tools include written examinations, global ratings, 360-degree global ratings, and procedure/case logs (6). Multisource feedback (MSF), or 360-degree feedback, is used to assess physicians' competencies in a broad range of residency programs, including residency programs in family medicine and internal medicine (7-9). By reviewing MSF feedback, physicians can improve their communication skills with patients, modify their communication strategies with nurses, and improve the print material in their offices (10). Studies of MSF show that these reliable and valid feedback instruments (questionnaires) are acceptable to practitioners (11, 12). This study is intended to evaluate the psychometric characteristics of a multi-source feedback tool to assess residents' interpersonal and communication skills in China.
The study received approval from the North China Center of Medical Education Development (NCCMED) and the Conjoint Health Research Ethics Board of the 19 collaborating hospitals.
258 first year resident physicians participated at 19 hospitals in 11 provinces of China. These 19 hospitals are part of the “Reform Residency Training Program in China,” a program funded by the China Medical Board, which uses competencies and assessments derived from the Global Medical Education Requirements (Table 1). Each resident received descriptive data (mean and SD) on the results for both himself or herself and the whole group.
The information of 258 resident physicians participated at 19 hospitals
*: Hospital 1(Affiliated Hospital of Chongqing Medical University); Hospital 2(Affiliated Hospital of Shanxi Medical University); Hospital 3(Affiliated Hospital of Capital University of Medical Sciences); Hospital 4(Affiliated Hospital of Hebei Medical University); Hospital 5(Affiliated Hospital of Luzhou Medical College) ; Hospital 6(Affiliated Hospital of Nanjing Medical University); Hospital 7(Affiliated Hospital of Harbin Medical University); Hospital 8(First Affiliated Hospital of China Medical University); Hospital 9(Shengjing Affiliated Hospital of China Medical University); Hospital10(Forth Affiliated Hospital of China Medical University); Hospital 11(Affiliated Hospital of Chengde Medical College); Hospital 12(Affiliated Hospital of Binzhou Medical College); Hospital13(Affiliated Hospital of Zunyi Medical College); Hospital 14(Affiliated Hospital of Inner Mongolia Medical College) ; Hospital 15(Affiliated Hospital of Weifang Medical University); Hospital 16(Affiliated Hospital of Hainan Medical University); Hospital 17(First Affiliated Hospital of Guangzhou Medical College); Hospital 18(Second Affiliated Hospital of Guangzhou Medical University); Hospital 19(Third Affiliated Hospital of Guangzhou Medical University).
The questionnaire for professionalism and interpersonal and communication skills from the Education Outcomes Service Group (EOS group) of the Arizona Medical Education Consortium was developed for attending doctors, residents (self- and peer evaluation), nurses, office staff, and patients. The goal of the EOS group was to provide assistance and support to program directors as they prepare to meet the ACGME outcome requirements. One item on the suggested list of methods for evaluating core competencies from the EOS group is the 360 Degree Evaluation. The assessment tool was refined from 2002 to 2006 by the EOS group to address new curricular elements and evaluation measurements recommended by the ACGME (13). The final questionnaire for attending doctors and residents consisted of 21, 21, and 21 items respectively (Table 2-4). The questionnaires for nurses (Table 5), office staffs (Table 6) and patients (Table 7) consisted of 26, 15 and 23 items respectively, with the same 5-point rating scale (1 = never to 5 = always). The questionnaires also included negative statements, such as “… is condescending to you or patients/families” and “… is abusive to you or patients/families rated with 1 as the perfect score (1 = never to 5 = always). All questionnaires provided respondents with the option of indicating whether they were able to evaluate the resident on the item.
The participating residents were enrolled in the select hospitals in September 2007, and the survey was carried out in May 2008. All investigators were uniformly trained, and questionnaires were kept sealed and confidential when researchers dispatched them to evaluators. Residents were required to complete a self-evaluation. One attending doctor, 3 nurses, 7 patients, 2 resident peers and 2 office staffs were appointed by the education management department of the hospital as a group to answer questions on the survey for each individual corresponding resident physician. Global assessments of residents' performance were based on at least eight months of contact with each evaluator.
The results of collected questionnaires were inputted into our database and analyzed. We sent back the questionnaires and had them filled again when the data was found incomplete. All evaluations were conducted according to the same principles and guidelines as previous attempts.
Response rates were used to determine feasibility for each of the respondent groups. The percentage of unable-to-evaluate (UE) items, along with the mean and SD, was computed to determine the viability of items in the survey and the score profiles for every item. When the percentage of unable-to-evaluate items exceeds 10% on a survey, it suggests a need to examine the item for revision or deletion. We used exploratory factor analysis to identify the factors and numbers of factors for each questionnaire and to describe the relative variance accounted for by each factor and their coherence with each other. Reliability was assessed by use of Cronbach's alpha coefficient for each individual evaluated group, which enables an assessment of overall instrument stability. Statistical analysis was performed by using SPSS version 13.0 (SPSS Inc., Chicago, IL, USA) for Windows®.
A total of 258 resident physicians participated, producing 258 self-assessments (100% return rate). A total of 258 (100%) attending doctor-assessments surveys were available for a mean of 1 per resident, 774 (100%) nurse-assessments surveys were available for a mean of 3 per resident, 1806 (100%) patient-assessments surveys were available for a mean of 7 per resident, 516 (100%) resident peer surveys were available for a mean of 2 per resident, and a total of 516 (100%) office staff surveys were available for a mean of 2 per resident.
Cronbach's alpha was calculated to determine the internal consistency and reliability of the questionnaires. There was an overall alpha of 0.913, 0.924, 0.930, 0.921, 0.901 and 0.933 respectively on attending doctor, self-evaluation, resident peer, nurse, office staff and patient surveys. The factor analysis identified 2 factors on the attending doctor, resident self, resident peer, nurse, and office staff surveys respectively: communication skills and professionalism, which accounted for 70.87% (Table 2), 71.01% (Table 3), 70.67% (Table 4), 75.54% (Table 5) and 74.62% (Table 6) of the variance respectively. There were 4 factors on the patient questionnaire: patient care, professionalism, interpersonal and communication skills, and system based practice, which accounted for 72.67% (Table 7) of the total variance.
Most items on the questionnaires could be answered by the respondents. As presented in Tables 2 to 7 the “Demonstrates respect for the patient's sexual orientation” item on the attending doctor (13.2%), resident peers (11.6%), nurses and patient (10.1%) survey had UE rates of more than 10%.
The scores for most items on the attending doctor, resident self, resident peer, nurse, patient, and office staff questionnaires were greater than 4. Low scores from the attending doctors were found in “Demonstrates respect for nurses”, “Demonstrates respect for support staff”, “Demonstrates responsibility” and “Maintains complete medical records” items. “Demonstrates respect for nurses” and “Demonstrates respect for support staff” items received low scores on the resident self -evaluations, and “Demonstrates respect for support staff”, “Shows compassion for patients and their families” and “Maintains complete medical records” items received low scores on the resident peer surveys. “Demonstrates respect for nurses” items on the nurse surveys, and the “Demonstrates respect for office staff /unit assistant” on the office staff surveys received high scores. The “time spent” and “community resources” items on the patient surveys received low scores.
Attending doctor descriptive statistics, unable-to-evaluate (UE) rates and rotated component matrix
PR=professionalism; ICS=interpersonal and communication skills; SD=standard deviation
*: UE rate is more than 10%; †: high score.
Resident self-descriptive statistics, unable-to-evaluate (UE) rates and rotated component matrix
PR=professionalism; ICS= interpersonal and communication skills
*: UE rate is more than 10%, †: high score.
Resident peer descriptive statistics, unable-to-evaluate (UE) rates and rotated component matrix
PR=professionalism; ICS= interpersonal and communication skills
*: UE rate is more than 10%, †: high score
Nurse descriptive statistics, unable-to-evaluate (UE) rates and rotated component matrix
PR=professionalism; ICS= interpersonal and communication skills
*: UE rate is more than 10%; †: high score; #: “negative” statements
Patient descriptive statistics, unable-to-evaluate (UE) rates and rotated component matrix
PC=patient care; PR=professionalism; ICS=communication skills; SP=systems based practice; *: UE rate is more than 10%, ‡: low score
Office staff descriptive statistics, unable-to-evaluate (UE) rates and rotated component matrix
PR=professionalism; ICS= interpersonal and communication skills
†: high score “negative” statements; #: “negative” statements
In this study, we assessed the psychometric validity and reliability of MSF assessment questionnaires which evaluated resident physicians in professionalism and interpersonal and communication skills. The evaluation was mandatory and the response rates were high, as expected. While most of the items could be answered, there were specific types of items on the attending doctor, resident self, resident peer, nurse and patient questionnaires that had UE percentages higher than anticipated. For attending doctors, resident peers, nurses, and patients, these tended to be in aspects of professionalism, specifically related to respect for patients sexual orientation. This is a sensitive subject in traditional Chinese culture and may explain the reticence of evaluators to score this domain.
Compared to traditional evaluation methods, the MSF, or 360-degree evaluation method, is more accurate and reliable (14, 15). All instruments had a high internal consistency and reliability (Cronbach's α>0.90), which suggests that both the questions and form data are internally consistent. The exploratory factor analyses with varimax rotation for the attending doctor, resident selves, resident peers, nurses, office staffs and patients questionnaires explained accounted for 70.87%, 71.01%, 70.67%, 75.54%, 74.62% and 72.67% of the total variance, respectively. The results showed that MSF assessment tools were internally valid.
Resident physicians did well in several aspects. Most resident physicians were conscientious and still learning how to become medical professionals. They respected the patient's disabilities and appreciated their colleagues' work. They completed the medical records as soon as possible.
However, results also showed that resident physicians did not pay much attention to spending enough time with their patients or suggesting community resources for additional information and support. It maybe that available community resources are not in a readily accessible and there are not a searchable format for the residents, such as an electronic database in China. In addition, resident physicians' busy daily work schedules, limiting the amount of time they have to research community health service programs and to share any pertinent information with patients in the large hospitals. Additionally, both Chinese doctors and patients tend to have lesser affinity to community health services and public health programs, as patients are often satisfied with just a visit to a larger hospital or institution. The resident physicians in this study infrequently inquired about community health services and public health programs.
There are limitations in the study. Data testing was limited to only resident physicians at the 19 collaborating hospitals which are relatively large teaching hospitals in China. The quality of physicians in these hospitals may be different from those work in smaller, community-based hospitals that are not affiliated with academic medical centers. We are not sure whether resident physicians in other parts of China would have similar performance profiles. There were only 1 attending doctor, 2 resident peer, 3 nurse, 7 patient and 2 office staff evaluators per resident physician in this study. Future research should increase the number of evaluators per resident to improve the reliability for the overall questionnaire. In addition, most teachers in China are usually not inclined to give low scores in evaluations like this, which may explain the relatively high scores (16).
A follow-up study to determine how the residents used their data, the changes they made as a result of the feedback, and their perceptions of this type of assessment is certainly warranted and was undertaken in recent resident training programs. The results of the study were collated in a second survey completed in 2009. This data will be used to provide formative feedback in a confidential manner to each resident, and suggestions for improvements will be made. The effects of such feedback and suggestions may then be reflected in the scores obtained during the following year's evaluation. In this way, a progressive improvement in professionalism and interpersonal skills and communication skills could be encouraged and measured.
The MSF or 360-degree feedback questionnaires for resident physicians may provide an internally valid and reliable way of assessing resident physician competencies.
The authors thank the 19 collaborating hospitals.
Financial Support: The study was supported by the China Medical Board; grant number: 06-844.
The authors have declared that no competing interest exists.
1. ACGME competencies: July 1, 2007, requirements. Accreditation Council for Graduate Medical Education. http://www.acgme.org/outcome/comp/GeneralCompetenciesStandards21307.pdf
2. Leach DC. The ACGME competencies: substance or form? Accreditation Council for Graduate Medical Education. J Am Coll Surg. 2001;192:396-398
3. Qiang Cao, Yu Cao, Qiang Liu. The direction of GMER in Chinese medical education. China Higher Medical Education. 2007;5:24-26
4. Xiao HP, Xian LQ, Yu XQ, Yu XQ, Wang JP. Medical curriculum reform in Sun Yat-sen University: implications from the results of GMER evaluation in China. Medical Teacher. 2007;29(7):706-710
5. Baozhi Sun; Yuhong Zhao. Medical curricula in China and the USA: a comparative study. Medical teacher. 2003;25(4):422-427
6. Hobgood CD, Riviello RJ, Jouriles N, Hamilton G. Assessment of communication and interpersonal skills compentencies. Acad Emerg Med. 2002;11:1257-69
7. Sargeant J, Mcnaughton E, Mercer S. et al. Providing feedback: Exploring a model (emotion, content, outcomes) for facilitating multisource feedback. Medical Teacher. 2011;33(9):744-749
8. Allerup P, Aspegren K, Ejlersen E. et al. Use of 360-degree assessment of residents in internal medicine in a Danish setting: a feasibility study. Medical Teacher. 2007;29(2-3):166-170
9. Iramaneerat C, Myford CM Yudkowsky R, Lowenstein T. Evaluating the effectiveness of rating instruments for a communication skills assessment of medical residents. Advances in health seciences education. 2009;14(4):575-594
10. Brinkman WB, Geraghty SR Lanphear BP, Khoury JC Gonzalez del Rey JAG, DeWitt TG Britto MT. Effect of multisource feedback on resident communication skills and professionalism - A randomized controlled trial. Archives of pediatrics adolescent medicine. 2007;161(1):44-49
11. Burford Bryan, Illing Jan, Kergon Charlotte. et al. User perceptions of multi-source feedback tools for junior doctors. Medical Education. 2010;44(2):165-176
12. Lockyer JM. Multisource feedback in the assessment of physician competencies. J Contin Educ Health Prof. 2003;23:4-12
13. Introduction to the EOSG Manual. http://azmec.med.arizona.edu/eos.htm
14. Higgins RSD, Bridges J, Burke JM, O'Donnell MA, Cohen NM, Wilkes SB. Implementing the ACGME general competencies in a cardiothoracis surgery residency program using 360-degree feedback. Ann Thorac Surg. 2004;77:12-17
15. Joshi R, Ling FW, Jaeger J. Assessment of a 360-degree instrument to evaluate residents' competency in interpersonal and communication skills. Academic med. 2004;79:458-63
16. Wanwen Zhao, Xia Ouyang, Xiaoling Feng, Ling Li. The construction of the evaluation model of head nurses with 360-degree feedback as the core. Modern Hospital Management. 2010;5:60-62
Corresponding author: Bao-zhi Sun, MD, School of Public Health, Harbin Medical University. Tel: 86-24-23256666-5466; Fax: 86-24-23261090; E-mail: baozhisun6666com