Can a structured, behavior-based interview predict future resident success?




Objective


To determine whether a structured, behavior-based applicant interview predicts future success in an obstetrics and gynecology residency program.


Study Design


Using a modified pre-post study design, we compared behavior-based interview scores of our residency applicants to a postmatch evaluation completed by the applicant’s current residency program director. Applicants were evaluated on the following areas: academic record, professionalism, leadership, trainability/suitability for the specialty, and fit for the program.


Results


Information was obtained for 45 (63%) applicants. The overall interview score did not correlate with overall resident performance. Applicant leadership subscore was predictive of leadership performance as a resident ( P = .042). Academic record was associated with patient care performance as a resident ( P = .014), but only for graduates of US medical schools. Five residents changed programs; these residents had significantly lower scores for trainability/suitability for the specialty ( P = .020).


Conclusion


Behavioral interviewing can provide predictive information regarding success in an obstetrics and gynecology training program.


Each academic year, residency programs and applicants expend significant time, energy, and financial resources interviewing applicants for future residency positions. In the 2009 National Residency Matching Program (NRMP) main residency match, a total of 51,882 applicants pursued the 25,185 training positions available in the United States. This included 1796 applicants for the 1185 residency positions in the specialty of obstetrics and gynecology.




For Editors’ Commentary, see Table of Contents




See related editorial, page 369



Despite these investments, little is known about the use of the residency interview in predicting an applicant’s future performance as a resident. For instance, Metro et al reviewed interview scores for their applicants to determine whether the scores correlated with any measures used to evaluate the residents during their training. Interview scores did not correlate with any of the measures, including knowledge, judgment, motor skills, or intrapersonal attitudes. Performance on the United States Medical Licensing Examination (USMLE) has been shown to be positively correlated with intraining service examinations, but not with faculty evaluation of resident performance. Other studies have shown interview scores did not predict physicians at risk of later impairment. Furthermore, resident attrition is a significant problem for many programs. After a cohort of 1055 residents started their obstetrics and gynecology residency training in 2001, McAlister et al found that 21.6% of residents either: switched programs, changed specialties, completed training on an atypical academic cycle, or left graduate medical education all together.


In 2006, the St. Vincent Hospital Department of Obstetrics/Gynecoloy embarked on a program of behavior-based interviewing for all applicants to its residency program, with the hope of identifying applicants who would be successful both in the chosen specialty and in the St. Vincent program. This study reports the results of this interview process to predict future resident performance.


Materials and Methods


The St. Vincent Hospital Obstetrics/Gynecology residency is a community-based program consisting of 4 residents per academic year (approved for expansion to 5 positions in June 2007). The hospital is 1 of 4 main teaching sites for the required third year obstetrics/gynecology clerkship for medical students from the Indiana University School of Medicine. Approximately 200-250 students apply for the available residency positions, with the majority of the US graduates matriculating from programs in the Midwest region.


Beginning in November 2006, the St. Vincent Hospital Department of Obstetrics/Gynecology embarked on a structured behavior-based interview for applicants to its residency program. Applicants were scored on the following areas: academic record, professionalism, leadership, trainability/suitability for the specialty, and fit for the program. Academic record was scored by the program director through a review of the information available in the written application. The other 4 aspects were measured through a series of behavior-based questions at individual interview stations. Interviews for professionalism, leadership, and trainability/suitability were conducted simultaneously by 2 faculty members. When scheduling prevented the presence of both members, 1 faculty member conducted the interview. Fit for the program was assessed by a panel of 3 residents. Through literature review and discussions with subject matter experts, behavior-based questions pertinent to each subject area were developed, as were scoring sheets with specific examples of scores for particular responses (measured aspects of each subject area, sample scoring sheets, and sample questions, Supplementary Figures 1-4 ). Faculty and residents were educated regarding the new interview protocol during a series of formal and informal meetings. During the interviews, faculty and residents could ask any of the potential questions available for their session, allowing some flexibility from interview to interview. At the end of each interview encounter, every interviewing faculty or resident completed their score sheet separately and independently from their partner(s). Scores for all 5 categories were determined by the average of the interviewer’s scores. With each category having a potential value of 36, the maximum interview score for each applicant was 180. These scores provided the initial ranking list of applicants on which all future discussions were based. A database was maintained with all interview data, as well as medical school location (US-based program or international-based program).


For all applicants from November 2006 through January 2008, eventual match sites were identified through a combination of (1) the NRMP Match Results by Matched Applicant provided to the St. Vincent program, (2) the Council on Resident Education in Obstetrics and Gynecology (CREOG) resident database, and (3) individual program web site reviews. The program director for each program was identified through the listing of accredited programs at the Accreditation Council for Graduate Medical Education website ( www.acgme.org ).


An electronic survey was developed and sent to each director of a program into which a St. Vincent applicant had matched. Program director e-mails were identified through the AGCME database. E-mails with links to the electronic survey were sent to the appropriate program directors in September 2009, at which time the former applicants would have completed either 1 or 2 years of residency training. The survey was designed to measure the following resident attributes:




  • Patient care



  • Medical knowledge



  • Surgical skills



  • Communication



  • Professionalism



  • Clinical documentation



  • Leadership



  • Teamwork



  • Overall impression



In addition, questions regarding awards, disciplinary actions for academic or professionalism-related issues, and the resident’s continued status in the program were included. If the resident was no longer with the training program, program directors were asked to explain the change in training status. Information on type of program (community or university) was also collected.


For residents matching into the St. Vincent training program, it was thought the program director’s completion of the survey would lead to potential bias, as he is the principal investigator of this study (E.S.). Therefore, the same survey for St. Vincent residents was completed by the consensus opinion of a core group of 3 faculty members with extensive clinical experience with the residents.


To maintain applicant confidentiality, once data were collected from the electronic survey and matched to the original interview score, all data were deidentified. When means between groups were compared, Student t tests were performed to determine statistical significance. Linear regression was used to determine relationships between pretest (interview) and posttest (electronic survey) scores. A P value of < .05 was considered significant.


The project was submitted to the St. Vincent Hospital Institutional Review Board (IRB) and was granted exempt status as an educational project.




Results


From November 2006 through January 2008, a total of 80 applicants interviewed for residency positions in obstetrics/gynecology at St. Vincent Hospital. Of these applicants, 65 (81%) were women, 15 (19%) were men, and 74 (92.5%) were from US allopathic training programs. Of these, 72 matched into obstetrics/gynecology training programs, and 8 either did not match or matched into other specialty programs (4 family medicine, 1 pediatrics, 1 general surgery, 1 anesthesia, and 1 unmatched).


The 72 applicants matched into a total of 42 obstetrics/gynecology residency training programs. Of the 42 electronic surveys sent to these program directors, a total of 24 surveys (57%) were completed, providing follow-up data for 36 of the applicants. Including the 9 residents matching into the St. Vincent program, this provided follow-up data for 45 (63%) of the applicants matching into obstetrics/gynecology training programs.


In evaluating the assessment of overall resident performance by their program director, there was not a significant relationship between the overall interview score and the overall performance score ( P = .784, Figure 1 ). This remained the case when separating results based by type of training program (university based, n = 25, P = .337; community based, n = 20, P = .952).


Jun 14, 2017 | Posted by in GYNECOLOGY | Comments Off on Can a structured, behavior-based interview predict future resident success?

Full access? Get Clinical Tree

Get Clinical Tree app for offline access