Every year, residency program directors and departments of obstetrics and gynecology go through the time-consuming and labor intensive but important process of screening, interviewing, and ranking applicants for their residency program−each hoping to select a new class of future obstetricians/gynecologists who will love their chosen career, become successful, knowledgeable, and skilled physicians and surgeons in the specialty, as well as continue to advance our field. The impact on applicants is also substantial in terms of time away from home or medical school and the financial commitment for application fees as well as travel expenses for interviews.
See related article, page 446
In the 2009 National Resident Matching Program (NRMP) main residency match, 1591 applicants applied for 1185 positions offered in obstetrics and gynecology. On average, successfully matched applicants from US schools ranked 9.7 programs; unmatched applicants and independent applicants (matched and unmatched) ranked slightly fewer (6.2, 6.7, and 3.4 programs, respectively). With these numbers we can estimate that over 12,000 interviews for residency positions in obstetrics and gynecology were conducted in the 2009-2010 applicant season.
The NRMP surveyed all residency program directors participating in the 2010 main residency match; 49.6% of program directors from the 246 Accreditation Council for Graduate Medical Education (ACGME) approved obstetrics and gynecology programs in the United States responded. The average number of applications received and screened by each obstetrics and gynecology program was 356. For an average of 5 available resident positions, each program interviewed 62 applicants, or 12 applicants for each available slot. Multiple factors were considered when selecting applicants to interview and in ranking applicants such as letters of recommendation, academic performance, scores on standardized examinations (USMLE), medical school performance evaluation (MSPE/Dean’s letter), personal statement, medical school reputation, research experience, volunteer and other experiences, and perceived commitment to the specialty. However, on a scale from 1 (not at all important) to 5 (very important), the program directors listed interactions with faculty and residents as well as interpersonal skills during the interview as the most important factors in determining their rank list (ratings of 4.7 for each).
Once an applicant is matched to a program, there is considerable investment on the part of the institution, hospital, department, and faculty in the professional development of each resident. A discouragingly high rate of attrition has been reported among residents entering obstetrics and gynecology training programs in the United States. McAlister et al examined American Medical Association Graduate Medical Education (GME) Census data for residents entering an obstetrics and gynecology residency program in 2001 and found that almost 22% did not complete training in their initial program. Thirteen percent of the entering residents changed to a different obstetrics and gynecology program, 7% changed specialty, and 2% left GME altogether. When a resident leaves a program, it can be highly disruptive, having a negative effect on the educational environment for the remaining residents and challenging the residency program director, who must rearrange schedules, ensure compliance with duty hour requirements, and attempt to recruit a qualified replacement resident.
Unfortunately, the criteria currently used for residency selection do not always predict future resident performance. In obstetrics and gynecology as well as in other specialties, selection criteria, including interview scores, do not consistently correlate with subsequent resident success or risk for attrition. Given the high stakes in selecting the right residents for a program as well as the considerable time and expense involved in the selection process, program directors in obstetrics and gynecology and other specialties continue to search for selection criteria or methods that will better predict the likely success of an applicant.
In the business world, the traditional unstructured resume-based job interview has largely been replaced by a structured behavior-based interview. Behavioral interviewing was first described in the 1980s by industrial psychologist, Tom Janz, and is based on the premise that past behavior predicts future performance. The company or employer identifies job-specific critical behaviors and important characteristics for success and develops a set of standard questions to be used for the interview. These questions allow the applicant to describe a time from their past when they were faced with a situation similar to what they will encounter in the potential job position; how they handled the situation and what they may have learned from the experience. This structured interview process has been shown to better predict subsequent job performance in fields outside of the health profession.
Over the last several years, behavioral interviewing is starting to be used in the GME selection process in several specialties. In obstetric/gynecology, educators have been introduced to this interview technique through presentations at various venues by the Council on Resident Education in Obstetrics and Gynecology (CREOG). However, there have been limited data confirming similar prediction of future job success in medical education as previously demonstrated in business.
In this edition of the Journal, Dr Strand and colleagues report the findings of their experience with structured, behavior-based interviews, and prediction of future resident success in an obstetrics and gynecology residency. The authors developed behavior-based interview questions with specific scoring rubrics in the areas they deemed important for the specialty and their program: professionalism, leadership, trainability/suitability for the specialty, and fit for the program. Faculty and residents were trained in the method and conducted the interviews over 2 applicant seasons. Subsequent resident performance in PGY1 or PGY2 was determined by identifying the matched program of each applicant interviewed and anonymous survey of the program director.
The authors found that high leadership scores in the interview were positively associated with leadership as a resident and that low scores in trainability/suitability for the specialty were significantly associated with resident attrition. There are some limitations to the study−performance evaluations in residency were obtained for only 63% of the applicants, the extent of training required for faculty and residents in the new interview technique, and reliability of the ratings were not well described, and there is question of the ability to generalize the findings from this one community-based program’s experience to other types of residency programs. However, these findings are certainly encouraging for residency program directors.
There may be a new strategy on the horizon that can better predict who will be the right resident for our specialty or our individual program. Structured behavioral interviewing is more objective than traditional interviewing and has been successful in the business world. In addition, the interview questions are based on job-specific skills or competencies, while the process of developing the behavior-based interview questions allows a program to reflect on those characteristics that may also be important for success in their own setting. Hopefully, investigation will continue in this promising area for residency selection and prediction of future resident success.