UME-GME REVIEW COMMITTEE
1
The Coalition for
Physician Accountability’s
Undergraduate Medical
Education-Graduate
Medical Education
Review Committee (UGRC):
Recommendations for Comprehensive
Improvement of the UME-GME Transition
https://physicianaccountability.org/
UME-GME REVIEW COMMITTEE
2
Table of ContentsTable of Contents
Competencies
and Assessments
FIVE RECOMMENDATIONS
Competencies
and Assessments
FIVE RECOMMENDATIONS
UGRC Members
Competencies
and Assessments
FIVE RECOMMENDATIONS
4
Competencies
and Assessments
FIVE RECOMMENDATIONS
Competencies
and Assessments
FIVE RECOMMENDATIONS
Competencies
and Assessments
FIVE RECOMMENDATIONS
Competencies
and Assessments
FIVE RECOMMENDATIONS
Competencies
and Assessments
FIVE RECOMMENDATIONS
Competencies
and Assessments
FIVE RECOMMENDATIONS
Acknowledgments
5
Final UGRC
Recommendations
and Themes
10-26
Executive Summary
6-9
UGRC Process
28-33
References
43
Limitations
42
Overview
3
Learner’s Journey
27
Future Ideal State
34-37
Impact of Public
Commentary
38
Appendix A:
Glossary of Terms
and Abbreviations
45-46
Appendix B:
Workgroup Ishikawa Diagrams (Fishbones)
Created for Root Cause Analysis
47-50
Appendix C:
Final Recommendations
with Complete Templates
50-123
Appendix D:
Preliminary Recommendations
Released on April 26, 2021
125-138
Appendix E:
Analysis of the Public
Comments
139-275
Consolidation and
Sequencing
39-41
UME-GME REVIEW COMMITTEE
3
Overview
In 2020, the Coalition for Physician Accountability (Coalition) formed a new committee to examine the transition
from undergraduate medical education (UME) to graduate medical education (GME). The UME-GME Review
Committee (the “UGRC” or the “Committee”) was charged with the task of recommending solutions to identied
challenges in the transition. These challenges are well known, but the complex nature of the transition together
with the reality that no single entity has responsibility over the entire ecosystem has perpetuated the problems
and thwarted attempts at reform.
Using deliberate and thoughtful methods, the UGRC spent 10 months exploring, unpacking, discussing, and
debating all aspects of the UME-GME transition. The Committee envisioned a future ideal state, performed a root-
cause analysis of the identied challenges, repeatedly sought stakeholder input, explored the literature, sought
innovations being piloted across the country, and generated a preliminary set of potential solutions to the myriad
problems associated with the transition. Initial recommendations were widely released in April 2021, and feedback
was obtained from organizational members of the Coalition as well as interested stakeholders through a public
call for comment. This feedback was instrumental to rening, altering, and improving the recommendations into
their nal form. The UGRC also responded to feedback by consolidating similar recommendations, organizing
them into more cogent themes, and sequencing them to guide implementation.
The UGRC has presented a total of 34 nal recommendations, organized around nine themes, for comprehensive
improvement of the UME-GME transition. The Committee has formally handed o these recommendations to the
Coalition for their consideration and implementation. Importantly, the UGRC strove to abide by an agreed upon set
of guiding principles that gave primacy to the public good and that championed diversity, equity, and inclusion. The
Committee believes that the recommendations are interconnected and should be implemented as a complete
set. Doing so will create better organizational alignment, likely decrease student costs, reduce work, enhance
wellness, address inequities, better prepare new physicians, and enhance patient care.
UME-GME REVIEW COMMITTEE
4
UGRC Members
The members of the UME-GME Review Committee (UGRC), their pertinent constituency or organizational
aliation, and their workgroup assignments are listed in the table below. Please refer to the section of this report
on “UGRC Process” for details about each workgroup. A complete glossary of terms and constituency/organization
names can be found in Appendix A.
Name Constituency/Organization Workgroup Assign-ment
Richard Alweis DIO B
Steven Angus DIO A
Michael Barone NBME A
Jessica Bienstock DIO D and Bundling
Maura Biszewski AOA D
Craig Brater ECFMG A and Bundling
Jesse Burk Rafel Resident C
Andrea Ciccone Lead Coalition Sta Member Unassigned
Susan Enright (Workgroup B Leader) Medical Education B
Sylvia Guerra Student B and DEI
Daniel Giang (Bundling Workgroup Leader) DIO C and Bundling
John Gimpel NBOME B
Karen Hauer (Workgroup A Leader) Medical Education A
Carmen Hooker Odom Public Member B
Donna Lamb NRMP B
Grant Lin Resident D, DEI, and Bundling
Elise Lovell (UGRC co-chair) OPDA C
George Mejicano (UGRC co-chair) Medical Education D and DEI
Thomas Mohr AACOM C
Greg Ogrinc ABMS D and DEI
Juhee Patel Student A
Michelle Roett (DEI Workgroup Leader) Medical Education D and DEI
Dan Sepdham Residency Program Director C
Susan Skochelak AMA D
Julie Story Byerley (Workgroup D Leader) Medical Education D and Bundling
Jennifer Swails (Workgroup C Leader) Residency Program Director C and Bundling
Jacquelyn Turner Clerkship Director C and DEI
Alison Whelan AAMC B
Pamela Williams Medical Education A
William Wilson Public Member A
UME-GME REVIEW COMMITTEE
5
Acknowledgements
The UGRC would like to acknowledge the signicant contributions of the following individuals, whose eorts were
integral to the successful completion of the Committees charge.
We greatly appreciate the dedication and commitment demonstrated over the past year by the UGRC
workgroup leads: Julie Byerley, Susan Enright, Daniel Giang, Karen Hauer, Michelle Roett, and Jennifer Swails. Our
public members, Carmen Hooker Odom and Reverend William Wilson, consistently focused our work on our
ultimate responsibility, the public good.
Andrea Ciccone (NBME, AOA), served as primary Coalition sta support to the UGRC, and provided invaluable
insights, perspectives, and organization.
We sincerely thank our outstanding project manager Chris Hanley (AAMC) and communications director Joe
Knickrehm (FSMB). Research librarians Kris Alpi, Robin Champieux, and Andrew Hamilton (Oregon Health & Science
University) enthusiastically guided our evidence informed approach. A team consisting of Dana Kerr, Matthew
Roumaya, Carol Morrison, Ulana Dubas, and Lauren Foster (NBME) performed the important analysis of the public
commentary, while Susan Morris and Sheila FitzPatrick (ABMS) managed graphic design. Our medical writer
Victoria Stagg Elliott (AMA) contributed signicantly to the creation of this report.
The members of the Coalition Management Committee met throughout this process and oered relevant
guidance and context.
Finally, we are eternally grateful to each member of the UGRC, who contributed their time, passion, expertise, and
experience from across the arc of medical education, in the common cause of improving the UME-GME transition
for all involved, and improving the medical care provided in our society.
Elise Lovell, MD
George Mejicano, MD, MS
UGRC co-chairs
UME-GME REVIEW COMMITTEE
6
Executive Summary
In the summer of 2020, a Planning Committee of the Coalition for Physician
Accountability (Coalition) selected the members of a new committee – the
Undergraduate Medical Education (UME) to Graduate Medical Education
(GME) Review Committee (UGRC) – and charged them with the task of
recommending solutions to identied challenges in the UME-GME transition.
1
The UGRC is pleased to submit this report, which includes the 34 nal
recommendations for comprehensive improvement of the UME-GME
transition, to the Coalition for their consideration and implementation.
Introduction:
The charge to the UGRC stated that there are identied challenges in the transition between medical school and
residency that are negatively impacting the UME-GME transition.1 These challenges include, but are not limited to,
the following:
Disproportionate attention towards nding and lling residency positions rather than on assuring learner
competence and readiness for residency training;
Unacceptable levels of stress on learners and program directors throughout the entire process;
Inattention to optimizing congruence between the goals of the applicants and the mission of the programs to
ensure the highest quality health care for patients and communities;
Mistrust between medical school ocials and residency program personnel;
Overreliance on licensure examination scores in the absence of valid, trustworthy measures of students’
competence and clinical abilities;
Lack of transparency to students on how residency selection actually occurs;
Increasing nancial costs to students as well as opportunity costs to programs associated with skyrocketing
application numbers;
The presence of individual and systemic bias throughout the transition; and
Inequities related to specic types of applicants such as international medical graduates.
In recent years, these and related challenges have expanded to the point that they are causing severe strain on
the entire system. Simply put, there is an emerging consensus and urgency to bring forth solutions and as stated
by the Planning Committee,1 that the “UME-GME community is energized at this moment to solve these problems,
and should therefore act boldly and fairly with transparency, while thoughtfully considering stakeholder input, and
utilizing data when available.”
1
In addition to understanding the challenges noted above, the UGRC had to develop a shared concept of what
comprises the “UME-GME transition.” Through its deliberations, the Committee came to a collective understanding
that the transition encompasses a complex ecosystem involving many individuals and organizations. The transition
begins during the preclinical phase of medical school as students consider specialty options, are counseled by
UME-GME REVIEW COMMITTEE
7
Executive Summary
mentors and faculty advisors, and embark on the long journey of professional identity formation. During their
clinical years, students participate in patient care in numerous settings and on dierent rotations, choose a
variety of electives, decide on a specialty, prepare application materials, research residency programs, apply to
many programs, are oered and partake in interviews, interact with program personnel, are selected through a
matching process, undergo hiring and credentialing, complete advanced skills training courses, experience major
life transitions, initiate new support structures, begin employment, participate in orientation, assume signicantly
more patient care responsibilities, and embed themselves within a learning and work environment that they will
call home for the next three to seven years. In other words, the UME-GME transition is not simply the application,
interview, and match process. Moreover, the transition does not end at the start of orientation to their rst year of
training. For unmatched students and international medical graduates, the process may take even longer.
As learners navigate through the UME-GME transition, they interact with numerous organizations with jurisdiction
over specic components of the process. Each organization plays a role and impacts the success of the
transition. However, the ecosystem is not governed by a single entity. In essence, it is a decentralized collection of
interdependent parts, each with their own interests, which currently do not communicate eectively or function
cohesively. Solutions that bring the components of the transition into better alignment could have many positive
outcomes and will likely decrease student costs, reduce work, enhance wellness, address inequities, better
prepare new physicians, and enhance patient care.
Background:
In 2018, a national conversation culminated regarding the use of numeric scores associated with medical licensing
examinations in residency applicant screening and selection. In response, the chief executive ocers of ve
national organizations (AMA, AAMC, ECFMG, FSMB, and NBME) agreed to co-sponsor an Invitational Conference on
USMLE Scoring (InCUS).
2
InCUS took place in March 2019 with a primary goal of reviewing the practice of numeric
score reporting. Three of the recommendations that emerged focused on the USMLE:
(a) Reduce the adverse impact of the overemphasis on USMLE performance in residency screening and
selection through consideration of changes such as pass/fail scoring;
(b) Accelerate research on the correlation of USMLE performance to measures of residency performance
and clinical practice; and
(c) Minimize racial demographic dierences in USMLE performance.
In contrast, the fourth InCUS recommendation focused on the UME-GME transition: Convene a cross-organizational
panel to create solutions for the assessment and transition challenges from UME to GME. The nal report from
InCUS noted that there was general agreement that changes in scoring of licensure examinations would not
address important aspects of the UME-GME transition system that needed attention. “It was acknowledged
that many organizations and stakeholder groups have responsibility for improving this transition. Yet if many are
responsible, a concern exists that no one group will take ownership or feel empowered to carry on the broader
conversation necessary to bring about appropriate change.”
2
In September 2019, a proposal was made to the Coalition to convene a UME-GME Review Committee in line
with the fourth recommendation from InCUS.
3
As a result, a Planning Committee was created by the Coalition
to develop the construct, membership, and charge of the Review Committee, which would be responsible for
recommending solutions to identied challenges in the UME-GME transition.
1
UME-GME REVIEW COMMITTEE
8
Executive Summary
The UGRC’s Guiding Principles
As stated above, the UGRC was charged with the task of recommending solutions to identied challenges in the
UME-GME transition. Although the Committee was encouraged to act boldly, thoughtfully consider stakeholder
input, and utilize data whenever possible, the UGRC’s primary goals were to ensure learner competence and
readiness for residency and to foster wellness in learners, sta, faculty members, and program directors.
1
In
addition, the UGRC was tasked to devote attention to the following items:
Optimizing t between applicants and programs to ensure the highest quality health care for patients
and communities;
Increasing trust between medical schools and residency programs;
Mitigating current reliance on licensure examinations in the absence of valid, standardized, trustworthy
measures of students’ competence and clinical care;
• Increasing transparency for applicants to understand how residency selection operates;
• Considering the needs of all types of applicants in making its recommendations;
• Considering nancial cost to applicants throughout the application process; and
• Minimizing individual and systemic bias throughout the UME-GME transition process.
The UGRC melded these principles into a single tenet that was kept front of mind during its deliberations and
related work: above all else, the UME-GME transition must optimally serve the public good. Inherent to that tenet,
the Committee consistently focused on the importance of increasing diversity, enhancing equity, and championing
inclusion.
The Work of the UGRC
Seven consensus steps prepared the UGRC to successfully accomplish the task of generating recommendations:
Elaborate the charge to include optimal preparation for residency by leveraging learners’ time and
experiences between the Match and the initial months of training.
Require level setting to ensure that all UGRC members had common understanding of the UME-GME
transition.
Use the concept of backward design to envision a future ideal state that helps create a system that
produces it.
Produce Ishikawa diagrams (i.e., shbones) to determine the root causes that underly the many
challenges currently associated with the transition.
Articulate the desired outcome and understand the root problems before generating solutions.
Identify potential solutions and innovations described in the literature or implemented by institutions
across the country.
• Embrace a consensus approach to endorsing recommendations, informed by available evidence.
Generation and Adoption of Preliminary Recommendations
The UGRC did not begin the process of generating potential solutions to the identied problems of the transition
until the work described above was complete. Even then, the generation of the preliminary recommendations
was focused and deliberate to ensure that background material could be assembled, that each potential solution
UME-GME REVIEW COMMITTEE
9
Executive Summary
was thoughtfully considered, and that there was ample time and space to discuss contentious ideas. Each
recommendation was linked to the future ideal state as well as to root causes of problems with the transition.
In addition, the co-chairs decided to frame each recommendation in broad terms, to include specic
examples on how a recommendation might be implemented, and to list both pros and cons for each potential
recommendation. Successful implementation of the UGRC’s recommendations relies on the cooperation of
multiple entities since the challenges within the transition are interdependent and not under the control of any
one organization or stakeholder group. Recommendations based on principles and that describe characteristics
of what can be achieved are more likely to garner support compared to granular recommendations that might
be readily dismissed as unrealistic or politically dicult. In addition, recommendations with a high degree of
consensus will be harder to ignore than those adopted by the UGRC by a simple majority.
In April 2021, the UGRC adopted 42 preliminary recommendations organized around 12 themes. The preliminary
recommendations and pertinent background material were presented to the Coalition, followed one week later
by their widespread release and a call for public comment.
Response to Feedback and Next Steps
Feedback about the preliminary recommendations was obtained from the organizational members of the
Coalition as well as from stakeholders through a public, month-long call for comments. This feedback was shared
with each member of the UGRC so that input from the Coalition and external stakeholders could inform the
Committees nal recommendations. Feedback obtained by the UGRC co-chairs through dialogue with students,
program directors, DIOs, medical educators, medical school deans, and international medical graduates –
obtained through purposeful outreach to those groups – was considered before nalizing the recommendations.
In response to all feedback, the UGRC made important changes to its preliminary recommendations. The changes
included signicant editing, clarication, and renement of language; complete reworking of a recommendation
addressing application inflation; and judiciously combining similar ideas to reduce the overall number of
recommendations. Of note, 32 of the preliminary recommendations were impacted by the feedback obtained
through public commentary.
Further, the co-chairs created a “bundling workgroup” tasked to consolidate similar recommendations, to
sequence those that were interdependent with one another, and to re-organize them into more cogent themes.
As a result of these eorts, the UGRC has adopted 34 nal recommendations, organized around nine themes,
to comprehensively improve the UME-GME transition. Moreover, the recommendations within each theme are
sequenced in chronologic order to guide their implementation. A fully textualized and comprehensive narrative on
each recommendation can be found in Appendix C.
The Committee believes that each proposed change will produce positive results and that implementation of
the complete set of recommendations will improve the entire transition. The UGRC also recognizes that each
recommendation may be categorized as transactional, investigational, or transformational in nature. Though
certain recommendations are designed to garner “early wins” by reducing the signicant stress felt by students
and program directors, the UGRC believes that the transformational recommendations are of greatest
importance because they align the medical community with a shared interest to promote the public good.
With the delivery of the 34 nal recommendations and this accompanying report, the work of the UGRC is now
complete. The Coalition will meet in late summer 2021 to discuss the nal recommendations and consider next
steps towards implementation.
UME-GME REVIEW COMMITTEE
10
Collaboration and Continuous
Quality Improvement
RECOMMENDATIONS
1
Convene a national ongoing committee to manage continuous quality improvement
of the entire process of the UME-GME transition, including an evaluation of the
intended and unintended impact of implemented recommendations.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
One of the challenges in creating alignment and making improvements is the lack of a single body with
broad perspective over the entire continuum. This creates a situation where organizations and institutions
are unnecessarily and counterproductively isolated, without a shared mental model or mission. A convened
committee, that includes learner and public representatives, should champion continuous improvement to the
UME-GME transition, with the focus on the public good.
THEME
2
In addition to supporting collaboration around the UME-GME transition, this national
committee should: develop and articulate consensus around the components of
a successful residency selection cycle; explore the growing number of unmatched
physicians in the context of a national physician shortage; and foster future research
to understand which factors are most likely to translate into physicians who fulll the
physician workforce needs of the public.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Currently, the medical education community lacks a shared mental model of what constitutes a successful
transition from UME to GME, and also what factors predict that success. The lack of agreement leads to conflict
over the content of applications as well as the resources required for a residency selection cycle. Success could
include simple educational outcomes such as completing training, board certication, or lack of remediation.
Alternatively, applicant-specic factors may be more important, such as likelihood of choosing the same
program again. Success may be dened solely on the public good, based on the ll rate of programs and
the number of physicians practicing in underserved areas. Or, it may be that successful residency selection is
institutionally specic based on its mission and community served, with some institutions focused on research
and others on rural communities. The committee should articulate the factors associated with a successful
residency selection cycle so they can be appropriately emphasized in the UME-GME transition, especially as
changes are made to the process.
The committee should report on data trends, implications, and recommended interventions to address the
growing number of unmatched physicians. This analysis should include demographic data to examine diversity,
specialty disparities in unmatched students, number of applications, grading systems, participation in SOAP,
post-SOAP unmatched candidates, match rate in subsequent years of re-entering the match pool, and attrition
rates of learners during residency. This recommendation is intended to urge UME programs and institutions
to utilize a continuous quality improvement approach and review unmatched graduates by specialties,
demographics, number of programs applied to, and clinical grading; to oer alternative pathways; and to add
The UGRC recommends the following, organized around nine themes:
UME-GME REVIEW COMMITTEE
11
Collaboration and Continuous Quality Improvement
RECOMMENDATIONS
faculty development for clinical advising. Both UME and GME data would identify patterns within the continuum
of medical education that negatively impact unmatched physicians and attrition rates of GME programs. Ideally,
shared resources and innovation across the continuum would be identied and disseminated.
Graduates of U.S. medical schools ll many residency positions, which means GME is constrained by the
decisions made by U.S. medical school admissions committees. However, international medical graduates are
also considered at many programs and provide an opportunity to serve the public good. The committee should
foster research to help program directors understand which applicant characteristics are useful indicators to
address ongoing medical workforce issues. Further changes to the transition should be informed by evidence
whenever possible.
3
The U.S. Centers for Medicare and Medicaid Services (CMS) should change the
current GME funding structure so that the Initial Residency Period (IRP) is calculated
starting with the second year of postgraduate training. This will allow career choice
reconsideration, leading to improved resident wellbeing and positive eects on the
physician workforce.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Given the timing of the residency recruiting season and the Match, students have limited time to denitively
establish their specialty choice. If a resident decides to switch to another program or specialty after beginning
training, the hospital may not receive full funding due to the IRP and thus be far less likely to approve such a
change. The knowledge that residents usually only have one chance to choose a specialty path increases the
pressure on the entire UME-GME transition. Furthermore, educational innovation is limited without flexibility for
time-variable training.
UME-GME REVIEW COMMITTEE
12
Diversity, Equity, and Inclusion
RECOMMENDATIONS
4
Specialty-specic salutary practices for recruitment to increase diversity across
the educational continuum should be developed and disseminated to program
directors, residency programs, and institutions.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Recognizing that program directors, residency programs, and institutions have wide variability in goals, denitions,
and community needs for increasing diversity, shared resources should be made available for mission-aligned
entities, with specialty-specic contributions including successful strategies and ongoing challenges. This
recommendation is intended for specialty organizations to perform workforce evaluations and specically
address diversity, equity, and inclusion (DEI) associated with specialty-specic disparities in recruitment.
THEME
5
Members of the medical educational continuum must receive continuing
professional development regarding anti-racism, avoiding bias, and ensuring
equity. Principles of equitable recruitment, mentorship and advising, teaching, and
assessment should be included.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Inclusive excellence requires avoiding bias and improving racial equity; these are essential skills for faculty in
today’s teaching. Many physicians lack these skills, perpetuating health disparities, lack of diversity, and learner
mistreatment. ACGME Common Program Requirements already include specic applicable requirements.
This recommendation reinforces the importance of addressing issues related to DEI for all members of the
educational community, including residents starting from orientation. This will ultimately promote belonging,
eliminate bias, and provide social support.
UME-GME REVIEW COMMITTEE
13
Trustworthy Advising and
Denitive Resources
RECOMMENDATIONS
6
Create an interactive database with veriable GME program/track information and
make it available to all applicants, medical schools, and residency programs and at
no cost to the applicants. This will include aggregate characteristics of individuals
who previously applied to, interviewed at, were ranked by, and matched for each
GME program/track.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Veriable and trustworthy GME program/track information should be developed and made available in an easily
accessible database to all applicants. Information for the database should be directly collected and sources should
be transparent. Each programs interviewed or ranked applicants reflect the programs desired characteristics
more accurately than the small proportion of applicants the program matches. Data must be searchable and
allow for data analytics to assist with program decision making (e.g., allowing applicants and their advisors to input
components of their individual application to identify programs/tracks with similar current residents). Applicants
and advisors should be able to sort the information according to demographic and educational features that
may signicantly impact the likelihood of matching at a program (e.g., geography, scores, degree, visa status, etc.).
This database would also provide information on the characteristics of individuals who previously applied to and
matched into various specialties.
THEME
7
Evidence-informed, general career advising resources should be available for all
medical school faculty and sta career advisors, both domestic and international.
All students should have free access to a single, comprehensive electronic
professional development career planning resource, which provides universally
accessible, reliable, up-to-date, and trustworthy information and guidance. General
career advising should focus on students’ professional development; inclusive
practices such as valuing diversity, equity, and belonging; clinical and alternate
career pathways; and meeting the needs of the public. Specialty-specic match
advising should focus on the individual student obtaining an optimal match.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Centralized advising resources, developed in collaboration with specialty societies, should reflect a common core,
with supplemental information as needed, and be evidence-informed and data-driven. This will ll an information
gap and increase the transparency and reliability of information shared with students. Resources should support
the unique needs of traditionally underrepresented, disadvantaged, and marginalized student groups. Guidance
contained in the resources can support faculty in managing or eliminating conflicts of interest related to recruiting
students to the specialty, advising for the Match, and advocating for students in the Match. Advising tools
should incorporate strengths-based approaches to career selection. The resources should include the option of
nonclinical careers without stigma. Three areas of focus are envisioned: basic advising information, general career
advising, and specialty-specic match advising.
UME-GME REVIEW COMMITTEE
14
Trustworthy Advising and Denitive Resources
RECOMMENDATIONS
Clear and accurate information regarding clinical and nonclinical career choices should be available for all
students. The AAMC’s Careers in Medicine (CiM) platform achieves some of the aims of this recommendation.
The strengths and limitations of CiM should be examined, expanding the content and broadening access to this
resource, including to all students (U.S. MD, U.S. DO, IMG) at no cost throughout their medical school training, or at
a minimum, at key career decision-making points, in order to support students’ professional development. The
public good can be prioritized within this resource with content emphasis on workforce strategies to address
the needs of the public, including specialty selection and practice location as well as alternative nonclinical career
choices. Links to specialty-specic medical student advising resources should also be incorporated.
Basic advising information should be created for all faculty and sta who interact with students to promote
common understanding of career advising, professional development, specialty selection, and application
procedures; introduce the role of specialty-specic advisors as distinct from other faculty teachers; and minimize
sharing outdated or incorrect information with students. General career advising should be dierentiated from
specialty-specic match advising or specialty recruiting. General career advisors require expertise in career
advising; incorporate strengths-based ap-proaches to career selection including the option of nonclinical careers
without stigma; focus on professional development; value diversity, equity, and belonging; incorporate the needs
of the public; and introduce the role of specialty-specic match advisors. Specialty-specic match advisors
should undergo a training process created as part of this resource development that includes equity in advising
and mitigation of bias.
8
Educators should develop a salutary practice curriculum for UME career advising.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Guidelines are needed to inform U.S. MD, U.S. DO, and international medical schools in developing their career
advising programs. Standardized approaches to advising along with career advisor preparation (both general
and specialty-specic) can enhance the quality, equity, and quantity of advising and improve student trust in
the advice. Educators can improve medical student career advising by developing formal guidelines with key
recommendations based upon professional development frameworks and competencies. Implementation
of such guidelines will result in greater consistency, thoroughness, eectiveness, standardization, and equity of
medical school career advising programs to better support students in making career decisions and will lay
the foundation for career planning across the continuum.
UME-GME REVIEW COMMITTEE
15
Outcome Framework and
Assessment Processes
RECOMMENDATIONS
9
UME and GME educators, along with representatives of the full educational
continuum, should jointly dene and implement a common framework and set of
outcomes (competencies) to apply to learners across the UME-GME transition.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
A shared mental model of competence facilitates agreement on assessment strategies used to evaluate a
learner’s progress, and the inferences that can be drawn from assessments. Shared outcomes language can
convey information on learner competence with the patient/public trust in mind. For individual learners, dening
these outcomes will facilitate learning and may promote a growth mindset. For faculty, dening outcomes will
allow for the use of assessment tools aligned with performance expectations and faculty development. For
residency programs, dening outcomes will be useful for resident selection and learner handovers from UME,
resident training, and resident preparation for practice.
THEME
10
To eliminate systemic biases in grading, medical schools must perform initial and
annual exploratory reviews of clinical clerkship grading, including patterns of grade
distribution based on race, ethnicity, gender identity/expression, sexual identity/
orientation, religion, visa status, ability, and location (e.g., satellite or clinical site
location), and perform regular faculty development to mitigate bias. Programs
across the UME-GME continuum should explore the impact of bias on student and
resident evaluations, match results, attrition, and selection to honor societies.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Recognizing that inherent biases exist in clinical grading and assessment in the clinical learning environment, each
UME and GME program must have a continuous quality improvement process for evaluating bias in clinical grading
and assessment and the implications of these biases, including honor society selection. This recommendation is
intended to mitigate bias in clinical grading, transcript notations, MSPE reflections of remediation, and residency
evaluations. This recommendation is not intended to create requirements for reporting race, ethnicity, gender
identity, sexual identity, religion, or ability of learners as data analysis must be limited to data readily available to
each school.
11
The UME community, working in conjunction with partners across the continuum,
must commit to using robust assessment tools and strategies, improving upon
existing tools, developing new tools where needed, and gathering and reviewing
additional evidence of validity.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Educators from across the education continuum should use shared competency outcomes language to guide
development or use of assessment tools and strategies that can be used across schools to generate credible,
equitable, value-added competency-based information. Assessment information should be shared in residency
applications and a post-match learner handover. Licensing examinations should be used for their intended
purpose to ensure requisite competence.
UME-GME REVIEW COMMITTEE
16
Outcome Framework and Assessment Processes
RECOMMENDATIONS
12
Using the shared mental model of competency and assessment tools and
strategies, create and implement faculty development materials for incorporating
competency-based expectations into teaching and assessment.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Faculty must understand the purpose of outcomes-focused education, specic language used to dene
competence, and how to mitigate biases when assessing learners. They must understand the purpose and
use of each assessment tool. The intensity and depth of faculty development can be tailored to the amount
and type of contact that individual faculty have with students. Clerkship directors, academic progress
committees, student competency committee members, and other educational leaders require a more in-
depth understanding of the assessment system and how determinations of readiness for advancement
are made. This faculty development requires centralized electronic resources and training for trainers within
institutions. Review of training materials, and completion of any required activities to document review and/or
understanding, should be required on a regular basis.
UME-GME REVIEW COMMITTEE
17
Away Rotations
RECOMMENDATIONS
13
Convene a workgroup to explore the multiple functions and value of away rotations
for applicants, medical schools, and residency programs. Specically, consider
the goals and utility of the experience, the impact of these rotations, and issues of
equity including accessibility, assessment, and opportunity for students from groups
underrepresented in medicine and nancially disadvantaged students.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Away rotations can be cost prohibitive yet may allow a student to get to know a program, its health system, and
surrounding community. Some programs are reliant on away rotations to showcase their unique strengths to
attract candidates. Given the multifactorial and complex role that away rotations fulll, a committee should be
convened to conduct a thorough and comprehensive review of cost versus benet of away rotations, followed
by recommendations from that review. Non-traditional methods of conducting and administering away
rotations should be explored (e.g., oering virtual away rotations, waiving application fees, or oering away
stipends particularly for nancially disadvantaged students).
THEME
UME-GME REVIEW COMMITTEE
18
Equitable, Mission-Driven
Application Review
RECOMMENDATIONS
14
A convened group including UME and GME educators should reconsider the
content and structure of the MSPE as new information becomes available to
improve access to longitudinal assessment data about applicants. Short-term
improvements should include structured data entry elds with functionality to
enable searching.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
The development of UME competency outcomes to apply across learners and the continuum is essential in
decreasing the reliance on board scores in the evaluation of the residency applicant. These will take time to
develop and implement and may be developed at dierent intervals. As new information becomes available
to improve applicant data, the MSPE should be utilized to improve longitudinal applicant information. In addition,
improvements in the MSPE, such as structured data entry elds with functionality to enable searching, should be
explored.
THEME
15
Structured Evaluative Letters (SELs) should replace all Letters of Recommendation
(LORs) as a universal tool in the residency program application process.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
A Structured Evaluative Letter (SEL), which would include specialty-specic questions, would provide knowledge
from the evaluator on student performance that was directly observed versus a narrative recommendation.
The template should be based on an agreed upon set of core competencies and allow equitable access to
completion for all candidates. The SEL should be based on direct observation and must focus on content that
the evaluator can complete. Faculty resources should be developed to improve the quality of the standardized
evaluation template and decrease bias.
16
To raise awareness and facilitate adjustments that will promote equity and
accountability, self-reported demographic information of applicants (e.g., race,
ethnicity, gender identity/expression, sexual identity/orientation, religion, visa
status, or ability) should be measured and shared with key stakeholders, including
programs and medical schools, in real time throughout the UME-GME transition.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Inequitable distribution of applicants among specialties is not in the best interest of programs, applicants, or the
public good. Bias can be present at any level of the UME-GME transition. A decrease in diversity at any point along
the continuum provides an important opportunity to intervene and potentially serve the community in ways
that are more productive. An example of accountability and transparency in an inclusive environment across the
continuum is a diversity dashboard for residency applicants. A residency program that nds bias in its selection
process could go back in real time to nd qualied applicants who may have been missed, potentially improving
outcomes.
UME-GME REVIEW COMMITTEE
19
Equitable, Mission-Driven Application Review
RECOMMENDATIONS
17
To optimize utility, discrete elds should be available in the existing electronic
application system for both narrative and ordinal information currently presented
in the MSPE, personal statement, transcript, and letters. Fully using technology
will reduce redundancy, improve comprehensibility, and highlight the unique
characteristics of each applicant.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Optimally, each applicant will be reviewed individually and holistically to evaluate merit. However, some
circumstances may require rapid review. The 2020 NRMP program directors’ survey found that only 49% of
applications received an in-depth review. The application system should utilize modern technology to maximize
the likelihood that applications are evaluated in a way that is holistic, mission-based, and equitable.
Currently, applications are assessed based on the information that is readily available, which may place undue
emphasis on scores, geography, medical school, or other factors that perpetuate bias. Adding specic data
gives an opportunity for applicants to demonstrate their strengths in a way that is user-friendly for program
directors. Maximizing the amount of accurate information readily available in the application will increase
capacity for holistic review of more applicants and improve trust during the UME to GME transition. Although
not all schools and programs will align on which information should be included, areas of agreement should be
identied and emphasized.
18
To promote equitable treatment of applicants regardless of licensure examination
requirements, comparable exams with dierent scales (COMLEX-USA and USMLE)
should be reported within the electronic application system in a single eld.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Osteopathic medical students make up 25% of medical students in U.S. schools and these students are required
to complete the COMLEX-USA examination series for licensure. Residency programs may lter out applicants
based on their USMLE score leading many osteopathic medical students to sit for the USMLE series. This creates
substantial increase in cost, time, and stress for osteopathic students who believe duplicate testing is necessary
to be competitive in the Match. A combined eld should be created in the Electronic Residency Application
Service (ERAS) that normalizes the scores between the two exams and allows programs to lter based only on
the single normalized score. This will mitigate structural bias and reduce nancial and other stress for applicants.
19
Filter options available to programs for sorting applicants within the electronic
application system should be carefully created and thoughtfully reviewed to ensure
each one detects meaningful dierences among applicants and promotes review
based on mission alignment and likelihood of success at a program.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Currently, residency programs receive more applications than they can meaningfully review. For this reason,
lters are sometimes used to identify candidates that meet selection criteria. However, some commonly used
lters may exclude applicants who are not meaningfully dierent from ones who are included (e.g., students
who took a dierent licensure examination, students with statistically insignicant dierences in scores, students
from dierent campuses of the same institution, etc.). The use of free text lters increases the risk of not
identifying, or mischaracterizing applicant characteristics. All applications should be evaluated fairly, independent
of software idiosyncrasies. Filters should be developed in conjunction with all stakeholders. Each lter that is
oered should align with the missions and requirements of residency programs.
UME-GME REVIEW COMMITTEE
20
Equitable, Mission-Driven Application Review
RECOMMENDATIONS
20
Convene a workgroup of educators across the continuum to begin planning for a
dashboard/portfolio to collect assessment data in a standard format for use during
medical school and in the residency application process. This will enable consistent
and equitable information presentation during the residency application process
and in a learner handover.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Key features of a dashboard/portfolio in the UME-GME transition, and across the continuum, should include
competency-based information that aligns with a shared mental model of outcomes, clarity about how
and when assessment data were collected, and narrative data that uses behavior-based and competency-
focused language. Learner reflections and learning goals should be included. Dashboard development will
require careful attention to equity and minimizing harmful bias, as well as a focus on the competencies
and measurements that predict future performance with patients. Transparency with students about the
purpose, use, and reporting of assessments, as well as attention to data access and security, will be essential.
UME-GME REVIEW COMMITTEE
21
Optimization of Application,
Interview, and Selection Processes
RECOMMENDATIONS
21
All interviewing should be virtual for the 2021-2022 residency selection season. To
ensure equity and fairness, there should be ongoing study of the impact of virtual
interviewing as a permanent means of interviewing for residency.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Virtual interviewing has had a signicant positive impact on applicant expenses. With elimination of travel, students
have been able to dedicate more time to their clinical education. Due to the risk of inequity with hybrid interviewing
(virtual and in person interviews occurring in the same year or same program), all interviews should be conducted
virtually for the 2021-2022 season. Hybrid interviewing (virtual combined with onsite interviewing) should be
prohibited.
A thorough review of the data around virtual interviewing is also recommended. Candidate accessibility, equity,
match rates, and attrition rates should be evaluated. Residency program feedback from multiple types of
residencies should be solicited. In addition, the separation of applicant and program rank order list deadlines in time
should be explored, as this would allow students to visit programs without pressure and minimize influence on a
programs rank list.
THEME
22
Develop and implement standards for the interview oer and acceptance
process, including timing and methods of communication, for both learners and
programs, to improve equity and fairness, to minimize educational disruption, and
to improve wellbeing.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
The current process of extending interview oers and scheduling interviews is unnecessarily complex and onerous,
with little to no regulation. Applicant stress and loss of rotation education while attempting to conform to some
elements (e.g., obsessively checking emails to accept short-timed interview oers) can be improved with changes
to the application platform, policies, and procedures. Development of a common interview oering/scheduling
platform and creating policies (e.g., forbidding residency programs to over oer/over schedule interviews and
from setting inappropriate time-based applicant replies), would result in important improvements. While these
processes are being developed, residency programs involved in the 2021-2022 residency selection season should
allow applicants 24 to 48 hours to accept or decline an interview oer. In addition, for the 2021-2022 residency
selection season, programs should not oer more interviews to applicants than available interview positions.
Likewise, applicants should not accept multiple interviews that are scheduled at the same time.
UME-GME REVIEW COMMITTEE
22
Optimization of Application, Interview, and
Selection Processes
RECOMMENDATIONS
23
Innovations to the residency application process should be piloted to reduce
application numbers and concentrate applicants at programs where mutual
interest is high, while maximizing applicant placement into residency positions. Well-
designed pilots should receive all available support from the medical community
and be implemented as soon as the 2022-2023 application cycle; successful pilots
should be expanded expeditiously toward a unied process.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Application inflation is a major problem in the current dysfunction in the UME-GME transition. The 2020 NRMP
program director’s survey found that only 49% of applications received an in-depth review; an unread
application represents wasted time and expense for applicants. Yet doubling the program resources available
for review is not practical. Informational interventions – like improved career advising and transparency – are
unlikely to reduce application numbers signicantly in the context of a high stakes prisoner’s dilemma. In sum,
the current process is costly to applicants and program directors and does not optimally serve the public good.
To address this dysfunction, Coalition organizations and other groups in the medical community should utilize
all available logistic, analytic, and nancial resources to lead and support innovative pilots to reduce application
numbers and concentrate applicants at programs where mutual interest is high, while maximizing applicant
placement into residency positions. Pilots should be based on best available evidence, specialty-specic needs,
potential impact (both positive and negative), and collaboration among stakeholders. Pilot innovations, some
of which are ongoing, could include, but are not limited to, the following: expanding integrated UME-GME
pathways, preference signaling, application caps, and/or additional application or match rounds.
Groups sponsoring pilots should be accountable for using a continuous quality improvement approach to
gather and monitor evidence of eectiveness and equity across applicant groups with historically distinct
application behaviors and outcomes, including United States MD and DO graduates, international medical
graduates, couples applicants, previously unmatched applicants, and individuals belonging to groups that are
underrepresented in medicine.
While pilot studies may vary across specialties, ultimately the redesigned residency application process should
be as consistent as possible across specialties, recognizing that applicants, advisors, and program directors
may be subject to the rules of multiple specialties in the context of combined tracks, couples, and dual
applicants.
24
Implement a centralized process to facilitate evidence-based, specialty-specic
limits on the number of interviews each applicant may attend.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Identify evidence-based, specialty-specic interview caps, envisioned as the number of interviews an applicant
attends within a specialty above which further interviews are not associated with signicantly increased match
rates, across all core applicant types. Create a centralized process to operationalize interview caps, which could
include an interview ticket system or a single scheduling platform.
UME-GME REVIEW COMMITTEE
23
Educational Continuity and
Resident Readiness
RECOMMENDATIONS
25
Early and ongoing specialty-specic resident assessment data should be
automatically fed back to medical schools through a standardized process to
enhance accountability and to inform continuous improvement of UME programs
and learner handovers.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Instruments for feedback from GME to UME should be standardized and utilized to inform gaps in curriculum and
program improvement. UME institutions should respond to the GME feedback on their graduates’ performance in
a manner that leads to quality improvement of the program.
THEME
26
Develop a portfolio of evidence-based resident support resources for program
directors, designated institutional ocials (DIOs), and residency programs. These will
be identied as salutary practices, and accessible through a centralized repository.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
A centralized source of resident support resources will assist programs with eective approaches to address
resident concerns. This will be especially relevant for competency-based remediation and resident wellbeing
resources in the context of increased demand for support around the UME-GME transition. Access for programs
and program directors will be low/no cost, condential, and straightforward.
27
Targeted coaching by qualied educators should begin in UME and continue during
GME, focused on professional identity formation and moving from a performance
to a growth mindset for eective lifelong learning as a physician. Educators should
be astute to the needs of the learner and be equipped to provide assistance to all
backgrounds.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Coaching can benet a student’s transition to become a master adaptive learner with a growth mindset. While
this transition should begin early in medical school, it should be complete by the time that the student moves from
UME to GME. If a learner does not transition to a growth mindset, their wellness and success will be compromised.
The addition of specic validated mentoring programs (e.g., Culturally Aware Mentoring) and formation of anity
groups to improve sense of belonging should be considered.
UME-GME REVIEW COMMITTEE
24
Educational Continuity and Resident Readiness
RECOMMENDATIONS
28
Specialty-specic, just-in-time training must be provided to all incoming rst-year
residents, to support the transition from the role of student to a physician ready to
assume increased responsibility for patient care.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
The intent of this recommendation is to level set incoming resident preparation regardless of medical school
experience. Recent research has shown that residents reported greater preparedness for residency if they
participated in a medical school “boot camp,” and participation in longer residency preparedness courses
was associated with high perceived preparedness for residency. This training must incorporate all six specialty
competency domains and be conducive to performing a baseline skills assessment. These curricula might
be developed by specialty boards, specialty societies, or other organized bodies. To minimize costs, specialty
societies could provide centralized recommendations and training could be executed regionally or through
online modules.
29
Residents must be provided with robust orientation and ramp up into their
specic program at the start of internship. In addition to clinical skills and system
utilization, content should include introduction to the patient population, known
health disparities, community service and engagement, faculty, peers, and
institutional culture.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Improved orientation to residency has the potential to enhance trainee wellbeing and improve patient safety.
Residents should have orientation that includes not only employee policies, but also education that optimizes
their success in their specic clinical environment. Residents, like other employees, should be paid for attending
orientation.
30
Meaningful assessment data based on performance after the MSPE must
be collected and collated for each graduate, reflected on by the learner with
an educator or coach, and utilized in the development of a specialty-specic,
individualized learning plan to be presented to the residency program to serve as a
baseline at the start of residency training.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Guided self-assessment by the learner is an important component in this process and may be all that is
available for some international medical graduates. This recommendation provides meaning and importance
for the assessment of experiences during the nal year of medical school (and possibly practice for some
international graduates), helps to develop the habits necessary for life-long learning, and holds students
and schools accountable for quality senior experiences. It also uses the resources of UME to prepare an
individualized learning plan (ILP) to serve as a baseline at the start of GME. This initial ILP will be rened by
additional assessments envisioned as an “In-Training Examination” (ITE) experience early in GME. The time for
this experience should be protected in orientation, and the feedback should be formative similar to how most
programs manage the results of ITEs. This assessment might occur in the authentic workplace and based
on direct observation or might be accomplished as an Objective Structured Clinical Exam using simulation.
This assessment should inform the learner’s ILP and set the stage for the work of the clinical competency
committee of the program.
UME-GME REVIEW COMMITTEE
25
Health and Wellness
RECOMMENDATIONS
31
Anticipating the challenges of the UME-GME transition, schools and programs
should ensure that time is protected, and systems are in place, to guarantee that
individualized wellness resources – including health care, psychosocial supports, and
communities of belonging – are available for each learner.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
Given that the wellness of each learner signicantly impacts learner performance, it is in the program and public’s
best interest to ensure the learner is optimally prepared to perform as a resident. There should be a focus on
applying resources that are already available rather than depending on the creation of new resources. Examples
of wellness resources include enrollment in health insurance, establishing with a primary care provider and dentist,
securing a therapist if appropriate, identifying local communities of belonging, and other supports that optimize
wellbeing. These resources may especially benet the most vulnerable trainees.
THEME
32
Adequate and appropriate time must be assured between graduation and learner
start of residency to facilitate this major life transition.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
The transition from medical school to residency typically marks a concrete transition from paying for education
to becoming a fulltime employee focused on the lifelong pursuit of professional improvement. This transition is life
changing for many. It often requires a move from one location to another, sometimes across the world. There must
be time for licensing and in some cases, visa attainment. Often this life transition is accompanied by other major
life events such as partnering or childbearing. Once residency starts, the learner may work many hours each
week and may have little time to establish a home. Thus, it is important for wellness and readiness to practice that
adequate time be provided to accomplish this major life transition.
The predictability of this transition must be recognized by both UME and GME institutions, and cooperation on both
sides is required for this transition to be accomplished smoothly. There is a desire to overall better prepare learners
for the start of residency, and an assured transition time would allow related recommendations to be more easily
accomplished.
33
All learners need equitable access to adequate funding and resources for the
transition to residency prior to residency launch.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
As almost every learner graduating from medical school transitions to residency, the need to fund a geographic
move and establishment of a new home is predictable. This nancial planning should be incorporated into medical
school expenses, for example through equitable low interest student loans. Options to support the transitional
expenses of international medical graduates should also be identied. These costs should not be incurred by GME
programs.
UME-GME REVIEW COMMITTEE
26
Health and Wellness
RECOMMENDATIONS
34
There should be a standardized process throughout the United States for initial
licensing at entrance to residency to streamline the process of credentialing for both
residency training and continuing practice.
NARRATIVE DESCRIPTION OF RECOMMENDATION:
To benet the public good, costs to support the U.S. healthcare workforce should be minimized. To this end, all
medical students should be able to begin licensure earlier in their educational continuum to better distribute
the work burden and costs associated with this predictable process. When learners are applying to programs
in many dierent states, the varied requirements are unnecessarily cumbersome. Especially for states where
a training license is required, the time between the Match and the start of the rst year of residency is often
inadequate for this purpose. This is a potential cost saving measure.
UME-GME REVIEW COMMITTEE
27
Throughout the transition, learners are being
assessed, including through preclinical course
examinations, rotation evaluations, and
licensure examinations
Medical School
Graduation
13
6
20 19 17
Oered and
Partake in
Interviews
24 22
21
Virtual Interviews for
2021-2022
Standards for
Interview Oer and
Acceptance Process
Interview Limits
23
34
Standardized
Process for Initial
Licensing
Residency
Preparedness
28 29 30
Specialty Specic
Residency
Preparation
Improved Residency
Program Orientation
UME to GME ILP
Hando
Initiate
Support
Structures
Centralized Resident
Support Resources
26
Wellness Resources
for the Transition
31
Major Life
Transitions
32
33
Apply to Many
Programs
Residency Application
Process Innovations
Assured Time Between
UME and GME
Equitable Access to
Funding for Transition
Embed in
Learning and
Work
Environment
Professional
Identity
Formation
The UME-GME transition encompasses a complex ecosystem. As
learners navigate through the UME-GME transition, they interact with
numerous organizations with jurisdiction over specic components of
the process. However, the ecosystem is not governed by a single entity.
9 11 12
Competency Based
Faculty Development
Materials
Improved
Assessment Tools
Common
Competencies
across Transition
Feedback from
GME to UME
25
1
2
3
Committee to
Manage CQI across
Transition
Residency Selection
and Physician
Workforce Research
IRP Reform
The
Learner’s
Journey
Collaboration and Continuous
Quality Improvement
Diversity, Equity and Inclusion
Trustworthy Advising and
Denitive Resources
Outcome Framework and
Assessment Processes
Away Rotations
Equitable Mission-Driven
Application Review
Optimizing Application, Interview
and Selection Processes
Educational Community and
Resident Readiness
Health and Wellness
Medical schools and
programs have a
responsibility to promote
equity and diversity across
the continuum
10 5 4
Specialty Specic
Practices to
Increase Diversity
DEI Education
Across the
Continuum
CQI to Mitigate Bias
across Transition
16
Sharing Applicant
Demographics to
Improve Diversity
7
8
Career
Advising
Resources
Career
Advising
Curriculum
Coaching for
Professional
Identity
Formation
27
Prepare Application
Materials
Research
Residency
Programs
Final
Year
Review of Away
Rotations
18 15 14
Interactive
GME
Database
Electronic Application
System Improvements
Review of Filter
Content and Use
Standardized Dashboard
and Portfolio for Learners
MSPE Revision
Structured
Evaluative Letters
Reporting Licensure Exams
in a Single Field
Selection
Process
Hiring and
Credentialing
28
UGRC Process
In the summer of 2020, a Planning Committee of the Coalition selected the
members of the UGRC and charged them with the task of recommending solutions
to identied challenges in the transition. The following is a description of the work
process of the UGRC.
ORIGIN OF THE UGRC
In 2018, a national conversation culminated regarding the use of numeric scores associated with medical
licensing examinations in residency applicant screening and selection. In response, the chief executive ocers of
ve national organizations agreed to co-sponsor an Invitational Conference on USMLE Scoring (InCUS) in March
2019, with the primary goal of reviewing the practice of numeric score reporting. Three recommendations that
emerged focused on the USMLE, however the fourth InCUS recommendation focused on the UME-GME transition:
Convene a cross-organizational panel to create solutions for the assessment and transition challenges from UME
to GME.
In September 2019, a proposal was made to the Coalition to convene a UME-GME Review Committee in line with
the fourth recommendation from InCUS. The Coalitions members are the national organizations responsible for
the oversight, education and assessment of medical students and physicians throughout their medical careers.
4
As a result, a Planning Committee was created by the Coalition to develop the construct, membership, and charge
of the Review Committee, which would be responsible for recommending solutions to identied challenges in
the UME-GME transition.
1
In January 2020, a call for nominations was issued for individual representatives to the
Planning Committee from undergraduate medical educators, residency program directors, learners, and the
public. The Coalitions Management Committee selected the individual members of the Planning Committee from
over 60 responses. In addition, organizational representatives from AACOM, AAMC, AOGME, ECFMG, NBME, NBOME,
and OPDA were appointed to the Planning Committee.
The Planning Committee met in March 2020 and identied the construct and structure of the UGRC, developed
a process for selecting its members, and determined the key questions that the UGRC should consider. The
Planning Committee discussed the scope of the UGRC and organized pertinent issues into three broad themes:
(a) preparation and selection for residency, (b) the application process, and (c) overall considerations such as
diversity and specialty specic competencies. The Planning Committee also spelled out the timeline, deliverables,
expectations, and composition of the UGRC. An open call for nominations took place in May and June of 2020 and
the Planning Committee reviewed 183 applications to populate a balanced UGRC that included undergraduate
and graduate medical educators, organizational members, public members, students, and residents. Care was
taken to ensure that multiple perspectives would be represented on the UGRC, including type of degree (DO
and MD), racial and ethnic diversity, range of specialties, geographic distribution, and persons with a focus on
undergraduate medical education (faculty and deans) and graduate medical education (program directors and
DIOs). All UGRC members were selected in July, the co-chairs were named in August, and the UGRC held its rst
meeting in September 2020.
COALITION FOR PHYSICIAN ACCOUNTABILITY
UME-GME REVIEW COMMITTEE
29
UGRC STRUCTURE
The UGRC was led by an Executive Committee comprised of the two co-chairs, the lead Coalition sta member,
and the four original workgroup leads. The co-chairs and lead sta member initially created four workgroups to
optimize group dynamics and distribute Committee work in an organized fashion. Because the charge from the
Planning Committee included an ambitious start-to-nish timeline (September 2020 to June 2021), this structure
allowed groups to work in parallel and delve more deeply into assigned tasks. Beyond the individual workgroup
areas of focus, all workgroups also included four overall cross-cutting themes throughout their deliberations:
diversity, equity, inclusion, and fairness; wellbeing; specialty focus; and the public good. Finally, the four original
workgroups were asked to develop research questions and to consider next steps after the UGRC completed its
charge, both of which were needed to help implement nal recommendations and to inform future discussions
out of scope of the UGRC. In February 2021, the co-chairs created a fth workgroup to ensure that the UGRC
appropriately addressed the critical issues of diversity, equity, and inclusion (DEI). Finally, in April 2021, the co-
chairs tasked a sixth “bundling workgroup” to consolidate similar recommendations, sequence interdependent
recommendations, and re-organize the nal recommendations into more cogent themes.
The UGRC was assisted in its work process by generous sta support from Coalition member organizations,
including a project manager, communications director, medical writer, survey analysts, and graphics designers.
Medical librarians searched the literature to support an evidence-informed approach.
UGRC Workgroup Focus Areas
Workgroup A: Ensuring Residency Readiness
General competencies
Selection of residency/specialty eld
Workgroup B: Mechanics of the Application/Selection Process from the UME Perspective
Information sharing
Application content
Application mechanics
Workgroup C: Mechanics of the Application/Selection Process from the GME Perspective
Information sharing
Application process
Interviewing
The Match
Workgroup D: Post-Match Optimization
Optimizing UME by enhancing residency readiness
Optimizing GME by ensuring patient safety
Information sharing
Feedback to UME
DEI Workgroup: Focus on Diversity, Equity, and Inclusion
Bundling Workgroup: Consolidation, Sequencing, and Reorganization of Themes
UME-GME REVIEW COMMITTEE
30
FOUNDATIONAL WORK PROCESS OF THE UGRC
Between September 2020 and June 2021, the entire UGRC met virtually on six separate occasions. Each of
these meetings consisted of multiple sessions spread over two or three days. In addition, a special session of the
UGRC occurred on April 5, 2021, to reconsider several initial recommendations. In between the full Committee
meetings, each workgroup met intermittently to fulll its tasks. A summary was widely distributed to the public
after each Committee meeting to update the community on the UGRC’s progress to date. Further, the UGRC
issued three explicit calls for external stakeholder engagement. The rst one occurred in December 2020 and
focused on envisioning the ideal state of the UME-GME transition. The second occurred in March 2021 and focused
on descriptions of current innovations to improve the UME-GME transition. The third opened in late April 2021 and
specically asked for feedback on the UGRC preliminary recommendations.
The rst virtual meeting of the UGRC occurred in September 2020. Seven consensus ideas quickly emerged
on how to manage the work of the Committee. First, the members agreed that the UME-GME transition
encompassed far more than preparation, application, and selection for residency. This led to an elaboration of
the charge to include both optimal preparation for caring for patients early in residency as well as considerations
on how to leverage learners’ time and experiences between the Match and the initial months of training. In other
words, the successful transition requires adopting and valuing a growth mindset, accompanied by a dramatic
change in focus where the emphasis shifts away from being student-centric and towards being patient-centric.
Second, it was evident that level setting was needed to ensure that all UGRC members had common
understanding of the UME-GME transition because not all UGRC members were knowledgeable about each
aspect and component of the transition ecosystem. To address this problem, the co-chairs called upon members
of the UGRC, or in some cases employees of Coalition organizations, to create a series of video presentations (i.e.,
voice over power points) that members could watch asynchronously. The video presentations helped all the UGRC
members reach a baseline level of understanding about the transition.
Third, there was a strong sentiment that the Committee should approach its work using the concept of backward
design (i.e., rst imagine an idealized desired state and then think about how to create a system that produces
it). Each UGRC workgroup spent two months envisioning an idealized state for their area of focus, and then the
workgroup leaders harmonized them into a single ideal state for the UME-GME transition. As described earlier in
this report, the nished product included elements of the overall ecosystem and addressed wellness, specialty
selection, learner selection, competence, continuum and hando, technology, licensing and credentialing, life
transition, residency launch, and residency environment. This exercise allowed the UGRC to articulate a blue-sky
denition of success for an equitable, ecient, and transparent system across the UME-GME transition.
In December 2020, the UGRC released a survey designed to engage external stakeholder organizations about
what should be included in the ideal state. Thirty-two organizations responded to the survey and the ideas they
shared were organized into eight themes, each of which had been identied by the UGRC workgroup leads when
creating the Committees harmonized ideal state. Thus, this rst call for stakeholder engagement did not result in
any substantive changes to what the UGRC had created. The UGRC’s shared vision of the ideal state has guided its
ongoing work.
The fourth consensus idea was that the UGRC should approach the identied challenges in a systematic manner
to unearth the root causes of problems with the current UME-GME transition. Thus, four workgroups spent many
weeks discussing why identied challenges existed and why they persisted. This series of exercises produced
workgroup-specic Ishikawa diagrams (i.e., shbones) that identied myriad problems underlying the identied
challenges associated with the transition. Each shbone was presented to the entire Committee so that UGRC
members could reflect on the problems found by each of the four workgroups. To ensure that the root cause
analyses were sound, UGRC members responded to a series of provocative questions designed to challenge
common assumptions about the transition, and they were then asked to rate which problems were most
important to address. The Ishikawa diagrams are included in Appendix B of this report.
1
2
3
4
31
Importantly, the fth consensus idea that the UGRC agreed upon was to avoid premature discussion or advocacy
for any specic solution to the identied challenges of the UME-GME transition. The idea was simple: articulate
the desired outcome and understand the root problems before generating solutions. Discussion about possible
remedies was not permitted until the UGRC had created a shared ideal state for the UME-GME transition and each
workgroup had completed its root cause analysis (i.e., Ishikawa diagram). Indeed, even after both exercises were
nished, the UGRC took the time to examine the ecosystem for components of the current UME-GME transition
that worked well. This exercise helped identify current aspects and processes that should be preserved.
In January 2021, the UGRC began to brainstorm solutions to the root causes identied by the workgroups. These
brainstorming sessions occurred in both the workgroups and meetings of the entire Committee. The UGRC used
a virtual white board to help with discussion, dissection, debate, and renement of ideas before they could be
incorporated into recommendations. At this stage, the UGRC’s sixth consensus idea was set into motion, which
was simply “to not reinvent the wheel.” Thus, a concerted eort was made to identify potential solutions and
innovations described in the literature or implemented by institutions across the country.
In February 2021, the UGRC released a second call for external stakeholder input. This eort to engage
stakeholders asked individuals and organizations to share innovations that had been implemented to address
concerns about the UME-GME transition. In total, 35 responses containing 39 self-described innovations were
submitted for review to the UGRC. Of note, a majority of the innovations submitted through this process had
previously been identied by the Committee.
Lastly, the seventh consensus idea was to strive to be evidence-based whenever possible. To that end, the
UGRC secured the services of three research librarians who could search the literature and public databases
when a member or a workgroup had a question about an issue. UGRC members had hopes of generating
recommendations that were data-driven and evidence-based. However, relatively few aspects of the UME-GME
transition have undergone systematic review. Similarly, many innovations described in the literature are descriptive
in nature without generalizable outcomes. This led the co-chairs to embrace a consensus approach to endorsing
recommendations, informed by available evidence, as opposed to identifying evidence-based recommendations.
7
5
6
GENERATION AND ADOPTION OF RECOMMENDATIONS
By February 2021, the workgroups had begun the process of forming preliminary recommendations for the entire
UGRC to consider. As those eorts progressed, the workgroup leaders identied two issues that required attention
by the co-chairs. The rst was to provide a forum for contentious issues to be discussed by the full UGRC, and the
second was to provide guidance regarding the level of granularity for the recommendations. To address the rst
concern, the co-chairs asked each workgroup leader to select a few recommendations that might generate
disagreement, and the majority of the February UGRC meeting was devoted to discussion and debate about
these topics. To address the second issue, a template was created that included instructions on how to frame
each recommendation in broad terms, and to include specic examples on how a recommendation might be
implemented.
The initial recommendation template was designed to be comprehensive and included the following ten elds:
recommendation; narrative description; specic examples of how the recommendation might be implemented;
questions for librarians; known citations or references; organizations or stakeholders that could help implement
the recommendation; links to the ideal state and Ishikawa diagrams; cross-cutting themes that are impacted;
potential desired outcomes and consequences; potential barriers to implementation; and future research
questions. The co-chairs later created a streamlined version of the templates that accompany each of the
UGRC’s nal recommendations. All 34 templates are included in Appendix C. Importantly, the templates provide
essential background information, supporting evidence, important context, and the rationale for each UGRC
recommendation.
COALITION FOR PHYSICIAN ACCOUNTABILITY
32
As the groups worked to rene their recommendations and complete the templates, the co-chairs devised a
process for sharing, presenting, adopting, reconsidering, and editing the preliminary recommendations from each
workgroup. The co-chairs determined that a super majority of 67% (two-thirds of the members present) would
be required to adopt a recommendation, and that the process would allow any member who had concerns
to bring them forward and propose edits that would facilitate a vote to adopt. In other words, the underlying
philosophy was for the Committee to “get to yes” and achieve a high degree of consensus. Importantly, each
recommendation brought to the full Committee was sponsored by one of the workgroups, whose members had
more thoroughly debated and thought through pertinent issues.
The UGRC met virtually in March 2021, to take decisional votes on each recommendation proposed by the four
main workgroups. In total, the workgroup leaders presented 41 recommendations to the UGRC. Each presentation
included (a) the recommendation, (b) the narrative description, (c) components that each recommendation
required (i.e., “must haves”) as well as those that would be “nice to have,” and (d) a table outlining pros and cons
of the recommendation. The presentation was followed by a facilitated discussion that allowed members to
ask questions, seek clarications and raise concerns about the proposed recommendation. Potential edits
to the recommendation were also entertained, followed by a binding vote to either adopt or not adopt the
recommendation as written. Of the 41 recommendations initially presented, 36 were adopted with at least a 67%
majority, and ve were not adopted.
Workgroups that had proposed a recommendation that was not adopted were given the option of altering the
recommendation and asking for the modied recommendation to be reconsidered. In addition, every member
was allowed to propose new recommendations. However, only the DEI workgroup used that mechanism to
propose new recommendations. The new recommendations, together with the recommendations being
reconsidered, were processed in the same manner as the original 41 recommendations (i.e., a preliminary vote,
presentation of the recommendation, facilitated discussion, and entertainment of suggested edits). When the
UGRC convened for a special session in April 2021, six more recommendations were adopted (three altered
recommendations brought back for reconsideration and three new recommendations related to diversity, equity,
and inclusion).
In total, the UGRC adopted 42 preliminary recommendations, organized under 12 themes: oversight; advising of
learners; competencies and assessments; away rotations; diversity, equity, and inclusion in medicine; application
process; interviewing; matching process; faculty support resources; post-match transition to residency; policy
implications; and research questions. The preliminary recommendations and pertinent background material were
presented to the Coalition in April 2021, followed one week later by their widespread release and a one-month call
for public comment. The preliminary recommendations of the UGRC can be found in Appendix D.
The solicitation for public comment was facilitated by the creation of a digital survey instrument with a prominent
link on the Coalitions website. The link was made widely available to interested parties and all stakeholder groups.
The call for public comment was disseminated through numerous communication channels including social
media platforms, email distribution lists, outreach presentations, and individual networks. In addition, periodic
reminders were issued throughout the open call to increase the number of responses
.
In total, the survey instrument collected 2,673 comments from 768 distinct respondents during the period of time
that the survey was administered. Of these responses, 13.7% were submitted on behalf of an organization or
group in an ocial capacity, which accounted for 21.2% of the overall comments. The survey responses were
analyzed as follows by a team from the NBME with expertise in qualitative and quantitative methods.
Prior to the survey administration window, UGRC stakeholders were asked to provide a list of potential codes
or topics that would likely be discussed in the respondents’ comments. After the rst week of the survey
administration window, four NBME sta members read portions of the response data and identied a list of
potential thematic codes. The list of codes was presented to UGRC stakeholders for review and approval. The
four NBME sta members then coded the rst two weeks of comments using the initial codebook. Subsequently,
through an iterative process, additional codes and tags were added, which resulted in a nal set of agreed-upon
codes and tags. The nal codebook was used by the NBME sta members to code the remainder of responses
in weekly batches. Two NBME sta members reviewed 10% of all coded comments from the rst two weeks of
COALITION FOR PHYSICIAN ACCOUNTABILITY
UME-GME REVIEW COMMITTEE
33
the survey window to ensure that codes were being adequately and accurately used. This review resulted in the
application of additional codes to the comments and not to the deletion of previously applied codes. Through
discussion, NBME sta members also attended to their reactions to the responses, their backgrounds, and their
potential biases. To clarify relationships between associated codes, codes were organized using a parent-child
code structure in which a parent code could include any number of subcategories, or “children.” In all tables and
gures in the results section, an asterisk was used to indicate which of the codes are parent codes. If a child
code was applied to a free-text response, its parent code was also applied or “upcoded.” All free-text responses
were also assigned sentiment (agree, disagree, or mixed) when distinct sentiment was expressed in a comment.
Additionally, a list of tags was applied to all free-text responses when applicable. The full report from the team was
made available to all Committee members before the UGRC’s recommendations were nalized. This report from
the NBME team can be found in Appendix E.
To prepare for the June 2021 UGRC meeting, multiple members of the UGRC’s Executive Committee read the
survey report in full, and ve workgroup leaders were assigned to summarize commentary about each of the
preliminary recommendations. In addition to the information contained in the survey report, feedback from the
organizational members of the Coalition and input obtained by the co-chairs through dialogue with students,
program directors, DIOs, medical educators, medical school deans, and international medical graduates was
shared with each member of the UGRC to inform the Committees nal recommendations. During the June
meeting all stakeholder feedback, strategies for consolidating and sequencing the recommendations, and
reconsidered themes were presented, discussed, and nalized. New language for the recommendation
addressing application inflation was also proposed, discussed, and adopted. As a result of these eorts, the UGRC
adopted 34 nal recommendations organized around nine themes. Moreover, the recommendations within each
theme are sequenced in chronologic order to guide their implementation.
FINAL STEPS
The Executive Committee was responsible for writing this nal report on behalf of the UGRC. The report includes
the templates created by the workgroups as well as input from all members of the Committee. There are ongoing
discussions regarding possible opportunities for scholarly activity with the purpose of codifying and further
sharing the work of the Committee. The UGRC co-chairs will deliver this nal report to the Coalition in mid-July
2021, and the UGRC will disband shortly thereafter. The Coalition will then meet in late July to consider adoption of
the recommendations and determine next steps towards implementation.
UME-GME REVIEW COMMITTEE
34
Future Ideal State
From the outset, the UGRC agreed to envision an idealized future state for the
transition from UME to GME before developing any recommendations. The idea
was to use the concept of backward design so that the UGRC could identify the
characteristics of a system that would create that ideal state. What would success
look like if the transition worked as a cohesive ecosystem that served all learners,
faculty, clinical supervisors, and patients?
Beginning with the rst virtual meeting of the UGRC in September 2020, four UGRC workgroups spent two months
conceptualizing an idealized state for their area of focus:
• Ensuring Residency Readiness
• Mechanics of the Application/Selection Process from the UME Perspective
• Mechanics of the Application/Selection Process from the GME Perspective
• Post-Match Optimization
The workgroup leaders then harmonized each component into a composite ideal state for the UME-GME
transition. Soon thereafter, a public comment period was opened to solicit additional ideas from external
stakeholders. In total, 32 organizations responded. Overall, the stakeholder input armed the concepts developed
by the UGRC and led to an improvement in clarity and wording but did not result in substantive content changes to
the UGRC’s proposed ideal state. The nalized composite ideal state for the UME-GME transition guided the UGRC’s
ongoing work and addressed the following areas: wellness, specialty selection, learner selection, competence,
continuum, hando, technology, licensing, credentialing, life transition, residency launch, and residency environment.
THE IDEAL STATE
Overall
The foundation of the ideal state as envisioned by the UGRC is a set of core values and concepts. The ideal UME-
GME transition is equitable, coordinated, ecient, transparent, and cohesive. It is an ecosystem that supports
each learner’s growth, evidence-informed specialty selection, achievement of competence, and maintenance
and improvement of wellness. Learners progress from medical school to a residency program in a manner that
acknowledges each learner’s unique strengths and learning needs and optimizes professional identity formation.
The components of the transition balance the tension between individual freedoms and the public good and
provide trustworthy documentation of competence across the continuum using reliable assessment tools that
generate meaningful information for learners, educators, and where appropriate, regulators. Additionally, the
UME-GME transition is flexible and adaptable to changes in medical education and the health care system, with a
commitment to continuous quality improvement.
Key to the success of the ideal state is a commitment to the broad inclusion of students, educators, schools,
programs, and the public in the design, evaluation, and continuous improvement of the UME-GME transition.
Stakeholders are transparent and reliably provide necessary information to each other; stakeholders are trusting
and trustworthy.
Costs, nancial and otherwise, are right-sized throughout the process to maximize value, acknowledge conflicts
of interest, and allocate resources to advance the public good. Learners are prepared to serve diverse patient
populations, minimize disparities, and elevate equity as they execute the social mission of medicine and its contract
UME-GME REVIEW COMMITTEE
35
with the public. Diversity is present and valued throughout all specialties, programs, and geographic areas.
Appropriate action is taken to mitigate racism and harmful bias throughout the medical education and health
care systems. Faculty, learners, and the system structure cultivate inclusive learning environments that foster a
growth mindset. Medical students are provided reliable, high quality advising, and are ultimately responsible for
their own career progression after medical school.
Wellness
An ideal state for the UME-GME transition optimizes wellbeing for all involved. For learners, thenancial challenges
of applying and transitioning to residency and being a resident are minimized. Learners have adequate funding
to establish and maintain their new living arrangements and focus on their training. There is adequate but
not excessive time for the geographic move from medical school to residency. GME programs facilitate the
creation of supportive social networks for each learner with special consideration of the needs of those from
underrepresented backgrounds. A focus on health and wellbeing is integral throughout the transition.
Specialty selection
Specialty selection can be an especially fraught process for learners and impacts the eectiveness of the entire
health system. In an ideal state, medical schools have a structured approach to career advising that begins early, is
based on professional development frameworks and competencies, is integrated within an educational program,
provides broad exposure, and aligns with the needs of society. The culture of career advising programs is inclusive,
trustworthy, non-judgmental, and equitable for all students. Advising tools are high quality, interactive, honest, and
readily available.
Educators determining the structure for UME and GME programs as well as those providing advice, mentorship,
or coaching to learners recognize career indecision as a normal part of professional formation and permit
flexibility for undecided learners at key transition points. This includes allowing non-standard timelines and
nonclinical careers as necessary. Students are supported by both UME and GME to seek specialties based on
a holistic assessment of their aptitude and goals that allows learners to be aspirational about their ambitions
while pragmatic about their possibilities. This support includes access to trustworthy, data-driven resources.
Students are informed about the workforce needs of society. They are advised against contributing to a culture of
competition.
Learner selection
While learners are challenged by specialty selection, GME programs are challenged by learner selection. In an
ideal state for learner selection that benets GME programs, learners, and most importantly patients, all residency
programs receive applications from individuals with a sincere interest in attending and who are academically
prepared and aligned with the program and institutional mission. Every program receives enough applications to
ll their class and has sucient resources to conduct a holistic review of the applications received. Interviews are
oered and scheduled to promote student wellness and minimize conflict with ongoing rotations. There are ample
interview slots for those invited. Applicants interview only with programs they are likely to attend if accepted. Away
electives broaden educational exposure but are not essential for successful residency selection.
Applicants are certied by their medical school as fully prepared and trustworthy for residency training. There is
social accountability and transparency for medical schools in the validity of this certication. Residency programs
have information regarding current competence of an applicant, the trajectory of their growth during medical
school, and the accuracy of measurements. These details are available in some form for all applicants in the
Match including U.S. medical graduates (MD and DO), U.S. citizen international medical graduates (IMGs), and non-
U.S. IMGs. Programs receive early notice about any student performance concerns. These are described clearly, in
context, and with a description of the resources required for remediation or ongoing support.
UME-GME REVIEW COMMITTEE
36
Competence
The ideal state for learner selection requires an ideal state for the denition, assessment, and assuredness of
competence, wherein graduated medical students are prepared to serve as physicians in training. They are facile
with the appropriate knowledge, skills, and eciency and have advancing professional identity and a condent
humility. They are prepared for the realities of residency and a physicians career. They are trustworthy to practice
under supervision, asking for help when needed.
A shared mental model of competency across the medical education continuum exists in the ideal state that
involves a standardized set of general competencies as well as specialty-focused competencies for certain
domains such as patient care and medical knowledge. Faculty development claries expectations for faculty
with learners at each level of training, teaches remediation strategies, and describes how patient safety is
ensured. Educators dene those competencies that programs believe, and data support, are the best predictors
of a learner’s abilities to succeed. Reliable and valid standardized assessment tools document competence. All
medical students engage in specialty-aligned knowledge and skills training during the nal year of medical school
to achieve the dened general and specialty-focused competencies.
Continuum and Hando
This ideal state for competence smooths a learner’s way along the continuum of medical education and allows
for seamless handos between stages. The timeline for this continuum prioritizes competence, and learners,
along with educators and institutions, approach training with a growth mindset and value lifelong learning.
Students have the time, space, and coaching to reflect on their growth and progress, grieve losses associated
with the transition to residency, and emotionally prepare for the launch of residency.
Areas for growth and gaps in a learner’s knowledge or skills are recognized and addressed by medical school
educators and GME programs as well as by themselves. Educators and learners value a learner’s competence in
identifying knowledge and skills gaps and together enact interventions for improvement. Assessment data from
the end of medical school are utilized to create an evidence-informed handover, engaging the learner in the
process and establishing directed self-learning. These data do not negatively aect a learner’s career.
Technology
An ideal UME-GME continuum is supported by useful technology that facilitates holistic review through a common,
structured format that is trustworthy and searchable. Such technology allows programs to nd applicants based
on multiple academic metrics, details of clinical and life experiences, and additional attributes. The integration of
information from schools, letter writers, and applicants allows programs to identify U.S. MD and U.S. DO students
and IMGs who will succeed at their programs. Applicants are identied by what they desire in a program, including
but not limited to a specic program, program experiences, or program mission. Evidence-based assessments are
available, meaningful, trustworthy, and presented in a useful format.
Licensing and Credentialing
The ideal state for technology in the UME-GME transition supports an ideal state for licensing and credentialing,
which is accomplished eciently for all learner groups (U.S. MD, U.S. DO, and IMGs). Varying state requirements are
addressed smoothly, creating a timely process without excessive cost. Necessary general and specialty specic
credentialing and certication are facilitated. As appropriate, an ideal state for licensing and credentialing includes
visa management.
Life Transition and Residency Launch
An ideal state for licensing and credentialling is one factor that optimizes the ideal launch of residency training.
Other factors include program directors and residency faculty who have the training, resources, infrastructure,
and perspective to approach the resident workforce as learners. Residency faculty welcome each learner as an
UME-GME REVIEW COMMITTEE
37
individual, knowing their strengths and weaknesses and trusting their competence appropriately. They are able
to tailor the rst months of the residency experience to the individual trainee, with appropriate supervision and
learning tools in place to facilitate success.
Additionally, residency faculty and peers recognize and mitigate bias to ensure optimal entrustment and success
for all learners in an inclusive environment. Special populations receive additional attention. This includes ensuring
that those who are underrepresented in medicine are introduced to support networks. International medical
graduates have focused training to prepare for success in the U.S.
Meaningful information about learners identied after the start of residency is also shared back to medical
schools to continually improve the preparatory process.
The ideal UME-GME transition also includes the cooperation of patients who are appropriately oriented to a clinical
environment that includes learners.
Residency Environment
Once residents start a GME program, the ideal residency environment includes adequate resources to support the
pursuit of individual learning plans for every resident.
In the ideal state, program directors and faculty have protected time, educational support, administrative sta,
professional development, and funding to support the ongoing individualized growth and wellbeing of residents.
Sponsoring institutions and all other parties recognize the primary role of resident physicians as learners and fully
support the educational environment. At the same time, the developmental path of resident physicians includes
progressive responsibility, self-directed learning, and professional identity formation, which leads to readiness for
independent practice at the time training is complete. Resources invested in medical education are appropriately
allocated to address the demands of the continuum.
Conclusion
With the successful execution of the steps of the ideal state, learners achieve an optimal transition from the role of
student to resident physician and are well prepared for the rigors of residency training.
UME-GME REVIEW COMMITTEE
38
Impact of Public Commentary
Stakeholder engagement has been a consistent priority for the UGRC. Ongoing updates about the Committees
work have been provided through the Coalition website and press releases, and through deliberate outreach and
meetings with stakeholder groups including students, program directors, DIOs, medical educators, medical school
deans, and international medical graduates. Eight of these meetings occurred after the release of the preliminary
recommendations, facilitating discussion about individual recommendations.
There have been three formal opportunities for individuals and organizations to provide feedback to the UGRC.
In December 2020, a survey was released to stakeholder organizations asking for input on the ideal state of the
UME-GME transition. Thirty-two organizations responded to the survey and the ideas they shared reinforced the
shared vision created by the UGRC for the future ideal state of the transition.
In February 2021, the UGRC issued a second call, inviting individuals and organizations to share ongoing or piloted
innovations that address concerns about the UME-GME transition. In total, 35 responses containing 39 self-
described innovations were submitted for review. Of note, the majority of the innovations submitted had previously
been identied by the Committee.
The most ambitious solicitation requested external stakeholder feedback on the UGRC preliminary
recommendations and coincided with their public release on April 26, 2021. A digital survey instrument was
created with a prominent link on the Coalitions website. The link was made widely available to interested parties
and all stakeholder groups. The call for public comment was disseminated through numerous communication
channels including social media platforms, email distribution lists, outreach presentations, and individual networks.
In addition, periodic reminders were issued throughout the one- month open call to increase the number of
responses.
The public comment survey responses were analyzed by a team from the NBME with expertise in qualitative
and quantitative methods, and their full report can be found in Appendix E. This report, all survey comments,
and the Coalition organizational responses were made available to all UGRC members before the Committees
recommendations were nalized. Multiple members of the UGRC’s Executive Committee read the survey report
in full, and ve workgroup leaders were assigned to summarize and present commentary relevant to each of the
preliminary recommendations at the June 2021 UGRC meeting.
During that meeting, a comprehensive discussion about stakeholder feedback regarding the preliminary
recommendations occurred, which included 1) individual Coalition member feedback; 2) reactions from meetings
with stakeholder groups; and 3) individual and organizational responses to the public comment survey, including
statements from organizational members of the Coalition.
In response to stakeholder feedback during the public comment period, the UGRC made important changes to its
preliminary recommendations. The changes included signicant editing, clarication, and renement of language;
complete reworking of a recommendation addressing application inflation; judiciously combining similar ideas to
reduce the overall number of recommendations from 42 to 34; and sequencing of recommendations to provide
prioritization and a timeline for implementation. The feedback also helped clarify the concept of bottlenecks
or critical recommendations that must be implemented to allow other downstream recommendations to
move forward. The themes used to organize the recommendations were condensed from 12 to nine, with
reconsideration of the descriptive theme titles. Of note, 32 of the preliminary recommendations were impacted by
the feedback obtained through public commentary. Stakeholder input from individuals, stakeholder groups, and
Coalition organizations was invaluable in informing the UGRC’s nal recommendations.
UME-GME REVIEW COMMITTEE
39
Consolidation and Sequencing
Early feedback about the UGRC’s 42 preliminary recommendations suggested that the sheer number was
somewhat overwhelming, that some recommendations were markedly similar to each other, and that critical
recommendations might serve as upstream bottlenecks that could hinder downstream implementation of other
recommendations. To address these concerns, the co-chairs created a sixth workgroup and tasked them to
review the preliminary recommendations and determine which ones were interdependent. Once identied, these
interdependencies would serve as the basis for deciding how the nal recommendations might be organized,
consolidated, and sequenced.
This new team – known as the “bundling workgroup” – considered the preliminary recommendations for the
purpose of consolidation and reducing their total number. The workgroup also reviewed feedback from the public
commentary to decide which recommendations could be grouped together, and in what sequence, to facilitate
orderly and ecient implementation. This work led to the identication of four distinct bottlenecks: organizational
collaboration and continuous quality improvement (Recommendation 1), the creation of an interactive database
(Recommendation 6), developing consensus around a common outcomes framework (Recommendation 9), and
away rotations (Recommendation 13).
The gure depicts how Recommendation 9 acts as a
bottleneck. Specically, Recommendation 9 calls out
the need for a common outcomes framework shared
by both UME and GME. If no consensus is achieved on a
common outcomes framework, progress in implementing
the following three recommendations will be impeded:
Recommendation 11 (related to assessment tools associated
with the common framework); Recommendation 12
(related to faculty development for both teaching and
assessment to optimally utilize the common framework),
and Recommendation 20 (related to an electronic
dashboard that shows each learner’s assessment data
within the framework).
Although each bottleneck is associated with a number of downstream recommendations, the most important
bottleneck is Recommendation 1, which calls out the need to convene a national ongoing committee to manage
continuous quality improvement of the entire process of the UME-GME transition. As shown in the table below, 27
of the nal 34 recommendations depend on the implementation of Recommendation 1. Without organizational
collaboration to convene a national committee focused on continuous quality improvement, the UGRC
recommendations for comprehensive improvement of the UME-GME transition will fail for lack of implementation.
20
12
11
9
20
UME-GME REVIEW COMMITTEE
40
Bottleneck Downstream, Dependent Recommendations for each Identied Bottleneck
1 2 3 4 5 6 7 8 11 12 14 15 16 17 18 19 20 23 24 25 26 28 29 30 31 32 33 34
6 7 8 16 23 24
9 11 12 20
13 23 24
As noted above, the bundling workgroups discussions were informed by feedback obtained from the public
comments. However, the tasks of organizing and sequencing the recommendations could not be nished until
the UGRC had adopted nal recommendations. After the public comment period closed and the analysis of the
commentary was completed, most of the preliminary recommendations were modied and one was completely
reworked. The workgroup then proposed that the Committee consolidate 13 preliminary recommendations
into ve, new bundled recommendations. The UGRC accepted this proposal, which reduced the number of
nal recommendations from 42 to 34. Further, the UGRC adopted the workgroups proposal to re-organize the
recommendations into the following nine themes:
• Collaboration and Continuous Quality Improvement
• Diversity, Equity, and Inclusion
• Trustworthy Advising and Denitive Resources
• Outcome Framework and Assessment Processes
• Away Rotations
• Equitable Mission-Driven Application Review
• Optimizing Application, Interview, and Selection Processes
• Educational Continuity and Resident Readiness
• Health and Wellness
Lastly, the UGRC adopted the bundling workgroups proposal on sequencing the nal recommendations within
each theme. This organizational structure reflects the interdependence of the recommendations and is intended
to help stakeholders, including the organizational members of the Coalition, consider next steps. As shown in the
table on the next page, each recommendation has a proposed initial timeframe to help guide its implementation.
Note that the UGRC considers certain recommendations (i.e., numbers 21 and 22) to be time sensitive and that
these should be enacted immediately by the Coalition. In contrast, the Committee members understand that
others may require several years of development before they can be fully implemented.
UME-GME REVIEW COMMITTEE
41
Sequencing Timeline
July-21 2021-22 2022-23 2023-24 2024+
Now Immediate Soon Longer
Theme Collaboration and Continuous Quality Improvement
1 Committee to Manage CQI across Transition
2 Residency Selection and Physician Workforce Research
3 IRP Reform
Theme Diversity, Equity and Inclusion
4 Specialty Specic Practices to Increase Diversity
5 DEI Education Across the Continuum
Theme Trustworthy Advising and Denitive Resources
6 Interactive GME Database
7 Career Advising Resources
8 Career Advising Curriculum
Theme Outcome Framework and Assessment Processes
9 Common Competencies across Transition
10 CQI to Mitigate Bias across Transition
11 Improved Assessment Tools
12 Competency Based Faculty Development Materials
Theme Away Rotations
13 Review of Away Rotations
Theme Equitable Mission-Driven Application Review
14 MSPE Revision
15 Structured Evaluative Letters
16 Sharing Applicant Demographics to Improve Diversity
17 Electronic Application System Improvements
18 Reporting Licensure Exams in a Single Field
19 Review of Filter Content and Use
20 Standardized Dashboard and Portfolio for Learners
Theme Optimizing Application, Interview and Selection Processes
21 Virtual Interviews for 2021-2022
22 Standards for Interview Oer and Acceptance Process
23 Residency Application Process Innovations
24 Interview Limits
Theme Educational Community and Resident Readiness
25 Feedback from GME to UME
26 Centralized Resident Support Resources
27 Coaching for Professional Identity Formation
28 Specialty Specic Residency Preparation
29 Improved Residency Program Orientation
30 UME to GME ILP Hando
Theme Health and Wellness
31 Wellness Resources for the Transition
32 Assured Time Between UME and GME
33 Equitable Access to Funding for Transition
34 Standardized Process for Initial Licensing
UME-GME REVIEW COMMITTEE
42
Limitations
The UGRC faced a number of limitations to its work process, beginning with the allotted timeframe. Although the
deadline for creation of the nal recommendations was 10 months, the complexity of the problems intrinsic to the
current UME-GME transition and the comprehensive scope of the charge were considerable.
In addition, a signicant amount of time was devoted to the steps prior to generating recommendations, including
the elaboration of the charge, level setting, envisioning an ideal future state, and generating Ishikawa diagrams
(shbone analyses) of root causes of the identied challenges to the transition. In the end, this eort was thought
necessary, as the process goal was to articulate the desired outcomes and understand the root problems before
generating solutions.
Due to time constraints it was also necessary to divide the charge between workgroups, and have work proceed
in parallel. More opportunities to share information and explore revisions across workgroups may have allowed for
earlier consolidation of recommendations.
Although the virtual format of the meetings caused some limitations with regards to the social norms of in person
meetings, the ability to come together virtually had a positive impact on the eciency of the work process.
The chat function of the platform also enabled additional layers of interactions and an alternative modality of
engagement. Managing a committee of 30 members can be challenging, but UGRC meetings were structured to
include both small and large group sessions.
Although the UGRC had representation from across the continuum, there was an acknowledged desire for more
diversity within the Committee and more representation from learners.
The workgroups and the librarians searched for medical literature supporting proposed solutions to the problems
facing the transition, however there is currently limited evidence for many of the recommendations, and
generating future research questions became an additional focus of the Committees work.
Although many innovations to improve the UME-GME transition are being explored across the country, it is too early
to draw conclusions regarding their overall eectiveness. As evidence accrues, a continuous quality improvement
process will help to advocate for changes that are evidence based.
The number of comments received through the public comment survey was lower than expected. This was likely
secondary to the large number of preliminary recommendations presented for feedback, and there was ongoing
discussion on how to most eectively engage stakeholders. Received survey comments overall however were
thoughtful and high quality, and signicantly impacted the wording of the nal recommendations.
Although there was the potential for conflict of interest from the inclusion of organizational leadership on the
UGRC, members honored the explicit request to bring their experience and expertise to discussions, and to
participate as individual members rather than as organizational representatives.
The scope of the UGRC was limited to the UME-GME transition, and therefore recommendation timeframes do
not extend for the duration of residency or into fellowship training. The resources available to the UGRC also did not
allow for a cost analysis of the recommendations.
Finally, the charge for the UGRC was to generate solutions in the form of recommendations to comprehensively
improve the UME-GME transition. We look to the collaboration of the Coalition member organizations for
implementation of the recommendations.
UME-GME REVIEW COMMITTEE
43
References
1. Whelan A, Joshi A. Planning Committee for the UME-GME Review Committee Final Report. Coalition for Physician
Accountability Planning Committee. 2020. https://physicianaccountability.org/wp-content/uploads/2020/05/
Planning-Committee-Final-Report-Draft-Final_520.pdf (Accessed June 30, 2021).
2. Barone MA, Filak AT, Johnson D, Skochelak S, and Whelan A. Summary Report and Preliminary
Recommendations from the Invitational Conference on USMLE Scoring (InCUS), March 11-12, 2019. https://www.
usmle.org/pdfs/incus/incus_summary_report.pdf (Accessed June 30, 2021)
3. Cain R, Catanese V, Katsufrakis P, McMahon G, Odom C, and Skochelak S. Proposal: Planning process to
collaboratively review the transition from UME to GME. Coalition for Physician Accountability Management
Committee. 2019. https://physicianaccountability.org/wp-content/uploads/2020/01/UME-GME-Proposal-
Final-1.17.20.pdf (Accessed June 30, 2021)
4. Chaudhry HJ, Kirch DG, Nasca TJ, et al. Navigating Tumultuous Change in the Medical Profession: The Coalition for
Physician Accountability. Acad Med. 2019 Aug; 94(8):1103-1107.
UME-GME REVIEW COMMITTEE
44
APPENDICES
UME-GME REVIEW COMMITTEE
45
Appendix A: Glossary of Terms and
Abbreviations
Organizations:
ACCME: Accreditation Council for Continuing Medical Education
ACGME: Accreditation Council for Graduate Medical Education
AACOM: American Association of Colleges of Osteopathic Medicine
ABMS: American Board of Medical Specialties
AMA: American Medical Association
AOA: American Osteopathic Association
AAMC: Association of American Medical Colleges
CMS: Centers for Medicare and Medicaid Services
CMSS: Council of Medical Specialty Societies
COALITION: Coalition for Physician Accountability
ECFMG: Educational Commission for Foreign Medical Graduates
FSMB: Federation of State Medical Boards
LCME: Liaison Committee on Medical Education
NBME: National Board of Medical Examiners
NBOME: National Board of Osteopathic Medical Examiners
NRMP: National Resident Matching Program
OPDA: Organization of Program Director Associations
UGRC: Undergraduate Medical Education to Graduate Medical Education Review Committee
Terms:
Away Rotations: A clinical experience at a teaching hospital or clinic that is not aliated with a student’s medical
school
Basic Advising: Common understanding of career advising, professional development, specialty selection, and
application procedures.
CiM: Careers in Medicine
COMLEX-USA: Comprehensive Osteopathic Medical Licensing Examination of the United States
Competence: The array of abilities (knowledge, skills, and attitudes) across multiple domains or aspects of
performance in a certain context. Statements about competence require descriptive qualiers to dene the
relevant abilities, context, and stage of training. Competence is multi-dimensional and dynamic. It changes with
time, experience, and setting. (Frank et al. 2010)
Competency: An observable ability of a health professional related to a specic activity that integrates knowledge,
skills, values, and attitudes. Since competencies are observable, they can be measured and assessed to ensure
their acquisition. Competencies can be assembled like building blocks to facilitate progressive development. (Frank
et al. 2010)
CQI: Continuous Quality Improvement
DEI: Diversity, equity and inclusion
DIOs: Designated institutional ocials
UME-GME REVIEW COMMITTEE
46
DO: Doctor of osteopathic medicine
Dual Applicants: Applicants applying to more than one specialty
Educational Continuum: Term that describes the span of a physicians education, from undergraduate medical
education (medical school) to graduate medical education (residency and fellowship) to continuing medical
education (ongoing during years in practice)
ERAS: Electronic Residency Application Service
General Career Advising: Assisting students in selecting an appropriate career path and specialty
GME: Graduate medical education (residency training)
Holistic Review: Mission aligned selection process that considers an applicant’s experiences, attributes, and metrics
ILP: Individualized learning plan
IMG: International medical graduate
InCUS: Invitational Conference on USMLE Scoring
Initial Residency Period (IRP): Number of years it takes for a resident to become board eligible in the rst medical
specialty the resident entered, set when a physician enters residency
In-Training Examination: Annual specialty-specic standardized multiple choice question medical knowledge
examination
Longitudinal Assessment: Measurement of a learner’s knowledge, skills and attitudes that occur on an ongoing
basis over a prolonged period of time
LOR: Letters of recommendation
Matched: A student that was able to secure a residency position to continue their medical education
Match Day: The day, typically occurring in March of each year, that students nd out which residency program
they have been assigned to for training after they graduate from medical school
MD: Doctor of medicine
MSPE: Medical Student Performance Evaluation
PCP: Primary care provider
PD: Program director
SELs: Structured evaluative letters
SOAP: Supplemental Oer and Acceptance Program
Specialty-Specic: Pertaining to a specic medical specialty (e.g., pediatrics, surgery, psychiatry, etc.)
Specialty-Specic Advising: Assisting students with strategies for optimal placement in their chosen specialty
SLOEs: Standardized letters of evaluation
Un-Matched: A student that was unable to secure a residency position to continue their medical education
USMLE: United States Medical Licensing Examination
Lack of alignment:
advising & stakeholder
needs
Advising misaligned
with student
preferences
Advising not aligned
with patient &
population health needs
Student advising Assessment tools and strategies Culture
Lack of trustworthy data to
inform advising
Lack of transparency —
programs don’t share all data
Data not standardized across
schools and programs
Inadequate advisor preparation
Lack of time
Lack of funding
Lack of institutional value placed
on advising
Lack of current advising resources
Lack of single coordinated system
Health care nancing uncoordinated
System needs things that don’t have a good business model
Each stakeholder has own nancial interest
Separate funding streams
Rigid business model in US and international schools
Each stakeholder has own accountability
structure
All stakeholders are looking at dierent part of
the problem
All stakeholders are working in their own interest
Public as stakeholder is undervalued
Metrics of success focus on learners,
schools & programs
Accreditation is process-oriented
rather than patient outcome-focused
Stakeholders Match system Denition of competence
Lack of shared mental model
Purpose of assessment; tension of
formative vs. summative
Varied approaches to assessment
at schools
Culture of individual ownership of
approaches
Lack of trust in available tools and
strategies, users
Varied, insucient resources for
assessment
Lack of validity evidence for
assessment tools and strategies
Lack of expertise to generate
evidence
Inadequate resources
Transactional, high-stakes nature
of match disconnected from
educational priorities
PDs are focused on comparative
data rather than competence
— Inadequate time to review
applications
Pressure to achieve high rankings
and be perceived as excellent
using familiar metrics
— Don’t have a way to measure
some important outcomes
Rigid, one size ts all approach
Fixed timepoint for match
Implicit assumption that all
learners will progress at equal pace
Culture is competitive
Students are competitive to get into
medical school
Residency selection is competitive
Schools are competitive
Lack of trust in other stakeholders
Focus on individual achievement
over social good
Insucient or ineective attention
to professional identity formation in
training
Challenges of labeling
unprofessional behavior
Fear about impact of labeling
unprofessional behavior
Lack of understanding of
professional development
Generational Dierences
Legal risks
Fear of failure
Risk of unmatched student
Culture success
Variable denition of competence
Competence has local meaning
in the local context & culture
Variable facility development
Competence as achievement vs.
a developmental progression
Dierent denitions of
competence in UME & GME
Variable value placed on
competence
Variable faculty and institutional
buy-in to CBME
CBME doesn’t provide maximally
useful info in the Match
Misaligned incentives
Easier to advance a learner than
stop them from advancing to
ensure competence
The current
dysfunctional UME-GME
transition system is
characterized by
mistrust and mismatch
of expectations among
learners, UME educators,
and GME educators who
use the wrong
information to make
wrong inferences
••••••••••••••••
Workgroup A
47
Appendix B: Workgroup Ishikawa Diagrams (Fishbones) Created for Root Cause Analysis
Key for anticipated time-based solutions: short term solution mid term solution long term solution
Applicants clinical
experience and
knowledge of specialty
limited
Applications
Recruitment
Application Mechanics: Lack of
clarity with reporting required
information (LOA, probation, etc.),
costly, long application season,
couples match process dicult to
understand
Letters of recommendation lack
consistency among specialty requirements
with confusion/bias to templates
Interviews lack standardization across
programs and specialties, lack faculty
development and education on
appropriate questioning
Virtual interviewing:
• Bias to tech issues, perception of
staging, and resource cost
Tech limitations due to cost,
availabilities and tech diculties
Student interviewing-
Preparation resource intensive
Cost- of interview high in both
tangible costs (travel) and time
away from rotations
Interviews
Assessment and
Holistic Review
Medical student reporting lacks consistent,
comparable information from objective and
universal reporting tools leading to mistrust by
residency programs
Application inflation increasing
program reliance on board scores
Programs lack consistent data and
resources for holistic review of
applicants
Medical school advising challenges
due to lack of feedback from
programs to school, advisor
experience, variation/lack of
program data, variabilities in
student clinical exposure, lack of
clarity on application requirements
Chaotic Interview Process
More interview oers than slots
resulting in cancellation
Insucient opportunity to
respond to interview invitation
Holding interview slots
Lack of real-time interview update to
allow schools to intervene/counsel
students appropriately
Medical school reliance on
high residency match rates
for their own recruitment
may conflict with GME goals
The system of scheduling elective rotations
is problematic for programs (duplicative
scheduling of rotations, cancellations)
EPAs and other competency reporting requirements
vary among schools and lack standardized tools that
are measurable, reproducible, universal, and
understood by students
Lack of professionalism information as
denitions, faculty development and tools
for reporting vary among schools
Inflexibility of med school
curriculum to tailor educational
experiences for successful
residency transition
The UME/GME
process for the
applicant, medical
school, and program
requires improvement
•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••
The system lacks transparency and visibility of
program requirements for away or audition
rotations, including if these ‘lters’ dier from
interview/residency application lters
The system lacks an easily accessible
database for programmatic information
that is accurate and comparable that
includes the programs desired applicant,
including ‘lters’ used by program to
determine student eligibility
Lack metrics in multiple key competencies
to compare candidates leading to program
reliability on board scores
Day 1 Readiness/Later Transition
lacks denition and consistent tools
for measurement and reporting
For all 4 components-there
is a lack of consistent
denition and application
of DEI practices
•••••••••••••
Workgroup B
48
Fear
Lack of Trustworthy,
Validated Information
to Programs
Lack of Trustworthy,
Validated Information
to Applicants
Needs of Society
Not Prioritized
Program Director
Stress (Expectations)
Program Director Stress
(Limited Resources)
Applicant Stress Bias
Lack of trustworthy assessment
especially with respect to
longitudinal, workplace-based
and 360 degree assessment,
including for IMGs
Applicant information is not in a
structured, validated format
usable for large scale review
Lack of understandable, plain
language reporting of student
assessment pre- and post-match,
especially in longitudinal,
workplace based and 360 degree
assessments, especially for IMGs
Unclear what data should be used
for resident selection, or how we
would dene a successful resident
Fear of missed opportunity-
PDs want “best” applicants,
applicants want “best”
programs, both think that
one more interview or
application will help
UME-GME
transition process
is inequitable,
inecient,
wasteful, costly,
and unnecessarily
stressful for all
involved
•••••••••••••••••••••
Fear of not lling- pressure for
programs to ll due to funding,
clinical need, prestige, etc.
Fear of not matching- Applicants
have limited career options outside of
the Match and perceive no flexibility
to change specialty or the timeline
Medical schools fear unmatched
students, may limit their
transparency
Inflexible timeline
Unfamiliar process
ACGME requirements to address wellness,
QI, diversity, and board pass rates
increase documentation and stress
Frequent Program director turnover, so new
PDs must learn unfamiliar rules
Hospital partners have clinical
expectations for the program without
a lot of backup if the program can’t
meet those expectations, which leads
to signicant risk aversion (for learners
who could struggle) and fear of not
lling (which also aects program
funding)
There are many applications per
position, and many applicants
have similar qualications. PDs
have little guidance on how to
select applicants for interview as a
part of holistic review
Limited time and stang for
individual holistic review at initial
application review, so may rely
on simplistic lters
Limited funding may fall
further if program begins to
struggle and can’t ll
Inadequate resources for
trainees requiring additional
support (educational to pass
boards, psych, clinical backup
if unable to care for patients,
faculty development, etc.).
Lack of resources means that
learners who needed support
previously are avoided
Limited time and stang
for interviews
Program director burnout and
depression may lower their
capacity further
Financial burden, educational
burden, and opportunity cost
for time spent on application
process
Unfamiliarity with the
process
Obligation for away
electives for more than
broadening clinical diversity
or learning about a
program– some specialties
required them for interest
signaling and student
assessment
Process is very dierent for
dierent groups of applicants
(USMD, USDO, IMG, etc.),
without clear expectations
Any measurement technique
(including standardized
metrics) can hinder some
applicants, but programs
need some way to tell the
dierence among them, and
applicants want a way to
distinguish themselves
Using biased metrics for
selection leads to a more
transparent, predictable
process compared with
holistic review
Filters can cause bias without
alerting programs (ie USMLE
lters removing DO applicants)
Bias favors certain applicants,
schools, etc., who may resist
complete equity
Conflicting advice from
multiple sources (peers, UME,
GME, online)
Yearly variability in residents
matched, especially at
smaller programs
Programs are not always
transparent in how they
select applicants for
interview and ranking, or
who actually matches with
the program
Applicants do not seem
to utilize or trust the
information that is
available.
Student eort spent on transition
instead of working toward the greater
good (research, patient care, wellness)
Students learn to hide their
weaknesses, reinforcing unhelpful
patterns for future practice
Sucient applicants do not go to
underserved areas/specialties
(FM, IM, and peds are the most
unlled specialties)
Learner-centered educational
requirements for residency programs
may conflict with patient-centered
health system needs
DO, URM, and IMG applicants are
underrepresented in certain
specialties and geographic areas
Signicant waste due to
redundant licensing exams
(multiple steps of both COMLEX
and USMLE, some applicants take
both). Uncertain that these metrics
are predictive of competence.
Workgroup C
49
Professional Identity
Formation
Information Sharing -
Handos
Optimizing UME:
Residency Ready
Optimizing GME:
Ensuring Patient Safety
DEI
Logistics of
Transitioning
Feedback to UME
Wellbeing
Learning Environment
Cultural Dierences | Mistreatment | Hidden Curriculum | Service/Education Balance | Specialty Disrespect
Clinical Environment
Patient Safety | Quality | Wellbeing | Professionalism | Supervision and Care Transition
Individual
Attributes
Diversity
URM
IMG
MD/DO
Support
Partners
Parenting
Finances
Post Match
Optimization
Eect
•••••••••
Pressure for schools
to advocate for
applicants
Meaningful data, ILPs
FERPA
No consequence to
UME for inaccuracy
Discipline-specic
curriculum linked to GME
Few non-residency
options
Conflict of interest
for schools
CoreEPAs
Boot camps —
Skill building
Lack of flexibility to
make customized
schedules
Appropriate supervision
Bd pass rate as
measure of success
Cultural humility
Lack of early
assessment
Low risk capacity —
Challenge of the
struggling intern
Importance of IPE
Need for representative,
inclusive community
Need for bystander
training
Need for equity lens
throughout process
Mindset transition:
student to worker &
lifelong learner
Life transitions
Impact of
mistreatment
Imposter
syndrome
Lack of
coaching
Graduation
Life long
learning
Specialty
bias
Moving
Finances
Licensure
Varied state
requirements
GME performance
data
Visa issues
Curricular gaps
Trainings
(ACLS, etc.)
MSPE utility &
accuracy
Welcome &
support
Need for early
assessment
Workgroup D
50
UME-GME REVIEW COMMITTEE
51
Appendix C: UGRC Final Recommendations
With Complete Templates
Recommendation 1:
Convene a national ongoing committee to manage continuous quality improvement of the entire process
of the UME-GME transition, including an evaluation of the intended and unintended impact of implemented
recommendations.
Narrative description of recommendation:
One of the challenges in creating alignment and making improvements is the lack of a single body with
broad perspective over the entire continuum. This creates a situation where organizations and institutions
are unnecessarily and counterproductively isolated, without a shared mental model or mission. A convened
committee, that includes learner and public representatives, should champion continuous improvement to the
UME-GME transition, with the focus on the public good.
This recommendation creates the ideal state for the UME-GME transition because:
The ideal state requires an equitable, coordinated, ecient, and transparent system across the UME-GME
transition. Further, the ideal state specically endorses the idea that the transition ecosystem must adapt to
changes in both medical education and health care, with a commitment to continuous quality improvement.
An ongoing committee that is focused on the entire process will ensure that the eorts to implement all
recommendations occur in a coordinated fashion, and that sucient attention is given to doing so in a manner
that is committed to continuous quality improvement.
How this recommendation links to the shbone diagrams used to develop the ideal state:
• Much of the UME-GME transition occurs within clinical learning environments
• Inequitable, inecient, wasteful, costly, and unnecessarily stressful
Implementation “must haves” include:
• Buy in from each constituency to allow for eective launch and operations
• Commitment to continue the work of implementing the UGRC recommendations
• Continuous quality improvement mindset
Implementation “nice to haves” include:
• Benchmarking and dashboards to show progress toward implementation
• Communications vehicles to disseminate future recommendations
Pros Cons
Alignment Reluctance to support central authority
Cost savings
Creation of a new organization or expansion of a
current one with associated costs and turf/scope/
boundary challenges
Focus on the public good Risk of increasing expense or hassle
Iterative and evolving
UME-GME REVIEW COMMITTEE
52
Specic examples on how this recommendation might be implemented:
The Coalition for Physician Accountability is itself an example of an overarching organization that has self-
organized into a body issuing recommendations on numerous topics. Similarly, a subset of organizations
(i.e., American Association of Colleges of Osteopathic Medicine, Association of American Medical Colleges,
Accreditation Council for Graduate Medical Education, Educational Commission for Foreign Medical Graduates/
Foundation for Advancement of International Medical Education and Research) has recently provided resources
to help graduates from the Class of 2021 make a successful transition from students to residents. Further,
various organizations have collaborated on a number of projects such as professionalism (American Board of
Medical Specialties, National Board of Medical Examiners) and USMLE Scoring (Association of American Medical
Colleges, American Medical Association, Educational Commission for Foreign Medical Graduates, Federation of
State Medical Boards, and National Board of Medical Examiners). Another example is the creation of steering
committees that include individuals to provide guidance of specic initiatives (e.g., the AMAs Accelerating Change
in Medical Education initiative). The Coalition for Physician Accountability could create a permanent committee
or group to launch implementation, provide ongoing guidance, monitor progress, and recommend future action
based on data and environmental factors.
A subset of the Coalition for Physician Accountability (i.e., those organizations with the most interest in the UME-
GME transition) could self-organize to create and support a central oversight body.
Research questions:
Apart from accrediting agencies, are there descriptions in the health professions, law or business literature of
ongoing, permanent oversight bodies that have been created to oversee the transition of professionals from
learners to practitioners?
What is the structure of such oversight bodies and is there any evidence that they have operationalized a
continuous quality improvement mechanism?
Are there examples of collaboration or public-private partnerships (e.g., government agencies and NGOs) that
have successfully implemented oversight over a professional transition?
UME-GME REVIEW COMMITTEE
53
Recommendation 2:
In addition to supporting collaboration around the UME-GME transition, this national committee should: develop
and articulate consensus around the components of a successful residency selection cycle; explore the growing
number of unmatched physicians in the context of a national physician shortage; and foster future research to
understand which factors are most likely to translate into physicians who fulll the physician workforce needs of
the public.
Narrative description of recommendation:
Currently, the medical education community lacks a shared mental model of what constitutes a successful
transition from UME to GME, and also what factors predict that success. The lack of agreement leads to conflict
over the content of applications as well as the resources required for a residency selection cycle. Success could
include simple educational outcomes such as completing training, board certication, or lack of remediation.
Alternatively, applicant-specic factors may be more important, such as likelihood of choosing the same program
again. Success may be dened solely on the public good, based on the ll rate of programs and the number
of physicians practicing in underserved areas. Or, it may be that successful residency selection is institutionally
specic based on its mission and community served, with some institutions focused on research and others on
rural communities. The committee should articulate the factors associated with a successful residency selection
cycle so they can be appropriately emphasized in the UME-GME transition, especially as changes are made to the
process.
The committee should report on data trends, implications, and recommended interventions to address the
growing number of unmatched physicians. This analysis should include demographic data to examine diversity,
specialty disparities in unmatched students, number of applications, grading systems, participation in SOAP,
post-SOAP unmatched candidates, match rate in subsequent years of re-entering the match pool, and attrition
rates of learners during residency. This recommendation is intended to urge UME programs and institutions
to utilize a continuous quality improvement approach and review unmatched graduates by specialties,
demographics, number of programs applied to, and clinical grading; to oer alternative pathways; and to add
faculty development for clinical advising. Both UME and GME data would identify patterns within the continuum
of medical education that negatively impact unmatched physicians and attrition rates of GME programs. Ideally,
shared resources and innovation across the continuum would be identied and disseminated.
Graduates of U.S. medical schools ll many residency positions, which means GME is constrained by the decisions
made by U.S. medical school admissions committees. However, international medical graduates are also
considered at many programs and provide an opportunity to serve the public good. The committee should foster
research to help program directors understand which applicant characteristics are useful indicators to address
ongoing medical workforce issues. Further changes to the transition should be informed by evidence whenever
possible.
This recommendation creates the ideal state for the UME-GME transition because:
A shared mental model for a successful transition will improve trust and allow the process to come into alignment
with the agreed upon outcomes, balancing the tensions between all stakeholders. Understanding how best to
meet the specialty-specic physician workforce needs of the public will assist program directors in designing
selection strategies based on characteristics beyond academic metrics. Careful consideration is due to applicants
who do not match to ensure they are receiving equitable treatment during the process.
How this recommendation links to the shbone diagrams used to develop the ideal state:
• Needs of society not prioritized
• Wellness
Implementation “must haves” include:
• Longitudinal data access (Applicant characteristics, survey data, and practice outcomes)
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
54
Pros Cons
Use of data driven selection characteristics for residency
Consensus satisfactory to all groups may not be
possible
Clearer denition of a successful match, its frequency,
and how the selection process can support applicant
and program alignment in educational goals and
program mission
Research will reveal population-level predictors of
practice patterns but may impact individual students
whose interests don’t t the broader trends.
UME programs could implement continuous quality
improvement regarding unmatched applicants.
Resources and funding will be needed to support
innovation for decreasing unmatched applicants
Shared resources for common challenges and
successful strategies
Guidelines and recommendations unavailable for
alternative pathways
Shared resources regarding unmatched applicant data
Innovation
Regularly available UME program specic data
on unmatched students and specialties, with
demographic distribution and additional information,
e.g., clinical grading, advising methods, alternative
pathways
• Access to match participants for data collection
• Broad participation among Coalition for Physician Accountability organizations and stakeholder groups
Implementation “nice to haves” include:
Appendix C:
UGRC Final Recommendations With Complete Templates
Relevant examples from the literature (if applicable):
1. Association of American Medical Colleges. The Complexities of Physician Supply and Demand: Projections from
2018 to 2033. June 2020. https://www.aamc.org/media/45976/download. Accessed June 2, 2021.
2. Zhang X, Lin D, Pforsich H, Lin VW. Physician workforce in the United States of America: forecasting nationwide
shortages. Hum Resour Health. 2020 Feb 6;18(1):8
3. O’Connell TF, Ham SA, Hart TG, Curlin FA, Yoon JD. A National Longitudinal Survey of Medical Students’ Intentions to
Practice Among the Underserved. Acad Med. 2018 Jan;93(1):90-97.
4. Goodfellow A, Ulloa JG, Dowling PT, et al. Predictors of Primary Care Physician Practice Location in Underserved
Urban or Rural Areas in the United States: A Systematic Literature Review. Acad Med. 2016 Sep;91(9):1313-21.
5. Gatell VI, Nguyen T, Anderson EE, McCarthy MP, Hardt JJ. Characteristics of Medical Students Planning to Work in
Medically Underserved Settings. J Health Care Poor Underserved. 2017;28(4):1409-1422.
6. Rabinowitz HK, Petterson S, Boulger JG, Hunsaker ML, Diamond JJ, Markham FW, Bazemore A, Phillips RL. Medical
school rural programs: a comparison with international medical graduates in addressing state-level rural family
physician and primary care supply. Acad Med. 2012 Apr;87(4):488-92.
Specic examples on how this recommendation might be implemented:
Convene a collaborative group with representation from key stakeholder organizations and the public charged
to dene a successful transition and understand which characteristics predict a successful match. The group
would need cooperation from all key stakeholder organizations including access to existing data.
Convene a research group with representation from key stakeholder organizations and the public, charged to
support and conduct research aimed at determining which applicant characteristics (e.g., degree, demographic,
UME-GME REVIEW COMMITTEE
55
experiences, academic metrics, etc.) are most likely to result in physicians who fulll the needs of the public in terms
of medical specialty shortages, ethnic diversity, geographic distribution, and other important needs. The group would
need cooperation from all key stakeholder organizations, including access to any existing data.
Regularly available data on unmatched applicant by specialty, with demographic distribution and additional
information, e.g., on clinical grading systems. Ideally, an unmatched graduate and UME program will have the
resources and meaningful options for successful reapplication or alternative pathways with appropriate
individualized advising.
Committee formation with diverse representation, specialty organizations, and a timeline for reporting
Research questions:
1. Does providing data on unmatched applicants and feedback on institutional trends allow for continuous quality
improvement?
2. Are attrition rates for GME programs aected by the pipeline of unmatched applicants or length of time before
matching?
3. Existing demographic, socioeconomic/disadvantaged status, number of applications, and specialty data on
unmatched applicants is needed.
4. Beyond the unmatched applicants, there are also individuals who did not apply to residency, applied but did not
receive an interview, or interviewed but were not placed on the programs rank list. More information is needed about
these people.
Citations:
1. Abraham HN, Opara IN, Dwaihy RL, Acu C, Brauer B, Nabaty R, Levine DL. Engaging Third-Year Medical Students on
Their Internal Medicine Clerkship in Telehealth During COVID-19. Cureus. 2020. 12(6): e8791.
2. Adams CC, Shih R, Peterson PG, Lee MH, Heltzel DA, Lattin GE. The Impact of a Virtual Radiology Medical Student
Rotation: Maintaining Engagement During COVID-19 Mitigation. Mil Med. Volume 186, Issue 1-2, January-February 2021:
e234–e240.
3. Akers A, Blough C, Iyer MS. COVID-19 Implications on Clinical Clerkships and the Residency Application Process for
Medical Students. Cureus. 2020. 12(4): e7800.
4. Asaad, M. Glassman G, Allam O. Virtual Rotations During COVID-19: An Opportunity for Enhancing Diversity. J Surg Res.
2021 260: 516-519.
5. Ayala A, Ukeje C. There Is No Place Like Home: Rethinking Away Rotations. Acad Med. 2020. 95(11): e5.
6. Boyd CJ, Inglesby DC, Corey B. Impact of COVID-19 on Away Rotations in Surgical Fields. J Surg Res. 2020. 255: 96-98.
7. Byrnes YM, Civantos AM, Go BC, McWilliams TL, Rajasekaran K. Eect of the COVID-19 pandemic on medical student
career perceptions: a national survey study. Med Educ Online. 2020. 25(1): 1798088.
8. Dean RA, Reghunathan M, Hauch A, Reid CM, Gosman AA, Lance SH. Establishing a Virtual Curriculum for Surgical
Subinternships. Plast Reconstruc Surg. 2020 146(4): 525e-527e.
9. DeAtkine AB, Grayson JW, Singh NP, Nocera AP, Rais-Bahrami S, Greene BJ. #ENT: Otolaryngology Residency Programs
Create Social Media Platforms to Connect With Applicants During COVID-19 Pandemic. Ear Nose Throat J. 2020.
145561320983205.
10. Everett AS, Strickler S, Marcrom SR, McDonald AM. Students’ Perspectives and Concerns for the 2020 to 2021
Radiation Oncology Interview Season. Adv Radiat Oncol. 2021. 6(1): 100554.
11. Farlow JL, Marchiano EJ, Fischer IP, Moyer JS, Thorne MC, Bohm LA. Addressing the Impact of COVID-19 on the
Residency Application Process Through a Virtual Subinternship. Otolaryngology Head Neck Surg. 2020 163(5): 926-928.
12. Franco I, Oladeru OT, Saraf A, et al. Improving Diversity and Inclusion in the Post-Coronavirus Disease 2019 Era Through
a Radiation Oncology Intensive Shadowing Experience (RISE). Adv Radiat Oncol. 2021. 6(1): 100566.
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
56
13. Gabrielson AT, Kohn JR, Sparks HT, Clifton M, Kohn T. Proposed Changes to the 2021 Residency Application Process in
the Wake of COVID-19. Acad Med. 2020. 95(9): 1346-1349.
14. Goldenberg MN, Hersh DC, Wilkins KM, Schwartz ML. Suspending Medical Student Clerkships Due to COVID-19. Med
Sci Educat. 2020. June 3. 1-4.
15. Hanson KA, Borofsky MS, Hampson LA, et al. Capturing the Perspective of Prospective Urology Applicants: Impacts
of COVID-19 on Medical Education. Urology. 2020. 146: 36-42.
16. Hayes JR, Johnston B, Lundh R. Building a Successful, Socially-Distanced Family Medicine Clerkship in the COVID
Crisis. PRiMER (Leawood, Kan.) 2020. 4: 34.
17. Iancu AM, Kemp MT, Alam HB. Unmuting Medical Students’ Education: Utilizing Telemedicine During the COVID-19
Pandemic and Beyond. J Med Internet Res. 2020. 22(7): e19667.
18. Jiang J, Key P, Deibert CM. Improving the Residency Program Virtual Open House Experience: A Survey of Urology
Applicants. Urology. 2020. 146: 1-3.
19. Kahn JM, Fields EM, Pollom E, et al. Increasing Medical Student Engagement Through Virtual Rotations in Radiation
Oncology. Adv Radiat Oncol. 2021. 6(1): 100538.
20. Kasle DA, Torabi SJ, Izreig S, Rahmati RW, Manes RP. COVID-19’s Impact on the 2020-2021 Resident Match: A Survey
of Otolaryngology Program Directors. Ann Otol Rhinol Laryngol. 2021. 3489420967045.
21. Katirji L, Smith L, Pelletier-Bui A, et al. Addressing Challenges in Obtaining Emergency Medicine Away Rotations and
Standardized Letters of Evaluation Due to COVID-19 Pandemic. West J Emerg Med. 2020. 21(3): 538-541.
22. Krawiec C, Myers A. Remote Assessment of Video-Recorded Oral Presentations Centered on a Virtual Case-Based
Module: A COVID-19 Feasibility Study. Cureus. 2020. 12(6): e8726.
23. Kronenfeld JP, Ryon EL, Kronenfeld DS, et al. Medical Student Education During COVID-19: Electronic Education Does
Not Decrease Examination Scores. Am Surg. 2020. Dec 29; 3134820983194.
24. Margolin EJ, Margolin EJ, Gordon RJ, Anderson CB, Badalato GM. Reimagining the Away Rotation: A 4-Week Virtual
Subinternship in Urology. J Surg Ed. 2021. Jan 20;S1931-7204(21)00008-8.
25. Murphy B. Match: Which specialties place most residents through SOAP. American Medical Association website.
https://www.ama-assn.org/residents-students/match/match-which-specialties-place-most-residents-through-
soap. Accessed June 22, 2021.
26. Muzumdar S, Grant-Kels, Feng H. Medical student dermatology rotations in the context of COVID-19. J Am Acad
Dermatol. 2020. 83(5): 1557-1558.
27. Nackers K, Becker A, Stewart K, Beamsley M, Aughenbaugh W, Chheda S. Patient care, public health, and a
pandemic: adapting educational experiences in the clinical years. FASEB bioAdvances. 2020.
28. Nagji A, Yilmaz Y, Zhang P, et al. Converting to Connect: A Rapid RE-AIM Evaluation of the Digital Conversion of a
Clerkship Curriculum in the Age of COVID-19. AEM education and training 2020. 4(4): 330-339.
29. National Resident Matching Program. Main residency match data and reports. https://www.nrmp.org/main-
residency-match-data/. Accessed June 22, 2021.
30. Nnamani Silva ON, Hernandez S, Kim AS, et al. Where Do We Go From Here? Assessing Medical Students’ Surgery
Clerkship Preparedness During COVID-19. J Surg Ed. 2021. Jan 16;S1931-7204(21)00010-6
31. Nnamani Silva ON, Hernandez S, Kim EH, et al. Surgery Clerkship Curriculum Changes at an Academic Institution
during the COVID-19 Pandemic. J Surg Ed. 2021. 78(1): 327-331.
32. Ooi R, Ooi SZY. The role of virtual sub-internships in influencing career perceptions: an international medical
graduate perspective. Med Ed Online. 2020. 25(1): 1821463.
33. Patel PM, Tsui CL, Aakaash V, Levitt J. Remote learning for medical student-level dermatology during the COVID-19
pandemic. J Am Acad Dermatol. 2020. 83(6): e469-e470.
34. Patel V, Nolan IT, Morrison SD, Fosnot J. Visiting Subinternships in Wake of the COVID-19 Crisis: An Opportunity for
Improvement. Ann Plast Surg. 2020. 85(2S Suppl 2): S153-S154.
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
57
35. Pelletier-Bui A, Franzen D, Smith L, et al. COVID-19: A Driver for Disruptive Innovation of the Emergency Medicine
Residency Application Process. West J Emerg Med. 2020. 21(5): 1105-1113.
36. Peterseim C, Watson KH. Family Medicine Telehealth Clinic With Medical Students. PRiMER (Leawood, Kan.). 2020. 4: 35.
37. Pollom EL, Sandhu N, Frank J, et al. Continuing Medical Student Education During the Coronavirus Disease 2019
(COVID-19) Pandemic: Development of a Virtual Radiation Oncology Clerkship. Adv Radiat Oncol. 2020. 5(4): 732-736.
38. Rajesh A, Asaad M. Alternative Strategies for Evaluating General Surgery Residency Applicants and an Interview Limit
for MATCH 2021: An Impending Necessity. Ann Surg. 2021. 273(1): 109-111.
39. Richardson MA, Islam W, Magruder M. The Evolving Impact of COVID-19 on Medical Student Orthopedic Education:
Perspectives From Medical Students in Dierent Phases of the Curriculum. Geriatr Orthop Surg Rehabil. 2020. 11:
2151459320951721.
40. Ruthberg JS, Quereshy HA, Ahmadmehrabi S, et al. A Multimodal Multi-institutional Solution to Remote Medical Student
Education for Otolaryngology During COVID-19. Otolaryngol Head Neck Surg. 2020. 163(4): 707-709.
41. Samueli B, Sror N, Jotkowitz A, Taragin B. Remote pathology education during the COVID-19 era: Crisis converted to
opportunity. Ann Diagn Pathol. 2020. 49: 151612.
42. Sandhu N, Frank J, von Eyben R, et al. Virtual Radiation Oncology Clerkship During the COVID-19 Pandemic and Beyond.
Int J Radiat Oncol Biol Physics. 2020. 108(2): 444-451.
43. Shin TH, Klingler M, Han A, et al. Ecacy of Virtual Case-Based General Surgery Clerkship Curriculum During COVID-19
Distancing. Med Sci Educ. 2020: 1-8.
44. Smith E, Boscak A. A virtual emergency: learning lessons from remote medical student education during the COVID-19
pandemic. Emerg Radiol. 2021.
45. Vollbrecht PJ, Porter-Stransky KA, Lackey-Cornelison WL. Lessons learned while creating an eective emergency
remote learning environment for students during the COVID-19 pandemic. Adv Physiol Educ. 2020. 44(4): 722-725.
46. Weber AM, Dua A, Chang K, et al. An outpatient telehealth elective for displaced clinical learners during the COVID-19
pandemic. BMC Med Educ. 2021. 21(1): 174.
47. Wendt S, Abdullah Z, Barrett S, et al. A virtual COVID-19 ophthalmology rotation. Surv Ophthal. 2021. 66(2): 354-361.
48. Williams C, Familusi OO, Ziemba J, et al. Adapting to the Educational Challenges of a Pandemic: Development of a
Novel Virtual Urology Subinternship During the Time of COVID-19. Urology. 2021. 148: 70-76.
49. Xu L, Ambinder D, Kang J, et al. Virtual grand rounds as a novel means for applicants and programs to connect in the
era of COVID-19. Am J Surg. 2020. Sep 2.
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
58
Recommendation 3:
The U.S. Centers for Medicare and Medicaid Services (CMS) should change the current GME funding structure so
that the Initial Residency Period (IRP) is calculated starting with the second year of postgraduate training. This will
allow career choice reconsideration, leading to improved resident wellbeing and positive eects on the physician
workforce.
Narrative description of recommendation:
Given the timing of the residency recruiting season and the Match, students have limited time to denitively
establish their specialty choice. If a resident decides to switch to another program or specialty after beginning
training, the hospital may not receive full funding due to the IRP and thus be far less likely to approve such a
change. The knowledge that residents usually only have one chance to choose a specialty path increases the
pressure on the entire UME-GME transition. Furthermore, educational innovation is limited without flexibility for
time-variable training.
How this recommendation links to the shbone diagrams used to develop the ideal state:
• Needs of society not prioritized
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Use of data driven selection characteristics for residency
that help to address ongoing medical workforce issues.
Research will reveal population-level predictors of
practice patterns but may impact individual students
whose interests don’t t the broader trends.
Relevant examples from the literature (if applicable):
1. Association of American Medical Colleges. The Complexities of Physician Supply and Demand: Projections From
2018 to 2033. June 2020. https://www.aamc.org/media/45976/download. Accessed June 2, 2021.
2. Zhang X, Lin D, Pforsich H, Lin VW. Physician workforce in the United States of America: forecasting nationwide
shortages. Hum Resour Health. 2020 Feb 6;18(1):8
3. O’Connell TF, Ham SA, Hart TG, Curlin FA, Yoon JD. A National Longitudinal Survey of Medical Students’ Intentions to
Practice Among the Underserved. Acad Med. 2018 Jan;93(1):90-97.
4. Goodfellow A, Ulloa JG, Dowling PT, et al. Predictors of Primary Care Physician Practice Location in Underserved
Urban or Rural Areas in the United States: A Systematic Literature Review. Acad Med. 2016 Sep;91(9):1313-21.
5. Gatell VI, Nguyen T, Anderson EE, McCarthy MP, Hardt JJ. Characteristics of Medical Students Planning to Work in
Medically Underserved Settings. J Health Care Poor Underserved. 2017;28(4):1409-1422.
6. Rabinowitz HK, Petterson S, Boulger JG, Hunsaker ML, Diamond JJ, Markham FW, Bazemore A, Phillips RL. Medical
school rural programs: a comparison with international medical graduates in addressing state-level rural family
physician and primary care supply. Acad Med. 2012 Apr;87(4):488-92.
Specic examples on how this recommendation might be implemented:
Convene a research group with representation from key stakeholder organizations and the public, charged to
support and conduct research aimed at determining which applicant characteristics (e.g degree, demographic,
experiences, academic metrics, etc.) are most likely to result in physicians who fulll the needs of the public in
terms of medical specialty shortages, ethnic diversity, geographic distribution, and other important needs. The
group would need cooperation from all key stakeholder organizations. including access to any existing data.
UME-GME REVIEW COMMITTEE
59
Recommendation 4:
Specialty-specic salutary practices for recruitment to increase diversity across the educational continuum should
be developed and disseminated to program directors, residency programs, and institutions.
Narrative description of recommendation:
Recognizing that program directors, residency programs, and institutions have wide variability in goals, denitions,
and community needs for increasing diversity, shared resources should be made available for mission-aligned
entities, with specialty-specic contributions including successful strategies and ongoing challenges. This
recommendation is intended for specialty organizations to perform workforce evaluations and specically
address diversity, equity, and inclusion (DEI) associated with specialty-specic disparities in recruitment.
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Specialty societies absorbing the burden of compiling
information to contribute to best practices
Resources and funding needed to support programs
may be limited
Shared resources for common challenges
Guidelines and recommendations unavailable for
implementation of diversity dashboards
Shared successful recruitment strategies
Innovation
Specic examples aligned with the overall thematic recommendation:
Specialty organizations would provide best practices for recruiting for diversity to provide guidelines for program
directors, programs, and institutions.
Organizations and stakeholder groups that could deploy this change:
• Specialty organizations
• American Association of Colleges of Osteopathic Medicine
• Accreditation Council for Graduate Medical Education
Research questions:
1. Are there specic specialties with challenges for recruiting for diversity that require more targeted resources?
2. What resources are available from specialty organizations for recruiting for diversity?
3. Are there existing or recommended diversity dashboards for program directors, programs, and institutions that
may be helpful to disseminate for targeted recruitment programs?
4. Is the implementation of a specialty-specic approach to recruiting for diversity more impactful than overall
diversity eorts?
Citations:
1. Abraham HN, Opara IN, Dwaihy RL, Acu C, Brauer B, Nabaty R, Levine DL. Engaging Third-Year Medical Students
on Their Internal Medicine Clerkship in Telehealth During COVID-19. Cureus. 2020. 12(6): e8791.
2. Adams CC, Shih R, Peterson PG, Lee MH, Heltzel DA, Lattin GE. The Impact of a Virtual Radiology Medical Student
Rotation: Maintaining Engagement During COVID-19 Mitigation. Mil Med. Volume 186, Issue 1-2, January-February
2021: e234–e240.
3. Akers A, Blough C, Iyer MS. COVID-19 Implications on Clinical Clerkships and the Residency Application Process for
Medical Students. Cureus. 2020. 12(4): e7800.
4. Asaad, M. Glassman G, Allam O. Virtual Rotations During COVID-19: An Opportunity for Enhancing Diversity. J Surg
Res. 2021 260: 516-519.
UME-GME REVIEW COMMITTEE
60
5. Ayala A, Ukeje C. There Is No Place Like Home: Rethinking Away Rotations. Acad Med. 2020. 95(11): e5.
6. Boyd CJ, Inglesby DC, Corey B. Impact of COVID-19 on Away Rotations in Surgical Fields. J Surg Res. 2020. 255: 96-98.
7. Byrnes YM, Civantos AM, Go BC, McWilliams TL, Rajasekaran K. Eect of the COVID-19 pandemic on medical student
career perceptions: a national survey study. Med Educ Online. 2020. 25(1): 1798088.
8. Dean RA, Reghunathan M, Hauch A, Reid CM, Gosman AA, Lance SH. Establishing a Virtual Curriculum for Surgical
Subinternships. Plast Reconstruc Surg. 2020 146(4): 525e-527e.
9. DeAtkine AB, Grayson JW, Singh NP, Nocera AP, Rais-Bahrami S, Greene BJ. #ENT: Otolaryngology Residency Programs
Create Social Media Platforms to Connect With Applicants During COVID-19 Pandemic. Ear Nose Throat J. 2020.
145561320983205.
10. Everett AS, Strickler S, Marcrom SR, McDonald AM. Students’ Perspectives and Concerns for the 2020 to 2021
Radiation Oncology Interview Season. Adv Radiat Oncol. 2021. 6(1): 100554.
11. Farlow JL, Marchiano EJ, Fischer IP, Moyer JS, Thorne MC, Bohm LA. Addressing the Impact of COVID-19 on the
Residency Application Process Through a Virtual Subinternship. Otolaryngology Head Neck Surg. 2020 163(5): 926-928.
12. Franco I, Oladeru OT, Saraf A, et al. Improving Diversity and Inclusion in the Post-Coronavirus Disease 2019 Era Through
a Radiation Oncology Intensive Shadowing Experience (RISE). Adv Radiat Oncol. 2021. 6(1): 100566.
13. Gabrielson AT, Kohn JR, Sparks HT, Clifton M, Kohn T. Proposed Changes to the 2021 Residency Application Process in
the Wake of COVID-19. Acad Med. 2020. 95(9): 1346-1349.
14. Goldenberg MN, Hersh DC, Wilkins KM, Schwartz ML. Suspending Medical Student Clerkships Due to COVID-19. Med Sci
Educat. 2020. June 3. 1-4.
15. Hanson KA, Borofsky MS, Hampson LA, et al. Capturing the Perspective of Prospective Urology Applicants: Impacts of
COVID-19 on Medical Education. Urology. 2020. 146: 36-42.
16. Hayes JR, Johnston B, Lundh R. Building a Successful, Socially-Distanced Family Medicine Clerkship in the COVID Crisis.
PRiMER (Leawood, Kan.) 2020. 4: 34.
17. Iancu AM, Kemp MT, Alam HB. Unmuting Medical Students’ Education: Utilizing Telemedicine During the COVID-19
Pandemic and Beyond. J Med Internet Res. 2020. 22(7): e19667.
18. Jiang J, Key P, Deibert CM. Improving the Residency Program Virtual Open House Experience: A Survey of Urology
Applicants. Urology. 2020. 146: 1-3.
19. Kahn JM, Fields EM, Pollom E, et al. Increasing Medical Student Engagement Through Virtual Rotations in Radiation
Oncology. Adv Radiat Oncol. 2021. 6(1): 100538.
20. Kasle DA, Torabi SJ, Izreig S, Rahmati RW, Manes RP. COVID-19’s Impact on the 2020-2021 Resident Match: A Survey of
Otolaryngology Program Directors. Ann Otol Rhinol Laryngol. 2021. 3489420967045.
21. Katirji L, Smith L, Pelletier-Bui A, et al. Addressing Challenges in Obtaining Emergency Medicine Away Rotations and
Standardized Letters of Evaluation Due to COVID-19 Pandemic. West J Emerg Med. 2020. 21(3): 538-541.
22. Krawiec C, Myers A. Remote Assessment of Video-Recorded Oral Presentations Centered on a Virtual Case-Based
Module: A COVID-19 Feasibility Study. Cureus. 2020. 12(6): e8726.
23. Kronenfeld JP, Ryon EL, Kronenfeld DS, et al. Medical Student Education During COVID-19: Electronic Education Does
Not Decrease Examination Scores. Am Surg. 2020. Dec 29; 3134820983194.
24. Margolin EJ, Margolin EJ, Gordon RJ, Anderson CB, Badalato GM. Reimagining the Away Rotation: A 4-Week Virtual
Subinternship in Urology. J Surg Ed. 2021. Jan 20;S1931-7204(21)00008-8.
25. Muzumdar S, Grant-Kels, Feng H. Medical student dermatology rotations in the context of COVID-19. J Am Acad
Dermatol. 2020. 83(5): 1557-1558.
26. Nackers K, Becker A, Stewart K, Beamsley M, Aughenbaugh W, Chheda S. Patient care, public health, and a
pandemic: adapting educational experiences in the clinical years. FASEB bioAdvances. 2020.
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
61
27. Nagji A, Yilmaz Y, Zhang P, et al. Converting to Connect: A Rapid RE-AIM Evaluation of the Digital Conversion of a Clerkship
Curriculum in the Age of COVID-19. AEM education and training 2020. 4(4): 330-339.
28. Nnamani Silva ON, Hernandez S, Kim AS, et al. Where Do We Go From Here? Assessing Medical Students’ Surgery
Clerkship Preparedness During COVID-19. J Surg Ed. 2021. Jan 16;S1931-7204(21)00010-6
29. Nnamani Silva ON, Hernandez S, Kim EH, et al. Surgery Clerkship Curriculum Changes at an Academic Institution during
the COVID-19 Pandemic. J Surg Ed. 2021. 78(1): 327-331.
30. Ooi R, Ooi SZY. The role of virtual sub-internships in influencing career perceptions: an international medical graduate
perspective. Med Ed Online. 2020. 25(1): 1821463.
31. Patel PM, Tsui CL, Aakaash V, Levitt J. Remote learning for medical student-level dermatology during the COVID-19
pandemic. J Am Acad Dermatol. 2020. 83(6): e469-e470.
32. Patel V, Nolan IT, Morrison SD, Fosnot J. Visiting Subinternships in Wake of the COVID-19 Crisis: An Opportunity for
Improvement. Ann Plast Surg. 2020. 85(2S Suppl 2): S153-S154.
33. Pelletier-Bui A, Franzen D, Smith L, et al. COVID-19: A Driver for Disruptive Innovation of the Emergency Medicine
Residency Application Process. West J Emerg Med. 2020. 21(5): 1105-1113.
34. Peterseim C, Watson KH. Family Medicine Telehealth Clinic With Medical Students. PRiMER (Leawood, Kan.). 2020. 4: 35.
35. Pollom EL, Sandhu N, Frank J, et al. Continuing Medical Student Education During the Coronavirus Disease 2019
(COVID-19) Pandemic: Development of a Virtual Radiation Oncology Clerkship. Adv Radiat Oncol. 2020. 5(4): 732-736.
36. Rajesh A, Asaad M. Alternative Strategies for Evaluating General Surgery Residency Applicants and an Interview Limit
for MATCH 2021: An Impending Necessity. Ann Surg. 2021. 273(1): 109-111.
37. Richardson MA, Islam W, Magruder M. The Evolving Impact of COVID-19 on Medical Student Orthopedic Education:
Perspectives From Medical Students in Dierent Phases of the Curriculum. Geriatr Orthop Surg Rehabil. 2020. 11:
2151459320951721.
38. Ruthberg JS, Quereshy HA, Ahmadmehrabi S, et al. A Multimodal Multi-institutional Solution to Remote Medical Student
Education for Otolaryngology During COVID-19. Otolaryngol Head Neck Surg. 2020. 163(4): 707-709.
39. Samueli B, Sror N, Jotkowitz A, Taragin B. Remote pathology education during the COVID-19 era: Crisis converted to
opportunity. Ann Diagn Pathol. 2020. 49: 151612.
40. Sandhu N, Frank J, von Eyben R, et al. Virtual Radiation Oncology Clerkship During the COVID-19 Pandemic and Beyond.
Int J Radiat Oncol Biol Physics. 2020. 108(2): 444-451.
41. Shin TH, Klingler M, Han A, et al. Ecacy of Virtual Case-Based General Surgery Clerkship Curriculum During COVID-19
Distancing. Med Sci Educ. 2020: 1-8.
42. Smith E, Boscak A. A virtual emergency: learning lessons from remote medical student education during the COVID-19
pandemic. Emerg Radiol. 2021.
43. Vollbrecht PJ, Porter-Stransky KA, Lackey-Cornelison WL. Lessons learned while creating an eective emergency
remote learning environment for students during the COVID-19 pandemic. Adv Physiol Educ. 2020. 44(4): 722-725.
44. Weber AM, Dua A, Chang K, et al. An outpatient telehealth elective for displaced clinical learners during the COVID-19
pandemic. BMC Med Educ. 2021. 21(1): 174.
45. Wendt S, Abdullah Z, Barrett S, et al. A virtual COVID-19 ophthalmology rotation. Surv Ophthal. 2021. 66(2): 354-361.
46. Williams C, Familusi OO, Ziemba J, et al. Adapting to the Educational Challenges of a Pandemic: Development of a
Novel Virtual Urology Subinternship During the Time of COVID-19. Urology. 2021. 148: 70-76.
47. Xu L, Ambinder D, Kang J, et al. Virtual grand rounds as a novel means for applicants and programs to connect in the
era of COVID-19. Am J Surg. 2020. Sep 2.
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
62
Recommendation 5:
Members of the medical educational continuum must receive continuing professional development regarding anti-racism,
avoiding bias, and ensuring equity. Principles of equitable recruitment, mentorship and advising, teaching, and assessment
should be included.
Narrative description of recommendation:
Inclusive excellence requires avoiding bias and improving racial equity; these are essential skills for faculty in today’s
teaching. Many physicians lack these skills, perpetuating health disparities, lack of diversity, and learner mistreatment.
ACGME Common Program Requirements already include specic applicable requirements. This recommendation
reinforces the importance of addressing issues related to DEI for all members of the educational community, including
residents starting from orientation. This will ultimately promote belonging, eliminate bias, and provide social support.
This recommendation creates the ideal state for the UME-GME transition because:
In the ideal state for the UME-GME transition, residency faculty and peers will recognize and mitigate bias to ensure optimal
entrustment and support for all learners in an inclusive environment. This training will help address entrenched inequities
in medical training, with particular focus on developing support networks for those underrepresented in medicine. The
application of anti-racism and bias mitigation throughout the UME-GME transition will help improve the diversity of our
future physician workforce, which is also important in furthering the public good. Creating welcoming and inclusive
environments for all residents requires intentional eorts by the institution. This is not created by accident and the explicit
action of training the entire organization is a rst step. This can begin to create an ongoing inclusive environment for all.
How this recommendation links to the shbone diagrams used to develop the ideal state:
An overarching thread throughout all of the root problems in the post-match period is the need to address diversity,
equity, and inclusion. In particular, opportunities identied in the shbone exercise included improving bystander training,
creation of a representative and inclusive community, and the application of an equity lens throughout the entire UME-GME
transition period.
Implementation “must haves” include:
Eective training for faculty in UME and GME programs, including how anti-racism strategies and bias mitigation can be
applied to recruitment, mentorship, advising, teaching, and assessment.
Orientation early in GME to the community of faculty, sta, and learners as well as the patient population served by the
trainees.
Implementation “nice to haves” include:
• Feedback on faculty performance
• Evolution of training programs to reflect best practices
.
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Improvement in the learning environment through
development of inclusive practices
Cost of implementation of training modules for faculty
Public good through improved health equity through
development of a representative workforce
Time required for training and questions of how to build
accountability into training
Better retention and promotion of medical trainees
underrepresented in medicine
Health equity improvement
Changing the system can start with educational
programs
UME-GME REVIEW COMMITTEE
63
Research questions:
1. What is the impact of robust anti-racism and bias mitigation faculty training on inclusiveness of the learning
environment, including implicit and explicit microaggressions and learner experiences?
2. How does introduction of anti-racism and bias mitigation faculty training aect recruitment, retention, and promotion
of trainees underrepresented in medicine?
Citations:
1. Acosta, D. and K. Ackerman-Barger (2017). “Breaking the Silence: Time to Talk About Race and Racism.” Academic
Medicine 92(3): 285-288.
2. Argueza, B. R., et al. (2021). “From Diversity and Inclusion to Antiracism in Medical Training Institutions.” Academic
Medicine
3. Benoit, L. J., et al. (2020). “Toward a Bias-Free and Inclusive Medical Curriculum: Development and Implementation of
Student-Initiated Guidelines and Monitoring Mechanisms at One Institution.” Academic Medicine 95(12S Addressing
Harmful Bias and Eliminating Discrimination in Health Professions Learning Environments): S145-S149.
4. Castillo, E. G., et al. (2020). “Reconsidering Systems-Based Practice: Advancing Structural Competency, Health Equity,
and Social Responsibility in Graduate Medical Education.” Academic Medicine: 1817-1822.
5. Davis, D. L. F., et al. (2021). “Start the Way You Want to Finish: An Intensive Diversity, Equity, Inclusion Orientation
Curriculum in Undergraduate Medical Education.” Journal of medical education and curricular development8:
23821205211000352.
6. Diaz, T., et al. (2020). “An Institutional Approach to Fostering Inclusion and Addressing Racial Bias: Implications for
Diversity in Academic Medicine.” Teaching and learning in medicine 32(1): 110-116.
7. Edgoose, J., et al. (2021). “Teaching About Racism in Medical Education: A Mixed-Method Analysis of a Train-the-Trainer
Faculty Development Workshop.” Family medicine 53(1): 23-31.
8. Hassen, N., et al. (2021). “Implementing anti-racism interventions in healthcare settings: A scoping review.” International
journal of environmental research and public health 18(6): 1-15.
9. Sotto-Santiago, S., et al. (2020). “”I Didn’t Know What to Say”: Responding to Racism, Discrimination, and
Microaggressions with the OWTFD Approach.” MedEdPORTAL : the journal of teaching and learning resources16:
10971.
10. Wingard, D., et al. (2019). “Faculty Equity, Diversity, Culture and Climate Change in Academic Medicine: A Longitudinal
Study.” Journal of the National Medical Association 111(1): 46-53.
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
64
Recommendation 6:
Create an interactive database with veriable GME program/track information and make it available to all applicants,
medical schools, and residency programs and at no cost to the applicants. This will include aggregate characteristics of
individuals who previously applied to, interviewed at, were ranked by, and matched for each GME program/track.
Narrative description of recommendation:
Veriable and trustworthy GME program/track information should be developed and made available in an easily
accessible database to all applicants. Information for the database should be directly collected and sources should
be transparent. Each programs interviewed or ranked applicants reflect the programs desired characteristics more
accurately than the small proportion of applicants the program matches. Data must be searchable and allow for data
analytics to assist with program decision making (e.g., allowing applicants and their advisors to input components of their
individual application to identify programs/tracks with similar current residents). Applicants and advisors should be able to
sort the information according to demographic and educational features that may signicantly impact the likelihood of
matching at a program (e.g., geography, scores, degree, visa status, etc.). This database would also provide information on
the characteristics of individuals who previously applied to and matched into various specialties.
This recommendation creates the ideal state for the UME-GME transition because:
This technology will allow applicants to identify what they want in a program, whether that means a specic program,
program experiences, attributes, or something else. Additionally, all stakeholders will be committed to the inclusion of
students, schools, programs, and the public in the design, evaluation, and continual improvement of the system. Applicants
and advisers will have the information necessary to target applications toward specialties and programs where they are
most likely to be considered and be successful, potentially decreasing application inflation. Automatic reporting on program
selection data and universal availability for all applicants will promote trust and transparency within the system.
How this recommendation links to the shbone diagrams used to develop the ideal state:
Lack of reliable, program-specic selection information.
The system lacks transparency and visibility of program requirements for away or audition rotations, including if these
lters dier from interview/residency application lters.
The system lacks an easily accessible database for programmatic information that is accurate and comparable and
includes the program details desired by an applicant, including the lters used by the program to determine student
eligibility.
Trustworthy data to inform advising is lacking.
• Available data is not always meaningful.
The system for how programs select candidates for interviews and ranking is opaque.
Implementation “must haves” include:
• Automatic reporting of selection data, not dependent on programs or applicants
• Allow comparison of applicants who apply with those who are interviewed/ranked and match for context
Single source for all information important to applicants, including program-generated (such as mission and training site
information) and applicant-generated (such as reviews)
• Available to all applicants without additional cost
Implementation “nice to haves” include:
• Stakeholder oversight board for continuous quality improvement and feedback to advisers, applicants, and programs
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
65
Pros Cons
Applicants and their advisers will have reliable
information about the type of applicant who receives
an interview with each program. Because this
information is automatically generated from the rank
order list or interview invtation list, it will be transparent
and reliable. Applicants will be able to more selectively
ap-ply to programs that meet their educational
objectives.
Small and selective programs are unable to be included
because of short rank lists and the potential disclosure
of individual data. This could be mitigated by including
batched data over several years.
Programs will have a better understanding of who they
interview through the real-time use of dashboards. This
could lead to higher awareness of potential bias and
helping to improve equity of the interview process.
Programs could rank applicants who are not
interviewed or otherwise game the system.
Programs with few applicants from certain groups
may continue struggle to increase diversity (self-fullling
prophecy).
The desirable features of other databases such as
program aims from the programs and crowd-sourced
input from applicants should be included into one
platform.
Organizations that have residency program data
unwilling to share data to database.
Student well-being with one source of truthful
programmatic information
Programs show aspirations with rank/interview
invitation list instead of just who has decided to attend
their program.
Appendix C:
UGRC Final Recommendations With Complete Templates
Relevant examples from the literature (if applicable):
1. FREIDA. American Medical Association. https://freida.ama-assn.org/. Accessed June 25, 2021.
2. Residency Navigator. Doximity. https://residency.doximity.com/. Accessed June 25, 2021.
3. Accreditation Council for Graduate Medical Education Institution/Program Finder. https://apps.acgme.org/ads/public/.
Accessed June 25, 2021.
4. Residency Explorer. Hosted by Association of American Medical Colleges but sponsored by several medical education
organizations. https://www.residencyexplorer.org. Accessed June 25, 2021.
5. Scutwork. Student Doctor Network. https://www.scutwork.com/. Accessed June 25, 2021.
6. Student Doctor Network. https://www.studentdoctor.net/. Accessed June 25, 2021.
7. Reddit and other social media sites including specialty-specic crowdsourced spreadsheets such as this one for
general surgery: https://docs.google.com/spreadsheets/d/1TZ31hgTNSNTVFrd5YQy8rL8Byy9HParaaCKlmyO0KlU/
edit#gid=107922705. Accessed June 25, 2021.
Specic examples on how this recommendation might be implemented:
One existing repository that approximates the characteristics of programs’ interviewees is the “Rank Order List”
submitted by programs to the National Resident Matching Program. Aggregated characteristics from the Electronic
Residency Application Service® (ERAS) of deidentied students on this list, potentially pooled over several years, would
approximate a programs desired applicant qualities. The unranked ID numbers of individuals appearing on this list linked
to the information on their ERAS applications would create aggregated characteristics (potentially pooled over several
years) that approximate those of the individuals interviewed by that program.
If the National Resident Matching Program is not able to coordinate de-identied rank order list information sharing,
UME-GME REVIEW COMMITTEE
66
Appendix C:
UGRC Final Recommendations With Complete Templates
Electronic Residency Application Service® has a feature to select for interview and to rank applicants. Use of this
feature could be encouraged through visual dashboards that demonstrate to programs how their selection
process aects groups of interest. Statistics that will later be made publicly available could be visibly apparent on the
dashboard for programs to monitor.
A third way to collect this information is to contract with private interview scheduling vendors to automatically create
a list for each program.
A database utilizing signicant applicant input and an oversight body of diverse stakeholders (including program
directors from dierent types of programs) can be developed. The oversight body will be responsible for continual
quality review and improvement based on stakeholder needs (i.e. incorporating applicant reviews post-interview,
adding specialty-specic procedure data, ensuring condentiality of aggregate data, etc.).
Research questions:
1. To what extent are oers to interview not accepted by applicants? This information may be available from the
Electronic Residency Application Service® or from proprietary scheduling software. Are these oers of interest to
applicants? To what extent are interviewees not listed in the rank order list?
2. Because each year the vast majority of individuals go through the UME-GME transition only once, it is dicult to
know how well new resources are used. However, Residency Explorer should experience an increase in number
of users, user satisfaction, adviser satisfaction, and the time spent by each user using this tool. Trust in outside
resources should decrease.
3. Does the database improve residency recruitment of desired candidates?
4. Does a database with veriable information aect programmatic match rates?
5. Does a database with veriable information, including general details about program candidates interviewed and
accepted, decrease the number of applications per student or applications received by a residency program? How
does increased program transparency aect applicant well-being and stress during the application process?
Citations:
1. Rowley BD. AMA—Fellowship and Residency Electronic Interactive Database Access (AMA-FREIDA): A Computerized
Residency Selection Tool. JAMA. 1988;260(8):1059.
2. Embi PJ, Desai S, Cooney TG. Use and utility of Web-based residency program information: a survey of residency
applicants. J Med Internet Res. 2003;5(3):e22.
UME-GME REVIEW COMMITTEE
67
Recommendation 7:
Evidence-informed, general career advising resources should be available for all medical school faculty and sta career
advisors, both domestic and international. All students should have free access to a single, comprehensive electronic
professional development career planning resource, which provides universally accessible, reliable, up-to-date, and
trustworthy information and guidance. General career advising should focus on students’ professional development;
inclusive practices such as valuing diversity, equity, and belonging; clinical and alternate career pathways; and meeting
the needs of the public. Specialty-specic match advising should focus on the individual student obtaining an optimal
match.
Narrative description of recommendation:
Centralized advising resources, developed in collaboration with specialty societies, should reflect a common core, with
supplemental information as needed, and be evidence-informed and data-driven. This will ll an information gap and
increase the transparency and reliability of information shared with students. Resources should support the unique needs
of traditionally underrepresented, disadvantaged, and marginalized student groups. Guidance contained in the resources
can support faculty in managing or eliminating conflicts of interest related to recruiting students to the specialty, advising
for the Match, and advocating for students in the Match. Advising tools should incorporate strengths-based approaches
to career selection. The resources should include the option of non-clinical careers without stigma. Three areas of focus
are envisioned: basic advising information, general career advising, and specialty-specic match advising.
Clear and accurate information regarding clinical and nonclinical career choices should be available for all students.
The AAMC’s Careers in Medicine (CiM) platform achieves some of the aims of this recommendation. The strengths
and limitations of CiM should be examined, expanding the content and broadening access to this resource, including
to all students (U.S. MD, U.S. DO, IMG) at no cost throughout their medical school training, or at a minimum, at key career
decision-making points, in order to support students’ professional development. The public good can be prioritized within
this resource with content emphasis on workforce strategies to address the needs of the public, including specialty
selection and practice location as well as alternative nonclinical career choices. Links to specialty-specic medical
student advising resources should also be incorporated.
Basic advising information should be created for all faculty and sta who interact with students to promote common
understanding of career advising, professional development, specialty selection, and application procedures; introduce
the role of specialty-specic advisors as distinct from other faculty teachers; and minimize sharing outdated or
incorrect information with students. General career advising should be dierentiated from specialty-specic match
advising or specialty recruiting. General career advisors require expertise in career advising; incorporate strengths-
based approaches to career selection including the option of nonclinical careers without stigma; focus on professional
development; value diversity, equity, and belonging; incorporate the needs of the public; and introduce the role of
specialty-specic match advisors. Specialty-specic match advisors should undergo a training process created as part
of this resource development that includes equity in advising and mitigation of bias.
This recommendation creates the ideal state for the UME-GME transition because:
The culture of career advising will be inclusive, trustworthy, non-judgmental, and equitable for all students. Advising tools
will be high quality, interactive, honest, and readily available. Both UME and GME will recognize career indecision as a
normal part of professional formation and allow flexibility for undecided learners at key transition points including non-
standard timelines as necessary. Students will be supported by both UME and GME and use trustworthy, data-driven
resources to seek specialties based on a holistic assessment of t that allows them to be aspirational about their
ambitions while being pragmatic about the possibilities. Students will be informed about the workforce needs of society.
Students and advisers will avoid contributing to a culture of competition.
How this recommendation links to the shbone diagrams used to develop the ideal state:
Student advising—lack of trustworthy data to inform advising
Stakeholders—public as stakeholder is undervalued
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
68
Culture—culture is competitive. More transparent sharing of information can help the culture to be more open and
inclusive.
Lack of alignment—advising and stakeholder needs
• Inadequate adviser preparation
Implementation “must haves” include:
• Career planning electronic resources must be available at no cost to students.
• Advising resources must be evidence-informed and data driven.
GME program directors and specialty societies create specialty-specic medical student advising resources to link to
from general advising resources.
• A process for regular updates to materials must be developed and implemented.
• The information on non-clinical careers must be useful and non-stigmatizing.
Implementation “nice to haves” include:
This resource may be linked to existing data sources on disease burden, health disparities, and public health to show the
potential public good impact of specialty selection and practice location.
• Some information on non-physician careers should be included.
There may be a need for buy-in from Council of Deans/Board of Deans, regarding recognizing the value of advising
through faculty time allocation and in promotions processes.
• Consider creating a certication process for those who complete training.
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Can enhance consistency and trustworthiness of
advising information
Ownership by one organization may be contro-versial or
promote a specic organizations lens
Opportunity to strengthen attention to students’
professional and personal development as part of
career advising
One-size-ts-all may constrain depth of infor-mation/
scope to address needs across all MD, DO, and
international schools and clinical/non-clinical careers
Opportunity to align advising with workforce needs and
needs of the public
Advising about non-clinical careers could limit physician
workforce (minimally)
Single source of information for advisers
Supports the professional development of students
Increases quality, consistency, transparency, and
reliability of advising
Promotes evidence-informed, student-centered advising
Advising materials attentive to issues of equity and
minimizing bias can support faculty in learner-focused
advising and minimize faculty conflicts of interest.
Opportunity to focus on specialty selection as separate
from recruitment to a eld
Advising about non-clinical careers decreases pressure
on schools to match all learners
Advising about non-clinical careers is a cost-eective
strategy to address students changing priorities and
encourage student self-actualization
UME-GME REVIEW COMMITTEE
69
Relevant examples from the literature (if applicable):
1. General: Careers in Medicine®
2. Specialty specic:
Anesthesiology: https://www.asahq.org/education-and-career/asa-medical-student-component/guide-to-a-
career-in-anesthesiology
Emergency medicine: https://www.cordem.org/resources/professional-development/ascem/
Family medicine: https://www.aafp.org/students-residents/medical-students/explore-career-in-family-
medicine/why-choose-family-medicine.html
Internal medicine: https://www.acponline.org/membership/medical-students/residency
Neurosurgery: https://www.aans.org/Trainees/Medical-Students
OB/GYN: https://www.acog.org/career-support/medical-students/medical-student-toolkit
Ophthalmology: https://www.aao.org/medical-students
Otolayngology: https://www.cordem.org/resources/professional-development/ascem/
Neurology: https://www.aan.com/tools-and-resources/medical-students/how-to-apply-for-residency/
Pathology: https://www.cap.org/member-resources/residents/cap-for-medical-students
Psychiatry: https://www.psychiatry.org/residents-medical-students/medical-students/apply-for-psychiatric-
residency
Pediatrics: https://services.aap.org/en/career-resources/medical-students/
Surgery: https://www.facs.org/Education/Resources/Residency-Search
3. Johns Hopkins University School of Medicine Colleges Advisory Program: Faculty development on content, skills
incorporated. https://www.hopkinsmedicine.org/som/education-programs/md-program/our-students/colleges-
advisory.html
4. Frosch E, Goldstein M. Relationship –centered Advising in a Medical school learning community. J Med Educ Curric Dev.
2019; 6. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6434435/.
5. The University of Washington Medical School Colleges System
6. Council of Residency Directors in Emergency Medicine, Advising Up: A Guide for Medical School Deans Regarding the
Emergency Medicine Applicant. https://www.cordem.org/globalassets/les/committees/student-advising/2020-
updates/asc-em-advising-up.pdf.
7. American Academy of Family Physicians Web resource, Advising Medical Students on Medical School and Career. https://
www.aafp.org/students-residents/premed-medical-students-educators/advising-medical-students.html.
8. Bumsted T, Schneider BN, Deiorio NM. Considerations for Medical Students and Advisors After an Unsuccessful Match.
Acad Med. July 2017 - Volume 92 - Issue 7 - p 918-922.
9. Association of American Medical Colleges. Settings and Environments. https://www.aamc.org/cim/career/
alternativecareers/.
10. Jobs for Physicians Without Residency. Non Clinical Doctors. http://www.nonclinicaldoctors.com/careers-for-physicians-
without-residency.html
Specic examples on how this recommendation might be implemented:
Convene a group of representatives of UME faculty and/or sta advisers and leaders, GME residency program
educators, individuals with physician workforce expertise, students, and residents to review the Careers in Medicine®
(CiM) platform to identify its strengths and limitations and to review other resources commonly used by students for
career planning to understand how CiM could be improved. Diversity, equity, and inclusion concerns and goals should
be included in this analysis, with attention to ensuring the needs of the public good. This same group should review the
professional development opportunities for medical school career advisers.
Based upon the above analysis, resources should be revised, updated, expanded, developed, and placed within one
centralized resource. A plan for periodic review is needed to ensure the materials remain current. Recommendations for
standardized and/or expanded adviser training and professional development should also be developed.
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
70
This resource could be incorporated in and linked to existing data sources on disease burden, health disparities, and
public health to allow students to gauge the potential public good impact of specialty selection and practice location.
To ensure a process for universal access by all medical school career advisers (domestic and international), key
stakeholders should be convened to establish scal strategy for the long-term support of this shared resource.
UME and GME educators in each specialty, either in collaboration with or separate from specialty societies and/or
boards, could be convened to identify and build consensus on the foundational content for specialty-specic advising
to be understood and delivered by advisers. GME program directors would be engaged to identify specialty-specic
medical student advising resources to link to the general resource.
Associations that provide application, assessment, and Match services would be partners to ensure trusted, updated
information. A plan for periodically (i.e., annually) updating information would be needed to ensure advising is data-
informed and maintains relevance to the needs of the public.
Research questions:
1. How and when do students use a career advising resource?
2. What value do students perceive from a career advising resource, and how does it change their career planning
behavior?
3. Do faculty development participants transfer the knowledge, skills, and attitudes into their general career advising
practice and/or specialty-specic advising practice?
4. What benets and limitations do faculty identify from participating in faculty development about general career
advising and/or specialty-specic advising?
5. What is the landscape of alternative career pathways for medical school graduates who choose not to practice
clinical medicine?
6. How prevalent is not practicing clinical medicine among medical school graduates versus other professions (law,
dentistry, veterinary medicine, nursing)?
Citations:
1. Patel S, Ahmed R, Rosenbaum BP, Rodgers SM. Career guidance and the Web: bridging the gap between the AAMC
Careers in Medicine Web site and Local Career Guidance Programs. Teach Learn Med. 2008;20:230-4.
2. Harris JA. McKay DW. Evaluation of medical-career counseling resources across Canada. Teach Learn Med.
2012;24:29-35.
3. Byerley J. Tilly A. A Simple Pyramid Model for Career Guidance. J Grad Med Educ. 2018;10:497-9.
4. Association of American Medical Colleges Group on Student Aairs. GSA Professional Development Initiative. https://
www.aamc.org/professional-development/anity-groups/gsa/professional-development-initiative. Accessed May
26, 2021.
5. Hillman E, Lutfy-Clayton L, Desai S, Kellogg A, Zhang XC, Hu K, Hess J. Student-Advising Recommendations from the
Council of Residency Directors Student Advising Task Force. West J Emerg Med. 2017;18:93-6.
6. Woods SK, Burgess L, Kaminetzky C, McNeill D, Pinheiro S, Heflin MT. Dening the roles of advisors and mentors in
postgraduate medical education: faculty perceptions, roles, responsibilities, and resource needs. JGME. 2010;2:195-200.
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
71
Recommendation 8:
Educators should develop a salutary practice curriculum for UME career advising.
Narrative description of recommendation:
Guidelines are needed to inform U.S. MD, U.S. DO, and international medical schools in developing their career advising
programs. Standardized approaches to advising along with career advisor preparation (both general and specialty-
specic) can enhance the quality, equity, and quantity of advising and improve student trust in the advice. Educators
can improve medical student career advising by developing formal guidelines with key recommendations based upon
professional development frameworks and competencies. Implementation of such guidelines will result in greater
consistency, thoroughness, eectiveness, standardization, and equity of medical school career advising programs to
better support students in making career decisions and will lay the foundation for career planning across the continuum.
This recommendation creates the ideal state for the UME-GME transition because:
Medical schools will use a structured approach to career advising that begins early, is based on professional development
frameworks and competencies, is integrated within an educational program, provides broad exposure to both clinical
specialties and alternative career paths, supports early opportunities for exploration, and educates medical students to
consider the school’s social accountability mandate and public good.
How this recommendation links to the shbone diagrams used to develop the ideal state:
Student Advising—lack of alignment of advising and stakeholder needs (i.e., advising not aligned with patient and
population health needs)
Inadequate advisor preparation—lack of current advising resources
Culture—lack of trust in other stakeholders
Implementation “must haves” include:
The best practice curriculum for undergraduate medical education career advising will be available online along with
supporting resources allowing access by medical school faculty in the U.S. and internationally.
This curriculum will be aligned with other key career advising resources such as the Association of American Medical
Colleges’ Careers in Medicine® online tool.
This curriculum will be created with attention to fairness, equity, and public good.
Medical school faculty and other advisors will be equipped with the skills, resources, training, and time to implement the
recommendations of this curriculum.
Implementation “nice to haves” include:
The best practice curriculum for undergraduate medical education career advising will be available at no cost to
individual faculty and its creation and distribution will be supported by institutions and organizations that comprise the
Coalition for Physician Accountability.
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Can improve quality and consistency of advising for
students across schools.
May be resource intensive to develop and implement
into curricula.
Allows for broad focus of career advising to include but
not be limited to the GME application cycle.
Schools could feel constrained regarding ability to
focus on desired school-specic missions. A one-size-
ts-all approach may not be the best choice for this
recommendation to be successful.
UME-GME REVIEW COMMITTEE
72
Relevant examples from the literature (if applicable):
1. Howse K, Harris J, Dalgarno N. Canadian National Guidelines and Recommendations for Integrating Career Advising Into
Medical School Curricula. Acad Med. 2017; 92:1543-8.
2. Navarro AM, Taylor AD, Pokorny, AP. Three innovative curricula for addressing medical students’ career development.
Acad Med. 2011; 86:72-76.
3. Cooke M, Irby DM, O’Brien BC. Educating Physicians: A Call for Reform of Medical School and Residency. 1st edition. San
Francisco, CA: Jossey-Bass; 2010.
4. Welch B, Spooner JJ, Tanzer K, Dintzner MR. and Implementation of a Professional Development Course Series. Am J
Pharm Educ. 2017 Dec; 81(10): 6394
Specic examples on how this recommendation might be implemented:
A task force comprised of representatives from UME, GME, medical students, and the public could be convened
to develop consensus guidelines and recommendations for career advising in medical schools. Ideally the
recommendations will address the factors that have been associated with eectively integrated career advising
programs: structured, timely, standardized, resourced, and based on professional development frameworks and
competencies. Diversity, equity, inclusion, fairness, and public good should be included in their charge.
National guidelines and recommendations that would be published in a key academic medical journal or through
stakeholder publications, as well as be presented at national medical education meetings.
The content of ongoing training opportunities (e.g., Association of American Medical Colleges Careers in Medicine®) could
be reviewed to ensure content alignment with the recommendations and guidelines.
Research questions:
1. What are the facilitators and barriers to medical schools implementing guidelines for professional development and
career advising?
2. Does student participation in a guideline-informed career advising curriculum improve student satisfaction with support
for and outcomes of their own career decision making?
Citations:
1. Howse K, Harris J, Dalgarno N. Canadian National Guidelines and Recommendations for Integrating Career Advising Into
Medical School Curricula. Acad Med. 2017; 92:1543-8.
2. Zink BJ, Hammond MM, Middleton E, Moroney D, Schigelone A. A Comprehensive medical school career development
program improves medical student satisfaction with career planning. Teach Learn Med. 2007; 19:55-60.
3. Sweeney KR, Fritz RA, Rodgers SM. Careers in medicine at Vanderbilt University School of Medicine: an innovative
approach to specialty exploration and selection. Acad Med. 2012; 87:942-8.
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
73
Recommendation 9:
UME and GME educators, along with representatives of the full educational continuum, should jointly dene and
implement a common framework and set of outcomes (competencies) to apply to learners across the UME-GME
transition.
Narrative description of recommendation:
A shared mental model of competence facilitates agreement on assessment strategies used to evaluate a
learner’s progress, and the inferences that can be drawn from assessments. Shared outcomes language can
convey information on learner competence with the patient/public trust in mind. For individual learners, dening
these outcomes will facilitate learning and may promote a growth mindset. For faculty, dening outcomes will
allow for the use of assessment tools aligned with performance expectations and faculty development. For
residency programs, dening outcomes will be useful for resident selection and learner handovers from UME,
resident training, and resident preparation for practice.
This recommendation creates the ideal state for the UME-GME transition because:
An equitable, coordinated, ecient, and transparent system across the UME-GME transition will support each
learner’s growth, evidence-informed specialty selection, achievement of competence, and wellness. There
will be a shared mental model of competency across the continuum. This could entail a standardized set of
general competencies and specialty-focused competencies in certain domains (for example, patient care and
medical knowledge). Professionalism of students will be accurately and transparently reported to future program
administrators. Educators will dene those competencies that programs believe, and data support, are the best
predictors of a student’s abilities to succeed
How this recommendation links to the shbone diagrams used to develop the ideal state:
Student advising—better data on learner competence allows better advice.
Assessment tools and strategies—harmonized mental model of outcomes allows development of appropriate
tools and more standardized faculty development in their use.
Culture—builds a culture of trust and valuing of medical education. If properly dened, desired attributes would
include elements of professionalism so that there is less ambiguity in dening unprofessional behavior.
Denition of competence—more precise denition of competence allows more clarity for schools, for advisers,
for curriculum development, and for the Accreditation Council for Graduate Medical Education.
Match system—the more robust data available for program directors and presumably a parallel in terms of
robust program data available to learners should result in fewer applications as they can be more targeted.
Stakeholders—the competencies can become elements in the Electronic Residency Application Service (ERAS®)
through which they can be searched and lters constructed.
Implementation “must haves” include:
The language of outcomes must be shared from UME to GME.
Accountability must be ensured through adoption by accreditors (Liaison Committee on Medical Education,
Commission on Osteopathic College Accreditation, Accreditation Council for Graduate Medical Education).
Implementation “nice to haves” include:
Shared outcomes language may be expanded so that it is applicable to fellowship training and practice.
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
74
Pros Cons
Transparency and consistency of expectations for
students, faculty, schools, and residency programs
Schools could feel constrained regarding ability to focus
on desired school-specic outcomes
Strengthens attention to UME–GME continuum Challenging for international schools to adopt
Facilitates shared assessment tools and strategies Likely to require extended time to develop based on
similar prior eorts; could slow eorts at other changes
for the UME-GME transition
Appendix C:
UGRC Final Recommendations With Complete Templates
Relevant examples from the literature (if applicable):
American Board of Pediatrics/Accreditation Council for Graduate Medical Education pediatrics milestones
project, with progression from novice to expert
Transitional year milestones
American Association of Medical Colleges Core Entrustable Professional Activities for Entering Residency
Physician Competency Reference Set
Accreditation Council for Graduate Medical Education competencies and harmonized milestones
Consensus Statement on a Framework for Professional Competence by the Coalition for Physician Accountability
https://physicianaccountability.org/wp-content/uploads/2020/05/Coalition-Competencies-Consensus-Statement-
FINAL.pdf
Specic examples on how this recommendation might be implemented:
1. Convene a group of educators and public representatives to dene the general competencies that are relevant
to all of medicine.
2. After #1 is done, convene groups representing each discipline (medicine, surgery, pediatrics, etc.) to use the
general competencies to dene specialty-specic competencies necessary at the UME-GME transition.
3. In collaboration with accrediting bodies (Liaison Committee on Medical Education, Commission on Osteopathic
College Accreditation, Accreditation Council for Graduate Medical Education), develop mechanisms to ensure
the use of the outcomes language across the education continuum.
Research questions:
1. What are facilitators and barriers to UME and GME programs incorporating shared outcomes language into
their curricula?
2. What are student or resident-sensitive quality measures that capture performance on the competency-based
outcomes?
Citations:
1. Frank JR, Snell LS, ten Cate O, et al. Competency-based medical education: theory to practice. Med Teach.
2010;32:638-45.
2. Englander R, Frank JR, Carraccio C, et al. Toward a shared language for competency-based medical education.
Med Teach. 2017;39:582-7.
3. McConville JF, Woodru JN. A shared evaluation platform for medical training. N Engl J Med. 2021;384:491-3.
UME-GME REVIEW COMMITTEE
75
Recommendation 10:
To eliminate systemic biases in grading, medical schools must perform initial and annual exploratory reviews
of clinical clerkship grading, including patterns of grade distribution based on race, ethnicity, gender identity/
expression, sexual identity/orientation, religion, visa status, ability, and location (e.g., satellite or clinical site location),
and perform regular faculty development to mitigate bias. Programs across the UME-GME continuum should
explore the impact of bias on student and resident evaluations, match results, attrition, and selection to honor
societies.
Narrative description of recommendation:
Recognizing that inherent biases exist in clinical grading and assessment in the clinical learning environment, each
UME and GME program must have a continuous quality improvement process for evaluating bias in clinical grading
and assessment and the implications of these biases, including honor society selection. This recommendation is
intended to mitigate bias in clinical grading, transcript notations, MSPE reflections of remediation, and residency
evaluations. This recommendation is not intended to create requirements for reporting race, ethnicity, gender
identity, sexual identity, religion, or ability of learners as data analysis must be limited to data readily available to
each school.
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Shared resources for methods of assessment of bias
in clinical grading and existing faculty development to
eliminate bias
Need for exploratory resources and funding to examine
extent and impact
Eliminating bias in clinical grading Variations in assessments limiting collaboration and
shared approaches, e.g., tiered grading, pass/fail,
milestones across the UME-GME continuum
Equitable access for students and residents to honor
society selection
Variations in institutions’ selection processes for honor
societies
Increased awareness and reporting of inherent biases
in clinical grading
Specic examples on how this recommendation might be implemented:
1. Each medical school will perform a validated analysis of clinical grading and implement targeted faculty
development to eliminate bias in the clinical learning environment.
2. UME and GME programs will explore the impact of bias at their institutions on student and resident evaluations,
match results, attrition, remediation processes, and student or resident selection for honor societies.
Research questions:
1. What are the established methods for examining bias in clinical grading by race, ethnicity and gender?
Citations:
1. Abraham HN, Opara IN, Dwaihy RL, Acu C, Brauer B, Nabaty R, Levine DL. Engaging Third-Year Medical Students
on Their Internal Medicine Clerkship in Telehealth During COVID-19. Cureus. 2020. 12(6): e8791.
2. Adams CC, Shih R, Peterson PG, Lee MH, Heltzel DA, Lattin GE. The Impact of a Virtual Radiology Medical Student
Rotation: Maintaining Engagement During COVID-19 Mitigation. Mil Med. Volume 186, Issue 1-2, January-February
2021: e234–e240.
3. Akers A, Blough C, Iyer MS. COVID-19 Implications on Clinical Clerkships and the Residency Application Process for
Medical Students. Cureus. 2020. 12(4): e7800.
UME-GME REVIEW COMMITTEE
76
4 Asaad, M. Glassman G, Allam O. Virtual Rotations During COVID-19: An Opportunity for Enhancing Diversity. J Surg
Res. 2021 260: 516-519.
5. Ayala A, Ukeje C. There Is No Place Like Home: Rethinking Away Rotations. Acad Med. 2020. 95(11): e5.
6. Boyd CJ, Inglesby DC, Corey B. Impact of COVID-19 on Away Rotations in Surgical Fields. J Surg Res. 2020. 255: 96-98.
7. Byrnes YM, Civantos AM, Go BC, McWilliams TL, Rajasekaran K. Eect of the COVID-19 pandemic on medical student
career perceptions: a national survey study. Med Educ Online. 2020. 25(1): 1798088.
8. Dean RA, Reghunathan M, Hauch A, Reid CM, Gosman AA, Lance SH. Establishing a Virtual Curriculum for Surgical
Subinternships. Plast Reconstruc Surg. 2020 146(4): 525e-527e.
9. DeAtkine AB, Grayson JW, Singh NP, Nocera AP, Rais-Bahrami S, Greene BJ. #ENT: Otolaryngology Residency
Programs Create Social Media Platforms to Connect With Applicants During COVID-19 Pandemic. Ear Nose Throat
J. 2020. 145561320983205.
10. Everett AS, Strickler S, Marcrom SR, McDonald AM. Students’ Perspectives and Concerns for the 2020 to 2021
Radiation Oncology Interview Season. Adv Radiat Oncol. 2021. 6(1): 100554.
11. Farlow JL, Marchiano EJ, Fischer IP, Moyer JS, Thorne MC, Bohm LA. Addressing the Impact of COVID-19 on the
Residency Application Process Through a Virtual Subinternship. Otolaryngology Head Neck Surg. 2020 163(5): 926-
928.
12. Franco I, Oladeru OT, Saraf A, et al. Improving Diversity and Inclusion in the Post-Coronavirus Disease 2019 Era
Through a Radiation Oncology Intensive Shadowing Experience (RISE). Adv Radiat Oncol. 2021. 6(1): 100566.
13. Gabrielson AT, Kohn JR, Sparks HT, Clifton M, Kohn T. Proposed Changes to the 2021 Residency Application Process
in the Wake of COVID-19. Acad Med. 2020. 95(9): 1346-1349.
14. Goldenberg MN, Hersh DC, Wilkins KM, Schwartz ML. Suspending Medical Student Clerkships Due to COVID-19. Med
Sci Educat. 2020. June 3. 1-4.
15. Hanson KA, Borofsky MS, Hampson LA, et al. Capturing the Perspective of Prospective Urology Applicants: Impacts
of COVID-19 on Medical Education. Urology. 2020. 146: 36-42.
16. Hayes JR, Johnston B, Lundh R. Building a Successful, Socially-Distanced Family Medicine Clerkship in the COVID
Crisis. PRiMER (Leawood, Kan.) 2020. 4: 34.
17. Iancu AM, Kemp MT, Alam HB. Unmuting Medical Students’ Education: Utilizing Telemedicine During the COVID-19
Pandemic and Beyond. J Med Internet Res. 2020. 22(7): e19667.
18. Jiang J, Key P, Deibert CM. Improving the Residency Program Virtual Open House Experience: A Survey of Urology
Applicants. Urology. 2020. 146: 1-3.
19. Kahn JM, Fields EM, Pollom E, et al. Increasing Medical Student Engagement Through Virtual Rotations in Radiation
Oncology. Adv Radiat Oncol. 2021. 6(1): 100538.
20. Kasle DA, Torabi SJ, Izreig S, Rahmati RW, Manes RP. COVID-19’s Impact on the 2020-2021 Resident Match: A Survey
of Otolaryngology Program Directors. Ann Otol Rhinol Laryngol. 2021. 3489420967045.
21. Katirji L, Smith L, Pelletier-Bui A, et al. Addressing Challenges in Obtaining Emergency Medicine Away Rotations and
Standardized Letters of Evaluation Due to COVID-19 Pandemic. West J Emerg Med. 2020. 21(3): 538-541.
22. Krawiec C, Myers A. Remote Assessment of Video-Recorded Oral Presentations Centered on a Virtual Case-
Based Module: A COVID-19 Feasibility Study. Cureus. 2020. 12(6): e8726.
23. Kronenfeld JP, Ryon EL, Kronenfeld DS, et al. Medical Student Education During COVID-19: Electronic Education
Does Not Decrease Examination Scores. Am Surg. 2020. Dec 29; 3134820983194.
24. Margolin EJ, Margolin EJ, Gordon RJ, Anderson CB, Badalato GM. Reimagining the Away Rotation: A 4-Week Virtual
Subinternship in Urology. J Surg Ed. 2021. Jan 20;S1931-7204(21)00008-8.
25. Muzumdar S, Grant-Kels, Feng H. Medical student dermatology rotations in the context of COVID-19. J Am Acad
Dermatol. 2020. 83(5): 1557-1558.
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
77
26. Nackers K, Becker A, Stewart K, Beamsley M, Aughenbaugh W, Chheda S. Patient care, public health, and a pandemic:
adapting educational experiences in the clinical years. FASEB bioAdvances. 2020.
27. Nagji A, Yilmaz Y, Zhang P, et al. Converting to Connect: A Rapid RE-AIM Evaluation of the Digital Conversion of a
Clerkship Curriculum in the Age of COVID-19. AEM education and training 2020. 4(4): 330-339.
28. Nnamani Silva ON, Hernandez S, Kim AS, et al. Where Do We Go From Here? Assessing Medical Students’ Surgery
Clerkship Preparedness During COVID-19. J Surg Ed. 2021. Jan 16;S1931-7204(21)00010-6
29. Nnamani Silva ON, Hernandez S, Kim EH, et al. Surgery Clerkship Curriculum Changes at an Academic Institution during
the COVID-19 Pandemic. J Surg Ed. 2021. 78(1): 327-331.
30. Ooi R, Ooi SZY. The role of virtual sub-internships in influencing career perceptions: an international medical graduate
perspective. Med Ed Online. 2020. 25(1): 1821463.
31. Patel PM, Tsui CL, Aakaash V, Levitt J. Remote learning for medical student-level dermatology during the COVID-19
pandemic. J Am Acad Dermatol. 2020. 83(6): e469-e470.
32. Patel V, Nolan IT, Morrison SD, Fosnot J. Visiting Subinternships in Wake of the COVID-19 Crisis: An Opportunity for
Improvement. Ann Plast Surg. 2020. 85(2S Suppl 2): S153-S154.
33. Pelletier-Bui A, Franzen D, Smith L, et al. COVID-19: A Driver for Disruptive Innovation of the Emergency Medicine
Residency Application Process. West J Emerg Med. 2020. 21(5): 1105-1113.
34. Peterseim C, Watson KH. Family Medicine Telehealth Clinic With Medical Students. PRiMER (Leawood, Kan.). 2020. 4: 35.
35. Pollom EL, Sandhu N, Frank J, et al. Continuing Medical Student Education During the Coronavirus Disease 2019
(COVID-19) Pandemic: Development of a Virtual Radiation Oncology Clerkship. Adv Radiat Oncol. 2020. 5(4): 732-736.
36. Rajesh A, Asaad M. Alternative Strategies for Evaluating General Surgery Residency Applicants and an Interview Limit
for MATCH 2021: An Impending Necessity. Ann Surg. 2021. 273(1): 109-111.
37. Richardson MA, Islam W, Magruder M. The Evolving Impact of COVID-19 on Medical Student Orthopedic Education:
Perspectives From Medical Students in Dierent Phases of the Curriculum. Geriatr Orthop Surg Rehabil. 2020. 11:
2151459320951721.
38. Ruthberg JS, Quereshy HA, Ahmadmehrabi S, et al. A Multimodal Multi-institutional Solution to Remote Medical Student
Education for Otolaryngology During COVID-19. Otolaryngol Head Neck Surg. 2020. 163(4): 707-709.
39. Samueli B, Sror N, Jotkowitz A, Taragin B. Remote pathology education during the COVID-19 era: Crisis converted to
opportunity. Ann Diagn Pathol. 2020. 49: 151612.
40. Sandhu N, Frank J, von Eyben R, et al. Virtual Radiation Oncology Clerkship During the COVID-19 Pandemic and Beyond.
Int J Radiat Oncol Biol Physics. 2020. 108(2): 444-451.
41. Shin TH, Klingler M, Han A, et al. Ecacy of Virtual Case-Based General Surgery Clerkship Curriculum During COVID-19
Distancing. Med Sci Educ. 2020: 1-8.
42. Smith E, Boscak A. A virtual emergency: learning lessons from remote medical student education during the COVID-19
pandemic. Emerg Radiol. 2021.
43. Vollbrecht PJ, Porter-Stransky KA, Lackey-Cornelison WL. Lessons learned while creating an eective emergency
remote learning environment for students during the COVID-19 pandemic. Adv Physiol Educ. 2020. 44(4): 722-725.
44. Weber AM, Dua A, Chang K, et al. An outpatient telehealth elective for displaced clinical learners during the COVID-19
pandemic. BMC Med Educ. 2021. 21(1): 174.
45. Wendt S, Abdullah Z, Barrett S, et al. A virtual COVID-19 ophthalmology rotation. Surv Ophthal. 2021. 66(2): 354-361.
46. Williams C, Familusi OO, Ziemba J, et al. Adapting to the Educational Challenges of a Pandemic: Development of a
Novel Virtual Urology Subinternship During the Time of COVID-19. Urology. 2021. 148: 70-76.
47. Xu L, Ambinder D, Kang J, et al. Virtual grand rounds as a novel means for applicants and programs to connect in the
era of COVID-19. Am J Surg. 2020. Sep 2.
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
78
Recommendation 11:
The UME community, working in conjunction with partners across the continuum, must commit to using robust
assessment tools and strategies, improving upon existing tools, developing new tools where needed, and
gathering and reviewing additional evidence of validity.
Narrative description of recommendation:
Educators from across the education continuum should use shared competency outcomes language to guide
development or use of assessment tools and strategies that can be used across schools to generate credible,
equitable, value-added competency-based information. Assessment information should be shared in residency
applications and a post-match learner handover. Licensing examinations should be used for their intended
purpose to ensure requisite competence.
This recommendation creates the ideal state for the UME-GME transition because:
An equitable, coordinated, ecient, and transparent system across the UME to GME transition will provide
trustworthy documentation of competence across the continuum using reliable assessment tools that generate
meaningful information for learners, educators, and where appropriate, regulators. Graduated medical students
will be ready to serve as physicians in training. They will be facile with the appropriate knowledge, skills, and
attitudes and will be equipped with an advancing professional identity and a condent humility. They will be
prepared for the realities of residency and a lifelong career as well as trustworthy to practice under supervision,
asking for help when needed. Professionalism of students will be accurately and transparently reported to future
program administrators. Reliable and valid standardized assessment tools will document competence
How this recommendation links to the shbone diagrams used to develop the ideal state:
Assessment tools and strategies — lack of validity evidence for assessment tools and strategies; varied
approaches to assessment across schools; fostering of mistrust between UME and GME
Implementation “must haves” include:
Assessment tools must address multiple competencies needed for practice.
There needs to be a plan for study to gather validity evidence.
Attention to fairness, equity, and minimizing bias is critical.
Implementation “nice to haves” include:
The use of assessment tools within systems for programmatic assessment may need to be optimized.
There may be a need for a strategy to encourage longitudinal learner-educator relationships and some
continuity in education setting.
An evaluation of the usefulness and risks of a mechanism for aggregate assessment data feeding into a
database used for evaluating learner success in programs and practice may be necessary.
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Improve the quality of assessment data to provide
meaningful information in resident selection
Excess focus on assessment data as well as ranking and
sorting learners inhibits learning, heightens student stress
Use assessment tools to promote students’
achievement of competence and readiness for GME
Risk of drawing inferences from assessment data
that do not predict performance in GME — based on
small dierences in performance that are not clinically
meaningful
Risk of focus on small dierences in performance that
are not educationally or clinically meaningful (non-
signicant dierences, standard error of mean)
UME-GME REVIEW COMMITTEE
79
Relevant examples from the literature (if applicable):
Accreditation Council for Graduate Medical Education toolbox of assessment methods
CanMEDS assessment tools http://canmeds.royalcollege.ca/en/tools
Specic examples on how this recommendation might be implemented:
1. In GME, the example of harmonized milestones across all disciplines is a model encouraging consistent language
for outcomes across a broad range of programs and institutions.
2. Where possible, use of existing assessment tools that have evidence of validity, which could come from the
Accreditation Council for Graduate Medical Education or programs, is recommended rather than creating
new tools. With the discontinuation of the United States Medical Licensing Examination® Step 2 Clinical Skills
examination, the National Board of Medical Examiners may be able to work with schools to improve the
reliability and standards of school-based Objective Structured Clinical Examinations and other simulations.
3. Convene a group of education and assessment leaders to ensure that implementation of shared assessments
would be achievable for schools; benecial, fair, and equitable for students; and helpful to program directors at
the UME to GME transition.
Research questions:
1. What are facilitators and barriers to implementation of recommended assessment tools in UME and GME
programs?
2. Are recommended assessment tools perceived by program directors and residency selection committees as
useful in the resident selection process?
3. What is the validity evidence for assessments of performance measured with any of the assessment tools?
Citations:
1. Bouriscot K, Kemp S, Wilkinson T, et al. Performance assessment: Consensus statement and recommendations
from the 2020 Ottawa Conference. Med Teach. 2021;43:58-67.
2. McConville JF, Woodru JN. A shared evaluation platform for medical training. N Engl J Med. 2021;384:491-3.
3. Van der Vleuten et al. A model for programmatic assessment t for purpose. Med Teach. 2012:34:205-14.
4. Lockyer J, Carraccio C, Chan MK, et al. Core principles of assessment in competency-based medical education.
Med Teach. 2017;39:609-16.
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
80
Recommendation 12:
Using the shared mental model of competency and assessment tools and strategies, create and implement
faculty development materials for incorporating competency-based expectations into teaching and assessment.
Narrative description of recommendation:
Faculty must understand the purpose of outcomes-focused education, specic language used to dene
competence, and how to mitigate biases when assessing learners. They must understand the purpose and use
of each assessment tool. The intensity and depth of faculty development can be tailored to the amount and type
of contact that individual faculty have with students. Clerkship directors, academic progress committees, student
competency committee members, and other educational leaders require a more in-depth understanding of the
assessment system and how determinations of readiness for advancement are made. This faculty development
requires centralized electronic resources and training for trainers within institutions. Review of training materials,
and completion of any required activities to document review and/or understanding, should be required on a
regular basis.
This recommendation creates the ideal state for the UME-GME transition because:
An equitable, coordinated, ecient, and transparent system across the UME to GME transition will support each
learner’s growth, evidence-informed career and specialty selection, achievement of competence, and wellness.
It also will provide trustworthy documentation of competence across the continuum using reliable assessment
tools that generate meaningful information for learners, educators, and where appropriate, regulators. Faculty,
learners, and the structure of the system will cultivate inclusive learning environments that foster a growth
mindset. The medical education and health care systems will minimize the eects of racism and harmful bias.
Faculty development will clarify expectations at each level of training, teach remediation strategies, and describe
how patient safety is ensured (direct vs. indirect supervision, schedule variation, etc.)
How this recommendation links to the shbone diagrams used to develop the ideal state:
Assessment tools and strategies—consistent, fair, and equitable use of assessment tools across programs in UME
and GME
Denition of competence—faculty assessment of learners is based on a shared outcomes language and
understanding of the purpose and use of assessment tools
Implementation “must haves” include:
Faculty development materials must be available electronically at MD, DO, and international schools.
Content addressing competency-based assessment, direct observation and feedback, purpose and use of
assessment tools, minimizing bias, and promoting equity are essential.
A process needs to be developed and implemented to conrm faculty understanding such as required mastery
questions.
Implementation “nice to haves” include:
• Materials intended to train the trainer may be developed for participants to train additional local faculty.
• There may be a need for a specic plan for periodic review and updating of these resources.
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
81
Pros Cons
Promotes appropriate use of a shared mental model
and assessment tools and strategies
Training may be perceived to be too resource-intensive
Can promote consistent, fair, and equitable assessment Faculty may dislike training on assessment or resent
requirements for training
Enhances shared language, expectations, and
assessment approaches across the UME-GME
transition
Can standardize approaches across MD, DO,
international schools
Appendix C:
UGRC Final Recommendations With Complete Templates
Relevant examples from the literature (if applicable):
Association of American Medical Colleges Core Entrustable Professional Activities Curriculum Developers’ Guide
and Faculty and Learners’ Guide
Accreditation Council for Graduate Medical Education Assessment Guidebook
Royal College of Physicians and Surgeons of Canada. Competence by Design. Faculty Development. https://www.
royalcollege.ca/rcsite/cbd/cbd-faculty-development-e
University of Virginia Keeley M, Gusic M, Morgan H, et al. Moving Toward Summative Competency Assessment to
Individualize the Post Clerkship Phase.Acad Med.December 2019;94:1858-1864.
Specic examples on how this recommendation might be implemented:
1. After the development of shared outcomes language and selection of assessment tools and strategies,
assessment experts from UME and GME convene to develop online modular faculty development materials
that would be available in MD, DO, and international schools.
2. Educators develop a plan to evaluate the eectiveness of the online faculty development materials.
3. Educators develop a plan to update the online faculty development materials periodically as needed.
Research questions:
1. Do faculty development participants transfer the knowledge, skills and attitudes into their assessment practice?
2. What benets and limitations do faculty identify from participating in faculty development about assessment?
3. What faculty training interventions help to mitigate bias and improve equity in assessment?
Citations:
1. Sirianni G, Takahashi SG, Myers J. Taking stock of what is known about faculty development in competency-
based medical education: A scoping review paper. Med Teach. 2020;42:909-15.
UME-GME REVIEW COMMITTEE
82
Recommendation 13:
Convene a workgroup to explore the multiple functions and value of away rotations for applicants, medical
schools, and residency programs. Specically, consider the goals and utility of the experience, the impact of
these rotations, and issues of equity including accessibility, assessment, and opportunity for students from groups
underrepresented in medicine and nancially disadvantaged students.
Narrative description of recommendation:
Away rotations can be cost prohibitive yet may allow a student to get to know a program, its health system, and
surrounding community. Some programs are reliant on away rotations to showcase their unique strengths to
attract candidates. Given the multifactorial and complex role that away rotations fulll, a committee should be
convened to conduct a thorough and comprehensive review of cost versus benet of away rotations, followed
by recommendations from that review. Non-traditional methods of conducting and administering away rotations
should be explored (e.g., oering virtual away rotations, waiving application fees, or oering away stipends
particularly for nancially disadvantaged students). Questions explored by the workgroup should include:
The circumstances when a learner should complete an away rotation
How the learner’s medical school oerings or lack of oerings predicates the basis for completing away
rotations
Identication of learners who would most benet from away rotations despite cost
The probability that completion of away rotations will lead to a residency position at the program where the
away rotation was completed
Should there be a limit on the number of away rotation and under what circumstances
The cost of completing away rotations
Alternatives to away rotation
Student impact when home institutions cannot provide specic clinical experiences
This recommendation creates the ideal state for the UME-GME transition because:
Away electives will be purposed for broadening educational exposure and not essential for successful matching.
How this recommendation links to the shbone diagrams used to develop the ideal state:
Applicants clinical experience and knowledge of specialty is limited
Financial burden
Opportunity cost for time spent on application process
Specic examples on how this recommendation might be implemented:
Convene a group of stakeholders to review any data and explore away rotation benets that includes ACGME,
specialty colleges, student organizations, residency program representation from all types of organizations,
specialties and program types: university-based, independent academic medical center, community, military, MD
and DO-granting medical schools as well as ECFMG.
Research questions:
1. Did the lack of away rotations negatively impact any particular subset of medical students’ ability to match into
a desired program compared to previous years?
2. Did the lack of away rotations negatively impact any types of residency programs (community, university, etc.)
or residency specialties in matching their desired candidates when compared to previous years?
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
83
Citations:
1. Winterton M, Ahn J, Bernstein J. The prevalence and cost of medical student visiting rotations. BMC Med Educ.
2016;16(1):291. Published 2016.
2. Higgins E, Newman L, Halligan K, Miller M, Schwab S, Kosowicz L. Do audition electives impact match success?.
Med Educ Online. 2016;21:31325. Published 2016 Jun 13.
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
84
Recommendation 14:
A convened group including UME and GME educators should reconsider the content and structure of the MSPE as
new information becomes available to improve access to longitudinal assessment data about applicants. Short-term
improvements should include structured data entry elds with functionality to enable searching.
Narrative description of recommendation:
The development of UME competency outcomes to apply across learners and the continuum is essential in
decreasing the reliance on board scores in the evaluation of the residency applicant. These will take time to develop
and implement and may be developed at dierent intervals. As new information becomes available to improve
applicant data, the MSPE should be utilized to improve longitudinal applicant information. In addition, improvements in
the MSPE, such as structured data entry elds with functionality to enable searching, should be explored.
This recommendation creates the ideal state for the UME-GME transition because:
A reconsidered Medical Student Performance Evaluation will provide trustworthy documentation of competence
across the continuum using reliable assessment tools that generate meaningful information for learners, educators,
and where appropriate, regulators. It will also create a foundation of trust, transparency, and reliability among
students, schools, programs, and the communities served. Applicants will be certied by their medical school as fully
prepared, appropriate, and trustworthy for residency training. There will be social accountability and transparency
for medical schools in the validity of this certication, and programs will have information regarding an applicant’s
current competence, the trajectory of growth during medical school, and measurement accuracy.
How this recommendation links to the shbone diagrams used to develop the ideal state:
• Lack of a shared mental model
• Varied approaches to assessment at schools
• Varied and insucient resources for assessment
Medical student reporting lacks consistent, comparable information from objective and universal reporting tools
leading to mistrust by residency programs.
• Lack of consistent data and resources for holistic review of applicants
Lack of metrics in multiple key competencies to compare candidates leading to program reliability on board scores
• Lack of trustworthy, validated, bidirectional information
• There are no consequences to UME for inaccuracy
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Utilizing an established means of communication
between UME/GME for any newly developed applicant
outcome/competencies
Medical school noncompliance with the Medical
Student Performance Evaluation
Consistent formatting of the Medical Student
Performance Evaluation if llable document is
established
Mistrust by programs of information provided
Improved communication between UME and GME on
candidate performance
UME-GME REVIEW COMMITTEE
85
Relevant examples from the literature (if applicable):
1. Giang D. Medical Student Performance Evaluation (MSPE) 2017 Task Force Recommendations as Reflected in the
Format of 2018 MSPE. J Grad Med Educ. 2019;11(4):385-388.
2. Swide C, Lasater K, Dillman D. Perceived predictive value of the Medical Student Performance Evaluation (MSPE) in
anesthesiology resident selection. J Clin Anesth. 2009 Feb;21(1):38-43.
Specic examples on how this recommendation might be implemented:
• A group of stakeholders could be convened to develop and implement MSPE improvement.
Research questions:
1. Do Medical Student Performance Evaluation (MSPE) improvements increase residency program reliance on MSPE
information?
2. Is the MSPE an accurate representation of medical student performance?
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
86
Recommendation 15:
Structured Evaluative Letters (SELs) should replace all Letters of Recommendation (LORs) as a universal tool in the
residency program application process.
Narrative description of recommendation:
A Structured Evaluative Letter (SEL), which would include specialty-specic questions, would provide knowledge
from the evaluator on student performance that was directly observed versus a narrative recommendation. The
template should be based on an agreed upon set of core competencies and allow equitable access to completion
for all candidates. The SEL should be based on direct observation and must focus on content that the evaluator can
complete. Faculty resources should be developed to improve the quality of the standardized evaluation template
and decrease bias.
This recommendation creates the ideal state for the UME-GME transition because:
Increased standardization of letters of recommendation will reduce unnecessary variability in the materials residency
programs consider when reviewing candidates’ applications and will help streamline the selection process. These
letters will work toward mitigating racism and other biases that should not be a part of the learner selection process.
How this recommendation links to the shbone diagrams used to develop the ideal state:
Letters of recommendation lack consistency among specialty requirements with confusion/bias to templates
Applicant info not in a structured, validated format usable for large scale review
Lack of understandable plain language reporting of a student assessment
Varied, insucient resources for assessment
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Applicant information shared with potential residency
programs on observed outcomes and performance
Template that is dicult to complete by an evaluator
(process or content)
Applicants ability to obtain the structured evaluative
letter
Residency programs not utilizing the structured
evaluative letter as a means of applicant evaluation
Convenient, llable document for the evaluator
Decreased bias in applicant evaluations
Relevant examples from the literature (if applicable):
1. Ocial Cord Standardized Letter of Evaluation (SLOE). Council of Residency Directors in Emergency Medicine. https://
www.cordem.org/resources/residency-management/sloe/. Accessed June 7, 2021.
2. SLOE – IM: DOM Summary Letter. Internal Medicine Letter of Evaluation Template https://higherlogicdownload.
s3.amazonaws.com/IM/fecab58a-0e31-416b-8e56-46fc9eda5c37/UploadedImages/Documents/resources/SLOE_
DOM_Summary_Letter_Template.pdf. Accessed June 7, 2021.
Specic examples on how this recommendation might be implemented:
A template can be developed by convening a group of learners, schools, and programs with input from specialty
colleges.
Decisions need to be made on the appropriate location of the structured evaluative letters to enable electronic
completion and submission. Housing the document within the application system should be considered.
Faculty development tools to support structured evaluative letters should be developed.
UME-GME REVIEW COMMITTEE
87
Research questions:
1. Search and review additional articles on the outcomes of present emergency medicine standardized letter of
evaluations or other templated letters of recommendation.
2. Does the addition of a structured evaluative letter assist the program in the evaluation of an applicant greater
than the letter of recommendation?
3. Does the addition of a structured evaluative letter decrease bias in the evaluation of an applicant?
Citations:
1. Jackson JS, Bond M, Love JN, Hegarty C. Emergency Medicine Standardized Letter of Evaluation (SLOE): Findings
From the New Electronic SLOE Format. J Grad Med Educ. 2019;11(2):182-186.
2. Katirji L, Smith L, Pelletier-Bui A, et al. Addressing Challenges in Obtaining Emergency Medicine Away Rotations and
Standardized Letters of Evaluation Due to COVID-19 Pandemic. West J Emerg Med. 2020;21(3):538-541.
3. Negaard M, Assimacopous E, Harland K, Van Heukelom J. Emergency Medicine Residency Selection Criteria: An
Update and Comparison. AEM Education and Training 2018: 2: 146-153
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
88
Recommendation 16:
To raise awareness and facilitate adjustments that will promote equity and accountability, self-reported
demographic information of applicants (e.g., race, ethnicity, gender identity/expression, sexual identity/orientation,
religion, visa status, or ability) should be measured and shared with key stakeholders, including programs and medical
schools, in real time throughout the UME-GME transition.
Narrative description of recommendation:
Inequitable distribution of applicants among specialties is not in the best interest of programs, applicants, or the
public good. Bias can be present at any level of the UME-GME transition. A decrease in diversity at any point along
the continuum provides an important opportunity to intervene and potentially serve the community in ways that are
more productive. An example of accountability and transparency in an inclusive environment across the continuum is
a diversity dashboard for residency applicants. A residency program that nds bias in its selection process could go
back in real time to nd qualied applicants who may have been missed, potentially improving outcomes
How this recommendation links to the shbone diagrams used to develop the ideal state:
• Bias
• Needs of society not met (underrepresented in medicine applicants excluded from some programs/specialties)
• Program director stress
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Increased attention and support to trainees
underrepresented in medicine (due to race, ethnicity,
gender, sexual orientation/identity, visa status, or ability)
moving through medical school and the transition.
Medical schools with discrepancies in grading could
change academic expectations instead of ensuring
all students have support and resources to excel.
This could make it more dicult for students to
demonstrate academic excellence (e.g. if honor
societies are removed from medical schools), or result in
applicants beginning residency less prepared.
More underrepresented in medicine applicants (due
to race, ethnicity, gender, sexual orientation/identity,
visa status, or ability) into competitive areas and
subspecialties.
Fewer applicants into primary care specialties, since
curricula may increase exposures to sub-specialties.
Medical schools in less diverse areas may be
penalized if compared to national means instead of
the applicant pool or population served.
Self-fullling prophecy, that less diverse institutions
may struggle to increase diversity in the absence of
a track record of success.
Relevant examples from the literature (if applicable):
1. Lattanza LL, Maszaros-Dearolf L, O’Connor MI, Ladd A, Bucha A, Trauth-Nare A, Muchley JM. The Perry Initiatives
Medical Student Outreach Program recruits women into orthopaedic residency. Clin Orthop Relat Res (2016)
474:1962-1966
2. Nellis JC, Eisele DW, Francis HW, Hillel, AT, Lin SY. Impact of a mentored student clerkship on underrepresented
minority diversity in otolaryngology-head and neck surgery. Laryngoscope, 126:2684-2688, 2016
3. Vajapey S, Cannada LK, Samora JB. What proportion of women who received funding to attend a Ruth Jackson
Orthopaedic Society meeting pursued a career in orthopedics? Clin Orthop Relat Res (2019) 477:1722-1726
UME-GME REVIEW COMMITTEE
89
4. Yoon JD, Ham SA, Reddy, ST, Curlin FA. Role models’ influence on specialty choice for residency training: a national
longitudinal study. J Grad Med Educ. April 2018. 149
Specic examples on how this recommendation might be implemented:
Medical schools and residency programs are provided with “dashboards” which give them rapid feedback on the
status of underrepresented groups within their student and applicant population. These dashboards should interact
with the level of detail provided in the electronic application platform, so that it is apparent when bias is present in
a specic metric/search term/lter (for medical schools and residency programs) or overall selection strategy (for
residency programs).
Application management is reported for medical schools (i.e. proportion of a specic population entering
competitive subspecialties, selected for honor societies, earning honors in each clerkship) and residency programs
(i.e. proportion of applications received from populations of interest, and how many of those applicants were
interviewed and ranked), creating accountability.
A pop-up warning appears to a program director that a certain group has been removed 50% by a certain lter.
Similarly, a letter writer or student aairs representative could receive a pop-up alert that a certain keyword being
entered has signicant bias.
Research questions:
1. Have any medical schools already started monitoring their underrepresented in medicine (UiM) pipeline in the UME-
GME transition? What programs are already in place to promote success of UiM in medical school? What programs
are in place to increase recruitment into competitive subspecialties?
2. How do dashboards alone, without reporting, aect behavior? Does behavior change with internal accountability
(i.e. to the dean, designated institutional ocial), or is public reporting required? What are the reasons applicants
choose a specialty, and how much do curricular choices and mentorship in medical school aect that decision?
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
90
Recommendation 17:
To optimize utility, discrete elds should be available in the existing electronic application system for both narrative and
ordinal information currently presented in the MSPE, personal statement, transcript, and letters. Fully using technology
will reduce redundancy, improve comprehensibility, and highlight the unique characteristics of each applicant.
Narrative description of recommendation:
Optimally, each applicant will be reviewed individually and holistically to evaluate merit. However, some circumstances
may require rapid review. The 2020 NRMP program directors’ survey found that only 49% of applications received
an in-depth review. The application system should utilize modern technology to maximize the likelihood that
applications are evaluated in a way that is holistic, mission-based, and equitable.
Currently, applications are assessed based on the information that is readily available, which may place undue
emphasis on scores, geography, medical school, or other factors that perpetuate bias. Adding specic data gives
an opportunity for applicants to demonstrate their strengths in a way that is user-friendly for program directors.
Maximizing the amount of accurate information readily available in the application will increase capacity for holistic
review of more applicants and improve trust during the UME to GME transition. Although not all schools and programs
will align on which information should be included, areas of agreement should be identied and emphasized.
How this recommendation links to the shbone diagrams used to develop the ideal state:
Lack of trustworthy, validated, and bidirectional information — improved reporting of the Medical Student
Performance Evaluation and re-designing Electronic Residency Application Service® to include a central report of
the applicant’s medical school attributes helps the reviewer to better process data and make informed decisions.
Program Director stress — streamlining and aggregated data succinctly along with searchable data will improve
program director stress and eciency in reviewing applications.
Program Director Fear of Missing Out – improved transparency to program directors who will have a better
understanding of the applicants selected.
• Bias – increased holistic review, even when resources do not allow detailed individual application review
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Maximize breadth and variety of information (including
personal background and medical school evaluation
data) available to program directors in a user-friendly
format for rapid, wide-scale review.
UME institutions have dierent grade reporting systems
(H/HP/P/F vs A/B/C/D/E) that are not easily translatable
from one institution to another, which may limit the
utility of a single data eld for grades. May need to focus
on non-metric entries.
Standardize and streamline the information available
about all applicants (U.S. and international applicants)
applying to U.S. residency programs.
Letter writers may be hesitant to report data that
negatively aects the applicant.
Aggregate accessible, easily processed data with
searchable data elds to reduce time and stress spent
on processing applicant data.
Reporting of data could decrease the competency-
based mindset and decrease wellness for medical
students.
Improve an equitable application process for all
students including those underrepresented in medicine,
international, and DO applicants.
Will require additional faculty development for
program directors to maximally utilize the new
application functions.
UME-GME REVIEW COMMITTEE
91
Pros Cons
Increase transparency in reporting applicant data and
increase trust in the system.
Program directors may become over reliant on
automated searches, leading to a decrease in
detailed reading of applications and holistic review.
Students can easily demonstrate excellence in
multiple domains, aligned with program mission.
Relevant examples from the literature (if applicable):
1. Hammoud MM, Standiford T, Carmody B. Potential implication of COVID-19 for the 2020-2021 Residency Application
Cycle. JAMA. 2020;321(1):29-30.
2. Geary A, Wang V, Cooper J, et al. Analysis of Electronic Residency Application Service (ERAS) Data Can Improve
House Sta Diversity. J Surg Res. 2021 Jan;257:246-251.
3. Kang HP, Robertson DM, Levine WN, et al. Evaluating the Standardized Letter of Recommendation Form in
Applicants to Orthopaedic Surgery Residency. J Am Acad Orthop Surg. 2020 Oct 1;28(19):814-822.
Specic examples aligned with the overall thematic recommendation (up to 3 granular examples of how the
recommendation could be applied/implemented):
Allow applicants to prioritize lists of extracurricular activities within Electronic Residency Application Service®.
Incorporate drop-down menus for role, location, and area of interest to align applicant entries with program search
options.
Integrate some aspects of the standardized letters (such as the Standardized Letter of Evaluation and Medical
Student Performance Evaluation[MSPE]) into the application system. Data and reported metrics from these letters
are entered into discrete, searchable elds, potentially also with keywords based on areas of excellence (currently
captured as “three key points” in the MSPE), to promote holistic review. Information about the letter writer (location,
community vs. university setting, clinical volume, etc.) is automatically collected and searchable.
Optional structured elds for schools to enter awards, honor societies, clerkship examination scores, and other
metrics used in a student’s summative evaluation. These structured data elds could include search options and
reference statistics to streamline information retrieval. Areas of concern (professionalism, remediation, etc.), could
also be available in searchable elds. The lters for these sensitive topics can be monitored to ensure they are not
being used for immediate rejections. Although not all schools will use these elds, they may be very important for
some schools trying to help students display excellence.
Discrete eld is added for the personal statement so programs can perform a free text search lter for their
program name, suggesting an individualized personal statement.
Research questions:
1. What available on the current Standardized Letters of Evaluation used by the emergency medicine, orthopedic,
and other specialties?
2. Is there any data on which part of the Medical Student Performance Evaluation is most useful?
3. Is there any data on which metrics in the Electronic Residency Application Service® are most useful? (Will help us
understand what we should keep and what can be eliminated.)
4. Was there improved ecacy of reviewing applications using this recommendation? (more applications reviewed,
more detailed review, more components considered)
5. Does this recommendation improve honest reporting and depiction of an applicant?
6. How does transparent reporting aect the number of interviews an applicant may receive?
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
92
Recommendation 18:
To promote equitable treatment of applicants regardless of licensure examination requirements, comparable exams
with dierent scales (COMLEX-USA and USMLE) should be reported within the electronic application system in a single
eld.
Narrative description of recommendation:
Osteopathic medical students make up 25% of medical students in U.S. schools and these students are required
to complete the COMLEX-USA examination series for licensure. Residency programs may lter out applicants
based on their USMLE score leading many osteopathic medical students to sit for the USMLE series. This creates
substantial increase in cost, time, and stress for osteopathic students who believe duplicate testing is necessary to
be competitive in the Match. A combined eld should be created in the Electronic Residency Application Service
(ERAS) that normalizes the scores between the two exams and allows programs to lter based only on the single
normalized score. This will mitigate structural bias and reduce nancial and other stress for applicants
How this recommendation links to the shbone diagrams used to develop the ideal state:
Fear:
Fear of not matching — Applicants have limited career options outside of the Match and perceive no flexibility to
change specialty or the timeline.
Financial burden, educational burden, and opportunity cost for time spent on application process
Applicant Stress:
Process is very dierent for dierent groups of applicants (U.S.MD, U.S.DO, IMG, etc.), without clear expectations
Bias:
Filters can cause bias without alerting programs (ie USMLE lters removing DO applicants)
Bias favors certain applicants, schools, etc., who may resist complete equity
Lack of Trustworthy, validated information to applicants
Conflicting advice from multiple sources (peers, UME, GME, online)
Needs of Society not prioritized:
Student eort spent on transition instead of working toward the greater good (research, patient care, wellness)
Signicant waste due to redundant licensing exams (multiple steps of both COMLEX and USMLE, some applicants
take both). Uncertain that these metrics are predictive of competence.
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
This change will decrease the number of programs
that inadvertently lter out DOs or who do not
understand the equivalency of the licensure
examinations. As a result, fewer DO applicants will feel
compelled to take the USMLE. Currently, over 70% of
osteopathic medical students believed they must take
the USMLE to be competitive in the Match resulting in
over 4000 students wasting time, enduring additional
stress, and paying over $5M out of pocket that is not
needed.
If ltering based on exam is a manifestation of program
bias against osteopathic students, this may persist
through other avenues.
UME-GME REVIEW COMMITTEE
93
Relevant examples from the literature (if applicable):
1. American Medical Association Ocially Recognizes COMLEX-USAs Equality with USMLE. National Board of
Osteopathic Medical Examiners. Published December 3, 2018. https://www.nbome.org/news/american-medical-
association-ocially-recognizes-comlex-usas-equality-with-usmle/. Accessed June 2, 2021.
2. Sandella JM, Gimpel JR, Smith LL, Boulet RJ, PhD. The Use of COMLEX-USA and USMLE for Residency Applicant
Selection. J Grad Med Educ (2016) 8 (3): 358–363.
3. COMLEX-USA and Acceptance for ACGME Fellowship Program Applications. NBOME Update. Published January 18,
2016. https://www.nbome.org/docs/NBOME_COMLEX_ACGME_Fellowships.pdf. Accessed June 2, 2021.
4. COMLEX-USA for Residency Program Directors. National Board of Osteopathic Medical Examiners. Published March
2020. https://www.nbome.org/Content/Exams/COMLEX-USA/COMLEX-USA_Residency_Program_Directors_Guide.
pdf. Accessed June 2, 2021.
5. Freida. American Medical Association. https://freida.ama-assn.org, Accessed June 2, 2021.
6. Hasty RT, Snyder S, Suciu GP, Moskow JM. Graduating osteopathic medical students’ perceptions and
recommendations on the decision to take the United States Medical Licensing Examination. J Am Osteopath Assoc.
2012 Feb;112(2):83-9.
7. Performance Data. USMLE®. https://www.usmle.org/performance-data/. Accessed June 2, 2021.
8. AACOM Reports on Student Enrollment. American Association of the Colleges of Osteopathic Medicine. https://
www.aacom.org/reports-programs-initiatives/aacom-reports/student-enrollment. Accessed June 2, 2021.
9. Convert Your COMLEX-USA 3-Digit Score. National Board of Osteopathic Medicine Examiners. https://www.nbome.
org/cbt_score_conv/. Accessed June 2, 2021.
10. Carmody B. The USMLE for DO Students: How to Stop Fleecing Osteopathic Medical Students. The Sheri of
Sodium. Published December 13, 2019. https://thesheriofsodium.com/2019/12/13/the-usmle-for-dos-how-to-stop-
fleecing-osteopathic-medical-students/. Accessed June 2, 2021.
Specic examples on how this recommendation might be implemented:
Electronic Residency Application Service® (ERAS) combines the elds for licensure exams and reports dates and
percentiles (or pass/fail) for COMLEX/USMLE together. Alternatively, ERAS could apply a conversion formula so that all
scores are reported in a comparable manner.
Discrete elds in Electronic Residency Application Service® for clerkship exams report National Board of Medical
Examiners and National Board of Osteopathic Medical Examiners together similar to the process above.
Electronic Residency Application Service® will only allow programs to lter based on the percentile or converted
score and will remove any ability to lter applicants by the specic exam taken.
Research questions:
1. Is there any data to suggest the accuracy of the conversion tool currently being utilized to convert COMLEX scores
to the USMLE scoring convention?
2. Has there been any published information as to why the National Board of Osteopathic Medical Examiners would
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Students may continue to have a fear that they will be
discriminated against if they don’t take the USMLE and
continue to take additional examinations
Promoting equity for osteopathic students may
decrease match rates for students from other
groups
UME-GME REVIEW COMMITTEE
94
want to continue using their current scoring convention versus the one used by the National Board of Medical
Examiners?
3. Any data on how many programs currently refuse to accept the COMLEX or will not consider applicants with only
the COMLEX and not the USMLE?
4. Will this change in exam score reporting reduce the numbers of osteopathic medical students taking the USMLE?
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
95
Recommendation 19:
Filter options available to programs for sorting applicants within the electronic application system should be carefully
created and thoughtfully reviewed to ensure each one detects meaningful dierences among applicants and
promotes review based on mission alignment and likelihood of success at a program.
Narrative description of recommendation:
Currently, residency programs receive more applications than they can meaningfully review. For this reason,
lters are sometimes used to identify candidates that meet selection criteria. However, some commonly used
lters may exclude applicants who are not meaningfully dierent from ones who are included (e.g., students who
took a dierent licensure examination, students with statistically insignicant dierences in scores, students from
dierent campuses of the same institution, etc.). The use of free text lters increases the risk of not identifying, or
mischaracterizing applicant characteristics. All applications should be evaluated fairly, independent of software
idiosyncrasies. Filters should be developed in conjunction with all stakeholders. Each lter that is oered should align
with the missions and requirements of residency programs.
How this recommendation links to the shbone diagrams used to develop the ideal state:
Program director stress/limited resources
• Bias
• Lack of trustworthy, validated, bidirectional information
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Program directors decrease utilization of score lters
and increase the use of a broader set of mission-based
lters.
Applicants feel pressure to indicate interests that they
do not hold in order to maximize likelihood of match in a
specic program or specialty.
Applicants identify what qualities they most want to
display to residency programs.
Residency programs use workarounds outside of the
application system, which will limit monitoring.
Medical schools thoughtfully identify the important
qualities of each applicant, including which ones truly
standout academically, and pursue transparency in
reporting.
Students without clinical opportunities for letters are at
a disadvantage.
Relevant examples from the literature (if applicable):
1. Garber AM, Kwan B, Williams CM, et al. Use of Filters for Residency Application Review: Results From the Internal
Medicine In-Training Examination Program Director Survey. J Grad Med Educ. 2019;11(6):704-707.
2. Prober CG, Kolars JC, First LR, Melnick DE. A plea to reassess the role of United States Medical Licensing Examination
Step 1 Scores in Residency Selection. Acad Med. 2016;91(1):12-15.
Specic examples on how this recommendation might be implemented:
Point-based score lters are replaced by lters which identify applicants who performed with statistical
equivalence on either USMLE or COMLEX. Similarly, lters accommodate for test date variability among applicants
as licensure examinations convert to pass/fail score reporting.
Electronic Residency Application Service® tabulates grades entered by medical schools or standardized letters of
evaluation to allow programs to identify applicants by percentile tiers across dierent schools.
Instead of free text lters, missional keywords for lters are developed, which allow applicants to select a limited
number of focus areas, e.g. geographical locations, academic pursuits, patient populations, or other areas
UME-GME REVIEW COMMITTEE
96
of potential alignment with programs. Keywords would be a less detailed addition to the three noteworthy
characteristics in the Medical Student Performance Evaluation, but would allow programs to identify applicants out
of a large applicant pool based on factors other than academic metrics. They would be more transparent than free
text lters.
Filters are added to classify letters of recommendation by their writer (i.e. chair, program director, institution, and
geographic location) and underlying clinical site (i.e. academic vs. rural, number of patient encounters). In this way, a
program could identify applicants who rotated at their institution, at a rural community program, or in a particular
state, or if the letter was from a writer who observed the applicant in more than 5 patient encounters per day. This
would be especially useful for evaluating international medical graduates who have variable clinical experiences or
for community programs that need to gauge applicant interest in their program.
Research questions:
1. How do mission-based lters aect vulnerable populations of applicants?
2. Do the new lters identify applicants who are able to succeed at a program?
3. How does the use of any lter aect the system — should our goal be to have individual thorough holistic review of
each application?
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
97
Recommendation 20:
Convene a workgroup of educators across the continuum to begin planning for a dashboard/portfolio to collect
assessment data in a standard format for use during medical school and in the residency application process. This
will enable consistent and equitable information presentation during the residency application process and in a
learner handover.
Narrative description of recommendation:
Key features of a dashboard/portfolio in the UME-GME transition, and across the continuum, should include
competency-based information that aligns with a shared mental model of outcomes, clarity about how and when
assessment data were collected, and narrative data that uses behavior-based and competency-focused language.
Learner reflections and learning goals should be included. Dashboard development will require careful attention to
equity and minimizing harmful bias, as well as a focus on the competencies and measurements that predict future
performance with patients. Transparency with students about the purpose, use, and reporting of assessments, as
well as attention to data access and security, will be essential.
How this recommendation links to the shbone diagrams used to develop the ideal state:
• Assessment tools and strategies
Implementation “must haves” include:
• The platform for this dashboard must be electronic.
• This dashboard must include competency-based performance data.
• This dashboard must include numerical, qualitative, and narrative information
Implementation “nice to haves” include:
• An interesting option for this dashboard would be to include student learning goals and reflections.
Once this dashboard is developed, there may be interest in devising a plan to gather evidence of validity regarding
predictors of successful educational and patient care outcomes.
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Uniform data display of performance information for
potential comparison of students within and across
schools
Risk of assessment data being used to draw inferences
about future performance that are not supported by
evidence
Could enhance or replace the medical student
performance evaluation letter
Could encourage learners’ focus on scores and
performance orientation at expense of fostering a
growth mindset
Group performance data from the dashboard could
inform predictive analytics, research, advising
Challenging to implement at international schools
UME-GME REVIEW COMMITTEE
98
Relevant examples from the literature (if applicable):
Vanderbilt Student Dashboard VSTAR. https://vstar.app.vanderbilt.edu/
• University of Cincinnati internal medicine residency
Sidney Kimmel Medical College JEFF CAT. https://www.ama-assn.org/system/les/2019-06/spring-consortium-
meeting-poster-sidney-kimmel-jeerson.pdf
Specic examples on how this recommendation might be implemented:
1. Convene a group of UME and GME educators with expertise in competency-based medical education and
assessment and information technology experts with expertise in data visualization.
2. Create mockups of potential dashboard learner performance data and collect feedback from UME educators
and residency program directors about the usefulness for summarizing competency-based performance.
Research questions:
1. How do residency selection committee members interpret information in a learner performance dashboard/
portfolio and use that information in candidate selection?
2. How does a learner performance dashboard/portfolio aect learners’ self-reflection and approach to their
learning?
3. How can a learner performance dashboard/portfolio provide learning analytics to shape teaching, learning and
curricular design?
Citations:
1. Boscardin CK, Fergus KB, Hellevig B, Hauer KE. Twelve tips to promote successful development of a learner
performance dashboard within a medical education program. Med Teach. 2018;40:855-61.
2. Carey R, Wilsoon G, Bandi V, et al. Developing a dashboard to meet the needs of residents in a competency-based
training program: A design-based research project. Can Med Educ J. 2020;11:e31-45.
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
99
Recommendation 21:
All interviewing should be virtual for the 2021-2022 residency selection season. To ensure equity and fairness, there
should be ongoing study of the impact of virtual interviewing as a permanent means of interviewing for residency.
Narrative description of recommendation:
Virtual interviewing has had a signicant positive impact on applicant expenses. With elimination of travel, students
have been able to dedicate more time to their clinical education. Due to the risk of inequity with hybrid interviewing
(virtual and in person interviews occurring in the same year or same program), all interviews should be conducted
virtually for the 2021-2022 season. Hybrid interviewing (virtual combined with onsite interviewing) should be prohibited.
A thorough review of the data around virtual interviewing is also recommended. Candidate accessibility, equity, match
rates, and attrition rates should be evaluated. Residency program feedback from multiple types of residencies should
be solicited. In addition, the separation of applicant and program rank order list deadlines in time should be explored,
as this would allow students to visit programs without pressure and minimize influence on a programs rank list.
This recommendation creates the ideal state for the UME-GME transition because:
Interviews will be oered and scheduled to promote student wellness and minimize conflict with ongoing rotations.
There will be ample interview slots for those invited. Applicants will interview only with programs they are likely to
attend. This life transition will be accomplished in a manner supporting wellness.
How this recommendation links to the shbone diagrams used to develop the ideal state:
• Interviews lack standardization across programs and specialties.
• There is a lack of faculty development and education.
• Virtual interviewing bias to technical issues, staging, and resource cost.
• Cost of interview is high in both tangible costs and time away from rotations
• Limited time and stang for interviews
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Public safety and reduced anxiety for some in 2021 with
an ongoing global pandemic
If virtual interviewing was not benecial for certain
applicants or programs, the applicants/programs would
be disadvantaged.
Equity for applicants for the upcoming recruitment
cycle
Increased anxiety and perception of disadvantage for
some applicants with the virtual format
Signicant cost savings for applicants and programs Implicit bias could be magnied in virtual interviewing
formats, reducing diversity, equity, and inclusion
Data-driven decision making for future recruitment
cycles
Programs without technologic support for virtual
interviews may have limited success.
Diversity in educational milieu at programs if expanded
applicant pool and interview/ranked/matched
applicants includes a wide array of MD and DO granting
medical schools as well as international graduates.
Students without technologic support for virtual
interviews may have limited success.
UME-GME REVIEW COMMITTEE
100
Specic examples on how this recommendation might be implemented:
Convene a diverse committee of stakeholders to review the data from virtual and in-person interviews. The
workgroup must contain representation from student organizations and diverse residency program representation
(university and community programs).
Request comprehensive review and recommendation by this group by June 30, 2022 for subsequent interviewing
cycles.
Research questions:
1. Do virtual interviews negatively aect the matching of any candidates?
2. Do virtual interviews negatively aect the match rate in any programs?
3. Do virtual interviews impact specialty choice, match rates, or the composition of the workforce?
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Programs may weigh other factors (e.g., numeric exam
scores, geographic familiarity) more substantially in
completing rank order lists if virtual interview formats
show limitations.
Applicants may apply to more programs, contributing
to increases in applications to programs and reducing a
programs ability for holistic review
Higher transfer rate or levels of unhappiness if
applicants match in a program that had a culture that
was inconsistent with their desired setting.
UME-GME REVIEW COMMITTEE
101
Recommendation 22:
Develop and implement standards for the interview oer and acceptance process, including timing and methods of
communication, for both learners and programs, to improve equity and fairness, to minimize educational disruption,
and to improve wellbeing.
Narrative description of recommendation:
The current process of extending interview oers and scheduling interviews is unnecessarily complex and onerous,
with little to no regulation. Applicant stress and loss of rotation education while attempting to conform to some
elements (e.g., obsessively checking emails to accept short-timed interview oers) can be improved with changes to
the application platform, policies, and procedures. Development of a common interview oering/scheduling platform
and creating policies (e.g., forbidding residency programs to over oer/over schedule interviews and from setting
inappropriate time-based applicant replies), would result in important improvements. While these processes are being
developed, residency programs involved in the 2021-2022 residency selection season should allow applicants 24 to
48 hours to accept or decline an interview oer. In addition, for the 2021-2022 residency selection season, programs
should not oer more interviews to applicants than available interview positions. Likewise, applicants should not
accept multiple interviews that are scheduled at the same time.
This recommendation creates the ideal state for the UME-GME transition because:
Developing and implementing standards for the interview oer and acceptance process will support learner growth,
evidence-informed specialty selection, achievement of competence and wellness. Interviews will be oered and
scheduled to promote student wellness and minimize conflict with ongoing rotations. There will be ample interview
slots for those invited. Applicants will interview only with programs they are likely to attend. This life transition will be
accomplished in a manner supporting wellness.
How this recommendation links to the shbone diagrams used to develop the ideal state:
• The cost of interviewing is high in both tangible costs and time away from rotations.
• Lack of real-time interview update to allow schools to intervene/counsel students appropriately
• Chaotic interview process of oering more slots, insucient opportunity to respond
• Student eort not spent toward the greater good
• Opportunity cost for time spent on application process
• Culture is competitive
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
A consistent and fair approach to interview oers/
acceptances
Inaccessibility to candidates and programs
A common, consistent platform for interview oers/
acceptances
Costly for candidates or programs
Preservation of learning and concentration on student
rotations during residency interview season
UME-GME REVIEW COMMITTEE
102
Specic examples on how this recommendation might be implemented:
A diverse group of stakeholders could be convened to develop a platform as well as relevant policies and procedures
to improve interview scheduling. Stakeholders would include students, residency programs, Association of American
Medical Colleges, American Association of Colleges of Osteopathic Medicine, and medical schools.
Research questions:
1. What is the student satisfaction with the new process and how does it compare to the previous version?
2. What is the program satisfaction with the new process and how does it compare to the previous version?
Citations:
1. Shreler J, Platt M, Thé S, Huecker. Planning virtual residency interviews as a result of COVID-19: insight from residency
applicants and physicians conducting interviews. Postgrad Med J. Published Online First: 27 January 2021. doi: 10.1136/
postgradmedj-2020-139182. Accessed June 7, 2021.
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
103
Recommendation 23:
Innovations to the residency application process should be piloted to reduce application numbers and concentrate
applicants at programs where mutual interest is high, while maximizing applicant placement into residency positions.
Well-designed pilots should receive all available support from the medical community and be implemented as soon
as the 2022-2023 application cycle; successful pilots should be expanded expeditiously toward a unied process.
Narrative description of recommendation:
Application inflation is a major problem in the current dysfunction in the UME-GME transition. The 2020 NRMP program
director’s survey found that only 49% of applications received an in-depth review; an unread application represents
wasted time and expense for applicants. Yet doubling the program resources available for review is not practical.
Informational interventions – like improved career advising and transparency – are unlikely to reduce application
numbers signicantly in the context of a high stakes prisoner’s dilemma. In sum, the current process is costly to
applicants and program directors and does not optimally serve the public good.
To address this dysfunction, Coalition organizations and other groups in the medical community should utilize all
available logistic, analytic, and nancial resources to lead and support innovative pilots to reduce application numbers
and concentrate applicants at programs where mutual interest is high, while maximizing applicant placement into
residency positions. Pilots should be based on best available evidence, specialty-specic needs, potential impact
(both positive and negative), and collaboration among stakeholders. Pilot innovations, some of which are ongoing,
could include, but are not limited to, the following: expanding integrated UME-GME pathways, preference signaling,
application caps, and/or additional application or match rounds.
Groups sponsoring pilots should be accountable for using a continuous quality improvement approach to gather and
monitor evidence of eectiveness and equity across applicant groups with historically distinct application behaviors
and outcomes, including United States MD and DO graduates, international medical graduates, couples applicants,
previously unmatched applicants, and individuals belonging to groups that are underrepresented in medicine.
While pilot studies may vary across specialties, ultimately the redesigned residency application process should be as
consistent as possible across specialties, recognizing that applicants, advisors, and program directors may be subject
to the rules of multiple specialties in the context of combined tracks, couples, and dual applicants.
This recommendation creates the ideal state for the UME-GME transition because:
Piloting these innovations will determine which ones should be implemented more broadly because they balance
the tension between individual freedoms and the public good to provide a learner-centered experience that is
sustainable for program directors and institutions. A successful intervention will also be flexible and adaptable to
changes in medical education and health care systems, with a commitment to continuous quality improvement.
These innovations will mean each residency program receives applications from individuals who are likely to attend
and aligned with the institutional mission. Additionally, every program will receive enough applications to ll their class
and will have sucient resources to conduct holistic review on the applications received. Financial challenges will be
minimized. Learners will have adequate funding to establish their living arrangement and support a focus on their
work in residency.
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
104
Pros Cons
Pilot innovations will aim to reduce application numbers
and concentrate applicants at institutions where
mutual interest is high, while maximizing applicant
placement into available residency positions.
Lengthening the overall duration of the application
and recruitment season for applicants and programs,
respectively
Reduction in the applicant evaluative information
available to programs at the time of review (for early
application or match rounds)
Measurable positive outcomes could include:
Fewer applications submitted per position, with
a target reduction of 20% (i.e., from ~132 to 102
applications per position, which are the 2010 levels)
Stabilization of the percent of applicants matching
outside their top three ranks to 2020 levels or better
Other undesired consequences that are possible but of
uncertain likelihood include:
Mismatch of supply (positions available) and demand
(applications permitted) if additional constraints are
added to the application process
Increasing overall stress in the process and diculty
accommodating some applicant groups, such as
couples and applicants applying to more than one
specialty
Fear by applicants and programs of “missing the great
for the good”
Loss of cohesive application strategy due to multiple
pilots and dierent preferred interventions in dierent
elds. Increased complication in the application
process overall
Several additional outcomes that are dicult to
measure include:
Increase the number of applications (and applicants)
that undergo holistic review
Concentrate applicants and interviewees at
institutions with which they have a high level of mutual
interest
Reduce time and money spent by both applicants and
programs
Increase applicants’ and programs’ satisfaction with
the application process and match, while reducing
stress
Provide an opportunity for some students to signal
their preference for specic programs, potentially
improving equity
The following outcomes must be monitored and should
be unaected:
The Match continues to have stable match rates within
+/- 2% of 2020 rates
For example, 2020 PGY-1 match rates by group
were: U.S. MD seniors (94%), U.S. DO seniors (91%),
and IMG (61%)
The Match continues to have a stable pre-SOAP
position ll rate of >93% and post-SOAP ll rate of >98%
The average number of ranked specialties remains
stable (U.S. MD/DO ~1.2, IMG ~1.4)
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
105
Specic examples on how this recommendation might be implemented:
Pilot innovations could include (ongoing examples in parentheticals):
1. Expanding integrated UME-GME pathways (New York University 3-year pathway with guaranteed residency
placement, Penn State University 3+3 Family Medicine pathway)
2. Preference signaling (Otolaryngology preference signaling)
3. Application caps (no known pilots)
4. Additional application rounds (no known pilots)
5. Additional match rounds (Obstetrics and Gynecology Early Result Acceptance Program)
Research questions:
1. Conduct experimental game theory modeling using incentivized scenarios to assess and compare dierent
commonly cited pilot innovations according to specialty-specic parameters
Citations:
1. Zastrow RK, Burk-Rafel J, London DA. Systems-Level Reforms to the US Resident Selection Process: A Scoping Review.
J Grad Med Educ. 2021;13(3):355–370.
Integrated UME-GME Pathways
2. Wong BJ. Reforming the match process - early decision plans and the case for a consortia match. JAMA
Otolaryngol Head Neck Surg. 2016;142(8):727-728.
3. Cangiarella J, Fancher T, Jones B, Dodson L, Leong SL, Hunsaker M, Pallay R, Whyte R, Holthouser A, Abramson SB.
Three-year MD programs: perspectives from the consortium of accelerated medical pathway programs (CAMPP).
Acad Med. 2017;92(4):483-90.
4. Andrews JS, Bale JF, Soep JB, Long M, Carraccio C, Englander R, Powell D. Education in pediatrics across
the continuum (EPAC): rst steps toward realizing the dream of competency-based education. Acad Med.
2018;93(3):414-20.
5. Pereira AG, Williams CM, Angus SV. Disruptive innovation and the residency match: The time is now. J Grad Med Educ.
2019;11(1):36-38.
6. Cangiarella J, Cohen E, Rivera R, Gillespie C, Abramson S. Evolution of an accelerated 3-year pathway to the MD
degree: the experience of New York University Grossman School of Medicine. Acad Med. 2020;95(4):534-9.
Preference Signaling
7. Bernstein J. Not the last word: Want to match in an orthopaedic surgery residency? Send a rose to the program
director. Clin Orthop Relat Res. 2017;475(12):2845-2849.
8. Salehi PP, Benito D, Michaelides E. A novel approach to the national resident matching program - the star system.
JAMA Otolaryngol Head Neck Surg. 2018;144(5):397-398.
9. Chen JX, Deng F, Gray ST. Preference signaling in the national resident matching program. JAMA Otolaryngol Head
Neck Surg. 2018;144(10):951.
10. Whipple ME, Law AB, Bly RA. A computer simulation model to analyze the application process for competitive
residency programs. J Grad Med Educ. 2019;11(1):30-35.
Application Caps
11. Naclerio RM, Pinto JM, Baroody FM. Drowning in applications for residency training: A programs perspective and
simple solutions. JAMA Otolaryngol Head Neck Surg. 2014;140(8):695-696.
12. Weissbart SJ, Kim SJ, Feinn RS, Stock JA. Relationship between the number of residency applications and the yearly
match rate: Time to start thinking about an application limit? J Grad Med Educ. 2015;7(1):81-85.
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
106
13. Pereira AG, Chelminski PR, Chheda SG, Angus SV, Becker J, Chudgar SM, et al; Medical Student to Resident Interface
Committee Workgroup on the Interview Season. Application inflation for internal medicine applicants in the match:
Drivers, consequences, and potential solutions. Am J Med. 2016;129(8):885-891.
14. Putnam-Pite D. Viewpoint from a former medical student/now intern playing the game - balancing numbers and
intangibles in the orthopedic surgery match. J Grad Med Educ. 2016;8(3):311-313.
15. Kraeutler MJ. It is time to change the status quo: Limiting orthopedic surgery residency applications. Orthopedics.
2017;40(5):267-268.
16. Ward M, Pingree C, Laury AM, Bowe SN. Applicant perspectives on the otolaryngology residency application process.
JAMA Otolaryngol Head Neck Surg. 2017;143(8):782-787.
17. Zhao H, Freedman A, Lerman S. Reforming the urology match application process: A role for the residency programs.
J Urol. 2020;203(1):44-45.
Additional Application Rounds
18. Ward M, Pingree C, Laury AM, Bowe SN. Applicant perspectives on the otolaryngology residency application process.
JAMA Otolaryngol Head Neck Surg. 2017;143(8):782-787.
Additional Match Rounds
19. Wong BJ. Reforming the match process - early decision plans and the case for a consortia match. JAMA
Otolaryngol Head Neck Surg. 2016;142(8):727-728.
20. Berger JS, Cioletti A. Viewpoint from 2 graduate medical education deans: Application overload in the residency
match process. J Grad Med Educ. 2016;8(3):317-321.
21. Hueston WJ. A proposal to address the increasing number of residency applications. Acad Med. 2017;92(7):896-897.
22. London DA. SOAP for everyone: An evolutionary development of the match. Acad Med. 2017;92(6):730.
23. Ward M, Pingree C, Laury AM, Bowe SN. Applicant perspectives on the otolaryngology residency application process.
JAMA Otolaryngol Head Neck Surg. 2017;143(8):782-787.
24. Arnold L, Sullivan C, Okah FA. A free-market approach to the Match: A proposal whose time has not yet come. Acad
Med. 2018;93(1):16-19.
25. Monir JG. Reforming the match: A proposal for a new 3-phase system. J Grad Med Educ. 2020;12(1):7-9.
26. Hammoud MM, Andrews J, Skochelak SE. Improving the residency application and selection process: An optional
early result acceptance program. JAMA. 2020;323(6):503-504.
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
107
Recommendation 24:
Implement a centralized process to facilitate evidence-based, specialty-specic limits on the number of interviews
each applicant may attend.
Narrative description of recommendation:
Identify evidence-based, specialty-specic interview caps, envisioned as the number of interviews an applicant
attends within a specialty above which further interviews are not associated with signicantly increased match rates,
across all core applicant types. Create a centralized process to operationalize interview caps, which could include an
interview ticket system or a single scheduling platform.
This recommendation creates the ideal state for the UME-GME transition because:
A centralized process of evidence-based, specialty-specic applicant interview caps will balance the tension
between individual freedoms and the public good to provide a learner-centered experience that is sustainable for
program directors and institutions. Interviews will be oered and scheduled to promote student wellness and minimize
conflict with ongoing rotations. There will be ample interview slots for those invited. Applicants will interview only with
programs they are likely to attend
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Implementation of interview caps is intended to ensure
every interview represents genuine mutual interest
between an applicant and program, with capping
mostly impacting the strongest applicants who occupy
a disproportionate number of interview slots.
Challenges accommodating unique groups, such as
couples
A more equitable distribution of interviews among
matched applicants, with a target reduction of the
interview distribution Gini coecient by 20%
Increased numbers of applicants applying to multiple
specialties
Lower average number of interviews attended per
applicant, with a target reduction of 20%
Stabilizing the percent of applicants matching outside
their top three ranks to 2020 levels
Concentrate interviewees at institutions with which
they have a high level of mutual interest
Reducing late cancelation of interviews
Increase applicants’ and programs’ satisfaction with
the interview process, while reducing stress
The following outcomes must be monitored and should
be unaected:
The Match continues to have stable match rates within
+/- 2% of 2020 rates
For example, 2020 PGY-1 match rates by group were:
U.S. MD seniors (94%), U.S. DO seniors (91%), and IMG
(61%)
The Match continues to have a stable pre-SOAP
position ll rate of >93% and post-SOAP ll rate of >98%
UME-GME REVIEW COMMITTEE
108
Pros Cons
The average number of ranked special-ties remains
stable (U.S. MD/DO ~1.2, IMG ~1.4)
Relevant examples from the literature (if applicable):
1. Katsufrakis PJ, Uhler TA, Jones LD. The residency application process: Pursuing improved outcomes through better
understanding of the issues. Acad Med. 2016;91(11):1483-1487.
2. Frush BW, Byerley J. High-Value Interviewing: A Call for Quality Improvement in the Match Process. Acad Med.
2019;94(3):324-327.
3. Gruppuso PA, Adashi EY. Residency Placement Fever: Is It Time for a Reevaluation? Acad Med. 2017;92(7):923-926.
4. Hammoud MM, Standiford T, Carmody JB. Potential Implications of COVID-19 for the 2020-2021 Residency
Application Cycle. JAMA. 2020;324(1):29–30.
5. Burk-Rafel J, Standiford TC. A Novel Ticket System for Capping Residency Interview Numbers: Reimagining Interviews
in the COVID-19 Era. Acad Med. 2021;96(1):50-55.
Specic examples on how this recommendation might be implemented:
A centralized interview ticket system that would permit use of multiple interview scheduling platforms could be
created.
A single interview scheduling platform across all programs within each specialty, with caps built into the scheduling
platform could be implemented.
Research questions:
1. Are there examples of modeling interview caps under dierent constraints?
2. What data exist regarding propensity for matching by applicant characteristics and numbers of interviews
attended, including for individuals who do not match?
3. Conduct modeling to examine dierent interview cap numbers and their impact on applicant and program
behaviors as measured by the intended outcomes.
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
109
Recommendation 25:
Early and ongoing specialty-specic resident assessment data should be automatically fed back to medical schools
through a standardized process to enhance accountability and to inform continuous improvement of UME programs
and learner handovers.
Narrative description of recommendation:
IInstruments for feedback from GME to UME should be standardized and utilized to inform gaps in curriculum and
program improvement. UME institutions should respond to the GME feedback on their graduates’ performance in a
manner that leads to quality improvement of the program.
This recommendation creates the ideal state for the UME-GME transition because:
The ideal state for the UME-GME transition is enhanced by a shared mental model on the continuum with
standardized competencies. This shared model allows for a mutually understood learner hando. Bidirectional
information sharing enhances trust and accountability.
How this recommendation links to the shbone diagrams used to develop the ideal state:
Feedback from GME to UME builds trust and enhances accountability. Additionally, this can illustrate curricular gaps to
UME providers to allow for continuous quality improvement.
Implementation “must haves” include:
• Meaningful GME feedback to UME on all competency domains of their graduates’ performance
Implementation “nice to haves” include:
• Comparison between assessment from the GME side to the MSPE descriptions from the UME side
• Sharing data with the public to hold medical schools accountable
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Build trust between GME and UME Cost
High stakes assessment and its eect on well-being
Incorrect inference regarding data – medical school
preparation is not the only thing that influences
residency performance
Specic examples on how this recommendation might be implemented:
• Patient safety reports that are shared regarding hospitals
• Physician evaluation data that is now being shared online by the ACGME as milestone data is collected
• Board pass rate that is shared in evaluation of programs
Research questions:
• What examples exist of GME programs providing feedback to UME programs on how their graduates are doing?
Specically, are there examples of using data from GME and beyond, such as board pass rates, in-training exams,
patient safety reports, quality of care data, used by medical schools to inform their curriculum needs?
UME-GME REVIEW COMMITTEE
110
Recommendation 26:
Develop a portfolio of evidence-based resident support resources for program directors, designated institutional
ocials (DIOs), and residency programs. These will be identied as salutary practices, and accessible through a
centralized repository.
Narrative description of recommendation:
A centralized source of resident support resources will assist programs with eective approaches to address resident
concerns. This will be especially relevant for competency-based remediation and resident wellbeing resources in
the context of increased demand for support around the UME-GME transition. Access for programs and program
directors will be low/no cost, condential, and straightforward.
How this recommendation links to the shbone diagrams used to develop the ideal state:
• Targets program director stress/limited resources
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Decreased time and eort for program directors,
improved well-being
Risk establishing one organization as the primary source
of resources, stifling innovation by other organizations/
institutions
Easier for program directors to consider learners with
prior diculties or those viewed for other reasons as
potentially at risk
Some resources still optimally managed at program/
institution level, e.g. mental health collaborations,
coaching, and this system may divert resources away
from local level
Consistent, standardized approach to recurrent resident
issues is reassuring for learners/programs
If this portfolio is housed at university centers, which
already have this infrastructure, this may worsen the
dynamic of the existing inequitable power distribution.
Centralized repository of resources to address unusual
resident concerns will be helpful for smaller/community
programs.
Cost eciency
Oload faculty development
Normalizes asking for support early, for both learner
and faculty.
Demonstration by health care systems of increased
engagement in the well-being and success of their
future workforce (culture change)
When paired with student portfolios, would identify
areas for immediate focus during the transition to
residency.
A more resilient, well supported workforce that has had
individualized coaching and mental health prioritized
throughout the continuum
UME-GME REVIEW COMMITTEE
111
Relevant examples from the literature (if applicable):
I.M. Emotional Support Hub, American College of Physicians. https://www.acponline.org/practice-resources/physician-
well-being-and-professional-fulllment/im-emotional-support-hub (including free therapists)
Alliance for Academic Internal Medicine Program Resources:
Faculty development toolkit: https://www.im.org/resources/ume-gme-program-resources/faculty-development-
resources
— Wellness/Resiliency: https://www.im.org/resources/wellness-resiliency
— Diversity Equity and Inclusion resources: https://www.im.org/resources/diversity-inclusion/dei-resources
Specic examples on how this recommendation might be implemented:
1. Create remediation resources aligned with specialty-specic competencies, some of which may be applicable
across specialties and developed based on a shared educational framework, with dened follow up plan. For
example, a video-based curriculum on communication barriers followed by quizzes that are scored centrally, with a
competency report sent to the program director when it is completed.
2. Create specialty-specic resident resources for issues of concern relevant to individual specialties
3. Develop well-being resources e.g creation of a wellness curriculum
4. Bring in online specialists who can meet with residents to assist with educational and psychological support,
including creating individual learning plans, providing following up, and checking in with the program director on
resident progress.
5. Launch national online resident support group meetings (similar to the sacred vocation program, art in medicine,
narrative medicine, etc.), that are oered across specialties, linked to the medical school of origin, or specialty society
Research questions:
1. Are there published examples of competency-based resident remediation portfolios or well-being resources,
especially examples that cross specialties?
2. How frequently are program directors using resources? Are they useful to program directors? To residents? Do they
improve learning outcomes? Clinical outcomes? Do they save time, reduce stress, or promote wellness among
program directors and residents?
3. Continuous quality improvement. Can the outcomes of interventions be tracked anonymously in order to reassess
quality/eectiveness of the resources?
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
112
Recommendation 27:
Targeted coaching by qualied educators should begin in UME and continue during GME, focused on professional
identity formation and moving from a performance to a growth mindset for eective lifelong learning as a physician.
Educators should be astute to the needs of the learner and be equipped to provide assistance to all backgrounds.
Narrative description of recommendation:
Coaching can benet a student’s transition to become a master adaptive learner with a growth mindset. While this
transition should begin early in medical school, it should be complete by the time that the student moves from UME
to GME. If a learner does not transition to a growth mindset, their wellness and success will be compromised. The
addition of specic validated mentoring programs (e.g., Culturally Aware Mentoring) and formation of anity groups
to improve sense of belonging should be considered.
This recommendation creates the ideal state for the UME-GME transition because:
In the ideal state, graduated medical students will be ready to serve as physicians in training, with evolving professional
identity formation that includes condent humility in skills. Through targeted coaching by trained faculty, new residents
will also be challenged to identify areas for growth and gaps in their competency with a growth mindset appropriate
for training. This coaching could also include tailored experiences for trainees based upon their existing skills and
knowledge and identied gaps. Finally, eective coaching groups could help build community that can develop
resilience necessary to thrive in residency.
How this recommendation links to the shbone diagrams used to develop the ideal state:
Transitioning from the role of a medical student into a licensed physician providing direct patient care requires
signicant evolution as a professional and an individual. This transition also carries threats of imposter syndrome and
mistreatment as new graduates progress in their training.
Implementation “must haves” include:
Naming and recognition of this phenomenon (i.e. growth mindset and the role of growth mindset in the setting of
evaluation during medical school)
Time and expectations for completion of this — should begin early in medical school and be complete by the time
the learner begins residency
Implementation “nice to haves” include:
• Training for optimal coaching
• Ratios that allow meaningful relationships between coach and learner
• Career goal match
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Learner centered focus with emphasis on professional
identity formation
Expense in faculty/programmatic time for this
investment
Addresses imposter syndrome proactively GME educators already overextended
Improves wellness of learners by creating a safe
approach toward addressing areas for growth and
developing resilience skills
Perceived by some as an unnecessary support of
maturing learner/impression that learners should not
need this support
Potential for improved performance in residency
Opportunity to develop UME-GME hando through
beginning coaching in UME during the post-match
period
UME-GME REVIEW COMMITTEE
113
Relevant examples from the literature (if applicable):
1.
https://med.nyu.edu/departments-institutes/innovations-medical-education/research-scholarship/grants/transition-
to-residency-advantage/coaches/coaching-curriculum
2. https://med.stanford.edu/peds/prospective-applicants/coaching.html
Specic examples on how this recommendation might be implemented:
Newly matched medical students could meet with trained UME faculty to discuss application of growth mindset to
training and reflect upon areas of strength and opportunities for improvement in their own training. Such sessions
could form the basis for development of individual learning plans. These coaching interactions could be new or built
into existing capstone rotations at medical schools.
Trained GME faculty could coach new residents as they navigate changes in professional identity and goal setting in
residency.
Research questions:
1. What is the impact of coaching focused on growth mindset and professional identity formation on early resident
performance, well-being, and burnout?
2. What are the eective practices of positive coaching relationships in the post-match period?
Citations:
1. Lovell B. What do we know about coaching in medical education? A literature review. Med Educ. 2018 Apr;52(4):376-
390. doi: 10.1111/medu.13482. Epub 2017 Dec 11. PMID: 29226349.
2. Wol M, Hammoud M, Santen S, Deiorio N, Fix M. Coaching in undergraduate medical education: a national survey.
Med Educ Online. 2020 Dec;25(1):1699765. PMID: 31793843; PMCID: PMC6896497.
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
114
Recommendation 28:
Specialty-specic, just-in-time training must be provided to all incoming rst-year residents, to support the transition
from the role of student to a physician ready to assume increased responsibility for patient care.
Narrative description of recommendation:
The intent of this recommendation is to level set incoming resident preparation regardless of medical school
experience. Recent research has shown that residents reported greater preparedness for residency if they
participated in a medical school “boot camp,” and participation in longer residency preparedness courses
was associated with high perceived preparedness for residency. This training must incorporate all six specialty
competency domains and be conducive to performing a baseline skills assessment. These curricula might be
developed by specialty boards, specialty societies, or other organized bodies. To minimize costs, specialty societies
could provide centralized recommendations and training could be executed regionally or through online modules.
This recommendation creates the ideal state for the UME-GME transition because:
All medical students will engage in specialty-aligned knowledge and skills training during the nal year of medical
school in order to achieve the dened general and specialty-focused competencies. Graduated medical students
will be ready to serve as physicians in training, facile with the appropriate knowledge, skills, and eciency and be
equipped with advancing professional identity and a condent humility.
How this recommendation links to the shbone diagrams used to develop the ideal state:
Boot camps and skill building are not standardized and vary by program, medical school, and specialty.
The service needs and structure of the many GME programs creates a situation where there is a lack of flexibility to
make customized schedules that are tailored to learner’s unique strengths and learning needs.
Implementation “must haves” include:
Meaningful content in each competency area
Implementation “nice to haves” include:
Nationally standardized curriculum
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Emphasis on the non-medical knowledge pieces Expense in time and money, especially on the GME side
which is already overstretched
Can be utilized nationally for data sharing and public
accountability after some experience
Can be used for feedback to UME
Relevant examples from the literature (if applicable):
There are many outstanding models of transition to residency courses, including Association of Professors of
Gynecology and Obstetrics’ Right Resident, Right Program, Ready Day One.
Specic examples on how this recommendation might be implemented:
1. Specialty societies could provide centralized recommendations, and training could be executed regionally or
through online modules.
UME-GME REVIEW COMMITTEE
115
Research questions:
1. Is there any information in the published literature on intern performance/preparedness when participating in a boot
camp or other residency preparedness course?
2. Is there any information in the published literature on ideal length of residency preparedness courses?
3. Is there any information in the published literature on the ecacy of what entity is providing the bootcamps/
residency preparedness course (med school, professional organization, or program)?
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
116
Recommendation 29:
Residents must be provided with robust orientation and ramp up into their specic program at the start of internship.
In addition to clinical skills and system utilization, content should include introduction to the patient population, known
health disparities, community service and engagement, faculty, peers, and institutional culture.
Narrative description of recommendation:
Improved orientation to residency has the potential to enhance trainee wellbeing and improve patient safety.
Residents should have orientation that includes not only employee policies, but also education that optimizes their
success in their specic clinical environment. Residents, like other employees, should be paid for attending orientation.
This recommendation creates the ideal state for the UME-GME transition because:
In the ideal state, an equitable, coordinated, ecient, and transparent system across the UME-GME transition will
progress learners from medical school to an ideal residency program that acknowledges the learner’s unique
strengths and learning needs, and will ensure optimized professional identity formation. Residency faculty will
welcome each learner as an individual, knowing their strengths and weaknesses, and trusting their competency
appropriately. Residency faculty and peers will recognize and mitigate bias to ensure optimal entrustment and
support for all learners in an inclusive environment. The rst months of the residency experience will be tailored to the
individual trainee. Patients will be appropriately oriented to a clinical environment that includes learners. Feedback will
be delivered from GME to UME to continually improve the preparatory process.
How this recommendation links to the shbone diagrams used to develop the ideal state:
• Lack of flexibility to make customized schedules tailored to learner’s unique strengths and learning needs
Implementation “must haves” include:
• Community building and familiarity with the clinical site and its resources
Implementation “nice to haves” include:
A patient care, systems, and community service month to provide a thorough orientation with trainee formative
assessments and a specialty specic competency focus
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Patient safety should be improved with a more robust
orientation
Expense
Improved resident wellness and sense of community Current calendar/block rotation constraints
Relevant examples from the literature (if applicable):
Some programs, such as those in family medicine and emergency medicine, have an entire month rotation of patient
care, systems, and community orientation before the intern is launched into regular block rotations.
Specic examples on how this recommendation might be implemented:
1. Many family medicine and emergency medicine programs have implemented an entire orientation month beginning
in July.
Research questions:
1. Is there any information in the published literature on the ideal length of residency orientation (1 month, 2 weeks, etc.)?
2. Is there any information in the published literature that includes the typical length of orientation by specialty?
UME-GME REVIEW COMMITTEE
117
Recommendation 30:
Meaningful assessment data based on performance after the MSPE must be collected and collated for each
graduate, reflected on by the learner with an educator or coach, and utilized in the development of a specialty-
specic, individualized learning plan to be presented to the residency program to serve as a baseline at the start of
residency training.
Narrative description of recommendation:
Guided self-assessment by the learner is an important component in this process and may be all that is available for
some international medical graduates. This recommendation provides meaning and importance for the assessment
of experiences during the nal year of medical school (and possibly practice for some international graduates), helps
to develop the habits necessary for life-long learning, and holds students and schools accountable for quality senior
experiences. It also uses the resources of UME to prepare an individualized learning plan (ILP) to serve as a baseline at
the start of GME. This initial ILP will be rened by additional assessments envisioned as an “In-Training Examination” (ITE)
experience early in GME. The time for this experience should be protected in orientation, and the feedback should be
formative similar to how most programs manage the results of ITEs. This assessment might occur in the authentic
workplace and based on direct observation or might be accomplished as an Objective Structured Clinical Exam using
simulation. This assessment should inform the learner’s ILP and set the stage for the work of the clinical competency
committee of the program.
This recommendation creates the ideal state for the UME-GME transition because:
Medical education is a continuum of learning. Even after the MSPE is provided to residency programs during the
application process, students continue to develop mastery. Areas of strength and areas in need of special attention
can be identied during the last year of medical school and the resources within UME can be utilized to set the stage
for resident as learner in GME. This information can prove invaluable to both learners and their residency program
faculty as learners seek to achieve entrustability and competency. Learners will come into residency with an ILP that
can then be compared to early assessment based on the other 5 competencies, rather than just medical knowledge,
to enhance trust between GME and UME, and to specify learner progress on the continuum.
How this recommendation links to the shbone diagrams used to develop the ideal state:
Information sharing from UME to GME optimizes learning and allows for individualization of residency experiences.
Continuing to collect data during MS4 with the goal of creating an Individualized Learning Plan helps the student
transition from a performance mindset to a growth mindset.
Early assessment in GME compared to UME descriptors will build trust between GME and UME.
Implementation “must haves” include:
Standardized specialty-specic handover instrument to assess learner performance at the end of medical school
and launch of residency ILP.
Meaningful formative assessment to promote ILPs and growth mindset.
• Feedback to UME to enhance trust and condence.
• Nationally standardized toolkit from specialty resources.
• Workplace assessment/direct observation with feedback.
• “Milestone Zero” measurement of all competency domains.
• Cost to learners must be minimized
Implementation “nice to haves” include:
A “warm hand o” from UME educators to GME educators with the learner present might optimize this experience
but may only be feasible if PD shares this responsibility with other program faculty.
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
118
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
More complete information transfer Risk of perpetuated bias in evaluating performance
Enhance trust in the system Risk of lack of trust to enter a growth mindset with
the ILP as typically UME assessment is summative not
formative
Establish habits of lifelong learning Attribution bias to the med school when other factors
could influence resident early performance
Emphasis on all competencies
Data could be fed back to med schools and/or used to
support accountability to the public
Relevant examples from the literature (if applicable):
• Many of the 3+3 programs’ learners benet from this already
1. Schiller JH, Burrows HL, Fleming AE, Keeley MG, Wozniak L, Santen SA. Responsible Milestone-Based Educational
Handover With Individualized Learning Plan From Undergraduate to Graduate Pediatric Medical Education. Acad
Pediatr. 2018 Mar;18(2):231-233. doi: 10.1016/j.acap.2017.09.010. Epub 2017 Sep 20. PMID: 28939503.
2. Morgan HK, Mejicano GC, Skochelak S, Lomis K, Hawkins R, Tunkel AR, Nelson EA, Henderson D, Shelgikar AV, Santen SA.
A Responsible Educational Handover: Improving Communication to Improve Learning. Acad Med. 2020 Feb;95(2):194-
199.
3. Warm EJ, Englander R, Pereira A, Barach P. Improving Learner Handovers in Medical Education. Acad Med. 2017
Jul;92(7):927-931.
Specic examples on how this recommendation might be implemented:
Schools may choose to assess their MS4 students using the ACGME Milestones format in order to inform the
transition to GME.
Graduating medical students, working together with their advisor, could produce an Individualized Learning Plan to
focus the learner’s experiences during the early part of their residency education.
Specialty societies may develop assessment tools and/or handover tools.
Research questions:
1. What is the current state of handos between medical schools and residency programs? Is the MSPE the last
ocial communication (beyond the nal transcript and degree verication)?
2. Does a framework for a student to develop an ILP currently exist? What is the evidence for its utility?
3. Is there any evidence that data obtained after submission of the MSPE is in any way dierent than that obtained
before the MSPE?
Citations:
1. Chitkara MB, Satnick D, Lu W-H, Fleit H, Go RA, Chandran L. Can Individualized Learning Plans in an advanced clinical
experience course for fourth year medical students foster Self-Directed Learning? BMC Medical Education.
2016;16(1). doi:10.1186/s12909-016-0744-8
UME-GME REVIEW COMMITTEE
119
2. Schüttpelz-Brauns K, Karay Y, Gehlhar K, Arias J, Zupanic M. Comparison of the evaluation of formative
assessment at two medical faculties with dierent conditions of undergraduate training, assessment and
feedback. GMS Journal for Medical Education. 2020;37(4):1-23. doi:10.3205/zma001334
3. Tewksbury LR, Carter C, Konopasek L, Sanguino SM, Hanson JL. Evaluation of a National Pediatric Subinternship
Curriculum Implemented Through Individual Learning Plans. Academic Pediatrics. 2018;18(2):208-213. doi:10.1016/j.
acap.2017.11.009
4. Li S-TT, Tancredi DJ, Co JPT, West DC. Factors Associated with Successful Self-Directed Learning Using
Individualized Learning Plans During Pediatric Residency. Academic Pediatrics. 2010;10(2):124-130. doi:10.1016/j.
acap.2009.12.00720.
5. Svrakic M, Bent JP. Individualized Learning Plan (ILP) Is an Eective Tool in Assessing Achievement of Otology-related
Subcompetency Milestones. Otology and Neurotology. 2018;39(7):816-822. doi:10.1097/MAO.0000000000001855
6. Schiller JH, Burrows HL, Fleming AE, Keeley MG, Wozniak L, Santen SA. Responsible Milestone-Based Educational
Handover With Individualized Learning Plan From Undergraduate to Graduate Pediatric Medical Education.
Academic Pediatrics. 2018;18(2):231-233. doi:10.1016/j.acap.2017.09.010
7. Shepard ME, Sastre EA, Davidson MA, Fleming AE. Use of individualized learning plans among fourth-year sub-
interns in pediatrics and internal medicine. Medical Teacher. 2012;34(1):e46-e51. doi:10.3109/0142159X.2012.638013
8. Lockspeiser TM, Kaul P. Using Individualized Learning Plans to Facilitate Learner-Centered Teaching. Journal of
Pediatric and Adolescent Gynecology. 2016;29(3):214-217. doi:10.1016/j.jpag.2015.10.020
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
120
Recommendation 31:
Anticipating the challenges of the UME-GME transition, schools and programs should ensure that time is protected,
and systems are in place, to guarantee that individualized wellness resources – including health care, psychosocial
supports, and communities of belonging – are available for each learner.
Narrative description of recommendation:
Given that the wellness of each learner signicantly impacts learner performance, it is in the program and public’s
best interest to ensure the learner is optimally prepared to perform as a resident. There should be a focus on applying
resources that are already available rather than depending on the creation of new resources. Examples of wellness
resources include enrollment in health insurance, establishing with a primary care provider and dentist, securing a
therapist if appropriate, identifying local communities of belonging, and other supports that optimize wellbeing. These
resources may especially benet the most vulnerable trainees.
This recommendation creates the ideal state for the UME-GME transition because:
The diculty of moving to a new setting, often for ones rst employed position, is stressful. The transition from learner
to employee-learner is dicult, and ones individual wellness might not be a top priority. Easing that transition with
access to resources will enable individuals to focus on the clinical work and learning that is necessary as a resident.
How this recommendation links to the shbone diagrams used to develop the ideal state:
• Logistics of transition
• Well-being
Implementation “must haves” include:
• Access to resources to support learner wellness
Implementation “nice to haves” include:
• One on one wellness coaching
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Allows learner to focus on residency Perceived by some as coddling of interns
Cost may be a barrier
GME programs are already overextended regarding
resource utilization
Specic examples on how this recommendation might be implemented:
1. Vertical “families” for mentorship
2. Availability of wellness coaches
3. “Taking Care of our Own” programs
4. Anity groups
Research questions:
1. For interns who are starting a residency, what are the characteristics of excellent onboarding programs that
promote wellness?
2. How might the strength/quality of these program be measured?
UME-GME REVIEW COMMITTEE
121
Recommendation 32:
Adequate and appropriate time must be assured between graduation and learner start of residency to facilitate this
major life transition.
Narrative description of recommendation:
The transition from medical school to residency typically marks a concrete transition from paying for education
to becoming a fulltime employee focused on the lifelong pursuit of professional improvement. This transition is life
changing for many. It often requires a move from one location to another, sometimes across the world. There must
be time for licensing and in some cases, visa attainment. Often this life transition is accompanied by other major life
events such as partnering or childbearing. Once residency starts, the learner may work many hours each week and
may have little time to establish a home. Thus, it is important for wellness and readiness to practice that adequate
time be provided to accomplish this major life transition.
The predictability of this transition must be recognized by both UME and GME institutions, and cooperation on both
sides is required for this transition to be accomplished smoothly. There is a desire to overall better prepare learners
for the start of residency, and an assured transition time would allow related recommendations to be more easily
accomplished.
This recommendation creates the ideal state for the UME-GME transition because:
The transition from medical school to residency is a major life transition, and those transitions are ideally accomplished
in a manner supporting wellness. Adequate but not excessive time for moving is built into the process in order to allow
time for a move if necessary and also to allow time for supports for health and well-being to be established before
residency starts. Ideally a supportive social network will be in place for each trainee, especially considering the needs
of those from underrepresented backgrounds, before residency starts.
How this recommendation links to the shbone diagrams used to develop the ideal state:
• The logistics of the transition sometimes challenge the post-match optimization.
Implementation “must haves” include:
Required orientation for incoming interns must not begin until those interns have had enough time to establish their
homes.
Implementation “nice to haves” include:
• A minimum of 6 weeks between graduation and residency start would be ideal
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Reasonable work-life integration for residents Schools on a quarter system often have a late
graduation.
Universities are unlikely to move graduation for one of
their schools.
Many learners need a paycheck as soon as possible.
Learners staying at their home institution may feel
ready to start early
Specic examples on how this recommendation might be implemented:
1. Many programs already have this, but it is dependent on medical school graduation date.
UME-GME REVIEW COMMITTEE
122
Research questions:
1. The optimum time of transition between medical school and residency should be studied. It is influenced by many
individual factors, and these factors should be evaluated in an evidence-informed manner.
2. Does learner performance vary based on time between medical school graduation and internship orientation
start?
3. Does learner wellness vary based on time between medical school graduation and internship orientation start?
4. What do learners and program directors want regarding timing of the start of internship?
5. Are there any studies that look at the transition time between medical school graduation and orientation to
internship?
6. Is there any literature more broadly describing the ideal time of transition for schooling to work initiation?
Appendix C:
UGRC Final Recommendations With Complete Templates
UME-GME REVIEW COMMITTEE
123
Recommendation 33:
All learners need equitable access to adequate funding and resources for the transition to residency prior to residency
launch.
Narrative description of recommendation:
As almost every learner graduating from medical school transitions to residency, the need to fund a geographic move
and establishment of a new home is predictable. This nancial planning should be incorporated into medical school
expenses, for example through equitable low interest student loans. Options to support the transitional expenses of
international medical graduates should also be identied. These costs should not be incurred by GME programs.
This recommendation creates the ideal state for the UME-GME transition because:
In the ideal state costs (nancial, educational, patient care, well-being, and otherwise) will be rightsized throughout the
process to maximize value. Life transitions will be accomplished in a manner supporting wellness. Financial challenges
will be minimized. Learners will have adequate funding to establish their living arrangements in order to support a
focus on their work in residency. Supports for health and well-being are established before residency starts.
How this recommendation links to the shbone diagrams used to develop the ideal state:
• Managing the logistics of transitioning is a root cause of some of the current challenges
• Adequate funding availability would allow logistics to be managed more smoothly.
Implementation “must haves” include:
• Moving and living costs through July 1 as part of the student loan/aid package for the graduating year
Implementation “nice to haves” include:
• None specied
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Cost saving for students Schools on a quarter system often have a late
graduation.
Most educational loans only cover the time the student
is specically registered in school.
Specic examples on how this recommendation might be implemented:
1. Some programs providesigning bonuses” or other means to cover relocation expenses. We do not recommend
that all programs do this because of the additional costs to GME.
Research questions:
1. Do changes in the transition to internship process lessen the nancial burden on learners?
2. How much does the transition to internship process cost applicants?
3. How much does the typical incoming intern invest in personal startup costs (obtaining and establishing housing,
funding the move, funding licensing, etc.) after match and before internship launch?
4. How do current interns dependent on nancial aid support cover these expenses?
UME-GME REVIEW COMMITTEE
124
Recommendation 34:
There should be a standardized process throughout the United States for initial licensing at entrance to residency to
streamline the process of credentialing for both residency training and continuing practice.
Narrative description of recommendation:
To benet the public good, costs to support the U.S. healthcare workforce should be minimized. To this end, all medical
students should be able to begin licensure earlier in their educational continuum to better distribute the work burden
and costs associated with this predictable process. When learners are applying to programs in many dierent states,
the varied requirements are unnecessarily cumbersome. Especially for states where a training license is required,
the time between the Match and the start of the rst year of residency is often inadequate for this purpose. This is a
potential cost saving measure.
This recommendation creates the ideal state for the UME-GME transition because:
In the ideal state licensing and credentialing will be accomplished eciently for all learner groups. This should include
visa management as needed and be accomplished in a timely manner without excessive cost.
How this recommendation links to the shbone diagrams used to develop the ideal state:
The varied state licensure requirements force learners to wait until after match to advance the work of establishing
their credentials for work.
The varied state requirements set up unnecessary barriers to practice readiness.
Implementation “must haves” include:
Single process for U.S. graduates
• Single process for international graduates
Implementation “nice to haves” include:
• Earlier start to the licensing process than Match Day
Appendix C:
UGRC Final Recommendations With Complete Templates
Pros Cons
Simplicity Political feasibility
Clarity
Cost eectiveness
Specic examples on how this recommendation might be implemented:
• The Federation of State Medical Boards could work to align state requirements and establish a process.
Research questions:
1. Does a single licensing system lessen the burden on learners in the post-match period before internship start?
2. What are the barriers to a single licensing system for incoming interns in U.S. training programs?
3. Which states currently require a training license?
UME-GME REVIEW COMMITTEE
125
The Coalition for Physician Accountability recommends the following, organized around 12 themes:
Theme: Oversight
Recommendation:
1. Convene a national ongoing committee to manage continuous quality improvement of the entire process
of the UME-GME transition, including an evaluation of the intended and unintended impact of implemented
recommendations.
Narrative description of recommendation:
One of the challenges in creating alignment and making improvements is the lack of a single body with
broad perspective over the entire continuum. This creates a situation where organizations and institutions
are unnecessarily and counterproductively isolated, without a shared mental model or mission. A convened
committee, that includes learner and public representatives, should champion continuous improvement to the
UME-GME transition, with the focus on the public good.
Recommendation:
2. Educators should develop a best-practice curriculum for UME career advising, including guidelines for
equitable curriculum delivery and outcomes.
Narrative description of recommendation:
Guidelines are needed to inform U.S. allopathic, osteopathic or international medical schools in developing
their career advising programs. Standardized approaches to advising along with career advisor preparation
(both general and specialty-specic) can enhance the quality and quantity of advising and improve student
trust in the advice that is received. Educators can enhance medical student career advising by developing
formal guidelines with key recommendations based upon professional development frameworks and
competencies. Implementation of such guidelines will result in greater consistency, thoroughness, eectiveness,
standardization, and equity of medical school career advising programs to better support students in making
career decisions and will lay the foundation for career planning across the continuum.
Recommendation:
3. A single, comprehensive electronic professional development career planning resource for students will
provide universally accessible, reliable, up-to-date, and trustworthy information and guidance.
Narrative description of recommendation:
The AAMC’s Careers in Medicine (CiM) platform achieves some of the aims of this recommendation. It is
recommended to examine the strengths and limitations of CiM, expanding the content and broadening access
to this resource, including to all students (MD, DO, IMG) at no cost, throughout their medical school training, or at
a minimum, at key career decision-making points, in order to support students’ professional development. The
comprehensive, interactive resource should address both clinical and non-clinical career paths. The public good
can be prioritized within this resource with content emphasis on workforce strategies to address the needs
of the public, including specialty selection and practice location. Links to specialty-specic medical student
advising resources should also be incorporated.
Appendix D: UGRC Preliminary
Recommendations
UME-GME REVIEW COMMITTEE
126
Theme: Oversight
Recommendation:
4. Advising about alternative career pathways should be available for those individuals who choose not to
pursue clinical careers. National career awareness databases such as Careers in Medicine should include
information on these alternative pathways.
Narrative description of recommendation:
The nancial and educational burden on learners is signicant, and advising of learners should include
alternative career pathways. This advice should be available to all learners, including students who choose
not to pursue a career in clinical medicine, students who go unmatched, as well as the struggling student who
may not be able to graduate from medical school. Centralized resources to support these eorts should be
developed and should also include information available to international medical graduates.
Recommendation:
5. Evidence-informed, general career advising resources should be available for all medical school faculty
and sta career advisors, both domestic and international. General career advising should focus on students’
professionalization; inclusive practices such as valuing diversity, equity, and belonging; clinical and alternate
career pathways; and meeting the needs of the public.
Narrative description of recommendation:
Centralized advising resources should reflect a common core, with supplemental information as needed.
General advising should be dierentiated from specialty-specic match advising or specialty recruiting.
Advising tools should incorporate strengths-based approaches to career selection. The resources should
include the option of non-clinical careers without stigma. Basic advising information should be created for
all faculty who interact with students to promote common understanding of career advising, professional
development, specialty selection, and application procedures; introduce the role of specialty-specic advisors
as distinct from other faculty teachers; and minimize sharing misinformation that is outdated or incorrect with
students.
All advisors, both faculty and sta, who routinely perform general career advising should undergo a training
process created as part of this resource development. Completing training and demonstrating needed
knowledge and skill could lead to a certication as a trained general career advisor.
Recommendation:
6. To support evidence-informed, student focused, specialty-specic advising for all medical students, advising
resources should be available for and used by advisors, both domestic and international.
Narrative description of recommendation:
Creation of evidence-informed, data-driven specialty-specic resources for advisors will ll an information
gap and increase the transparency and reliability of information shared with students. Guidance contained in
the resources can support faculty in managing or eliminating conflicts of interest related to recruiting students
to the specialty, advising for the Match, and advocating for students in the Match. Resources should also
assist UME programs in supporting the unique needs of traditionally underrepresented, disadvantaged, and
marginalized student groups. Basic advising information should be created for all faculty who interact with
students to promote common understanding of career advising, professional development, specialty selection,
and application procedures; emphasize the role of specialty-specic advisors as distinct from other faculty
teachers; and minimize sharing misinformation that is outdated or incorrect with students.
Appendix D:
UGRC Preliminary Recommendations
UME-GME REVIEW COMMITTEE
127
All advisors, both faculty and sta, who routinely perform specialty-specic advising should undergo a training
process created as part of this resource development that includes equity in advising and mitigation of bias.
Completing training and demonstrating needed knowledge and skill could lead to a certication as a trained
specialty-specic advisor.
Theme: Competencies and Assessments
Recommendation:
7. UME and GME educators, along with representatives of the full educational continuum, should jointly dene
and implement a common framework and set of outcomes (competencies) to apply to learners across the
continuum from UME to GME.
Narrative description of recommendation:
A shared mental model of competence facilitates agreement on assessment strategies used to evaluate a
learner’s progress in those competencies and the inferences which can be made from assessments. Shared
outcomes language can convey information on learner competence with the patient/public trust in mind.
For individual learners, dening these outcomes will facilitate learning and may promote a growth mindset. For
faculty, dening outcomes will allow for the use of assessment tools aligned with performance expectations
and faculty development. For residency programs, dening outcomes will be useful through resident selection
and learner handovers from UME, resident training, and resident preparation for practice.
Recommendation:
8. The UME community, working in conjunction with partners across the continuum, must commit to using
robust assessment tools and strategies, improving upon existing tools, developing new tools where needed,
and gathering and reviewing additional evidence of validity.
Narrative description of recommendation:
Educators from across the education continuum should use the shared competency outcomes language to
guide development or use of assessment tools, and strategies that can be used across schools to generate
credible, equitable, value-added competency-based information. Assessment information could be shared in
residency applications and a post-match learner handover. Licensing examinations should be used for their
intended purpose to ensure requisite competence.
Recommendation:
9. Using the shared mental model of competency and assessment tools and strategies, create and implement
faculty development materials for incorporating competency-based expectations into teaching and
assessment.
Narrative description of recommendation:
Faculty must understand the purpose of outcomes-focused education, specic language used to dene
competence, and how to mitigate biases when assessing learners. They must understand the purpose and
use of each assessment tool. The intensity and depth of faculty development can be tailored to the amount
and type of contact that individual faculty have with students. Clerkship directors, academic progress
committees, student competency committee members, and other educational leaders require more in-
depth understanding of the assessment system and how determinations of readiness for advancement
are made. This faculty development requires centralized electronic resources and training for trainers within
institutions. Review of training materials, and completion of any required activities to document review and/or
understanding, should be required on a regular basis to be determined by the development group.
Appendix D:
UGRC Preliminary Recommendations
UME-GME REVIEW COMMITTEE
128
Theme: Competencies and Assessments
Recommendation:
10. A convened group including UME and GME educators should reconsider the content and structure of the
MSPE as new information becomes available in order to improve access to longitudinal assessment data about
applicants. Short term improvements should include structured data entry elds with functionality to enable
searching.
Narrative description of recommendation:
The development of UME competency outcomes to apply across learners and the continuum is essential in
decreasing the reliance on board scores in the evaluation of the residency applicant. These will take time to
develop and implement and may be developed at dierent intervals. As new information becomes available
to improve applicant data, the MSPE should be utilized to improve longitudinal applicant information. In addition,
improvements in the MSPE, such as structured data entry elds with functionality to enable searching should
be explored.
Recommendation:
11. Meaningful assessment data based on performance after the MSPE must be collected and collated for
each graduate, reflected on by the learner with an educator or coach, and utilized in the development of a
specialty-specic individualized learning plan to be presented to the residency program for continued utilization
during training. Guided self-assessment by the learner is an important component in this process and may be
all that is available for some international medical graduates.
Narrative description of recommendation:
This recommendation provides meaning and importance for the assessment of experiences during the nal
year of medical school (and possibly practice for some international graduates), helps to develop the habits
necessary for life-long learning, and holds students and schools accountable for quality senior experiences.
It also uses the resources of UME to prepare an individualized learning plan (ILP) for interns to be utilized in the
handover.
Recommendation:
12. Targeted coaching by qualied educators should begin in UME and continue during GME, focused on
professional identity formation and moving from a performance to a growth mindset for eective lifelong
learning as a physician. Educators should be astute to the needs of the learner and be equipped to provide
assistance to all backgrounds.
Narrative description of recommendation:
Coaching can benet a student’s transition to become a master adaptive learner with a growth mindset.
While this transition should begin early in medical school, it should be complete by the time that the student
moves from UME to GME. If a learner does not transition to a growth mindset their wellness and success will be
compromised. Consider adding specic validated mentoring programs (e.g., Culturally Aware Mentoring) and
formation of anity groups to improve sense of belonging.
Appendix D:
UGRC Preliminary Recommendations
UME-GME REVIEW COMMITTEE
129
Theme: Competencies and Assessments
Recommendation:
13. Structured Evaluative Letters (SELs) should replace all Letters of Recommendation (LOR) as a universal tool in
the residency program application process.
Narrative description of recommendation:
A Structured Evaluative Letter, which would include specialty-specic questions, would provide knowledge
from the evaluator on student performance that was directly observed versus a narrative recommendation.
The template should be based on an agreed upon set of core competencies and allow equitable access to
completion for all candidates. The SEL should be based on direct observation and must focus on content that
the evaluator can complete. Faculty resources should be developed to improve the quality of the standardized
evaluation template and decrease bias.
Recommendation:
14. Convene a workgroup of educators across the continuum to begin planning for a dashboard/portfolio to
collect assessment data in a standard format for use during medical school and in the residency application
process. This will enable consistent and equitable information presentation during the residency application
process and in a learner handover.
Narrative description of recommendation:
Key features of a dashboard/portfolio in the UME-GME transition, and across the continuum, should include
competency-based information that aligns with a shared mental model of outcomes, clarity about how
and when assessment data were collected, and narrative data that uses behavior-based and competency-
focused language. A mechanism should include learner reflections and learning goals. Dashboard
development will require careful attention to equity and minimizing harmful bias, as well as a focus on the
competencies and measurements that predict future performance with patients. Transparency with students
about the purpose, use, and reporting of assessments, as well as attention to data access and security, will be
essential.
Theme: Away Rotations
Recommendation:
15. Convene a workgroup to explore the multiple functions and value of away rotations for applicants, medical
schools, and residency programs. Specically, consider the goals and utility of the experience, the impact of
these rotations, and issues of equity including accessibility, assessment, and opportunity for students from
groups underrepresented in medicine and nancially disadvantaged students.
Narrative description of recommendation:
Away rotations can be cost prohibitive yet may allow a student to get to know a program, its health system,
and surrounding community. Some programs are reliant on away rotations to showcase their unique
strengths in order to attract candidates. Given the multifactorial and complex role that away rotations fulll, a
committee should be convened to conduct a thorough and comprehensive review of cost versus benet of
away rotations, followed by recommendations from that review. Non-traditional methods of conducting and
administering away rotations should be explored (e.g., oering virtual away rotations, waiving application fees,
or oering away stipends particularly for nancially disadvantaged students).
Appendix D:
UGRC Preliminary Recommendations
UME-GME REVIEW COMMITTEE
130
Theme: Diversity, Equity, and Inclusion (DEI) in Medicine
Recommendation:
16. To raise awareness and facilitate adjustments that will promote equity and accountability, demographic
information of applicants (race, ethnicity, gender identity/expression, sexual identity/orientation, visa status, or
ability) should be measured and reported to key stakeholders, including programs and medical schools, in real
time throughout the UME-GME transition.
Narrative description of recommendation:
Inequitable distribution of applicants among specialties is not in the best interest of programs, applicants, or
the public good. Bias can be present at any level of the UME-GME transition. A decrease in diversity at any point
along the continuum provides an important opportunity to intervene and potentially serve the community in
more productive ways. An example of accountability and transparency in an inclusive environment across the
continuum is a diversity dashboard for residency applicants. A residency program that nds bias in its selection
process (perhaps due to an Alpha Omega Alpha lter) could go back in real time to nd qualied applicants
who may have been missed, potentially improving outcomes.
Recommendation:
17. Specialty-specic best practices for recruitment to increase diversity across the educational continuum
should be developed and disseminated to program directors, residency programs, and institutions.
Narrative description of recommendation:
Recognizing that program directors, programs, and institutions have wide variability in goals, denitions, and
community needs for increasing diversity, shared resources should be available for mission-aligned entities, with
specialty-specic contributions including successful strategies and ongoing challenges. This recommendation
is intended for specialty organizations to specically address diversity, equity, and inclusion in specialty-specic
disparities in recruitment.
Recommendation:
18. In order to eliminate systemic biases in grading, medical schools must perform initial and annual exploratory
reviews of clinical clerkship grading, including patterns of grade distribution based on race, ethnicity, gender
identity/expression, sexual identity/orientation, visa status, ability, and location (e.g., satellite or clinical site location),
and perform regular faculty development to mitigate bias. Programs across the UME-GME continuum should
explore the impact of bias on student and resident evaluations, match results, attrition, and selection to honor
societies, such as Alpha Omega Alpha and the Gold Humanism Honor Society.
Narrative description of recommendation:
Recognizing that inherent biases exist in clinical grading and assessment in the clinical learning environment,
each UME and GME program must have a continuous quality improvement process for evaluating bias in
clinical grading and assessment and the implications of these biases, including honor society selection. This
recommendation is intended to mitigate bias based on clinical grading, transcript notations, MSPE reflections of
remediation, and residency evaluations that may be influenced by bias.
Appendix D:
UGRC Preliminary Recommendations
UME-GME REVIEW COMMITTEE
131
Recommendation:
19. A committee must be formed to explore the growing number of unmatched physicians in the context of a
national physician shortage, including root causes, and disparities in unmatched students based on specialty,
demographic factors, and grading systems. The committee should report on data trends, implications, and
recommended interventions.
Narrative description of recommendation:
The growing number of unmatched physicians necessitates analysis and strategic planning to address root
causes. This analysis should include demographic data to examine diversity, specialty disparities in unmatched
students, number of applications, grading systems, participation in SOAP, post-SOAP unmatched candidates,
and match rate in subsequent years of re-entering the Match pool. This recommendation is intended to
urge UME programs and institutions to have a continuous quality improvement approach by reviewing
unmatched graduates for specialties, demographics, number of programs applied to, and clinical grading; to
oer alternative pathways; and add faculty development for clinical advising. Ideally shared resources and
innovation across the continuum would be identied and disseminated.
Theme: Application Process
Recommendation:
20. A comprehensive database with veriable residency program information should be available to all
applicants, medical schools, and residency programs and at no cost to the applicants.
Narrative description of recommendation:
Veriable and trustworthy residency program information should be developed and made available in an
easily accessible database to all applicants. Information for the database should be directly collected and
sources should be transparent. Data must be searchable and allow for data analytics to help with program
decision making (e.g., allowing applicants to input components of their individual application to identify
programs with similar current residents).
Recommendation:
21. Create a widely accessible, authoritative, reliable, and searchable dataset of characteristics of individuals
who applied, interviewed, were ranked, and matched for each GME program/track to be used at no cost by
applicants, and by their advisors. Sort data according to medical degree, demographics, geography, and other
characteristics of interest.
Narrative description of recommendation:
The Residency Explorer tool currently allows applicants to compare their characteristics to those of recent
residents attending each GME program. These data could be more robust by providing users with more
detailed information about each programs selection process. Each program’s interviewed or ranked applicants
reflect the programs desired characteristics more accurately than the small proportion of applicants the
program matches. Applicants and advisors should be able to sort the information according to demographic
and educational features that may signicantly impact the likelihood of matching at a program (e.g.,
geography, scores, degree, visa status, etc.).
Appendix D:
UGRC Preliminary Recommendations
UME-GME REVIEW COMMITTEE
132
Recommendation:
22. To optimize utility, discrete elds should be available in the existing electronic application system for
both narrative and ordinal information currently presented in the MSPE, personal statement, transcript, and
letters. Fully using technology will reduce redundancy, improve comprehensibility, and highlight the unique
characteristics of each applicant.
Narrative description of recommendation:
Optimally, each applicant will be reviewed individually and holistically to evaluate merit. However, some
circumstances may require rapid review. The 2020 NRMP program directors’ survey found that only 49% of
applications received an in-depth review. The application system should utilize modern technology to maximize
the likelihood that applications are evaluated in a way that is holistic, mission-based, and equitable.
Currently, applications are assessed based on the information that is readily available, which may place undue
emphasis on scores, geography, medical school, or other factors that perpetuate bias. Adding concrete data
gives an opportunity for applicants to demonstrate their strengths in a way that is user-friendly for program
directors. Maximizing the amount of accurate information readily available in the application will increase
capacity for holistic review of more applicants and improve trust during the UME to GME transition. Although
not all schools and programs will align on which information should be included, areas of agreement should be
found and emphasized.
Recommendation:
23. Filter options available to programs for sorting applicants within the application system should be carefully
created and thoughtfully reviewed to ensure each one detects meaningful dierences among applicants and
promotes review based on mission alignment and likelihood of success at a program.
Narrative description of recommendation:
Residency programs receive more applications than they can meaningfully review, and applications may
lack details that would help to dierentiate between similar candidates. For this reason, lters are sometimes
used to identify candidates that meet selection criteria. However, some commonly used lters may exclude
applicants who are not meaningfully dierent from ones who are included. All applications should be evaluated
fairly, independent of software idiosyncrasies. Each lter that is oered should align with the missions and
requirements of residency programs. Filters with known bias (such as honor society and score lters) should
be carefully monitored, especially as score reporting changes put some applicants at risk of inequitable
consideration due to the timing of their test administration.
Recommendation:
24. To promote equitable treatment of applicants regardless of licensure examination requirements,
comparable exams with dierent scales (COMLEX-USA and USMLE) should be reported within the ERAS
ltering system in a single eld.
Narrative description of recommendation:
Osteopathic medical students make up 25% of medical students in U.S. schools and these students are
required to complete the COMLEX-USA examination series for licensure. Residency programs may lter out
applicants based on their USMLE score leading many osteopathic medical students to sit for the USMLE series.
This creates substantial increase in cost, time, and stress for osteopathic students who believe duplicate testing
is necessary to be competitive in the Match. A combined eld should be created in ERAS which normalizes the
scores between the two exams and allows programs to lter based only on the single normalized score. This
will mitigate structural bias and reduce nancial and other stress for applicants.
Appendix D:
UGRC Preliminary Recommendations
UME-GME REVIEW COMMITTEE
133
Theme: Interviewing
Recommendation:
25. Develop and implement standards for the interview oer and acceptance process, including timing and
methods of communication, for both the learners and programs to improve equity and fairness, to minimize
educational disruption, and improve wellbeing.
Narrative description of recommendation:
The current process of extending interview oers and scheduling interviews is unnecessarily complex and
onerous, and there is little to no regulation of this process. Applicant stress and loss of rotation education while
attempting to conform to some processes (e.g., obsessively checking emails to accept short-timed interview
oers) can be improved by implementing process improvements to the application platform, policies, and
procedures. Development of a common interview oering/scheduling platform and setting policies to this
platform, such as a residency programs inability to over oer/over schedule interviews and set inappropriate
time-based applicant replies, would result in important improvements.
Recommendation:
26. Interviewing should be virtual for the 2021-2022 residency recruitment season. To ensure equity and fairness,
there should be ongoing study of the impact and benets of virtual interviewing as a permanent means of
interviewing for residency.
Narrative description of recommendation:
Virtual interviewing has been a phenomenal change to control applicant expenses. With elimination of travel,
students have been able to dedicate more time to their clinical education. Due to the risk of inequity with hybrid
interviewing (virtual and in person interviews occurring in the same year or same program), all interviews should
be conducted virtually for the 2021-2022 season. The committee also recommends a thorough exploration of
the data around virtual interviewing. Candidate accessibility, equity, match rates, and attrition rates should be
evaluated. Residency program feedback from multiple types of residencies should be explored. In addition, the
separation of applicant and program rank order list deadlines in time should be explored, as this would allow
students to visit programs without pressure and minimize influence on a programs rank list.
Recommendation:
27. Implement a centralized process to facilitate evidence-based, specialty-specic limits on the number of
interviews each applicant may attend.
Narrative description of recommendation:
Identify evidence-based, specialty-specic interview caps, envisioned as the number of interviews an applicant
attends within a specialty above which further interviews are not associated with signicantly increased
match rates, across all core applicant types. Standardize the interview oer, acceptance, and scheduling
workflow. Create a centralized process to operationalize interview caps, which could include an interview ticket
system or a single scheduling platform.
Appendix D:
UGRC Preliminary Recommendations
UME-GME REVIEW COMMITTEE
134
Theme: Matching Process
Recommendation:
28. To promote holistic review and eciency, utilize the best available modeling and data to redesign the
mechanics of the residency application process. The redesigned process – such as an optional early decision
application cycle and binding match – must reduce application numbers while concentrating applicants at
programs where mutual interest is high.
Narrative description of recommendation:
Application inflation is a root cause of the current dysfunction in the UME-GME transition. The current high cost
of the application process (to applicants and program directors) does not serve the public good. The 2020
NRMP program director survey found that only 49% of applications received an in-depth review. An unread
application represents wasted cost to the applicants and doubling the resources available for review is not
practical. Optimal career advising may not be sucient to reduce application numbers in the context of a very
high stakes process. Despite increased transparency in characteristics of matched applicants, the number of
applications per applicant continues to rise.
Following careful review of all available data and modeling information, one of several potential options must be
taken to reduce the number of applications submitted per position. Outcomes must be carefully monitored. For
example, a new optional “early decision” application cycle and binding match is envisioned where applicants
may apply in only one specialty, and application numbers and available positions are constrained. An iterative,
continuous quality improvement approach is envisioned that begins relatively conservatively, and is adjusted
annually as needed, based on process and outcome measures (i.e., stakeholder experience, match rate, rank
list position to match for both applicants and programs, equity for underrepresented groups and programs). An
early match may be preferable to other interventions, especially if a conservative initial approach is used, to limit
legal challenges and impact on special populations.
Theme: Faculty Support Resources
Recommendation:
29. Develop a portfolio of evidence-based resident support resources for program directors (PDs), designated
institutional ocials (DIOs), and residency programs. These will be identied as best practices, and accessible
through a centralized repository.
Narrative description of recommendation:
A centralized source of resident support resources will assist programs with eective approaches to address
resident concerns. This will be especially relevant for competency-based remediation and resident wellbeing
resources in the context of increased demand for support around the UME-GME transition. Access for
programs and program directors will be low/no cost, condential, and straightforward.
Recommendation:
30. Educators across the continuum must receive faculty development regarding anti-racism; avoiding bias;
and improving equity in student and resident recruitment, mentorship and advising, teaching, and assessment.
Narrative description of recommendation:
Avoiding bias and improving racial equity are essential skills for faculty in today’s teaching. Many faculty lack
these skills, and that lack perpetuates health disparities, lack of diversity, and learner mistreatment. This faculty
development must be longitudinal and repeated annually.
Appendix D:
UGRC Preliminary Recommendations
UME-GME REVIEW COMMITTEE
135
Theme: Post-Match Transition to Residency
Recommendation:
31. Anticipating the challenges of the UME-GME transition, schools and programs should ensure that time is
protected, and systems are in place, to ensure that individualized wellness resources – including health care,
psychosocial supports, and communities of belonging – are available for each learner.
Narrative description of recommendation:
Given that the wellness of each learner signicantly impacts learner performance, it is in the program and
public’s best interest to ensure the learner is optimally prepared to perform as a resident. This should be
focused on applying resources that are already available and not dependent on the creation of new resources.
Examples of wellness resources include: enrollment in insurance, establishing with a primary care provider and
dentist, securing a therapist if appropriate, identifying local communities of belonging, and other supports that
optimize wellbeing. These resources may especially benet the most vulnerable trainees.
Recommendation:
32. Using principles of inclusive excellence, program directors, programs, and institutions must incorporate
activities in diversity, equity, and inclusion for faculty, residents, and sta beginning in orientation and ongoing, in
order to promote belonging, eliminate bias, and provide social support.
Narrative description of recommendation:
Recognizing that the ACGME Common Program Requirements already have specic requirements in this area,
this recommendation is intended to specically state how important it is to address issues related to DEI for all
members of the educational community.
Recommendation:
33. Specialty-specic, just-in-time training must be provided to all incoming rst-year residents, to support the
transition from the role of student to a physician ready to assume increased responsibility for patient care.
Narrative description of recommendation:
The intent of this recommendation is to level set incoming intern performance regardless of medical school
experience. Recent research has shown that residents reported greater preparedness for residency if they
participated in a medical school “boot camp,” and participation in longer residency preparedness courses
was associated with high perceived preparedness for residency. This training must incorporate all six specialty
milestone domains and be conducive to performing a baseline skills assessment. These curricula might be
developed by specialty boards, specialty societies, or other organized bodies. To minimize costs, specialty
societies could provide centralized recommendations and training could be executed regionally or through
online modules.
Recommendation:
34. Residents must be provided with robust orientation and ramp up into their specic program at the start of
internship. In addition to clinical skills and system utilization, content should include introduction to the patient
population, known health disparities, community service and engagement, faculty, peers, and institutional
culture.
Narrative description of recommendation:
Appendix D:
UGRC Preliminary Recommendations
UME-GME REVIEW COMMITTEE
136
Improved orientation to residency has the ability to enhance trainee well-being and improve patient safety.
Residents should have orientation that includes not only employee policies but also education that optimizes
their success in their specic clinical environment. Residents, like other employees, should be paid for attending
orientation.
Recommendation:
35. A specialty-specic, formative, competency-based assessment that informs the learner’s individualized
learning plan (ILP) must be performed for all learners as a baseline at the start of internship.
Narrative description of recommendation:
An assessment of learner competence must be deployed at the start of internship to assess the competencies
outside of medical knowledge in a specialty-specic manner. This assessment should be managed by
the GME side to ensure authentic assessment and to provide feedback to UME agencies. This assessment
must incorporate the ve specialty milestone domains beyond medical knowledge. This assessment might
be developed by specialty boards, specialty societies, or other organized bodies. Cost to students must be
minimized.
This is envisioned as an “In-Training Examination” (ITE) experience early in internship that is based on the ve
specialty milestone domains beyond medical knowledge. The time for this experience should be protected in
orientation, and the feedback should be formative similar to how most programs manage the results of ITEs.
This assessment might occur in the authentic workplace and based on direct observation, or might be
accomplished as an Objective Structured Clinical Exam using simulation. This assessment should inform the
learner’s ILP and set the stage for the work of the clinical competency committee of the program.
Recommendation:
36. Early and ongoing specialty-specic resident assessment data should be automatically fed back to
medical schools through a standardized process to enhance accountability and continuous improvement of
UME programs and learner handovers.
Narrative description of recommendation:
Instruments for feedback from GME to UME should be standardized and utilized to inform gaps in curriculum
and program improvement. UME institutions should respond to the GME feedback on their graduates’
performance in a manner that leads to quality improvement of the program.
Recommendation:
37. Adequate and appropriate time must be assured between graduation and learner start of residency to
facilitate this major life transition.
Narrative description of recommendation:
The transition from medical school to residency typically marks a concrete transition from paying for ones
education to becoming a fulltime employee focused on ones lifelong pursuit of improvement in ones
occupation. This transition is life changing for many. It often requires a move from one location to another,
sometimes across the world. There must be time for licensing and in some cases, visa attainment. Often this life
transition is accompanied by other major life events such as partnering or child-bearing. Once residency starts
the learner may work many hours each week and may have little time to establish a home. Thus, it is important
for wellness and readiness to practice that adequate time be provided to accomplish this major life transition.
Appendix D:
UGRC Preliminary Recommendations
UME-GME REVIEW COMMITTEE
137
The predictability of this transition must be recognized by both UME and GME institutions, and cooperation on
both sides is required for this transition to be accomplished smoothly. There is a desire to overall better prepare
learners for the start of residency, and an assured transition time would allow related recommendations to be
more easily accomplished.
Recommendation:
38. All learners need equitable access to adequate funding and resources for the transition to residency prior to
internship launch.
Narrative description of recommendation:
As almost every learner graduating from medical school transitions to internship, the need to fund a
geographic move and establishment of a new home is predictable. This nancial planning should be
incorporated into medical school expenses, for example through equitable low interest student loans. Options
to support the transitional expenses of international medical graduates should also be identied. These costs
should not be incurred by GME programs.
Theme: Policy Implications
Recommendation:
39. There should be a standardized process throughout the United States for initial licensing at entrance to
residency in order to streamline the process of credentialing for both residency training and continuing practice.
Narrative description of recommendation:
To benet the public good, costs to support the U.S. healthcare workforce should be minimized. To this end, all
medical students should be able to begin licensure earlier in their educational continuum to better distribute the
work burden and costs associated with this predictable process. When learners are applying to match in many
dierent states the varied requirements are unnecessarily cumbersome. Especially for states where a training
license is required, the time between Match Day and start of internship is often not long enough to manage this
process This is a potential cost saving measure.
Recommendation:
40. Recommend to the U.S. Centers for Medicare and Medicaid Services (CMS) that they change the current
GME funding structure so that the Initial Residency Period (IRP) is calculated starting with the second year of
postgraduate training. This will allow career choice reconsideration, leading to resident wellbeing and positive
eects on the physician workforce.
Narrative description of recommendation:
Given the timing of the residency recruiting season and the Match, students have limited time to denitively
establish their specialty choice. If a resident decides to switch to another program or specialty after beginning
training, because of the IRP the hospital may not receive full funding and thus be far less likely to approve such a
change. The knowledge that residents usually only have one chance to choose a specialty path increases the
pressure on the entire UME-GME transition. Furthermore, educational innovation is limited without flexibility for
time-variable training.
Appendix D:
UGRC Preliminary Recommendations
UME-GME REVIEW COMMITTEE
138
Appendix D:
UGRC Preliminary Recommendations
Theme: Research Questions
Recommendation:
41. To guide future improvements in resident selection and transition, conduct research to understand which
residency applicant characteristics, residency curriculum experiences, and learning environment factors are
most likely to translate into physicians who fulll the specialty specic physician workforce needs of the public
(e.g., primary care, demographics, geographic distribution).
Narrative description of recommendation:
Graduates of U.S. medical schools ll many residency positions, which means GME will be limited by the
decisions made by medical school admissions committees. However, non-U.S. graduates are also considered
at many programs, providing an opportunity to serve the public good. Additional research is needed to help
program directors understand which applicant characteristics are useful indicators to address on-going
medical workforce issues. Further changes to the transition should be informed by evidence whenever
possible.
Recommendation:
42. Build consensus around the components of a successful recruitment cycle, utilizing input from all
stakeholders. Identify which characteristics of applicants and programs predict a successful recruitment cycle
outcome.
Narrative description of recommendation:
Currently, the medical education community lacks a shared mental model of what constitutes a successful
transition from UME to GME, and also what factors predict that success. The lack of agreement leads to
conflict over the content of applications as well as the resources required for a recruitment cycle. Success
could include simple educational outcomes such as completing training, board certication, or lack of
remediation. Alternatively, applicant-specic factors may be more important, such as likelihood of picking the
same program. The success may be dened solely on the public good, based on ll rate of programs and
how many physicians practice in underserved areas. Or, it may be that a successful match is institutionally
specic based on its mission and community served, with some institutions focused on research and others
on rural communities. Regardless, the factors associated with success must be understood so they can be
appropriately emphasized in the UME-GME transition, especially as changes are made to the process.
139
Appendix E: Analysis of the
Public Comments
COALITION FOR PHYSICIAN ACCOUNTABILITY
Table of Contents
EXECUTIVE SUMMARY ..................................................................................................................................... 7
METHODS ............................................................................................................................................................. 9
DATA COLLECTION ......................................................................................................................................... 10
DATA ANALYSIS............................................................................................................................................... 11
RESULTS ............................................................................................................................................................ 12
LIMITATIONS ..................................................................................................................................................... 14
SUMMARY DATA .............................................................................................................................................. 15
Table 1: Comments by Group ............................................................................................................... 15
Table 2: Response Counts and Frequencies by Group and Recommendation Theme ........ 16
Table 3: Counts of Recommendation Numbers ............................................................................... 17
Table 3: Counts of Recommendation Numbers Continued .......................................................... 18
Table 4: Sentiment ................................................................................................................................... 19
Table 5: Tags ............................................................................................................................................. 20
OVERSIGHT ....................................................................................................................................................... 21
Table 6: Sentiment for Oversight ......................................................................................................... 21
Figure 1: Sentiment for Oversight ....................................................................................................... 21
Oversight: Selected Verbatim ............................................................................................................... 22
Table 7: Code Application Counts for Oversight ............................................................................. 23
Figure 2: Code Application for Oversight .......................................................................................... 24
Figure 3: Bigrams for Oversight ........................................................................................................... 25
ADVISING OF LEARNERS .............................................................................................................................. 26
Table 8: Sentiment for Advising of Learners .................................................................................... 26
Figure 4: Sentiment for Advising of Learners .................................................................................. 26
Advising of Learners: Selected Verbatims ........................................................................................ 27
Table 9: Code Application Counts for Advising of Learners ........................................................ 28
Table 9: Code Application Counts for Advising of Learners Continued ................................... 29
Figure 5: Code Application for Advising of Learners ..................................................................... 30
Figure 6: Bigrams for Advising of Learners ...................................................................................... 31
COMPETENCIES AND ASSESSMENTS ...................................................................................................... 32
Table 10: Sentiment for Competencies and Assessments ........................................................... 32
Figure 7: Sentiment for Competencies and Assessments ............................................................ 32
Competencies and Assessments: Selected Verbatims ................................................................. 33
Table 11: Code Application Counts for Competencies and Assessments ............................... 34
3
Table 11: Code Application Counts for Competencies and Assessments Continued .......... 35
Figure 8: Code Application for Competencies and Assessments .............................................. 36
Figure 9: Bigrams for Competencies and Assessments ............................................................... 37
AWAY ROTATIONS .......................................................................................................................................... 38
Table 12: Sentiment for Away Rotations ............................................................................................ 38
Figure 10: Sentiment for Away Rotations .......................................................................................... 38
Away Rotations: Selected Verbatims .................................................................................................. 39
Table 13: Code Application Counts for Away Rotations ............................................................... 40
Table 13: Code Application Counts for Away Rotations Continued .......................................... 41
Figure 11: Code Application for Away Rotations ............................................................................ 42
Figure 12: Bigrams for Away Rotations ............................................................................................. 43
DIVERSITY, EQUITY, AND INCLUSION (DEI) IN MEDICINE .................................................................... 44
Table 14: Sentiment for Diversity, Equity, and Inclusion (DEI) in Medicine ............................. 44
Figure 13: Sentiment for Diversity, Equity, and Inclusion (DEI) in Medicine ........................... 44
Diversity, Equity, and Inclusion (DEI) in Medicine: Selected Verbatims ................................... 45
Table 15: Code Application Counts for Diversity, Equity, and Inclusion (DEI) in Medicine . 46
Table 15: Code Application Counts for Diversity, Equity, and Inclusion (DEI) in Medicine
Continued ................................................................................................................................................... 46
Figure 14: Code Application for Diversity, Equity, and Inclusion (DEI) in Medicine .............. 48
Figure 15: Bigrams for Diversity, Equity, and Inclusion (DEI) in Medicine ............................... 49
APPLICATION PROCESS ............................................................................................................................... 50
Table 16: Sentiment for Application Process ................................................................................... 50
Figure 16: Sentiment for Application Process ................................................................................. 50
Application Process: Selected Verbatims ......................................................................................... 51
Table 17: Code Application Counts for Application Process ....................................................... 54
Table 17: Code Application Counts for Application Process Continued .................................. 55
Figure 17: Code Application for Application Process .................................................................... 56
Figure 18: Bigrams for Application Process .............................................................................................. 57
INTERVIEWING ................................................................................................................................................. 58
Table 18: Sentiment for Interviewing .................................................................................................. 58
Figure 19: Sentiment for Interviewing ................................................................................................ 58
Interviewing: Selected Verbatims ........................................................................................................ 59
Table 19: Code Application Counts for Interviewing ...................................................................... 60
4
Table 19: Code Application Counts for Interviewing Continued ................................................. 61
Figure 20: Code Application for Interviewing ................................................................................... 62
Figure 21: Bigrams for Interviewing .................................................................................................... 63
MATCHING PROCESS ..................................................................................................................................... 64
Table 20: Sentiment for Matching Process ....................................................................................... 64
Figure 22: Sentiment for Matching Process ...................................................................................... 64
Matching Process: Selected Verbatims ............................................................................................. 65
Table 21: Code Application Counts for Matching Process ........................................................... 67
Table 21: Code Application Counts for Matching Process Continued ...................................... 68
Figure 23: Code Application for Matching Process ........................................................................ 69
Figure 24: Bigrams for Matching Process ......................................................................................... 70
FACULTY SUPPORT RESOURCES .............................................................................................................. 71
Table 22: Sentiment for Faculty Support Resources ...................................................................... 71
Figure 25: Sentiment for Faculty Support Resources .................................................................... 71
Faculty Support Resources: Selected Verbatims ............................................................................ 72
Table 23: Code Application Counts for Faculty Support Resources ......................................... 73
Figure 25: Code Application for Faculty Support Resources ...................................................... 74
Figure 26: Bigrams for Faculty Support Resources ....................................................................... 75
POST-MATCH TRANSITION TO RESIDENCY ............................................................................................ 76
Table 24: Sentiment for Post-Match Transition to Residency ...................................................... 76
Figure 27: Sentiment for Post-Match Transition to Residency .................................................... 76
Post-Match Transition to Residency: Selected Verbatims ............................................................ 77
Table 25: Code Application Counts for Post-Match Transition to Residency ......................... 78
Table 25: Code Application Counts for Post-Match Transition to Residency Continued ..... 79
Figure 28: Code Application for Post-Match Transition to Residency ...................................... 80
Figure 29: Bigrams for Post-Match Transition to Residency ....................................................... 81
POLICY IMPLICATIONS .................................................................................................................................. 82
Table 26: Sentiment for Policy Implications ..................................................................................... 82
Figure 30: Sentiment for Policy Implications .................................................................................... 82
Policy Implications: Selected Verbatims ........................................................................................... 83
Table 27: Code Application Counts for Policy Implications ......................................................... 84
Table 27: Code Application Counts for Policy Implications Continued .................................... 85
Figure 31: Code Application for Policy Implications ...................................................................... 86
5
Figure 32: Bigrams for Policy Implications ....................................................................................... 87
RESEARCH QUESTIONS ................................................................................................................................ 88
Table 28: Sentiment for Research Questions ................................................................................... 88
Figure 33: Sentiment for Research Questions ................................................................................. 88
Research Questions: Selected Verbatim ........................................................................................... 89
Table 29: Code Application Counts for Research Questions ....................................................... 90
Figure 34: Code Application for Research Questions .................................................................... 91
Figure 35: Bigrams for Research Questions .................................................................................... 92
OTHER COMMENTS ......................................................................................................................................... 93
Table 30: Sentiment for Other Comments ......................................................................................... 93
Figure 36: Sentiment for Research Questions ................................................................................. 93
Other Comments: Selected Verbatims ............................................................................................... 94
Table 31: Code Application Counts for Other Comments ............................................................. 95
Table 31: Code Application Counts for Other Comments ............................................................. 96
Figure 37: Code Application for Other Comments .......................................................................... 97
Figure 38: Bigrams for Other Comments ........................................................................................... 98
SURVEY DEMOGRAPHICS ............................................................................................................................ 99
Which of these choices best represents your reason for responding to the UGRC
recommendations survey? .................................................................................................................... 99
Which of the following describes your primary role? .................................................................. 100
Which of the following describes your primary role? – Other (please specify) .................... 101
Which of the following describes your primary role? – Other (please specify) Continued 102
Which of the following describes your primary role? – Other (please specify) Continued 103
In which type of medical school are you currently enrolled? .................................................... 104
Are you currently a practicing physician/clinician? ..................................................................... 105
Which of the following medical degrees do you have? ............................................................... 106
What is the location of the medical school from which you graduated? ................................ 107
Other Medical School Locations ........................................................................................................ 108
Other Medical School Locations Continued ................................................................................... 109
Other Medical School Locations Continued ................................................................................... 109
In what year did you complete your residency? ............................................................................ 111
What is your core medical specialty? ............................................................................................... 112
Other Core Specialties .......................................................................................................................... 113
6
What is the location of the institution where your primary role is… ........................................ 114
What is the location of the institution where your primary role is… Other Locations ........ 115
What is the location of the institution where your primary role is… Other Locations
Continued ................................................................................................................................................. 116
Do you directly supervise residents? ............................................................................................... 117
What is your gender identity? ............................................................................................................. 118
Other Gender Identities ........................................................................................................................ 119
What is your race or ethnic identity? Select all that apply. ........................................................ 120
Other Race/Ethnic Identities................................................................................................................ 121
APPENDIX A: LIST OF CODES .................................................................................................................... 122
APPENDIX B: LIST OF TAGS ....................................................................................................................... 127
APPENDIX C: SURVEY INSTRUMENT ....................................................................................................... 128
7
EXECUTIVE SUMMARY
NBME is committed to developing and delivering assessments of the many expected competencies of
health professionals. NBME is committed to the work of the Coalition for Physician Accountability and
helping find solutions that will improve the UME-GME transition. NBME supported the work of the
Undergraduate Medical Education to Graduate Medical Education Review Committee (UGRC) by
developing, administering, and analyzing the results of a survey to collect feedback on the Preliminary
Recommendations on addressing the challenges that exist in the transition from medical school to
residency. The survey sought feedback from stakeholders during the public comment period (April 26,
2021 to May 28, 2021).
Respondent Information
The survey instrument collected 2,673 comments from 768 distinct respondents over 32 days of the
administration. A request to participate in the public comment period for the recommendations was
posted on their website. UGRC solicited responses from specific organizations and groups. 13.7% of
the respondents completed the survey on behalf of an organization or group in an official capacity
which accounted for 21.2% of the overall comments. Comments provided by organizations or groups
tended to be very thorough, when compared to the comments from individual respondents. The
largest groups of respondents who were not responding on behalf of an organization or group, are
Medical School Students (26.6%), Residency Program Directors (16.3%) and Faculty Members of
Medical Schools (10.3%) which accounted for 39.5% of the overall comments.
Result Highlights
Of the 12 Preliminary Recommendation themes, respondents commented most often on the themes
of Interviewing (N=464), Application Process (N=294), and Matching Process (N=262). Overall,
respondents had varying opinions regarding the specific recommendations. For instance, of the
Interviewing comments which were assigned a sentiment (N=309), opinions differed. Responses
within this theme were mostly focused on Recommendation 26; some respondents felt strongly that
virtual interviews should become permanent, while others described negative implications associated
with this means of interviewing for residency, including equity and inclusion.
In reference to the Application Process theme, commentary was also wide-ranging. Standardized
testing was mentioned frequently, often these comments expressed concerns about the differences
between USMLE and COMLEX scores, or the need for a single licensing exam for allopathic and
osteopathic students. Data transparency was also a popular topic, specifically the need for better
access to program info, and more transparency around the use of filters
Comments regarding the recommendations within the Matching Process theme were also mixed.
Early decision was a popular topic in the comments, with some respondents expressing agreement,
while others cautioned against it. Respondents questioned how these recommendations could be
implemented, and they mentioned application caps and limits frequently.
8
Respondents expressed strong feelings relating to Away Rotations, specifically about issues of
inequity due to associated costs for the student learner.
The recommendation with the highest number of comments expressing agreement was Faculty
Support Resources (66.7%).
The anticipated use of this report is to aid subject matter experts in the review and interpretation of the
results of the open comment period survey. Please refer to the Limitations section of this report before
drawing conclusions based on the presented data.
9
METHODS
The survey instrument was programmed using Survey Monkey and included thirteen open-ended
questions. Twelve of the open-ended questions asked respondents to comment on each of the
recommendation themes; respondents indicated the themes for which they wished to provide
commentary. Participants were asked to include the specific recommendation number(s) they were
commenting on within their response. The final open-ended question solicited general comments
about the UGRC Preliminary Recommendations.
The survey instrument also included twelve background information-gathering questions. The initial
question asked whether the respondent was responding on behalf of an organization in an official
capacity or for themselves. Those responding on behalf of an organization or group were asked to
indicate its name. Respondents who were responding for themselves were asked their primary role,
which then led to a series of background questions that queried current physician-related activity,
medical degree, location of medical school, year of residency completion, medical specialty, location
of current institution, resident supervision status, gender, and race. Medical school students were
asked the location and type of medical school in which they were currently enrolled.
10
DATA COLLECTION
A request to participate in the public comment period for the recommendations of the Coalition for
Physician Accountability’s Undergraduate Medical Education to Graduate Medical Education Review
Committee (UGRC) was posted on their website (https://physicianaccountability.org). In addition,
UGRC solicited responses from specific organizations and groups.
The survey window spanned April 26, 2021 to May 28, 2021.
11
DATA ANALYSIS
Prior to the survey administration window, UGRC stakeholders were asked to provide a list of
potential codes or topics that would likely be discussed in the respondents’ comments. After the first
week of the survey administration window, 4 NBME staff members read portions of the response data
and identified a list of potential thematic codes. The list of codes was presented to UGRC
stakeholders for review and approval. The 4 NBME staff members then coded the first two weeks of
comments using the initial codebook.
Subsequently, through an iterative process, additional codes and tags were added, which resulted in
a final set of agreed-upon codes and tags (see Appendix). The final codebook was used by the 4
NBME staff members to code the remainder of responses in weekly batches. Two NBME staff
members reviewed 10% of all coded comments from the first two weeks of the survey window to
ensure that codes were being adequately and accurately used. This review resulted in the application
of additional codes to the comments and not to the deletion of previously applied codes. Through
discussion, NBME staff members also attended to their reactions to the responses, their backgrounds,
and their potential biases.
To clarify relationships between associated codes, codes were organized using a parent-child code
structure in which a parent code could include any number of subcategories, or “children”. In all tables
and figures in the results section, an asterisk is used to indicate which of the codes are parent codes.
If a child code was applied to a free-text response, its parent code was also applied or “upcoded”. In
the Code Application tables, parent codes are also shaded in gray. A complete listing of parent and
child codes can be found in Appendix A.
All free-text responses were also assigned sentiment (agree, disagree, or mixed) when distinct
sentiment was expressed in a comment. Additionally, a list of tags wase applied to all free-text
responses when applicable. A complete list of tags can be found in Appendix B.
12
RESULTS
The survey instrument collected 2,673 comments from 768 distinct respondents over 32 days of the
administration. Because the survey was open to the public, it is not possible to calculate the overall
response rate. Prior to analysis, survey responses were reviewed and determined to be valid as long
as one free-text response contained any text. If the text was nonsensical (i.e., "NA", "None", "N/A",
etc.), codes were not applied during qualitative data analysis coding. The remainder of this report
illustrates the results of NBME staff analysis, first displaying summary data about the survey
respondents, themes, and associated codes, then displaying various results by theme and then
presenting the background and demographic information about the respondents.
Table 1 shows the number of comments, percentage of comments, number of distinct respondents,
and percentage of distinct responses by group, which is defined as the primary role indicated by the
survey respondent. 13.7% of the respondents completed the survey on behalf of an organization or
group in an official capacity which accounted for 21.2% of the overall comments. The largest groups
of respondents, who were not responding on behalf of an organization or group, are Medical School
Students (26.6%), Residency Program Directors (16.3%) and Faculty Members of Medical Schools
(10.3%) which accounted for 39.5% of the overall comments. Additional background information is
included in this report, beginning on page 99.
Table 2 (Response Counts and Frequencies by Group and Recommendation Theme) shows the
number of distinct responses and percentage of responses for each group responding to a
recommendation theme.
Table 3 (Counts of Recommendation Numbers) shows counts associated with each of the 42
recommendations ordered by theme, for comments where the recommendation number was indicated
by the respondent. For recommendation themes containing a single recommendation, comments
were automatically coded to that recommendation.
Table 4 (Sentiments) shows the breakdown of sentiments by theme (Agree, Disagree, Mixed) which
were applied by the coders for comments where the respondent sentiment was evident. Additionally,
the coding team applied tags.
Table 5 shows the frequency and percentage of these applied tags (Combine Potential, Concerning
Comment, Interesting Comment, Organizations, Personal Anecdote, Priority, Skepticism, Source
Cited, Suggestion, Unintended Consequences) for each of the themes.
Results by Theme. The remaining tables in the report show summary information about the codes
applied to the comments collected. The organization of tables and graphs for each the 12 themes is
identical. For each theme, a sentiment table and figure appear, followed by a small selection of
verbatim comments, edited for brevity. Next, a table indicates the frequency and percentage of codes
applied to the comments and is followed by a bar graph which illustrates the most frequently applied
codes, in descending order. The final graph for each theme is a bigram which illustrates the most
frequently occurring pairs of words in descending order for 2 groups: individual respondents (non-
organization response) and respondents who completed the survey on behalf of an organization or
group organization.
13
In all tables and figures, an asterisk is used to indicate parent codes.
An additional report showing comments by recommendation theme was provided to stakeholders on
June 1, 2021.
14
LIMITATIONS
Survey instrument was not piloted prior to administration due to time constraints.
Survey instrument was long and there may have been respondent drop-off prior to survey
completion.
All data were self-reported by respondents.
Certain respondent groups had a small number of respondents.
Due to small number of respondents in some respondent groups, results are not generalizable;
interpretation should be taken with caution.
Survey was open to the public – not possible to compute the overall response rate.
Analysis – due to time constraints there was only 1 coder per comment.
Analysis – due to time constraints the coding team was only able to QC 10% of the codes from
the first 2 weeks of the survey administration.
Analysis – due to time constraints the coding team was not able to recode previously coded
comments when new codes were added to the codebook.
Analysis – due to time constraints the coding team was not able to consolidate codes or confirm
accurate interpretation of the coding results.
15
SUMMARY DATA
A total of 2127 respondents clicked into the survey.
A total of 768 respondents left at least one comment.
A total of 2673 comments were collected across the thirteen open-ended questions.
Table 1 provides data about the number of comments left by each respective primary role.
Table 1: Comments by Group
Group
#
Comments
%
Comments
Distinct
Respondents
I am responding on behalf of an organization or group
in an official capacity
568
21.2%
105
Medical School Student
491
18.4%
204
Residency Program Director
400
15%
125
Medical School Assistant/Associate Dean
300
11.2%
64
Faculty Member of a Medical School
264
9.9%
79
Other
247
9.2%
75
Non-Practicing Physician/Clinician
78
2.9%
17
Clerkship Director
71
2.7%
20
Intern/Resident/Fellow
71
2.7%
26
Designated Institutional Official (DIO)
56
2.1%
12
Practicing Physician/Clinician
47
1.8%
22
Medical School Dean
36
1.3%
6
General Public
23
0.9%
6
I serve, or have served, on a State Medical Board
21
0.8%
7
Total
2,673
100%
768
16
Table 2: Response Counts and Frequencies by Group and Recommendation Theme
Group
Oversight
Advising of
Learners
Competencies and
Assessments
Away Rotations
Diversity, Equity,
and Inclusion
Application Process
Interviewing
Matching Process
Faculty Support
Resources
Post-Match Transition
to Residency
Policy Implications
Research Questions
Other
Total
Clerkship Director
0.7%
(N=1)
5.6%
(N=10)
4%
(N=10)
1%
(N=2)
2.5%
(N=5)
2.4%
(N=7)
1.9%
(N=9)
2.3%
(N=6)
4.6%
(N=4)
3.2%
(N=5)
2.1%
(N=2)
3.2%
(N=2)
2.8%
(N=8)
2.7%
(N=71)
Designated Institutional Official
(DIO)
4%
(N=6)
2.2%
(N=4)
1.6%
(N=4)
3.1%
(N=6)
2%
(N=4)
1.4%
(N=4)
2.2%
(N=10)
1.5%
(N=4)
1.1%
(N=1)
3.2%
(N=5)
4.2%
(N=4)
1.6%
(N=1)
1.1%
(N=3)
2.1%
(N=56)
Faculty Member of a Medical
School
8.7%
(N=13)
13.3%
(N=24)
10.5%
(N=26)
9.9%
(N=19)
12.1%
(N=24)
8.5%
(N=25)
7.1%
(N=33)
7.6%
(N=20)
12.6%
(N=11)
8.9%
(N=14)
10.5%
(N=10)
19.4%
(N=12)
11.6%
(N=33)
9.9%
(N=264)
General Public
1.3%
(N=2)
0%
(N=0)
0.4%
(N=1)
0.5%
(N=1)
1%
(N=2)
1.4%
(N=4)
0.9%
(N=4)
0.8%
(N=2)
0%
(N=0)
0.6%
(N=1)
2.1%
(N=2)
0%
(N=0)
1.4%
(N=4)
0.9%
(N=23)
I am responding on behalf of
an organization or group in an
official capacity
25.5%
(N=38)
23.3%
(N=42)
21.8%
(N=54)
20.8%
(N=40)
21.2%
(N=42)
18.4%
(N=54)
15.3%
(N=71)
19.5%
(N=51)
32.2%
(N=28)
27.2%
(N=43)
28.4%
(N=27)
35.5%
(N=22)
19.7%
(N=56)
21.2%
(N=568)
I serve, or have served, on a
State Medical Board
1.3%
(N=2)
0.6%
(N=1)
0.8%
(N=2)
0%
(N=0)
2%
(N=4)
0.3%
(N=1)
0.4%
(N=2)
0.4%
(N=1)
0%
(N=0)
1.3%
(N=2)
1.1%
(N=1)
0%
(N=0)
1.8%
(N=5)
0.8%
(N=21)
Intern/Resident/Fellow
1.3%
(N=2)
1.1%
(N=2)
3.2%
(N=8)
2.1%
(N=4)
3.5%
(N=7)
3.7%
(N=11)
1.9%
(N=9)
3.8%
(N=10)
0%
(N=0)
3.8%
(N=6)
2.1%
(N=2)
1.6%
(N=1)
3.2%
(N=9)
2.7%
(N=71)
Medical School
Assistant/Associate Dean
16.1%
(N=24)
17.2%
(N=31)
14.9%
(N=37)
11.5%
(N=22)
8.6%
(N=17)
9.9%
(N=29)
6.5%
(N=30)
11.8%
(N=31)
13.8%
(N=12)
13.3%
(N=21)
8.4%
(N=8)
9.7%
(N=6)
11.3%
(N=32)
11.2%
(N=300)
Medical School Dean
2.7%
(N=4)
2.8%
(N=5)
2%
(N=5)
1.6%
(N=3)
1%
(N=2)
1%
(N=3)
0.4%
(N=2)
0.4%
(N=1)
2.3%
(N=2)
1.9%
(N=3)
1.1%
(N=1)
1.6%
(N=1)
1.4%
(N=4)
1.3%
(N=36)
Medical School Student
14.1%
(N=21)
12.8%
(N=23)
13.7%
(N=34)
24.5%
(N=47)
16.7%
(N=33)
24.5%
(N=72)
29.3%
(N=136)
18.3%
(N=48)
8%
(N=7)
8.9%
(N=14)
5.3%
(N=5)
6.5%
(N=4)
16.5%
(N=47)
18.4%
(N=491)
Non-Practicing
Physician/Clinician
2.7%
(N=4)
2.2%
(N=4)
2.8%
(N=7)
1.6%
(N=3)
5.1%
(N=10)
3.1%
(N=9)
1.3%
(N=6)
2.7%
(N=7)
3.4%
(N=3)
3.8%
(N=6)
5.3%
(N=5)
6.5%
(N=4)
3.5%
(N=10)
2.9%
(N=78)
Other
7.4%
(N=11)
8.3%
(N=15)
9.7%
(N=24)
6.2%
(N=12)
10.6%
(N=21)
9.9%
(N=29)
9.7%
(N=45)
10.7%
(N=28)
6.9%
(N=6)
7.6%
(N=12)
11.6%
(N=11)
8.1%
(N=5)
9.9%
(N=28)
9.2%
(N=247)
Practicing Physician/Clinician
2%
(N=3)
1.7%
(N=3)
1.2%
(N=3)
2.1%
(N=4)
1.5%
(N=3)
2%
(N=6)
0.9%
(N=4)
1.9%
(N=5)
2.3%
(N=2)
1.3%
(N=2)
2.1%
(N=2)
0%
(N=0)
3.5%
(N=10)
1.8%
(N=47)
Residency Program Director
12.1%
(N=18)
8.9%
(N=16)
13.3%
(N=33)
15.1%
(N=29)
12.1%
(N=24)
13.6%
(N=40)
22.2%
(N=103)
18.3%
(N=48)
12.6%
(N=11)
15.2%
(N=24)
15.8%
(N=15)
6.5%
(N=4)
12.3%
(N=35)
15%
(N=400)
Total
100%
(N=149)
100%
(N=180)
100%
(N=248)
100%
(N=192)
100%
(N=198)
100%
(N=294)
100%
(N=464)
100%
(N=262)
100%
(N=87)
100%
(N=158)
100%
(N=95)
100%
(N=62)
100%
(N=284)
100%
(N=2673)
17
Table 3: Counts of Recommendation Numbers
Note: As indicated in the Methods section, participants were asked to include the specific recommendation
number within their response. The counts in Table 3 represent comments where participants followed these
specific instructions. If a number was not provided, there was no attempt to determine which recommendation
the participant was specifically referring to, and therefore is not represented in the data. For recommendation
themes containing a single recommendation, comments were automatically coded to that recommendation and
are therefore overrepresented when compared to other recommendations.
Theme
Recommendation #
N
Percent
Oversight
1
153
4.6%
Advising of Learners
2
68
2%
Advising of Learners
3
72
2.2%
Advising of Learners
4
73
2.2%
Advising of Learners
5
62
1.9%
Advising of Learners
6
54
1.6%
Competencies and Assessments
7
81
2.4%
Competencies and Assessments
8
67
2%
Competencies and Assessments
9
41
1.2%
Competencies and Assessments
10
68
2%
Competencies and Assessments
11
63
1.9%
Competencies and Assessments
12
49
1.5%
Competencies and Assessments
13
121
3.6%
Competencies and Assessments
14
64
1.9%
Away Rotations
15
174
5.2%
Diversity, Equity, and Inclusion (DEI) in Medicine
16
78
2.3%
Diversity, Equity, and Inclusion (DEI) in Medicine
17
57
1.7%
Diversity, Equity, and Inclusion (DEI) in Medicine
18
87
2.6%
Diversity, Equity, and Inclusion (DEI) in Medicine
19
76
2.3%
Application Process
20
107
3.2%
Application Process
21
116
3.5%
Application Process
22
72
2.2%
Application Process
23
89
2.7%
Application Process
24
119
3.6%
Interviewing
25
142
4.3%
Interviewing
26
313
9.4%
Interviewing
27
178
5.3%
Matching Process
28
182
5.5%
Faculty Support Resources
29
40
1.2%
Faculty Support Resources
30
52
1.6%
Post-Match Transition to Residency
31
38
1.1%
Post-Match Transition to Residency
32
31
0.9%
Post-Match Transition to Residency
33
47
1.4%
18
Table 3: Counts of Recommendation Numbers Continued
Theme
Recommendation #
N
Percent
Post-Match Transition to Residency
34
35
1.1%
Post-Match Transition to Residency
35
50
1.5%
Post-Match Transition to Residency
36
34
1%
Post-Match Transition to Residency
37
48
1.4%
Post-Match Transition to Residency
38
41
0.8%
Policy Implications
39
48
0.9%
Policy Implications
40
59
1.8%
Research Questions
41
42
1.3%
Research Questions
42
31
0.9%
Total
3,333
100%
19
Table 4: Sentiment
Recommendation
Total
Comments
Agree
Disagree
Mixed
*Advising of Learners Total Comments
180
43.3%
(N=78)
3.9%
(N=7)
7.2%
(N=13)
*Application Process Total Comments
294
25.9%
(N=76)
4.4%
(N=13)
10.5%
(N=31)
*Away Rotations Total Comments
192
25.5%
(N=49)
7.3%
(N=14)
3.1%
(N=6)
*Competencies and Assessments Total
Comments
248
47.2%
(N=117)
7.7%
(N=19)
14.1%
(N=35)
*DEI Total Comments
198
22.2%
(N=44)
4.5%
(N=9)
9.6%
(N=19)
*Faculty Support Resources Total Comments
87
66.7%
(N=58)
5.7%
(N=5)
2.3%
(N=2)
*Interviewing Total Comments
464
23.5%
(N=109)
16.2%
(N=75)
23.7%
(N=110)
*Matching Process Total Comments
262
32.1%
(N=84)
12.2%
(N=32)
4.2%
(N=11)
*Other Total Comments
284
11.6%
(N=33)
5.3%
(N=15)
2.8%
(N=8)
*Oversight Total Comments
149
57%
(N=85)
6%
(N=9)
--
(N=0)
*Policy Implications Total Comments
95
44.2%
(N=42)
3.2%
(N=3)
12.6%
(N=12)
*Post-Match Transition to Residency Total
Comments
158
17.7%
(N=28)
4.4%
(N=7)
13.9%
(N=22)
*Research Questions Total Comments
62
40.3%
(N=25)
3.2%
(N=2)
4.8%
(N=3)
20
Table 5: Tags
Recommendation
Total Comments
Combine
Potential
Concerning
Comment
Interesting
Comment
Organizations
Personal
Anecdote
Priority
Skepticism
Source Cited
Suggestion
Unintended
Consequences
*Advising of Learners Total Comments
180
2.8%
(N=5)
--
(N=0)
5.6%
(N=10)
9.4%
(N=17)
2.2%
(N=4)
2.8%
(N=5)
13.3%
(N=24)
1.1%
(N=2)
36.7%
(N=66)
7.8%
(N=14)
*Application Process Total Comments
294
1.4%
(N=4)
1%
(N=3)
10.5%
(N=31)
9.9%
(N=29)
1.4%
(N=4)
5.4%
(N=16)
8.2%
(N=24)
2.7%
(N=8)
34%
(N=100)
10.2%
(N=30)
*Away Rotations Total Comments
192
--
(N=0)
3.6%
(N=7)
2.1%
(N=4)
6.8%
(N=13)
4.7%
(N=9)
3.1%
(N=6)
13.5%
(N=26)
--
(N=0)
25.5%
(N=49)
17.2%
(N=33)
*Competencies and Assessments Total
Comments
248
0.8%
(N=2)
0.8%
(N=2)
6%
(N=15)
12.9%
(N=32)
2.4%
(N=6)
2%
(N=5)
33.5%
(N=83)
1.6%
(N=4)
60.9%
(N=151)
24.2%
(N=60)
*DEI Total Comments
198
1%
(N=2)
4.5%
(N=9)
11.1%
(N=22)
8.6%
(N=17)
5.1%
(N=10)
5.6%
(N=11)
13.1%
(N=26)
4.5%
(N=9)
41.9%
(N=83)
10.1%
(N=20)
*Faculty Support Resources Total
Comments
87
--
(N=0)
--
(N=0)
5.7%
(N=5)
12.6%
(N=11)
--
(N=0)
5.7%
(N=5)
17.2%
(N=15)
3.4%
(N=3)
42.5%
(N=37)
2.3%
(N=2)
*Interviewing Total Comments
464
1.3%
(N=6)
1.3%
(N=6)
0.9%
(N=4)
7.3%
(N=34)
4.7%
(N=22)
3%
(N=14)
19%
(N=88)
0.4%
(N=2)
26.9%
(N=125)
18.8%
(N=87)
*Matching Process Total Comments
262
0.4%
(N=1)
0.4%
(N=1)
5.3%
(N=14)
6.1%
(N=16)
3.4%
(N=9)
5.3%
(N=14)
13.4%
(N=35)
0.8%
(N=2)
29.8%
(N=78)
18.3%
(N=48)
*Other Total Comments
284
--
(N=0)
4.6%
(N=13)
5.6%
(N=16)
13.4%
(N=38)
1.1%
(N=3)
0.7%
(N=2)
9.5%
(N=27)
0.4%
(N=1)
29.9%
(N=85)
4.9%
(N=14)
*Oversight Total Comments
149
0.7%
(N=1)
--
(N=0)
2.7%
(N=4)
17.4%
(N=26)
0.7%
(N=1)
4%
(N=6)
18.1%
(N=27)
0.7%
(N=1)
36.2%
(N=54)
9.4%
(N=14)
*Policy Implications Total Comments
95
1.1%
(N=1)
--
(N=0)
1.1%
(N=1)
9.5%
(N=9)
1.1%
(N=1)
5.3%
(N=5)
16.8%
(N=16)
--
(N=0)
22.1%
(N=21)
11.6%
(N=11)
*Post-Match Transition to Residency
Total Comments
158
3.8%
(N=6)
1.3%
(N=2)
4.4%
(N=7)
16.5%
(N=26)
3.2%
(N=5)
10.1%
(N=16)
17.7%
(N=28)
2.5%
(N=4)
38%
(N=60)
12%
(N=19)
*Research Questions Total Comments
62
3.2%
(N=2)
--
(N=0)
11.3%
(N=7)
12.9%
(N=8)
3.2%
(N=2)
11.3%
(N=7)
17.7%
(N=11)
1.6%
(N=1)
48.4%
(N=30)
3.2%
(N=2)
21
OVERSIGHT
Table 6: Sentiment for Oversight
Sentiment
N
Percent
Agree
85
57%
Disagree
9
6%
Total Comments
149
100%
Figure 1: Sentiment for Oversight
22
Oversight: Selected Verbatim
This is a comment about the whole document--which is thorough, thoughtful, and comprehensive. I am
concerned that it is new paint on a decaying house. The crisis was exacerbated --or even caused--by a
misguided push to expand medical school classes without alignment with residency slots. This is on top of
chronic perverse specialization incentives in the US health care system. Some careers have good lifestyle and
high income--and some do not. THese wonderful recommendations will do nothing to produce the spectrum of
doctors needed by the patients of the future. I wonder if we would get more primary care physicians if we took a
whole different track. Maybe something like (a. improve income and working conditions of primary care
docs) b. consider a universal generalist hospitalist and ambulatory internship (with pay) instead of year 4 of
medical school interns would provide inpatient hospitalist care for 6-9 months and generalist ambulatory and
urgent care for 6 months (? alternating)--and would be eligible for licensure after this training. (12 months
minimum, extendable to assure competency). During this internship, learners could apply to specialties --with 2
years for IM or FM board certification, ? 2.5 years for peds, and full residency for other areas. Or something
really out of the box like that. The wonderful suggestions of this document reinforce the building of a workforce
that will continue to drive procedure oriented, fragmented care. (Role: Medical School Assistant/Associate
Dean, MD)
23
Table 7: Code Application Counts for Oversight
Code
N
Percent
*Applications
5
1.1%
Applications - MSPE (Medical School Performance Evaluation)
2
0.4%
*Assessment
7
1.5%
Assessment - Accurate assessments
4
0.9%
*Communication
2
0.4%
*Cost/Finances/Debt
5
1.1%
Cost/Finances/Debt - Program Cost
4
0.9%
*COVID Impact
4
0.9%
*Data Transparency & Availability
6
1.3%
Data Transparency & Availability - Data to Support Informed Decisions
4
0.9%
*DEI
41
8.9%
DEI – Bias
4
0.9%
DEI – Diversity
17
3.7%
DEI – Fairness
2
0.4%
DEI – Inclusion
28
6%
DEI - School Resource Availability
2
0.4%
DEI - Small Program(s)
4
0.9%
*DO/Osteopathy/Osteopathic
3
0.6%
*Funding
4
0.9%
*Implementation
89
19.2%
Implementation - Cohesive Policy
33
7.1%
Implementation - CQI (Continuous Quality Improvement)
13
2.8%
Implementation – Impact
46
9.9%
*Interviews
2
0.4%
*Matching Process
3
0.6%
*Non-US Trained Students
4
0.9%
*Oversight
65
14%
Oversight - Cohesive Oversight Committee
25
5.4%
*Physician Shortage
2
0.4%
*Public Health
5
1.1%
*Roles
4
0.9%
Roles - DIO (Designated Institutional Officer)
3
0.6%
*Specialties
7
1.5%
*Training
2
0.4%
*Transition to Residency
12
2.6%
Total
463
100%
24
Figure 2: Code Application for Oversight
25
Figure 3: Bigrams for Oversight
26
ADVISING OF LEARNERS
Table 8: Sentiment for Advising of Learners
Sentiment
N
Percent
Agree
78
43.3%
Disagree
7
3.9%
Mixed
13
7.2%
Total Comments
180
100%
Figure 4: Sentiment for Advising of Learners
27
Advising of Learners: Selected Verbatims
2. A good career advisor uses every resource available to gather information, but not all formal career advising
tools are appropriate for every student. New staff should be on-boarded to watch matches, edit MSPEs,
personal statements, etc. to understand the big picture , but the school should hire people willing to try to advise
students. I do not recommend many faculty members as general career advisors as they do not keep up to date
on the trends, matches, recommendations, or rotation changes spanning different departments. 3. CIM is a
good tool, but it favors Allopathic programs and schools. The assessment information and MD information isn't
pertinent to COM students. Non-clinical paths should be listed to help career advisors with post-graduates who
keep reapplying just to look at the Unfilled List for competitive specialties […]. This tool should be free to all, not
just Allopathic students. 4. Career Advisors need non-clinical pathways for post-grads who were terminated
from residency or have struggled with boards throughout their education. These students do not have access to
CIM since it has expired. 5. Residency faculty and staff need a general career advising resource working with
professionalism deficiencies as they will see issues on rotations from students. Issues need to be addressed
and corrected ASAP. […] (Role: Other: Career Advisor)
[…] 2. Students rely more on peer and near peer advise. Would not think the effort for a faculty "curriculum"
would be of long-term benefit, considering what it would likely cost. And, advising has some generic factors and
many specialty-specific factors; the curriculum would be huge and with many branch points- likely unwieldy. 3.
Students are more likely to access and respect peer/near peer input. Single resource managed by some central
group likely would age poorly and see limited use by students. 4. Critically important and should start in UME,
with options for sympathetic off ramps. 5. A reasonable concept, but who would fund such a resource and how
would there be certainty as to its currentness. 6. Not sure what is envisioned here. But still there remains the
issue re who will maintain such resources and how it can be kept current. GME programs lack resources - see
my comment for #1 - because they are not really allowed to tap into the CMS dollars they bring it. (Role:
Medical School Assistant/Associate Dean, MD)
2: Making a curriculum is one thing and training the counselors on how to counsel students is a different thing,
the latter being more important. When you say implementation of guidelines this should include training of
counselors (standardized training throughout the world). Not a big problem you have Zoom now. […] 4:
Advising on alternative pathways should be available for all medical students irrespective of whether they are
interested or not. Sometimes one develops interest after he/ she is given information about a new pathway. […]
(Role: Non-Practicing Physician/Clinician, MBBS)
[…] 4: For those choosing not to pursue a clinical career it will be important to determine what % of a
graduating course should pursue this pathway. Ideally a clinical medical school should be producing clinicians.
Centralized services to support those pursing a non-clinical career would be a significant help for international
medical graduates. 5: The availability of career advising resources for all Faculty would be welcomed.
Certification of those career advisors may be a useful way of ensuring all Faculty are up-to-date in relation to
career options. […] (Role: I am responding on behalf of an organization or group in an official capacity)
2. Care needs to be taken when distinguishing between “advising” and “leading”. Systemic racism in higher
education can lead advisors to recommend people of low socioeconomic status to not apply for medical school
in the United States. My premed program in the US actually refused to release my letters of recommendation
when I applied to DO schools until after secondary interviews were performed. The dean of the science
department told me that they did not think I would be financially successful even though I made the grades.[…]
(Role: Non-Practicing Physician/Clinician, MD)
[…] 3: A single professional development career planning resource sounds like an equitable manner to provide
for everyone entering into their residency application phase. A huge concern is how this single platform is
developed and who is at the table when it is created. (Role: Faculty Member of a Medical School)
28
Table 9: Code Application Counts for Advising of Learners
Code
N
Percent
*Advice & Coaching
104
14.2%
Advice & Coaching - Alternative Careers
42
5.7%
Advice & Coaching - Career Advising
68
9.3%
Advice & Coaching – Coaching
4
0.5%
Advice & Coaching - Specialty-specific Advising
24
3.3%
Advice & Coaching - Staff training to support students
29
4%
*Applications
7
1%
Applications - Application Process
3
0.4%
*Assessment
2
0.3%
*Competencies
4
0.5%
*Cost/Finances/Debt
30
4.1%
Cost/Finances/Debt - Implementation Cost
14
1.9%
Cost/Finances/Debt - Program Cost
3
0.4%
Cost/Finances/Debt - Student Cost
8
1.1%
Cost/Finances/Debt - Student Debt
5
0.7%
*Data Transparency & Availability
46
6.3%
Data Transparency & Availability - Dashboard or Portfolio
3
0.4%
Data Transparency & Availability - Data to Support Informed Decisions
40
5.4%
Data Transparency & Availability - Database of Program Info
11
1.5%
*DEI
34
4.6%
DEI – Bias
7
1%
DEI – Diversity
2
0.3%
DEI – Equity
17
2.3%
DEI – Fairness
3
0.4%
DEI – Inclusion
6
0.8%
DEI - School Resource Availability
8
1.1%
DEI – SES
2
0.3%
*DO/Osteopathy/Osteopathic
11
1.5%
*Faculty
13
1.8%
Faculty - Faculty Development
12
1.6%
*Funding
5
0.7%
Funding - Unfunded Mandate
3
0.4%
*Implementation
52
7.1%
Implementation - Cohesive Policy
11
1.5%
Implementation – Impact
12
1.6%
*Matching Process
6
0.8%
Matching Process – Unmatched
3
0.4%
*Non-US Trained Students
10
1.4%
Non-US Trained Students – IMG
5
0.7%
29
Table 9: Code Application Counts for Advising of Learners Continued
Code
N
Percent
*Oversight
8
1.1%
*Public Health
3
0.4%
*Roles
3
0.4%
Roles - Program Directors
2
0.3%
*Specialties
22
3%
Specialties - Competitive Specialties
2
0.3%
Specialties - Specialty Selection
10
1.4%
*Standardization of Requirements
4
0.5%
*Transition to Residency
3
0.4%
Transition to Residency – Timing
2
0.3%
*Wellness/Wellbeing
6
0.8%
Total
734
100%
30
Figure 5: Code Application for Advising of Learners
31
Figure 6: Bigrams for Advising of Learners
32
COMPETENCIES AND ASSESSMENTS
Table 10: Sentiment for Competencies and Assessments
Sentiment
N
Percent
Agree
117
47.2%
Disagree
19
7.7%
Mixed
35
14.1%
Total Comments
248
100%
Figure 7: Sentiment for Competencies and Assessments
33
Competencies and Assessments: Selected Verbatims
Regarding 8 and 9: […] It seems a competency based assessment model, to be accurate and effective, would
require significant contact between the same learners and attendings over the course of the year. Our medical
school is community-based and geographically dispersed. Often to get needed variety of clinical rotations
learners have to travel between health systems and campuses. A more structured competency assessment may
advantage academic centers where learners can stay on one campus and have more contact hours with the
same attendings. […] (Role: Faculty Member of a Medical School, MD)
This entire set of items reads as if a single entity is poised to take over the medical education enterprise, or at
least, to force compliance with a single way of doing things. […] I am fully in favor of suggestions of how to
improve. Programs and schools without internal experts may have substantial benefit from those suggestions.
But I cannot support making such things requirements. […] 14. […] this would appear to suggest that it will
require a school to take this standardized approach to data gathering and reporting. That has potential to be a
large demand, and one that can suppress local creative thinking. […] Please, let's not dictate every last detail to
our schools. (Role: Medical School Assistant/Associate Dean, MD)
7. This is an interesting concept that could lead the way for studying the possibility of “competency-based
curriculums”. I strongly believe if done correctly, can lead to a much better way of training and evaluating
students. It is well established that medicine is nothing like taking a test and the focus on exams is extremely
palpable in medical school. Shifting toward learning for the sake of treating patients and proving competencies
is crucial. I know many students that try to skimp on their practical learning because "they will learn it in
residency/the test is more important to focus on" but have also heard many stories about interns feeling
extremely unprepared with their responsibilities on their first few months. 8. YES. Exams are primarily for
determining competence, not comparing applicants.[…] (Role: Medical School Student)
[…] Having read 10 several times, I'm still confused about what exactly it means. (Role: Clerkship Director and
Assistant Residency Program Director, MD)
As a physician and past president of our state professional organization as well as clinical preceptor to MD and
DO students, I have been committed to maintaining our osteopathic distinctiveness throughout my career. I
applied to osteopathic medical school in order to learn, practice and teach osteopathic medicine. In these
proposed congruency changes to graduate medicine program alignments , it appears that our distinctive
osteopathic principles and practices are completely removed. So, the thousand or so hours of undergraduate
education devoted to osteopathic practices, above and beyond allopathic undergrad education, become
meaningless. Without support in graduate medical education , because programs must better conform to a
single standard to remain viable, whatever competencies an osteopathic medical student manages to achieve,
will wither and die outside the school walls. […] We must offer some path that recognizes not only the
convergence of ideas about practice of medicine, but the divergence as well […] (Role: Practicing
Physician/Clinician, DO)
The LCME and/or ACGME should provide the resources and training for educators on doing these evaluations.
[…]. It should not be left to each institution to figure out how to do this training. The biggest problem we have
now with MSPEs is the huge variability in quality and the lack of any true differentiating information on
candicates […].. I'm also surprised this group hasn't tackled the back end that drives use of markers such as
AOA, USMLE scores, etc in filtering candidates. Programs are held to task by the ACGME if we don't have
adequate first time board passage rates. The easiest way for us to judge test taking ability is the candidates
scores on other standardized tests. It's a bit disingenuous to say it doesn't matter for the USMLE, but it has
enormous stakes for the GME programs[….]. (Role: Residency Program Director, MD)
34
Table 11: Code Application Counts for Competencies and Assessments
Code
N
Percent
*Advice & Coaching
31
1.7%
Advice & Coaching - Career Advising
4
0.2%
Advice & Coaching – Coaching
20
1.1%
Advice & Coaching - Staff training to support students
6
0.3%
*Applications
135
7.3%
Applications - Application Process
5
0.3%
Applications - Biasing Applications
11
0.6%
Applications - LOR (Letters of Recommendation)
35
1.9%
Applications - MSPE (Medical School Performance Evaluation)
57
3.1%
Applications - Objective Metrics to Gauge Applicants
17
0.9%
Applications - Standardization of Application Process
25
1.4%
*Assessment
112
6.1%
Assessment - Accurate assessments
62
3.4%
Assessment - Standardized Exams
23
1.2%
Assessment - Inequality in Scaling
3
0.2%
*Assessment and Performance Data
56
3%
Assessment and Performance Data - Grades & Grading Pass Fail
20
1.1%
Assessment and Performance Data - Holistic Review
12
0.7%
Assessment and Performance Data - ILPs (Individualized Learning Plans)
9
0.5%
*Communication
7
0.4%
*Competencies
79
4.3%
Competencies - EPAs (Entrustable Professional Activities)
18
1%
Competencies – Milestones
14
0.8%
*Cost/Finances/Debt
32
1.7%
Cost/Finances/Debt - Implementation Cost
18
1%
Cost/Finances/Debt - Program Cost
17
0.9%
Cost/Finances/Debt - Student Cost
6
0.3%
*Data Transparency & Availability
73
4%
Data Transparency & Availability - Dashboard or Portfolio
23
1.2%
Data Transparency & Availability - Data to Support Informed Decisions
56
3%
*DEI
57
3.1%
DEI – Bias
20
1.1%
DEI - Bias - Racial Bias
2
0.1%
DEI – Diversity
4
0.2%
DEI – Equity
21
1.1%
DEI – Fairness
9
0.5%
DEI – Inclusion
10
0.5%
DEI - School Resource Availability
9
0.5%
DEI - Small Program(s)
5
0.3%
35
Table 11: Code Application Counts for Competencies and Assessments Continued
Code
N
Percent
DEI – URM
2
0.1%
*DO/Osteopathy/Osteopathic
10
0.5%
*Faculty
22
1.2%
Faculty - Faculty Development
21
1.1%
*Funding
10
0.5%
*Implementation
179
9.7%
Implementation - Change Management
4
0.2%
Implementation - Cohesive Policy
43
2.3%
Implementation - CQI (Continuous Quality Improvement)
4
0.2%
Implementation – Impact
136
7.4%
*Interviews
4
0.2%
*Matching Process
9
0.5%
*Medical School Prestige
3
0.2%
*Non-US Trained Students
15
0.8%
Non-US Trained Students - IMG
10
0.5%
*Oversight
23
1.2%
Oversight - Cohesive Oversight Committee
5
0.3%
*Public Health
10
0.5%
*Roles
18
1%
Roles - Other Roles
2
0.1%
Roles - Program Directors
13
0.7%
*Rotations
3
0.2%
Rotations - Away Rotations
3
0.2%
*Specialties
24
1.3%
*Standardization of Requirements
3
0.2%
*Training
8
0.4%
*Transition to Residency
17
0.9%
Transition to Residency - Learner Handover
6
0.3%
Transition to Residency - Timing
2
0.1%
*Wellness/Wellbeing
11
0.6%
Applications - SEL (Structured Evaluative Letters)
97
5.3%
Total
1,840
100%
36
Figure 8: Code Application for Competencies and Assessments
37
Figure 9: Bigrams for Competencies and Assessments
38
AWAY ROTATIONS
Table 12: Sentiment for Away Rotations
Sentiment
N
Percent
Agree
49
25.5%
Disagree
14
7.3%
Mixed
6
3.1%
Total Comments
192
100%
Figure 10: Sentiment for Away Rotations
39
Away Rotations: Selected Verbatims
20. A database where applicants can essentially view the application data of residents is a poor
recommendation for two reasons. 1. Doesn't respect the privacy of applicants. Even if you took aggregates of
matched applicants from previous years, it would be too easy to identify the personal information of individual
residents. 2. Such data would be of minimal use to applicants compared to the aggregate data of
interviewed/ranked applicants. Showing the data of matched applicants doesn't reflect the full diversity of the
applicant pool who had the potential to match at a given program. […] 23. While a well-meaning
recommendation, this recommendation isn't realistic for practical and technical reasons. Removing filters on
data in application portals such as ERAS doesn't stop program directors from filtering applications using that
data. It just adds extra steps. If applicant data can be viewed, it can be aggregated through web scraping and
filtered. If the coalition recommends that pieces of applicant data shouldn't be used to filter applications, that
data should not be viewable by program directors. (Role: General Public)
Most family medicine residency programs are community-based. Away rotations not only allow those programs
to build a more robust pipeline of interest in their program, but they also diversify training for medical students,
who otherwise experience most of their clinical training in medical school in large academic health centers. The
focus of this workgroup should be on exploration and research to help provide more opportunities and equitable
access to away rotations for medical students, not to limit them. With the inequities that currently exist for
students to do away rotations, there is the potential to create bias in GME educators who interview and rank
medical student applicants. This needs to be investigated further and addressed if it is felt to cause inequities in
student selection. (Role: I am responding on behalf of an organization or group in an official capacity)
15: From my advising experience, I've found that away rotations are exceptionally valuable for both students
and programs. I would encourage the workgroup to focus on how to facilitate away rotations for students of all
classes, including osteopathic students in allopathic institutions and culturally and financially disadvantaged
students. The away rotation allows a student to show a program a holistic view of their performance and
personality, regardless of the reputation of their home institution or their standardized test scores. It also allows
the student to get a feel for how they would fit into the culture of that residency and reduce burnout and
changing residencies over time. I would hope that the workgroup would not recommend limitations on the away
rotation, but rather how to make away rotations more robust and available to all students. (Role: I am
responding on behalf of an organization or group in an official capacity)
Currently IMGs for the most part are UNABLE to get rotations at hospitals with Emergency Medicine
residencies, making it impossible for them to get a Standardized Letter of Evaluation (SLOE) without which they
cannot apply for Emergency Medicine Residency. This is just one example of the issues facing IMGs. In
addition, IMGs cannot use VSAS (for the most part) further limiting their access to quality rotations and United
States Clinical Experience. (Role: Medical School Student)
40
Table 13: Code Application Counts for Away Rotations
Code
N
Percent
*Advice & Coaching
2
0.2%
Advice & Coaching - Career Advising
2
0.2%
*Applications
39
3.1%
Applications - Application Process
18
1.4%
Applications - Biasing Applications
5
0.4%
Applications - LOR (Letters of Recommendation)
4
0.3%
Applications - Standardization of Application Process
8
0.6%
*Assessment
3
0.2%
Assessment - Accurate assessments
3
0.2%
*Assessment and Performance Data
4
0.3%
*Cost/Finances/Debt
63
5%
Cost/Finances/Debt - Implementation Cost
7
0.6%
Cost/Finances/Debt - Implementation Cost - UME
2
0.2%
Cost/Finances/Debt - Program Cost
9
0.7%
Cost/Finances/Debt - Student Cost
61
4.8%
Cost/Finances/Debt - Student Debt
3
0.2%
*COVID Impact
16
1.3%
*Data Transparency & Availability
18
1.4%
Data Transparency & Availability - Data to Support Informed Decisions
4
0.3%
Data Transparency & Availability - Database of Program Info
6
0.5%
*DEI
101
8%
DEI - Bias
6
0.5%
DEI - Diversity
4
0.3%
DEI - Equity
68
5.4%
DEI - Fairness
16
1.3%
DEI - Inclusion
11
0.9%
DEI - Inclusion - Community outreach program(s)
7
0.6%
DEI - Reputation
7
0.6%
DEI - School Resource Availability
27
2.1%
DEI - SES
34
2.7%
DEI - Small Program(s)
23
1.8%
DEI - URM
19
1.5%
*DO/Osteopathy/Osteopathic
11
0.9%
*Funding
22
1.7%
Funding - GME Funding
2
0.2%
*Implementation
39
3.1%
Implementation - Change Management
3
0.2%
Implementation - Impact
34
2.7%
*Interviews
14
1.1%
41
Table 13: Code Application Counts for Away Rotations Continued
Code
N
Percent
Interviews - Virtual Interviews
6
0.5%
*Matching Process
25
2%
*Medical School Prestige
2
0.2%
*Non-US Trained Students
8
0.6%
Non-US Trained Students - IMG
8
0.6%
Non-US Trained Students - US IMG
2
0.2%
*Oversight
21
1.7%
Oversight - Cohesive Oversight Committee
11
0.9%
*Research
22
1.7%
*Roles
4
0.3%
Roles - Other Roles
2
0.2%
*Rotations
152
12%
Rotations - Audition Rotations
16
1.3%
Rotations - Away Rotations
150
11.8%
*Specialties
33
2.6%
Specialties - Competitive Specialties
14
1.1%
Specialties - Specialty Selection
10
0.8%
*Standardization of Requirements
16
1.3%
Standardization of Requirements - Cross Specialty Standardization
9
0.7%
*Training
18
1.4%
*Transition to Residency
3
0.2%
*Wellness/Wellbeing
11
0.9%
Wellness/Wellbeing - Life Changes
2
0.2%
Total
1,270
100%
42
Figure 11: Code Application for Away Rotations
43
Figure 12: Bigrams for Away Rotations
44
DIVERSITY, EQUITY, AND INCLUSION (DEI) IN MEDICINE
Table 14: Sentiment for Diversity, Equity, and Inclusion (DEI) in Medicine
Sentiment
N
Percent
Agree
44
22.2%
Disagree
9
4.5%
Mixed
19
9.6%
Total Comments
198
100%
Figure 13: Sentiment for Diversity, Equity, and Inclusion (DEI) in Medicine
45
Diversity, Equity, and Inclusion (DEI) in Medicine: Selected Verbatims
[…] 17. This statement lacks substance and a plan with education is necessary. DEI comprehension is lacking
in medicine, even after many years, and it is deadly for our patients and tragic for our learners (Role: Medical
School Assistant/Associate Dean, MD)
18: Applications are full of potential biases, but this comment doesn't include a primary source of bias: names.
Ample evidence suggests that just seeing a traditionally female vs male name or a stereotypically Black name
vs White name can bias a reviewer. We should find a way to remove names from applications so that we can
review them blindly without name bias. […] (Role: Faculty Member of a Medical School, MD)
16. I worry about "forcing" students to reveal this sensitive identity information. 18. Same concerns as for 16.
[…] students may feel a lot of jeopardy in having to disclose this information. At the school level, it will be hard to
keep this data truly anonymous. Student may feels it will be very easy to identify them if they possess enough
rare traits e.g. gender queer native student. […] (Role: Medical School Assistant/Associate Dean, MD)
18. To facilitate evaluation of performance and outcomes by race, ethnicity, gender, etc. standardized collection
tools to allow for comparison across SOM would be welcomed. The GQ allows for some analysis if students
self identify but uniform collection of data across schools perhaps mandated by the LCME would allow for
reporting, comparison and highlighting where differences exist allowing for and demanding deeper analysis. […]
Standard tools and assessments will allow educators to determine if disparities occur and to better understand
how outcomes are impacted by race, gender, etc. all in an effort to decrease structural inequities that are
pervasive in education and healthcare. (Role: Clerkship Director, MD)
16) I interviewed 200 applicants personally. My pool of interviewees contained many women, blacks, Hispanics
and international medical graduates. For the past two years, my top picks have been people of color. None of
them ranked us. Again, when you are a n economically disadvantaged program, you cannot compete to get
diversity unless you accept applicants who have failed their USMLEs. And because we are a small, new
program - we cannot afford people who cannot pass their USMLEs since there is research data to show that
those same applicants can't pass the ABA's BASIC exam. […] (Role: Residency Program Director, MD)
[…] 18. […] STAFF DEVELOPMENT IS CRUCIAL. I've found it difficult to find the right resources at my
institution for staff development even though DEI has become a big part of our institutional conversation. Also
getting Program Director buy-in is challenging but also just finding the time/resources in a busy academic
institution with increasing patient demands AND a pandemic. […] (Role: Other: Residency & fellowship
program administrator/coordinator)
16. […] More students from diverse socioeconomic groups and non urban communities must be identified early
in their education, mentored and encouraged to get education and preparation for medical school, and recruited
actively and proactively. Work with community colleges, community groups and do not discriminate against
students who have had to work to go to college. […] (Role: Faculty Member of a Medical School, MD)
16. This is an absolute necessity. There is no reason that as a nation where Black people make up
approximately 25% of the population, they make up far less of the medical faculty and providers at hospitals,
residency programs and clinics throughout this country. The lack of Black male medical doctors is a stain on the
integrity of the medical establishment in this country and must be addressed for the betterment of our patient
population (Role: Medical School Student)
46
Table 15: Code Application Counts for Diversity, Equity, and Inclusion (DEI) in
Medicine
Code
N
Percent
*Advice & Coaching
17
2.1%
Advice & Coaching - Alternative Careers
5
0.6%
Advice & Coaching - Career Advising
9
1.1%
Advice & Coaching - Coaching
3
0.4%
Advice & Coaching - Specialty-specific Advising
3
0.4%
Advice & Coaching - Staff training to support students
2
0.2%
*Applications
11
1.3%
Applications - Biasing Applications
4
0.5%
Applications - LOR (Letters of Recommendation)
2
0.2%
Applications - MSPE (Medical School Performance Evaluation)
2
0.2%
Applications - Objective Metrics to Gauge Applicants
3
0.4%
Applications - Standardization of Application Process
2
0.2%
*Assessment
9
1.1%
Assessment - Standardized Exams
8
1%
*Assessment and Performance Data
21
2.6%
Assessment and Performance Data - Grades & Grading Pass Fail
14
1.7%
Assessment and Performance Data - Holistic Review
5
0.6%
*Competencies
3
0.4%
*Cost/Finances/Debt
19
2.3%
Cost/Finances/Debt - Implementation Cost
12
1.5%
Cost/Finances/Debt - Implementation Cost - GME
2
0.2%
Cost/Finances/Debt - Implementation Cost - UME
2
0.2%
Cost/Finances/Debt - Student Cost
4
0.5%
Cost/Finances/Debt - Student Debt
3
0.4%
*COVID Impact
2
0.2%
*Data Transparency & Availability
18
2.2%
Data Transparency & Availability - Data to Support Informed Decisions
6
0.7%
Data Transparency & Availability - Database of Program Info
6
0.7%
Data Transparency & Availability - Filters
4
0.5%
*DEI
113
13.8%
DEI - Balance when it comes to DEI
10
1.2%
DEI - Bias
53
6.5%
DEI - Bias - Racial Bias
10
1.2%
DEI - Diversity
48
5.8%
DEI - Diversity - Diversity Monitoring of Programs
21
2.6%
DEI - Diversity - Diversity Quotas
5
0.6%
DEI - Diversity - Policy Implications
8
1%
DEI - Elimination of Honors
8
1%
Table 15: Code Application Counts for Diversity, Equity, and Inclusion (DEI) in Medicine Continued
Code
N
Percent
47
DEI - Equity
11
1.3%
DEI - Fairness
5
0.6%
DEI - First-gen Med Student Support
4
0.5%
DEI - Inclusion
11
1.3%
DEI - Inclusion - Community outreach program(s)
3
0.4%
DEI - School Resource Availability
5
0.6%
DEI - SES
13
1.6%
DEI - URM
27
3.3%
DEI - URM - Black Medical Students
3
0.4%
DEI - URM - Non-URMs being put at disadvantage
3
0.4%
*DO/Osteopathy/Osteopathic
6
0.7%
*Faculty
9
1.1%
Faculty - Faculty Development
7
0.9%
*Funding
8
1%
Funding - GME Funding
5
0.6%
Funding - Influence of Private Equity
2
0.2%
*Implementation
28
3.4%
Implementation - Impact
16
1.9%
*Interviews
2
0.2%
Interviews - Interview Selection Criteria
2
0.2%
*Matching Process
42
5.1%
Matching Process - Slots
13
1.6%
Matching Process - Unmatched
31
3.8%
*Medical School Prestige
2
0.2%
*Mid-level Practitioners
5
0.6%
*Non-US Trained Students
13
1.6%
Non-US Trained Students - IMG
9
1.1%
Non-US Trained Students - US IMG
2
0.2%
*Oversight
3
0.4%
*Physician Shortage
9
1.1%
*Research
3
0.4%
*Rotations
2
0.2%
*Specialties
20
2.4%
Specialties - Competitive Specialties
5
0.6%
Specialties - Specialty Selection
8
1%
*Standardization of Requirements
2
0.2%
*Transition to Residency
2
0.2%
*Wellness/Wellbeing
3
0.4%
Total
821
100%
48
Figure 14: Code Application for Diversity, Equity, and Inclusion (DEI) in Medicine
49
Figure 15: Bigrams for Diversity, Equity, and Inclusion (DEI) in Medicine
50
APPLICATION PROCESS
Table 16: Sentiment for Application Process
Sentiment
N
Percent
Agree
76
25.9%
Disagree
13
4.4%
Mixed
31
10.5%
Total Comments
294
100%
Figure 16: Sentiment for Application Process
51
Application Process: Selected Verbatims
Recommendation is NOT clear. Does it recommend to just mention overall characteristics of applicants
Interviewed, Matched/Ranked or recommends disclosing each and every applicant information like scores, year
of graduation, medical school, attempts etc? […] (Role: Medical School Student)
21. […] adding an application cap would encourage use of this information to inform their application choices.
The application numbers continue to rise every year as students decide to apply to more programs to improve
their chance of matching. Many times an unmatched student is unmatched because they applied to too many
programs where they were not a competitive applicant and not enough where they had a good chance of
matching. Having better information to make these decisions is a great improvement and an application cap
would incentivize it's use. (Role: Residency Program Director, MD)
[…] it is important to ensure that solutions mitigate potential bias. ERAS fees contribute to a significant portion
of the AAMC operating budget. This represents a conflict of interest for change in structure. It will be important
that these conflicts of interests are dealt with when devising any centralized systems to ensure that they are not
going to be an additional source of income/costs to students and that there is oversight to ensure that the data is
transparent and validated.[…]. (Role: Medical School Assistant/Associate Dean, MD)
23. FTM - Failing To Match, is rampant on both sides, applicants and program directors. If both sides had a
system to work with that would allow them to develop a short list of "good" possibilities the anxiety would
decrease. But "good" needs to be defined as both the right fit, but also a high probability of matching.[…]
(Role: Faculty Member of a Medical School, MD)
22 and 23. It is our view that recommendations in this category do not go far enough. Until we truly change
criteria used to filter applicants we will not overcome the problems associated with the current UME GME
transition. We believe that significant more work needs to be done to define the characteristics that define an
applicant who will succeed in residency and beyond. Those characteristics need to be included. All other
characteristics that do not lead directly to high performing physicians should be excluded. […] (Role: I am
responding on behalf of an organization or group in an official capacity)
24: […] Creating a false equivalency between USMLE and COMLEX would only damage DO applicants further
by bringing ambiguity into the source of their percentile score as M.D. applicants cannot take COMLEX. I am
appalled that a committee such as this cannot comprehend even the most basic statistical evidence of this
discrepancy which is so well documented. The business interests of the NBOME are secondary to insuring
rigorous standards and score reporting for all medical students. The time is now to relegate the NBOME to
administering an osteopathic principles specific test and for all medical students, M.D. and D.O., to take the
USMLE steps 1, 2 and 3. (Role: Medical School Student)
The three-digit scores I achieved as a medical school graduate absolutely do not define nor limit my ability to
learn, grow, improve, and be a competent physician. In fact, as an applicant who has achieved the minimal
requirements of hundreds of programs, it is incumbent upon residency programs to educate residents and me
on how to maximize my testing skills, challenge my foundation of medical knowledge , and push me to succeed
as a physician so that patients can receive the care they deserve. In a manner of speaking, it is a failure of the
medical education system that medical school graduates such as myself are underprepared and under-
supported to be successful on licensure eligibility exams. (Role: Other: Unmatched Doctor, MD)
52
21. As I update my program's website, my residents have told me they do not want to provide any personal data
about their culture, religion or sexuality for the residency recruitment process, even if stored in a password
protected server. They feel this is their personal protected information and should not be used in recruitment. I
asked if it would have been helpful for them to see this when they were choosing a program, to align
themselves, and they said NO. The thought is that it hurts diversity to try to pick a program where everyone
there is culturally or racially or genderbased - like you are. It is best to base these decisions on the program
curriculum and other hospital data and offerings. (Role, Residency Program Director, DO)
As an osteopathic student, I know I will have biases against me from program directors. I know I cannot change
that. However, I think osteopathic schools should do a better job with our school structures in order to show PDs
that MD and DO schools can be comparable institutions. For example, my DO school does not have chairs for
every specialty because we are (obviously) not part of a university hospital system. No chair of medicine,
anesthesiology, surgery, etc. We only have a department of FM (which all the faculty regardless of specialty are
technically under but most faculty are FM docs) and the department of clinical specialities which is very vague.
So for example for students applying IM, students have to be lucky enough to rotate at a hospital where the
chair of a medicine has experience with and is willing to write a LoR. Not all of our community hospitals have
residencies so there’s varying levels of comfort in these community doctors writing letters. That puts the burden
on us as 4th years to find different rotations where we can get LoRs to fulfill letter requirements. And since our
schools only have affiliations with hospitals, we are not guaranteed that our better 3rd year hospitals will take us
on for 4th year as well because our school has such a large class size and these hospitals accommodate
Caribbean students as well. So then this makes us use Away rotations as a backup option, but not all Away
rotations will take us because we are either DO students or because they have a USMLE requirement when not
every DO student takes the USMLE. There are many things that osteopathic schools can do better, but I hope I
highlighted the pertinent things that I don’t think as many people think about. It’s easy for osteopathic physicians
to forget about some of the things that made their education more difficult once they’re past medical school
because “they’ve made it” to residency. But I will continue to advocate and voice my opinions so that the
judgment of being a DO is diminished little by little every year. I really appreciate you taking the time to read my
comments and thank you for working on these changes in UME and GME. Stay safe! (Role: Medical School
Student)
20- I have not seen any way to quantitate the data about a resident in a way that would be meaningful in a
database-- quantity then counts more than quality. I would rather have a resident who was meaningfully
involved in 1 volunteer activity than one who did 5 with little effort- but a data base will say that one did 1 and the
other did 5. 21- Similar to my comment about 20- I do not see how such a database will actually be meaningful.
We interview and rank and match a wide variety of applicants and a data base will not be able to reflect that
nuance. (Role: Residency Program Director, MD)
I worry about having a public database of stats regarding who we matched vs. who we ranked. It would not help
those who matched with us to know they were perhaps not our top ranked applicants. We often fill in the middle
to third quartile of our list. But we match students at the top of the list as well. There are always stronger and
less strong interns-- and the stronger ones are critical to pulling up the less strong. It would be terrible to be
viewed as a program that only attracted one kind of applicant. (Role: Residency Program Director)
23. Filters can help programs from a demographics perspective, for internal accountability. For example, some
programs have a filter by gender NOT because they are seeking to exclude a particular gender, but because it
provides a quick look at what their applicant pool looks like. For example, a filter for female applicants can
quickly help programs determine what the percentage of female applicants to their program is, and from there
determine what the percentage of female applicants in their “selected to interview” group is. It allows programs
to see if they are over- or under-representing a particular gender. If such a filter is published (or excluded), a
potential beneficial use is removed or could subject a program using it for good intentions to be scrutinized
unnecessarily. […] 24. A standardization of the USMLE and COMLEX scores should be considered carefully,
as there are studies that suggest that a simple percentile comparison between tests is not likely sufficient. A
comparison of percentile only, without considering the test taken, likely disadvantages a 50th percentile USMLE
examinee compared to a 50th percentile COMLEX examinee. […] A move toward a single licensure
53
examination taken by all students, with an additional OMM/OMT examination for osteopathic students, would
better serve this goal […]. (Role: Other: Associate Program Director, MD)
20 - i have concerns with this concept overall. When you tell applicants to continue to only look for programs
where the current residents or current applicants match that applicants characteristics, this inhibits program
change and growth. For example, if a program is working hard on recruitment of URiM residents, but currently
have 0-2 URiM residents in their program, applicants may think this is a place where they do not have a good fit,
but then it will be a self-fulfilling prophecy for the program and this will inhibit their growth, no matter how hard
they dedicate themselves to this cause. In addition, the same applies to other metrics for applicants. If a
program is identified as taking applicants with higher board scores or from certain schools, then those with lower
scores may not apply. Alternatively, if a program is identified as having all residents with lower scores, this may
deter applicants with those higher and mark those programs as "not as good” […]. (Role: Residency
Program Director, MD)
20. […] Some great residents/physicians may struggle with the boards, which are less and less relevant in an
information age. Specialty board exams do not measure quality, nor to they measure the most important skill of
future physicians - communication. Any recent graduate should be able to use decision support resources to
identify a differential, proper testing/evaluation, and proper treatmtents. It's more important the the physician be
able to explain it all in a way that engages the patient and family in the treatment program. […] (Role: Medical
School Assistant/Associate Dean, MD)
20: Earlier comment about medicine, specialties, and programs having no idea what they actually desire as
outcomes and no way to inform their selection process outside of stratifying metrics that demonstrably (large
evidence base) do not predict future performance in a meaningful way and simply introduce bias. 21: Need to
do a deep dive on what metrics are actually meaningful. I am certain many are missed here. What can be
measured easily is often not what is of merit. 22: Machines are trained by people. Biased people make biased
machines. Training NLP to be thorough will reduce selection to arbitrary keyword searching. The way this is
proposed does not promote holistic recruitment. I doubt that is the intent. This is a hotly debated topic (see ICRE
May 2021 presentation on this topic). 23: Filters should only be created once we know what our outcomes are
and how to alter the selection process. Extensive scholarship is needed in this domain (mine and a select
handful of others is ongoing but will require support and buy-in from a broad stakeholder group and to move to a
co-productive model). […] (Role: Residency Program Director, MD)
54
Table 17: Code Application Counts for Application Process
Code
N
Percent
*Advice & Coaching
6
0.5%
Advice & Coaching - Career Advising
6
0.5%
*Applications
79
6.8%
Applications - Caps and Limits
22
1.9%
Applications - Application Process
12
1%
Applications - Application Redundancy
8
0.7%
Applications - Biasing Applications
13
1.1%
Applications - LOR (Letters of Recommendation)
5
0.4%
Applications - MSPE (Medical School Performance Evaluation)
16
1.4%
Applications - Objective Metrics to Gauge Applicants
7
0.6%
Applications - Standardization of Application Process
12
1%
*Assessment
99
8.5%
Assessment - Standardized Exams
97
8.3%
Assessment - Inequality in Scaling
36
3.1%
Assessment - Licensing exam quality differences
14
1.2%
Assessment - Single Licensing Exam
41
3.5%
*Assessment and Performance Data
33
2.8%
Assessment and Performance Data - Grades & Grading Pass Fail
4
0.3%
Assessment and Performance Data - Holistic Review
29
2.5%
*Competencies
2
0.2%
*Cost/Finances/Debt
40
3.4%
Cost/Finances/Debt - Implementation Cost
10
0.9%
Cost/Finances/Debt - Implementation Cost - GME
5
0.4%
Cost/Finances/Debt - Implementation Cost - UME
2
0.2%
Cost/Finances/Debt - Student Cost
27
2.3%
Cost/Finances/Debt - Student Debt
2
0.2%
*Data Transparency & Availability
117
10%
Data Transparency & Availability - Data to Support Informed Decisions
38
3.3%
Data Transparency & Availability - Database of Program Info
67
5.7%
Data Transparency & Availability - Filters
51
4.4%
*DEI
34
2.9%
DEI - Bias
7
0.6%
DEI - Bias - Racial Bias
2
0.2%
DEI - Diversity
5
0.4%
DEI - Equity
16
1.4%
DEI - Fairness
4
0.3%
DEI - SES
2
0.2%
DEI - URM
4
0.3%
*DO/Osteopathy/Osteopathic
48
4.1%
55
Table 17: Code Application Counts for Application Process Continued
Code
N
Percent
*Funding
3
0.3%
*Implementation
33
2.8%
Implementation - Impact
29
2.5%
*Interviews
12
1%
Interviews - Interview Caps and Limits
5
0.4%
Interviews - Interview Selection Criteria
4
0.3%
*Matching Process
11
0.9%
Matching Process - Slots
2
0.2%
Matching Process - Unmatched
7
0.6%
*Mid-level Practitioners
2
0.2%
*Non-US Trained Students
8
0.7%
Non-US Trained Students - IMG
6
0.5%
*Specialties
15
1.3%
Specialties - Competitive Specialties
2
0.2%
Specialties - Specialty Selection
4
0.3%
*Wellness/Wellbeing
4
0.3%
Total
1,169
100%
56
Figure 17: Code Application for Application Process
57
Figure 18: Bigrams for Application Process
58
INTERVIEWING
Table 18: Sentiment for Interviewing
Sentiment
N
Percent
Agree
109
23.5%
Disagree
75
16.2%
Mixed
110
23.7%
Total Comments
464
100%
Figure 19: Sentiment for Interviewing
59
Interviewing: Selected Verbatims
[…] The following delineates concerns regarding the recent UGRC report. The main concern is focused on
item 26 which produced a wide-reaching opinion regarding the conduct of the admissions process for 2021-
2022. The proposed guidance conflates public health policy with arguments regarding equitable process.
Process: • The report has not been inclusive of specialty societies and is focused on a UME constituency.
[…] The specific details of the construct of an interview day, specifically a determination between virtual or in
person interviewing, is outside the scope of the UGRC. […] Item 26 conflates public health policy with
assumptions regarding equity. The two issues should be separate. […] Forced conversion to virtual interviews
will create an environment whereby candidates interested in a specific program will find ways to visit the
program outside of the interview process. This will create an advantage for those who are able to afford or
arrange the opportunity. […] Programs will be burdened with adjudicating special interests regarding
“outside formal process” visitation. Special circumstances pertaining to personal advocacy or relationships
will be more prominent in the admissions process as “insider” influence is exerted to produce introductions
and/or special visits. • Insularity will be increased. Without on site introductions to programs, culture,
residency cohorts, and faculty, candidates will regress to what they know. The trend will be to stay at the home
program. • Home programs will regress to “safety”. Home students and people who accomplished their
single rotation opportunity at the program will be prioritized. This will breed insularity. […]. Programs in
less desirable areas will be severely disadvantaged. They will be stunted in their ability to attract learners
outside of their immediate sphere. Reputation of an institution or program will be artificially emphasized.
•Candidates will not be able to assess the culture of an institution or program in a personal and facile manner
via the virtual platform which result in limiting choice. […] (Role: Faculty Member of a Medical School,
MD)
Recommendation #26. Interviewing should be virtual for the 2021-2022 residency recruitment season. To
ensure equity and fairness, there should be ongoing study of the impact and benefits of virtual interviewing as a
permanent means of interviewing for residency. Just as programs were free to interview in-person or virtually,
elect to provide meals, elect to fund transportation/accommodations, elect to provide trinkets/souvenirs before
COVID-19, the same should be the case in the '21-'22 recruitment season unless the pandemic imposes
national restrictions on in-person interviewing. Programs are different and we should value and appreciate
those differences. […] A one-size fits all mandate will not achieve parity and fairness, but instead will leave
certain programs more or less advantaged. Recommendation #27. Implement a centralized process to
facilitate evidence-based, specialty-specific limits on the number of interviews each applicant may attend. This
is only part of the problem...if you limit the number of programs an applicant can apply to, but don't limit the
number of applicants a program can interview, then the subjectively ranked "top-tiered" programs will
undoubtedly see the greatest benefit while the subjectively ranked "lowest ranked" programs will struggle to find
applicants available for invitations. An early-decision program can only work if there are limits for both parties
as it will force applicants to only pick a few of their desired programs and force programs to only invite a
restricted number of their desired applicants, leaving more applicants and programs available to other programs
and applicants, respectively. (Role: Medical School Assistant/Associate Dean, MD)
60
Table 19: Code Application Counts for Interviewing
Code
N
Percent
*Advice & Coaching
3
0.1%
Advice & Coaching - Career Advising
2
0.1%
*Applications
142
3.9%
Applications - Caps and Limits
61
1.7%
Applications - Application Process
81
2.2%
Applications - Biasing Applications
9
0.2%
Applications - Objective Metrics to Gauge Applicants
6
0.2%
Applications - Standardization of Application Process
35
1%
*Assessment
8
0.2%
Assessment - Accurate assessments
3
0.1%
Assessment - Standardized Exams
5
0.1%
*Assessment and Performance Data
12
0.3%
Assessment and Performance Data - Holistic Review
10
0.3%
*Communication
23
0.6%
*Competencies
2
0.1%
*Cost/Finances/Debt
123
3.4%
Cost/Finances/Debt - Implementation Cost
19
0.5%
Cost/Finances/Debt - Program Cost
31
0.9%
Cost/Finances/Debt - Student Cost
116
3.2%
Cost/Finances/Debt - Student Debt
10
0.3%
*COVID Impact
62
1.7%
*Data Transparency & Availability
167
4.6%
Data Transparency & Availability - Data to Support Informed Decisions
159
4.4%
Data Transparency & Availability - Database of Program Info
27
0.7%
Data Transparency & Availability - Filters
4
0.1%
*DEI
194
5.3%
DEI - Bias
19
0.5%
DEI - Diversity
9
0.2%
DEI - Equity
126
3.5%
DEI - Fairness
66
1.8%
DEI - Inclusion
20
0.6%
DEI - Inclusion - Community outreach program(s)
11
0.3%
DEI - Reputation
38
1%
DEI - School Resource Availability
22
0.6%
DEI - SES
37
1%
DEI - Small Program(s)
48
1.3%
DEI - URM
5
0.1%
*DO/Osteopathy/Osteopathic
5
0.1%
*Funding
17
0.5%
61
Table 19: Code Application Counts for Interviewing Continued
Code
N
Percent
Funding - GME Funding
3
0.1%
*Implementation
126
3.5%
Implementation - Change Management
63
1.7%
Implementation - Cohesive Policy
23
0.6%
Implementation - Impact
119
3.3%
*Interviews
409
11.3%
Interviews - Interview Caps and Limits
156
4.3%
Interviews - Interview Selection Criteria
68
1.9%
Interviews - Virtual Interviews
328
9%
*Matching Process
124
3.4%
Matching Process - Couples
7
0.2%
Matching Process - Early Decision/Matches
4
0.1%
Matching Process - Matched
6
0.2%
Matching Process - Second Looks
21
0.6%
Matching Process - SOAP (Supplemental Offer and Acceptance Program)
5
0.1%
Matching Process - Unmatched
9
0.2%
*Medical School Prestige
7
0.2%
*Non-US Trained Students
12
0.3%
Non-US Trained Students - IMG
10
0.3%
*Oversight
32
0.9%
Oversight - Cohesive Oversight Committee
2
0.1%
*Research
70
1.9%
*Roles
23
0.6%
Roles - Program Directors
22
0.6%
*Rotations
9
0.2%
Rotations - Away Rotations
8
0.2%
*Specialties
40
1.1%
Specialties - Competitive Specialties
18
0.5%
Specialties - Specialty Selection
4
0.1%
*Standardization of Requirements
51
1.4%
Standardization of Requirements - Cross Specialty Standardization
10
0.3%
*Training
10
0.3%
*Transition to Residency
17
0.5%
Transition to Residency - Timing
4
0.1%
*Wellness/Wellbeing
55
1.5%
Wellness/Wellbeing - Life Changes
22
0.6%
Total
3,634
100%
62
Figure 20: Code Application for Interviewing
63
Figure 21: Bigrams for Interviewing
64
MATCHING PROCESS
Table 20: Sentiment for Matching Process
Sentiment
N
Percent
Agree
84
32.1%
Disagree
32
12.2%
Mixed
11
4.2%
Total Comments
262
100%
Figure 22: Sentiment for Matching Process
65
Matching Process: Selected Verbatims
This is another idea that would create further diversity in programs. Many people apply broadly so as to avoid
not matching. Because of a recent "toxic" culture of residencies and applicants alike misrepresenting their desire
to match each other, many desirable applicants do not match at programs that would have loved to take them.
Early decision options would allow applicants the opportunity to show definite interest in a program. That said,
early decision options should be hidden from all other institutions. Applicants should not be punished by other
institutions should they not match early decision[…] Should an early decision process take place, all applicants
not selected should be let back into the regular match process without prejudice. (Role: Medical School
Student)
An early decision application cycle is a terrible idea. This just extends the recruitment season for program
directors which extends the time during which a program director is not actually doing anything with his or her
residency program. Recruitment is a time at which everything with the residency program is on standstill and an
early cycle would just lengthen that. Also, an early cycle would set up a dichotomy among residents in my
program. […] If we go to a double match cycle with an early application cycle, then I will quit being program
director because I do not want to deal with it. Please […] please do not do this!!!!! Early decision works well in
a non-match cycle for undergraduate education, but in a match process just sets up an "us versus them". It is
no way will help me more holistically review applications. I am convinced that there is no good way to reduce
applications other than better advising at the med school level and anything else that we do to try to reduce
applications will just cause more harm than good. THe only thing that might be beneficial is program signaling
although the evidence from ENT has not been published yet. (Role: Residency Program Director, MD)
The Matching process is unfortunately very discouraging to what they refer to as “old IMG”. Every year I see
younger and younger graduates coming from far away and my chance is fading. […] with the level of clinical
experience and familiarity with the health system in America, I consider myself to be qualified and ready to start
residency training regardless of my age “41” and years of graduation “15”. […] I suggest booster cycles for
Citizens old IMGs, where applications can be given more time for review and consideration. […] (Role: Non-
Practicing Physician/Clinician, MBBS)
[…] One question that arises is whether, in a competency based time variable educational system, (which
seems to be a newer model for students who might achieve competency earlier than the traditional model),
should there be an option for an early entry into residency. Not sure if this was discussed. I am not proposing a
3 year medical school (although my own school is considering this) but rather two entry points into residency.
This would also address the unmatched who might be ready for residency in January rather than waiting for
July. (Role: Clerkship Director, MD)
Process needs to be revamped. […] we have considered resident-funded residency positions, with a yearly
tuition, to help students who are eager to match and become licensed. As the process is often an impediment to
many qualified applicants, we have also discussed an "associate/extender" nomenclature, especially in the
primary care setting to create an extended "residency," particularly in the primary care/family medicine
speciality/setting. Perhaps a 4 to 5 year term in this role to qualify for board certification/licensure. Also, we
would like the ability to compare "oranges to oranges" and "apples to apples" by the creation of a universal
licensing exam for both Osteopathic and Allopathic applicants; as their residencies are universally accredited.
[…] (Role: Residency Program Director, MD)
We have got to have some way of reducing #of applications per applicant. I advise many students and my
advice on this falls on deaf ears. My best student this year applied to > 80 programs, received almost that many
interviews, interviewed at 34, matched at her number one which was our program - right where she started! The
fruitless ness of that effort is astounding. (Role: Residency Program Director, MD)
66
Recommendation #28. […] Sans a few exceptions (such as Urology and Ophthalmology), we must first accept
the inescapable truth that the AAMC and NRMP generate huge revenues due to application inflation (that they
welcome with open arms) and it's laughable that none of that profit goes back to the users (applicants and
programs) who are mandated to use this monopolized process. We must also appreciate that our GME
programs, funded in large part by the American public, are not in a position to ensure our U.S. MD/DO
graduates (with tremendous debt burdens often owed to the American public through federal loan programs) are
provided with a GME training opportunity before opening remaining spots to others as they are tasked with
competing with all applicant types in a one-time Match. […] (Role: Residency Program Director, MD)
28. There needs to be an enforced maximum number of applications per applicant. In emergency medicine this
year, our program essentially received applicants from half of the candidates applying to the specialty. It is
impossible to screen this number of applicants holistically and nearly impossible to screen them using
conventional USMLE cut-offs. The current situation is simply untenable. (Role: Residency Program
Director, MD)
67
Table 21: Code Application Counts for Matching Process
Code
N
Percent
*Advice & Coaching
7
0.8%
Advice & Coaching - Career Advising
4
0.4%
Advice & Coaching - Specialty-specific Advising
2
0.2%
*Applications
69
7.6%
Applications - Caps and Limits
35
3.8%
Applications - Application Process
17
1.9%
Applications - Application Redundancy
14
1.5%
Applications - MSPE (Medical School Performance Evaluation)
2
0.2%
Applications - Standardization of Application Process
5
0.5%
*Assessment
5
0.5%
Assessment - Standardized Exams
5
0.5%
*Assessment and Performance Data
15
1.6%
Assessment and Performance Data - Holistic Review
15
1.6%
*Communication
2
0.2%
*Competencies
2
0.2%
*Cost/Finances/Debt
33
3.6%
Cost/Finances/Debt - Implementation Cost
8
0.9%
Cost/Finances/Debt - Implementation Cost - GME
3
0.3%
Cost/Finances/Debt - Program Cost
7
0.8%
Cost/Finances/Debt - Student Cost
20
2.2%
Cost/Finances/Debt - Student Debt
4
0.4%
*COVID Impact
5
0.5%
*Data Transparency & Availability
12
1.3%
Data Transparency & Availability - Data to Support Informed Decisions
8
0.9%
Data Transparency & Availability - Database of Program Info
4
0.4%
Data Transparency & Availability - Filters
4
0.4%
*DEI
39
4.3%
DEI - Bias
6
0.7%
DEI - Diversity
8
0.9%
DEI - Equity
16
1.8%
DEI - Fairness
5
0.5%
DEI - Inclusion
3
0.3%
DEI - Small Program(s)
8
0.9%
DEI - URM
2
0.2%
*Funding
5
0.5%
Funding - GME Funding
2
0.2%
Funding - Unfunded Mandate
2
0.2%
*Implementation
57
6.3%
Implementation - CQI (Continuous Quality Improvement)
2
0.2%
68
Table 21: Code Application Counts for Matching Process Continued
Code
N
Percent
Implementation – Impact
50
5.5%
*Interviews
15
1.6%
Interviews - Interview Caps and Limits
4
0.4%
Interviews - Interview Selection Criteria
2
0.2%
Interviews - Virtual Interviews
2
0.2%
*Matching Process
130
14.3%
Matching Process - Couples
2
0.2%
Matching Process - Early Decision/Matches
108
11.9%
Matching Process - Matched
3
0.3%
Matching Process - Slots
9
1%
Matching Process - SOAP (Supplemental Offer and Acceptance Program)
6
0.7%
Matching Process - Unmatched
10
1.1%
*Medical School Prestige
7
0.8%
*Non-US Trained Students
9
1%
Non-US Trained Students - IMG
6
0.7%
*Physician Shortage
3
0.3%
*Research
4
0.4%
*Roles
2
0.2%
*Rotations
5
0.5%
Rotations - Away Rotations
3
0.3%
*Specialties
27
3%
Specialties - Competitive Specialties
3
0.3%
Specialties - Specialty Selection
13
1.4%
*Transition to Residency
5
0.5%
Transition to Residency - Timing
4
0.4%
*Wellness/Wellbeing
16
1.8%
Total
910
100%
69
Figure 23: Code Application for Matching Process
70
Figure 24: Bigrams for Matching Process
71
FACULTY SUPPORT RESOURCES
Table 22: Sentiment for Faculty Support Resources
Sentiment
N
Percent
Agree
58
66.7%
Disagree
5
5.7%
Mixed
2
2.3%
Total Comments
87
100%
Figure 25: Sentiment for Faculty Support Resources
72
Faculty Support Resources: Selected Verbatims
29: Resources identified in this repository should include accredited and vetted remedial education courses that
comprehensively address lapses in professionalism, which are predictable at this transitionary phase in a
physician's career. Common cases of these behavioral lapses in the transition phase include impropriety or
cheating on exams due to immense pressures to succeed, or lapses of professional behavior with colleagues or
patients during their adjustment to their new professional role and identity. […] Education is a critical component
to longitudinal success of a remediation plan to prevent recidivism and future harm to the public. It is important
that the remedial education provider is a neutral third party, separate from the residency program, to eliminate
bias, and to foster an open safe space for disclosure and to mitigate feelings of mistrust on the part of the
resident. (Role: Intern/Resident/Fellow)
Comment on recommendation 29: We support this recommendation with clarification. Centralized resident
support resources will be invaluable to residency programs; however, we caution that evidenced based
resources may not adequately recognize the individual characteristics of each resident. Just as assessments
should be fair and equitable, our tools to assist with remediation and well-being must also be inclusive and
equitable. The ways in which individual identities inform the manner with which residents need to be supported
must be acknowledged and supported by the evidence. Comment on recommendation 30: We support this
recommendation with specific rephrasing. We recommend against use of use of ‘avoiding.’ Instead, we suggest
faculty should be trained to recognize their implicit bias and through faculty development, gain the tools to
appropriately address and mitigate those biases impacting their behaviors and decisions. The must is followed
by a list, however, faculty who are not involved in recruitment, for example, may not need the faculty
development focused on equity in recruitment. (Role: I am responding on behalf of an organization or group in
an official capacity)
Faculty are essential in a trainee’s pathway to independence, both influencing trainee professional identity
formation (PiF) (Recommendation 12) and providing meaningful feedback in skill development across the
continuum (Recommendations 9, 18, and 30). It is fundamental that the medical education community invests
in faculty development (FD). AAIM recommends the creation of FD tracks in teaching/learning, PiF,
evaluation/assessment, and instructional design/ curriculum development. To expand and evolve these tracks,
the medical education community must include non-physician educators and tap into their expertise to build a
cadre of competent physician educators. Residency programs and medical schools can leverage these experts
to assist with faculty development that could be beneficial at a national level. Institutions should be able to
access these shared resources so that they can build their own tracks or so that their faculty can easily access
and benefit from these national medical education programs. A shared approach will allow for greater
standardization of best practices, which will benefit the overall UME to GME transition. Since the focus is on
bolstering educator proficiency, the Alliance supports both didactic and peer-to-peer observations and feedback.
AAIM recognizes the financial impact this commitment entails and understands that budgets vary, and some
institutions would view it as an arduous undertaking; developing and sharing these tracks nationally is essential
to help ease that burden. […] (Role: I am responding on behalf of an organization or group in an official
capacity)
73
Table 23: Code Application Counts for Faculty Support Resources
Code
N
Percent
*Advice & Coaching
6
1.7%
Advice & Coaching - Career Advising
2
0.6%
Advice & Coaching - Staff training to support students
3
0.9%
*Applications
2
0.6%
*Assessment
7
2%
Assessment - Standardized Exams
2
0.6%
*Assessment and Performance Data
2
0.6%
*Communication
2
0.6%
*Cost/Finances/Debt
14
4%
Cost/Finances/Debt - Implementation Cost
8
2.3%
Cost/Finances/Debt - Program Cost
5
1.4%
*Data Transparency & Availability
15
4.3%
Data Transparency & Availability - Dashboard or Portfolio
4
1.1%
Data Transparency & Availability - Data to Support Informed Decisions
12
3.4%
*DEI
30
8.6%
DEI – Bias
18
5.1%
DEI - Bias - Racial Bias
6
1.7%
DEI – Equity
10
2.9%
DEI – Inclusion
7
2%
DEI - Small Program(s)
2
0.6%
*Faculty
34
9.7%
Faculty - Faculty Development
33
9.4%
*Funding
5
1.4%
Funding - Unfunded Mandate
2
0.6%
*Implementation
45
12.9%
Implementation - Cohesive Policy
9
2.6%
Implementation - CQI (Continuous Quality Improvement)
2
0.6%
Implementation – Impact
21
6%
*Non-US Trained Students
2
0.6%
Non-US Trained Students - IMG
2
0.6%
*Oversight
3
0.9%
Oversight - Cohesive Oversight Committee
2
0.6%
*Public Health
2
0.6%
*Roles
11
3.1%
Roles - DIO (Designated Institutional Officer)
3
0.9%
Roles - Program Directors
11
3.1%
*Wellness/Wellbeing
6
1.7%
Total
350
100%
74
Figure 25: Code Application for Faculty Support Resources
75
Figure 26: Bigrams for Faculty Support Resources
76
POST-MATCH TRANSITION TO RESIDENCY
Table 24: Sentiment for Post-Match Transition to Residency
Sentiment
N
Percent
Agree
28
17.7%
Disagree
7
4.4%
Mixed
22
13.9%
Total Comments
158
100%
Figure 27: Sentiment for Post-Match Transition to Residency
77
Post-Match Transition to Residency: Selected Verbatims
[…] 32: How are we determining these? Again, scholarship -> evidence informed best practices and policies ->
pilot innovations -> adoption 33: Mandatory orientations need to be supported with time and money. These are
not addressed. Additionally, scholarship -> evidence informed best practices and policies -> pilot innovations ->
adoption. This will be key for determining universal components and then those that belong to the specialties.
There also must be adaptable components to address what was learned in the handover and ILP process. 34:
There are a variety of innovations in curricula out there - a group should define best practices and ways to adapt
these to different environments 35: This will need to be designed, validity evidence gathered, a G study
performed, equity addressed, and how performance informs ILPs and coaching addressed. Again, scholarship -
> evidence informed best practices and policies -> pilot innovations -> adoption. […] (Role:
Residency Program Director, MD)
I believe that it must be emphasized that GME is a time of learning and that residents are learners. Too often,
there is a mismatch between expectations of new interns and their competency that is not due to a problem with
their competency, but rather generated by excessive expectations of residency programs -- especially in the first
3-4 months or internship -- that result from inadequate staffing, back-up, support, etc. Hospitals and other sites
of learning should be better supported in general, and especially in the summer months, to facilitate the
transition to residency. Interns are not ready for independent practice, and that should not be the expectation.
This is especially important to note since new interns are often transitioning to a different city, a different health
system, etc. at this time. (Role: I am responding on behalf of an organization or group in an official
capacity)
78
Table 25: Code Application Counts for Post-Match Transition to Residency
Code
N
Percent
*Advice & Coaching
4
0.4%
*Assessment
16
1.7%
Assessment - Accurate assessments
12
1.2%
Assessment - Standardized Exams
3
0.3%
*Assessment and Performance Data
24
2.5%
Assessment and Performance Data - ILPs (Individualized Learning Plans)
21
2.2%
*Communication
2
0.2%
*Competencies
9
0.9%
Competencies - EPAs (Entrustable Professional Activities)
3
0.3%
Competencies – Milestones
3
0.3%
*Cost/Finances/Debt
48
5%
Cost/Finances/Debt - Implementation Cost
24
2.5%
Cost/Finances/Debt - Implementation Cost – GME
18
1.9%
Cost/Finances/Debt - Implementation Cost – UME
9
0.9%
Cost/Finances/Debt - Program Cost
16
1.7%
Cost/Finances/Debt - Student Cost
34
3.5%
Cost/Finances/Debt - Student Debt
13
1.4%
*Data Transparency & Availability
22
2.3%
Data Transparency & Availability - Dashboard or Portfolio
8
0.8%
Data Transparency & Availability - Data to Support Informed Decisions
17
1.8%
Data Transparency & Availability - Database of Program Info
13
1.4%
*DEI
30
3.1%
DEI – Bias
5
0.5%
DEI – Diversity
5
0.5%
DEI – Equity
8
0.8%
DEI – Fairness
2
0.2%
DEI – Inclusion
8
0.8%
DEI - School Resource Availability
10
1%
DEI – SES
3
0.3%
DEI - Small Program(s)
4
0.4%
*Faculty
7
0.7%
Faculty - Faculty Development
7
0.7%
*Funding
36
3.7%
Funding - GME Funding
19
2%
*Implementation
29
3%
Implementation - Change Management
12
1.2%
Implementation - Cohesive Policy
3
0.3%
Implementation - CQI (Continuous Quality Improvement)
2
0.2%
Implementation – Impact
22
2.3%
79
Table 25: Code Application Counts for Post-Match Transition to Residency Continued
Code
N
Percent
*Interviews
2
0.2%
*Licensure
2
0.2%
*Matching Process
13
1.4%
Matching Process - Matched
2
0.2%
Matching Process - Unmatched
2
0.2%
*Non-US Trained Students
9
0.9%
Non-US Trained Students - IMG
4
0.4%
*Oversight
7
0.7%
*Research
16
1.7%
*Roles
10
1%
Roles - Other Roles
4
0.4%
Roles - Program Directors
6
0.6%
*Specialties
11
1.1%
Specialties - Specialty Selection
4
0.4%
*Standardization of Requirements
18
1.9%
Standardization of Requirements - Cross Specialty Standardization
2
0.2%
Standardization of Requirements - Cross State Standardization
2
0.2%
*Training
34
3.5%
*Transition to Residency
96
10%
Transition to Residency - Bootcamp
11
1.1%
Transition to Residency - Learner Handover
30
3.1%
Transition to Residency - Orientation
45
4.7%
Transition to Residency - Timing
47
4.9%
*Wellness/Wellbeing
36
3.7%
Wellness/Wellbeing - Life Changes
17
1.8%
Total
961
100%
80
Figure 28: Code Application for Post-Match Transition to Residency
81
Figure 29: Bigrams for Post-Match Transition to Residency
82
POLICY IMPLICATIONS
Table 26: Sentiment for Policy Implications
Sentiment
N
Percent
Agree
42
44.2%
Disagree
3
3.2%
Mixed
12
12.6%
Total Comments
95
100%
Figure 30: Sentiment for Policy Implications
83
Policy Implications: Selected Verbatims
Please consider the consequences of the massive overexpansions of nurse practitioner, physician assistant,
osteopathic, and US MD annual graduates. They are clearly increasing at rates that are 6 to 10 times the annual
population growth rate of 0.6% or 3 to 5 times any increase in demand. Please note that the annual dollars
going to support these health professionals are not increasing at anything close to these 4 to 6% annual
increases in these 4 sources. To translate, responsible health professional leadership should be more
interested in a reduction in the annual graduates arising from US MD and DO schools as compared to an
expansion of GME. At a minimum some discussions of a moratorium should be initiated with PA and NP
leadership. The health professional leaders should avoid at all costs a massive glut of workforce although the
previous expansions guarantee this. Also important is understanding that the various deficits and shortages
such as half enough generalists and general specialists for 40% of the US population - is the result of the worst
financial design specific to basic, office, most needed, most prevalent services in settings where the worst public
and private health insurance are found along with the worst employers and populations lower in income. For
example primary care in these 2621 counties is about 60,000 physicians for this 130 million or about 46 per
100,000. These counties have about 45% of the complexity in this 40% of the population with only 25% of
primary care workforce and less than 20% of primary care spending. And the requirements of HITECH to value
based have eroded about 1 billion a year reducing what can be invested from 38 billion to less than 30 billion.
It is simply not possible to resolve shortages with training designs - as I charted in Nebraska with the most
successful pipelines - and 70 counties that remained just as short of health care workforce over 15 years. (Role:
Other: Most of the above, MD)
84
Table 27: Code Application Counts for Policy Implications
Code
N
Percent
*Advice & Coaching
3
1%
Advice & Coaching - Career Advising
3
1%
*Applications
2
0.7%
Applications - Application Process
2
0.7%
*Assessment
2
0.7%
Assessment - Standardized Exams
2
0.7%
*Cost/Finances/Debt
15
5.2%
Cost/Finances/Debt - Implementation Cost
8
2.8%
Cost/Finances/Debt - Implementation Cost - GME
5
1.7%
Cost/Finances/Debt - Program Cost
5
1.7%
Cost/Finances/Debt - Student Cost
7
2.4%
Cost/Finances/Debt - Student Debt
3
1%
*Data Transparency & Availability
2
0.7%
*DEI
3
1%
DEI - Equity
2
0.7%
*DO/Osteopathy/Osteopathic
2
0.7%
*Funding
22
7.6%
Funding - GME Funding
19
6.6%
*Implementation
15
5.2%
Implementation - Change Management
7
2.4%
Implementation - Cohesive Policy
4
1.4%
Implementation - Impact
10
3.4%
*Licensure
8
2.8%
*Matching Process
13
4.5%
Matching Process - Matched
2
0.7%
Matching Process - Slots
2
0.7%
Matching Process - Unmatched
3
1%
*Non-US Trained Students
5
1.7%
Non-US Trained Students - IMG
3
1%
*Oversight
5
1.7%
*Research
3
1%
*Specialties
18
6.2%
Specialties - Specialty Selection
15
5.2%
*Standardization of Requirements
25
8.6%
Standardization of Requirements - Cross State Standardization
18
6.2%
*Training
5
1.7%
*Transition to Residency
6
2.1%
Transition to Residency - Timing
2
0.7%
*Wellness/Wellbeing
10
3.4%
85
Table 27: Code Application Counts for Policy Implications Continued
Code
N
Percent
Wellness/Wellbeing - Life Changes
4
1.4%
Total
290
100%
86
Figure 31: Code Application for Policy Implications
87
Figure 32: Bigrams for Policy Implications
88
RESEARCH QUESTIONS
Table 28: Sentiment for Research Questions
Sentiment
N
Percent
Agree
25
40.3%
Disagree
2
3.2%
Mixed
3
4.8%
Total Comments
62
100%
Figure 33: Sentiment for Research Questions
89
Research Questions: Selected Verbatim
Are you considering doing some hypothetical modeling to determine if you looked at student application patterns
then randomly assign them to a program how that might look? A lot of time and energy is spent on this and
maybe its more about programs providing quality information about their program, let students investigate that
and see what programs best fit their interests and apply based on that. Let the program just randomly decide.
We keep deluding ourselves into thinking this process somehow gives agency to students and programs, but in
the end it is still an algorithm that makes the decision. (Role: Faculty Member of a Medical School)
90
Table 29: Code Application Counts for Research Questions
Code
N
Percent
*Applications
3
2.9%
*Assessment
2
1.9%
Assessment - Standardized Exams
2
1.9%
*Assessment and Performance Data
2
1.9%
Assessment and Performance Data - Grades & Grading Pass Fail
2
1.9%
*Competencies
2
1.9%
Competencies - EPAs (Entrustable Professional Activities)
2
1.9%
*Cost/Finances/Debt
5
4.8%
Cost/Finances/Debt - Implementation Cost
3
2.9%
*Data Transparency & Availability
3
2.9%
Data Transparency & Availability - Data to Support Informed Decisions
2
1.9%
Data Transparency & Availability - Database of Program Info
2
1.9%
Data Transparency & Availability - Filters
2
1.9%
*DEI
9
8.7%
DEI - Bias
3
2.9%
DEI - Fairness
2
1.9%
DEI - Inclusion
2
1.9%
*Funding
6
5.8%
*Implementation
9
8.7%
Implementation - Impact
7
6.7%
*Matching Process
4
3.8%
Matching Process - Unmatched
2
1.9%
*Non-US Trained Students
5
4.8%
Non-US Trained Students - IMG
2
1.9%
*Physician Shortage
3
2.9%
*Research
10
9.6%
*Specialties
6
5.8%
Specialties - Specialty Selection
2
1.9%
Total
104
100%
91
Figure 34: Code Application for Research Questions
92
Figure 35: Bigrams for Research Questions
93
OTHER COMMENTS
Table 30: Sentiment for Other Comments
Sentiment
N
Percent
Agree
33
11.6%
Disagree
15
5.3%
Mixed
8
2.8%
Total Comments
284
100%
Figure 36: Sentiment for Research Questions
94
Other Comments: Selected Verbatims
1. Residencies should be made to filter by score only after philosophical/program characteristic matches are
made. Scores are often used first, and they are a poor measure of ability and fit to many programs. 2.
Interviews should be offered virtually. But applicants should be limited on the amount of interviews they can
accept if they choose this option. 3. Non-US IMGs should not be allowed to participate in the Match. Many of
my colleagues think very highly of the candidates, but they feel that it is unfair that a candidate can come from a
program where all expenses were paid and they had 6+ months to prepare for their boards. Those advantages
are not fair to US graduates and US IMGs, who do not have those luxuries. I just wanted to say thank you
otherwise for allowing opinions on these matters. (Role: Medical School Student)
I am very concerned about the new Pathways rule restricting many IMG’s with Provisional Training Certificates
with limited supervision unable to apply for the residency matching process. Seems like a rule which singles out
the older IMG pool who might have tended to tasks like motherhood/fatherhood, other responsibilities including
raising families or helping out parents or going through a tedious immigration process to finally settle in the US
or busy nurturing their little ones after medical school, essentially coming back to their passion and given
USMLE’s, done rotations and now trying to apply in residency training programs. Essentially we have closed the
doors to such candidates and making it very hard for them to successfully apply for training even if they have
completed all prerequisites including medical schooling, wonderful USMLE scores, recent US rotations etc. I
think we should rethink the pathways a little bit more. I have personally trained and met many such older IMG’s
who have done a phenomenal job returning back to medicine finding their true passion and working so hard to
make a difference in the lives of our patients while working tirelessly at the front lines to help our communities.
Please reconsider the pathway and make the process more structured but at least possible for such wonderful
people who add value to medicine. Thank you. (Role: Residency Program Director, MD)
We appreciate the work that the Coalition did produce the recommendations […] Here we list more specific
reactions: * Many of the recommendations are so general that it is hard to imagine what the final product for
any recommendation might look like. Thoughtful feedback is difficult without implementation details. * It is
hard to overlay the framework of transactional, investigational and transformational actions with the 42
recommendations or even the 12 categories. * It was not always clear where the effort/redesign would be at
a local level versus a national level. * There are many instances of asking for analytics, CQI assessments
that many schools and small residency programs likely do not have the personnel/skills to accomplish. *
There seemed to be an assumption that any observed differences in metrics between subgroups (perhaps
defined by gender or race/ethnicity) are evidence of bias. That is, differences equal bias. * Many of the
recommendations, for example those within Advising of Learners and Competencies and Assessments will
require a lot of faculty time (and skills) to implement well. Any time away from clinical venues is expensive. *
Finally, we would welcome some thought on how one changes culture. For decades, through the former Dean’s
letter to iterations of the MSPE, there have been attempts to increase transparency of performance data
transmitted from UME to GME. We would love to hear some ideas on what is different now. * The collective
“we” learned a lot this year (or we could with more analyses) about what happens with virtual interviewing and
no/few away rotations. We should be agile in using this data while it is still fresh. Overall, it is a very good
document. It is thorough and thoughtful and very encouraging that diverse stakeholders are having collaborative
and productive discussions. We look forward to the next version, perhaps addressing priorities, budgets, and
timelines. (Role: I am responding on behalf of an organization or group in an official capacity)
95
Table 31: Code Application Counts for Other Comments
Code
N
Percent
*Advice & Coaching
2
0.7%
*Applications
13
4.8%
Applications - Caps and Limits
8
2.9%
Applications - MSPE (Medical School Performance Evaluation)
3
1.1%
Applications - Standardization of Application Process
2
0.7%
*Assessment
5
1.8%
*Assessment and Performance Data
6
2.2%
Assessment and Performance Data - Grades & Grading Pass Fail
3
1.1%
*Cost/Finances/Debt
10
3.7%
Cost/Finances/Debt - Implementation Cost
2
0.7%
Cost/Finances/Debt - Student Cost
4
1.5%
Cost/Finances/Debt - Student Debt
4
1.5%
*COVID Impact
5
1.8%
*Data Transparency & Availability
2
0.7%
*DEI
19
7%
DEI – Diversity
3
1.1%
DEI – Inclusion
10
3.7%
*DO/Osteopathy/Osteopathic
2
0.7%
*Faculty
2
0.7%
Faculty - Faculty Development
2
0.7%
*Funding
12
4.4%
Funding - Influence of Private Equity
5
1.8%
Funding - Unfunded Mandate
3
1.1%
*Implementation
12
4.4%
Implementation - Change Management
2
0.7%
Implementation - CQI (Continuous Quality Improvement)
3
1.1%
Implementation - Impact
3
1.1%
*Interviews
21
7.7%
Interviews - Interview Caps and Limits
7
2.6%
Interviews - Virtual Interviews
13
4.8%
*Matching Process
16
5.9%
Matching Process - Slots
2
0.7%
Matching Process - SOAP (Supplemental Offer and Acceptance Program)
3
1.1%
Matching Process - Unmatched
8
2.9%
*Non-US Trained Students
5
1.8%
Non-US Trained Students - IMG
2
0.7%
*Oversight
7
2.6%
*Physician Shortage
3
1.1%
*Research
2
0.7%
96
Table 31: Code Application Counts for Other Comments
Code
N
Percent
*Roles
5
1.8%
Roles - Program Directors
4
1.5%
*Rotations
6
2.2%
Rotations - Away Rotations
6
2.2%
*Standardization of Requirements
5
1.8%
*Transition to Residency
4
1.5%
*Wellness/Wellbeing
7
2.6%
Total
273
100%
97
Figure 37: Code Application for Other Comments
98
Figure 38: Bigrams for Other Comments
99
SURVEY DEMOGRAPHICS
Which of these choices best represents your reason for responding to the UGRC recommendations survey?
Which of these choices best represents your reason for responding to the UGRC recommendations survey?
N
Percent
I am responding on behalf of an organization or group in an official capacity
105
13.7%
I am responding for myself
663
86.3%
Total
768
100%
100
Which of the following describes your primary role?
Which of the following describes your primary role?
N
Percent
Clerkship Director
20
3%
Designated Institutional Official (DIO)
12
1.8%
Faculty Member of a Medical School
79
11.9%
General Public
6
0.9%
I serve, or have served, on a State Medical Board
8
1.2%
Intern/Resident/Fellow
27
4.1%
Medical School Assistant/Associate Dean
64
9.6%
Medical School Dean
6
0.9%
Medical School Student
204
30.7%
Non-Practicing Physician/Clinician
17
2.6%
Other
75
11.3%
Practicing Physician/Clinician
22
3.3%
Residency Program Director
125
18.8%
Total
665
100%
101
Which of the following describes your primary role? – Other (please specify)
Which of the following describes your primary role? - Other (please specify)
administrative staff
Assistant Residency Program Director, Emergency Medicine Subspecialty Advisor, Chair of Council of Residency Directors in EM Application Process
Improvement Committee
Associate Chair of Education (Med School Faculty, UME and GME stakeholder)
Associate Fellowship Director, Director of Recruitment
Associate Program Director
Associate Program Director
Career Advising of Medical School
Career Advisor
Clerkship Director and Assistant Residency Program Director
Coordinator
Dean Emeritus; Membership NBME
Department Chair-Emergency Medicine
Director of Assessment
Director of Medical Education
EDI Leadership at institution
Educator
Fellowship APD
Fellowship Coordinator
Fellowship director, vice chair for education
Foreign medical graduate
former Dean and now Provost
Fully Qualified Medical Graduate
Institutional Director
Institutional GME Program Administrator (Accreditation Specialist)
International Medical graduate
102
Which of the following describes your primary role? – Other (please specify) Continued
Which of the following describes your primary role? - Other (please specify)
International medical graduate US CITIZEN
International medical postgraduate
International Medical School Graduate
Leader of national group of transition to residency course educators (and faculty member)
Medical graduate
Medical graduate
Medical School Staff
Medical school staff member - Career Counselor
Medical School Student Affairs Staff
medical student (allopathic) on leave of absence
Most of the above, was a full professor with published research involving workforce and basic
health access and rural health, rural medical education leader
Non-US International Medical Graduate
Post IMG-medical graduate
Postdoctoral fellow in search of finishing residency to be able to practice postdoc.
Practicing foreign medical graduate
Program Administrator
Program Administrator
Program Coordinator
Program coordinator
Program Manager
Recently retired Associate Dean
Recently retired physician engaged primarily in med ed
Residency & fellowship program administrator/coordinator
Residency associate program director
Residency Associate Program Director
Residency Associate Program Director
103
Which of the following describes your primary role? – Other (please specify) Continued
Which of the following describes your primary role? - Other (please specify)
Residency Coordinator
Residency Coordinator
Residency Coordinator
Residency faculty and clinic lead
Residency Program Administrator
Residency Program Coordinator
Residency Program Coordinator
Residency Program Manager
residency/fellowship coordinator
retired former dean
Retired physician medical educator
School of GME senior associate dean
staff member of a medical school
State Medical Society Executive
unmatched
Unmatched doctor
Unmatched MD
US Citizen International Medical Graduate (IMG).
US IMG
Vice Chair for Academic Affairs
vice chair for education
Vice Chair of Education
Vice Chair of Education and Assistant Dean of GME
Vice Provost for Academic Programs (background in UME)
104
In which type of medical school are you currently enrolled?
In which type of medical school are you currently enrolled?
N
Percent
Allopathic
170
83.7%
Osteopathic
33
16.3%
Total
203
100%
105
Are you currently a practicing physician/clinician?
Are you currently a practicing physician/clinician?
N
Percent
Yes
329
78.5%
No
90
21.5%
Total
419
100%
106
Which of the following medical degrees do you have?
Which of the following medical degrees do you have?
N
Percent
None of the above
52
11.5%
MD
326
71.8%
DO
50
11%
MBBS
26
5.7%
Total
454
100%
107
What is the location of the medical school from which you graduated?
What is the location of the medical school from which you graduated?
N
Percent
United States or Canada
337
83.8%
Other
65
16.2%
Total
402
100%
108
Other Medical School Locations
What is the location of the medical school from which you graduated? - Other (please specify)
Ateneo de Zamboanga School of Medicine, Philippines
Barbados
Caribbean island
Carribean
cuba
Cuba
Dominican Republic
Dominican Republic
Egypt
Egypt
Fatima jinnah medical University, Pakistan
Ghana
Grenada
Grenada
Grenada
Guatemala
Haiti
India
India
India
India
India
India
India
India
INDIA
109
Other Medical School Locations Continued
What is the location of the medical school from which you graduated? - Other (please specify)
International
Iran
Iran
Iraq / Baghdad
Iraq / Baghdad
Iraq / Baghdad
Kampala , UGANDA
Kazakhstan
Kharkov Medical University, Ukraine
Lahore, pakistan
Mexico
Nepal
Nepal
Nepal
Nepal
Nepal
New York
nigeria
Other
Pakistan
Pakistan
Pakistan
Pakistan
Pakistan, Karachi, DOW Medical College
Philippines
Other Medical School Locations Continued
What is the location of the medical school from which you graduated? - Other (please specify)
110
Philippines
S
Saudi
South America
Sudan
SUDAN
The Netherlands
Turkey
UAG, Guadalajara, Mexico
UK
University of Glasgow, Scotland
University of Science Arts & Technology School of Medicine
USAT
Xavier University School of Medicine
111
In what year did you complete your residency?
In what year did you complete your residency?
N
Percent
1960 - 1969
2
0.6%
1970 - 1979
13
3.8%
1980 - 1989
57
16.5%
1990 - 1999
67
19.4%
2000 - 2009
121
35%
2010 - 2019
85
24.6%
2020 - 2021
1
0.3%
Total
346
100%
112
What is your core medical specialty?
What is your core medical specialty?
N
Percent
Allergy and Immunology
1
0.3%
Anesthesiology
14
3.7%
Dermatology
4
1%
Emergency Medicine
55
14.4%
Family Medicine
55
14.4%
Internal Medicine
74
19.3%
Neurological Surgery
2
0.5%
Neurology
10
2.6%
Obstetrics and Gynecology
14
3.7%
Ophthalmology
3
0.8%
Orthopaedic Surgery
4
1%
Osteopathic Neuromusculoskeletal Medicine
11
2.9%
Other
14
3.7%
Otolaryngology - Head and Neck Surgery
10
2.6%
Pathology
8
2.1%
Pediatrics
51
13.3%
Physical Medicine and Rehabilitation
3
0.8%
Plastic Surgery
2
0.5%
Preventive Medicine
1
0.3%
Psychiatry
24
6.3%
Radiology
6
1.6%
Surgery
14
3.7%
Thoracic Surgery
1
0.3%
Transitional Year
2
0.5%
Total
383
100%
113
Other Core Specialties
What is your core medical specialty? - Other (please specify)
after I finish medical school , I start residency in internal medicine but could not finish because I escaped from Baghdad situation
Clinical Research
Critical care
Gastroenterology/Internal Medicine
General practitioner
I was unable to obtain residency or sit for my Board exams dues to ECFMG delaying my application as well as 50% of medical schools that they
unaccredited in 2019. They still would not approve me to take my exam even though I graduated prior to the unaccreditation
Internal medicine-pediatrics
Laboratory Medicine (Clinical Pathology)
med/peds
Med/Peds
NA
osteopathic neuromusculoskeletal medicine
Unmatched
Unmatched
114
What is the location of the institution where your primary role is…
What is the location of the institution where your primary role is...
N
Percent
United States or Canada
618
94.1%
Other
39
5.9%
Total
657
100%
115
What is the location of the institution where your primary role is… Other Locations
What is the location of the institution where your primary role is... - Other (please specify)
(I have not been granted a residency position)
AIIMS Mangalagiri
Ain shams university
Bahrain
BANGALORE,INDIA
Caribbean
carribean
Curacao
Dominica
Dominican Republic
Ghana
India
India
INDIA
Iraq / Baghdad
Iraq / Baghdad
Iraq / Baghdad
Kasturba Medical College Manipal India
Kazakhstan
Lahore, pakistan
NA
Nepal
Nepal
Nepal
Nepal
Nepal
116
What is the location of the institution where your primary role is… Other Locations Continued
What is the location of the institution where your primary role is... - Other (please specify)
nigeria
Nigeria
Omdurman Islamic University
Pakistan
Pakistan
Pakistan
Pakistan, Karachi
Philippines
Qatar
San Pedro Dominican Republic
Saudi
serbia
Southampton, UK
117
Do you directly supervise residents?
Do you directly supervise residents?
N
Percent
Yes
299
70%
No
128
30%
Total
427
100%
118
What is your gender identity?
What is your gender identity?
N
Percent
Woman
309
46.7%
Man
316
47.8%
Genderqueer or non-binary
4
0.6%
Gender fluid
0
--
Agender
0
--
Prefer not to answer
31
4.7%
Prefer to self describe
1
0.2%
Total
661
100%
119
Other Gender Identities
What is your gender identity? - Prefer to self describe
Dude.
120
What is your race or ethnic identity? Select all that apply.
What is your race or ethnic identity? Select all that apply. -
N
Percent
White or Caucasian
428
55.7%
Asian
98
12.8%
Prefer not to answer
65
8.5%
Black or African American
42
5.5%
Hispanic, Latina/o/x, or of Spanish Origin
38
4.9%
Other race/ethnicity not already specified
11
1.4%
American Indian or Alaska Native
8
1.0%
Native Hawaiian or Other Pacific Islander
1
0.1%
Total Respondents
768
100%
121
Other Race/Ethnic Identities
What is your race or ethnic identity? Select all that apply. - Other race/ethnicity not already specified
Arab American
Coptic middle eastern
Cracker
East Indian
Egyptian
human race
I don't believe in this category. I am White or Caucasian but underrepresented minority.
Middle Eastern
Middle Eastern
Pakistani
West African
122
APPENDIX A: LIST OF CODES
Code
*Advice/Coaching
Alternative Careers
Career Advising
Coaching
Specialty-specific Advising
Staff training to support students
*Applications
Application Caps and Limits
Application Process
Application Redundancy
Biasing applications
LOR (Letters of Recommendation)
MSPE (Medical School Performance Evaluation)
Objective Metrics to Gauge Applicants
Personal Statements
School Enrollment Targets
SEL (Structured Evaluative Letters)
Standardization of Application Process
*Assessment
Accurate assessments
Standardized Exams
Inequality in Scaling
Licensing exam quality differences
Single Licensing Exam
Turnaround Time for USMLE Scores
123
Code
*Assessment and Performance Data
Grades/Grading/Pass Fail
Holistic review
ILPs (Individualized Learning Plans)
*Communication
*Competencies
EPAs (Entrustable Professional Activities)
Milestones
*Cost/Finances/Debt
Implementation Cost
GME Cost
UME Cost
Program Cost
Student Cost
Student Debt
*COVID Impact
*Data Transparency & Availability
Dashboard or Portfolio
Data to Support Informed Decisions
Database of Program Info
Filters
*DEI
Balance when it comes to DEI
Bias
Racial Bias
Diversity
Diversity Monitoring of Programs
Diversity Quotas
124
Code
Policy Implications
Elimination of Honors
Equity
Fairness
First-gen med student support
Inclusion
Community outreach program(s)
Reputation
School resource availability
SES
Small program(s)
URM
Black Medical Students
Non-URMs being put at disadvantage
*DO/Osteopathy/Osteophathic
*Faculty
Faculty Development
*Funding
GME Funding
Influence of Private Equity
Unfunded mandate
*Implementation
Change management
Cohesive Policy
CQI (Continuous Quality Improvement)
Impact
*Interviews
Interview Caps and Limits
125
Code
Interview Selection Criteria
Tickets
Virtual Interviews
*Licensure
*Matching process
Couples
Early Decision/Matches
Matched
Second looks
Slots
SOAP (Supplemental Offer and Acceptance Program)
Unmatched
*Medical School Presitige
*Mid-Level Practicioners
*Non-US Trained Students
IMG
US IMG
*Oversight
Cohesive Oversight Committee
*Physician Shortage
*Public health
*Research
*Roles
DIO (Designated Institutional Officer)
Other Roles
Program Directors
*Rotations
Audition Rotations
126
Code
Away Rotations
*Specialties
Competitive Specialties
Specialty Selection
*Standardization of Requirements
Cross Specialty Standardization
Cross State Standardization
*Training
*Transition to Residency
Bootcamp
Learner handover
Orientation
Timing
*Wellness/Wellbeing
Life Changes
* Denotes Parent Code
127
APPENDIX B: LIST OF TAGS
Tag
Combine Potential
Concerning Comment
Interesting Comment
Organizations
Personal Anecdote
Priority/prioritize
Skepticism
Source Cited
Suggestion
Unintended Consequences
128
APPENDIX C: SURVEY INSTRUMENT
Dear Stakeholder,
Thank you for your interest in participating in the public comment period for the preliminary
recommendations of the Coalition for Physician Accountability’s Undergraduate Medical
Education to Graduate Medical Education Review Committee (UGRC).
Please review the Initial Summary Report and Preliminary Recommendations of the
UGRC before you begin the survey. We recommend keeping the report open throughout
the duration of the survey to provide you with additional background information and
context. A glossary of terms used in the survey is available for your consideration here.
The deadline to submit your feedback to the UGRC’s preliminary recommendations is May 26,
11:59PM EDT. If you have questions or need assistance, please email
* Which of these choices best represents your reason for responding to the UGRC
recommendations survey?
o I am responding on behalf of an organization or group in an official capacity
o I am responding for myself
*Please indicate the name of the organization or group for which you are responding.
[TEXT BOX]
The first section of the survey will gather background information to help us
understand the perspective you provide in your response to the UGRC Preliminary
Recommendations.
* Which of the following describes your primary role?
o Medical School Student
o Intern/Resident/Fellow
o Faculty Member of a Medical School
o Clerkship Director
o Residency Program Director
o I serve, or have served, on a State Medical Board
o Designated Institutional Official (DIO)
o Medical School Dean
o Medical School Assistant/Associate Dean
o Practicing Physician/Clinician
129
o Non-Practicing Physician/Clinician
o General Public
o Other (please specify) [TEXT BOX]
In which type of medical school are you currently enrolled?
o Allopathic
o Osteopathic
*Are you currently a practicing physician/clinician?
o Yes
o No
Which of the following medical degrees do you have?
o MD
o DO
o MBBS
o None of the above
* What is the location of the medical school from which you graduated?
o United States or Canada
o Other (please specify) [TEXT BOX]
In what year did you complete your residency?
[TEXT BOX]
What is your core medical specialty?
[TEXT BOX]
* What is the location of the institution where your primary role is: {Role Response}?
o United States or Canada
o Other (please specify) [TEXT BOX]
* Do you directly supervise residents?
o Yes
o No
What is your gender identity?
o Woman
o Man
o Genderqueer or non-binary
o Gender fluid
o Agender
o Prefer not to answer
o Prefer to self describe [TEXT BOX]
What is your race or ethnic identity? Select all that apply.
American Indian or Alaska Native
Asian
Black or African American
130
Hispanic, Latina/o/x, or of Spanish Origin
Native Hawaiian or Other Pacific Islander
White or Caucasian
Other race/ethnicity not already specified [TEXT BOX]
Prefer not to answer
In the next section of the survey you will able to review the UGRC Preliminary
Recommendations and provide commentary to specific recommendations.
For your reference, you can find the Initial Summary Report and Preliminary
Recommendations here. A glossary of terms used in the survey is available for your
consideration here.
* Please indicate which recommendation theme(s) you wish to comment on?
Oversight: #1
Advising of Learners: #2 - #6
Competencies and Assessments: #7 - #14
Away Rotations: #15
Diversity, Equity and Inclusion (DEI) in Medicine: #16 - #19
Application Process: #20 - #24
Interviewing: #25 - #27
Matching Process: #28
Faculty Support Resources: #29 - #30
Post-Match Transition to Residency: #31 - #38
Policy Implications: #39 - #40
Research Questions: #41 - #42
I do not wish to comment on any of the recommendations
Oversight
1. Convene a national ongoing committee to manage continuous quality improvement of
the entire process of the UME-GME transition, including an evaluation of the intended
and unintended impact of implemented recommendations.
Please use the space below to comment on the recommendation relating to Oversight.
Advising of Learners
2. Educators should develop a best-practice curriculum for UME career advising,
including guidelines for equitable curriculum delivery and outcomes.
3. A single, comprehensive electronic professional development career planning
131
resource for students will provide universally accessible, reliable, up-to-date, and
trustworthy information and guidance.
4. Advising about alternative career pathways should be available for those individuals
who choose not to pursue clinical careers. National career awareness databases such
as Careers in Medicine should include information on these alternative pathways.
5. Evidence-informed, general career advising resources should be available for all
medical school faculty and staff career advisors, both domestic and international.
General career advising should focus on students’ professionalization; inclusive
practices such as valuing diversity, equity, and belonging; clinical and alternate career
pathways; and meeting the needs of the public.
6. To support evidence-informed, student focused, specialty-specific advising for all
medical students, advising resources should be available for and used by advisors, both
domestic and international.
Use the space below to comment on the recommendations relating to Advising of Learners.
Please reference the specific recommendation(s) in your comment ,e.g.,
2: Your comment...
3: Your comment...
Competencies and Assessments
7. UME and GME educators, along with representatives of the full educational
continuum, should jointly define and implement a common framework and set of
outcomes (competencies) to apply to learners across the continuum from UME to GME.
8. The UME community, working in conjunction with partners across the continuum,
must commit to using robust assessment tools and strategies, improving upon existing
tools, developing new tools where needed, and gathering and reviewing additional
evidence of validity.
9. Using the shared mental model of competency and assessment tools and strategies,
create and implement faculty development materials for incorporating competency-
based expectations into teaching and assessment.
10. A convened group including UME and GME educators should reconsider the content
and structure of the MSPE as new information becomes available in order to improve
access to longitudinal assessment data about applicants. Short term improvements
should include structured data entry fields with functionality to enable searching.
11. Meaningful assessment data based on performance after the MSPE must be
collected and collated for each graduate, reflected on by the learner with an educator or
coach, and utilized in the development of a specialty-specific individualized learning plan
to be presented to the residency program for continued utilization during training.
Guided self-assessment by the learner is an important component in this process and
132
may be all that is available for some international medical graduates.
12. Targeted coaching by qualified educators should begin in UME and continue during
GME, focused on professional identity formation and moving from a performance to a
growth mindset for effective lifelong learning as a physician. Educators should be
astute to the needs of the learner and be equipped to provide assistance to all
backgrounds.
13. Structured Evaluative Letters (SELs) should replace all Letters of Recommendation
(LOR) as a universal tool in the residency program application process.
14. Convene a workgroup of educators across the continuum to begin planning for a
dashboard/portfolio to collect assessment data in a standard format for use during
medical school and in the residency application process. This will enable consistent and
equitable information presentation during the residency application process and in a
learner handover.
Use the space below to comment on the recommendations relating to Competencies and
Assessments.
Please reference the specific recommendation(s) in your comment ,e.g.,
7: Your comment...
8: Your comment...
Away Rotations
15. Convene a workgroup to explore the multiple functions and value of away rotations
for applicants, medical schools, and residency programs. Specifically, consider the
goals and utility of the experience, the impact of these rotations, and issues of equity
including accessibility, assessment, and opportunity for students from groups
underrepresented in medicine and financially disadvantaged students.
Please use the space below to comment on the recommendation relating to Away Rotations.
Diversity, Equity and Inclusion (DEI) in Medicine
16. To raise awareness and facilitate adjustments that will promote equity and
accountability, demographic information of applicants (race, ethnicity, gender
133
identity/expression, sexual identity/orientation, visa status, or ability) should be
measured and reported to key stakeholders, including programs and medical schools, in
real time throughout the UME-GME transition.
17. Specialty-specific best practices for recruitment to increase diversity across the
educational continuum should be developed and disseminated to program directors,
residency programs, and institutions.
18. In order to eliminate systemic biases in grading, medical schools must perform initial
and annual exploratory reviews of clinical clerkship grading, including patterns of grade
distribution based on race, ethnicity, gender identity/expression, sexual
identity/orientation, visa status, ability, and location (e.g., satellite or clinical site
location), and perform regular faculty development to mitigate bias. Programs across the
UME-GME continuum should explore the impact of bias on student and resident
evaluations, match results, attrition, and selection to honor societies, such as Alpha
Omega Alpha and the Gold Humanism Honor Society.
19. A committee must be formed to explore the growing number of unmatched
physicians in the context of a national physician shortage, including root causes, and
disparities in unmatched students based on specialty, demographic factors, and
grading systems. The committee should report on data trends, implications, and
recommended interventions.
Use the space below to comment on the recommendations relating to
Diversity, Equity, and Inclusion (DEI) in Medicine.
Please reference the specific recommendation(s) in your comment ,e.g.,
16: Your comment...
17: Your comment...
Application Process
20. A comprehensive database with verifiable residency program information should be
available to all applicants, medical schools, and residency programs and at no cost to
the applicants.
21. Create a widely accessible, authoritative, reliable, and searchable dataset of
characteristics of individuals who applied, interviewed, were ranked, and matched for
each GME program/track to be used at no cost by applicants, and by their advisors. Sort
data according to medical degree, demographics, geography, and other characteristics
of interest.
22. To optimize utility, discrete fields should be available in the existing electronic
application system for both narrative and ordinal information currently presented in the
MSPE, personal statement, transcript, and letters. Fully using technology will reduce
redundancy, improve comprehensibility, and highlight the unique characteristics of each
applicant.
134
23. Filter options available to programs for sorting applicants within the application
system should be carefully created and thoughtfully reviewed to ensure each one
detects meaningful differences among applicants and promotes review based on
mission alignment and likelihood of success at a program.
24. To promote equitable treatment of applicants regardless of licensure examination
requirements, comparable exams with different scales (COMLEX-USA and USMLE)
should be reported within the ERAS filtering system in a single field.
Use the space below to comment on the recommendations relating to Application Process.
Please reference the specific recommendation(s) in your comment ,e.g.,
20: Your comment...
21: Your comment...
Interviewing
25. Develop and implement standards for the interview offer and acceptance process,
including timing and methods of communication, for both the learners and programs to
improve equity and fairness, to minimize educational disruption, and improve wellbeing.
26. Interviewing should be virtual for the 2021-2022 residency recruitment season. To
ensure equity and fairness, there should be ongoing study of the impact and benefits of
virtual interviewing as a permanent means of interviewing for residency.
27. Implement a centralized process to facilitate evidence-based, specialty-specific
limits on the number of interviews each applicant may attend.
Use the space below to comment on the recommendations relating to Interviewing.
Please reference the specific recommendation(s) in your comment ,e.g.,
25: Your comment...
26: Your comment...
Matching Process
135
28. To promote holistic review and efficiency, utilize the best available modeling and
data to redesign the mechanics of the residency application process. The redesigned
process – such as an optional early decision application cycle and binding match
must reduce application numbers while concentrating applicants at programs where
mutual interest is high.
Please use the space below to comment on the recommendation relating to Matching Process.
Faculty Support Resources
29. Develop a portfolio of evidence-based resident support resources for program
directors (PDs), designated institutional officials (DIOs), and residency programs. These
will be identified as best practices, and accessible through a centralized repository.
30. Educators across the continuum must receive faculty development regarding anti-
racism; avoiding bias; and improving equity in student and resident recruitment,
mentorship and advising, teaching, and assessment.
Use the space below to comment on the recommendations relating to Faculty Support
Resources.
Please reference the specific recommendation(s) in
your comment ,e.g., 29: Your comment...
30: Your comment...
Post-Match Transition to Residency
31. Anticipating the challenges of the UME-GME transition, schools and programs
should ensure that time is protected, and systems are in place, to ensure that
individualized wellness resources – including health care, psychosocial supports, and
communities of belonging – are available for each learner.
32. Using principles of inclusive excellence, program directors, programs, and
institutions must incorporate activities in diversity, equity, and inclusion for faculty,
residents, and staff beginning in orientation and ongoing, in order to promote
belonging, eliminate bias, and provide social support.
33. Specialty-specific, just-in-time training must be provided to all incoming first-year
residents, to support the transition from the role of student to a physician ready to
assume increased responsibility for patient care.
136
34. Residents must be provided with robust orientation and ramp up into their specific
program at the start of internship. In addition to clinical skills and system utilization,
content should include introduction to the patient population, known health disparities,
community service and engagement, faculty, peers, and institutional culture.
35. A specialty-specific, formative, competency-based assessment that informs the
learner’s individualized learning plan (ILP) must be performed for all learners as a
baseline at the start of internship.
36. Early and ongoing specialty-specific resident assessment data should be
automatically fed back to medical schools through a standardized process to enhance
accountability and continuous improvement of UME programs and learner handovers.
37. Adequate and appropriate time must be assured between graduation and learner
start of residency to facilitate this major life transition.
38. All learners need equitable access to adequate funding and resources for the
transition to residency prior to internship launch.
Use the space below to comment on the recommendations relating to Post-Match Transition
to Residency.
Please reference the specific recommendation(s) in your comment ,e.g.,
31: Your comment...
32: Your comment...
Policy Implications
39. There should be a standardized process throughout the United States for initial
licensing at entrance to residency in order to streamline the process of credentialing for
both residency training and continuing practice.
40. Recommend to the U.S. Centers for Medicare and Medicaid Services (CMS) that
they change the current GME funding structure so that the Initial Residency Period
(IRP) is calculated starting with the second year of postgraduate training. This will allow
career choice reconsideration, leading to resident wellbeing and positive effects on the
physician workforce.
Use the space below to comment on the recommendations relating to Policy Implications.
Please reference the specific recommendation(s) in your comment ,e.g.,
39: Your comment...
40: Your comment...
137
Research Questions
41. To guide future improvements in resident selection and transition, conduct research
to understand which residency applicant characteristics, residency curriculum
experiences, and learning environment factors are most likely to translate into
physicians who fulfill the specialty specific physician workforce needs of the public (e.g.,
primary care, demographics, geographic distribution).
42. Build consensus around the components of a successful recruitment cycle, utilizing
input from all stakeholders. Identify which characteristics of applicants and programs
predict a successful recruitment cycle outcome.
Use the space below to comment on the recommendations relating to Research Questions.
Please reference the specific recommendation(s) in your comment ,e.g.,
41: Your comment...
42: Your comment...
Please share any other comments you have about the UGRC Preliminary
Recommendations.
Thank you for your participation in this survey!
If you have any questions, please email Coal[email protected].
UME-GME REVIEW COMMITTEE
140
The Coalition for
Physician Accountability’s
Undergraduate Medical
Education-Graduate
Medical Education
Review Committee (UGRC):
Recommendations for Comprehensive
Improvement of the UME-GME Transition
https://physicianaccountability.org/