A number of factors influence the decisions that a sponsoring institution makes regarding which graduate medical education (GME) specialty and subspecialty programs to support, how many positions to support, and the sites where residency training occurs. One important factor is the financial impact of supporting a residency training program. The impact includes both the costs associated with operating the residency program and the benefits that the hospital and its educational partners derive from operating the programs. In determining program size, the incremental or marginal impact of each resident is more likely to affect a hospital's decisions than the average financial impact per resident. Other important considerations affecting decisions on program offerings include accreditation requirements, the interests of the other institutions affiliated with the residency training programs, community workforce needs, and faculty and medical school graduate preferences.
Medicare is the primary vehicle for federal support to teaching hospitals through its direct graduate medical education (DGME) payments for the direct costs of operating residency training programs and additional payments for inpatient services associated with the indirect costs of operating residency training programs. Both types of payments are formula-driven and do not reflect the financial impact of operating different types of residency programs. In 1997, hospital-specific caps were placed on the number of residency positions that Medicare supports. Since then, the number of subspecialty programs has grown while the number of primary care residency programs has declined (Salsberg et al., 2008; Weida et al., 2010). Between 1996 and 2011, the number of primary care residents has increased 8.4 percent, while there has been a 10.3 percent increase in other specialties and a 61.1 percent increase in subspecialty residents (RAND analysis of JAMA, Appendix II, 1996; Brotherton and Etzel, 2012).
In its June 2010 report, the Medicare Payment Advisory Commission (MedPAC) found that the increasing trend toward specialization is inconsistent with the needs of an efficient, high-quality, high value health care delivery system for primary care physicians (MedPAC, 2010). MedPAC expressed concern that the costs and benefits of sponsoring residency programs are likely to vary by specialty, and that some programs may be more financially attractive to sponsoring institutions than others. In particular, the trend toward subspecialization raises the question of whether Medicare funding should be restructured to differentiate between programs that are less costly or self-sustaining and those that are more costly to the supporting institution. Understanding the financial impact would allow Medicare to distribute its GME funds more efficiently (MedPAC, 2010).
MedPAC asked RAND to use available literature and data to summarize how both the costs and benefits for operating residency programs may differ by such program characteristics as size, specialty, type of sponsor, training venue, and geographic location. We focused on seven disciplines that play a major role in the care of Medicare patients, use different models for resident training, and have experienced different growth rates in residency positions over 2005—2010: general internal medicine (IM), cardiology, family medicine (FM), dermatology, general surgery, urology, and radiation oncology. We refer to these as specialty programs in this article, although cardiology is a fellowship or subspecialty program in internal medicine.
We developed the framework in Figure 1 to investigate how program characteristics such as size and where training occurs affect the following types of costs:
- direct GME costs—i.e., the educational resources and infrastructure required to operate GME programs
- patient care costs—i.e., the indirect impact of operating GME programs on patient care costs, including the value of resident services and the financial and nonmonetary benefits that hospitals and attending physicians derive from participating in GME activities
- the GME-related patient care revenues and funding that hospitals explicitly receive for participating in GME activities, such as Medicare DGME payments and indirect medical education (IME) payments.
The study was exploratory because it was uncertain at the outset whether sufficient data would be available to measure the financial impact of the factors affecting financial performance. We found data from different sources that could be used to assess whether a particular factor increases or decreases costs and to estimate the relative magnitude of the impact across specialties. However, we were unable to develop a consistent comparison metric—impact per resident—that could be used to quantify the results and generate an overall measure of financial impact for each type of program. Data limitations, particularly with respect to attending physician faculty-to-resident ratios, precluded us from doing so. Although the study does not quantify the variation in financial impact, it provides a framework for doing so and identifies the major factors that are likely to affect the financial performance of the sponsoring institution and influence program offerings and size.
Framework for Analyzing the Financial Impact of Operating GME Programs
Table 1 summarizes our overall findings regarding the relative impact that different program characteristics are likely to have on the financial performance of sponsoring institutions and their educational partners. Those program characteristics that are likely to increase costs per resident are shown with upward-pointing arrows, while those that are likely to reduce costs are shown with downward-pointing arrows. The relative magnitude of the direction across programs is reflected in the shading. If the impact across programs is in the same direction, the specialty program that is estimated to be most affected is shown with black arrows, the program likely to be the least affected is shown with white arrows, and those that fall in between are shown with gray arrows. If the impact is a cost for one or more programs and a benefit for others, the shading of the arrow depicts the magnitude of the direction. For example, the IM and FM faculty practice plans are estimated to operate at a loss, whereas the other specialties are estimated to operate at a profit, with the highest profit per resident estimated for urology and the lowest profit estimated for cardiology and general surgery. Sideways arrows indicate no significant impact on specialty program costs.
Relative Impact of Selected Program Characteristics on the Financial Performance of Sponsors and Educational Partners
NOTES: Program characteristics that are likely to increase costs per resident are shown with upward-pointing arrows, while those that are likely to reduce costs are shown with downward-pointing arrows. The relative magnitude of the direction is reflected in the shading. If the impact across programs is in the same direction, the specialty program that is estimated to be most affected is shown with black arrows, the program likely to be the least affected is shown with white arrows, and those that fall in between are shown with gray arrows. If the impact is a cost for one or more programs and a benefit for the remaining programs, the shading of the arrow depicts the magnitude of the direction. IME = indirect medical education, RRC = residency review committee.
Basis and sources for estimates:
- RAND-calculated national weighted average stipend by year of residency training derived from the Association of American Medical Colleges (AAMC) Survey of Resident/Fellow Stipends and Benefits, Autumn 2010 Report, and the number of years each training program requires determined from the ACGME Data Resource Book 2010–2011.
- Median faculty compensation levels reported in MGMA Academic Practice Compensation and Production Survey for Faculty and Management: 2011 Report Based on 2010 Data. Compensation for FM is based on non-obstetric faculty compensation; compensation for cardiology is based on compensation for noninvasive cardiology faculty.
- Percentage of time spent in teaching and “other” activities reported in MGMA (2011). The percentages of all activities were scaled so that the sum equaled 100 percent.
- Average program size derived from the total number of residents and programs in ACGME (2011) and analysis of RRC program-specific requirements (ACGME, 2007a–2007c; ACGME, 2009a–2009e).
- National normalized premium rates in 2008 (O'Brien-Strain et al., 2010).
- RAND analysis of single-program sponsors in ACMGE (2011).
- Average number of participating institutions in ACGME (2011) and technical expert panel input on FM rotations.
- Average percentage of first-year training in hospital outpatient clinics (AMA, 2011).
- Average percentage of first-year training in nonhospital ambulatory care settings (AMA, 2011).
- Average percentage of international medical graduates (AMA, 2011).
- RAND-derived estimate based on analysis of a convenience sample of on-call schedules and the difference between what the resident is paid for on-call coverage and what the hospital would otherwise need to pay. Nonsurgical rate based on an estimated average hourly rate for academic hospitalists derived from MGMA (2011) and surgical rates based on 2010 MGMA on-call compensation survey.
- RAND-derived estimate based on estimated percentage of time spent in teaching medical students and junior residents, the difference between the hourly rates for residents and attending physicians, and the estimated impact of teaching on attending physician productivity and revenues.
- RAND analysis of California Office of Statewide Health Planning and Development inpatient discharge data and hospital financial data.
- Percentage of first-year training in ambulatory settings (AMA, 2011).
- RAND-derived ratio of academic physician bills at 100 percent clinical activity to private practice physician billings using MGMA (2011).
- RAND estimate constructed from MGMA (2011) based on percentage of time spent in billable clinical activities, estimated collections for time spent in clinical activity, and estimated compensation for clinical activity. Practice expenses estimated using CMS (2012) practice expense per hour data and median number of ambulatory encounters (MGMA, 2011). Billable hours derived by multiplying 40 hours per week by 48 weeks and the percentage of time in billable clinical activities.
- RAND analysis of RRC requirements (ACGME, 2007a–c; ACGME, 2009a–e) and Robertson (2009).
- Difference between compensation for academic and private practices in MGMA (2011).
- Based on average differences in payment by specialty. There would be no payment for some residents if the hospital were over its cap, but the limit is applied proportionately across all programs at the hospital and does not apply to individual residents or specialties.
Key factors affecting variation in the direct costs of GME programs include program size, attending physician compensation levels, and malpractice insurance. Economies of scale affect both variation across specialty programs and between large GME programs at academic health centers (AHCs) and smaller community-based programs. Smaller specialty programs with relatively high faculty compensation levels and malpractice insurance costs are likely to have higher direct GME costs per resident than other programs. However, residency programs also have a number of indirect effects on hospitals and attending physicians. These indirect effects are important from the perspective of the overall economics of operating GME programs and the marginal impact of changing program offerings, but they are problematic to measure. Despite the duty hour limitations and the growing emphasis of the accreditation requirements on education over service, our interviewees indicated that residents continue to serve as a relatively inexpensive source of labor. They identified attending physician patient care revenues and the share of outpatient clinic costs and other practice expenses covered by the faculty practice plan as key differences in the financial impact of training programs in different specialties. Attending physicians in specialties with relatively high compensation levels that also provide most services in hospital-operated facilities are more able to support resident clinical supervision activities through their patient care revenues. Primary care residency programs are disadvantaged relative to other specialties because of lower physician revenues and a higher proportion of training in ambulatory clinics. These programs are often subsidized by the hospital or cross-subsidized by faculty practice plan revenues from other departments. While funding disparities can be addressed in AHCs through cross-subsidization, this opportunity does not exist in community hospitals with a single primary care program.
The public policy debate over GME financing often focuses on only one component of the cost and benefit equation and, by doing so, leads to the perplexing but commonly heard adage that “it costs to train residents and it costs to replace them.” This seeming contradiction arises from looking at the average cost of residency training to determine that “it costs to train residents” and at the loss of benefits derived from having residents to determine that “it costs money to replace them.” It is best resolved by examining the marginal financial impact of adding or subtracting residents to existing teaching programs. The marginal impacts are more likely to influence sponsor decisions on changes in GME program size and offerings and help explain why GME programs are expanding above the Medicare full-time equivalent (FTE) limit on funded positions. For existing programs, minor changes in residency program size are unlikely to have an impact on the either GME infrastructure costs or IME costs, so the major cost of adding a slot is the resident's stipend and fringe benefits and resident-specific allowances. Marginal costs may be higher if adding the resident requires additional capacity or attending physicians. If the hospital has service needs that would otherwise need to be met by hiring alternative providers, there is a marginal benefit to adding a resident, particularly in a subspecialty program, before considering the additional benefits of any GME-related revenues.
One national GME expert who we interviewed suggested that the limits on the number of positions that Medicare will fund provide a natural experiment that demonstrates the overall economics of operating residency programs. Since 1996, there has been a steady increase in the number of subspecialty programs and residents. While some subspecialty expansions, such as in hospice and palliative care, are consistent with physician workforce priorities, others have low priority relative to increasing the supply of primary care physicians. Unless workforce priorities are reinforced by the hospital's internal service needs, program expansions are more likely to occur in the more-lucrative specialty and subspecialty programs rather than primary care. Medicare's GME-related payments should be realigned to be more consistent with the differences in financial impact of various specialty programs and to focus support on primary care residency programs. However, the difficulties that many primary care residency programs are experiencing in filling their slots with qualified candidates suggest that simply increasing payments for primary care programs relative to other specialty and subspecialty programs will not be sufficient. Significant investments are needed not only to enhance primary care training programs but also to attract future physicians into primary care.
Accreditation Council for Graduate Medical Education, ACGME Program Requirements for Graduate Medical Education in Cardiovascular Disease, 2007a.
Accreditation Council for Graduate Medical Education, ACGME Program Requirements for Graduate Medical Education in Dermatology, 2007b. As of September 4, 2013: http://www.acgme.org/acgmeweb/Portals/0/PFAssets/ProgramRequirements/080dermatology_07012007.pdf
Accreditation Council for Graduate Medical Education, ACGME Program Requirements for Graduate Medical Education in Family Medicine, 2007c.
Accreditation Council for Graduate Medical Education, ACGME Program Requirements for Graduate Medical Education in Internal Medicine, 2009a.
Accreditation Council for Graduate Medical Education, ACGME Program Requirements for Graduate Medical Education in Interventional Cardiology, 2009b.
Accreditation Council for Graduate Medical Education, ACGME Program Requirements for Graduate Medical Education in Radiation Oncology, 2009c. As of September 5, 2013: http://www.acgme.org/acgmeweb/Portals/0/PFAssets/ProgramRequirements/430_radiation_oncology_01012009_f07012011.pdf
Accreditation Council for Graduate Medical Education, ACGME Program Requirements for Graduate Medical Education in Surgery, 2009d.
Accreditation Council for Graduate Medical Education, ACGME Program Requirements for Graduate Medical Education in Urology, 2009e.
Accreditation Council for Graduate Medical Education, ACGME Data Resource Book Academic Year 2010–2011, 2011. As of August 20, 2013: http://www.acgme.org/acgmeweb/Portals/0/PFAssets/PublicationsBooks/2010-2011_ACGME_DATA_RESOURCE_BOOK.pdf
Association of American Medical Colleges, AAMC Survey of Resident/Fellow Stipends and Benefits, 2010. As of August 20, 2013: https://www.aamc.org/download/158738/data/2010_stipend_report.pdf
American Medical Association, “FREIDA Online,” 2011. As of November 22, 2011: https://freida.ama-assn.org/Freida/user/viewProgramSearch.do
Brotherton, S. E., and Etzel, S. I., “Graduate Medical Education, 2011–2012,” Journal of the American Medical Association, Vol. 308, No. 21, 2012, pp. 2264–2279.
Centers for Medicare and Medicaid Services, “PFS National Payment Amount File PFALL13A,” 2012. As of August 20, 2013: http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/PhysicianFeeSched/PFS-National-Payment-Amount-File.html
Medical Group Management Association, Academic Practice Compensation and Production Survey for Faculty and Management: 2011 Report Based on 2010 Data, 2011.
Medicare Payment Advisory Commission, Report to the Congress: Aligning Incentives in Medicare, Washington, D.C., June 2010.
MedPAC—see Medicare Payment Advisory Commission.
MGMA—see Medical Group Management Association.
O'Brien-Strain, M., McClellan, S., Frances, S., and Theobald, N., Final Report on GPCI Malpractice RVUs for the CY 2010 Medicare Physician Fee Schedule Rule, Burlingame, Calif.: Acumen, LLC, March 2010.
Robertson, C. M., Klingensmith, M. E., and Coopersmith, C. M., “Prevalence and Cost of Full-Time Research Fellowships During General Surgery Residency: A National Survey,” Annals of Surgery, Vol. 249, No. 1, 2009, pp. 155–161.
Salsberg, E., Rockey, P. H., Rivers, K. L., Brotherton, S. E., and Jackson, G. R., “U.S. Residency Training Before and After the 1997 Balanced Budget Act,” Journal of the American Medical Association, Vol. 300, No. 10, 2008, pp. 1174–1180.
Weida, N. A., Phillips, R. L., Bazemore, A. W., Dodoo, M. S., Petterson, S. N., Xierali, I., and Teevan, B., “Loss of Primary Care Residency Positions Amidst Growth in Other Specialties,” American Family Physician, Vol. 82, No. 2, 2010, p. 121.
The research described in this article was sponsored by the Medicare Payment Advisory Commission, and was produced within RAND Health, a division of the RAND Corporation.