Development of American Medical Education

Sadie Tristam is a third year sociology major at Grinnell College. She is passionate about education and medicine fascinates her.

Changes in education usually corresponded with changes in the time. We see this in our primary education system, and it is no different for medical education. Many scholars who study the development of medical education agree that the schools and education have to develop with the medicine of the time. There is an added importance because medicine directly serves the community it is in. Quality medical schools frequently have quality medical facilities, and those communities greatly benefit. Medicine has also seen some dramatic changes, developing “from a relatively weak, traditional profession of minor economic significance, medicine has become a sprawling system of hospitals, clinics, health plans, insurance companies, and myriad other organizations employing a vast labor force,” and that labor force needs an education. [1] I argue that medical schools and the corresponding curriculum develop closely with changes throughout history. Looking between 1860 and 1950 specifically, I will discuss how some aspects of medical education have developed, standards have changed, and what changes we see in the near future.

One of the fundamental pedagogical issues concerning medical education is how to balance the need to teach students the basic concepts of medical science with the need to train them in the practical skills needed to practice medicine. [2] The two focuses include a practical medical education and a theoretically oriented one. Practical medical educations focus on aspects of medicine relevant to the illness in a direct community, meaning there is more sociology and psychology in the doctor-patient relationship. The theoretically oriented education is more focused on the basic medical sciences, laboratory skills, and usually the doctor-patient relationship is devoid of consideration on the personal situation. Many scholars and educational leaders argued for a more practical medical education because it “instead emphasizes the need for focusing on practical and professional skills that allow for the maturation of reasoning and the development of professional values and compassion” in students who would be the doctors. [3]

Between 1860 and 1900, medical education was in its infancy. The classical higher education curriculum was not complimenting medicine because they failed to recognize the development of the sciences and social sciences. Admissions standards were incredibly low, and for many medical schools, there wasn’t even a need for a high school diploma. Many medical schools had less preliminary education requirements than theology or law schools, and this was typically because students could not usually afford an undergraduate degree, and then attend medical school. [4] The American Medical Association started influencing the medical education in the 1880s with the introduction of state licensing laws, and it “formed a Committee on Medical Education (CME) as one of its first actions.” [5] It was also in the 1880s that we see a formal entrance requirement, which included written and oral exams for the clinical skills and the classical skills.

Medicine in this time period was still struggling to develop: many hospitals were unsuitable for treating patients, while dispensaries were more popular forms of medical treatment. This influenced how clinical teaching happened in medical schools. Frequently, medical schools preferred to take students to dispensaries because more patients were available. Also dispensaries could and would cater more to the student’s needs. Well-funded medical schools were few and far between, but students at those facilities had many opportunities to travel to Europe and learn from the specialty hospitals there. This mirrored the fact that many “leaders in medicine advocated for medical education standards that more closely reproduced the rigor of the study of medicine at the major European universities,” since medicine in Europe was more developed than in the United States at this time. [6]

Many changes we see in education and medicine happen at the turn of the century. Between 1900 and 1950, the development is rapid, and most of this is because of the Flexner Report. Early on, doctors are still not very accessible for much of the population because they preferred to be in cities and urban areas. However, medicine in this time period changed because of the government’s focus on public health. Death rates were declining because the standard of living was rising due to public health measures. Periodic health examinations comparable to the common yearly physical became an accepted part of medical care. Doctors started to be a little more invested in patients because they “assumed a new responsibility for maintaining longitudinal medical records on his patients so that he could recognize changes in their state of health,” and this connection allowed for more personalized care. [7] Part of this change transferred into the medical education. Much of this influence was from the Flexner Report of 1910.

Photo Credit Unknown Source: Wikipedia
Photo Credit Unknown
Source: Wikipedia

The Flexner Report was developed as a study from the Johns Hopkins University Medical School and the purpose was to look at the quality of facilities, entrance requirements,and number of qualified faculty members at medical schools. The study did locate and expose many poor quality medical schools, but the real legacy is that it “transformed medical school education to strictly adhere to protocols of mainstream science teaching (basic sciences, research, and clinical care).” [8] The Report did heavily emphasize the importance of an academic education and research than the skills learned from professional training.

This was not reflected in the medical education, however. With the advent of surgery and the degree of specialization needed for such a new medical practice, there was created an internship and residency program. The AMA developed educational standards for such apprenticeships in 1919, and this helped to monitor the progress of such a program. The purpose of an internship/residency was to provide hands-on experience for the aspiring doctors, and it has become an integral part of the medical education.

Within this time period, the quality of hospitals has turned around significantly. Coupled with the closing of most dispensaries and many wealthy public medical schools opening their own hospitals, standards changed for the better. Schools in the Northeastern corner of the United States opened hospitals adjacent to their schools because it gave a certain amount of control on how the hospital was run, and to provide their students with hands-on experience. This growth also allowed for successful internship/residency programs at these schools and the development of clinical specialties, including the cardiac, neo-natal, and orthopedic wings we take for granted today. [9] In turn, the specialization of education is evident as interns and residents usually pick a specialty to focus on before their training is complete. Also, as the medical field expands, we also see a specialization in what kind of medicine people can practice. Now, for example, there are pathologists, anesthesiologist, and audiologists, all of whom may go to medical school or may not, but are critical to the medical field.

Massachusetts Medical College circa 1824 (Author: J.R. Penniman) Source: Harvard University, Francis A. Countway

The importance of this discussion is becoming more pertinent today. Many politicians and academics claim our medical system is in crisis, and one way people are looking to change it is through the medical education. The argument is being made that the four years students spend in medical school are integral to changing perceptions that student has made in their first 20 years of previous experience, and “the first two years of medical school (biochemistry, molecular biology, neuroscience, psyciology) are unnecessary for the second two years,” because it does not teach the importance of patient care and professional values. [10] Instead, it teaches the value of research and clinical skills unrelated to the practice of being a doctor. Some medical programs, for example Harvard Medical School, see the early application of clinical rotations pertinent to changing this trend. The argument is that clinical rotations in the first two years of medical school can help establish to the hopeful doctor that patient care is the primary goal, while still learning the important and necessary skills. [11]

The value of a solid education cannot be contested. Whether the goal is to be a lawyer, run a successful business, or be a doctor, the need for a solid foundation before practicing is crucial. Some argue that because doctors are influencing the lives of humans, that their solid foundation needs to be long, hard, and top-notch. We can see through TV shows like Grey’s Anatomy that surgical internship/residency programs after the four years of medical school are seven seasons, or about seven years, long. Despite the fact that it seems their personal lives are more difficult than the surgical program, the show does not paint a breezy picture of the time commitment, rigor, and stress put on the doctors. The development of such an education comes from many decades of trial and error on the part of academics, faculty, and students. A study of the medical schools mid-century agreed that “the better his grasp of the basic sciences, the broader his factual knowledge, and the more intelligently he organizes this knowledge, the more scientific he will be as a doctor. If, in addition, he has sympathy for and a broad understanding of people, he will be an excellent physician.” [12] That “if” is becoming a must, as the medical education is trying to find a balance between being the best scientifically and providing the best quality care.

[1] Paul, Starr. The Social Transformation of American Medicine: The rise of a sovereign profession and the making of a vast industry. Basic Books, 1982, pg. 4.

[2] William Rothstein. American Medical Schools and the Practice of Medicine: A History. Oxford University Press, 1987.

[3] Alberto Armendi and Edmund Marek. “Pedagogical Shifts in Medical HealthEducation.” Creative Education 4 no. 6A (2013), pg. 20.

[4] Rothstein, pg. 89-93.

[5] Susan E. Skochelak (2010) Commentary: A Century of Progress in Medical Education: What About the Next 10 Years?. Academic Medicine 85, 197.

[6] Skochelak, pg. 197.

[7] Rothstein, pg. 123.

[8] Armendi and Marek, pg. 20.

[9] William Rothstein. American Medical Schools and the Practice of Medicine: A History. Oxford University Press, 1987.

[10] Vinay Prasad. “Persistent Reservations Against the Premedical and Medical Curriculum.” Perspectives on Medical Education 2 no. 5 (2013), pg. 1.

[11] Armendi and Marek pg. 20.

[12] Robert Berson and John Deitrick. Medical Schools in the United States at Mid-Century. McGraw-Hill, 1953. pg. 6.

Further Reading:

Ludmerer, Kenneth M. Time to Heal: American medical education from the turn of the century to the era of managed care. New York: Oxford University Press, 2005.

Ludmerer, Kenneth M. Learning to Heal: The Development of American Medical Education. New York: Basic Books, 1985.

Quintero, Gustavo. “Medical Education and the Healthcare System—Why Does the Curriculum Need to be Reformed?” BMC Medicine 12 no. 213 (2014).