A couple of conversations I’ve had with patients’ families over the past month have made me realize that many folks don’t know how our system produces a pediatrician, a radiologist, or a surgeon. And a lot of what people know is wrong. Physicians are so immersed in what we do that we forget that the process is a pretty arcane one. Just what are the mechanics of how doctors are trained? Understanding your physician’s educational journey should help you understand what makes him or her tick. As it turns out, a lot of standard physician behavior makes more sense when you know were we came from. This post concerns some important history about that.
Most physicians in the nineteenth century received their medical educations in what were called proprietary medical schools. These were schools started as a business enterprise, often, but not necessarily, by doctors. Anyone could start one, since there were no standards of any sort. The success of the school was not a matter of how good the school was, since that quality was then impossible to define anyway, but of how good those who ran it were at attracting paying students.
There were dozens of proprietary medical schools across America. Chicago alone, for example, had fourteen of them at the beginning of the twentieth century. Since these schools were the private property of their owners, who were usually physicians, the teaching curriculum varied enormously between schools. Virtually all the teachers were practicing physicians who taught part-time. Although being taught by actual practitioners is a good thing, at least for clinical subjects, the academic pedigrees and skills of these teachers varied as widely as the schools — some were excellent, some were terrible, and the majority were somewhere in between.
Whatever the merits of the teachers, students of these schools usually saw and treated their first patient after they had graduated because the teaching at these schools consisted nearly exclusively of lectures. Although they might see a demonstration now and then of something practical, in general students sat all day in a room listening to someone tell them about disease rather than showing it to them in actual sick people. There were no laboratories. Indeed, there was no need for them because medicine was taught exclusively as a theoretical construct, and some of its theories dated back to Roman times. It lacked much scientific basis because the necessary science was itself largely unknown at the time.
As the nineteenth century progressed, many of the proprietary schools became affiliated with universities; often several would join to form the medical school of a new state university. The medical school of the University of Minnesota, for example, was established in 1888 when three proprietary schools in Minneapolis merged, with a fourth joining the union some years later. These associations gave medical students some access to aspects of new scientific knowledge, but overall the American medical schools at the beginning of the twentieth century were a hodgepodge of wildly varying quality.
Medical schools were not regulated in any way because medicine itself was largely unregulated. It was not even agreed upon what the practice of medicine actually was; there prevailed at the time among physicians several occasionally overlapping but generally distinct views of what the real causes of disease were. All these views shared a basic fallacy — they regarded a symptom, such as fever, as a disease in itself. Thus they believed relieving the symptom was equivalent to curing the disease.
The fundamental problem was that all these warring medical factions had no idea what really caused most diseases; for example, bacteria were only just being discovered and their role in disease was still largely unknown, although this was rapidly changing. Human physiology — how the body works — was only beginning to be investigated. To America’s sick patients, none of this made much difference, because virtually none of the medical therapies available at the time did much good, and many of the treatments, such as large doses of mercury, were actually highly toxic.
There were then bitter arguments and rivalries among physicians for other reasons besides their warring theories of disease causation. In that era before experimental science, no one viewpoint could definitely prove another wrong. The chief reason for the rancor, however, was that there were more physicians than there was demand for their services. At a time when few people even went to the doctor, the number of physicians practicing primary care (which is what they all did back then) relative to the population was three times more than it is today. Competition was tough, so tough that the majority of physicians did not even support themselves through the practice of medicine alone; they had some other occupation as well — quite a difference from today.
In sum, medicine a century ago consisted of an excess of physicians, many of them badly trained, who jealously squabbled with each other as each tried to gain an advantage. Two things changed that medical world into the one we know today: the explosion of scientific knowledge, which finally gave us some insight into how diseases actually behaved in the body, and a revolution in medical education, a revolution wrought by what is known as the Flexner Report.
In 1910 the Carnegie Foundation commissioned Abraham Flexner to visit all 155 medical schools in America (for comparison, there are only 125 today). What he found appalled him; only a few passed muster, principally the Johns Hopkins Medical School, which had been established on the model then prevailing in Germany. That model stressed rigorous training in the new biological sciences with hands-on laboratory experience for all medical students, followed by supervised bedside experience caring for actual sick people.
Flexner’s report changed the face of medical education profoundly; eighty-nine of the medical schools he visited closed over the next twenty years, and those remaining structured their curricula into what we have today—a combination of preclinical training in the relevant sciences followed by practical, patient-oriented instruction in clinical medicine. This standard has stood the test of time, meaning the way I was taught in 1974 was essentially unchanged from how my father was taught in 1942.
The advance of medical science had largely stopped the feuding between kinds of doctors; allopathic, homeopathic, and osteopathic schools adopted essentially the same curriculum. (Although the original homeopathic schools, such as Hahnemann in Philadelphia, joined the emerging medical mainstream, homopathic practice similar to Joseph Hahnemann’s original theories continues to be taught at a number of places). Osteopathy maintains its own identity. It continues to maintain its own schools, of which there are twenty-three in the United States, and to grant its own degree—the Doctor of Osteopathy (DO), rather than the Doctor of Medicine (MD). In virtually all respects, however, and most importantly in the view of state licensing boards, the skills, rights, and privileges of holders of the two degrees are equivalent.