The emergency room is the only type of medical facility in the U.S. where patients have a right to receive care, regardless of whether they carry insurance or not. Intended for quick treatment of people in crisis, emergency departments have become a safety net for all kinds of conditions—at a cost of 48 billion dollars a year and rising.

While the health industry sounds the alarm about this price tag and scolds patients for over-use, crowded ERs are symptoms of a deeper and more complicated malady. It’s not just people without insurance who rely on the ER and can’t pay their bills. More commonly, people with health insurance seek emergency care, often because their physicians are too busy, they can’t get to a specialist, or the out-of-pocket costs for office visits dissuaded them until too late. The more complicated, stressed, and tumultuous the health system becomes, the more people fall back on a simple, if expensive, safety net.

With today’s overburdened facilities, it’s hard to imagine a time when Americans avoided the hospital at all costs. However, before the ER emerged as a fortress of life-saving technology, sick people preferred to receive medical care in their homes, tended by family members or paid nurses. Doctors made house calls. Nineteenth-century hospitals were places of last resort for the poor, migrant laborers, and travelers who had nowhere to rest or no family to care for them. Adding to this social stigma was the fact that wards were plagued by hospital-borne bacterial infections. Doctors had very few cures; they mostly enforced rest and nourishment.

The busy emergency room at the Downey Regional Medical Center in Downey, California, 2006
Gary Friedman/Los Angeles Times via Getty Images
The busy emergency room at the Downey Regional Medical Center in Downey, California, 2006

Revolutionary changes followed the embrace of germ theory in the 1870s and 1880s, which ushered in an age of militant sterilization. Procedures that used to occur on kitchen tables now required a disinfected operating theater—and this made hospitals indispensable for modern surgery. A cascade of innovations from the 1930s onward, including effective antibiotics, diagnostic imaging, dialysis, and corticosteroids, allowed the medical profession to achieve an unprecedented degree of efficacy.

In 1900, an accident victim would have been carried home to recover or die, and few hospitals had an “emergency ward”—the patient’s chances were just as good if they waited for a doctor to come to them. By 1945 all that had changed, and hospitals began to organize their critical care resources into emergency departments, staffed by specialists whose techniques developed from the battlefield medicine of WWII. With new blood banks and defibrillators, they achieved Lazarus-like effects. Crucially, the emergency entrance was open “night and day… round the clock,” as American Hospital Association ads proclaimed.

Such messaging taught people to equate hospitals with health, “the first line of defense instead of the last resort,” in a phrase repeated by many journalists. A Washington, D.C., hospital director rejoiced that “so-called ordinary people, and even the well-to-do,” were flocking to the ER because “they want service in a hurry.” The federal government met this popular demand for more and better hospitals with passage of the 1946 Hill-Burton Act, which funded the construction of over 6,800 facilities nationwide. Many communities used the funds to build their first emergency departments.

In the usually busy overcrowded County USC emergency room, a sick young lady lays in an empty hospital bed near the entry.
Clarence Williams/Los Angeles Times via Getty Images
In the usually busy and overcrowded County USC emergency room, a sick young lady lays in an empty hospital bed near the entry.

With abundant new hospitals and faith in modern medicine at an all-time high, ER visits increased 400 percent between 1940 and 1955, and that growth did not reflect a bumper crop of emergencies. Rather, it reveals that Americans had higher expectations for health care but were only partially covered by a patchwork insurance system, creating a perfect storm that broke over the nation’s emergency rooms. Worried hospital administrators declared a “crisis of overutilization.” At first, the culprits in ER over-use were middle- and working-class Americans frustrated by the difficulty of accessing primary care.

Historian Beatrix Hoffman has analyzed three causes of over-use in the 1950s and 1960s. First, due to a doctor shortage and greater geographical mobility, many people lacked personal physicians and went to the ER for ordinary illnesses. Second, though many middle-class people carried partial health insurance, it was usually limited to hospital services. When they learned this (sometimes by receiving bills for things they thought were covered) patients defaulted to the ER to avoid the out-of-pocket expense of preventative care.

Finally, even if you had a personal physician and could pay, it was increasingly hard to get seen. A generation before, you could summon the town doctor at any time of the day or night for a house call. By the 1940s, doctors were keeping what seemed to patients like very leisurely hours—most clocked out at 5 p.m., and on Sundays and Wednesdays. The sick might have to wait weeks for an appointment. When patients called them after hours, doctors often referred them to the emergency room, where they faced increasingly long waits.

Nikita Hendrix (in wheelchair) waits to be treated for a pressure sore on his foot, during a visit to the emergency room at Long Beach Memorial Hospital, 2014.  In one of the first signs of the effects of Obamacare, most hospitals in Los Angeles County had an increase in visits to their emergency departments in the first part of 2014.
Mel Melcon/Los Angeles Times via Getty Images
Nikita Hendrix (in wheelchair) waits to be treated for a pressure sore on his foot, during a visit to the emergency room at Long Beach Memorial Hospital, 2014. In one of the first signs of the effects of Obamacare, most hospitals in Los Angeles County had an increase in visits to their emergency departments in the first part of 2014.

Slammed by an unprecedented demand for emergency care, hospitals did an about-face: instead of boasting that their doors were always open, they started distributing pamphlets that instructed the public on “proper use of the emergency room,” pleading with people to stay home and call their personal physician unless their condition was life-threatening. Unfortunately, “emergency” is a subjective term, and patients themselves often couldn’t tell the cause of their chest pain or trouble breathing—they only knew that the hospital, with its high-tech equipment, could give them an answer at any time of the day or night.

Patients were well aware that they might wait in the ER for hours, and that the experience could be unpleasant and costly. The fact that they kept lining up shows how frothy publicity combined with real scientific progress to cement the connection between rapid medical treatment and chances of survival. Administrators soon realized that many of these patients couldn’t pay their bills, but the public responded with outrage when hospitals turned people away, leading to a 1986 law mandating emergency treatment. The industry’s messaging about the lifesaving power of medicine inadvertently created an ethical imperative—that no one deserved to die a preventable death.

There’s no single reason why the emergency room serves as first resort for so many Americans today, but its history of absorbing patients set adrift in a changing health care system suggests that it will bear the impact of changes to come.