Simply by virtue of being alive, each man, woman, and child has a history. And in this technology-enabled age of the Quantified Self, more and more people are taking an active interest in their personal history—downloading apps to track calories and mood swings, blogging about runs and test scores. But arguably the most important record is your medical record—and for people born in the past century, that record has advanced in both importance and technology.
Why Does It Matter?
The purpose of a medical record is to centralize information that helps patients and their doctors make well-informed decisions when sorting through the patient’s health and treatment options. Today, a medical record might include material on the patient’s genes, environment, diet, lifestyle, and treatment history. Doctors add to these records every time you sit on their tables, and individuals need access to these records to secure life insurance, provide evidence of immunization, and so forth.
Low-tech and Low-info Beginnings
Prior to 1900, there was no standard method for keeping medical records. In fact, many doctors didn’t even touch their patients except to check a pulse; many of their observations centered on studying the patient’s complexion, urine, and other excretions. So there wasn’t much to write down.
Some more substantial narratives did exist; the ancient Greeks wrote down advice for patients, lessons for doctors, and stories of particularly notable diseases. This practice was revived in the 14th century, then again with a scientific revolution in the 16th century—marked by a growing scholarly interest in the natural world and the inner workings of the body—fueled the expansion of this practice and the publishing of medical “observations.” One of the most extensive surviving collections of medical records from this time were written by Simon Forman and Richard Napier; you can read more about their work at The Casebooks Project.
But they were the exception to the rule. Other doctors might have kept account books, a list of patients along with their payments for treatments and prescriptions—but that was usually as far as they got.
The Rise of Hospitals and Medical Education
In the second half of the 19th century, two developments drove the establishment of more official medical records in the Western world: public hospitals began to emerge, and medical knowledge grew exponentially.
In the United States, the migration from home health care to being treated in public hospitals was a product of urbanization and the rapid evolution of medicine. For example, the new-found ability to sterilize medical equipment around 1880 opened new doors for surgeons.
Despite these developments, there still weren’t any standards dictating what information to record, so it was a challenge for doctors to compare cases or trace how a doctor arrived at a diagnosis.
The Modern Age
In the 1960s, the introduction of computers into the medical field paved the road for the standardization and sharing of medical records. The ability to log a consistent set of information allowed doctors to track their patients over time and provide evidence of their decision making and follow through.
Beyond providing the patient with their medical history, these new modern medical records had a few other practical purposes:
- Coinciding with an increase in medical malpractice litigation, the thorough medical record became an important legal tool should the doctor be sued (especially if the patient had a poor outcome).
- In terms of insurance, the medical record began to be used to warrant the bill sent to the insurer as well as to determine the patient’s rates and denials.
A large number of standardized medical records allowed researchers to aggregate useful data in their study of disease and treatment, further advancing the field of medicine. In fact the medical coding system is meant to help track the prevalence of illness and the efficacy of treatment in all sorts of patients.
The 1980s and 90s saw growth in the deployment of various computerized health care programs, including software for hospital admission registration and master patient indexes. But many of these programs couldn’t communicate across departments—they were tagged as “source” systems and unviewable to other departments, let alone other hospitals.
A New Revolution
In 1996, the U.S. Congress passed the Health Insurance Portability and Accountability Act (HIPAA), which required the establishment of national standards for electronic health records (EHR). In effect, HIPAA both furthered the digitation of patients’ diagnostic and treatment data and created a lot of complexity thanks to the thousands of codes used to track that data. The concept of a national centralized server model of health care data—which would allow many different providers to communicate and share patient records—is not always a popular one, as it carries risks involving privacy and security of information.
But the need for expanded EHRs was clear—it could potentially help avoid costly, life-and-death mistakes. In his January 2004 State of the Union address, President George W. Bush called out the issue directly: “By computerizing health records, we can avoid dangerous medical mistakes, reduce costs and improve care.”
The passage of President Barack Obama’s Affordable Care Act in 2010 introduced new reporting requirements for EHRs. Medical professionals are bracing themselves for a number of changes related to coding and electronic record keeping.
What You Can Do Now
Play an active role in your own health by keeping a copy of your own health records. Some medical offices may charge you for a copy of your chart, but it is within you right to ask for it. And wherever life leads you, make sure your chart follows: if you switch physicians or visit a specialist, make sure your physician’s office shares your record.