The Bloodletting Paradox
What Phlebotomy’s History Can Teach Us About Common Sense & Runaway Harm
I love thee, bloodstain’d, faithful friend!
As warrior loves his sword or shield;
For how on thee did I depend
When foes of Life were in the field!
Those blood spots on thy visage, tell
That thou, thro horrid scenes, hast past.
O, thou hast served me long and well;
And I shall love thee to the Last!
Published in 1841 by Baltimore-based writer and physician Joseph Snodgrass (1813–1880), the above poem was written in praise of the spring lancet, a once popular and prestigious surgical tool. Although an instrument for human incision might seem an odd thing to compose a verse about, Snodgrass’s admiration was not out of place for his time.¹
“OdeTo My Spring Lancet” reflected the attitude many physicians during the Industrial Revolution held toward the emerging medical innovations of the 18th and 19th centuries. This post-enlightenment era elevated the status of medical practitioners by putting the field of medicine on more empirical footing. Developed taxonomies, shared standards, and newly-engineered technologies made both new and seasoned therapeutic practices more efficient and accessible.
One such practice, given new esteem by the spring-lancet’s innovative design, became one of the 19th-century’s most popular health interventions. This was the inveterate art of bloodletting.
A Brief Overview of Intentional Bloodshed
Practiced for over 3000 years across multiple societies and cultures, bloodletting (or venesection) was a form of therapeutic phlebotomy: the removal of blood from the body through incision, leeching, or scarification, to cure an illness or prevent disease. Using an array of instruments that fell in and out of fashion over the millennia, the practitioner would pierce a vein at a particular site on the body (which might vary depending on the ailment being addressed) and drain the patient’s blood, sometimes until the patient passed out.
Spilling blood as a means of healing might sound backward to modern ears. And we might mistakenly assume it to have been used throughout history only as a last resort — to the degree that it was used at all. However, over the course of its reign, bloodletting was considered an immediate go-to treatment for a litany of different ailments; from serious infirmities like smallpox and pneumonia to irritations as common as fevers, acne, or headaches.²
Beyond addressing existing irritations, it was also a form of preventative therapy. Individuals might be bled every season to maintain their health or ostensibly improve other aspects of their being. One medieval medical handbook described bloodletting as having the ability to help strengthen the memory, develop the senses, or even produce a “musical voice”— to highlight just a few of its many purported perks.
As the practice flourished, patients who could afford it commissioned personalized bleeding bowls to catch their blood. And it wasn’t uncommon for physicians to procure lavishly decorated bleeding tools and ornate lancet cases.
The only problem with this popular medical craze was that bloodletting didn’t actually work.
An Irony in Medicine
In time, medical practitioners found bloodletting’s therapeutic claims to be untenable. It is one thing for a treatment to be ineffective, but another for it to cause harm, and bloodletting checked both boxes.
As it turns out, it does not give one a musical voice, improve one’s senses, or treat pneumonia. Additionally, with the exception of a few special cases,³ removing large amounts of blood from the body may not only be harmful — causing long-term damage to the oft-bled — but potentially fatal, especially when frequently employed for common maladies.
This may appear obvious today, but it was a reality that eluded our predecessors. Bloodletting helped alleviate common surface-level symptoms, giving it the appearance of effectiveness and leaving its potential harm undetected. Although the practice was eventually abandoned by the early 1900s, the century leading up to its fall from grace was also the era of its most persistent use.
It’s important to highlight that bloodletting was not some superstitious practice of back-alley apothecaries, but a customary and favored treatment in orthodox medicine. Bleeding patients became so prevalent during the 19th century that the ability to phlebotomize skillfully was as necessary to the Industrial-Age physician as the ability to use a stethoscope is for modern-day practitioners.
The historical irony here is twofold. You have a medical practice believed to improve a patient’s health that was paradoxically making them worse; and that same practice experiencing the height of its popularity during a time of industrialization, forward-thinking, and professed rational empiricism.
How an ironic state of runaway harm could flourish amid a culture of modernism and innovation is worth investigating.
When Sense Becomes Common
Bloodletting’s curious case could be described as — to use a fancy medical term — iatrogenic.
A compound of the Greek iatros (physicians) and genesis (origin), iatrogenic harm is defined as harm brought forth by a healer, or any unintended adverse patient outcome because of a health care intervention.⁴ In other words, “treatments” that ironically make things worse.
Although the term is technically specific to medical error, practices that produce the opposite of what is intended are possible across all fields and disciplines. We may be surrounded by non-medical ‘iatrogenesis’, perhaps even eliciting it ourselves.
Of course, even with the greatest of care mistakes are still possible, so adverse outcomes are bound to happen every so often. It’s when those adverse outcomes are not apparent to us that they become truly dangerous. Harm seen for what it is can be properly addressed, but harm mistaken for help— as was the case with bloodletting —will be further perpetuated, unknowingly turning a one-off adverse result into iterative abuse.
Oddly enough, a culture’s “common sense” is usually the driving force behind these states of perpetual harm. Noticing the difference between help and harm often requires deliberate interrogation of a behavior or practice. But when those practices are under the umbrella of convention, they are shielded from such interrogation and assumed to be benign.
Common sense is infrequently questioned, hard to protest, and even harder to abandon, making it difficult to notice when its inner workings have gone awry. The perfect seedbed for iatrogenesis to crop up undisturbed.
Many factors constitute a culture’s common sense, all of which have the potential to hide iatrogenesis. However, there are three worth highlighting that we can explore in turn — namely, paradigms, authority, and ubiquity. Bloodletting’s 19th-century tenure provides a case study to spotlight how these might go astray and possibly help mitigate runaway harm in our own times.
In Good Humor: Paradigms
The first common sense element to consider is the existence of paradigms; and more specifically, our tendency to ignore them. These are the theoretical frameworks lying behind our practices; the collection of underlying assumptions that justify our behaviors and give them their legitimacy. Once established, they unknowingly fade into the background of public consciousness, making them — and the practices they sanction — harder to interrogate.
In the case of bloodletting, the practice’s use in orthodox medicine was based on a 2000-year-old medical theory popularized by the great arbiter of ancient Greek medicine, Hippocrates (460 — 370 BC).⁵
Hippocrates posited that mental and physical temperament was determined by four internal fluids (or humors): phlegm, black bile, yellow bile, and blood. It was believed that proper wellness was a result of all four humors being in balance. And disease, conversely, an imbalance between the humors— either through deficiency or abundance. Known today as humoral theory, or humorism, the framework gave bloodletting its prevailing rationale in the West.⁶
By the second century AD, Galen of Pergamon (129 — 216 AD) — the Roman physician and strong proponent of both humorism and bloodletting — claimed that out of the four humors, blood was the most abundant and the most dominant, making phlebotomy the logical candidate for addressing common ailments. It was additionally believed the other humors all made their way into the blood eventually, so their excess, too, could be addressed by bloodletting the “plethora” away.
Bloodletting and humorism’s Western popularity grew in parallel, the former scaffolded by the latter. Public acceptance of humoral theory allowed it to slowly make its way under the common sense category, protecting it from large-scale questioning. And with the humoral paradigm guarded from public interrogation, there was little reason to question the efficacy of bloodletting either.
Even today, we rarely make time to assess the relative strengths and weaknesses of our dominant paradigms. Attempts to improve any aspect of society usually involve us attending solely to immediate practical concerns (“How can we improve test scores?”), while neglecting the paradigmatic assumptions underlying those concerns (“Are test scores a reliable reflection of intelligence?”). It’s ironically a paradigm’s obviousness that makes it go unnoticed and its amendment seem nonsensical.
Through Hippocrates, and with subsequent help from Galen, the humoral paradigm gave us the proper mental scaffolding to bleed our troubles away, highlighting the second contributing factor in phlebotomy’s long legacy.
Age Before Beauty: Authority
Galen and Hippocrates were brilliant physicians, both establishing many of the fundamentals still used in medicine to this day, but they were also products of their time. Galen was not allowed to perform human autopsies, for example, and instead had to build most of his anatomical knowledge piecemeal from various animal dissections. Although he got many things right, he understandably got many things wrong.
Nonetheless, as an abnormally prolific writer, Galen’s ideas were disseminated and he soon became a household name and unwavering authority. His large output of medical treatises (around 300–400 estimated works) traveled throughout the known world, being preserved, copied, and translated from the Roman period to the Renaissance and beyond. For over a thousand years, if you were learning medicine, you were learning more or less directly from Galen.
As centuries passed, Galen’s observations morphed into orthodoxy and the man became historical legend. Observations that deviated from Galen’s original findings were thought to be mistaken, and physicians who stated his work to be in error were sometimes forced to recant or lose their credibility.⁷
Both bloodletting and humorism had their share of skeptics, even as early as the second century, but their protests went ignored by the larger medical establishment. Legitimized by the Galenic corpus, a challenge to either (or to any practice believed to be of Galenic origin for that matter) was not simply considered a challenge to the idea, but to the infallible man himself.
Authority takes many different forms — prominent figures, time-honored traditions, seminal texts, etc. — but all of them have similar effects. Voices that sing out of key with authority are marked as subversive, and any protests that might come from such voices (which, as we’ve seen, might be lifesaving) are rarely taken seriously.
Granted, there are times when that marginal voice should be dismissed. Deviation from authority might be due to willful ignorance, incompetence, or anarchy for anarchy’s sake— cases in which ignoring the variant voice might be appropriate. However, deviation from authority might also be due to innovative thinking, creativity, wisdom, or even genius. The problem is that, in the shadow of common sense, it is extremely difficult to discern whether a deviation is wise, malicious, or merely naive.
It takes time to properly appraise the marginal voice and its protests, but our third common sense contributor makes that time an ever-diminishing commodity.
“Bloodstain’d, Faithful Friend”: Technology & Ubiquity
By the 19th century, bloodletting’s résumé was extensive. It had been cradled across cultures by the humoral paradigm and bore Galen’s stamp of approval for over a millennium. But those caretakers were beginning to come under fire.
New observations began to better localize disease, slowly nudging humoral fluids out of favor. And although Galen remained highly respected, empirical autopsies were making his anatomical mistakes harder to ignore. Yet amidst the growing scrutiny toward that which gave phlebotomy its legitimacy, bloodletting cases were, ironically, increasing, due, in part, to the influence of technology.
For the majority of its history, bloodletting was imprecise, painful for the patient, and difficult to control with consistency. As is often the case, the culprit behind its clunkiness was clunky tools. Bleeding instruments ranged from carved stones and animal parts to phlebotomes and fleams (early-period venipuncture devices). These instruments got the job done but lacked accuracy and required a practiced hand.
However, by the 1800s, our old friend the spring lancet gained popularity. Its miniature blade improved accuracy and the adjustable spring allowed for controlled tension. Prior to that, physicians had to use their own strength to apply the appropriate amount of pressure, and getting the right vein at the right angle was literally hit or miss. The spring lancet reduced initial pain for the patient and simultaneously made the physician’s attempts more successful.
On the surface, this was an innovation to celebrate — as the poem that introduced this article could attest. The new device afforded the practice greater ease and accessibility. Untutored individuals could now perform venesection with reasonable confidence, so more patients were being bled by more people with greater frequency. The lancet’s technological accessibility greatly contributed to bloodletting’s ubiquity.
As a major element of common sense, ubiquity is often the most likely culprit of its misguidance. Similar to paradigms, when a practice becomes ubiquitous — both frequent and widespread — it becomes normative and difficult to discard. A concentrated practice, gaining traction at a slow rate, has time to be examined and is easier to oppose if found harmful. But fast-spreading practices are normalized quickly and technology can turbocharge their rate of dissemination, reducing the window of time we have to interrogate them properly.
We often assume any innovation that makes a behavior easier should result in positive outcomes. Spring-lancet manufacturers, for example, likely crafted the instrument believing it would contribute to medical progress and, ultimately, public health. It was difficult for them to conceive they might have been doing the opposite. From their vantage point, the device made a popular medical intervention more efficient. But in retrospect, we can see it instead made a dangerous practice more ubiquitous. Even though bloodletting was being challenged by the mid-19th century, the practice was already so diffuse it remained entrenched well into the 20th.⁸
With so many new technologies and their accompanying behaviors being deployed in our own times, at a speed and scale unmatched in our species’ history, it is wise to think carefully about which exciting innovations might just be contemporary spring lancets.
To Bleed, Or Not To Bleed
It’s tempting to chalk up historical blunders such as the bloodletting paradox to bygone ignorance we are no longer prone to. But in reality, we are just as susceptible as our progenitors to the same pitfalls that enabled phlebotomania’s runaway harm. And with a false confidence, perhaps even more so.
Iatrogenesis — in medicine or otherwise — is an emergent phenomenon that can be quite hard to detect when we uncritically assume our common sense to be in good order. While paradigms, authority, and ubiquity are not problematic in and of themselves, we have to be mindful of how their cooperation — as some of the primary building blocks of common sense — might harbor iatrogenic practices.
The wrong outcomes, despite believing we are doing the right things, may be present in any of our modern systems, policies, or institutions. Corporate guidelines, pedagogical methods, social conventions, etc.; all could be unknowingly perpetuating ill effects rather than opposing them. Wolves we ourselves have dawned in sheep’s clothing.
We would do well to encourage the periodic appraisal of our practices and their justifications. The consistent inquiry into whether or not our actions are truly effective or align with our values is not pessimism, but prudence.
Given the complexity of our times, there is wisdom in paying attention to history and remembering that simply because a practice is considered common — and its intentions sincere — does not guarantee it isn’t enabling detriment.
*This article does not purport to provide any medical advice. For questions on health and medical practice, consult a licensed physician.*
Footnotes for my Fellow Pedants:
[1]: Snodgrass is arguably, and unfortunately, more famous for his role as the last physician to Edgar Allen Poe (more specifically for his failure to save Poe’s life) than for his other contributions as a writer, activist, and abolitionist. Reference Site
[2]: A number of famous deaths in European history, such as those of George Washington and Charles II, are speculated to have been caused by excessive bloodletting from their personal physicians.
[3a]: Although bloodletting has been dismissed as a harmful practice for most ailments, a 2014 paper suggests controlled venesection can still be effective for a small number of blood-related diseases; namely hemochromatosis, polycythemia vera, and porphyria cutanea tarda. Reference Paper
[3b]: It should also be noted that some modern schools of medicine use controlled phlebotomy in their practice with updated methods. The danger associated with bloodletting referenced in this article is limited to the comparatively uncontrolled pre-20th century venesection in the Western context (“uncontrolled” denoting the absent standard for amount of blood removal deemed ‘safe’ and which diseases proved effectively treatable by the removal of blood).
[4]: The definition for Iatrogenesis used above comes from the Hartford Institute for Geriatric Nursing. Here is an additional definition from the World Health Organization
[5]: Although the Western rationale for bloodletting came from the Greeks, the oldest record we have of the practice comes from the ancient Egyptians in a medical text known as the Ebers Papyrus, dating to around 1550 BC. The manuscript recorded medical interventions that seem to have been in practice for some time, so it’s reasonable to assume bloodletting predates this era.
[6]: The four internal humors were said to correspond to the four elements that made up the world: water, earth, fire, and air. We also still harbor vestiges of humoral theory in our modern language. Personality descriptions such as sanguine, phlegmatic, or melancholy are derived from humoral imbalances.
[7]: In 1559, John Geynes, a physician from New Haven, had to stand trial in London’s Royal College of Physicians for claiming that Galen ‘had erred’. He eventually made a public retraction.
[8]: Bloodletting was not dismissed all at once but was gradually abandoned. There was no definitive nail-in-the-coffin paper or experiment. Rather, a slow shift in the overall direction of Western medicine through things like the acceptance of germ theory and a better understanding of the circulatory system eventually led to the practice’s decline.
Sources:
Craddock, P., 2021, Spare Parts: The Story of Medicine Through the History of Transplant Surgery, St. Martin’s Press
Davis, A., Appel, T., 1979, Bloodletting Instruments in the National Museum of History and Technology, Smithsonian Institution Press
Lagay, F., PhD, 2022, The Legacy of Humoral Medicine, AMA Journal of Ethics
This article was originally published in BrainLabs. Read it on its Medium publication here: