Darwin’s Rathtars

In honor of Star Wars week at the Athens Science Observer, I wrote a guest post on Rathtars and all they can teach you about evolution. Enjoy!

A rathtar roams free aboard Han Solo’s ship the Eravana. Image: Wookieepedia

When Kanjiklub and the Guavian Death Gang boarded the Eravana, the three most beloved characters in The Force Awakens suddenly found themselves in a tight spot. Thanks to a con that was risky even by Han Solo’s standards, our trio was left stranded while the film’s more charismatic heroes made a narrow escape on the Millennium Falcon. Yes, after eating a large number of the galaxy’s most dangerous gangsters and scaring off the rest, those poor, adorable rathtars were left to fend for themselves.

Thanks to the ship’s presumably heavy infestation with Corellian scavenge rats combined with a rathtar’s ability to eat literally anything, the marooned rathtars would have been able to feed and reproduce without a problem. But how would their newfound isolation affect them? Would they continue to be a colony of rathtars just like the ones on their home planet?

When a new population is founded by a few individuals from a larger group — like, for example, when three rathtars are stranded on a Baleen-class heavy freighter — a phenomenon called the “founder effect” is observed. The founder effect is a loss in genetic diversity in this new colony resulting from a random sampling of the individuals from the original population that isn’t necessarily representative of that original population. What does that mean in Galactic Basic?

Say, for example, that most rathtars have tentacles ranging in length from eight to 11 horrifying feet long and an appalling number of blood-red eyes. However, by sheer chance, or maybe because they are easier to catch, the rathtars Han Solo happened to capture had tentacles that were a merely terrifying eight and a half feet long and puce-colored eyes.

The growing rathtar colony on the Eravana was derived from those select few individuals Han Solo captured. Therefore, even though rathtars throughout the galaxy have a much wider variety of tentacle lengths and eye colors, it can be expected that after several generations, all of the Eravana rathtars will have eight and a half foot long tentacles and puce-colored eyes. There was a loss in genetic diversity in the new group, meaning that fewer versions of the genes that determine tentacle length and eye color are available to choose from when a new rathtar is produced.


Galapagos finches sketched by Darwin during his 1845 journey on the H.M.S. Beagle. Finches have very little in common with rathtars, as rathtars are horrifying. Image: Wikimedia Commons

This founder effect was famously first described by Darwin in the context of finches living on the Galapagos Islands. He noted that finches living on different Galapagos Islands appeared to have a common ancestry, but had grown different characteristics once isolated on their separate islands. Like with the Eravana rathtars, the finch populations on the different islands were established by a small number of individuals from the larger, mainland finch populations.

Over time, the differences in genetic diversity between the finch populations weren’t the only reason why they no longer resembled each other. Darwin noted that the finches’ beaks were different sizes and shapes, based on whichever type of beak was best suited to eating the specific seeds and nuts available on their islands. Something very similar would have happened to the Eravana rathtars.

Not much is known about the rathtars’ home world, but it is likely quite different from the metallic confines of a freighter. As a result, the Eravana rathtars would have also gone through the same process as Darwin’s finches, known as adaptive radiation, in order to fit into their new environment.

With successive generations, rathtar traits would naturally be selected to help them thrive on the Eravana, like smaller body size to fit through the narrow corridors and more nimble tentacles to better snack on unsuspecting Corellian scavenge rats. Given enough time, the Eravana rathtars might even become so distinct from rathtars in general that they would form a completely new species!

So if you, like me, are concerned about the marooned rathtars, worry no more! Thanks to natural selection and a presumed rat infestation, they aren’t lost and lonely, drifting aimlessly in space; the Eravana rathtars are the proud founding fathers of a new species in a galaxy far, far away.

Nectar of the Gods

The daughters of Aegir, the Norse god of the sea and brewer to the gods of Asgard, brew ale while their father sits in the foreground. The frothy head on a pint of ale is reminiscent of sea foam over waves, likely leading to the connection between the god of the sea and ale.

The daughters of Aegir, the Norse god of the sea and brewer to the gods of Asgard, brew ale in this image from a 19th century Swedish translation of the Poetic Edda. The frothy head on a pint of ale is reminiscent of sea foam over waves, likely leading to the connection between the god of the sea and beer.

Since antiquity, civilizations the world over have reveled in the ability to make alcohol, praising their gods for sending them the heavenly nectar that makes a man brave and his companions attractive. However, it wasn’t until some years after Antoni van Leeuwenhoek discovered his animalcules that people began to suspect that it is neither Aegir nor Mayahuel they have to thank for their mead and octli, but a microbe: Saccharomyces, or, as it’s more commonly known, yeast.

For millennia, people have taken advantage of the byproducts of a process called “fermentation.” The word fermentation has its origins in the Latin word fervere, “to boil,” as a result of the observation that bubbles form in fermenting substances, as if they were boiling. Alcoholic fermentation is one of several forms of a metabolic process called “anaerobic respiration.”

In cellular biology, respiration is the process by which nutrients, including sugars, are converted into molecules that are useful sources of energy for a cell. Anaerobic respiration is respiration that occurs when oxygen is absent.

While human cells also carry out anaerobic respiration, it is a different process than yeast’s alcoholic fermentation. In a massive blow to efforts to improve cardiovascular health, it turns out this form of fermentation, “lactic acid fermentation,” ends with the production of lactate by muscle cells, not alcohol. Thus, rather than getting progressively drunker during exercise, a person is left utterly sober and with a burning sensation in their muscles.

The hijacking of alcoholic fermentation by unwitting humans began as early as 7,000 years ago, likely accidentally. A careless farmer sealed an improperly cleaned jar containing fruit or grains, and resident microbes were happy to snack on what had been stored inside. When the jar was opened, the unsuspecting consumers would have been pleasantly surprised to find a fizzy, liquid mess that made their heads buzz.

A fermentation vat at the Woodford Reserve distillery. Fermentation is kick-started with the addition of yeast to the mash. The froth at the surface of the mash is composed of bubbles of carbon dioxide produced during fermentation.

A fermentation vat at the Woodford Reserve distillery. Fermentation is kick-started with the addition of yeast to the mash. The froth at the surface of the mash is composed of carbon dioxide bubbles produced during fermentation.
Image: Ken Thomas

These happy accidents have since been refined to an art form. Even before we had any awareness of microbes’ existence, much less of the critical role they play in fermentation, early brewers, vintners, and distillers discovered how to create an ideal environment for alcohol production through extensive trial and error.

In addition to aiding in the release of sugars, heating the sugar sources and water to high temperatures before fermentation kills microbes that could compete with yeast. This microbial competition could digest the sugars into compounds that, at best, affect the alcohol’s flavor and at worst are toxic. Ideal fermentation temperatures were found for the yeast to happily produce large amounts of “good” alcohol, without gorging themselves and producing off-flavors. Even the tradition of crushing grapes for wine by stomping on them may have helped along the fermentation process by transferring microbes, including yeast, from bare feet onto the crushed fruit.

The most commonly used brewer’s, distiller’s, and vintner’s yeasts are strains of Saccharomyces cerevisiae, the same species of yeast used as a leavening agent for bread. Different strains of S. cerevisiae have been cultivated for characteristics that make them suitable for different uses. Baker’s yeast carbonates dough very quickly and produces very little alcohol, making it ideal for producing light and fluffy dough, while brewer’s yeast is less aggressive, carbonating at a slower rate and allowing for the production of greater amounts of alcohol.

Laboratory Baker's Yeast, a strain of Saccharomyces cerevisiae, grown on an agar plate.

Laboratory Baker’s Yeast, a strain of Saccharomyces cerevisiae, grown on an agar plate.
Image: Rainis Venta

By the mid-eighteenth century, it was clear that the substance that was then simply called “ferment” was required to make both bread and alcoholic drinks. However, it wasn’t yet known that this substance was a living organism. Several of the most important minds in the foundation of modern science carried out the quest to discover how fermentation took place, which led to several key discoveries in the early days of biochemistry.

Antoine Lavoisier, a chemist widely regarded as the “father of modern chemistry,” first confirmed that yeast was fermenting our alcohol. He showed that 1/3 of the sugar added to a fermentation reaction is oxidized into carbon dioxide, hence the bubbles found in fermenting liquids, while the final 2/3 is reduced to alcohol. Lavoisier further found that “ferment” was absolutely necessary to catalyze this reaction, and that it remained unchanged from start to finish.

Saccharomyces cerevisiae undergoing gemmation (upper left), as viewed by DIC microscopy.

Saccharomyces cerevisiae undergoing gemmation (upper left), as viewed by DIC microscopy.

From the microbe’s point of view, the alcohol and carbon dioxide produced during fermentation are actually harmful waste products to be discarded. As more alcohol is produced, it eventually reaches levels that are toxic to the yeast. This is why beer and wine have on average 5% and 12% alcohol contents, respectively; any higher and the yeast begins to die (spirits begin their lives with similar alcohol contents, but are distilled down to higher concentrations.)

In 1835, Charles Cagniard de la Tour took Lavoisier’s work a bit further by showing that yeast multiplies during fermentation through a process he called gemmation. This process, also called “budding,” is a form of asexual reproduction in which a cell splits itself into two exact copies of itself.

Just over twenty years later, famed microbiologist Louis Pasteur showed that not only does yeast multiply during fermentation, but also that this multiplication and the production of alcohol occur in parallel. Additionally, he showed that the yeast had to be alive for this process to occur; if he boiled the yeast before adding it, fermentation never began.


Albert Edelfelt’s 1885 portrait of the father of microbiology, Louis Pasteur.

Though these observations seem obvious to those with a modern understanding of microbiology, it was an important discovery at the time. These facts combined indicated that fermentation is a direct result of yeast replication and growth. Pasteur thus showed for the first time that not only is fermentation a process that is carried out by a microorganism, but that it is one that is required for that microorganism to live and thrive.

Pasteur went on to fully develop our current understanding of fermentation, elucidating in detail the step-by-step process of fermentation. He also was the one to coin the phrase “anaerobic respiration” after having established that fermentation only occurs in the absence of oxygen.

Though years of study have taught us that alcohol wasn’t bestowed upon us by the great god Bibulous, we should still remember that it is a gift. The next time you meet your friends out for a drink, raise your glass to Saccharomyces cerevisiae, the microbial benefactor of booze lovers everywhere. 

You Know Plenty, John Snow

Dr. Jon Snow, the Father of Epidemiology, and his impressive sideburns.

An 1856 autotype portrait of Dr. John Snow, the Father of Epidemiology, and his impressive sideburns.
Image: Wikimedia Commons

Even the most casual Game of Thrones fan knows Jon Snow, the illegitimate son of the late fan favorite, Ned Stark. As a man of the Night’s Watch, Jon Snow is charged with defending Westeros from the terrors beyond the Wall, including keeping Wildling invasions and the terrifying Others (White Walkers) at bay.

However, many of you have probably not heard of another John Snow who went down in history as a great defender of men. Now known as the Father of Epidemiology, this John Snow was a 19th century doctor and revolutionary thinker who bucked the established paradigm of then-modern thought.


In the mid-1800’s, Soho in London’s West End was not exactly the height of fashion. It was a squalid, densely populated working-class neighborhood that was largely neglected by the well-to-do of London. At this time, Soho’s residents depended on poorly maintained cesspits or the River Thames for waste disposal, rather than a well-planned sewage system.

The poor quality of Thames drinking water, depicted in William Heath's 1828 cartoon, A monster soup, commonly called Thames Water. Image: Wellcome Images

The poor quality of Thames drinking water depicted in William Heath’s 1828 cartoon, A monster soup, commonly called Thames Water.
Image: Wellcome Images

This lack of basic water and sanitation infrastructure meant that the many residents of Soho were swimming in their own filth. Accounts from this time describing cellars three feet deep in human excrement and nearly opaque drinking water the color of green tea feel ripped from Martin’s pages on Flea Bottom, with the glaring difference that these accounts are not works of fiction.

A member of both the Royal College of Surgeons of England and the Royal College of Physicians, John Snow not-of-Winterfell fought to combat poor hygiene and disease. Though he was by training an anesthesiologist (he personally administered chloroform to Queen Victoria during the birth of her two youngest children), recent cholera outbreaks in Soho attracted his attention. The end of August 1854 had brought with it a particularly bad bout of this disease, killing 127 people within the first three days of the outbreak.

A transmission electron microscopy image of the bacterium that causes cholera, Vibrio cholerae. Image: Tom Kirn, Ron Taylor, Louisa Howard - Dartmouth Electron Microscope Facility

A transmission electron microscope image of the bacterium that causes cholera, Vibrio cholerae.
Image: Tom Kirn, Ron Taylor, Louisa Howard – Dartmouth Electron Microscope Facility


Today, we know that cholera is spread through food and water contaminated with feces containing the bacterium Vibrio cholerae. In 1854, however, the idea that living, microscopic organisms cause infectious diseases, called the “Germ Theory of Disease,” would not have been accepted by the medical profession, as Louis Pasteur did not come up with it until 1861.

The “Miasma Theory of Disease” that pervaded at that time had existed since medieval times. It stated that foul, poisonous air made people sick; in fact, this idea was how the disease malaria got its name, being medieval Italian for “bad air.”

Though John Snow knew as little about infectious microbes as anyone else at the time, his observations about the patterns of the spread of disease led him to believe that bad air had nothing to do with it. He traveled from house to house, collecting data on the sick and the deceased in order to map out the source of the outbreak.

By asking questions about people’s habits and behavior, he found that the common connection between many of the sick was that their main source of drinking water was a water pump that sat on the corner of Broad Street and Cambridge Street.

ghost map

John Snow’s map of cholera cases in Soho. Each black bar represents one case of cholera. The Broad Street pump is marked in the center of the map.
Image: John Snow, M.D., On the Mode of Communication of Cholera

By the time the Board of Guardians of St. James’s Parish met on September 7th, 1854, nearly 500 people had died from the cholera outbreak that was sweeping through Soho. Firmly believing the Broad Street pump to be the center of the outbreak, John Snow testified in front of the Board, urging them to remove the pump’s handle in order to stop further spread of the disease.


Though the Board was skeptical of his claims, they took his advice and mandated that the handle be removed from the Broad Street pump to block the source of the disease. The outbreak waned shortly after the handle was removed, but few really believed Snow’s theory.

The fact was that nobody wanted to believe that the disease was being spread through drinking water contaminated with human feces; despite the obvious evidence that the water was filthy, this thought was too vile to be readily accepted. The end of the outbreak did not help his case, either – Snow was the first to admit that the outbreak had already been on the decline before they took action.

Despite this reluctance to believe Snow’s theory that disease is contagious, evidence kept piling up in his favor against the claims of those who believed in the Miasma Theory of Disease.

Afjeowi caption

George Pinwell’s 1866 cartoon, Death’s Dispensary, depicting the Broad Street Pump as the source of the 1854 Soho Cholera outbreak.
Image: CDC


Snow set out to find more convincing evidence for the connection between contaminated water and cholera infection, and discovered that those who avoided drinking from the Broad Street Pump did not become infected.

Just one block away from the Broad Street pump, workers in a local brewery and monks from a nearby monastery did not become ill, despite being surrounded by the sick. It turned out these men took a page out of the Lannister playbook; rather than drinking water, they exclusively drank alcohol, specifically the beer they brewed within their own walls.

Further investigation found that of the 535 malnourished inmates living in the terrible living conditions of a Soho workhouse, only 5 became ill. This workhouse had its own, independent source of water.

With the help of fellow contagionist Dr. Joseph J. Whiting, Snow extended his inquiries by asking residents around South London which company supplied their drinking water. They were particularly suspicious of the Southwark and Vauxhall Water Company, which could have caused cases later in the outbreak by pulling water from the Thames downstream of where cholera-contaminated sewage was dumped into the river.

A table from John Snow's 1856 paper, Cholera and the water supply in the south districts of London, 1854, showing the connection between the Southwark and Vauxhall company's water and deaths due to cholera.

A table from John Snow’s 1856 paper, Cholera and the water supply in the south districts of London, 1854, showing the connection between the Southwark and Vauxhall company’s water and deaths due to cholera.

When they compared these findings to the incidence of cholera in Soho, Snow found that 286 of the victims of the most recent outbreak drank Southwark and Vauxhall water, whereas only 14 drank water from another company that pulled water upstream of the sewage inputs (it was unknown which company provided water to the remaining 34.)

Despite the evidence in favor of John Snow’s theory, the Miasma Theory prevailed until Louis Pasteur’s development of the Germ Theory a decade later.


He may not have killed a White Walker, fought off a Wildling invasion, or even managed to convince people that disease is contagious. Still, John Snow’s impact was profound. Snow’s investigations established a new way of thinking about disease; by trying to understand human behavior and find patterns surrounding the outbreak, he laid the foundation for the field of epidemiology as we know it today.

And so – it seems it’s safe to say that in spite of the staunch opposition of the Miasmatists, you know plenty, John Snow.

The Columbian Exchange

The First Thanksgiving 1621, Jean Leon Gerome Ferris

The First Thanksgiving 1621, Jean Leon Gerome Ferris

When discussing some of the darker periods of human history, one event that stands out is the devastation wreaked on Native American populations following first contact with European explorers and colonists.

Though many of these first contacts were far from peaceful, the clashes between the indigenous peoples and European invaders were not directly responsible for the majority of the losses. Rather, it is estimated that up to 95% of the pre-Columbian Native American population of 50 million died of European diseases in the years following 1492, vastly outnumbering the lives lost to violence.

It makes intuitive sense that the Native Americans should have suffered so badly as a result encountering diseases they had never been exposed to before, but why weren’t the Europeans equally affected? Weren’t they meeting the unknown, as well?


To answer this question, we’ll first need to go back about 12,000 years to the Neolithic Revolution. At this time, Old World hunter/gatherers made the transition to farming, establishing large, static communities for the first time in history.

What those new farmers didn’t realize was that they were also creating a microbe’s paradise. Within these new farming communities, the Eurasian people lived in very close contact with each other and their animals, both those they purposely domesticated and others, like rats and roaches, for which the new urban setting incidentally made a cozy new home.

Map of Paris, 1572

Map of Paris, 1572

Flash-forward a few millenia and those farming settlements had evolved into cities like Paris and Venice, which by the 15th century had populations numbering in the several hundreds of thousands. Many diseases thrive in environments like those in the close quarters of highly populated cities; it’s just easier to jump from one host to the next when they are constantly bumping into and sneezing on each other.


Absolutely. Cahokia near modern-day St. Louis may have had a population that reached 20,000 people, and by the time the Spaniards arrived, Tenochtitlan is estimated to have had a population around 200,000, making it a contender for the largest city in the world at that time.


Mural painted in the Palacio Nacional de México by Diego Rivera in 1945 depicting Tenochtitlan during Aztec times. See the full mural here.


Of the 14 domesticated herd animals, 13 were Eurasian species. The only one who called the pre-Columbian Americas home was the llama. Surprisingly, therein lies the answer to our question.

The New World civilizations domesticated fewer animals, and their contact with the ones they did domesticate was more distant than that seen in Eurasian communities. As a result, they were exposed to significantly fewer animal diseases that infect humans, or zoonoses. Many major diseases – including smallpoxmeasles, and influenza, which caused massive New World epidemics – are believed to have begun as zoonoses.

Due to the lack of widespread, close contact with domesticated animals, these diseases and those like them were not present in the New World. Disease didn’t wipe out the Europeans who came over to the Americas because there were very few diseases to infect them in the first place.

An illustration of Syphilis attributed to Albrecht Dürer. At this time,  syphilis was called the "French Disease." The French, in turn, called it the "Spanish Disease." The regional recorded names for syphilis are a veritable "Who Hates Who" of that period  in Europe's history.

An illustration of Syphilis attributed to Albrecht Dürer. At this time, syphilis was called the “French Disease.” The French, in turn, called it the “Spanish Disease.” The regional recorded names for syphilis are a veritable “Who Hates Who” of that period in Europe’s history.


To say that pre-Columbian America was a disease-free utopia would be misleading. However, the most common diseases found in these populations appear to have been non-communicable diseases, those not caused by an infectious pathogen. Of the infectious diseases that were found in the Americas, the most common, like staphylococcus and streptococcus, were found worldwide, and thus were not new to the European colonists.

A devastating exception to this rule is the disease that the Nahua called the huey cocolitzli, or the “Great Pestilence.” This epidemic lasting from 1576 to 1580 is estimated to have killed anywhere between 7 and 17 million people across Mexico, native and Spanish alike.

Though the pathogen that caused the huey cocolitzli remains unknown, one likely candidate is a viral hemorrhagic fever native to the New World. This theory is strengthened by the fact that the Spanish were so susceptible to the disease, indicating that it probably had not been seen before in the Old World.

On a racier note, evidence is stacking up that Columbus’s crew brought syphilis back to Europe from their trip in the Americas. While…celebrating…their safe return home, they lit the spark that burned into the 1495 syphilis outbreak, the first recorded in Europe. Columbus’s crew and friends passed the torch to many famous artistic geniuses/playboys, who kept it burning (literally and figuratively) throughout Europe until the discovery of penicillin.


Several tropical diseases currently found in Central and South America today immediately come to mind as being potentially harmful to the Europeans when they first arrived. After all, didn’t malaria and yellow fever infamously interfere with the digging of the Panama Canal?

The construction of the Panama Canal’s Pedro Miguel Locks, 1910.
Image courtesy of the Wellcome Trust

While these diseases are dangerous to New and Old World natives alike, they didn’t harm the conquistadors or American colonists for one very important reason: they weren’t there yet.

The Triangular Trade introduced malaria, yellow fever, schistosomiasis, elephantiasis, and river blindness to the Americas, all of which are still responsible for significant morbidity in South and Central America. Once there, these diseases found mosquito and snail vectors that were happy to help them establish in the New World.

Yet again, it comes down to the livestock. Blossom’s not saying she told you so, but she is looking a little smug.

As Pretty as a Milkmaid

blossom color

Blossom the Cow.
Image: Edward Jenner Museum, Berkeley, UK

Whether you realize it or not, you owe a lot to a very special cow named Blossom. In 1796, Blossom was a humble dairy cow like any other on a Gloucestershire farm. Around this time, Blossom had a neighbor by the name of Edward Jenner, a well-respected physician and naturalist, who was beginning to consider an idea that first struck him some 35 years prior.

When he was 13 years old, Jenner was apprenticed to a country surgeon, where he overheard a dairymaid discussing her perfect complexion with a friend. “I shall never have smallpox,” she apparently said, “for I have had cowpox. I shall never have an ugly pockmarked face.

Field Marshal The Right Honourable Jeffery Amherst, 1st Baron Amherst KCB, contemplating an act of biological warfare.  Joshua Reynolds, 1765

Field Marshal The Right Honourable Jeffery Amherst looking pensive. “To smallpox blanket, or not to smallpox blanket…”
Joshua Reynolds, 1765


The disease from which the vain dairymaid claimed protection, smallpox, is caused by Variola viruses, from the Latin varius or varus, meaning “stained” and “mark on the skin.” At that time, smallpox killed an average of one-third of those infected, and blinded or extensively scarred those that survived. It was also so contagious as to have been used in early forms of biological warfare, most infamously by General Jeffery Amherst during Pontiac’s Rebellion.

As its Latin name suggests, smallpox causes painful pimples on the skin known as macules. The most common progression of the disease is the development of macules into raised papules spread across the face, torso, and limbs, and then into fluid-filled vesicles, large, painful skin lesions filled with debris from tissues destroyed by the virus.


Though no one yet understood why, it was well known that smallpox survivors never suffered from the disease again. People in Africa, Asia, and the Middle East independently developed techniques to take advantage of this fact, collectively called variolation.

The idea behind variolation is the same as that behind a chicken pox party – a grosser, even more dangerous chicken pox party. By inoculating with matter from the sores of an infected person, a previously uninfected person would get a (hopefully) mild smallpox infection. After the infection, the inoculated person would be protected from future, more severe bouts of smallpox.

Variolation begun in 15th century China involved the ritualized administration of smallpox matter to the nostrils by blowing it through a silver pipe.

Variolation begun in 15th century China involved the ritualized administration of smallpox matter to the nostrils by blowing it through a silver pipe.

The Turks and eventually Europeans practiced variolation by sticking a lancet into an open sore of a person with a mild smallpox infection, then jabbing it into the arm of the person to whom they are trying to confer protection. The Chinese took a slightly different approach, first documented around the 15th century CE. Rather than administer the virus under the skin, the Chinese would take scabs from healing vesicles, grind them up, and snort them up their noses. Those Chinese sure knew how to party.

While variolation conferred effective immunity against smallpox, it was still less than ideal as a method of protection. Because it caused an actual smallpox infection, those who were variolated could spread smallpox to other people. The variolated were also at risk of secondary infections resulting from the jab, as other bugs would be glad to hitch a ride on the lancet (syphilis comes to mind.) Worst of all, up to 2% of people who were variolated actually died from the resulting infection.

Despite the risks, the significant difference between a 2% and 35% fatality rate helped variolation become a widespread practice. It even made its way to America to play an important role at a critical moment of the American Revolution.


Jenner, the physician fortunate enough to be Blossom’s neighbor, realized that the cowpox-infected dairymaid bragging about her lovely skin was describing protection similar to that resulting from variolation. If protection could be conferred by purposely infecting a person with smallpox, couldn’t the same also be done with cowpox?

The hand of Sarah Nelmes infected with Cowpox.  Image: Wellcome Images

The Cowpox-infected hand of Sarah Nelmes.
Image: Wellcome Images

Inspired by the vain milkmaid and variolation, Jenner asked Blossom and her cowpox-infected dairymaid, Sarah Nelmes, for their assistance. Jenner took material from a sore on Nelmes’s hand and inoculated his gardener’s eight-year-old son. After a mild fever and discomfort, the boy recovered. When exposed to smallpox a few months later, he was found to have complete protection against the disease.

While today this experiment would be considered illegal at best and wildly unethical at worst, it was the first documented case of what would eventually be called vaccination. Though Jenner did not know it at the time, cowpox is caused by the Vaccinia virus, a close relative of smallpox’s Variola. By infecting Sarah Nelmes with cowpox, Blossom had exposed her to a virus that causes a very mild disease, but that is similar enough to Variola that Sarah’s immune system built up defenses against both viruses.

The key difference between vaccination and variolation is the use of a less virulent virus to induce immunity, rather than the actual, dangerous smallpox virus itself. As we have come to further understand disease and immune responses to it, vaccination has come to include the use of weakened or killed pathogens, or even just pieces of them, to stimulate a protective immune response.

Ali Maow Maalin, who contracted the world’s last known case of smallpox, in Merka, Somalia, 1977. After recovering from Smallpox, Maalin dedicated the rest of his life to the eradication of Polio.

Ali Maow Maalin, the world’s last known smallpox victim, in Merka, Somalia, 1977. After recovering from Smallpox, Maalin dedicated the rest of his life to the eradication of Polio.
Image: CDC


With the help of connections in London and abroad, by 1800 Jenner’s vaccine had spread through most of Europe. Word of it had even reached Thomas Jefferson, who began the United States’ first vaccination program, through Harvard professor and vaccine proponent Benjamin Waterhouse.

A century and a half of global vaccination campaigns later, in 1980, smallpox was officially declared by the World Health Organization to be the first, and so far only, human disease to be eradicated.


While most of the credit rightfully goes to Jenner, there should still be an International Blossom Day to celebrate the source of the Vaccinia that made up the first vaccine. Chick-Fil-A would give out free sandwiches and milk would flow like a stream down a mountainside. Alas, there’s no such love for poor Blossom.

At least our appreciation for the role she played in eradicating one of the world’s most deadly diseases is displayed in one small way; “vaccination” comes from the Latin word vacca – “cow.”