The Miracle of Penicillin

Few suspected  that the quiet discovery made by Sir Alexander Fleming in a private laboratory in 1928 would snowball into being the earth-shattering breakthrough it would turn out to be.  Nonetheless, snowball it did, and ended up being one of the most profound medical discoveries of the World War II era.

Fleming began experimenting with antibiotics in 1921, but it wasn’t until 1928 that he found the germ-killing properties of penicillium’s liquid secretion.  At the time, however, he was unable to produce enough of this powerful penicillium secretion to actually be useful in medicine, and so his findings were swept under the rug for several years.  Fleming’s work was “rediscovered” almost a decade later by a team of scientists at Oxford, and additional research from Columbia University provided strong evidence that penicillin could effectively treat infections.

Ultimately the media's interest in the miracle drug resulted in extensive popular press coverage of penicillin. Medical and scientific research were increasingly popular subjects of press coverage since the 1880's.  Source:  ccat.sas.upenn.edu
Ultimately the media’s interest in the miracle drug resulted in extensive popular press coverage of penicillin. Medical and scientific research were increasingly popular subjects of press coverage since the 1880’s. Source: ccat.sas.upenn.edu

 

After Japan’s attack on Pearl Harbor in 1941, the United States’ demand for penicillin was so high that it could only be met by mass production- something that Fleming had deemed impossible.  In that same year, pharmaceutical company Pfizer agrees to develop a method of mass production for penicillin, and devotes the next three years to researching a solution.  One of the scientists on the job, Dr. Jasper Kane, suggested in 1942 that Pfizer utilize the same deep-tank fermentation methods as used in processing citric acid.  The company decided to follow through with Kane’s plan and went out on a limb to invest several millions of dollars in converting a nearby plant into a plant perfected for the complex production process that would be necessary.  It took only four months to get the renovated plant into functional order, and thanks to the new technology, Pfizer found itself producing over five times more penicillin than they had expected.

The miraculous antibiotic was first tested for the military in the spring of 1943, and it worked so well that by the fall, surgeons were using it on the battlefield to treat patients with life-endangering infections.  During the span of World War II, American troops received about 85% of the nation’s penicillin production, which by 1943 totaled 231 billion units.  In fact, by D-Day in 1944, almost 300 billion units of penicillin were brought ashore with Allied armed forces (270 billion of which were supplied by Pfizer, the nation’s leading producer at the time).

Source:  ccat.sas.upenn.edu
Source: ccat.sas.upenn.edu

Penicillin quickly proved itself to be one of the safest antibacterial substances on the market, an assertion which still holds true today.  It helped save countless lives, earning its nickname, the wartime “wonder drug.”  Indeed, World War II was made different from any other war in history due to the remarkable infection treatment that penicillin offered to people.

The Battle Against Malaria during WWII

In the beginning of the Second World War, a disease called malaria proved to be heavily detrimental to the allied forces and their fight against dictatorship. Malaria, which produces extremely high fevers and other flu-like symptoms due to a specific infected mosquito bite, was the cause of death for roughly 500,000 American soldiers at the dawn of World War II. In addition to high fevers, excessive sweating and the chills were also among the common symptoms of malaria, which had the power to put soldiers out of combat for up to a week or more.

Malaria is transmitted when a type of mosquito called Anopheles, more specifically a female Anopheles, is contaminated with a special species of Plasmodium. Although the American army endured numerous variations of malaria, Plasmodium vivax (or vivax malaria) and Plasmodium falciparum (or falciparum malaria) were the most prevalent. Because vivax malaria had the ability to greatly attenuate the immune systems of infected soldiers, those diseased were at greater risk to secondary infections that could result in death.

Soldiers recovering of malaria in this Guadalcanal treatment facility. Source:  www.nydailynews.com
Combat medic tending to soldiers recovering of malaria in a Guadalcanal treatment facility.
Source: http://www.nydailynews.com

As one would guess, a cure or preventative drug for malaria was desperately needed in order to provide treatment to the ever-increasing number of malaria cases (which had struck four times for every one soldier working with the allies in the Pacific by 1943).

Warning to American troops that the Japanese aren't their only enemy in Pacific combat during WWII. Source:   http://www.motherjones.com/photoessays/2011/08/racist-propaganda/malaria-japan-war
Warning to American troops that the Japanese aren’t their only enemy to prepare for in the Pacific.
Source: http://www.motherjones.com

Following World War I but prior to World War II, controversies on how to approach malaria arose between the Malaria Commission (League of Nations) and the International Health Division (Rockefeller Foundation). The Malaria Commission believed that using drugs against malaria was the best way to combat the disease. However, the International Health Division wanted to wipe out the parasites that spread the disease, the Anopheles mosquito. With the intention of using both methods during World War II, several soon realized that destroying the Anopheles mosquitoes while fighting would be very difficult. As a result, researchers solely focused on controlling malaria outbreaks through prophylactic drugs.

Due to the pressing health conditions of soldiers at war, both civilian and military scientists from America came together with allied scientists to discover a sound cure for malaria as quickly as possible. Toward the end of 1943, scientists had proven that the German drug atabrine could cure falciparum malaria. Atabrine was to replace the original treatment used to fight malaria called quinine. Quinine was derived from the bark of cinchona trees and approximately 90% of its supply came from Java (an island in Indonesia). But when the Japanese took over Java, that supply was swiftly cut off. However, atabrine (and no other drugs) had yet to cure vivax malaria.

Propaganda used as a reminder to soldiers in the Pacific to take atabrine in order to fight malaria and stay alive. Source: http://www.microkhan.com
Propaganda used as a reminder to soldiers in the Pacific to take atabrine in order to combat malaria while fighting the Japanese.
Source: http://www.microkhan.com

With unwavering tenacity to find another antimalarial drug that was superior to atabrine and could combat vivax malaria, the collaborated research of civilian and military scientists in America continued to intensify. After numerous more trials and studies were conducted among organizations throughout the nation, scientists had a breakthrough with a drug named chloroquine. In early 1945, the drug showed positive results on soldiers in combat. Chloroquine not only cured falciparum malaria as adequately as atabrine did, but it also effectively suppressed vivax malaria!

By the time World War II had concluded, it was obvious that chloroquine had surpassed atabrine. Although over a million American casualties during World War II were a result of malaria, the disease has become much more controllable due to the extensive research and medical breakthroughs of effective antimalarial drugs in the interim of World War II.

 

 

 

 

Blood Transfusions Save Lives

Although blood transfusion was not invented during the World War II era, its development and modernization revolutionized health care and helped save thousands of lives on the battlefield and at home, both in America and overseas.

Blood transfusions were still largely experimental during World War I, and the first use of refrigeration to preserve blood supplies was not until 1920.  The first blood bank at Cook County Hospital was opened in Chicago by Dr. Bernard Fantus in 1937.  It was first used by battlefield surgical teams after the discovery that blood and plasma transfusions could help prevent wounded soldiers from going into shock.

An American medic administers plasma to a wounded soldier.
An American medic administers plasma to a wounded soldier. Source:  www.mtaofnj.org

In 1938, Dr. Charles Drew (a leading authority on mass transfusions) made the revolutionary discovery that blood plasma could successfully replace whole blood in humans, which was important because whole blood quickly spoils when in storage.  He developed a technique in which to separate and preserve blood plasma, a method that was found to be extremely helpful to World War II surgeons in countries where the the death toll was high.

At first, liquid plasma was used rather than dried plasma, because time was of the element, and plasma took a long time to dry (not to mention it was extremely expensive!).  As part of Drew’s method, the plasma was first processed, separated from whole blood via sedimentation, and diluted to a 50% concentration with a salt solution.  Then the plasma was vacuum-packed into a specialized glass bottle from which it would eventually be dispensed (blood would not be packaged into plastic bags until after the war).

Plasma was vacuum-sealed into glass bottles and shipped in boxes of 6.
Plasma was vacuum-sealed into glass bottles and shipped in boxes of 6.  Source:  history.amedd.army.mil

The American Red Cross finally organized its own Blood Donor Service in 1941, and agreed to sponsor a civilian blood donor service to support American soldiers overseas.  It also developed a safe, sterilized system of testing and storing plasma for overseas shipment.  By the end of the war, the Red Cross had collected over 13 million units of blood through the program and converted almost all of it into plasma.

Propaganda from the American Red Cross concerning its blood donation services for the war effort.
Propaganda from the American Red Cross concerning its blood donation services for the war effort.

After World War II, the United States continued its blood programs as a medical service available to hospitals across the nation.  The system of blood banks grew exponentially, even though the Red Cross’s blood supply from volunteers was unable to meet national demands.  This led to an emergence of for-profit blood collection centers which picked up where the Red Cross left off, paying people for their blood donations and distributing them as plasma for both medical and research purposes.  America soon became a global leader in blood and plasma services as its system grew and improved over the years.