Lessons from Atabrine
I keep a small collection of malaria-related artifacts from the World War II era, including original bottles of the synthetic quinoline derivative Atabrine (quinacrine), which was used to prevent and treat malaria among Allied soldiers during the war—particularly in the South Pacific theater—but also in North Africa and Italy and elsewhere where the disease was a threat.
Facing shortages of quinine, military officials had stockpiled Atabrine early in the war, despite knowing of its very undesirable side effects, which included its not uncommonly causing disturbing psychiatric symptoms such as nightmares, panic, and hallucinations.
Eager to find a safer alternative, the U.S. government during the war years funded a massive, Manhattan Project-like drug development effort to synthesize and test alternative compounds. The Johns Hopkins University played a critical role in this effort—from which later emerged the related quinoline drug chloroquine, which became the mainstay in prophylaxis and treatment of malaria for decades. In later years, other quinoline derivatives, including primaquine and Lariam (mefloquine), would also emerge from the legacy of this Hopkins-affiliated World War II era program. In the over seven decades since this program, hundreds of millions of people worldwide have taken these quinoline-derivative drugs, and countless numbers have been saved from malaria as a result.
Yet, while undoubtedly highly efficacious against malaria, today, we understand that these quinoline derivative drugs are also active in the human central nervous system, and in some cases, they may act as idiosyncratic neurotoxicants, causing neurological and mental health sequelae that outlast the drug’s continued presence in the body. Mefloquine recently became the subject of an FDA “black box” labeling change, that warns of potentially “permanent” effects from the drug. U.S. military officials have also expressed concern the drug’s effects could, in some cases, confound the diagnosis and management of military-related disorders, such as posttraumatic stress disorder (PTSD) and traumatic brain injury (TBI). As a result, the U.S. military has gradually moved to eliminate the drug from routine use, replacing it with safer and better tolerated non-quinoline alternatives.
While Atabrine contributed directly to victory in World War II, there is evidence that—as with mefloquine—it may have left behind a hidden epidemic of psychiatric symptoms among veterans that are only now beginning to be understood. Although tragic, the difficult lessons we learn among our veterans may help us to inform future decision-making governing the broader use of this class of drug. Since malaria remains such a threat worldwide, use of these drugs will likely continue, but their rational use in place of potentially safer alternatives will now need to balance their reduction in morbidity and mortality from malaria against the increasingly better-understood risk of neurological and psychiatric sequelae.
Remington is a former Army public health physician now pursuing a DrPH degree in the Department of Mental Health. He researches various topics in military mental health including the mental health effects of antimalarial drugs.