The Bright Side of Black Death

April 17, 2015

By Medical Discovery News

Bright Side of Black Death

It’s easy to think that nothing good could come from a disease that killed millions of people. But Dr. Pat Shipman, an anthropologist at Pennsylvania State University, disputed that notion in his recent article in “American Scientist,” where he suggested the Black Death that ravaged Europe in the Middle Ages may have resulted in some positive effects on the human population. Considering that we are in the midst another significant plague (the Ebola virus in West Africa), we could certainly use more information about the role of pandemics on human populations.

The Black Death or Bubonic plague started in the mid-1300s and was caused by a bacterium called Yersinia pestis, which typically enters the body through the bite of a flea. Once inside, the bacterium concentrates in our lymph glands, which swell as the bacteria grow and overwhelm the immune system, and the swollen glands, called buboes, turn black. The bacteria can make their way to the lungs and are then expelled by coughing, which infects others who breathe in the bacteria. The rapid spread of the infection and high mortality rates wiped out whole villages, causing not only death from disease but starvation as crops were not planted or harvested. It killed somewhere between 100 million to 200 million people in Europe alone, which was one-third to one-half of the entire continent’s population at the time. The plague originated in the Far East and spread due to improved trade routes between these two parts of the world.

Today, global travel is easier than ever thanks to extensive international airline networks. Just like with the Black Death, our transportation systems could enhance the spread of a modern plague. Of course, modern healthcare is also more sophisticated and effective, but as the latest Ebola outbreak has reminded us, a pandemic is a realistic possibility.

Dr. Sharon DeWitte, a biological anthropologist at the University of South Carolina, recently made several discoveries from comparing the skeletal remains of those who died from the Black Death and those who died from other causes during the same era. First, she found that older people, who were therefore already frail, died at higher rates. Killing this group at a higher rate created a strong source of natural selection, removing the weakest part of the population.

After the plague years, she found that in general people lived longer. In medieval times, living to 50 was considered old age. But the children and grandchildren of plague survivors lived longer, probably because their predecessors lived long enough to pass on advantageous genes. Today, a genetic variant in European people called the CCR5-D32 allele, which was favored during the natural selection initiated by the plague, is associated with a higher resistance to HIV/AIDS.

Microbes have an intimate relationship with human populations and have shaped human evolution through the ages. We may see survivors of the Ebola virus passing on similarly advantageous genes through natural selection as well.

For a link to this story, click here.

Artificial Blood

Nov. 7, 2014

By Medical Discovery News

Red blood cells

In the series “True Blood,” the invention of artificial blood allows vampires to live among humans without inciting fear. In the real world, however, artificial blood would have very different effects, as 85 million units of blood are donated worldwide and there is always a demand for more. An artificial blood substitute free of infectious agents that could be stored at room temperature and used on anyone regardless of blood type would be revolutionary.

That is exactly what a group of scientists at the University of Essex in England are working on, although the search for an artificial blood substitute started 80 years ago. All red blood cells contain a molecule called hemoglobin, which acquires oxygen from the lungs and distributes it to cells throughout the body. Their plan is to make an artificial hemoglobin-based oxygen carrier (HBOC) that could be used in place of blood.

HBOCs are created using hemoglobin molecules derived from a variety of sources, including expired human blood, human placentas, cow blood, and genetically engineered bacteria. The problem is that free hemoglobin, which exists outside the protective environment of red blood cells, breaks down quickly and is quite toxic. Therefore, HBOCs are not approved for use in most of the world due to their ineffectiveness and toxicity.

The active group in hemoglobin that binds to oxygen is called heme, which can actually be quite toxic. Scientists have found a variety of ways to modify hemoglobin to increase its stability but safety issues still remain. If the hemoglobin’s processing system is overwhelmed, a person may develop jaundice, which causes the skin and whites of the eyes to turn yellow. Too much free hemoglobin can also cause serious liver and kidney damage. When free hemoglobins, not whole red blood cells, are infused, the human body’s natural system for dealing and disposing of this molecule is overwhelmed, leading to toxicity. That is why blood substitutes consisting of free hemoglobin have been plagued with problems, such as an increase in deaths and heart attacks.

But scientists involved in this latest effort to produce a blood substitute have been reengineering the hemoglobin molecule. They are introducing specific amino acids, which are the building blocks of proteins, into hemoglobin in an effort to detoxify it. Preliminary results indicate that this approach may work. They have already created some hemoglobin molecules that are much less reactive and are predicted to be less toxic when used in animals or people.

If successful, this HBOC would be a universal product, meaning it could be used on everyone and there would be no need to waste time on testing for blood types. It would also be sterile, free of any of the infectious agents that donated blood must be tested for. Instead of refrigeration, it could have a long shelf life at room temperature, perhaps years, so it could be stockpiled in case of major emergencies. It could even be kept on board ambulances and at remote locations far from hospitals. The search for an effective and nontoxic blood substitute is one the medical field’s Holy Grails, and if proven successful, these scientists may have finally found it.

For a link to this story, click here.

How Clean is Too Clean?

Oct. 31, 2014

By Medical Discovery News

Cleaning supplies

Common knowledge and previous studies generally agree that children who grow up in the inner city and are exposed to mouse allergens, roach allergens, and air pollutants are more likely to develop asthma and allergies. But a recent study adds a new twist – children exposed to these substances in their first year of life actually had lower rates of asthma and allergies. However, if these allergens were first encountered after age one, this protective effect did not exist.

Another study parallels this one, concluding that children growing up on farms also have lower allergy and asthma rates. Scientists argue that farm children are regularly exposed to microbes and allergens at an early age, leading to this same protective effect.

Asthma is the most common chronic condition among children. One in five Americans, or 60 million people, has asthma and allergies. In the industrialized world, allergic diseases have been on the rise for more than 50 years. Worldwide, 40-50 percent of school-age children are sensitive to one or more common allergens.

In this study, scientists enrolled 467 children from the inner cities of Baltimore, Boston, New York City, and St. Louis and followed their health since birth. The infants were tested for allergies and wheezing by periodic blood tests, skin-prick tests, and physical exams, and their parents were surveyed. They also sampled and analyzed the allergens and dust in the homes of over 100 of the subjects.

Children who lived in home environments that included cat and mouse dander as well as cockroach droppings in their first year of life were much less likely to develop wheezing or allergies when compared to children who were not exposed to these substances. This protective effect was additive, so children exposed to all three were less likely to develop wheezing compared to children exposed to two, and those children were more protected than those who were exposed to only one. Only 17 percent of children who lived in homes with all three allergens experienced wheezing by age three, compared to 51 percent of children who lived in homes without such allergens. Interestingly, dog dander did not have a protective effect against the development of allergies or wheezing.

The richness of the bacterial populations children were exposed to enhanced this protective effect. This suggests that household pests may be the source of some of the beneficial bacteria in the inner city environment. Early exposure to allergens and certain bacteria together provide the greatest effect.

An infant’s microbiome, the total makeup of bacteria in and on their bodies, is developed during their first year of life. The bacteria colonizing an infant’s gastrointestinal system affects their immune system and influences the development of allergies. Scientists hypothesize that something similar may be happening in the airways and lungs, as kids with asthma have altered bacterial populations in their respiratory systems.

There is mounting evidence exposures to allergens and bacteria in the first few months of life help shape the respiratory health of children. But we don’t yet know how specific allergens and bacteria induce this protective effect, or how they can be used to treat children and reduce their chances of developing allergies and asthma.

For a link to this story, click here.