Friday, January 21, 2011

People Get Ready


On Sunday October 30th of 1938, in honor of Halloween, Orson Welles famously narrated a radio adaptation of H.G. Wells’ War Of The Worlds. Not all listeners tuned in punctually enough to catch the disclaimer offered at the commencement of the show. Apparently, hearing tales of a Martian invasion on the radio told largely in the form of mock news bulletins left some of these people confused and frightened, believing that our planet was actually being attacked by space aliens.*

In our current century, it is even more challenging to tell the difference between reality and parody, and so when a friend at work presented me with a Guardian article entitled, “Earth must prepare for close encounters with aliens, say scientists” it took a few minutes of research to conclude that it was not a satire, or at least not a completely fabricated one. The British newspaper’s sensationalized headline was (rather loosely) based on a 2010 discussion meeting held by the Royal Society. The meeting provided the content for a recent themed issue of the society’s journal, The Philosophical Transactions of the Royal Society A, which bore the more demure title “The detection of extra-terrestrial life and the consequences for science and society”. Basically, it’s a very special what-if edition that speculates on the kind of effect knowledge of life on other planets might have on the societies, religions, etc. of our own planet. You see, it’s not that we’ve discovered aliens or think we’re about to discover aliens, it’s just that, hmm, it’s good to be prepared?

Does the Royal Society have anything newsworthy to say on the subject? It depends on your definition of newsworthy. The Guardian’s headline starts to make a bit more sense after reading Martin Dominik and John C. Zarnecki's introductory paper. Here we learn that there is a protocol for how to respond to the possible detection of extra-terrestrial life – approved by several international associations with lackluster acronyms who are somehow involved in astronomy – but that the protocol holds no legal power. The authors therefore suggest that the UN get involved.

Much debate revolves around whether attempting to contact extra-terrestrials is really such a good idea, with various other articles pondering the repercussions of such contact. The Search for Extra-Terrestrial Intelligence (SETI) has already been scouring the galaxy for half a century for electromagnetic signals, mostly radio waves. They haven’t found anything. However, if they did, earth would have to decide what to do about it. Furthermore, there is nothing stopping us from throwing out signals that might be detected by other faraway life-forms.

The paleontologist Simon Conway Morris devotes his article to speculating on what kind of life we might find on other worlds. Being rather fond of the idea of evolutionary convergence, Morris predicts that intelligent extra-terrestrials would be biologically similar to our own human species and that, given our propensity to violence, this is somewhat worrisome. The Guardian piece makes much of Morris’ suggestion that we, “prepare for the worst”, without noting how often he states that it’s more likely we are alone in the universe.

As would be expected in an issue devoted to what might happen if something else where to happen, Morris is not the only one buffering his argument with caveats. Nobody is making any grand claims about the inevitability of visiting or being visited by intelligent life-forms from other planets in the next decade. The authors of the introduction begin their conclusion with the sentence, “So far, there is no scientific evidence for or against the existence of life beyond Earth.” While the journal’s special issue may have some interesting philosophical arguments, there’s little scientific information to be gleaned from it, and certainly no news (recall that the actual meeting took place last year).

But does that really matter? 2011 has already seen a generous serving of questionable, news-that’s-not-actually-new stories fueled by online social networking sites. I hadn’t even finished sorting through my post-holiday emails at work when I heard the shocking tale of bird deaths in Arkansas. The feathery corpses were attributed to everything from chemical pollutants to the coming apocalypse. Gradually the story that emerged was something conservationists have know for ages; lots of birds die, every year, often by such banal and non-menacing methods as flying into windows. Similar events probably occurred last year as well, it’s just that your mom wasn’t on Twitter back then.

And then there was the zodiac fiasco. Earlier this month, astronomy instructor Parke Kunkle caused an unexpected stir when he revealed, during an interview for the Minneapolis Star Tribune, that our view of the constellations during any given month has been gradually shifting throughout the several thousand years following the establishment of our current zodiac signs. Of course, this was only news to those with less knowledge of astronomy than a student taking an introductory community college course on the subject, which apparently is most of us. Much panic ensued. Suddenly everyone was having identity crises over their incorrectly-assigned, zodiac-based personality traits, and those unfortunate individuals born between November 30th and December 17th were coming to grips with the possibility of being born under a sign named after a thirteenth constellation; Ophiuchus. With astronomers everywhere shrugging and saying, “What? What’s the problem here?” astrologers had to step in and do some speedy damage control to calm the distressed and disheartened public. 

Frankly, I’m a little surprised the Guardian’s space alien article didn’t cause more mayhem, or at least a few ripples of hype. It took me several searches to coax Twitter into telling me anything at all about the would-be scandal, but then I was using big words like, “extra-terrestrial” and “Royal Society”. What finally did the job was “Guardian aliens”. The posts didn’t seem especially alarming. Most just provided a link to the article along with one of its more over-the-top quotes. Perhaps society is getting savvier in processing its non-news. Who knows, if we weather enough horoscope restructurings, maybe we’ll even be ready to handle extra-terrestrial life when/if the Royal Society has some actual news to deliver. Maybe. But probably we’ll freak out.


* The actual degree of panic caused by Welles’ show is now said to have been largely overstated by journalists of the time, but it’s an amusingly-exaggerated tale so I won’t bother to cast any further doubt on its veracity.

 Short for The Royal Society of London for Improving Natural Knowledge, established 1660. Isaac Newton was their president for over 20 years, so they’re pretty legit. The group’s journal is divided into two publications; A, for physical, mathematical and engineering sciences and B, for lowly biology.

 Fear not, your horoscope is even less tethered to actual science than you previously believed. Unless you’re into something called sidereal astrology, your zodiac sign is based on sets of dates named after constellations rather than the periods in which those constellations are visible. I’m still an Aries. You’re still whatever it is you are. It’s going to be okay.

Friday, January 14, 2011

Hot Topic



For a city not known for its winters, Austin has been putting in a good effort this week. Daytime highs are struggling to get above 40ºF and nighttime lows have dipped to temperatures I would prefer not to discuss. It’s cold out there. I plan to make soup later this evening, as few things in this world taste better than a big bowl of soup on a freezing January day. Probably I’ll opt for gazpacho, because nothing compensates for the brutality of winter like a heaping serving of raw puréed tomatoes mixed with seasonings and then chilled….Did I lose you? Were you hoping for hot soup? Well you’re not alone in your preferences. Though I’ve not yet been to Spain, I’m told that it’s difficult to find a bowl of gazpacho during frostier months, the reasoning of restaurateurs being “what lunatic would want to eat cold soup in the middle of winter?” But not everyone agrees that winter gazpacho is such folly. Raw foodists, loosely defined as those who consume 75-100% of their food uncooked, would readily choose cold soup over boiled minestrone, arguing that the former is nutritionally superior. Raw food is a growing and lucrative branch of the restaurant business (as demonstrated by the exorbitant price of my favorite Daily Juice smoothie) but supporters of broiling and barbequing aren’t swayed by the sales pitch. It was one of these individuals that brought to my attention the belief that cooked food was instrumental in human evolution. “I don’t get these raw food people,” My source lamented “Don’t they know that eating cooked food is what allowed us to develop larger brains than other animals?” This alleged knowledge was news to me, but as my boyfriend would say, the idea “Googled well”* and so here we are….

Heat Wave
Here is the claim in a (dry roasted) nutshell: humans have a larger brain, relative to body size, than other mammals. Large complex organs are energetically costly to maintain, and yet our species’ basal metabolic rate  is not significantly higher than that of similarly-sized animals with smaller brains. Since higher metabolisms are not fueling our enormous brains, the additional energy required for such stately organs must be coming from somewhere else, and that somewhere else, according to certain biologists, is a decreased gut size. The human gastrointestinal tract is about 60% smaller than expected for a primate of our size. Supporters of the “expensive-tissue hypothesis” believe that the innovation of cooking, which increases the available calories in plant-source foods, drove this important change in our anatomy.§ Over time, they argue, with less effort needed to digest tough fibrous vegetation, the gut shrank and the brain grew, eventually yielding the dimensions of our current species, Homo sapiens. Furthermore, because of these changes, it is now difficult for us to obtain enough energy (calories) from raw food sources alone.

You Can’t Start a Fire Without a Spark
If you heard a supporter of this hypothesis speak on the subject, they might present it as though it were a given, possibly a launching point for another argument. But the idea is far from universally accepted. One problem is that scientists don’t agree on how long humans and their ancestors have been able to control fire. Some estimate that the technology arose about 250,000 years ago, some 600,000 and others have suggested dates as far back as 1.9 million years. This is a rather important detail to sort out. Homo sapiens as a species is purported to have been in existence for only about 200,000 years. Cooking, and thus fire, would need to predate this considerably if it is to be the accepted cause of our being the smartypants species we are today.

Previous connections between diet and physical proportions have focused on the adoption of meat consumption. While meat is more difficult to chew in its raw form (and more likely to be teeming with bacteria), the availability of its calories is not strongly affected by cooking, which makes this hypothesis less dependent on the ability to get a campfire going.

Other critics have cited non-human species that have smaller guts or higher metabolic rates and yet have failed to develop large brains as evidence that the connection between these elements is not an obvious one.

Smoke and Mirrors
Meanwhile proponents of raw foodism, avoiding all this evolutionary biology nonsense, maintain that cooking food removes valuable nutrients and replaced them with various toxins. They’re right, of course, to some extent. We’ve all heard before that the delicious charred stuff on grilled vegetables (and meat, if you’re into that) and toasted marshmallows contains carcinogens. Additionally, as any raw food website will cheerfully inform you, cooking creates dietary advanced glycation end products (dAGEs) which are believed to contribute to maladies such as diabetes and cardiovascular disease. However, the increase** in dAGEs varies among food types and cooking techniques. Meat is more affected by heating than are vegetables, and high temperature/low-moisture cooking environments have the greatest potential for harming food items of either ilk. In terms of dAGEs, boiling is not such a big deal, and a crock pot is barely worse than a dehydrator (though I still refuse to purchase either of these silly devices).

It bears mentioning that certain vegetables are toxic in their raw form. Recently, I helped to prepare taro root, a tuber unfamiliar enough to necessitate Internet research in order to cook. It turns out that taro is not only inedible when raw (calcium oxalate) but that it really shouldn’t even be touched with bare hands until it has at least been microwaved, as it makes some people’s skin itchy. Other raw items to avoid adding to your salad in mass quantity include parsnips, kidney beans, buckwheat greens and, of course, raw chicken, tempting though it may sound.

Too Many Cooks
A major downside of trying to figure out what happened thousands and millions of years ago is that we can’t actually do experiments to confirm our hypotheses. We can argue about interpretations of the fossil record, but we can’t just retreat to the lab and subject animals to similar conditions for a few millions years to observe what happens. Nobody has that kind of funding. Most educated people accept the theory of evolution by natural selection, but specifics of cause and effect along the road to the present are difficult/impossible to prove. It may be fun to speculate about possible explanations, but there’s no getting around the fact that we weren’t there. Multiple hypotheses exist for numerous anatomical quirks. For instance, the existence of lactose intolerance after weaning in some populations has been attributed to more than one possible causative factor. One argument states that the loss of the lactase enzyme is simply another limited energy issue; why continue to produce the enzyme if there was no use for it in the pre-agricultural era. But another argument attributes the discarded enzyme to the “parent-offspring conflict” – the idea that parents, who wish to reserve enough resources to produce more than one child, have different goals than their children, each of which cares only for its own survival and would gladly postpone weaning indefinitely if it were enzymatically possible. Which hypothesis is correct? I have no idea. Just pick your favorite viewpoint and hope for the best.

What’s For Dinner?
One way of testing at least part of the expensive-tissue hypothesis, the idea that humans now lack the ability to get enough energy from uncooked food, would be to see if anyone can thrive on a raw food diet. This is harder to establish than you would think. Not a lot of studies have been done on the subject and, as with so many human health studies, they are correlation studies rather than laboratory experiments. The subjects in articles on raw foodism are people who chose this diet rather than having it randomly assigned to them in double-blind controlled study. Some studies have found that certain nutritional deficiencies exist in those adhering to raw food diets. Low B-12 levels and low serum HDL cholesterol (the “good cholesterol” you often read about) were observed in one study. Another reported low body weight and lack of normal menstruation in women. Also bothersome is that fact not that many people eat a 100% raw diet (thus the 75-100% guideline for qualifying as a raw foodist). To my knowledge a long-term study on humans consuming exclusively raw foods has yet to be done.

But would a diet of entirely cooked food be a good thing? While cooking may have helped our ancestors get enough calories in times of scarcity, most of us live in quite different conditions today. Easy calories abound and humans are now more likely to be malnourished than undernourished. The deficiencies in consumers of mostly cooked foods are vitamins and fiber, a problem that has been linked to more diseases than I have the patience to list. The same article that reported low HDL cholesterol levels in raw foodists also reported low LDL levels (aka “bad cholesterol”) in the same subjects. Cooked starches and meats may be a fine way to avoid starvation, but they don’t necessarily promote longevity. Recall that for an adaptation to succeed it need only help its bearers live long enough to produce and raise offspring. Fitness in old age is a luxury of modernity.

So, as is often the case, the best route might be a compromise between two extremes. A varied non-partisan diet of cooked and raw foods may be the most sensible solution to our dining dilemmas. Soup and salad rather than soup or salad.


* More novel slang for you. To “Google well”, means to garner enough hits when typed into one’s search engine to merit further investigation. It can also be used in the negative to express skepticism when receiving information second hand, “That doesn’t sound like it would Google well.”

Basal metabolic rate is the minimum amount of energy required by an organism just to sit still and not die (running around costs extra). It is higher in mammals like ourselves than in retiles due to our sophisticated physiological methods of thermoregulation.

The crude term “gut” refers to the alimentary tract, which includes the stomach as well as the various portions of the intestines.

§ The cell walls of plants are made of the polysaccharide cellulose, which is a real pain to break into smaller molecules. Cows have bacteria-produced enzymes to accomplish this, but humans are less fortunate. However, heat can also break cellulose into smaller, more digestible units.

** dAGEs exist in raw foods as well and are especially high in meat.


Who told you this?

Wrangham, R. and Conklin-Brittain, N. 2003. “Cooking as a biological trait.” Comparative Biochemistry and Physiology 136: 35-46.

Aiello, L. and Wheeler, P. 1995. “The Expensive-Tissue Hypothesis: The Brain and the Digestive System in Human and Primate Evolution .” Current Anthropology 36: 199-221.

Krebs, J. R. 2009. “The gourmet ape: evolution and human food preferences.” American Journal of Clinical Nutrition 90: 700S-711S.

Pennisi, E. 1999. “Did Cooked Tubers Spur the Evolution of Big Brains?” Science 283: 2004-2005.

Gibbons, A. 1998. “Solving the Brain’s Energy Crisis.” Science 280: 1345-1347.

Uribarri, J. et al. 2010. “Advanced Glycation End Products in Foods and a
Practical Guide to Their Reduction in the Diet.” Journal of the American Diabetic Association 110:911-916.

Garcia, A. L. et al. 2008. “Long-term strict raw food diet is associated with favourable plasma b-carotene and low plasma lycopene concentrations in Germans.” British Journal of Nutrition 99: 1293–1300.

Koebnick, C. et al. 2005. “Long-Term Consumption of a Raw Food Diet Is Associated with Favorable Serum LDL Cholesterol and Triglycerides but Also with Elevated Plasma Homocysteine and Low Serum HDL Cholesterol in Humans.” The Journal of Nutrition 135: 2372–2378.

Koebnick, C. et al. 1999. “Consequences of a Long-Term Raw Food Diet on Body Weight and Menstruation: Results of a Questionnaire Survey.” Annals of Nutrition & Metabolism 43:69-79.

Sunday, January 2, 2011

Species of the Month: JANUARY

In some ways “Species of the Month” was not the best choice of titles for this column. I could have saved myself some headaches by opting instead for the term “life form” or “organism”. The decision wasn’t entirely arbitrary. After all, I wanted to be able to provide you with the exciting Latinized binomial nomenclature for these plants/animal/fungi/etc. But as a result, I now find myself discussing a creature that is essentially a subgroup of a common species.


Black is the New Gray
Black squirrels are melanistic versions of the species Sciurus carolinensis – the eastern gray squirrel. Aside from their dramatic pigmentation, they’re just like the fairer-haired members of their species. Their lifestyles are probably similar to the squirrels in your own neighborhood.* They live in trees, eat nuts and make those squirrel sounds at you when you’re leaving for work in the morning. They are “scatter-hoarders”, which means they bury food they wish to store for later in numerous locations rather than in one or two well-guarded caches. Sometime they bury their leftovers in your outdoor potted plants, uprooting a perfectly good basil plant in the process. For this and other similar offenses, squirrels are often viewed as pests by humans. As with the eastern grays, black squirrels occupy portions of the Eastern and Midwestern United States, as well as Southeastern areas in Canada. I recall seeing them in Manhattan’s Union square Park on more than one occasion (though it’s also possible that the animals I observed were merely gray squirrels with a heavy coating of soot.) And, sometime during the past century, the black squirrel found its way to the British Isles

Controversy From Across the Pond
Great Britain has its own native species of squirrel, the red squirrel (Sciurus vulgaris), and they were not pleased when our gauche American gray squirrels escaped into their British countryside (after being introduced in captivity by some thrill-seeking noble) and largely out-competed the dainty red squirrel. In recent years black squirrels, whose presence in the wilds of the UK also resulted from captive-living mishaps, have grown in population and now threaten to outnumber even the country’s detested gray squirrels. Much indignant speculation as to the cause of the demographic shift ensued in the British press. A 2008 article in The Daily Mail claimed that the “mutant pack of black squirrels” were nudging out grays as a result of a higher levels of testosterone conferred by the pigment mutation (I’ve yet to find any documentation of hormonal variations between the differently colored squirrels). Black squirrels are often described as being more aggressive, though this is based on anecdotal observation, and perhaps a dash of xenophobia. The less sensationalistic BBC wrote that black squirrels were increasing in number simply because the mutant gene responsible for their pigmentation was dominant over the wild-type allele. This is partially accurate, or perhaps “incompletely” accurate.

Incomplete Dominance
In 2009, British scientists finally rounded up some squirrels and attempted to sort out what was going on in terms of both genotype (actual genes present) and phenotype (outward appearance of the animal). They concluded that there were three, not two, coat variations among Sciurus carolinensis. In addition to the black and gray squirrels there exists an in-between phenotype; a brown-black squirrel with an orange underbelly (as opposed to the white underbelly of the original gray squirrel). Genetic testing showed these phenotypes to correspond exactly to differences in a singles pair of alleles. Using E+ to indicate the wild-type allele and Eb for the melanic allele, the authors demonstrated that the mutant version was incompletely dominant over the wild-type. That is, squirrels with two E+ alleles had the familiar eastern gray coloring, squirrels with two Eb alleles were entirely black, and those with one of each allele exhibited the mixed, brown-black phenotype. Complete dominance of the mutant gene would not yield this in-between squirrel variant. If the Eb allele were dominant, even the heterozygous animal (E+ Eb) would be all black.

Genotypes left to right: E+ E+, E+ Eb and Eb Eb

Why All the Black Squirrels Then?
If black squirrels don’t posses a quickly-spreading dominant gene or any demonstrated hormonal/behavior advantage, then what might explain their rising numbers in the UK? Again Britain’s favorite tabloid has a thought; sexual selection. Surely stirring discomfort amongst pale English readers throughout the country, The Daily Mail suggested that female gray squirrels prefer to mate with black squirrels. However, as with claims of the blacks’ bullying barbarism, rumors of their greater success with the ladies is thus far only hearsay. If one counts the brown-black mixed squirrels as black squirrels, then the combined animals could eventually outnumber grey squirrels by gene prevalence alone. But, of course, there are always other factors to consider. Even the folks who sequenced all that squirrel DNA don’t rule out the possibility that additional genes may be involved in making grey squirrels black. It’s frontier science. If you really want to know the answer, you’ll need to get yourself some squirrels, a microscope and a bunch of expensive DNA materials. Let me know if you find anything interesting.


* If you live in Austin, you’re probably in the company of fox squirrels (Sciurus niger). New Yorkers are more likely to see eastern grey squirrels. And the rest of you will have to do your own research if you wish to learn what rodents inhabit your vicinity.

† A quick introductory genetics recap: Alleles are different versions of a gene, each occupying the same spot (locus) on one of two chromosomes. One chromosome (and thus one allele for the gene of interest) is inherited from each parent.