Friday, June 8, 2012

We're moving! (Important links below)

These are not my kitties, but aren't they cute? Image: mava

Big news! As of June 2012, Blogus scientificus has taken up residence at EarthSky, already home to my "Lifeform of the Week" posts (formerly "Species of the Month" here at B. scientificus). Same great product, but now with 50% fewer typos (disclaimer: estimate not based on actual data).

The debut post is already up, and you can see all my EarthSky posts whenever you like simply by clicking this LINK. Isn't modern living easy? And, as if that weren't amazing enough, you can also follow me on Twitter (allow a few days for me to figure out what exactly one does on Twitter.)

Earlier posts will remain at this location, and you can always browse the archive if you're feeling lonely or bored. In case you're especially resistant to change, I've preemptively placated you with the above cat photo. Aww.

Thursday, May 24, 2012

Mycotoxicology smackdown: Death Cap Mushrooms vs. Milk Thistle


Images: Archenzo (L), and demott9 (R)

In the fall of 2011 four cases of death cap mushroom poisoning were successfully treated at Georgetown University Hospital (GUH) using a controversial remedy – an intravenously delivered chemical extracted from the milk thistle plant. Manufactured by the German pharmaceutical company Madaus (and sold under the somewhat ironic name Legalon), the drug has been available in Europe since the mid 1980s, but lacks FDA approval in the United States. Why are U.S. citizens being denied this wunder drug? Are we simply at the mercy of mushrooms? Can nothing be done?

As the Georgetown story unfolds, a few particularly striking points jump out. For one thing…

Poison control centers are very, very important.
When the first of the fungus-addled patients turned up at GUH – having eaten what he assumed were edible mushrooms picked from his yard – physicians quickly diagnosed him with amanitin poisoning. Amanitin, the principal toxin in the feared death cap mushroom, can be lethal. The ordeal begins with your standard food poisoning gastrointestinal woes, but can progress to organ failure, specifically in the liver and kidneys. In severe cases, organ transplants are required to save a patient’s life.

Having identified the problem, the medical team’s next move was to phone poison control. That’s right, the same people you would call if you found your child taking swigs from a bottle of laundry detergent. Poison control put the doctors in touch with Santa Cruz physician Dr. Todd Mitchell, who was conducting clinical trials of IV milk thistle (also called silibinin).

So if you’ve ever dismissed concerns about government funding cuts to poison control centers with a glib, “Ppfff, we don’t need those things, we’ve got emergency rooms” then you might want to rethink your stance. Poison control doesn’t just handle calls from panicked civilians, they also advise health care professionals tasked with treating panicked civilians.

But let’s get back to the story. Contacting Dr. Mitchell was far from the last step in the GUH patient’s road to recovery, because as it turns out…

Using non-approved drugs in the U.S. is NOT easy
The FDA allows experimental drugs to be tested if there is an established protocol and review board approval, such as in Dr. Mitchell’s clinical trial. This doesn’t mean that every other hospital out there can also start doling out these drugs. Luckily, the FDA does permit emergency one-time use of an Investigational New Drug (IND), which allowed the Georgetown team to procure and administer silibinin to their patient.

Mitchell himself went through the emergency IND process twice, first in 2007 and then again in 2009, to treat several mycotoxin-sickened patients (the first case was an entire family of six), before finally managing to set up his clinical trial, sponsored by Madaus.

It having been an unusually rainy season in the DC area, the Georgetown doctors got to work immediately writing up their protocol for any potential future poisonings. Still, an emergency meeting of the approval committee had to be called when a second patient materialized before everything was in place, soon to be followed by patients three and four. In the end, all four received the coveted silibinin and recovered without major complications (or liver transplants) and GUH is now an approved referral center for the drug.

So why isn’t every hospital in the country running silibinin trials? What are we waiting for? Well, mushroom poisoning isn’t especially common in U.S. While Europeans have a long tradition of strolling through the woods patrolling for tasty fungi, most Americans are content to buy their mushrooms at the store. According to the journal Nature Medicine, only about 50 cases of amanitin poisoning crop up in the U.S. annually, which isn’t enough to motivate hospitals to plan ahead on the off chance that one these unfortunate victims walks into their ER. But while we’re on the subject of clinical trials…

Does anyone know if this stuff actually works?
With all the hustling for silibinin going on, there must be some pretty strong evidence of its efficacy against amanitin poisoning, right? Well… not entirely. The anecdotal tales are certainly impressive. Patients on the brink of liver failure have reportedly perked up soon after the IV treatment began. But in the world of science, anecdotes aren’t worth much (unless they’re anecdotes about the insane lives of famous scientists, which, of course, are pure gold). The small number of people turning up with these maladies limits the scope of any trial, and denying available treatment to a control group is, well, kinda unethical.

Additionally, even in the absence of IV milk thistle, patients do receive some treatment for death cap poisoning – including intense hydration, penicillin, and activated charcoal (don’t worry, silibinin patients can get these too, it’s not an either/or deal.) and many of them survive. It’s thus difficult to determine what portion of any success story can be attributed solely to milk thistle. Silibinin supposedly works by blocking absorption of the poison by liver cells. It has also been tested against non-mushroom-related liver issues, but the data thus far are underwhelming.

Perhaps Mitchell’s trial will shed some light (it’s scheduled to wrap in late 2012), but in the meantime we can at least acknowledge that milk thistle doesn’t seem to be making anyone sicker (which is more than can be said for penicillin, which gives me hives and triggers anaphylactic shock in certain unluckier individuals.) And finally...

Are mushrooms a recipe for disaster?
Given that wild mushrooms are so potentially lethal, it seems reasonable to suggest that we leave their harvesting to the pros. But what fun would that be? My mother (who grew up in Russia, so it may be inaccurate to call her an amateur) has been picking and cooking mushrooms for decades with no reported fatalities. I’ve had them, and they’re thoroughly delicious. (Though I did spend a wee bit more time contemplating my mortality during that meal than I normally I do.)

Roughly 100 species of poisonous mushrooms reside in the U.S. (out of a total of 5,000 species), so if you’re going to try your hand at the art of mushroom hunting, please do some research first. And also note that taste and smell are not indicators of whether you’ve picked an edible mushroom or a toxic toadstool. In fact, the death cap is said to be rather tasty. As aptly summarized in this Croatian proverb, “All mushrooms are edible; but some only once.”

Friday, May 18, 2012

Sore muscles? Don’t blame lactic acid.


Image Credit: mrflip

With Tim Burton’s film adaptation of Dark Shadows* currently in theaters, it seems fitting to begin this post with a classic trope from vampire comedy, “I just flew in from Transylvania…and boy are my arms tired!” Get it? Arms? Oh, never mind. For me it’s presently more the legs anyway. And it’s not so much fatigue as an excruciating soreness and stiffness of the muscles. An unsolicited preview of old age. Superficially, my suffering was created by an activity called “yard work”, which I discovered only recently and which resulted in several hours crawling around on the ground obsessively uprooting every weed in the vicinity, all in a configuration to which my limbs were apparently unaccustomed.

But what is the actual physiological cause of such aches? If you’d asked me this question a week ago, I would have answered that lactic acid was the culprit, thus making myself look like an imbecile. In case you’re laboring under similar misconceptions, let’s remedy this before any of us has the opportunity to embarrass ourselves in public.

Somewhere along the line many of us learned that lactic acid builds up in the muscles during strenuous exercise and that this causes muscle aches. The basic idea is that lactate (the predecessor of lactic acid) is a byproduct of anaerobic respiration, which is the kind of cellular metabolism that occurs when you’re pushing yourself hard enough to run out of oxygen (on a cellular level, that is, if you’re actually hyperventilating that’s a separate issue). But while lactic acid may be to blame for immediate pain, the proverbial “burn” felt during an extreme workout, it’s long gone from your system by the time the real muscle soreness sets in. This second wave pain typically shows up the next day, reaching the apex of ouch somewhere between 24 and 72 hours after you overdid it at the gym.

The phenomenon is well documented enough to be christened with the badass acronym DOMS (Delayed Onset Muscle Soreness), and it doles out pain as efficiently as its leather-clad namesakes. Science is still working to get a handle on what exactly is transpiring on a molecular level, but the most popular explanation is that DOMS is caused by damage to muscle cells. Sort of like when you sprain your ankle, except instead of one big injury you’re incurring a slew of teeny tiny injuries. As with any other injury, this triggers an inflammatory response, in which your body sends various repair-performing metabolites to the site of the problem, creating a sea of swelling, stiffness and soreness in the process.

DOMS occurs when a person is using their muscles in a way that somehow deviates from the normal routine, either by engaging muscles that typically don’t see much action (as with painting a ceiling or moving a lot of unwieldy boxes) or by ramping up the intensity of one’s existing exercise regimen (i.e., being a competitive jackass in the weight room). But you probably already noticed this trend from personal experience. What you may not be aware of is that some types of muscle movements are more likely to cause DOMS than others. So-called “eccentric” muscle contractions make for the most aches. Eccentric contractions are those in which an elongated muscle braces against an opposing force. This is the converse of concentric contractions, in which shortening of muscles does the work. Imagine that the arm protruding from that building in the photograph suddenly came to life. If it continued curling that weight toward the roof, its biceps muscle would shorten (concentric contraction), but if it lowered the weight toward the sidewalk (in a controlled way, without dropping it on pedestrians), its biceps would elongate (eccentric contraction). Running downhill is a form of exercise notably rife with eccentric contractions.

The amount of lactic acid produced during the activity does not predict the severity of DOMS, so just because you aren’t in pain while in the midst of novel physical exertion, don’t assume that you won’t feel crappy the next day. The best way to avoid the late-blooming agony of DOMS is to make only incremental increases in muscle usage, allowing your body to acclimate to new demands before pushing onward. (Pretty useless advice if your main sources of soreness are all-or-nothing activity binges that you’re unlikely to repeat… weeding a lawn, for instance.)

In any event, lactic acid is the least of your problems. In fact, the human body can even use it as an energy source, at least according to this New York Times article, which explains how certain methods of exercise condition the body to use lactic acid more efficiently. Apparently, if you train like a professional athlete, you can grow massive mitochondria that suck up lactic acid like it's Gatorade, while other folks’ cells are just sitting there gasping for breath. You’ll be the envy of the marathon. Though you might still want to avoid running downhill.

* I haven’t seen it. I probably won’t see it. And I don’t need to know any additional details. Just let me have my fantasy that Tim Burton actually managed to successfully adapt one of my favorite TV shows.

A note on muscle motion: muscles only actively shorten, the lengthening is passive. When the biceps elongates, it’s due to the shortening of a complementary muscle (the triceps). So you always have both shortening and elongation happening simultaneously. But the important factor is which of the muscles is under tension. This may be one case where feeling beats thinking in terms of comprehension. Grab something heavy, then raise and lower it a few times and note which muscles feel most engaged. See what I mean?


Saturday, May 5, 2012

There's more than one way to make a blond


Image Credit: deanwissing.


Typically, if you want a look that combines dark skin with light hair there are two options. Depending on your starting point, you can either brighten your hair with chemicals like Beyoncé, or darken your skin with UV radiation à la New Jersey’s “tanning mom”. Yet on the South Pacific nation of the Solomon Islands, 5-10% of the population is just born that way. And now, a group of researchers believe they have traced the genetic cause of this unexpected blondness. And, well, big deal, because we’ve known for ages that hair color was genetically determined. Eye color too. It’s in our biology textbooks even. Nice going, science. But wait, it’s actually more interesting than you might think. It turns out that the Solomon Island blond results from a different, and simpler, genetic variation than the more familiar European brand of blond. This means that fair hair evolved separately at least two times in human history.

Prior to this recent study, which appeared in the latest issue of Science, the golden-haired inhabitants of the Pacific Islands had been the cause of much speculation. Perhaps their blondness resulted from some environment factor, such as diet or sun exposure. Or maybe fair hair was simply imported to the region by European visitors. To solve the mystery, scientists from several universities (including Stanford, located in blond-friendly California) scrutinized DNA samples from 43 blond and 42 dark haired Solomon Islanders. They found that the blonds did indeed have something different in their genes – a single nucleotide missense* mutation on an allele associated with pigmentation. Basically, there was a T (Thymine) where normally there would be C (Cytosine). Further genotyping of 918 Solomon Islanders and 941 individuals from elsewhere around the globe revealed that about 26% of the Solomon Islands population carried such an altered allele, but that it was essentially absent outside of the South Pacific, including European nations.

The findings suggest that South Pacific blondness is produced by a discrete recessive gene. It’s classic Mendelian genetics: individual carrying two mutated recessive alleles (TT) will be blond, whereas those with two standard issue alleles (CC) or a mixed set (CT) will be dark haired. European hair pigmentation, on the other hand, is determined by a bunch of different genes, yielding a variety of shades like platinum blond, golden blond and dirty blond. (Or “iced champagne”, “golden sunset” and the like, if you’re browsing the hair dye aisle.)

Globally, blond hair in adults is rare, and it tends to pair with fair skin. The Solomon Islands study indicates that human evolution has generated this hair pigmentation at least twice now, and seemingly under rather different environmental conditions. Whether the flaxen-haired phenotype confers any benefits on South Pacific individuals is unknown. It seems that light hair might help keep one’s head cool in hot, sunny regions. But then you also have to hear dumb blond jokes all day. Probably it just about evens out.

* DNA single nucleotide mutations (aka point mutations) come in a few flavors. Missense mutations result in a different amino acid being produced (think accidentally typing “tap” when you meant “cap”, it’s still a word), whereas nonsense mutations produce gibberish that shuts down the amino acid making process (more like “ctp” instead of “cap”, spell check does not approve). There’s also something called a silent mutation, which just results in the originally scheduled amino acid, but you don’t need to worry about those for today.

Sunday, April 22, 2012

List-server: 5 reasons to go vegan for a year


Yep, I’m doing lists now. Why? Because it has come to my attention that humans find lists irresistible. Actually, I noticed this phenomenon some time ago whilst compulsively purchasing yet another periodical offering “The 20 Best British Singles from 1974 with the word ‘blue’ in the chorus” (or something similarly useful.) But it only recently occurred to me to incorporate the format into this blog. Lists are fun, and they help us remember which items we need to purchase at Home Depot. Trust me, you don’t want to have to go back there a second time. So are you ready to discuss veganism with the aid of numeric headers? Let’s do it then…


This past week’s New York Times Science section kicked off with a front-page article about the hardships of espousing a vegan diet. With the best intentions I’m sure, the author sympathetically cataloged the “social, physical and economic challenges” of veganism, exaggerating the difficulties while underemphasizing the many benefits. Having myself spent a couple years living la vida vegan, I can tell you that it’s not so damn hard. (And my stint sans animal products occurred in the 1990’s, when products like coconut milk creamer had yet to be invented. Nowadays, it should be an Almond Breeze by comparison.) While no longer a vegan for reasons we’ll get to a bit later, I consider myself better off for having tried it. Here are a few of the potential perks awaiting those willing to temporarily part ways with cheese and eggs.

1) You’ll learn what’s in your food
While meat is relatively easy to spot, other animal products, like butter and eggs, can sneak into food unnoticed. Thus, being a vegan means mastering the art of reading food labels. Ingredients lists, particularly on processed foods, often read like chemistry lab manuals, so their comprehension necessitates doing some research. You’ll ask important questions like, “What the hell is casein?” (It’s a milk-derived protein used to make cheese, including many soy cheeses. Not vegan.) And, “What the hell is xanthan gum?” (It’s a polysaccharide used as a thickening agent. Vegan) Eventually, you may even ask, “Why is there so much weird, unpronounceable crap in my food?”

Along the way you’ll discover other interesting details. A simple can of beans or jar of peanut butter can contain added sugar. Something as seemingly benign as hummus can harbor flavor enhancers. Who knew? Reading labels cultivates a healthy sense of outrage about the volume of superfluous nonsense added to the things we eat daily. Ultimately, this might make you opt for less synthetic purchases, leaving more room on your plate for real food.

2) You’ll cook more often, and more interesting dishes
When I began college I was a lousy cook. The most complex meals I prepared involved boiling a box of pasta or rice and tearing open the accompanying “seasoning packet.” On less ambitious days, I’d open a can of soup, throw some grated cheese on it, and heat and serve. But once cheese was off limits, canned soup tasted pretty bland and I was forced to learn to cook for real.

With many restaurants offering lackluster vegan options and processed foods often containing animal ingredients (“ugh, whey powder, I can’t use this”), the logical reaction is to cook more from scratch, using easy to manage items like vegetables, grains, beans and maybe even tofu once you pass the beginner stage. In doing so, one quickly discovers that beans and tofu don’t have a ton of flavor on their own. Thus the next step in vegan cooking is learning how to use spices. As a vegan, I acquired cumin and coriander and their ilk. I learned to make curries and peanut sauces. (Which, btw, are excellent options even for those who re-incorporate animal products into their diet.)

A lot of people approach veganism by trying to replicate their favorite animal-based foods. But the best vegan foods are alternatives rather than mere substitutes. They’re things that were vegan all along. Pseudo meat-textured veggie burgers don’t taste like beef and vegan cheese doesn’t melt. But falafel and tahini dressing are delicious and surprisingly easy to make.

3) You’ll save money
The Science Times piece makes several mentions of the financial strains imposed by a vegan diet. For instance, “…vegan specialty and convenience foods can cost two to three times what their meat and dairy equivalents do.”  But “convenience foods” are processed or pre-fab foods, which shouldn’t be the bulk of anyone’s diet. Fresh produce is expensive only when compared to fast food burritos and Walmart frozen pizza.

If you look at the raw ingredients used to make vegan and non-vegan meals, the opposite pattern emerges. At my local grocery store, a 1 lb bag of a 100% vegan carrots sells for about a buck (give or take depending on whether or not you opt for the organic ones). Meat and cheese, on the other hand, can be pricey. And even the lower quality versions of animal products are still more expensive than dried beans. I mean, check out this recipe for dahl. Main ingredient: lentils. It doesn’t get much cheaper than that.* Here’s another one for middle eastern chick pea stew (pro tip: throw in some green vegetables and raisins for extra awesomeness.) You’re welcome. Those hard-to-find “vegan specialties foods” are hardly essentially to good vegan cooking. † But if you’re longing for the novelty of seitan (it is great for stir-fries) but you live in the sticks, you can always make it yourself.

If a switching to a vegan diet is driving you into debt, it’s probably an indication that there’s too much junk food in your life.

4) You’ll be thinner and healthier
In case you haven’t heard, America (along with much of the developed world) has a colossal weight problem, and with it a slew of obesity-related ailments. Also old news are the many studies finding correlations between plant based diets and lower body mass index (BMI), as well as lower LDL cholesterol levels (that’s the bad kind), lower risk of heart disease and type 2 diabetes, and lower overall cancer rates. If you don’t believe me, have a look at this American Dietetic Association (ADA) paper about vegan and vegetarian diets. In addition to pointing out the above-mentioned benefits, they conclude that such diets, when properly planned, “are appropriate for individuals during all stages of the life cycle, including pregnancy, lactation, infancy, childhood, and adolescence, and for athletes.” Yes, you read that correctly, even pregnant women and athletes can be vegans without keeling over from anemia.

However, it’s important to note that the ADA also claims that there is no sufficiently bioavailable plant source of vitamin B-12. So total vegans need to obtain this nutrient through a supplement or in B-12 fortified foods (fortified cereals, for instance). There seems to be a bit of debate over this, with some folks still insisting that dark green vegetables, or at least seaweed can provide enough B-12. Personally, I would (and did) play it safe by taking a B supplement. ‡

5) You’ll change the way you think about food
Despite nixing the strict vegan diet ages ago, I still eat mostly plant-based foods. Having animal products off limits for a spell taught me to view them more as garnishes than necessities. Remember, meat and cheese and the like aren’t just rough on your body, they also require more resources to farm than do plants. Given the environmental impact of animal products, using them in moderation is advisable. This handy graphic from Environmental Working Group (EWG), which visualizes the carbon footprint of various protein sources by comparing them to miles driven in a car, may help put things in perspective. Note that cheese is even higher up on the shit list than pork or chicken.

So if being a vegan is freaking fabulous, why did I ever stop? Mostly, it was the “social” component of the Times’ triad of terrors. It’s enough of a pain telling friends who invite you to dinner that you don’t eat meat. Explaining that you also don’t eat eggs, butter, milk, cheese, sour cream, yogurt, and pretty much everything else they were planning to serve can easily disqualify you from future invites. It’s not ideal for travel either. While I somehow successfully navigated both the Scottish Highlands and parts of Eastern Europe as a vegan, it was rough going at times. Basically, humanity eats a boat load of animal products, so unless you have a strong moral objection to their consumption, the “When in Rome” approach gives you a lot more flexibility. Plus the occasional dash of feta cheese or fresh mozzarella is a fine, if non-essential, addition to the menu. Though I do sometimes wish I still had an easy excuse to avoid queso. §

Perhaps you’ve already five for five on the above qualities. Not everyone needs a major dietary upheaval to learn to cook decent meals. I always suspected that my readership was composed of an elite group of health and environment conscious individuals with well-stocked spice racks. But do realize that you’re in the minority, and consider suggesting an animal product sabbatical for any of your less fortunate friends, relatives and coworkers.


* Some people will argue that all these spices are expensive. But you don’t have to buy them every time you cook. Once you assemble a starter set of seasonings, you’re good for a while. Also, if you do live somewhere with a health food store or a well-stocked supermarket, check to see if they sell spices in the bulk food section. Bulk spices are insanely cheap.

† One poor soul quoted in the Times, a resident in the tiny town of Phoenix, Arizona (population 1.5 million), complained of having to drive 20 miles to obtain such delicacies.

‡ You, wise reader, are probably asking, “Wait, if there’s no plant source of B-12, then where the hell do these vitamin pills come from?” That occurred to me too. (Great minds think alike.) It turns out B-12 is made through bacterial fermentation.

§ For those unfamiliar with this item, queso – short for chile con queso – is a Tex-Mex chip-dipping favorite consisting of melted cheese and chile peppers. If you happen to express an aversion to the stuff, someone will immediately argue that you just haven’t tried the “good queso” and attempt to introduce you to this superior product at the next opportunity. From what I’ve experienced, there is no good queso. It’s just a bad idea. Order the salsa instead.

Friday, April 13, 2012

Climbing the ranks: social status changes gene expression in monkeys


Red carpet by chadmagiera.


Having low social status may suck for reasons beyond not getting invited to the swankiest parties; it could also be making you ill. A correlation between socioeconomic status and health in humans is well established. Most famously, the Whitehall studies of British civil servants found that workers in low status positions had worse health and earlier deaths than their higher-ranking managers.

But the reasons behind this relationship are not clear. Are low status individuals sicklier because their jobs expose them to grueling physical labor and dangerous chemicals? Do low wages make preventative medical care and good nutrition harder to obtain? Does the demoralizing experience of being ordered around all day stress the mind and body? Or perhaps we’re looking at it backward? Maybe healthier, fitter individuals naturally rise to the rank of CEO while less robust workers languish in the mailroom. Such are the chicken and egg conundrums facing human correlation studies, even large cohort studies like Whitehall. Sometime, when things get too muddled, it's best to grab some monkeys and head to the laboratory.

That’s what a group of scientists did in a study recently published in the Proceedings of the National Academy of Sciences. Their monkey of choice was the rhesus macaque – a species whose lowest-ranking members, like humans, also exhibit poorer health.  Working with 49 monkeys divided into 10 social groups, the researchers demonstrated that the animals’ social status affected gene expression, specifically in genes relating to immune function. The effect was so pronounced, in fact, that it could even be used to predict status. Gene expression data from blood samples indentified with 80% accuracy the relative rank of the individual from which they were taken.

In case your recollection of genetics is a bit hazy, this might be a good time to clarify what we mean by “gene expression”. With the exception of gametes, every nucleated cell in your body contains a full set of genes (46 chromosome worth, assuming you’re human), but not every inch of DNA in those cells is constantly expressed (that is, transcribed into RNA and eventually translated into the proteins that run our bodies).* So while an individual’s genome is set in stone, gene expression varies between cell types and can also be impacted by environmental conditions.

To sort out the connection between the environmental factor of social rank and gene expression, the authors of the study took medium-ranking female monkeys and assigned them to new hierarchical social groups comprised of five individuals. Rank in these experimental groups could be manipulated by the order in which each member was introduced – the first ones in generally ranked the highest, while latecomers were stuck with increasingly lower statuses. 

Looking at thousands of genes, the authors found greater gene expression related to rank in 987 of them. (535 genes were expressed more in high rankers, 452 in low rankers). Additionally, monkeys that switched ranks during the experiment experienced changes in their gene expression shortly thereafter. This suggests that not only is it rank that controls gene expression, rather than the other way around, but that negative effects of low status on health might be reversible through changes in the social environment.

How are we to interpret these results? Very carefully. Monkeys and humans differ genetically and socially, so we shouldn’t just assume the results apply to our own species and call it a day. Human civilization is a complicated affair, and status can’t easily be reduced to resource access and grooming privileges. What determines our status within society? Is it just economic? Does it extend to race and gender? To high school cliques and Hollywood A thru D lists? As they say in the science biz, more research is needed, but it’s an intriguing start.


* If that’s still too vague, or you’re just needing a break from your job, I found this website that lets you build your own virtual protein. Weeee! Internet!

 In the wild, female rhesus macaques typically stay with the social group and rank they’re born into.

Thursday, March 29, 2012

Whales, trees wish you would shut up already


Image Credit: Thing Three.

The noisiest place I ever lived was an apartment at the corner of Sixth Avenue and 15th Street in Manhattan. It was the nexus of several bustling neighborhoods and a hub of public transportation, with subway stops mere blocks in each cardinal direction and even a New Jersey PATH train station within easy walking distance. New York City’s endless stream of vehicles gushed down the avenue day and night, and an array of 24-hour fast food establishments and markets ensured a steady flow of both gregarious revelers and ranting vagrants. On weekends, street festivals awoke us with their blaring music and electric generators. On rainy nights, the din of hydroplaning taxis made watching a DVD with the windows open impossible. It was effing loud.

As a species, humans excel at making noise. Not content to stick with the howling and growling of other animals, we’ve created machines to augment our collective clamor. And, thanks to our ever-increasing transformation of the planet’s landscape, we’re sharing that noise with other organisms. Everywhere we go, we bring sirens and jets and jackhammers. How is the rest of nature faring with this parade of sound? Sometimes not too well. And a pair of recent studies, each published in the Proceedings of the Royal Society B, highlights how the effects of our auditory intrusions are not limited to land-dwelling animals, or even animals.

Right place, right time
In the weeks following the September 11th, 2001 World Trade Center attacks, many New Yorkers were on edge. They jumped at the sound of car alarms, fretted over riding the subway, and had anxiety dreams filled with airplanes. Meanwhile, North Atlantic right whales (Eubalaena glacialis) were finally able to relax a bit. Why? Because for once they could hear each other without yelling. Ship traffic in the Bay of Fundy, Canada, where whale studies were underway, dropped following the attacks, and with it so did underwater noise levels. Ships produce low frequency sounds that interfere with right whales’ communication vocalizations. So while life on land was newly terrifying and chaotic, underwater an invisible wave of tranquility was sweeping through.

As you can imagine, the exact course of the research wasn’t planned. In the weeks prior September 11th, scientists were plodding along measuring underwater noise levels and also collecting samples of whale crap, unaware of the tragedy-borne opportunity lurking on the horizon. The audio data were collected from August to September of 2001, but the fecal sample study would continue through 2005, from the months of late July to early October (the whales aren’t in town year round).*

Consistent with the scientists’ empirical observations, underwater recordings made after September 11th showed a decrease in noise. But perhaps most interestingly, analysis of the fecal samples found that the post-911 poo had lower levels of glucocorticoid metabolites. Glucocorticoids are secreted in response to various kinds of stress, so the lower levels suggest that quieter waters resulted in calmer whales. Depressingly this might also indicate that, under normal, non-catastrophic circumstances, whales in the region are chronically stressed.

Soothing sounds of the forest
While I myself have never attended a grade school science fair (my childhood was sparse on extracurricular activities) I’m told that a popular experiment for such things is testing whether plants grow better in the presence of soothing classical music versus abrasive rock ’n’ roll. If you’ve read this far hoping to learn about laboratory scientists subjecting potted ferns to daily doses of death metal, I’m sorry to disappoint you. The study at hand dealt with the indirect effects of noise on plant pollination and seed dispersal. Both of these services are sometimes provided by animals, so while the plants themselves may be indifferent to manmade noise, animal reactions to it can influence the plants’ reproductive success.

The authors conducted their research in New Mexico in the Bureau of Land Management's Rattlesnake Canyon Wildlife Area, which houses not just wildlife, but also natural gas wells and the noise-making compressors used for extraction and transportation of the resource. This provided an ideal setting for the study, as the location had both quiet control areas (without compressors) and loud experimental areas (with compressors) and yet none of the confounding variables typically found in noisy spots (i.e., the various urban indignities discussed back in the opening paragraph).

To examine the effect of noise on seed dispersal, the scientists focused on the piñon pine (Pinus edulis), tracking the volume of seeds taken by different animals from both the loud and quiet spots. While some animals seemed to prefer snatching seeds in the noisy areas, the one most likely to help the seeds take root, the western scrub-jay (Aphelocoma californica), carried away more seeds in quieter locales. † This suggests that pines residing in noise-polluted districts may have less luck producing offspring.

Yet the effects of increased noise weren’t always negative. The pollination experiment looked at the auditory preferences of the black-chinned hummingbird (Archilochus alexandri). Unlike the scrub-jays, these birds visited flowers in the noisy sites more often, potentially conferring a reproductive advantage to hummingbird-pollinated plants in loud regions. However, the authors note that the cause of this was probably not the hummingbirds’ fondness for generator sounds, but their aversion to predators that stake out the quieter areas. Like humans choosing an inferior restaurant because it’s easier to get a table, hummingbirds likely hang out in less desirable neighborhoods to avoid the hassles of more popular ones.

So while the exact effect may be difficult to predict, noise pollution – like other more publicized forms of pollution – does have an impact on both flora and fauna, on land and in water. Repercussions of the increasing cacophony of daily life on our own species might also merit examination. Anecdotally, I can tell you that I felt less stressed out after vacating the Manhattan apartment. Though perhaps the toxic fumes emitted by the Cheesesteak Factory (yes, such an establishment exists) were the bigger problem.

* If you’re wondering how one goes about finding whale droppings, as with drugs and bombs, it’s done with the help of trained dogs.

† The scrub-jay stores some of the seeds it swipes, and not all these are eaten later. Thus the unconsumed seeds have a shot at becoming new trees.

Sunday, March 11, 2012

Paint it White: How New York City is getting cooler

New York vista by Tim Pearce.

It’s official: white roofs are cool. The declaration comes from the tastemakers at NASA, and, as always, New York City is at the vanguard of the new trend. In fact, the city’s recently brightened rooftops were found to be over 40 degrees Fahrenheit cooler than traditional dark roofs at the height of summer heat waves.

If you’ve weathered a NYC summer and were lucky enough to escape to the country for a weekend, you probably noticed that it feels several degrees cooler once outside the city – and consequently several degrees insufferably hotter upon reluctant reentry. Having myself spent over a decade in the city, I’d assumed this temperature gradient was caused by the fact the New York summers were literally Hell and the months of July and August a period of non-eternal damnation, but it turns out there’s a simpler explanation.

New York City is crowded, not just with people, but with buildings. Lots of buildings, built close together and typically topped with dark roofs that excel at absorbing the sun’s rays. Outside of the city, manmade structures are interspersed with these things called trees. The green of suburban and rural foliage reflects back some of that solar radiation, resulting in cooler ground temperatures. Meanwhile cities bake in their black asphalt casing. The phenomenon is called the “urban heat island” effect, and, scientifically speaking, it thoroughly and unequivocally sucks.

Higher temperatures not only make city dwellers miserable and irritable, they also result in greater energy usage and higher greenhouse gas emissions, which ultimately makes things even hotter. Hoping to reign in this unfortunate positive feedback loop, city planners have been working to replace traditional dark roofs – made of conveniently waterproof and durable asphalt and tar – with new fangled white materials (household staples like ethylene–propylene–diene monomer (EPDM) and a thermoplastic polyolefin (TPO)). Since not every building owner is willing and/or able to replace an otherwise functional black roof, the NYC CoolRoofs program is also promoting a less involved option – “retrofitting” existing dark roofs with old-fashioned white paint.

So is it working? A multi-year study recently published online in the journal Environmental Research Letters examined how the high-tech versus DIY brightened roofs fared against each other and as well as compared to standard dark roofs. Initially, both the two professionally-installed membranes and the civilian-applied white paint performed admirably – with white surfaces measuring an average of 43 degrees Fahrenheit cooler on hot summer days than dark samples. (Those asphalt rooftops can reach a disheartening 170 degrees Fahrenheit.) However, while the fancy whites were resistant to the ravages of time, the humble white paint lost some of its luster (and, more measurably, its reflectance) by the second year… not unlike the interior paintjobs in Manhattan apartments. Still, even the two-year old paint was an improvement over the dark roofs, and you can’t beat the price (about 50 cents per square foot, whereas the pro roofs ran between $15 to $28 per square foot).

So that’s all pretty impressive, but I seem to recall that New York has not just one but two problem seasons, in terms of both human suffering and energy usage. What about winter? Is white after Labor Day as gauche on rooftops as it is in ensembles?* The EPA, whose standards for roof reflectivity these white materials are striving to meet, acknowledges that brightening roofs in colder climates may come with a “winter heat penalty”. That is, in reducing unwanted summer heat, the more welcome heat that would have been generated by sunlight-absorbing dark roofs in winter is also forfeited, and gas or electric heaters must work harder to make up the deficit. However, since fewer hours of sunlight are available during the coldest months, it’s not a major loss (relative to the improved efficiency in summer) unless you’re dealing with frigid locations where heaters run 9 months out of the year. †

Additionally, the authors of the current study found that at least one of their measured materials (our friend EPDM) seemed to avoid the winter heat penalty entirely. They attribute this to the material’s emissivity level. Here’s the deal (in as much detail as I’m willing to tackle), white roof initiatives are looking only at a material’s reflectivity. This is the amount of sunlight reflected back instead of absorbed. To qualify for the EPA’s “energy star” rating, a material must have a solar reflectance of 0.65 or greater upon initial installation, and 0.50 or greater after three years. (It’s a zero to one scale, with 0.0 absorbing all light, and 1.0 reflecting all light.)

But materials also have an emissivity factor. Emissivity is the amount of heat something emits after absorbing solar radiation. Since close to half of that sunlight is being absorbed by the white surfaces (still far better than black, which is typically around 0.05 in reflectivity) how much of it they emit will also affect a building’s surface temperature. The white materials tested were assumed to have a high emissivity in addition to their high reflectivity (i.e., whatever sunlight got in would quickly be booted back out), but the EPDM seemed to cling a bit harder to its absorbed heat, thus keeping its surface toastier in the winter.

The authors therefore suggest that materials with high reflectivity but middling emissivity may be the best fit for colder climates. Got all that? Don’t worry, I’m not sure I understand it either. Emissivity is a tricky concept to get a handle on. Apparently even the folks manufacturing the EPDM couldn’t figure out how much heat their product would emit.‡

In any event, it’s nice to see that New York City is getting its environmental act together. When I left the place in July of 2008, it was a sweltering cesspool with insufficient bike lanes and a surfeit of Sex in the City-spawned shopping zombies. Less than four years later and, well, at least two out of those three problems are being addressed. While the world could do without some East Coast fads (I had to endure a second round of Brooklyn’s ironic mustaches when I moved to Austin), one can only hope that the trend of going green by way of white will soon fan out to other cities. Especially those in the south, where I currently reside. It's totally uncool here.

* Yes, I’m aware that no white after Labor Day is an outmoded fashion rule. Austin isn’t that behind the curve.

† And if you’re living somewhere truly freezing, isn’t your roof covered with snow most of the time anyway? Now if we could just get it to snow in summer instead of winter, then all our problems would be solved.

‡ The product was rated as having an emissivity of 0.90 (high), but the authors estimated it to be closer to 0.48 (not so high).

Tuesday, March 6, 2012

America's grossest invasive species

As if 12-foot pythons and lizards capable of biting off limbs weren’t enough for Americans (especially Floridians) to worry about, now comes a nonnative species that is not only harmful to indigenous flora and fauna, but also thoroughly disgusting. Meet the European earthworm (species of the genus Lumbricus) a grotesquely-long squirming squishy blob of an animal that is responsible for the decline in populations of the completely non-revolting ovenbird (Seiurus aurocapilla). It’s a bit backward sounding, I know, as birds typically eat worms, but the problem is more complex than ordinary predation or resource competition.

Happy, earthworm-free woods. Image: Duane Burdick.
The worms aren’t attacking the birds (they’re not that big) or even nosing in on their food supply. You see, the issue is that ovenbirds – a migratory species that nests in North America and flies south during the colder months – build their nests on the ground, and they do so in what was previously earthworm-free hardwood forest, which normally has a thick layer of understory plants. The abundance of low plants helps conceal ovenbird nests from predators, but now these nauseating euro-worms are ruining the delicate environmental balance.

Personally, I feel a titch misled. Being none too keen on earthworms from day one, I’d been told many times that we must appreciate (or at least tolerate) the slimy bastards because of all the good they do for plants. You’ve surely heard the same spiel. Worms’ subterranean writhing tills the soil, distributes nutrients, and makes gardens flourish. But such pro-worm propaganda, while true, is only part of the story. And what’s good for the garden isn’t necessarily good for the forest. That important understory foliage grows from the slow decomposition of leaf litter on the forest floor. Earthworms, which happily eat all sorts of rotting vegetation, feed on this litter and hasten the decomposition process. As a result, there is less fertilizer for the understory plants and subsequently less protective cover for ovenbird nests.

How did these clammy wriggling beasts from abroad worm their way into American soil? Like pretty much every other invasive species, they were delivered here courtesy of human sloppiness and/or cluelessness. European earthworms have been in the country as long as European humans, (you know, pilgrims and founding fathers and the like), brought in accidently by boat or deliberately by ambitious gardeners. More recent activities, logging and the dumping of fishing bait, delivered the pests into hardwood forests, where they’re currently running amuck. Yuck.

I don’t claim that earthworms have no place on our planet, or even in North America. Nor would I profess that any one animal was superior to another. I can only present to you the facts as I have uncovered them....

Images: U.S. Fish and Wildlife Service (L) and Michael Linnenbach (R)
I rest my case.

Tuesday, February 21, 2012

How to be a better liar

Rice by babbagecabbage, Photoshoppery by yours truly.

Listen up, liars. Society is onto you. Well, its machines are anyway. Computers can spot your fake online hotel reviews, and they know when you’re shaving ten pounds off your weight on your Match.com profile. They’re sharpening their skills daily and, with their help, even the human brain (a device notoriously terrible at truth detection tasks) might get wise to your chicanery. So if you’re not going to change your wicked ways, you can at least try to improve your technique a bit.

But how? Researchers say the telltale signs of deception are difficult to hide. Even in written form – where the author has the opportunity to edit his or her appalling untruths – liars still leave behind linguistic clues. But I say, what kind of defeatist attitude is that? We can make our résumés more appealing to employers by replacing passive phrasing with exciting ACTION words. Why should lying about the rest of our lives be any different? Before we throw in the towel and resort to honesty, shouldn’t we at least try to apply what science has learned to our own deceitful endeavors? Of course we should. Let’s have a go at it.

Mind your language
Certain outward signs can draw attention to a poorly executed lie: fidgeting, stammering, sweating, shaking, heartbeats audible from two rooms away. Humans took note of this and have concocted lie-detecting technologies that attempt to exploit such physiological cues. The earliest example usually given is China, circa 1000 B.C., where suspected criminals were asked to place rice powder (or rice, by some accounts) in their mouths and then spit it out during interrogations. The idea behind this test was that the stress of lying dried out one’s mouth. Thus those answering honestly should be able to spit out more than liars, whose mouths the starchy stuff would stick to like feathers on tar.

The modern polygraph, which measures pulse, blood pressure, and other indicators while suspects answer questions of varying stakes (“Is today Tuesday?” vs. “Did you kill your wife?”), is basically a twentieth century upgrade of the rice test.* Both rely on finding signs of nervousness exhibited by guilty liars. Problematically, both also risk merely capturing the anxiety of innocent people freaking out over being accused of a crime.

A more novel way of discerning between factual and fabricated statement involves not measuring the body, but analyzing the words used by the speaker (or writer). While not initially conceived as lie detectors, computer algorithms that examine linguistic patterns have been used experimentally to search for hallmarks of deception. Linguistic Inquiry and Word Count (LIWC), software designed by James Pennebaker, Roger Booth, and Martha Francis is one such tool. It sifts through documents (written text or transcribed speech) and tallies the instance of various word categories, including significant but harder to control “function words” - pronouns, articles, and the like.

LIWC was most recently unloosed in the jungle of online dating and proved superior to human judges at assessing “trustworthiness” from the content of daters’ profiles.

Thinking cues and feeling cues
Having spent some time tackling this idea of liars’ linguistic cues, scientists have come up with a few observations. The giveaways can be divided into two types: cognitive (thinking) and emotional (feeling). These distinctions are based on the proposed causes of the cues. The idea is that lying is 1) morally troubling and 2) intellectually difficult. Let’s tackle the second one first.

Because liars have the challenge of creating and managing the details of a fictional tale, their accounts should theoretically reflect this by being shorter and simpler. Experimental analysis has found that false statements generally do have lower overall word counts. They also contain fewer exclusion words. These are words like “except” and “but”, which are used to make the kind of fine distinctions in stories that can be a headache to keep track of when they’re not actually true. For instance, when playing hooky from work, the cognitively-taxed liar might claim, “I have a cold.” or even, “I have a cough and a sore throat.” But you can improve upon this by throwing in an exclusion, “I have a cough and a sore throat, but I’m not feeling feverish.” Just make sure you have more symptoms than exclusions, else your employers will think you’re being lazy.

Due to ease of handling, motion verbs are also common in dishonest accounts. “I fell and sprained my ankle; I won’t be coming in today.” “The car ran over a nail and got a flat; I won’t be coming in today.” And so on. You may not want to lean too heavily on motion verbs. Try adding a little detail about what it was that caused you to fall in the first place, or the crummy pot-hole-strewn road you driving on. Be careful though. As a child, my instinct when lying was to construct elaborate narratives with well-developed characters and lengthy passages of dialogue designed to answer questions no reasonable person would think to ask. It was perhaps overkill, as I can report anecdotally that my results were not stellar.

Emotional cues stem from feelings of guilt and fear related to lying, rather than its cognitive demands. These are especially important in our modern computer-driven world, because they’re harder for the liar to filter out, even when given the chance to edit their work (the online dating profiles found more emotional than cognitive giveaways of deceit). Unhappy words and negations both crop up more frequently in the ramblings of the untruthful. Though you might have a hard time eliminating these entirely when you’re lying about a dour subject. I mean, is it better to say, “I feel lousy” or “ I don’t feel well”?

Perhaps we should focus instead on paring down instances of “distancing”. Apparently, liars want to detach as much as possible from their ghastly falsehoods and often do so by avoiding first person pronouns. So make sure to use, “I” and “me” whenever possible.

Of course, as with physiological measures, none of the above pitfalls are likely to affect an evil genius (clinical definition: high functioning antisocial personality with IQ of 140 or greater). But, given that you’re sitting around reading this rather than out devising a kryptonite trap for Superman, you’re probably not in that category.

Man vs. machine
Something else you might want to keep in mind in all this wordsmithing, is who or what you are trying to dupe. So far, everything we’ve discussed relates to outwitting computer algorithms. But humans and machines rely on different cues to decide whether or not to trust you. And even though humans suck at spotting a lie (we tend to fare at about the rate of chance), that shouldn’t stop you from peppering your fictions with the very elements we incorrectly perceive as signs of honesty.

The only linguistic trait that both evokes distrust in humans and actually correlates to lying is word count. Shorter stories are more likely to be false and more likely to be perceived as false by a human audience. † But from that point on, our poor species gets lost and gravitates toward linguistic patterns that have nothing to do with honesty. While we like long descriptions with plenty of details, we also want individual sentences to be on the short side. Got it? Use lots of words. Make short sentences.

Also popular is concrete language. Abstract or convoluted sentences inspire suspicion (not to mention boredom). And do try to use the word “we”. The plural first person pronoun “we” makes listeners feel included, whereas the second person “you” or third person “they” makes us feel like outsiders. To test this, I made a point of using “we” in describing human lie-catching ineptitude, so that you wouldn’t think me a snob accusing you of being a bumbling simpleton. Did it help? Were you filled with trust? Perhaps an urge to lend me money?

And Now the Caveats
WARNING: Don’t try this in a language other than English. All the studies I looked at were conducted in English and, as anyone who has struggled to learn a new language knows, grammar varies considerably between languages. Culture likely has an effect too. Who knows if first person pronouns have the same appeal outside of the egocentric U.S.

Also, results may vary with the degree of the lie being told. A lie created to conceal a major transgression will likely leave more clues in its wake than one told to praise an unremarkable meal.

And while we’re on the subject, those big lies about where we were and who did or didn’t kill aren’t the most common type of dishonesty. The majority of our deceptions work to mask our socially unacceptable opinions and feelings, or to hide our perceived shortcomings. Given the banality of our lies and the effort required to tell them convincingly, it might be easiest to just fess up to those unpopular attitudes and lackluster achievements.

* Bonus trivia: The man who invented the blood pressure measuring component of the lie detector, William Moulton Marston, was also the creator of Lasso-of-Truth-wielding comic book heroine Wonder Woman (under the name Charles Moulton).

† This is not to say that any short sentence is false. It’s all relative. And, relative to a truthful account on a similar topic, an untruthful one is likely to be shorter.