“The Designer’s Detritus” – my latest Nature Education post on ENCODE, junk DNA and intelligent design

I said I’d tell you when it was published – and it’s been published! Somewhat surprisingly, I was asked by my friend Khalil to write up my thoughts on the whole ENCODE project/junk DNA/the-human-genome-is-80%-functional fiasco for the Student Voices blog, but from the perspective of what intelligent design proponents were taking from it all. If you’ve been following pro-ID blogs Evolution News and Views and Uncommon Descent lately, there’s been little end to the victorious proclamations – because, as we all know, the more functional the genome is, the more likely ID is to be true, right?

Uh, yeah. Right. Sure. This is one of the issues I discuss in my Nature Education post (and don’t forget that I’ve discussed it on Homologous Legs before too), along with some other things.

Here’s a taste to get you in the mood:

Where to start? It’s unlikely that you’ve missed any of the extensive media coverage the Encyclopedia of DNA Elements (ENCODE) project has received over the past two weeks, after the international team of scientists responsible published 30 papers in high-profile journals detailing their efforts in mapping the activity of the human genome in over 100 cell types. Nearly every newspaper, online news site and vaguely science-y blog has written something about the accomplishment, with most focusing on what has become, unarguably, ENCODE’s most controversial “finding”: that over 80% of the genome is functional.

A good proportion of the biology community hit back at the claim, with the blogs of biochemist Larry Moran and evolutionary geneticist T. Ryan Gregory providing some of the more comprehensive critiques (which are spread across multiple posts on their blogs – have a look through their archives if you’re interested in the details). The consensus amongst the critics is that the leaders of ENCODE have a very loose definition of “functional”, under which it is likely most possible DNA sequences will fall, including those uncontroversially deemed “junk DNA”, simply due to the noise and imprecision inherent in various biological processes like transcription. Somecritics have even gone so far as to propose the Random Genome Project, a null hypothesis test for the “80% functional” claim essentially based around the question “What proportion of a randomly generated genome, on average, would be assigned function based on ENCODE’s criteria?” If the answer turned out to be around the 60-80% mark, it would cast serious doubts on the claim that most of our genome is functional.

I have good reason to think that there might be a response to this piece forthcoming from ENV in the next few days – like always, I’ll keep you all posted…

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>