News & Views item - December 2006

 

 

Deciphering the Information Coded in DNA -- Sequencing Was the Easy Part. (December 4, 2006)

     Helen Pearson's News Feature in Nature's November 16 issue "Codes and Enigmas" points out that now DNA sequencing is getting to the stage that new complete sequences of genomes from mammals to viruses is no longer front page news.

 

In fact as the data pour in and the "thinkers" and theorists cogitate, it is becoming ever more apparent that locating the structural coding regions was the easy bit. As Pearson summarises it, "The code that is currently most exercising the minds of geneticists is the 'regulatory code' that directs the production of suites of proteins tailored to specific cell types and used at specific times. The idea is that many of the genes switched on in DNA contain signature sequences in 'promoter' regions nearby and 'enhancer' regions that may be millions of base pairs away."

 

The subtleties of the puzzle are such that some doubt whether even approaching a full understanding of the system will ever be obtained.

 

So for example she cites the work Gene Stanley and his colleagues who in 1992 suggested that there were patterns in DNA that spanned hundreds and thousands of base pairs.

 

The statistical tools they used for their analyses are ones used to "identify correlations in climate and financial data".

 

Other groups attempting to repeat the work ran into difficulties and the field remained sceptical but mathematicians and physicists became interested and much of the scepticism has become muted, but that doesn't mean that the reduction of scepticism has been correlated with flashes of understanding.

 

So where is this leading? Well, if there was ever an area of research which cries out for an interdisciplinary attack this is it. But it also should be seen to require that as a minimum thorough knowledge of modern physics and mathematical and statistical tools along with an understanding of the molecular biology of the living organism is required.

 

That's not going to be accomplished by trying to produce interdisciplinary individuals, rather what is required is to bring together brilliant workers trained in the separate disciplines but who are able to also understand one another. So going to what is virtually an ancient example 1969 Nobel Laureates Max Delbruck, trained as a physicist, and Salvador Luria, who learned the techniques of bacteriophage research at the Pasteur Institute collaborated and in 1943 published a paper showing that, contrary to the then current view, viruses undergo permanent changes in their hereditary material. That same year they devised the fluctuation test, which provided experimental evidence that phage-resistant bacteria were the result of spontaneous mutations rather than a direct response to changes in the environment.

 

In short you need the best of scientific disciples not second-rank, under-taught amalgams. But they do need to be able to understand each other.

 

Are there exceptions?

 

    Yes of course, here's Terry Sejnowski a Howard Hughes Medical Institute investigator who joined the Salk Institute and UCSD in 1988. He took his PhD in theoretical physics (not as an interdisciplinary student) then got interested in neuroscience.

As Karen Hopkin writes in The Scientist

"He seems to be one of very few people in the world who knows enough to work at every one of these levels," says Jack Cowan of the University of Chicago. "Terry goes from molecules to sleep. You don't get many people who have that range. I think he's one of the really remarkable scientists around at the moment."
 
By weaving together theory and experimentation, Sejnowski effectively launched the field of computational neuroscience. "Prior to his arrival on the scene, every generation of neurobiologists felt that computation had done nothing in the previous 10 years, but that it would be extremely important in the next 10 years. And after that 10 years was over, one was exactly where one started," notes Columbia University's Eric Kandel. "The problem was that computation was independent of experimentation. Terry was the first major person to come along who had both training in experimental science and in computation."