Human prehistory has descended into a state of chaos which can only be described as farcical. New research, summarized in an October 2012 review by Aylwyn Scally and Richard Durbin (“Revising the human mutation rate: implications for the understanding human evolution” in Nature Reviews Genetics 13:745-753, doi:10.1038/nrg3295) suggests that the molecular clock used to date events in hominid prehistory may run more slowly than previously thought, and at variable speeds, throwing the timetable of evolutionary events into confusion.
The new research has staggering implications for the date of the split between the lineage leading to orangutans in Asia and the line leading to humans, chimps and gorillas in Africa: it’s been revised from 13-14 million years ago to anywhere from 34 to 46 million years ago – an impossible result that has researchers scratching their heads.
A report by Ann Gibbons (“Turning back the clock: slowing the pace of prehistory.” Science 338:189-191) exposes the massive uncertainty than now reigns in the field of physical anthropology:
“The mutation rates are so up in air,” said paleogeneticist Svante Paabo of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, in August, when his team published big margins of error — from 170,000 to 700,000 years ago — for the date when our ancestors split from Neandertals and their close cousins, the Denisovans. As a result, the timing of some events in human origins is now “very murky,” says paleoanthropologist Chris Stringer of the Natural History Museum in London. The ambiguity in the mutation rate affects a host of evolutionary and disease-related analyses, says paleoanthropologist John Hawks of the University of Wisconsin, Madison: “We can’t figure out how things happened if we don’t know when they happened.”
Before we go any further, let’s review the science underlying the molecular clock, which is used by paleoanthropologists to date events in our past. Matthew Cobb provides a handy summary in his post, Putting our DNA clock back, over at Why Evolution Is True:
The basic assumption behind the molecular clock is that mutations – changes in DNA – occur at a constant rate over time, and that the number of differences between two groups can therefore be turned into a figure based on the time since the two diverged. This phenomenon was first noticed in 1962 by Linus Pauling and Emile Zuckerkandl looking at differences in haemoglobin genes, then explicitly turned into a hypothesis the following year by Margoliash, before being fully developed in the 1970s by Allan Wilson. (It is in fact a bit more complicated, as the average generation time of a species has to be taken into account – the shorter the generation time, the higher the mutation rate.)
There are some important provisos to the clock – any stretch of DNA that is subject to selection, for example, is not going to be a very useful source of clock data, as genetic differences will tend to be removed by selection; many genes that are vital to organismal function are therefore highly conserved, showing few differences between groups. For this reason, scientists tend to use either ‘synonymous changes’ in DNA – these are ‘silent’ differences that do not cause any change in gene function (protein structure, gene regulation, or whatever) – or to use stretches of non-coding DNA, which appear to be not subject to natural selection and to evolve ‘neutrally’, just accumulating mutations with time.
As Ann Gibbons points out, the science behind molecular clock dating was relied on highly questionable assumptions, until very recently:
For the past 15 years, researchers have estimated the speed of the molecular clock by counting the mutational differences between humans and primates in matching segments of DNA, then using different species’ first appearances in the fossil record to estimate how long it took those mutations to accumulate. For example, the fossils of the oldest known orangutan ancestor are about 13 million years old, so DNA differences between humans and orangutans had about that long to accumulate. By doing similar calculations in many segments of DNA in various primates, researchers calculated an average rate of about one mutation per billion base pairs per year for humans and other apes…
But this method of calculating the mutation rate has drawbacks. For starters, it assumes that the fossil dates accurately record the first appearance of a species, but that can change with a new find. Second, there are no fossils of our closest living relatives: chimps and gorillas. Third, the method assumes that species split at the same time as their genes diverged, but in fact, genetic separation can be millions of years earlier than species divergence. Finally, the method assumes that mutation rates are similar across apes, although factors such as generation time—the average number of years between generations — affect the rate.
Now all that has changed:
With the recent advent of high-throughput sequencing methods, geneticists finally have been able to sequence enough whole genomes to calculate directly the number of mutations between trios of two parents and their child in large numbers of families. Eight studies in the past 3 years (and the 2003 study) have estimated a slower mutation rate, according to a review published online on 11 September in Nature Reviews Genetics by geneticists Aylwyn Scally and Richard Durbin of the Wellcome Trust Sanger Institute in Hinxton, U.K…
Remarkably, all the studies got about the same rate: 1.2 × 10^-8 mutations per generation at any given nucleotide site. That’s about 1 in 2.4 billion mutations per site per year (assuming an average generation time of 29 years) — and that’s less than half of the old, fossil-calibrated rate.
The new research has created an upheaval in the field of human evolution. The following table (adapted from Gibbons’ report) summarizes the old and the new molecular clock dates for some key events in human prehistory. As the reader can see, both the old and the new molecular clock are at odds with the fossil evidence on vital points.
9 million–13 million years ago (Sivapithecus)
Old mutation rate
13 million–14 million years ago
New mutation rate
34 million–46 million years ago (Mismatch between fossils and molecular date)
4.1 million–7 million years ago (Sahelanthropus, Orrorin, Ardipithecus, and Australopithecus)
Old mutation rate
4 million–7 million years ago
New mutation rate
8 million–10 million years ago (Mismatch between fossils and molecular date)
Homo sapiens-Neandertal split
350,000–600,000 years ago (Homo heidelbergensis), 200,000 years ago (Neandertals)
Old mutation rate
250,000–350,000 years ago (Mismatch between fossils and molecular date)
New mutation rate
400,000–600,000 years ago
125,000–80,000 years ago (archaic Homo sapiens)
Old mutation rate
Less than 70,000 years ago (Mismatch between fossils and molecular date)
New mutation rate
90,000–130,000 years ago
While the new molecular clock seems to give more accurate dates than the old clock for the more recent events in human prehistory, such as the split between Homo sapiens and Neandertal man and the migration of our ancestors out of Africa, it goes wildly astray for earlier events in our past. For instance, it places the human-orangutan split at 34-46 million years ago – which is a lot earlier than the date when apes diverged from monkeys, and about the same time as when Old World and New World monkeys diverged. Paleoanthropologists are not pleased. “A human-orangutan split at 40 million years is absolutely crazy,” says David Begun of the University of Toronto.
So how are scientists explaining the awkward dates implied by the new molecular clock? Scally and Durbin, in their report in Nature Reviews Genetics, suggest that the mutation rate was faster early on in primate evolution. Than, they say, it slowed in the African apes. After that, it may have slowed down even more in human evolution. Commenting on Scally and Durbin’s proposal, Harvard University population geneticist David Reich agrees that some slowdown did occur in the great apes. Nevertheless, in a supplement to a recent paper he and his research team published (Nature Genetics, 44, 1161-1165 (2012), doi:10.1038/ng.2398), he argues that the scenario proposed by Scally and Durbin is extremely unlikely:
However, this scenario also requires us to hypothesize a combination of unlikely events: (a) the slowdown would need to have been coincidental in both lineages to explain the observations, and (b) the slowdown would also have to have been extraordinarily dramatic: about 3-fold in both lineages in the period ancestral to human-chimpanzee divergence to produce as extreme an effect as is observed. (page 66) (Emphases mine – VJT.)
Reich himself believes that the new molecular clock methods aren’t picking up all the mutations, which is why he believes they’re getting an artificially slow mutation rate. Reich, Stefansson, graduate student James Sun of the Massachusetts Institute of Technology, and several other researchers have recently co-authored a study of their own (Nature Genetics, 44, 1161-1165 (2012), doi:10.1038/ng.2398) which used a different method, based on micro-satellite DNA, or small pieces of DNA which vary in the number of times they repeat, and which have a higher mutation rate than DNA nucleotides, making it easier to pick up all their new mutations. After converting the microsatellite mutation rates back to a base pair mutation rate, Reich and his team came up with a figure of 1 in 1.2 billion to 1 in 2.0 billion per year, compared with the figure of 1 in 2.4 billion mutations per site per year yielded by the new molecular clock studies. The rates proposed by Reich’s team would put the split between humans and chimpanzees at about 3.7 million to 6.6 million years ago, compared with a date of to 8 to 10 million years ago indicated by the new molecular clock. If Reich and his team are correct, the human-orangutan split would have occurred between 9.8 and 17.5 million years ago, or 2.65 times earlier than the human-chimp split. But Reich’s proposal faces problems of its own: it would imply that Sahelanthropus, and possibly also Orrorin and Ardipithecus, are not on the line leading to human beings, as anthropologists currently believe.
As Gibbons wryly observes in her article: “So no matter how researchers calculate the mutation rate directly, they can’t accommodate all the fossil dates.”
A lot of unknowns remain – in particular the issue of estimating generation time in prehistoric populations, as well as the lack of population-level data for prehistoric groups (e.g. Neanderthals or Denisovans). But the increasing richness of molecular data are producing ever more refined estimates of our past. And that is the power of science – nothing is taken as fixed, knowledge changes and increases, in a uniquely progressive way, enabling us to revise and refine our understanding, and even to reject what we previously thought to be true. Indeed, there is grandeur in this view of life.
All I can say is: if this is what Cobb calls good news, what would he consider bad news? I’d invite him to consider again the four-fold uncertainty in the date of the split between Homo sapiens and Neandertal man: anywhere from 170,000 to 700,000 years ago. Or let him consider the near six-fold uncertainty in the date of the human-orangutan split: anywhere from 8 to 46 million years ago. Is this what Cobb calls progress?
And while I’m writing on the subject of orangutans, I’d also like to ask my readers to ponder the following question: why is it that humans share at least 28 unique physical characteristics with orangutans, but only two with chimps and seven with gorillas, despite the fact that we’re genetically closer to chimps and gorillas?
I’d like to close with a quote from Chesterton:
“Merely having an open mind is nothing. The object of opening the mind, as of opening the mouth, is to shut it again on something solid.” (Autobiography. Collected Works Vol. 16, p. 212)