A team of researchers led by geoscience professor John Valley of the University of Wisconsin originally determined the age of the crystals by looking at a small sample and measuring how much of the element uranium had decayed into lead. (This decay happens rapidly, and can be used as kind of a geological clock.) Some scientists, however, suggested a potential issue with this dating method, pointing out that the lead atoms might move around in the crystal over time, causing Valley and his colleagues to read a falsely older age in the places where the lead was concentrated.
In order to handle their concerns, Valley and his team recently verified their conclusions using a second sophisticated dating technique known as atom-probe tomography, which allowed them to pick out and identify individual atoms of lead in the crystal. Using this method, they determined that lead atoms did move around within the crystal, but not enough to affect their age calculation. Their findings, published this weekend in the journal Nature Geoscience, confirmed that the crystals were in fact formed some 4.4 billion years ago, only 100 million years after Earth itself formed in a molten ball of rock.
Measuring only 200 by 400 microns–about twice the diameter of a human hair–the crystals might not look like much to the naked eye, but their advanced age strongly suggests that Earth may have formed a continental crust much earlier than scientists previously believed. If this is true, and temperatures were low enough once the crust formed, the planet might have been able to sustain liquid water at its surface–and maybe even to sustain life–far earlier than previously thought. As Valley told Reuters: “We have no evidence that life existed then. We have no evidence that it didn’t. But there is no reason why life could not have existed on Earth 4.3 billion years ago.”