Radiometric dating proved wrong
The discovery of radioactivity in 1896 by Henri Becquerel, the isolation of radium by Marie Curie shortly thereafter, the discovery of the radioactive decay laws in 1902 by Ernest Rutherford and Frederick Soddy, the discovery of isotopes in 1910 by Soddy, and the development of the quantitative mass spectrograph in 1914 by J. Thomson all formed the foundation of modern isotopic dating methods.
But it was not until the late 1950s that all the pieces were in place; by then the phenomenon of radioactivity was understood, most of the naturally occurring isotopes had been identified and their abundance determined, instrumentation of the necessary sensitivity had been developed, isotopic tracers were available in the required quantities and purity, and the half-lives of the long-lived radioactive isotopes were reasonably well known.
Comparing these rocks with the products of present erosion, sedimentation, and earth movements, these earliest geologists soon concluded that the time required to form and sculpt the present Earth was immeasurably longer than had previously been thought.
Lord Kelvin and Clarence King calculated the length of time required for the Earth to cool from a white-hot liquid state; they eventually settled on 24 million years.
James Joly calculated that the Earth’s age was 89 million years on the basis of the time required for salt to accumulate in the oceans.
he question of the ages of the Earth and its rock formations and features has fascinated philosophers, theologians, and scientists for centuries, primarily because the answers put our lives in temporal perspective.
Until the 18th century, this question was principally in the hands of theologians, who based their calculations on biblical chronology.
My purpose here is not to review and discuss all of the dating methods in use.