At high geomagnetic latitudes, the carbon-14 spreads evenly throughout the atmosphere and reacts with oxygen to form carbon dioxide.
Carbon dioxide also permeates the oceans, dissolving in the water.
The amount of carbon-14 gradually decreases through radioactive beta decay with a half-life of 5,730 years.
So, scientists can estimate the age of the fossil by looking at the level of decay in its radioactive carbon.
The method was developed by Willard Libby in the late 1940s and soon became a standard tool for archaeologists.
Radiocarbon dating has allowed key transitions in prehistory to be dated, such as the end of the last ice age, and the beginning of the Neolithic and Bronze Age in different regions.Research has been ongoing since the 1960s to determine what the proportion of in the atmosphere has been over the past fifty thousand years.The resulting data, in the form of a calibration curve, is now used to convert a given measurement of radiocarbon in a sample into an estimate of the sample's calendar age.Libby and James Arnold proceeded to test the radiocarbon dating theory by analyzing samples with known ages.For example, two samples taken from the tombs of two Egyptian kings, Zoser and Sneferu, independently dated to 2625 BC plus or minus 75 years, were dated by radiocarbon measurement to an average of 2800 BC plus or minus 250 years. Carbon dioxide produced in this way diffuses in the atmosphere, is dissolved in the ocean, and is taken up by plants via photosynthesis.
One of the most frequent uses of radiocarbon dating is to estimate the age of organic remains from archaeological sites.