I understand how radioactive dating works, but something about it concerns me. If we have a rock and assume that it was 100% carbon-14 at formation, and we now measure it to be 25% carbon-14 and 75% nitrogen-14 (I know nitrogen is a gas, but bear with me), then we can calculate that the rock has been around long enough to pass through 2 half-lives (2 x 5,730 years = 11,460 years). If, in fact, the rock was 50% carbon-14 and 50% nitrogen-14 at its formation, then it would actually be only 5,730 years old (only half the originally calculated age). This measurement seems to hinge on the fact that we that the rock was originally 100% carbon-14.How must carbon-14 is left can then be compared to how much carbon-14 would have been in the environement when the thing was living, and absorbing carbon from its environment. One of the most widely used and well-known absolute dating techniques is carbon-14 (or radiocarbon) dating, which is used to date organic remains.
Carbon-14 moves up the food chain as animals eat plants and as predators eat other animals. This unstable isotope starts to break down into nitrogen-14.
It takes 5,730 years for half the carbon-14 to change to nitrogen; this is the half-life of carbon-14.
In the case of living organisms, there is a constant ratio of Carbon-14 to Carbon-12 in the environement, and this is because carbon-14 is constantly created anew by cosmic radiation impacting on nitrogen-14 (I can imagine there might be variations to take into account as historic cosmic ratiation levels would vary).
Once a living thing dies, it cannot take up any fresh carbon-14, so the carbon-14 that is contained within its body will decay, and will not be renewed.
Firstly, we don't use carbon dating for measuring the age of rocks; carbon dating would only be used for the remains of living matter (e.g. Other radioactive isotopes might be used for various other objects.