have a read. http://www.christiananswers.net/q-aig/aig-c007.html. radioactive dating is proven to be inaccurate.
Your source for scientific fact is on a website specifically designed for Christians who might be losing their faith?
Well let's look at the differences between a Christian source and an unbiased source (for educational purposes only, I don't want to start anything here)
In,
http://www.christiananswers.net/q-aig/aig-c007.html, the first half is dedicated to showing how Carbon Dating works and how it is inaccurate. Why? Because it is the least accurate of all scientific dating methods
, with a half life of a little less that 6000 years, it can only measure recent things and C02 in the air could have affected this as well. Strange they didn't do as much depth on the other methods which are now more commonly used? Proceding....
They get into other methods and its inaccuracies...Who really wants to spend half of their life learning about radio active dating and carbon stronnium mumbo jumbo.....That's alright....Christian answers saves you some time and eases your pain....
The isotope concentrations can be measured very accurately, but isotope concentrations are not dates. To derive ages from such measurements, unprovable assumptions have to be made such as:
1. The starting conditions are known (for example, that there was no daughter isotope present at the start, or that we know how much was there).
2. Decay rates have always been constant.
3. Systems were closed or isolated so that no parent or daughter isotopes were lost or added.
Or you can read here
http://en.wikipedia.org/wiki/Radiometric_dating (A reliable encyclopedia for those of you who aren't aware)
Let's start with number 2...
You're right, decay rates have not been constant (but for what :-?)? Our favorite buddy carbon!! Carbon can be affected by conditions to change decay times. And according to wikipedia
The decay rate is not always constant for electron capture, as occurs in nuclides such as 7Be, 85Sr, and 89Zr. For this type of decay, the decay rate may be affected by local electron density. These isotopes are not used, however, for radiometric dating.
http://en.wikipedia.org/wiki/Half_life.......Nobody is assuming decay rates are constant, if an atom has it's decay rate proven and shows that it cannot be affected by anything in the natural universe, it's decay rate is constant, and it even said above that we can get the concentrations accurately. But of course statement nubmer two is valid because it refers to an undisclosed amount of inaccurate methods. So it is true, but not for everything..
For number 1?
Rubidium-strontium dating is based on the beta decay of rubidium-87 to strontium-87, with a half-life of 50 billion years. This scheme is used to date old igneous and metamorphic rocks, and has also been used to date lunar samples.
50 billion years? If it was proven that that is the half life....I'm pretty sure that whether we know or not what was there, we know that witht he concentration that exists....most likely the sample found was at least one half life (already a little far back), and even without one half life, it's concentration before decaying one half life can accurately be determined (otherwise why would we use something that dates 50 billion years when the current theory is that earth is 4 billion?)
And number 3....an atom decays on it's own....and that is what is measured.....number 3 is an assumption taken, yet not involving the decay of a radioactive isotope.
Sorry this is sloppy, I was in a rush.....
Here is a scientifically biased source if u want to compare
http://www.gate.net/~rwms/AgeEarth.html