Oh Mr. Man of Faith, I'm still waiting for some refutation of my earlier points. I'm also waiting for an explanation for how light can travel billions of light-years in only 6,000 years...
While I'm waiting, I feel that I will bring up another issue: radiological dating techniques. A lot of people have heard about them and understand some of the basics (measuring the ratio of radioisotopes to their decay products), but they do not understand the specifics of it. Due to this lack of in-depth understanding, there are a few common arguments made against it. Three such arguments are as follows:
(1) We cannot know what the original amount of decay products was when the rock was originally formed.
(2) Rocks are not closed systems.
(3) We cannot know that the decay rate has been constant.
Now let me refute each one of these points using my favorite dating technique: uranium-lead dating of zircon crystals. Uranium eventually decays into stable lead, so measuring the ratio of lead to uranium should yield information about the time it took for the uranium to decay. Here is how it works:
Zircon is another name for zirconium silicate, which has the formula ZrSiO4. During formation, other elements can sometimes enter the mineral while it is in a molten state and get stuck inside the crystal matrix. Uranium is known to do this, replacing zirconium in certain spots. What is interesting about zircon is that it will not allow lead into its lattice during formation. Lead has chemical properties that are incompatible with zircon. It's like mixing oil and water. This is how we know that there was no lead present when the zircon crystal first formed. Point one has been addressed.
Zircon is a mineral which is durable both chemically and physically. This allows it to survive long periods of time without being disturbed. Does this mean, however, that cracks never form which might allow lead to leak out? Of course not. However, scientists are aware of this. This is why many different rocks from a rock layer are used to establish a date for that layer. If an outside event has disturbed the layer, some rocks will be affected more than others. Some will be completely broken or fractured, others will have only tiny cracks and still others will be unharmed. The ones that have obvious damage are thrown out and not measured in the first place. It is by measuring the dates of all of the good quality rocks together and plotting them on a graph (forming what is called concordia and discordia lines) that we can extrapolate what the age of that layer is. The rocks that have lost the least lead would look the youngest and are therefore the ones that are closest to the true age of the layer.
Imagine what the case would be if some of the lead leaked out of all of the rocks that was studied in a way that we currently cannot predict. That would mean that the ages we measure for that layer would be an underestimate. This would be a mistake in the exact opposite direction that creationists would want, as it would mean that little 4.2 billion year-old rock we measured over there is actually older than we think it is.
Now consider what might occur if lead leeched into cracks in the rocks through ground water. Would we be able to tell? The answer is yes. When various isotopes of uranium decay, they can only produce certain kinds of lead isotopes. One isotope that does not result from uranium decay (or indeed, the decay of any other naturally occurring isotopes) is lead-204. Lead-204 makes up between 1-2% of naturally-occurring lead. So if any lead-204 is found inside a zircon crystal, then we know that rock has been contaminated and can therefore throw it out. Point two has been addressed.
Now we have point three: the idea that decay rates can change over time. We do know that decay rates can be changed by certain things. One such example is sustained nuclear fission. In essence, that is the definition of radioactive decay happening at a crazy high rate. However, nuclear fission requires a very specific set of circumstances (not exactly common in nature) and not all isotopes of uranium will work. The ratio of uranium-238 to uranium-235 would tell us if fission has occurred. Since U238 is not fissile but U235 is, then the presence of fission would cause the amount of U235 to be depleted relative to the amount of U238. Indeed, there is some speculation that a natural fission reactor once existed in Oklo, Gabon due to the depleted levels of U235 there.
The common factors that affect rocks, such as heat, pressure, weathering and humidity have absolutely no effect on natural decay rates. I have seen one suggestion that neutrino background radiation could change the decay rate of radioisotopes, but there are some problems with that idea. Firstly, neutrino interactions with matter are extraordinarily rare. In fact, neutrinos could pass through light-years of lead without being absorbed. Secondly, even if background neutrinos did affect decay rates, then the original measure of the decay rates of uranium isotopes would have automatically taken that into account (since neutrinos are zipping through the Earth every moment of every day and so would have caused the tested sample to decay just as quickly as their counterparts stuck in the rocks). If one was to suggest that background radiation was significantly higher in the past than it is now, it would be up to that individual to provide evidence for this. As of now, we have no reason to believe it has changed enough to matter.
So there you have it. That's why the rocks that we have dated simply cannot be only 6,000 years old (unless you want to argue that God made the rocks look old when they are in fact young, which would not only present an "author of confusion" problem but it would also raise the question of why He would want us to believe in a young Earth if He went through the trouble to make it look old).