Truth revisited
In the first essay in this series I concluded that absolute certainty was impossible. A good scientific hypothesis is just a good approximation of the real world. Or, in other words, the next time we use it, we are likely to get a result it would predict. But this tells us nothing about whether truth itself actually exists. It only tells us what we can know.

Karl Popper claimed that all hypotheses have zero likelihood of being true. But while that may be the case for any given hypothesis, there are an infinite number of possible hypotheses. If we ask if any of them are true, the answer is infinity times zero, or "undefined". Thus there is no way we can argue from evidence for the existence of truth, or against the existence of truth.

As I argue in my essay on
faith and evidence, such a situation allows us to believe what we choose based on other criteria like optimism, or faith. We can take it on faith that the True exists, but then use observation and reason to try to approximate it. Similarly, we can have faith that the Good exists.

Sometimes epistemological realism and empiricism are presented as opposing ideas. My position is that knowledge is empirical, but we can accept on faith that we are also approximating what is Real. However, it may now be possible to argue for that position, based on evidence.

The Copenhagen interpretation of Quantum Mechanics and  Empiricism

I think there is an emerging standard interpretation of quantum mechanics today.
1) It accepts that non-locality is real, based on recent experiments.
http://www.upscale.utoronto.ca/GeneralInterest/Harrison/BellsTheorem/BellsTheorem.html
http://www.aip.org/enews/physnews/1998/split/pnu399-1.htm
2) It also accepts that quantum uncertainty is an objective uncertainty, not just a limitation on our ability to know. The probabilities involved are not subjective or epistemological, but a description of real property of nature.

This emerging standard is the interpretation I agree with. There are other ways QM can be understood. For example, there is the multi-world hypothesis, which violates Ockham’s razor with a vengeance. It also has been proposed that we could reject the logical law of excluded middle so that something could be neither “A” or “not A”. Another opinion is particles traveling backward in time.

But here I want to focus on two interpretations that have received a lot of attention. First, the Bohm interpretation proposes that the particle is always "really" somewhere, and that the wave function can be thought of as pushing the particle. Bohm showed that older QM could be formulated this way. But it has not been successful with newer quantum field theory. My intuition is that it can not succeed here. Bohm may have extended our ability to view some phenomena in a semi-classical way, but eventually this kind of thinking breaks down.

Bohm’s interpretation is sometimes presented as allowing a deterministic universe. But I would say this is not the case. Bohm’s interpretation involves non-locality. In order to describe the future evolution of the wave function, we may need information that is not from the strict past. And, since the particle is guided by the wave, even if the particle “really” is somewhere, its future movements can not be predicted without information that does not exist in the past. If the past is not enough to predict the future, that is not determinism, at least in my view.

So, finishing up my comments on the Bohm interpretation - while non-locality is becoming more accepted, I don’t think Bohm can be fully adopted, and, in addition, it is not really deterministic.

Next I’d like to address the original Copenhagen interpretation. The claim here is that there is no objective reality independent of the observer. It is fully empirical and makes no distinction between subjective uncertainty, and quantum uncertainty. This lead to paradoxes like Schrodinger's Cat, that could be a superposition of “alive” and “dead” until we opened the box to look.

Of course if one starts with an empirical philosophy, one will end with an empirical philosophy. But, more than that, some have claimed that QM demonstrated that pure empiricism was the correct view. That argument was weakened once non-locality was experimentally demonstrated. Before that, pure empiricists could claim that those that wished to preserve objective reality had to resort to undemonstrated violations of the principles of special relativity.

However, I think a new set of experiments have done even more damage to the pure empericist argument. Experiments on quantum de-coherence have, since the late 1990s, probed the boundary between the quantum world, and the classical world.
http://www.aip.org/pnu/1996/split/pnu297-3.htm
In one experiment, an atom in a super-imposed state is placed in a metal sphere. The atom bounces inside the sphere, and at some time t later, its state of super-position can be observed. It is found that with each interaction some of the quantum effects disappear. In other words, in answer to the question, “What counts as an observation?” the answer is, “Any interaction is at least a partial observation”.

So, consider the following – Suppose we send a single photon through a half-silvered mirror. QM says the “particle” really goes both directions. That is, it is 50% reflected ‘R”, and 50% transmitted “T”. But some interaction with the mirror was involved, so we have to modify the description. I’ll exaggerate the effect here, just to demonstrate the point. We have to introduce a distinction between objective, quantum uncertainty, and subjective, epistemological uncertainty. Since some small bit of the quantum uncertainty is eliminated by the mirror, we can say that there is now a subjective probability of 50% that the photon is 51%R/49%T, and a 50% subjective probability that the photon is 49%R/51%T.

This may seem like a strange distinction, but if the photon gets far enough away from a 50%T/50%R state, then we will be able to detect that in the pattern of superposition we observe, when we recombine the 2 beams. This also can explain what happens to Schrodinger's cat. When the atom decays there is a 50/50 superposition of states. But very quickly we have enough interactions with the cat to eliminate virtually all quantum uncertainty. We end up with a 50% subjective probability that the cat is 0% alive/100% dead, and a 50% subjective probability that the cat is 100% alive/0% dead.

What these new experiments show is that the distinction between objective and subjective uncertainty is a useful one in describing experimental observations. Pure empiricist will agree that we have useful concepts to describe reality. The idea of an electron or an atom may be useful, but that does not mean it is reality. They could say the same here. Our distinction between objective and subjective uncertainty may be useful, but that does not mean its real. But this does strike me as an odd argument. If we say our hypothesis is “x” “objective reality does not exist, and we only have our observations”, and then we say the hypothesis “not x” “objective reality does exist separate from our observations”, is useful in explaining our observations, we seem to have something of a contradiction.

The argument for empiricism has been that since we only have our senses, there is no way we can provide evidence for objective reality apart from our senses. But these experiments, by showing that an objective/subjective distinction is useful, may provide that evidence. There is an objective unknown. If "what we don’t know" and "what is unknowable" are distinct, then it follows logically that "what we know", and "what is knowable" are distinct too.

Comments?

Back to philosophy main page

Back to "More philosophy page"