Matt Leifer doesn’t blog all that often, but what he posts is very good. It tends to be extremely high-level stuff about foundational problems in quantum theory, mind, so it’s not for the faint of heart, but if you get into that sort of thing, it’s fascinating. Wednesday’s post on dechoerence is no exception:
[L]et me start by defining two problems that I take to be at the heart of understanding quantum theory:
1) The Emergence of Classicality: Our most fundamental theories of the world are quantum mechanical, but the world appears classical to us at the everyday level. Explain why we do not find ourselves making mistakes in using classical theories to make predictions about the everyday world of experience. By this I mean not only classical dynamics, but also classical probability theory, information theory, computer science, etc.
2) The ontology problem: The mathematical formalism of quantum theory provides an algorithm for computing the probabilities of outcomes of measurements made in experiments. Explain what things exist in reality and what laws they obey in such a way as to account for the correctness of the predictions of the theory.
If that’s where you’re starting, you know it’s going to get deep…
I’m not going to attempt to explain decoherence this morning, or quite possibly ever. It’s a slippery and abstract topic, and it’s hard to imagine a world in which it would fit into a Basic Concepts sort of category. This post does sort of make me think that a “Basic Concepts” post on “Measurement” might be interesting, though.