The Paper of Record, unafraid to tackle the really important questions, today addresses the perennial favorite: Is it dangerous to stand near a microwave oven?
You’ll be happy to know that the answer is still “No.” I would’ve preferred “No, you dolt,” but you take what you can get:
Although microwave ovens can in fact leak radiation, the levels that might be released are fairly minute.
According to the Center for Devices and Radiological Health, a unit of the Food and Drug Administration that regulates microwave oven safety, every microwave that reaches the market must meet a requirement limiting the amount of radiation it can leak in its lifetime to five milliwatts per square centimeter at roughly two inches away from the oven. According to the center, that is far below the levels of radiation that have been shown to harm humans.
My only real objection to this is that it gives the false impression that the “radiation” from microwaves is the same as the “radiation” from atomic bombs and the like. I mean, technically, they are the same, in that both microwaves and gamma rays are forms of electromagnetic radiation, but then, so is visible light, and you never hear people asking about whether they’re getting “irradiated” by standing too close to a light bulb.
The “radiation” that people worry about causing cancer and the like is generally gamma radiation, with photon energies on the order of MeV, or a 1,000,000 electron volts (a small but convenient unit of energy). These photons have wavelengths of about a thousandth of a nanometer, which is smaller than the size of an atom. They’re a significant health hazard because they can pass through large quantities of ordinary matter without stopping, and when they do interact with atoms and molecules, they tend to knock electrons loose from whatever they hit, which triggers all sorts of unpleasant chemistry. When gamma rays ionize atoms and molecules inside cells, they can do a great deal of damage, sometimes killing the cell outright, sometimes causing mutations that can cause the cells to become cancerous.
The “radiation” in a microwave, on the other hand, is, well, mcrowave radiation. Microwave photons have an energy of around 10 μeV, or 0.00001 electron volts. For those playing at home, that’s eleven orders of magnitude less than the energy of a gamma-ray photon.
These are so far apart that they’re hardly comparable. Microwaves are easy to contain and direct where you want them, while gamma rays pretty much go where they damn well please. Gamma-ray photons can blast atoms apart, while microwave photons just sort of jiggle molecules around (which is how they heat food).
High levels of microwaves can lead to damage, basically by cooking flesh. “High levels,” in this case, means something on the order of 100 W, roughly what’s in your microwave oven. High levels of gamma radiation can cause cellular damage leading to an increase in cancer risk. This occurs at much lower power levels– this handy radiation FAQ suggests a 5% increase in cancer from a dose of 1Sv worth of radiation exposure, which this page suggests would require something like 1011 gamma-ray photons at 1 MeV. If you got that entire dose in one second, that would correspond to about 0.03 W.
These types of radiation are not remotely comparable in terms of health risk. While I’m glad that the Times took the time to tell people that microwave ovens are no threat, I wish they had spent a paragraph or so explaining that all radiation is not created equal.