Quantum Diaries survivor Tommaso Dorigo offers an inside look at experimental particle physics, describing new results from combing through CDF data to look for rare events producing two leptons with the same charge:
Indeed, 44 events were found when 33.7 were expected, plus or minus 3.5. That corresponds to a roughly 2-sigma fluctuation of expected counts. The picture on the left shows the leading lepton spectrum for the 44 events in the signal region.
What he leaves out is that this is 44 events out of a total number that’s almost certainly in the millions. Particle physicists are sort of inured to this, and rarely have the total size of the dataset to hand, but I find it absolutely fascinating. The events that reveal interesting physics are such a vanishingly small fraction of the data that they collect that it’s just amazing that we ever learn anything. And yet, they’ve done amazing things in particle physics over the years.
The particular particle excess he’s describing probably doesn’t mean anything (two-sigma deviations are a dime a dozen), and Tommaso has bet money that they don’t mean anything, but it’s a fascinating look at how particle physicists work.
(Which reminds me, I really ought to Classic Edition my old posts about the pentaquark…)
A good diary. What is fascinating about this is the unsung hero of these accelerators – the selector programs that look at events and decide whether to save the data from the big detectors. They throw out thousands of events to collect a 100 a second or so, then filter these to pick out a few dozen out of thousands of those. So these selector filters have a millisecond or so to decide on whether to capture an event, Yikes.
That is what is so amazing about current physics models – they get the probabilities right. You are smashing all this stuff together and getting all this crud out and you have the models that tell you what to expect – and you get it! That is amazing.
It’s actually worse than that Markk..at the Tevatron, bunch crossings (events, really) happen at a rate of 1.7 MHz. The experiments can only write to tape at rates of 50-100 Hz. Experiments live and die by their triggers.
Hi Uncertain,
thank you for your sign of appreciation of my post. It is indeed fascinating to think that those 44 events are the survivors of a selection of a total of 7000 trillion events (I don’t even have a noun for that number).
Indeed, as someone notes above, the first hero in the chain is the trigger system. Those 7000 trillions have been reduced online by a factor of 30,000 to write less than 100 hertz of them to tape – a total of 150 billions surviving.
Anyway, just because I am a darn fussy person: the Tevatron bunch crossing time is 396 nanoseconds, so the rate is 2.5 MHz.
News about the Tevatron can be found at http://www.fnal.gov .
Cheers!
T.
Hmmm after posting this I had a nightmare. Could I have mistaken the number of accepted events ? Wait a moment. 150 billions is too much for two or three years of countinous data taking. Indeed, it is 15 billions, damnit. But it is not the x30000 factor that’s wrong, it is the very first figure: not 7000 trillions. Let me compute it again from first principles (I had done the computation by heart last time, a sure recipe for failure):
The proton-antiproton cross section at 2 TeV is s=70 millibarns. The data collected is L=1 inverse femtobarn. Since N = s L, N= 7E-25 / 10E-39 = 7E14, so “only” 700 trillion collisions. That makes more sense.
cheers,
T.