Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Science

Matter-Antimatter Bias Seen In Fermilab Collisions 304

ubermiester writes "The New York Times is reporting that scientists at Fermilab have found evidence of a very small (about 1%) average difference between the amount of matter/antimatter produced in a series of particle collisions. Quoting: '[T]he team, known as the DZero collaboration, found that the fireballs produced pairs of ... muons ... slightly more often than they produced pairs of anti-muons. So the miniature universe inside the accelerator went from being neutral to being about 1 percent more matter than antimatter.' This finding invites theorists to explain why there is so much more matter than antimatter in the universe, when the Standard Model suggests that there should be equal amounts of each." Here is the paper as submitted to Physical Review (PDF). The DZero team is looking forward to getting detailed data from the LHC once it ramps up operationally.
This discussion has been archived. No new comments can be posted.

Matter-Antimatter Bias Seen In Fermilab Collisions

Comments Filter:
  • by valros ( 1741778 ) on Tuesday May 18, 2010 @02:29AM (#32249568)
    Wasn't this the previously supposed hypothesis? That the big bang held a slight matter bias. Its great that we can recreate it now. Also, how has antimatter responded to this bias?
  • Is 1% significant? (Score:5, Interesting)

    by gman003 ( 1693318 ) on Tuesday May 18, 2010 @02:34AM (#32249594)

    For some experiments, 1% might be attributable to error. I've never done practical particle physics, though. Does this fall under experimental error, or is stuff like this usually re-creatable to seventeen decimal places?

    I may not know much science, but I do know that margin of error is important.

    • by crescente ( 1334029 ) on Tuesday May 18, 2010 @02:47AM (#32249666)
      Their error, as stated in the linked abstract, is less than 0.3%. So, if you believe they're doing statistics correctly, yes, the signal is greater than the noise. More importantly, even, say 1.0 - 0.3 = 0.7% is HUGE: the common estimate of matter-antimatter asymmetry at the big bang was merely a billion-and-one to a billion. (linky: http://livefromcern.web.cern.ch/livefromcern/antimatter/academy/AM-travel02c.html [web.cern.ch]). And that extra one in a billion is all the matter we have today.
      • Re: (Score:3, Informative)

        by Smallpond ( 221300 )

        Their error, as stated in the linked abstract, is less than 0.3%. So, if you believe they're doing statistics correctly, yes, the signal is greater than the noise. More importantly, even, say 1.0 - 0.3 = 0.7% is HUGE: the common estimate of matter-antimatter asymmetry at the big bang was merely a billion-and-one to a billion. (linky: http://livefromcern.web.cern.ch/livefromcern/antimatter/academy/AM-travel02c.html [web.cern.ch]). And that extra one in a billion is all the matter we have today.

        That ratio means that the energy of the big bang was much less (100 / 1,000,000,000) than what it was previously estimated to result in the matter we see today. Kind of a large difference.

    • by FrangoAssado ( 561740 ) on Tuesday May 18, 2010 @02:56AM (#32249704)

      Well, if they wrote a paper and submitted it to Phys Rev, you can rest assured they considered this (and it will be checked by many other physicists).

      The abstract in the linked paper says the result they got differs by 3.2 standard deviations from the prediction given by the Standard Model. That's not conclusive, but it's significant. Surely they (or someone else) will keep looking in other data (from LHC, for example) to see if they can increase confidence.

      • Re: (Score:3, Insightful)

        by Steve Max ( 1235710 )

        They submitted a paper saying that they see a difference of around three sigma from what the SM predicts; they claim nothing more than that. Besides, we need to see the whole picture here: previous experiments agree with the SM prediction within 1-sigma, which is as good as it gets, while their result is a bit off. Their best fit disagrees more with the current combined BaBar/Belle best result than the SM prediction does to the BaBar/Belle numbers. This, combined with the fact that we've seen even bigger si

    • by Trepidity ( 597 ) <delirium-slashdot@@@hackish...org> on Tuesday May 18, 2010 @02:57AM (#32249708)

      Assuming that what the conclusion (p. 21) reports as "like-sign dimuon charge asymmetry of semileptonic b-hadron decays" is the number we're looking for, they do give a margin of error that's smaller than the asymmetry observed. They report the asymmetry as:

      A = -0.00957 +/- 0.00251 (stat) +/- 0.00146 (syst)

      I believe the two errors are there because they breaking out the statistical margin of error (due to sampling) and systemic margin of error (due to accuracy of apparatus and setup).

    • Re: (Score:3, Informative)

      by tylersoze ( 789256 )

      Given the calculated ratio of photons to fermions during baryogensis the asymmetry is suppose to be even smaller than that, something like 1 extra particle of matter per 100 million if I remember correctly.

    • Re: (Score:2, Informative)

      by Anonymous Coward

      For some experiments, 1% might be attributable to error. I've never done practical particle physics, though. Does this fall under experimental error, or is stuff like this usually re-creatable to seventeen decimal places?

      I may not know much science, but I do know that margin of error is important.

      It's extremely significant given some models show that a 1% bias would account for the Universe as we know it. Dead even, no Universe. A 1% bias and we get our Universe. 1% may not seem like much but it's massive when you are talking about the origin of the Universe. As far as experimental error 1% is a pretty massive error in particle physics. Add a few zeroes, 0.0001%, and it'd still be interesting but a full 1% is pretty massive on the scale we are talking about.

    • Particle physicists deal almost entirely in probabilistic measurements, so they get pretty good at understanding their error bars.

      1% could be enormous, or tiny depending on the sample size.

      The fact that this has been published in Physical Review gives a strong implication that they have done and checked their sums.

    • by nedlohs ( 1335013 ) on Tuesday May 18, 2010 @08:39AM (#32251406)

      I'm sure the scientists who wrote the paper never even considered that before submitting the article for peer review.

  • by oldhack ( 1037484 ) on Tuesday May 18, 2010 @02:41AM (#32249636)
    Your expensive tube is doing fat lot of good, eh?! You go Fermilab! LHC can suck it!
  • so can this help us map the antimatter in the universe?

  • Budget (Score:3, Interesting)

    by MichaelSmith ( 789609 ) on Tuesday May 18, 2010 @02:57AM (#32249706) Homepage Journal

    So presumably 99% of the mass-energy in the universe is currently energy, much of which must be potential and kinetic energy. The momentum of the Big Bang, the energy we will get back in the eventual collapse, light elements which will eventually fuse, and heavy elements which will eventually undergo fission.

    • Re:Budget (Score:5, Funny)

      by ShakaUVM ( 157947 ) on Tuesday May 18, 2010 @03:33AM (#32249850) Homepage Journal

      >>The momentum of the Big Bang, the energy we will get back in the eventual collapse...

      Eventual collapse?

      Haven't kept up with physics, eh? =)

      • Re: (Score:3, Informative)

        by Trepidity ( 597 )

        As far as I can tell, the "big crunch" hypothesis isn't yet totally [arxiv.org] ruled out, though majority opinion is probably against it.

      • Well okay but even if we don't rewind the energy to push us to infinity must have come out of that 99%.

      • Re: (Score:3, Insightful)

        by BitZtream ( 692029 )

        Yea, it just depends on which day of the week you subscribe to which theory.

        The reality of it is ... they are theories and they continually keep finding new data that says the theories are at least partially wrong or in some cases bare no relation to reality.

        Stop pretending you (or anyone else) understands the universe.

    • Re: (Score:2, Interesting)

      by Graff ( 532189 )

      So presumably 99% of the mass-energy in the universe is currently energy, much of which must be potential and kinetic energy.

      Not necessarily, it depends on how many iterations of annihilation-recombination took place.

      For example, say we have 100% matter and antimatter, it interacts and annihilates leaving 1% matter. The remaining energy recombines back into matter and antimatter (through processes like vacuum fluctuation and virtual particles [wikipedia.org]), now 99% of that annihilates, leaving lasts iteration's 1% plus this iteration's 99% x 1% = 0.99% for a total of 1.99%. Next will be 98.01% x 1% = 0.9801%, and so on.

      Thus the formula is:
      0.9

      • I wasn't sure about my assumption that the left over energy would be all our energy, including the energy which would be released ultimately fusing everything down to iron. But if that is the case we can't be 100% matter because then we would be at maximum entropy.

      • Re: (Score:3, Insightful)

        Your summation doesn't make sense. We have 1/0.01 = 100 = 10000%, so the total energy is 10000% of what it started as? The very first term in the series should be a clue that something is wrong: 0.99^0 = 1 = 100% already.
  • Uneven laws (Score:5, Interesting)

    by Thanshin ( 1188877 ) on Tuesday May 18, 2010 @03:05AM (#32249750)

    It would be so funny to discover now that the laws of physics are uneven in space...

    That the same experiment gets you different results depending on which sid of the Milky Way you are...

    Or they could be uneven in time. Maybe every 54.12 years the relation between produced matter/antimatter switches from 1:1.01 to 1.01:1.

    • Re:Uneven laws (Score:5, Informative)

      by cc1984_ ( 1096355 ) on Tuesday May 18, 2010 @03:20AM (#32249808)

      It would be so funny to discover now that the laws of physics ... be uneven in time. Maybe every 54.12 years the relation between produced matter/antimatter switches from 1:1.01 to 1.01:1.

      You're not the first to think this (specifically the fundamental constants like the speed of light might be changing over time):

      http://www.space.com/scienceastronomy/generalscience/constant_changing_010815.html [space.com]

    • Re:Uneven laws (Score:5, Interesting)

      by silentcoder ( 1241496 ) on Tuesday May 18, 2010 @03:24AM (#32249814)

      That would not be a "discovery" but a confirmation. Many physicists have suggested such hypothesis in the past. Even more have suggested asymetry in time -t that at various ages of the universe the fundamental constants may have been different to what they are now.

      There are a few pieces of evidence suggesting this (the rate of decay of Oklo's uranium COULD be explained that way - though a natural fission reactor is a more plausible one), and several physicists have conjectured that the fine-structure-constant may have changed over time, and that would be an explanation for the wrong speed of galaxies that wouldn't require cold-dark-matter.

      Our estimates on the age of the universe have changed 4 times in the past 2 decades - generally, it got younger with the current consensus at about 13-Billion years.
      Of course if any of the fundamental constants had changed over time or in different regions of space - in the end, it's simply a matter of how you travel through space-time, then that means all bets are off. The fundamental constants determine the laws of physics. Thus far, outside of singularities like the big bang or black holes (and Stephen Hawking thinks we don't even need THOSE to be singularities) there is no really strong evidence for it. It's possible, but unlikely - and if true, means it's mathematically impossible for us to understand the universe.

      • by Anonymous Coward on Tuesday May 18, 2010 @03:38AM (#32249864)

        So maybe Dragons really did exists once upon a time when the laws of physics were different.

        Oh.. the creationists will love this.

        • Well, define dragon.

          Flying lizard-like creature? I give you the Pterosaur [wikipedia.org].

          Fire breathing creature? Not quite, but the bombardier beetles [wikipedia.org] is somewhat there. It's not real fire, but getting hit by a liquid close to 100 C is going to feel like being burned. And if that compound is also acidic or caustic, it gets even worse, and anyone hit by a decent amount of it would certainly feel like they're on fire.

          These two aren't exactly along the same evolutionary branches, but a combination of the two aren't beyond t

      • Anyone can propose a theory. There is no shortage of speculative theories out there. Creating a theory is easy, even philosophers can do that. 'Proving' that the theory is not false the hard bit - something that sets the physicists apart from the philosophers. I'm not trying to bash philosophers here, since they have their place, merely trying to say that theories are pretty much 'dime a dozen' these days - but experimental verification of them is a much more rare and precious thing. That's why I stopped re
        • Re:Uneven laws (Score:5, Interesting)

          by silentcoder ( 1241496 ) on Tuesday May 18, 2010 @06:58AM (#32250762)

          Hypothesis are a dime a dozen, theories are supposed to be hypotheses that have stood the test of time for a while, but the terminology often gets mixed up to the detriment of science (even by scientists).

          That said, in this case - the people who made these hypothesis are highly respected phycisists who had genuine puzzles they were attempting to explain. In most cases so far - there ended up being other more plausible explanations, but I just don't imagine serious physicists proposing an alteration to a fundamental constant lightly.

          Right now there is some puzzles in cosmology that suggest that the fine structure constant may have been slightly lower in the past, there is further very strong evidence that supports the possibility (notably - the energy of the background cosmic radiation is slightly lower, by almost exactly the amount it would be if this was true).
          BUT - and this is a big but, in the meantime, two other explanations for the cosmic radiation difference have been proposed. In both cases they don't rely on a different fine structure constant shortly after the big bang. But their supporting evidence is still being tested. In the meantime - neither explains the puzzles that led to the proposal in the first place, so if either is shown to be accurate - cosmology still can't answer those.
          That puts the weight of evidence currently on the side of a change over time in the FSC, if only because it explains more observations than any other available hypotheses.
          Downside - if the FSC was different, that means a LOT of other differences, because the FSC is an amalgamation of several other fundamental constants including Planck's. Change that in the past, and it means the physics of the early universe was slightly different to ours, and such a difference is a mathematical singularity, it's impossible from our side of it, to predict what was on the other side.

    • Re:Uneven laws (Score:5, Interesting)

      by Black Gold Alchemist ( 1747136 ) on Tuesday May 18, 2010 @03:32AM (#32249846)
      If the laws are uneven in time, that could lead to perpetual motion among other interesting consequences.

      For example, pretend that the speed of light is variable over time and remember that E=mc^2. On earth, we build a matter-antimatter annihilation laser and point it at a base in space. When the speed of light speeds up to c=1.1 the normal value, we fire off the laser, converting 10 g of matter into 1.08749377 petajoules. The light energy travels for a time, during which the speed of light slows back down to c. It hits a set up in the space base that converts the light back into matter. We divide by normal c, and are left with 12.1 grams of matter. We mail it back to earth, and send 10 g grams back to the laser (to repeat the process). The other 2.1 g is used as starship fuel, worth over 180 terajoules. Don't rinse, but repeat.
    • Re: (Score:3, Interesting)

      by Joce640k ( 829181 )

      That would just mean that the "laws", aren't - in the same way that Newton's "laws" turned out to be not quite right when you're moving quickly.

      And science would be cool about it. Excited about it, even.

      Public imagination aside, scientists tend to celebrate when they're find out they were wrong (especially if it took big/expensive machines to do it...)

      • Re: (Score:2, Interesting)

        by jenik ( 1030872 )
        I'd suggest you read Kuhn's Structure of Scientific Revolutions to see that science appears to work very differently than you (and many others) think. Scientist tend to prevent any substantial change in their paradigm as long as possible---for example by "tweaking" theories or devising auxiliary hypotheses.
    • by Joshua Fan ( 1733100 ) on Tuesday May 18, 2010 @03:54AM (#32249924) Homepage
      The real problem facing physicists right now is the lack of a Fermilab in Australia to confirm such a possibility.
    • Re: (Score:2, Interesting)

      by cfc-12 ( 1195347 )

      I can heartily recommend Vernor Vinge's "A Fire Upon the Deep", which is set in a galaxy where the laws of physics do indeed vary widely depending on your distance from the centre of the galaxy.

      Probably not what you'd call hard science fiction, but definitely one of the best "what if" books I've ever read.

      • > Probably not what you'd call hard science fiction...

        It's certainly what I call hard science fiction.

        > ...but definitely one of the best "what if" books I've ever read.

        Everything Vinge has written is excellent.

    • by JamesP ( 688957 )

      Or they could be uneven in time. Maybe every 54.12 years the relation between produced matter/antimatter switches from 1:1.01 to 1.01:1.

      Funny you mention that, there are some theories that say the speed of light changed in time, so in the past (like, billions of years ago) it was slower, IIRC

  • new matter? (Score:4, Interesting)

    by kix ( 24024 ) on Tuesday May 18, 2010 @03:17AM (#32249788) Homepage

    I'm probably misunderstanding something here, but it seems that they have discovered that when the big bang happened, then because of this property, a bit more matter was created than anti-matter out of wherever they came in the first place, the rest of it annihilated with each other and everything else is made up from the "extra bits". This seems fairly reasonable.

    Now, it is also known that new matter-antimatter element pairs are being created and annihilated all the time everywhere, this is where Hawking radiation comes from.

    Does this new discovery mean, that it would be possible, that instead of an antimatter-matter pair a matter-matter pair is created sometimes instead and therefore the amount of matter in the universe is increasing (even if by a tiny amount)? Or are the conditions needed for this to happen too extreme to ever take place outside of big bangs and accelerators? Although as I understand some cosmic rays have far greater energies than accelerators.

    Real physicists - please help me make sense of it all!

    • Re:new matter? (Score:4, Informative)

      by chichilalescu ( 1647065 ) on Tuesday May 18, 2010 @04:14AM (#32250022) Homepage Journal

      no, this is doesn't fit the physics i know of.
      in quantum field theory, you can describe the phenomena of a photon splitting into a particle-antiparticle pair that then anihilates to recreate the initial photon. these are the pairs that appear and disappear all the time (because of virtual photons that appear and disappear). However, a photon splitting into a particle-particle pair doesn't fit QFT.

      • Re: (Score:3, Informative)

        by kix ( 24024 )

        Right, of course you are correct. After having read http://en.wikipedia.org/wiki/Virtual_particle [wikipedia.org] I actually understand that the question was rather silly. sorry about that. Although, if everyone read the correct wikipedia entries before asking things, there would be very few questions indeed ;)

        thanks.

    • Does this new discovery mean, that it would be possible, that instead of an antimatter-matter pair a matter-matter pair is created sometimes instead and therefore the amount of matter in the universe is increasing (even if by a tiny amount)? Or are the conditions needed for this to happen too extreme to ever take place outside of big bangs and accelerators? Although as I understand some cosmic rays have far greater energies than accelerators.

      I think it's a bit more subtle than that -- things like a particle with a magnetic dipole decaying and tending to send the matter particle towards its North pole and the antimatter towards its South pole, but I'm not certain.

    • > Now, it is also known that new matter-antimatter element pairs are being created and annihilated all the time everywhere, this is where Hawking radiation comes from.

      Wait, what? We "know" this because of a theoretical radiation that has never been observed?

  • LHC can't contribute (Score:3, Informative)

    by Bananenrepublik ( 49759 ) on Tuesday May 18, 2010 @03:38AM (#32249868)

    LHC is a proton-proton collider, Tevatron (where D0 is situated) an antiproton-proton collider. Therefore Tevatron provides a situation which is symmetric between matter and antimatter, LHC doesn't. The conclusion of the paper is that there is a 1% excess of matter in a situation that started with no preference for matter or antimatter. I don't see how LHC could contribute to this given that they are always starting with two matter particles.

    • by Shillo ( 64681 ) on Tuesday May 18, 2010 @05:26AM (#32250284)

      Had you read the abstract, you'd know that Fermilab's result is b+anti-b decay, not p+anti-p, so LHC is fine as long as they can specifically track which muons came from b quark decays.

      As a matter of fact, they have a special detector just for that (it's not general-purpose, because b+anti-b pairs decay within centimetres from their creation point, so they actually drop particle tracker 5mm from the beam). See LHCb experiment.

  • Well (Score:4, Funny)

    by Daath ( 225404 ) <(kd.redoc) (ta) (pl)> on Tuesday May 18, 2010 @03:49AM (#32249898) Homepage Journal

    It doesn't matter. But it doesn't anti-matter, less.
    Or something.

  • by Laxator2 ( 973549 ) on Tuesday May 18, 2010 @03:56AM (#32249930)
    The Tevatron is so thoroughly outclassed by the LHC that they have to take advantage of every opportunity to make a press release and show that they are still relevant. Once the LHC starts producing science data there will be impossible to justify funding for the Tevatron. The whole of Fermi Lab. (which uses about half the science money given by the D.O.E.) will be in danger of being closed, so they are fighting for survival. During the Bush administration they had to get private funding to avoid lay-offs. http://tierneylab.blogs.nytimes.com/2008/07/02/good-news-or-less-bad-news-for-american-science/ [nytimes.com]
    • Re: (Score:2, Insightful)

      by DMiax ( 915735 )

      OTOH this is what happened to the LHC predecessor at CERN when Fermilab was bleeding edge. I suspect that in 20 years the #1 accelerator will be our fellow Americans' one. (unless they win the race to have short-sighted politicians...)

      And I think it is probably better to have only one "best accelerator" at a time. LHC will be able to confirm the data from Tevatron *and* do something more. And so will do the next Tevatron with LHC data.

      • Re: (Score:3, Interesting)

        by Laxator2 ( 973549 )
        Honestly, I don't know much about what happened at CERN before LHC, I only remember that they had LEP, which was an electron-positron collider, while the Tevatron is proton-anti-proton. The "scooping" of experiments happens all the time, for example Cornell's collider was the main place to study B mesons for about 20 years, before SLAC built the BaBar machine that accumulated in one year as much data as the Cornell machine has accumulated in 20 years. Luckily, the people at Cornell were able to move to K
    • I don't know anything about particle accelerators but...

      It's a machine that does something that no other machines can do. So, I imagine there could be an industrial process that such machine could be used for.

      Is there? Can the old accelerators be transformed from science labs into industrial tools?

      • by zmooc ( 33175 )

        Particle accelerators are all about the destruction of minute amounts of material. Industry is about construction of massive amounts of material. Therefore particle accelerators are the exact opposite of industry and I doubt an industrial application for particle accelerators can be made up.

    • In terms of raw energy levels, the LHC eats everyone's lunch, but the LHC does different work than the Tevatron. The LHC wouldn't be able to do any of the stuff they're doing in this instance, as the LHC doesn't deal with antimatter. It's a proton collider. the Tevatron is a proton-anti-proton collider.

    • Of course to real scientists there is no point to NOT run both in parallel. There’s plenty of stuff the Tevatron can still do.
      Oh, and maybe they should go big, and say that the US must be first again, and build a XLHC (extremely large hadron collider), crossing the whole country from coast to coast. ^^

    • http://en.wikipedia.org/wiki/Superconducting_Super_Collider [wikipedia.org]

      it was canceled in 1993, now its a data center

      it was going to be 40 TeV (the LHC is only 14 TeV). we would have already had been running it for years now, and the discussion topics here on slashdot could have been equivalent to discussions about columbus sighting land, in terms of amazing new discoveries by mankind

      and to make it incredibly freaky, this thing apparently was going to be in texas, way back when in 1993 when texas still believed in scie

      • Re: (Score:3, Funny)

        by wtbname ( 926051 )

        I consulted my daughters "Jesus and You, and Science Too" text book, and it confirms that your post is bunk. Texas has never believed in science.

    • by students ( 763488 ) on Tuesday May 18, 2010 @12:15PM (#32254048) Journal

      The Tevatron has to be partially removed to allow the construction of Project X [fnal.gov], which is an accelerator that complements the LHC but does not compete with it. Fermilab is in no danger of being closed due to obsolescence. Many of the people who work there are working on the LHC, and there are many other experiments located at Fermilab.

      After congress canceled the Superconducting Super Collider, Europe focused on exploring the "Energy Frontier" and American scientists have focused on the "Intensity Frontier." There are also lots of collaboration and experiments that do not fit into either category. Of course, the rate at which the "Intensity Frontier" is explored does depend on the federal budget, but it will get done eventually.

  • I know modern science is meant to be collaborative, but this paper has more than a page of authors! I note that they are listed alphabetically -- remind me to change my name to Aarons before taking up particle physics.

    • Re: (Score:3, Interesting)

      by mathfeel ( 937008 )

      I know modern science is meant to be collaborative, but this paper has more than a page of authors! I note that they are listed alphabetically -- remind me to change my name to Aarons before taking up particle physics.

      This is typical of "big science" that involves tons of people like experimental high energy versus "bench science" or "desk science" like everyone else.

  • by IBitOBear ( 410965 ) on Tuesday May 18, 2010 @04:39AM (#32250128) Homepage Journal

    What is, "there used to be a lot more matter and antimatter before they started canceling each other out and now we live amongst the debris"?

    or, from my safety fifth-grader...

    What is "the standard model is wrong"?

    And I don't mean that in a bad way. The "flat earth" hypothesis was an _amazing_ deduction at its inception. It was only off by eight inches declination for every mile. This was a _tiny_ margin of error. But error compounds and so does any other form of tiny, so eight inches per mile, an error of ~.0126% (e.g. 8/63360) was enough to make the earth round.

    Ta dah! 8-)

    • by queazocotal ( 915608 ) on Tuesday May 18, 2010 @05:58AM (#32250452)

      It's been known for a long time that the standard model has problems.
      To continue your analogy.

      The earth is flat works really well as a model. If you're in a hilly terrain, you might suspect early on that the flat earth model isn't quite right.

      To find out that earth is actually a slightly disorted sphere with a radius of some 6000km means that you have to go quite far (distance wise) to realise that the errors in the flat-earth model actually add up to a coherent alternative theory - a spherical earth.

      It's much like this in physics.

      Saying 'the standard model is wrong' - and giving plausible arguments - doesn't give much for alternative theorists to get their teeth into.

      If however, you can produce a concrete measurement that can say 'The standard model is off by 0.3% here, 0.6% here, 1.2% here, and this looks _really_ like a curve of 0.5x+x^2 in the energy/bias ratio' - this can eliminate whole classes of alternate theories.

      At the moment, string theory (and the descendant fields) suffer from an embarrasment of possibilities.
      There are people arguing that the world is flat, round, toroidal, duck-shaped, ...

      These theories are generally internally consistent, and can only be proved wrong with measurements of the real world. Without these measurements, the theories are interesting maths that you can make a career in maths about, but not predict the world in a useful way.

      • Re: (Score:3, Funny)

        by Hurricane78 ( 562437 )

        It's been known for a long time that the standard model has problems.

        Well, of course. With them all being anorexic and on drugs, you can see their problems when looking at their bodies. ;)

  • by Anonymous Coward on Tuesday May 18, 2010 @07:55AM (#32251070)

    This is yet another reason why you shouldn't read mainstream media to get your physics news. Just reading the article summary makes me shiver all over.
    Please, there are no fireballs at a particle collider and we are many many orders of magnitude in energy away from recreating the conditions after the Big Bang.
    There is no miniature universe anywhere. Nothing went from being neutral to more matter than antimatter. Given that the (anti)matter in question here are (anti)muons
    that would imply violation of charge conservation, which is not what they observed. This has nothing (well almost nothing, I'll explain in a sec) to do with why there is
    so much more matter than antimatter in the universe, and the Standard Model does not suggest that there should be equal amounts either. The only correct
    representation of facts in there is that the paper is indeed from the D0 collaboration and it has to do with seeing 1% more muons than antimuons.

    Okay, so what did they do? They looked at decays of neutral B-mesons. These are curious mesons, because they oscillate back and forth between being a
    B and an anti-B. If you ever took quantum mechanics: The propagating energy eigenstates are |B> +/ |anti-B> while |B> and |anti-B> are eigenstates of charge-conjugation+parity (CP).
    The B can decay into a mu+ (antimuon) + other stuff, the anti-B can decay into a mu- (muon) + other stuff. (In both cases the other stuff has the opposite charge, so total
    charge is conserved.) They saw a 1% asymmetry in the amount of mu+ vs. mu- which means that during the oscillation back and forth they end up 1% more often in one
    than the other state which means there is a matter-anti-matter asymmetry in their behavior (technically there is CP violation in the mixing). The newsworthy fact is that in
    the Standard Model this particular asymmetry (CP violation in mixing) is predicted to be about 25times smaller. With the uncertainties they quote that makes a 3-sigma discrepancy
    which is regarded enough to claim "evidence of something" (you need 5 sigma to claim "observation of ..."), in this case direct evidence of new physics beyond the
    Standard Model, which is what particle physicists have eagerly been looking for for the last decades. Personally, I'm holding my breath until I see the same measurement
    from CDF (the other experiment at Fermilab). There have been many 3-sigma descrepancies in the past ...

    As far as the universe is concerned, today we only have matter (forget about particle colliders, the point is there are no stars or huge clouds of anti-hydrogen out there).
    As the theory goes after the Big Bang there were equal amounts of matter and antimatter, which would eventually have all annihilated into radiation and we wouldn't be here.
    The matter we see today is from a tiny, 1 in 10^9, asymmetry in the amount of matter vs. anti-matter that was generated dynamically by particle reactions after the Big Bang.
    When the universe cooled down and all the anti-matter got annihialted the tiny excess of matter was left over, which is the matter we see today. To generate this asymmetry one
    needs (among other things) CP violation. There is CP violation in the Standard Model, it's just not nearly enough (several orders of magnitude) to generate the required asymmetry in the early
    universe. It is totally not straightforward what the 1% asymmetry in the B-anti-B mixing from above translates into in the early universe, although I'm quite sure people are looking at
    it right as I speak. I would be very surprised if it was enough though.

    • by Lord Ender ( 156273 ) on Tuesday May 18, 2010 @10:29AM (#32252588) Homepage

      I can picture you reaching for the nonexistant typewriter lever at the end of each line, then realizing it isn't there, then hitting the enter key to advance to the next line as a substitute.

    • Re: (Score:3, Interesting)

      There have been many 3-sigma descrepancies in the past ...

      I have just three words on this:

      "Alternating Neutral Currents".

      (For those confused, Neutral Currents are interactions mediated by the Z boson. In the early 1970s, there was a race on to provide evidence for these, and there were press releases that had to be retracted because somebody jumped the gun and reported finding a Z before it was verified. This jokingly became called 'Alternating neutral currents', and several physicists had their credibili

  • by RealErmine ( 621439 ) <commerce@nOspaM.wordhole.net> on Tuesday May 18, 2010 @09:33AM (#32251934)
    Humour Bot: "I says, super collider? I just met her! And then they made a super collider 2, thank you, you've been a great audience"

It is easier to write an incorrect program than understand a correct one.

Working...