Friday, 23 November 2012


  The decay of a neutral Bs meson into a muon pair is a very rare process whose rate in principle could be severely affected by new physics beyond the standard model. We now know it is not: given the rate measured by the LHCb experiment, any new contribution to the decay amplitude has to be smaller than the standard model one. There's a medical discussion going on and on about the interpretation of this result in the context of supersymmetry.  Indeed,  the statements describing the LHCb result as "a blow to supersymmetry" or "putting SUSY into hospital" are silly (if you think it's the most spectacular  change of loyalties since Terminator 2, read on till the end ;-) But what is the true meaning of this result?

To answer this question at a quantitative level it pays to start with a model independent approach (and technical too, to filter the audience ;-)  B-meson decays are low-energy processes which are properly described within a low-energy theory with heavy particles, like W/Z bosons or new physics, integrated out. That is to say, one can think of the Bs→μμ decay as caused by effective 4-fermion operators with 1 b-quark, 1 s-quark, and 2 muons: 

Naively, integrating out a mediator with mass M generates a 4-fermion operator suppressed by M^2. In the standard model, only the first operator is generated  with ML,SM≈17 TeV, dominantly by the diagram with the Z-boson exchange pictured here. That scale is much bigger than the Z mass because the diagram is suppressed by a 1-loop factor, and furthermore it is proportional to the CKM matrix element V_ts whose value is 0.04. The remaining operators do not arise in the SM, in particular there are no scalars that could generate MS or MP (the Higgs boson couples to mass, thus by construction it has no flavor violating couplings to quarks). 

In terms of the coefficients of these operators, the Bs→μμ branching fraction relative to the SM one is given by

LHCb says that this ratio should not be larger than 2 or smaller than 1/3. This leads to model-independent constraints on the mass scales suppressing the 4-fermion operators. And so, the lower bound on ML and MR is about 30 TeV, that is similar in size of the standard model contribution. The bound on the scalar and pseudoscalar operators is much stronger: MS,MP≳150,200 TeV. \begin{digression} The reason is that the contribution of the vector operators to the Bs→μμ  decay is suppressed by the small ratio of muon and Bs masses, which goes under the name of helicity suppression. Bs is spin zero, and a vector particle mediating the decay always couples to 2 muons of the same chirality. In the limit mμ=0,  when chirality=helicity,  the muons spins add up, which forbids the decay by spin conservation \end{digression}.

Consequently, the LHCb result can be interpreted as a constraint  on new physics capable of generating the 4-fermion operators  listed above. For example,  a generic pseudoscalar with order 1 couplings and flavor violating couplings to quarks and leptons must be heavier than about 100 TeV. It may sound surprising that the LHC can probe physics above 100 TeV, even if indirectly. But this is in fact typical for B-physics: observables related to CP violation and mixing of B-mesons are sensitive to similar energy scales  (see e.g Table I of this paper). Notice however that 100 TeV is not a hard bound on new pseudoscalars. If the new physics has a built-in mechanism suppressing the flavor violating couplings then even weak scale masses may be allowed.      

Now, what happens in SUSY? The bitch always comes in package with an extended Higgs sector, and the exchange of the heavier cousins of the Higgs boson can generate the operators MS and MP. However, bounds on the heavy Higgs masses from Bs→μμ will always be much weaker than 100 TeV quoted above. Firstly, the Higgses couple to mass, thus the Yukawa couplings relevant for this decay are much smaller than one. Secondly, the Higgses have flavor conserving couplings at tree-level, and flavor violation is generated only at 1 loop. Finally, models of low-energy SUSY always assume some mechanism to suppress flavor violation (otherwise all hell breaks loose); in typical realizations flavor violating amplitudes will be suppressed by the CKM matrix elements, much as in the standard model.  All in all, SUSY appears less interesting in this context than other new physics models, and SUSY contributions to Bs→μμ are typically smaller than the standard model ones.

But then SUSY has many knobs and buttons.  The one called tanβ -- the ratio of the vacuum values of the two Higgs fields -- is useful here because the Yukawa couplings of the heavy Higgses to down-type quarks and leptons happen to be proportional to tanβ. Some SUSY contributions to the branching fraction are proportional to the 6th power of  tanβ. It is then possible to pump up tanβ such that the SUSY contribution to Bs→μμ exceeds the standard model one and becomes observable.  For  this reason, Bs→μμ  was hailed as a probe of SUSY. But, at the end of the day, the bound from Bs→μμ on the heavy Higgs masses  is relevant only in the specific corner of the parameter space (large tanβ), and even then the SUSY contribution crucially depends on other tunable parameters:  Higgsino and gaugino masses, mass splittings in the squark sector, the size of the A-terms, etc.   This is illustrated by the plot on the right where the bounds (red) change significantly for different assumptions about the μ-term and the sign of the A-term. Thus, the bound may be an issue in some (artificially) constrained SUSY scenarios like mSUGRA, but it can be easily dodged in more the general case.   

To conclude, you should interpret the LHCb measurement of the Bs→μμ branching fraction as a strong bound on theories on new physics coupled to leptons and,  in a flavor violating way, to quarks. In the context of SUSY, however, there are far better reasons to believe her dead (flavor and CP, little hierarchy problem, direct searches). So one should not view Bs→μμ as the SUSY killer,  but as just another handful of earth upon the coffin ;-)

Some pictures borrowed from Mathieu Perrin-Terrin's talk


Anonymous said...

typical elaborate explanation boiling down to the fact that a qft with 100+ parameters is simply not falsifiable?

Anonymous said...

Nice article, thanks :-)

and forget the first scornful comment by the other anonymous ;-)

muon said...

Nice informative post.

I guess people were hoping to measure a rate above the SM prediction - that would have provided useful constraints on parameters of a new physics model like SUSY (imagine a band in the tanβ-MA plane); you're right that observing a rate consistent with the SM is less useful.

Are there other quark-sector opportunities like this one? As far as I know, forbidden tau decays can also probe very high mass scales and people talk about using LHC data to try to extend the stringent bounds already coming from the B factories.

Jester said...

Right, an excess would favor large tanbeta and low mA within the MSSM.

As for what's the next big thing, that's an interesting question. It seems that Bd-to-mu-mu is not too far away, so I guess LHCb could pinpoint it when more luminosity is collected after the upgrade. But I don't know if, apart from that, there's another clean flavor observable at the LHC with a large sensitivity to BSM, maybe someone who knows will comment... I know nothing about the tau decays.

Luboš Motl said...

It is absolutely not true that a theory with 100 parameters is unfalsifiable.

Loop quantum gravity has infinitely many undetermined parameters but it is still perfectly falsifiable. The infinite-dimensional manifold simply avoids the truth and one may show it.

On the other hand, the truth does belong to many manifolds, including many manifolds with a high dimension.

So these two things have nothing to do with each other. The only correlation here is that the confirmation of a theory with a small number of parameters is a "stronger confirmation". But we only have it for the Standard Model as a partial theory of Nature now.

What I find remarkable is that the claims about "theories not being falsifiable" are often being screamed by the very same people who also permanently scream that the theories were just falsified.

It's nonsense that the muon decay of B-mesons does anything with the "grave" of supersymmetry. If one actually knows the subclasses of SUSY models and doesn't view SUSY as a monolith (one that must be destroyed), he must know that many of these subclasses don't have even tiny tension with the LHCb data. It's perhaps inconvenient for someone but it's true.

Tim Preece said...

In the expression for the branching fraction relative to the SM it looks like the contributions from ML and MR come in with opposite sign.

Would that imply that the contributions could cancel? So how does 30Gev limit arise?

And apologies in advance if this is a dumb question.

Jester said...

When people quote this kind of bounds they usually assume only one operator is on, and the remaining are set to zero. Strictly speaking, if you don't care about fine-tuning, they could be large (that is to say M's could be smaller) but cancel between each other so that you don't see any effect.

Robert L. Oldershaw said...

Regarding the fin de siecle heralded by the impending demise of string theory, supersymmetry and related lost causes, the comments of a former visionary physicist seem especially appropriate.

"How can physics live up to its true greatness except by a new revolution in outlook which dwarfs all its past revolutions? And when it comes, will we not say to each other, 'Oh, how beautiful and simple it all is! How could we ever have missed it for so long!'." John Archibald Wheeler

Perhaps what is needed is not a fix, or fixes, for the old paradigm, but rather a completely new and unifying paradigm for physics. Do we have the courage, honesty, dedication, perseverance and humility that will be needed to realize such a paradigmatic change?

Anonymous said...

Please change "all hell breaks lose" to "all hell breaks loose". I won't be able to sleep until you do.

Anonymous said...

@Robert L. Oldershaw:

I dont know why this is, but I've gotten the impression that you keep copy-pasting the same trolling about SUSY, ST, and any BSM and even experimentally established physics (sometimes enriched with vigorous promotions of your own "alternative ideas") below every articly written in every physics blog, independent from the topic of the blog post ...

I mean, somebody could write in principle about the first snow this winter, which would not prevent you from writing always the same boring stuff dismissing everything physics has achieved since the beginning of the 20th century.

It is some kind of amusing to observe that the renormalization group flow of your commenting habit has reached an attractive non trivial fixed point quite some time ago; what you keep repeating everywhere is exactly self similar :-D.
But note that repeating a thing below any article of every physics blog a thousend times does not make it more true or relevant than it was the first time ;-)

First I was annoyed about you, but now seeing you popping up everywhere like this seems simply ridiculous to me and it just makes me LOL :-D.



Robert L. Oldershaw said...

Morning Dilaton,

You say: "always the same boring stuff dismissing everything physics has achieved since the beginning of the 20th century."

This is patently false. I have the greatest respect for General Relativity, Quantum Mechanics, and most of the well-tested physics from before the pseudo-science era, which began in ernest in the late 1970s and has been gaining strength ever since then.

If you think what I am proposing is boring, would you like to see 14 definitive predictions (remember what they are?) of which 5 are already verifed or have considerable and growing empirical support?

Here you go:

Discrete Scale Relativity

Anonymous said...

@Oldershaw: Different anon here. Not sure what you mean by "pseudoscience era", but hopefully you understand that the Standard Model has been ludicrously successful, again and again, in predicting the results of experiment, out to many, many significant digits. In fact, it has turned out to be far more accurate than anyone expected. Disappointingly accurate, even. Physicists are always hoping to find some slight discrepancy between measured numbers and the numbers calculated from the Standard Model.

Ervin Goldfain said...

Quoting Weinberg,"SUSY’s plausibility is reduced, but not to zero."

This is where we stand today. Claiming knowledge about where the future LHC results might take us is nothing but sheer speculation.

Anonymous said...

Susy: The BS is strong with this one ;))