Is New Physics Running Out of Corners?

Friday was the last occasion for Moriond participants to see new results on specific physics topics since Saturday is reserved for summary talks.  The topic was ‘Beyond the Standard Model’ — a very large subject, which covers an incredible number of theoretical models, from Supersymmetry to Two-Higgs-Doublet Models, two of the most discussed topics of the day.

Schema of top decays, from unmerged t ofully merged (boosted), from P. Azzi talk.

Schema of top decays, ranging from unmerged to fully merged (boosted) (talk presented by Patricia Azzi at Moriond 2014).

Each talk addressed more than one theoretical model, as the experiments prefer to focus on model-independent results. With each talk, however, the space left for new physics by latest measurements appeared smaller and smaller. In fact, Jean Iliopoulos highlighted in his summary talk that it is becoming harder to say “new physics must be around the corner,” as we are “running out of corners!”  So, I’ll focus on a topic that appears less theoretical even if it is treated in close collaboration with theorists: searches for new physics with boosted topologies. This topic was presented by Patrizia Azzi on behalf of the ATLAS and CMS collaborations.

What is a boosted topology? The term, which derives from “Lorentz boost”, is applied when a particle has energy equal to or above twice its mass. Due to their light masses, this is pretty much always the case for electrons and muons; they are considered “ultra-relativistic” and are not classified in this category. Rather, the term is reserved for much heavier particles, like W, Z or H bosons or top quarks – that is, particles that need much more energy to be boosted. These particles are unstable and are only observed by their decay products and, as a consequence of the boost, their decay products end up collimated in a single jet.

As an example of a boosted topology, consider a top quark decaying into a W boson and a b quark.  If the W boson decays hadronically, it produces two light jets. So, in the non-boosted case, we could expect to reconstruct the top quark from three jets: two light jets and one b-jet. In the boosted case, however, we only observe one collimated jet containing, in its substructure, the two light jets and the b jet. The challenge is to identify such a jet and recognize its components.

Boosted topologies are also studied in searches for a Z’ boson (a heavy Z boson predicted in some new theories) decaying into a top – anti-top pair. The top quarks are boosted for a Z’ with mass above 1 TeV and, at the moment, Z’ are excluded below 1.65 TeV (at 99% Confidence Level) depending on the model. Such searches represent possible new “corners” for finding new physics, especially as the LHC centre of mass energy increases (from 8 TeV to 13-14 TeV) in Run 2.

Reconstructed mass of tau leptons coming from Higgs, from R. Manzoni talk.

Reconstructed mass of tau leptons coming from Higgs decays (from talk presented by Riccardo Manzoni at Moriond 2014).

At these energies, boosted topologies will also be important for Higgs boson decays to b quarks or τ leptons.

Another very important topic — concerning the Standard Model and the Higgs boson — was brought up during the Young Scientist Forum.  This session features PhD students, who are each given five minutes to present a topic and to answer one question, which is an excellent opportunity to present their work. And this topic, evidence of Higgs boson decays into τ leptons, was treated by two students: Nils Ruthmann for ATLAS and Riccardo Manzoni for CMS.

This is a major result. Until the end of 2013, the Higgs had been only observed decaying into bosons (γγ, ZZ and WW), although the Standard Model predicts that it should also decay into fermions  (ττ, bb,…), decays in these channels are difficult to identify due to high background rates and final states that are more difficult to extract (jets versus leptons or photons). Both analyses used multivariate techniques to achieve the goal.

One of the more difficult challenges is to identify the tau leptons, which decay fully leptonically in 12% of the cases, leptonically and hadronically in 46% of the cases and fully hadronically in the rest (42%). The plot at the right illustrates the mass of tau leptons, as reconstructed in the hadronic decay mode. The final results present evidence of H → ττ at a significance of 4.1 σ for ATLAS and 3.2 σ for CMS. No new “corner” here, but more key support for the Standard Model, and a very important measurement.

Eve Le Menedeu Eve Le Ménédeu is currently a postdoctoral physicist at IFAE (Barcelona), working on ttH, H → bb analysis and some b-tagging studies. She wrote her thesis at CEA-Saclay on muon spectrometer performances and studies of WZ dibosons. In her spare time, Eve plays the flute and guides underground visits of the ATLAS detector.

Dark Matters

The winter conference season is well under way, and what better way to fill my first blog post than with a report from one of the premier conferences in particle and astroparticle physics: the Rencontres de Moriond.

One of the nice things I like about attending a conference is that it lets me step away from my day-to-day work and think again about the wider context of what we do as physicists. In this conference, it was the progress being made in our understanding of dark matter that best seemed to bring together work from many different areas of investigation. (Note that some of the results I will mention were already included in Jessica Levêque’s post No Matter How Hard You Try… Standard Is Standard).

Artist’s impression of dark matter (in blue) surrounding the Milky Way. Credit: ESO/L. Calçada

Artist’s impression of dark matter (in blue) surrounding the Milky Way. Credit: ESO/L. Calçada

Dark matter is the material that holds galaxies and clusters of galaxies together – the evidence for its existence from astronomical measurements is overwhelming. The problem: no one knows what dark matter actually is. None of the particles we know will do the job, not even the elusive neutrinos. All we do know is that it must be electrically neutral, very weakly interacting, and stable over billions of years. But that’s pretty much it.

What to do? Well, we could try to detect collisions between dark matter particles and ordinary atoms. At the conference, the LUX and CDMS collaborations reported their searches to detect this mysterious substance. Neither group saw any evidence of a signal, more or less ruling out potential hints seen by other groups over the last few years. In addition, several searches for dark matter production in the ATLAS and CMS experiments were reported, also with null results.

But a lack of positive signals is not the end of the story. Far from it – three talks in particular showed how our models for dark matter are evolving, with constraints from many directions.

The first of these addressed so-called supersymmetric dark matter. Supersymmetry is a group of models that predict new particles, as yet undiscovered, to explain several mysteries in particle physics (see So where is all the SUSY?, by Zach Marshall). In many supersymmetric models, one of the new particles is the dark matter particle, but there is no solid prediction of its mass to guide searches. Lorenzo Calibbi, however, combined astrophysical observations, searches for supersymmetry at the LHC and a few assumptions and educated to argue that the dark matter particle should be at least 24 times as massive as the proton – if it is supersymmetric.

New results from LUX and CDMS (solid lines) contradict previous hints of dark matter signals (shaded blobs).

New results from LUX and CDMS (solid lines) contradict previous hints of dark matter signals (shaded blobs).

Another possibility for dark matter is a particle called an axion. This would be very difficult to detect, in fact the searches I mentioned would have no chance of observing it. A surprise constraint arrived during the week, with the announcement by BICEP-2 of the detection of some particular patterns in the cosmic microwave background (CMB) radiation. This radiation was emitted early in the history of the universe, when it was about 380,000 years old – for comparison, the current age of the universe is 13.8 billion years. Their results are very fresh, and many groups will be seeking to replicate their observation, but if confirmed it’s essentially the last major piece of evidence for cosmic inflation that occurred shortly after the Big Bang.

What does this have to do with axion dark matter? Quite a lot, actually. There were big uncertainties in the possible properties on axions, depending on whether they were created before or after inflation. The observation by BICEP-2 rules out the creation of axion dark matter before inflation, giving a more precise target for future axion searches.

Finally, there was a proposal by Marco Drewes that perhaps we don’t need exotic new theories at all. The very fact that neutrinos have mass implies that they should have partner particles – one each for the electron, muon and tau neutrinos. He showed that, if these partners, or right-handed neutrinos – are arranged in just the right way, one of them could be the dark matter particle. Even better, the other two could explain why we live in a universe of matter rather than antimatter. By conventional standards, this proposal is artificial, without any solid theoretical motivation, but it’s testable, and that is music to an experimentalist’s ears.

Mike Flowerdew Mike Flowerdew is a post-doctoral researcher at the Max Planck Institute of Physics in Munich, Germany. He is currently searching for evidence of supersymmetric particles in the ATLAS data, and was also responsible for the calibration of the muon systems during data-taking.

The Neutrino Puzzle

Having explored the latest results on what we call ‘heavy flavour’ or physics of particles containing a b-quark (see The Penguin Domination by Jessica Levêque), we embarked on a much lighter subject: neutrinos.

It was as if a fresh breeze swept through the audience. Partly because we are surrounded by snow-capped mountains but mostly because of the topic — neutrino physics has been bubbling with activity these past few years. Many new measurements were shown, adding several pieces to the neutrino puzzle. But we are still far from having a clear idea of the picture we are trying to build, piece by piece.

Neutrinos are special particles. They are at the heart of some of the most exciting fundamental problems that particle physicists are trying to solve. But neutrinos are elusive, a characteristic that makes it difficult to study them. Physicists must use their ingenuity to compete at developing new kinds of detectors capable of measuring neutrinos coming from different sources.

Neutrinos sources studied by experiments

Neutrinos sources studied by experiments

There are a few things we know about neutrinos. In the Standard Model, neutrinos are neutral leptons that were thought to be massless. There are three neutrino species — electron, muon and tau neutrinos, each associated to the other three leptons in the Model — electron, muon and tau. They are the second most common particle in the universe after photon but are not well-known to the public. They interact with matter through weak interaction which makes them difficult to catch. But physicists like challenges and build experiments to detect and measure the flux of neutrinos coming from sources outside of our solar system or the sun, through the atmosphere, produced by terrestrial nuclear power plants or particle accelerators.

Most of these experiments were only sensitive to one neutrino species and at first, all these measurements appeared to be inconsistent. The picture got clearer when the Super Kamiokande experiment in Japan established in 1998 that neutrinos can oscillate from one species to another. Which means that an electron-neutrino can transform itself into a muon-neutrino and vice-versa. This explains, for instance, why the solar electron-neutrino measured flux is well below the one predicted by the solar model — because a fraction of them oscillate into muon-neutrinos that were not detected. The important consequence of the oscillation is that it can only occur if neutrinos have mass!

Neutrino masses with respect to the other Model Standard particles (fermions)

Neutrino masses with respect to the other Standard Model particles (fermions)

Since then, new experiments have been built to measure the probability of oscillation between different neutrino species and infer a measurement of their mass. At the conference, several measurements of these parameters were shown and we now know with fair precision the different oscillation probabilities as well as the mass differences between neutrino species. However, we still don’t know the mass itself although cosmological experiments allow us to set an upper limit on the sum of the masses of the three neutrino species, which is below an eV (electronVolt). Moreover, new experimental inconsistencies appear: some experiments do not observe the expected number of neutrinos, even with the oscillations taken into account.

So now, new questions have arisen: Where does the neutrino mass come from? Why is it so far from the other lepton masses? As it is massive and weakly interacting, could the neutrino be part of the dark matter of the universe? Is the neutrino its own anti-particle? Are there more than three neutrinos? Where are the high energetic neutrinos coming from?

Some experiments like IceCube are now able to map neutrinos coming from the universe and this is like doing astronomy with neutrinos!

Neutrino skymap as measured by the IceCube experiment

Neutrino skymap as measured by the IceCube experiment

During the session, several theoreticians proposed models that try to conciliate the different observations and answers to the above questions: Couldn’t there be a new species of neutrino in which the others could oscillate? Is the neutrino description in the standard model complete: couldn’t they have (as the other leptons) right-handed partners? This last option is interesting since it could explain why the standard neutrino mass is so small and perhaps also part of the universe dark matter as the right handed-neutrinos could be very massive.

Theoretical talks alternated with experimental ones describing future experiments that are currently being developed to help solve the puzzle. These experiments are being built by smaller collaborations in comparison to the LHC teams. The experiments can be located in the South Pole to take advantage of the ice as an interacting medium for the detector or in the depth of a disused mine to fight efficiently against cosmic ray background. The proposed technologies are also very different depending on the aim of the measurement but all experiments need a very low and well-controlled background, as the number of observed neutrinos is always small.

Stay tuned! There is no doubt that new results on neutrinos will come soon but in the meantime, my colleagues and I will catch some fresh air during a long lunch break up on the snowy mountains. After all, it is important to rest our brains in order to prepare for presentations of the top quark, the Higgs boson and other new results from the LHC in the next sessions.

So, what does a particle physicist, with her brain at rest, see in the surrounding mountains?



Higgs decaying in two photons bump over background as seen by the ATLAS experiment

Higgs decaying in two photons bump over background as seen by the ATLAS experiment

the Higgs boson of course!!!

Sabine Crépé-Renaudin Sabine Crépé-Renaudin is a French researcher at CNRS. She is involved in Grid Computing (the grid is like a world wide distributed computing centre used to reconstruct and analyse LHC experiment data). Her main activity in analysis is the search for new phenomena beyond the Standard Model in top-antitop quark final states. She also devotes part of her time to outreach activities.

No Matter How Hard You Try… Standard is Standard.

The past two days of the Recontres de Moriond 2014 Electroweak conference have been very intense with many new experimental results, many insightful theoretical talks and many lively discussions. Since the topics cover neutrino experiments, astrophysical observations and Standard Model precision measurements, giving a summary is not an easy task. But I will try.


Fig. 1 – Is the Higgs boson the last missing piece of the Standard Model or part of a much bigger puzzle? (image courtesy of minutephysics)

The discovery of the long-sought Higgs boson, the last missing piece of the Standard Model of particle physics, was announced in July 2012 by both the ATLAS and CMS collaborations at CERN, and the Nobel prize was awarded in October 2013 to Peter Higgs and François Englert, for proposing the mechanism responsible for breaking the electroweak symmetry and giving mass to the Z and W bosons.

In less than two years, the experimental paradigm shifted from the search for a new particle to precise measurements of its properties. The newly discovered boson has to be perfectly characterized to make sure it is exactly the one predicted by the Standard Model and not an impostor. The first results published in 2012 by both ATLAS and CMS established the bosonic nature of the new particle as well as its couplings to the W and Z bosons and to photons. But this was not enough. Because the Higgs boson is not only responsible for the W and Z masses but also for all the Standard Model particle masses, we have to establish that it directly couples to fermions.


Fig. 2 – From Moriond talk of Eilam Gross presenting measurements of Higgs boson coupling to fermions.

One of the important results of this conference is that both ATLAS and CMS showed very strong evidence of the Higgs boson decaying into b quarks and tau leptons (see Fig. 2). The official combination is expected to be published later this year. We can already see from the individual points that the final combination will provide required statistical significance to claim observation of the Higgs coupling to fermions, having a strength compatible with Standard Model predictions. Standard Model 1 : New Physics 0.


Fig. 3 – From Moriond talk of Eilam Gross presenting measurements of Higgs boson production and decay rates.

In addition, both ATLAS and CMS released a large number of measurements of the Higgs boson’s production and decay rates over a significant number of final states during the past year, which were summarized today in clear and comprehensive talks. A difficult channel, the associated ttH production, is just at the edge of our sensitivity, but the ATLAS and CMS data clearly show a hint of this production mode. The combination of all these measurements allow one to probe the deviation of the entire set of Higgs coupling constants from the Standard Model. And no matter how hard you try, everything beautifully aligns to 1 (as shown on Fig. 3). Standard Model 2 : New Physics 0.


Fig. 4 – From Roberto Covarelli’s Moriond talk presenting measurement of Higgs boson decay width.

One of the last highlights of the conference was the measurement of the Higgs boson width. The “width” of a particle depends on its lifetime, or in other words, its decay probability. For the 126 GeV Higgs boson, the Standard Model predicted width is 4 MeV. The previously established limit, obtained from the width of the reconstructed mass peak, could only constrain the Higgs boson width to values smaller than 3.4 GeV. The analysis presented by CMS today uses a new method. (I won’t enter into technical details here but the idea is that the production rate of the Higgs boson decaying into ZZ* at high energy depends on the Higgs boson width.) This new measurement shows a remarkable sensitivity and constrains the Higgs boson width to be below 17 MeV, more than two orders of magnitude better than the previous limits! Standard Model 2.5 : New Physics 0. Only half a point here, as the Higgs boson is still allowed to decay into invisible new particles, less than 50% of the time, but this still leaves enough room for new physics to sneak in. It maybe the only place, actually.

Could this be the end for new physics models? It’s becoming a serious question for theorists since no hint in deviation from the Standard Model predictions has been found yet despite the huge amount of data analyzed so far. From all that I’ve heard from theorists today, here are a few phrases that grabbed my attention:

  • “We are lucky that experiments find anomalies from time to time, as it allows us to publish papers”.
  • “the Standard Model may after all not be an effective theory at low energy of a more fundamental theory, but might very well be the fundamental theory itself.”
  • “Any new physics model must only be a small fluctuation around the Standard Model predictions”

So, do we really need more than the Standard Model? What are the questions that are not answered so far ? There seem to be only a handful of them remaining: neutrino oscillations (or mass), baryon asymmetry, dark matter, dark energy and inflation.


Fig. 5 – A model and his model.

The inflation problem is actually in the spotlight right now. A “guest” talk was added at the last minute to our conference agenda, to present the recent observation published by the BICEP-2 experiment at the South Pole. The measurement of the polarisation of the cosmic microwave background shows a signal that could (under minimal assumptions) very well be compatible with the inflation model (which is needed to expand the universe at a very high rate in its early stage). This polarisation signal could also be the experimental proof of the gravitational waves, the last of Einstein’s predictions that remain to be validated. It also looks like one of the most simplistic inflation models is sufficient to explain the BICEP-2 observations.

The same “minimalistic” trend is being considered in our field. For example, one of the new models presented by Marco Drewes proposes the addition of only three right-handed neutrinos to the Standard Model to solve all the remaining issues, in particular  dark matter, baryon asymmetry and neutrino masses.

An interesting suggestion was made today towards a minimalistic model choice. Besides the obvious need to accurately describe all the experimental results gathered so far in particle physics, its equations must fit on a medium size T-shirt, such that physicists can wear. And today, it seems that only the Standard Model can successfully fulfill both criteria.

Jessica Levêque Jessica Levêque is a French researcher at CNRS. Her main activities are measurement of the Higgs boson properties in the di-photon channel, improvement of the photon and electron performance, data quality monitoring and the tracker upgrade for LHC phase II in 2022. She also devotes part of her time on outreach activities. And when she’s not at work, she climbs mountains.

The Penguin Domination

This week features the 2014 Moriond Electroweak conference at La Thuile, Italy. More than a 100 particle physicists gather from all around the world. Started 50 years ago, this conference is still very valued, year after year, due to the high quality of the talks. The Moriond winter conference is one of the most exciting conferences, as all the particle physics experiments present their brand new results, but it is also appealing because of the mountains and the great Italian food.

Penguin Diagram

Example “penguin diagram”, in which a Bs decays through a loop (image courtesy of Symmetry).

SM Penguin Diagram

Standard Model “penguin” diagram” for process in LHCb study (image courtesy of Belle Experiment).

Being a particle physicist is an amazing job. In theory. But not always in practice. Understanding the fundamental laws ruling the universe is indeed exciting, but our daily work usually includes endless fights with reluctant computers, digging into the cryptic world of the C++ coding and GRID error messages, working countless hours for months or even years to come up with new ideas and new analyses to challenge the theoretical predictions, and finally opening the box of data to find out that there is actually nothing to be found out. Even if not finding anything is sometimes a scientific result worth publishing, it can be quite frustrating at times.

That’s why taking a few days off from the computer and a few steps back is an absolute necessity for many of us. It’s a great way to return to the original excitement, an opportunity to look at our field with fresh eyes, a rested mind, to see only the beauty behind the fights. It’s also a way to meet new people, outside of a confined meeting room, in a more relaxed environment, discuss and challenge the new results on a ski-lift, elaborate new projects, and go back home with renewed inspiration and motivation.

A conference is an opportunity to broaden our views because each step of a physics analysis is so technical, many of us end up working in a tiny area of specialization. But our field is wide and we need to know about all the orthogonal paths explored by other people, with different experiments, different particles and different methods. First, to learn about them out of curiosity, but also to gain perspectives on our own work and open our minds to new ideas.

Today was the first session of the conference, about heavy flavoured particles, also known as ‘B-physics’. Today was also the first ski session, during the lunch break, and the first associated sunburns. But no broken leg to report so far. B-physics is quite far from my field of expertise, and is a complex topic, both from the theoretical and the experimental side. What I know is that it is one of the first places where we found that matter and anti-matter behave slightly differently, which was the first hint on the path to understanding why our universe today is only made up of matter, although our equations tell us that matter and anti-matter must have been produced in equal quantities in the early universe. So where did all the anti-matter go?

B to Kmumu

From presentation of Wolfgang Altmannshofer at Moriond EW 2014

Heavy flavour physics has taught us that particles and anti-particles have a slightly different lifetime. Today, it’s also teaching us that some of the anti-particles partly break the most intuitive symmetry rules (charge-parity conservation). But antimatter breaks rules in a precisely predictable way. That’s the beauty of physics. By studying all the possible behaviours and decays predicted by the Standard Model, and by looking for possible deviations, we can discover or exclude the contribution of new physics models, where additional particles could interfere.

The B-physics measurements are extremely sensitive to such new particles, especially those rare decays that are dominated by “penguin diagrams”. These are complex processes that include a loop of intermediate particles that, when viewed with the right imagination, can appear as penguins1. By studying the rates of these rare decays, one can infer the contribution of even very heavy particles that would have escaped direct detection at the LHC, so far, because they are too heavy to have been produced with the 8 TeV of beam energy that has been available up until now. We still have to wait a few more months until the LHC restarts with its upgraded magnets to explore the 13 TeV energy frontier.

The summary of the B-physics session today is that pretty much everything behaves as predicted with an incredible accuracy, showing how well our Standard Model, again, describes the physics of the particles and their interactions. But tensions exist between the measurements and the predictions, for example in the Bs to K*μμ channel, as shown in the histogram. More data and theoretical explorations are needed to validate or exclude them further, but…

I must admit I’ve never met anyone, besides physicists, who are so excited when proven wrong and almost disappointed when everything behaves as expected. Because ‘wrong’, for a physicist, either means there is something we did do not measure properly and could certainly be done better, or, that there is a hidden mechanism underlying, something new yet to be discovered. And that’s what we live for.

1 An amusing description of penguin diagrams, from the person who coined the term, appeared in Symmetry Magazine, June 2013.

Jessica Levêque Jessica Levêque is a French researcher at CNRS. Her main activities are measurement of the Higgs boson properties in the di-photon channel, improvement of the photon and electron performance, data quality monitoring and the tracker upgrade for LHC phase II in 2022. She also devotes part of her time on outreach activities. And when she’s not at work, she climbs mountains.

Letters from the Road

I’ve been lucky to get to make two workshop / conference stops on a trip that started at the very beginning of October. The first was at Kinematic Variables for New Physics, hosted at Caltech. Now I’m up at the Computing in High Energy Physics conference in Amsterdam. Going to conferences and workshops is a big part of what we do, in order to explain our work to others and share what great things we’re doing, in order to hear the latest on other people’s work, and – and this one is important – in order to get to talk with colleagues about what we should do next.

The first workshop, KVNP, was a small ATLAS-CMS-Theory discussion of what new variables we should use for our searches when the LHC restarts. A lot of searches that we do use pretty simple variables like the amount of energy in the calorimeter and the momentum imbalance transverse to the beam. But there are some very clever variables that you can put together that estimate some interesting things, like the mass of a new particle that you might have just created under certain assumptions. That’s a really interesting thing to be able to do, and if we discover something in 2015, then it’s going to be very important to be able to estimate that sort of mass. For now, however, I left with the impression that we really should be doing simple things at first, trying to understand very carefully what we’re doing, and avoiding adding much complexity to our searches. We’ll have plenty of time for complexity later on!

Now, up at CHEP, I’m listening to some interesting discussions about the future of computing in our field. There are a lot of difficult problems in modern computing, not the least of which is how our software should change now that we know that Moore’s law continues to be correct but processor clock speed has capped out.

Moore's law describing the number of transisters on a chip.  Clock speed, however, has peaked out around a few GHz and not grown in the last 10 years or so.

Moore’s law describing the number of transisters on a chip. Clock speed, however, has peaked out around a few GHz and not grown in the last 10 years or so.

These talks really require a crystal ball, as there are some technologies that are soon going to be mainstream, and when betting one runs the risk of being on the losing side, a la beta tapes, laserdisks, HD-DVD, and others. Many of the people here don’t think as much about physics – I would be interested to see a poll of the number of them actively working on a physics analysis, for example. But it’s very interesting to hear what they have to say about computing issues, where they are far better equipped than the average physicist. The conference is a bit strange – already two talks I’ve wanted to hear have been cancelled, and one has been given remotely by a video connection, thanks to our government shutdown that was affecting the US labs until Thursday. But it’s been fun all the same. I’ll leave you with a few discussion questions that I found really quite interesting this week:

  • Can ATLAS or CMS run without a trigger by 2022? The rate off of the detector is something that we might be able to handle (hint: the tracker seems to be the biggest problem!)
  • How do you define “Big Data”? If you do it by simply the volume (in GB) of data, then you’re actually just describing the budget!
  • Are we heading for a world where there is custom code running on different system architecture, or will the language and compiler development catch back up and provide homogeneity?


ZachMarshall Zach Marshall is a Division Fellow at the Lawrence Berkeley National Laboratory in California. His research is focused on searches for supersymmetry and jet physics, with a significant amount of time spent working on software and trying to help students with physics and life in ATLAS.

So where is all the SUSY?

Supersymmetry (SUSY) is one of the most loved, and most hated, theories around that works as an extension of our beloved Standard Model. It’s loved because it has some very nice features: it can explain dark matter, it has some very suggestive features when it comes to the possibility of unifying the forces (a “grand unified theory”), it can explain why the Higgs mass is so light (though the Higgs is a bit on the heavy side for some versions of SUSY), and it has some other very nice theoretical features that are perhaps a bit technical. It’s hated, for the most part, because the full version has around 300 free parameters. That means SUSY can predict just about anything that the LHC might see, and it also means that SUSY is almost impossible to rule out – even the simple versions!!

Limits from all of the different SUSY searches ATLAS has already made public.

Limits from all of the different SUSY searches ATLAS has already made public.

There’s been quite a bit of ink spilled lately over where SUSY might be hiding in the LHC data. Tomasso Dorigo has written about the slowly dissolving faith in finding SUSY at the LHC and how an increasing number of theorists think that some of the apparent problems that SUSY solves might not be problems at all, but just our misunderstanding of fundamental physics. Peter Woit has written quite a bit about the trouble with SUSY at the moment and the apparent crisis in theory (quite interesting, but I would quickly be out of my depth in a discussion of that, so I leave it to the professionals).

Nevertheless, the SUSY group in ATLAS hasn’t given up! Quite the opposite: we’ve produced 22 Conference Notes (notes describing physics analyses that aren’t quite ready for publication, often because for publication we like to test a lot of additional signal models that are not determinent in the question of “was there a new particle in the data”) and 2 papers on the 2012 dataset, and have a large number of new searches and papers waiting in the wings. Why bother, and what are all those analyses doing?

Well, as I said, SUSY is a theory that can give a lot of different results. One version might give us heavy (1000-1500 times heavier than the proton) objects called “squarks” and “gluinos” (yes, all SUSY particles have funny names) that decay to quarks, which show up in our detector as things called “jets.” So one set of searches should look for jets coming from heavy objects. In their decays they can also produce electrons, muons, and taus (called leptons) so we also want a search with one, or two, or three, or even four leptons in the final state. Each one of those is a separate analysis, so now we’re starting to get a longer list. Another version of SUSY might give us lighter particles (300-600 times heavier than the proton) objects that decay to bosons like the W, the Higgs, or the Z. Those give us events with lots of leptons, but without lots of jets – so we should have a different set of searches for those. There is a lot of interest in SUSY making extra top and bottom quarks right now (for reasons that go under the heading “naturalness”). So we also want a set of searches looking for traces of bottom and top quarks in the events. That’s a lot of variety!

Of course, the nice thing about having such a diverse range of searches is that they aren’t only sensitive to SUSY. There are a number of other theories out there that could predict things that our SUSY searches are sensitive to. So even though they’re carried out by the SUSY group, and even though we talk about SUSY models that we set limits on with each of these analyses, they are a great way to cover a huge range of models of new physics. There are a lot of strange models out there (Hephalons?), so searching for things that SUSY predicts provides a nice starting point for covering that enormous set of models (of course, the exotics group on ATLAS is doing some nice work to cover other models!! I’m just a bit biased).

Even if SUSY doesn’t show up in our current data, I still think there’s a decent chance that we will find it – or something like it – in the high energy run set to start in 2015. The SUSY searches could find the first bump, but that doesn’t mean it’s SUSY – we’d have our work cut out for us to understand what we’d found! In the meantime, we can look through strange SUSY models to make sure that we haven’t potentially missed something interesting in our results.


ZachMarshall Zach Marshall is a Division Fellow at the Lawrence Berkeley National Laboratory in California. His research is focused on searches for supersymmetry and jet physics, with a significant amount of time spent working on software and trying to help students with physics and life in ATLAS.

Snowmass from Afar

There’s a (potentially) really big deal in physics that’s just ended: the Snowmass conference. Ken over at the USLHC blog has already mentioned it, and I’ve been watching with interest from here in Geneva as well. The meeting, and its reports, are trying to walk an extrodinarily delicate line that’s interesting for both the physics and the sociology involved. A really nice summary is here.

The Snowmass conferences have a great history in particle physics, including some quite detailed discussions about jet physics that drove the way we talked about jets for much of the last 20 years. It’s only really in the last five that we, as a field, have been able to move beyond what was decided at Snowmass in the early 1990s, in fact. So these can be quite important decision-making meetings, where we (again, as a field) get together to decide how we want to standardize things, where our priorities are as a community, what the important outstanding issues are, and so on.

Snowmass on the Missisippi

Snowmass on the Missisippi

The meeting is fairly US-centric. That comes with the good and the bad. It allows us to judge our programmes fairly against those in other countries, see if our trends are the same, and perhaps adjust the way we do things if we see a better model somewhere else. Of course, if we’re trying to make big decisions, it risks making those decisions without the explicit agreement of our friends in Europe – and, of course, without much discussion with colleagues like me who can’t attend — which means they may not be adopted by the full community of physicists. In other words, you get the chance to make important, broad, and sweeping decisions, but you risk no one taking those decisions seriously. It’s quite a challenge!

It’s also a bit of a challenge following the talks from abroad. In ATLAS we have trained people pretty well to make presentation slides as though people will have to read your talk and won’t get to see it. That’s critical when a large number of collaborators will be either asleep (in other time zones, not in the room with you, we hope) or in other meetings when you’re giving a talk. At Snowmass there’s a huge mixture between people who work that way and people who don’t; people who treat the slides as notes and people who treat them as a document to be read. Thanks to those who tried to make their notes understandable, though!

This year, money has come up over and over at these meetings, in ways that I’ve not seen it discussed before. Some of these seem to be really quite adult conversations. Part of the charge of the theory working group seems to be to discuss what a reasonable summer salary and budget is for a professor of theoretical physics, so that they might standardize the grant awards a bit more. I don’t know if that’s going to work, but it is a very interesting idea – and to my knowledge, things like that have not been openly discussed at meetings like this before. People are raising the interesting issues and not shying away from the painful ones: are we training students well? Are we just training quantitative analysts for Wall Street? Is the distribution of the community among fields correct? Should we try to influence it? How do we deal with no longer being such a big deal in terms of computing needs? The painful part about watching from here is that I can’t follow the discussions in much detail, and I want to know what the answers that people are presenting are, not just to see the questions they are asking!

In any case, the various working groups inside of the conference will be producing “white papers” (long reports, basically) over the next few weeks and months. They should contain the first conclusions and answers to some of these questions. Only time will tell if these will be the answers we’ve hoped for that the community can really get behind, or if they will be another voice trying to bring reason to chaos, with only limited success.

Good luck to the working groups — we’re pulling for you!


ZachMarshall Zach Marshall is a Division Fellow at the Lawrence Berkeley National Laboratory in California. His research is focused on searches for supersymmetry and jet physics, with a significant amount of time spent working on software and trying to help students with physics and life in ATLAS.

A Few Missing Steps

After a long hiatus from US ATLAS, I recently started a new job at the Lawrence Berkeley National Laboratories. It’s one of the few remaining labs in the US funded by the Department of Energy that does basic science research. It’s the fourth job I’ve had in four years, all working on ATLAS, and all working on similar projects. This one is different, though: if I pass a performance review a few years from now, I’ll have the lab-equivalent of tenure. I’ve had reactions ranging from “who did you have to kill to get that job” to “so who did you actually talk to to land that”?

"Piled Higher and Deeper" by Jorge ChamAt the same time, some friends recently pointed out this article breaking down becoming a professor into a few short steps. Well, not so short. From the looks of things, the average time to complete all the steps is 15-20 years. I feel like they may have missed a step or two, as well as a few features of the system, which might be new to some of you. So here, remembering that I’m not an expert in the field of getting a professorship, are a few small steps that they left out:

-) Be in the right place at the right time. I was lucky enough to be one of the first graduates from the LHC with real collision data in my thesis. A few years earlier and I would have either been working on another experiment or not have had the opportunity to help get the detector and software up and running – the latter of which is quite important for job hunts. It’s important to have done some physics analysis (either a search or a measurement), but it is critical to have also done some work on the detector itself!

-) Have the right person retire at the right moment. These days in our field, new positions only open because someone left. Some groups are wise enough to try to hire a new person a few years before their oldest professor is going to retire, so that the new person can be well trained in the operation of the group and the experience can be passed on, but very few have a truly “new” job open up. In my case, a very good guy at Berkeley decided it was time to retire and build a kit car for a few years. And so, a position was born.

-) Happen to have spent the last five years becoming an expert in the areas that the group most wants an expert in. Some groups really want a person who will work on a particular piece of the detector, or a particular physics topic (even more specific than just “working on ATLAS”, for example, some groups want someone who will help measure the Higgs boson properties). In my case, the person who retired was one of the few physicists who had mostly worked on software at the end of his career, and our experiment’s software is what I’ve worked on since I started on ATLAS years ago.

-) Compete against your friends. There are simply not enough jobs in our field for all the good candidates. And I really mean good. Some of the people that I’ve gotten to know here I have enormous respect for as physicists. And, of course, it happens that the people applying for jobs at the same time are all about the same age – and are all the friends you’ve been making in the field.

If it sounds like there’s a lot of luck involved in getting a job, that’s because there does seem to be these days. It’s an unpredictable challenge, and there are a lot of good physicists who end up getting overlooked by the process. I was very lucky, though there’s an old saying, “Luck is the residue of design.” The design wasn’t my own in this case, so this is my chance to thank all the people who helped get me here!

Now that I’m back, who’s got a question I can help answer?


ZachMarshall Zach Marshall is a Division Fellow at the Lawrence Berkeley National Laboratory in California. His research is focused on searches for supersymmetry and jet physics, with a significant amount of time spent working on software and trying to help students with physics and life in ATLAS.

Want a small scale LEGO® version of the ATLAS detector?

A small scale version of the ATLAS detector can be made available as an official LEGO® product, but I need people to vote for it at LEGO Cuusoo. We need 10,000 votes to be considered by LEGO®.

ATLAS LEGO® mini model

ATLAS LEGO® mini model

Why a LEGO® box?

Since I presented the original ATLAS LEGO® model in 2011, it has been featured in many outreach events all over the world. The original model has 9517 pieces in total and reassembles the real detector in high detail to a scale of roughly 50:1, matching a little plastic man. A smaller version, approximately 22cm in length, has been designed and shipped 111 times to more than 20 institutes. However, I want to make the smaller model accessible to a wider audience, enabling us to use it for educational purposes, as prizes in competitions, as interactive exhibits and for sales in the ATLAS souvenir shop.

There is huge interest in the LEGO® models. Already, 35 ATLAS institutes have received and built their own versions of the large model and used it on various occasions. Some have used it simply to explain the look and function of the ATLAS detector and others as a mean of getting students involved in constructing the model while discussing the real experiment. It is also always an attractive display at physics conferences.



This June, for the inaugural event of Passport Big Bang, the LEGO® models were used as inspiration for a competition called “Build Your Own Particle Detector” in which both children and adults were invited to be creative and build their own design from the 15 kilos of bricks I provided at the site. More than 300 children along with their parents filled up the tent. The contest lasted for four hours and at the end of the day, we had 82 designs! The event turned out to be a much bigger success than we expected.

We are hoping to get more votes than required. After all, the LEGO® plastic building blocks can come in handy when trying to explain the search for the real building blocks of the universe. Please vote and share.

The ATLAS model,
The “Build your own particle detector” competition,
Vote for the small model

 How to vote (with your Facebook account):

  1. Go to , click ‘Support’ button
  2. Choose your Facebook account (You may skip when asked if LEGO can post for you)
  3. Fill the little two-page questionnaire (tick on ‘Support’)

How to vote (creating a Cuusoo account):

  1. Go to , click ‘Support’ button
  2. Choose “Sign up now” and register
  3. Wait for confirmation email and visit the link therein
  4. Go to and click ‘Support’ button (yes, again)
  5. Fill the little two-page questionnaire

Voting through Twitter  is similar to Facebook.


SaschaMehlhase   Sascha Mehlhase is a postdoc at the Niels Bohr Institute in Copenhagen. While his current physics interests focus mainly on silicon detectors, searches for stable massive particles (R-Hadrons) and the W-boson mass measurement, he spends a lot of his spare time on outreach activities (not just including plastic bricks).
You can find more information at