Nobel Prize-Winning Physicist Peter Higgs Dies Aged 94

The professor, who proposed the existence of the so-called “God particle” that helped explain how matter formed after the Big Bang, died on Monday following a short illness.

The man who theorised the existence of what is commonly called the “God particle” that helped explain how matter formed after the Big Bang, Peter Higgs, has died at age 94, the University of Edinburgh said on Tuesday.

The physicist gained global recognition in 2012, when he was awarded the Nobel prize, almost 50 years after Higgs had predicted the existence of a new particle, which came to be known as the Higgs boson, or the “God particle.”

The university where Higgs was an emeritus professor said he died Monday after a short illness.

Higgs’ work helped scientists understand one of the most fundamental questions of how the universe was formed: how the Big Bang created something out of nothing some 13.8 billion years ago. Without mass from the Higgs, particles could not merge into the matter we interact with every day, the professor proclaimed in 1964.

But it would be almost half a century before the particle’s existence could be confirmed. In 2012, in one of the biggest breakthroughs in physics in decades, scientists at CERN, the European Organization for Nuclear Research, announced that they had finally found a Higgs boson using the Large Hadron Collider(LHC).

Experiments at the LHC proved the Standard Model of particle physics by detecting the Higgs boson particle – a particle which itself had long proved to be elusive.

What is the Higgs boson?

Higgs’ work showed how the boson has helped to bind the universe together and give fundamental particles their mass, crucial for the existence of all the other connected atoms in the universe.

The nickname “God particle” comes from the Nobel Prize-winning physicist Leon Lederman’s 1993 book The Goddamn Particle: If the Universe Is the Answer, What Is the Question?. The name was later changed to “God particle” after criticism from religious institutions.

The University of Edinburgh principal, Professor Peter Mathieson, where Higgs was an honorary professor, referred to Higgs as a “truly gifted scientist.” Mathieson emphasized that Higgs’ legacy would continue to inspire many more for generations to come.

This article was originally published on DW.

Roadshow Brings Mega Science Projects to the People

The show focuses on seven mega-science projects supported by countries around the world, including India.

Mumbai: An 11-month long and nationwide roadshow to create a buzz around various international mega-science projects devoted to understanding the working of the universe from the atomic to the astronomical level rolled out in Mumbai on May 8.

The programme is named Vigyan Samagam and is being executed in the form of a travelling exhibition that features galleries of posters, working models, exhibits, audio-visuals informational materials, electronic displays and interactive kiosks.

The first destination was Mumbai’s Nehru Science Centre, where the show plans to stay for two months, until July 7. It will then move to Bengaluru, where it will be open to the public from July 29 to September 28 at the Visvesvaraya Industrial and Technological Museum.

The next stop will be Kolkata, where it be open at the Science City from November 4 to December 31. The final stop will be at the National Science Centre in New Delhi from January 21 to March 30 next year.

The exhibitions will be open on weekends and holidays as well, from 10 am to 6 pm.

The show focuses on seven mega-science projects supported by countries around the world, including India. They are the European Organisation for Nuclear Research, Switzerland; the India-based Neutrino Observatory, Tamil Nadu; the Facility for Antiproton and Ion Research, Germany; the International Thermonuclear Experimental Reactor, a fusion reactor in France; the Laser Interferometer Gravitational Wave Observatories in the US and on the one being planned in India; the Thirty Meter Telescope; and the Square Kilometer Array, South Africa.

Also read: Two Very Similar Projects in Tamil Nadu – but Only One Is Opposed. Why?

These projects throw light on crucial questions related to the origin of the universe and its evolution through its various stages. They relate to particle physics, including properties of particles like the Higgs boson and neutrinos, the detection of gravitational waves from colliding black holes and merging neutron stars, and precision engineering challenges involving the containment of extremely energetic plasmas.

Among other things, the programme will highlight India’s contributions in these research and development activities.

The inaugural event at each venue will be followed two days of events comprising talks and lectures by eminent speakers from research and industry. These will be live-streamed through social media platforms.

Its organisers – the Departments of Science and Technology and Atomic Energy and the National Council of Science Museum (NCSM) – will also conduct quizzes, essay writing contests, drawing contests and science awareness cyclothons to engage schoolchildren and other students in attendance.

V.K. Saraswat, a member of NITI Aayog, emphasised the importance of science and technology for the country’s economic growth of the country in his launch speech, and hoped that the programme will help to inspire youth to take up scientific work as a career.

K. VijayRaghavan, the principal scientific adviser to the Government of India, spoke of the need to make scientific and technological developments available in local languages to ensure they are accessible to more people (an issue he has discussed before in some detail).

A.D.Choudhary, the director-general of the NCSM, said the organisation was working on a plan to expand the show’s footprint to include smaller cities as well.

Sunderarajan Padmanabhan writes for India Science Wire and tweets at @ndpsr.

Science Communication Will Help Decide When the Next Particle Smasher Will Be Built

An ongoing debate among physicists demonstrates the rewards of science communication.

The particle physics community is not too happy right now. About a month ago, they found themselves yanked into an unexpected debate. The point of contention was whether to build a $22-billion giant collider, a project overwhelmingly supported by particle physicists.

It all started with an op-ed by Sabine Hossenfelder, a theoretical physicist who had started her career as a particle physicist, in the New York Times.

Physicists typically present a united front on questions of funding but Hossenfelder’s write-up was a surprising departure from this norm. Hossenfelder contended that a new collider was just not a good investment for particle physics at this stage, even as particle physicists were misleading the public that new discoveries were around the corner.

An explainer appeared shortly after. Since then particle physicists have been presenting their rebuttals on other news sites, blogs and of course, social media. Hossenfelder’s Facebook page has been debate central for the past few weeks.

Also read: CERN’s Concept Design for Next-Gen ‘Supercollider’ Mirrors China’s Plans

An important lesson physicists have learned is that the laws of nature are structured like a Matryoshka doll. The key to unraveling them is energy. At higher and higher energies, more and more hidden rules of nature come tumbling out. This is what collider are used for: they smash particles together at very high energies to reveal these otherwise inaccessible rules.

The Large Hadron Collider (LHC) in Geneva will soon reach the limit of its ability to amp up energy and CERN has already released the plans for a bigger, better machine called the Future Circular Collider (FCC). If approved by policymakers, it will be built by 2050.

On the theory side, our entire knowledge of particle physics can be summed up within a single, 40-year-old rulebook called – simply and stolidly – the Standard Model. Everything we have observed in the LHC had already been predicted by the Standard Model.

In the years since the Standard Model, particle physicists have come up with a plethora of models that try to go beyond it, to uncover  the laws of ‘new physics’ deeper than the model itself. The physicists have postulated new particles, new symmetries of nature and even new dimensions of varying sizes. However, none of them have turned up in the LHC.

We do know that more discoveries wait to be made at at enormously high energies, far too high for any collider on Earth. But for energies that a collider on Earth can reach, is Standard Model the last of the Matryoshka dolls? Or is there anything more to find?

This is where opinions vary and the debate begins.

Hossenfelder has alleged that the particle physics community hasn’t presented an honest case to the public for the FCC. The advertisements always emphasise the new particles and exotic dimensions – exactly what they’ve failed to find.

She’s also maintained that there’s no compelling theoretical reason to expect anything new will turn up in the FCC. This is especially so since none of the models that that physicists came up with after the Standard Model have made a single correct prediction.

One could counter-argue that new particles and/or dimensions could be hiding at energies just out of the LHC’s reach, and it’s possible that the FCC will find them. But the same argument could also made if the FCC does not find them. So then should we keep building colliders till we find anything new?

Hossenfelder believes that it is time to stop.

She advances an alternative: instead of spending $22 billion on a single experiment, conduct a range of less expensive experiments that could give us a better idea of what to look for. We should start planning the next collider when we know where to look.

Several other physicists have penned multiple rebuttals. One of their points is that even if nothing new is found, the experiment will still be of value. They will help rule out unviable models – which in itself would be an important discovery. For another: there’s more juice to be extracted from the Standard Model yet. Many physicists are working on using the Standard Model to make predictions that can be confirmed or ruled out only with an FCC-type machine.

Finally, while $22 billion is indeed a lot, it is not prohibitive when spent over 30 years and by several countries together.

Overall, particle physicists feel it’s not time to throw in the towel yet.

As this debate plays out in the public eye, it highlights an important lesson for the scientific community: that communicating science to the people is now more important than ever.

When Hossenfelder had been making the same arguments in her blog, she was largely ignored. It was her article that forced prominent particle physicists to respond.

In fact, many particle physicists are also unhappy with the airtime Hossenfelder has been getting. They feel that a non-particle physicist does not deserve a public platform to criticise particle physics. But Hossenfelder has built a public presence by investing time and effort in science communication, on Twitter and Facebook, through her longtime blog and a book last year.

It is also true that particle physicists had not presented their best case for the FCC before the public till this debate forced their hands.

Irrespective of which way the debate swings, it has already shown that the few who communicate science can have a lopsided influence on the public perception of an entire field – even if they’re not from that field. The distinction between a particle physicist and, say, a condensed-matter physicist is not as meaningful to most people reading the New York Times or any other mainstream publication as it is to physicists. There’s no reason among readers to exclude Hossenfelder as an expert.

However, very few physicists engage in science communication. The extreme ‘publish or perish’ culture that prevails in sciences means that spending time in any activity other than research carries a large risk. In some places, in fact, junior scientists spending time popularising science are frowned upon because they’re seen to be spending time on something unproductive.

Also read: China, Japan Prepare to Transform Asia Into Hub of Particle Physics Research

But debates like this demonstrate the rewards of science communication.

This presents some serious questions for the physics community. Should science communication be encouraged? If so, should it count for tenure? And how do we reward scientists who communicate without punishing those who don’t?

If physicists are to control the public perception of their fields, they will have to decide on these questions sooner than later.

And if the way this debate has proceeded is any indication, communication has helped clarify the positions of both sides and brought out some points of agreement.

All physicists agree that we can’t keep building colliders ad infinitum. They differ on when to quit. Now would be a good time, according to Hossenfelder. Most particle physicists don’t think so. But how will we know when we’ve reached that point? What are the objective parameters here? These are complex questions, and the final call will be made by our ultimate sponsors: the people.

So it’s a good thing that this debate is playing out before the public eye. In the days to come, physicists and non-physicists must continue this dialogue and find mutually agreeable answers. Extensive, honest science communication will be key.

Nirmalya Kajuri is a theoretical physicist. He is currently a postdoctoral fellow in the Chennai Mathematical Institute.

CERN’s Concept Design for Next-Gen ‘Supercollider’ Mirrors China’s Plans

The Future Circular Collider will ramp up its energy over 15 years, and be able to produce millions of Higgs bosons for precision studies.

The world’s largest particle physics laboratory has unveiled its design options for the Large Hadron Collider’s successor – what is expected to be a 100-km long next generation ‘supercollider’.

The European Organisation for Nuclear Research (CERN) submitted the conceptual design report for what it is calling the Future Circular Collider (FCC). The FCC is expected to be able to smash particles together at even higher intensities and push the boundaries of the study of elementary particles. CERN expects it can come online by 2040, when the Large Hadron Collider’s (LHC’s) final run will come to a close.

The LHC switched on in 2008. Its first primary goal was to look for the Higgs boson, a fundamental particle that gives all other fundamental particles their masses. The LHC found it four years. After that, physicists expected it would be able to find other particles they’ve been looking for to make sense of the universe. The LHC has not.

This forced physicists to confront alternative possibilities about where and how they could find these other hypothetical particles, or even if they existed. The FCC is expected to help by enabling a deeper and more precise examination of the world of particles. It will also help study the Higgs boson in much greater detail than the LHC allows, in the process understand its underlying theory better.

The CERN report on what the FCC could look like comes at an interesting time – when two supercollider designs are being considered in Asia. In November 2018, China unveiled plans for its Circular Electron Positron Collider (CEPC), a particle accelerator seven-times wider than the LHC.

Also read: China, Japan Prepare to Transform Asia Into Hub of Particle Physics Research

The FCC, CEPC and the LHC are all circular machines – whereas the other design is slightly different. Also in November, Japan said it would announce the final decision on its support for the International Linear Collider (ILC) in a month. As the name suggests, the ILC’s acceleration tunnel is a straight tube 30-50 km long, and parallels CERN’s own idea for a similar machine.

But in December, a council of scientists wrote to Japan’s science minister saying they opposed the ILC because of a lack of clarity on how Japan would share its costs with other participating nations.

In fact, cost has been the principal criticism directed against these projects. The LHC itself cost $13 billion. The FCC is expected to cost $15 billion, the CEPC $5 billion and the ILC, $6.2 billion. ($1 billion is about Rs 7,100 crore.)

They are all focused on studying the Higgs boson more thoroughly as well. This is because the energy field that the particle represents, called the Higgs field, pervades the entire universe and interacts with almost all fundamental particles. However, these attributes give rise to properties that are incompatible with the universe’s behaviour at the largest scales.

‘The world may not be able to accommodate two circular colliders’

Scientists believe that studying the Higgs boson closely could unravel these tensions and maybe expose some ‘new physics’. This means generating collisions to produce millions of Higgs bosons – a feat that the LHC wasn’t designed for. So the newer accelerators.

The FCC, the CEPC and the ILC all accelerate and collide electrons and positrons, whereas the LHC does the same with protons. Because electrons and positrons are fundamental particles, their collisions are much cleaner. When composite particles like protons are smashed together, the collision energy is much higher but there’s too much background noise that interferes with observations.

These differences lend themselves to different abilities. According to Sudhir Raniwala, a physicist at the University of Rajasthan, the CEPC will be able to “search for rare processes and make precision measurements”. The FCC will be able to that as well as explore signs of ‘new physics’ at higher collision energies.

§

According to CERN’s conceptual design report, the FCC will have four phases over 15 years.

I – For the first four years, it will operate with a centre-of-mass collision energy of 90 GeV (i.e. the total energy carried by two particles colliding head-on) and produce 10 trillion Z bosons.

II – For the next two years, it will operate at 160 GeV and produce 100 million W bosons.

III – For three years, the FCC will run at 240 GeV and produce a million Higgs bosons.

IV – Finally, after a year-long shutdown for upgrades, the beast will reawaken to run at 360 GeV for five years, producing a million top quarks and anti-top quarks. (The top quark is the most massive fundamental particle known.)

After this, the report states that the FCC tunnel could be repurposed to smash protons together the way the LHC does but at higher energy. And after that also smash protons against electrons to better probe protons themselves.

The first part of this operational scheme is similar to that of China’s CEPC. To quote The Wire‘s report from November 2018:

[Its] highest centre-of-mass collision energy will be 240 GeV. At this energy, the CEPC will function as a Higgs factory, producing about 1 million Higgs bosons. At a collision energy of 160 GeV, it will produce 15 million W bosons and at 91 GeV, over one trillion Z bosons.

Michael Benedikt, the CERN physicist leading the FCC project, has called this a validation of CERN’s idea. He told Physics World, “The considerable effort by China confirms that this is a valid option and there is wide interest in such a machine.”

Also read: Hopes for ‘New Physics’ Pave the Road to Rencontres

However, all these projects have been envisaged as international efforts, with funds, people and technology coming from multiple national governments. In this scenario, it’s unclear how many of them will be interested in participating in two projects with similar goals.

Benedikt did not respond to a request for comment. But Wang Yifang, director of the institute leading the CEPC, told The Wire that “the world may not be able to accommodate two circular colliders”.

When asked of the way forward, he only added, “This issue can be solved later.”

Moreover, “different people have different interests” among the FCC’s and CEPC’s abilities, Raniwala said, “so there is no easy answer to where should India invest or participate.” India is currently an associate member at CERN and has no plans for a high-energy accelerator of its own.

To the FCC’s credit, it goes up to a higher energy, is backed by a lab experienced in operating large colliders and already has a working international collaboration.

Additionally, many Chinese physicists working in the country and abroad have reservations about China’s ability to pull it off. They’re led in their criticism by Chen-Ning Yang, a Nobel laureate.

But in the CEPC’s defence, the cost Yang is opposed to – a sum of $20 billion – is for the CEPC as well as its upgrade. The CEPC’s construction will also begin sooner, in around 2022, and it’s possible China will be looking for the first-mover advantage.

China, Japan Prepare to Transform Asia Into Hub of Particle Physics Research

China has taken an important step towards realising a next-generation particle collider to address physics’s unsolved mysteries, with Japan close behind.

Chinese scientists have released the full design report of a major future particle collider they plan to build the next decade. Pegged at $5 billion, the Circular Electron Positron Collider (CEPC) is expected to begin construction in 2022 and operation in 2030 if the Chinese government agrees to fund it. Once running, the CEPC will function as a ‘Higgs factory’ – a collider adept at producing Higgs bosons for detailed study.

The Institute for High Energy Physics (IHEP), which prepared the report, has also said that once the CEPC completes its physics goals in 10 years, the collider complex can be upgraded to a proton-proton collider complementing the Large Hadron Collider in Europe.

Paralleling China’s announcement was one from Japan. Japanese physicists have been considering hosting another $5 billion machine called the International Linear Collider, also to produce and study Higgs bosons. They said they would announce their final plans by the end of December.

Considering the US doesn’t have plans for Higgs factories in the near future and Europe’s plans are decades away, Asia could be the new home of Higgs boson studies for most of this century.

The CEPC design report is the last of three reports, the first two published in 2015 and 2017. It details what the particle accelerator and collider will be capable of as well as the abilities and goals of the physics experiments that will be run with it. According to the report, the CEPC will consist of an underground tunnel 100 km in circumference. It will receive electrons and positrons (a.k.a. anti-electrons) from a pre-accelerator located on the surface at an energy of 10 GeV, followed by a booster. The particles will then be circulated within two concentric rings in the CEPC tunnel to a higher energy and collided head-on.

The report states that the highest centre-of-mass collision energy will be 240 GeV – i.e. the total energy carried by the electrons and positrons together at the moment of collision. At this energy, the CEPC will function as a Higgs factory, producing about 1 million Higgs bosons. At a collision energy of 160 GeV, it will produce 15 million W bosons and at 91 GeV, over one trillion Z bosons. The Higgs, W and Z bosons are all force-carrier particles within the overarching framework called the Standard Model of particle physics, used to understand the types and interactions of elementary particles.

Also read: All You Need to Know to Get Started on Particle Physics

The Large Hadron Collider (LHC) in Europe has a much higher collision energy than the CEPC, with one-fourth the circumference, so why are the Chinese trying to build a ‘weaker’ collider? The answer lies in the purposes of the two machines. The LHC operates at the energy frontier, probing higher and higher energies looking for new particles. The CEPC operates at the intensity frontier, producing copious amounts of known particles for precision studies. For example, the LHC first elucidated that the Higgs boson weighs about 125 GeV by producing a few of them at that energy. The CEPC will now operate at this beam energy (approx. half the collision energy) to produce more Higgs bosons and study them in further detail.

According to the report, “The tunnel hosting the collider and booster will be mostly in hard rock so there is a strong and stable foundation to support the accelerators.” The tunnel will also be big enough to accommodate a future proton-proton collider.

There are other crucial differences between the LHC and the CEPC. For example, in particle physics experiments where a charged particle is accelerated through a magnetic field that bends its path, the particle emits the so-called synchrotron radiation. The lighter the particle, the more synchrotron radiation it emits. This means the electrons and positrons in the CEPC will lose more energy as they traverse the ring than the protons do at the LHC. The Chinese scientists plan to extract this radiation for use in other experiments, especially in the study of crystals. Synchrotron radiation has many desirable properties, such as high brilliance and stability, that make them easy to work with.

A general view of the LHC experiment as seen at CERN, near Geneva, Switzerland. Credit: Reuters/Pierre Albouy

A general view of the LHC experiment as seen at CERN, near Geneva, Switzerland. Credit: Reuters/Pierre Albouy/File Photo

The IHEP expects the Chinese government will fund 75% of the project while 25% will be sourced from an international collaboration. The project has currently entered the R&D and prototyping phase, which will end in 2022. Then, construction on the CEPC will begin, followed by operationalisation in 2030. The CEPC will run in its Higgs mode for seven years, in the Z boson for two years and in the W boson mode for one. After that, in 2040, the IHEP expects the superconducting magnets needed to upgrade the CEPC to the SPPC – super proton-proton collider – will be ready to install.

The current plan is for the SPPC to operate with a collision energy of 75 TeV, over five-times higher than the LHC’s current level. The forecast factors in the availability of a new class of iron-based superconducting magnets at lower price.

In this period, the LHC will not be dormant. There have been proposals from scientists at CERN, the European lab for nuclear research that runs the LHC, to use the machine in different ways to accommodate the questions physicists need answered. For example, according to one pitch, the LHC can be modified to include a large device called the Energy Recovery Linac (ERL). The ERL will accelerate electrons and collide them with protons accelerated by the LHC.

This configuration of the machine will be called the LHeC – large hadron-electron collider. It will be used to study the strong nuclear force in greater detail. By increasing the energy of the ERL, the LHeC will also be able to operate as a high-luminosity collider by the 2030s, in conjunction with the CEPC. Luminosity is a measure of the number of particles produced in a collision.

Also read: Standard Model – the Absolutely Amazing Theory of Almost Everything

Another proposal suggests that by the late 2030s, when the CEPC is preparing to become the SPPC, the high-luminosity LHeC should be succeeded by the Future Circular Collider (FCC). CERN scientists working on this idea have conceived of a 100-km long tunnel accommodating a circular collider with collision energies comparable to the CEPC.

Multiple scientists have written in favour of the CEPC in the past, and more generally of a post-LHC collider. However, Chen-Ning Yang, a physicist and Nobel laureate, has been a notable detractor. Yang has argued that the CEPC will extract too great a price from China and that the money can be put to better use. Steven Weinberg, another Nobel laureate, countered this argument, saying, “The fundamental character of elementary particle physics makes it very attractive to bright young men and women, who then provide a technically sophisticated cadre available to deal with problems of society.”

The CEPC’s prospects will become clearer by 2021, when China’s next five-year plan will be introduced. Japan will know of the ILC’s fate by next year, when the European Union will finalise a deal to fund the project. Apart from moving the focus of particle physics research to Asia, it will be important for Japan to keep pace with China. If it doesn’t, then the CEPC will get a head-start that the ILC might never be able to catch up with.

A Mystery Particle That Seems to Threaten Our Understanding of Physical Reality

Scientists at Cern’s Large Hadron Collider have seen something that may force us to abandon everything we thought we knew about the world on the level of particles.

There was a huge amount of excitement when the Higgs boson was first spotted back in 2012 – a discovery that bagged the Nobel Prize for Physics in 2013. The particle completed the so-called standard model, our current best theory of understanding nature at the level of particles.

Now scientists at the Large Hadron Collider (LHC) at Cern think they may have seen another particle, detected as a peak at a certain energy in the data, although the finding is yet to be confirmed. Again there’s a lot of excitement among particle physicists, but this time it is mixed with a sense of anxiety. Unlike the Higgs particle, which confirmed our understanding of physical reality, this new particle seems to threaten it.

The new result – consisting of a mysterious bump in the data at 28 GeV (a unit of energy) – has been published as a preprint on ArXiv. It is not yet in a peer-reviewed journal – but that’s not a big issue. The LHC collaborations have very tight internal review procedures, and we can be confident that the authors have done the sums correctly when they report a “4.2 standard deviation significance”. That means that the probability of getting a peak this big by chance – created by random noise in the data rather than a real particle – is only 0.0013%. That’s tiny – 13 in a million. So it seems like it must a real event rather than random noise – but nobody’s opening the champagne yet.

Also Read: Do Hydrogen Anti-Atoms Dance Like Hydrogen Atoms? Let’s Check With Lasers

What the data says

Many LHC experiments, which smash beams of protons (particles in the atomic nucleus) together, find evidence for new and exotic particles by looking for an unusual build up of known particles, such as photons (particles of light) or electrons. That’s because heavy and “invisible” particles such as the Higgs are often unstable and tend to fall apart (decay) into lighter particles that are easier to detect. We can therefore look for these particles in experimental data to work out whether they are the result of a heavier particle decay. The LHC has found many new particles by such techniques, and they have all fitted into the standard model.

The new finding comes from an experiment involving the CMS detector, which recorded a number of pairs of muons – well known and easily identified particles that are similar to electrons, but heavier. It analysed their energies and directions and asked: if this pair came from the decay of a single parent particle, what would the mass of that parent be?

In most cases, pairs of muons come from different sources – originating from two different events rather than the decay of one particle. If you try to calculate a parent mass in such cases it would therefore spread out over a wide range of energies rather than creating a narrow peak specifically at 28GeV (or some other energy) in the data. But in this case it certainly looks like there’s a peak. Perhaps. You can look at the figure and you can judge for yourself.

New data. Credit: CMS Collaboration

Is this a real peak or is it just a statistical fluctuation due to the random scatter of the points about the background (the dashed curve)? If it’s real that means that a few of these muon pairs did indeed come from just a large parent particle that decayed by emitting muons – and no such 28 GeV particle has ever been seen before.

So it is all looking rather intriguing, but, history has taught us caution. Effects this significant have appeared in the past, only to vanish when more data is taken. The Digamma(750) anomaly is a recent example from a long succession of false alarms – spurious “discoveries” due to equipment glitches, over-enthusiastic analysis or just bad luck.

Also Read: Weak Signal Suggests Neutrino Sector May Upset Cherished Particle Physics Idea

This is partly due to something called the “look elsewhere effect”: although the probability of random noise producing a peak if you look specifically at a value of 28 GeV may be 13 in a million, such noise could give a peak somewhere else in the plot, maybe at 29GeV or 16GeV. The probabilities of these being due to chance are also tiny when considered respectively, but the sum of these tiny probabilities is not so tiny (though still pretty small). That means it is not impossible for a peak to be created by random noise.

And there are some puzzling aspects. For example, the bump appeared in one LHC run but not in another, when the energy was doubled. One would expect any new phenomena to get bigger when the energy is higher. It may be that there are reasons for this, but at the moment it’s an uncomfortable fact.

New physical reality?

The theory is even more incongruous. Just as experimental particle physicists spend their time looking for new particles, theorists spend their time thinking of new particles that it would make sense to look for: particles that would fill in the missing pieces of the standard model, or explain dark matter (a type of invisible matter), or both. But no one has suggested anything like this.

CMS model of a Higgs boson decaying into two jets of hadrons and two electrons. Credit: Lucas Taylor/CERN, CC BY-SA

CMS model of a Higgs boson decaying into two jets of hadrons and two electrons. Credit: Lucas Taylor/CERN, CC BY-SA

For example, theorists suggest we could find a lighter version of the Higgs particle. But anything of that ilk would not decay to muons. A light Z boson or a heavy photon have also been talked about, but they would interact with electrons. That means we should have probably discovered them already as electrons are easy to detect. The potential new particle does not match the properties of any of those proposed.

If this particle really exists, then it is not just outside the standard model but outside it in a way that nobody anticipated. Just as Newtonian gravity gave way to Einstein’s general relativity, the standard model will be superseded. But the replacement will not be any of the favoured candidates that has already been proposed to extend standard model: including supersymmetry, extra dimensions and grand unification theories. These all propose new particles, but none with properties like the one we might have just seen. It will have to be something so weird that nobody has suggested it yet.

Also Read: The Universe’s Rate of Expansion Is in Dispute – We May Need New Physics to Solve It

Luckily the other big LHC experiment, ATLAS, has similar data from their experiments The team is still analysing it, and will report in due course. Cynical experience says that they will report a null signal, and this result will join the gallery of statistical fluctuations. But maybe – just maybe – they will see something. And then life for experimentalists and theorists will suddenly get very busy and very interesting.The Conversation

Roger Barlow is a Research Professor and the Director of the International Institute for Accelerator Applications at the University of Huddersfield

This article is republished from The Conversation under a Creative Commons license. Read the original article.

The Role of Science Journalism in a World With Preprints

Attempting to call out science journalists who recognise that all stories evolve constantly will not work – because doing so also passively condones hype.

This article is the first of a new column called ‘Infinite in All Directions’. Once our now-defunct science newsletter, IIAD is being revived in the form of this blog-style weekly column written by Vasudevan Mukunth, science editor.

The recent conversation about preprints, motivated by Tom Sheldon’s article in Nature News, focused on whether access to preprint manuscripts is precipitating bad or wrong articles in the popular science journalism press. The crux of Sheldon’s argument was that preprints aren’t peer-reviewed, which leaves journalists with the onerous task of validating their results when, in fact, that has been the traditional responsibility of independent scientists hired by journals to which the papers have been submitted. I contested this view because it is in essence a power struggle, with the scientific journal assuming the role of a knowledge-hegemon.

An interesting example surfaced in relation to this debate quite recently, when two researchers from the Indian Institute of Science, Bengaluru, uploaded a preprint paper to the arXiv repository claiming they had detected signs of superconductivity at room temperature in a silver-gold nanostructure. They simultaneously submitted their paper to Nature, where it remains under embargo; in the meantime, public discussions of the paper’s results have been centred on information available in the preprint. Science journalists around the world have been reserved in their coverage of this development, sensational though it seems to be, likely because, as Sheldon noted, it hasn’t been peer-reviewed yet.

At the same time, The Hindu published an article highlighting the study. For its part, The Wire commissioned an article – since published here on August 6 – discussing the preprint paper in greater detail, with comments from various materials scientists around the country. The article’s overwhelming conclusion seemed to be that the results look neat to theorists but need more work to experimentalists, and that we should wait for Nature‘s ‘verdict’ before passing judgment. Nonetheless, the article found it fit, and rightly so based on the people quoted, to be optimistic.

A few days later, there emerged a twist in the plot. Brian Skinner, a physicist at the Massachusetts Institute of Technology, uploaded a public comment to the arXiv repository discussing, in brief, a curious feature of the IISc preprint. He had found that two plots representing independent measurements displayed in the manuscript showed very similar, if not exact, noise patterns. Noise is supposed to be random; if two measurements are really independent, their respective noise patterns cannot, must not, look the same. However, the IISc preprint showed the exact opposite. To Skinner – or in fact to any observer engaged in experimental studies – this suggests that the data in one of the two plots, or both, was fabricated.

An image from Skinner’s preprint paper showing the similar noise patters from two different plots (overlapped as blue and green dots for comparison). Source: arXiv:1808.02929v1

An image from Skinner’s preprint paper showing the similar noise patters from two different plots (overlapped as blue and green dots for comparison). Source: arXiv:1808.02929v1

This is obviously a serious allegation. Skinner himself has not attempted to reach any conclusion and has stopped at pointing out the anomaly. At this juncture, let’s reintroduce the science journalist: what should she do?

In a world without preprints, this paper would not have fed a story until after a legitimate journal had published the paper, and in which case the science journalist’s article’s legitimacy would bank on the peer-reviewers’ word. More importantly, in a world without preprints, this would have been a single story – à la the discovery of the Higgs boson or gravitational waves from colliding neutron stars. In a world with preprints, this has become an evolving story even though, excluding the “has been submitted to a journal for review” component, the study itself is not dynamic. (Contrast this for example to the search for dark matter: it is ongoing, etc.)

Against this context, the arguments Sheldon et al have put forth assumes a new clarity. What they’re saying is that the story is not supposed to be evolving, and that science journalists forced to write their stories based only on peer-reviewed papers would have produced a single narrative of an event fixed at one point in space and time. Overall, that if journalists could have waited for the paper to be peer-reviewed, they would have been able to deliver to the people a more finished tale, and whose substance and contours enjoy greater consensus within the scientific community.

This may seem like a compelling reason to not allow journalists to write articles based on preprints until you stop to consider some implicit assumptions that favour peer-review.

First off, peer-review is always viewed as a monolithic institution whereas the people quoted in an article are viewed as individuals – despite the fact that both groups are (supposed to be) composed of peers acting independently. As a result, the former appears to be indemnified. In The Wire article, the people quoted were Vijay Shenoy, T.V. Ramakrishnan, Ganapathy Baskaran, Pushan Ayyub and an unnamed experimentalist. The author, R. Ramachandran, (together with the editor – me) also cited multiple studies and historical events for the necessary context and reminded the reader on two occasions that the analysis was preliminary. What the people get out of peer-review, on the other hand, is a ‘yes’/’no’ answer that, in the journal’s interests, are to be considered final.

In fact, should review – peer- or journalistic – fail, journalism affords various ways to deal with the fallout. The scientists quoted may have spoken on the record, and their contact details will be easily findable; the publication’s editor can be contacted and a correction or retraction asked for; in some cases (including The Wire‘s), a reader’s editor acting independently of the editorial staff can be petitioned to set the public record straight. With a journal, however, the peer-reviewers are protected behind a curtain of secrecy, and the people and the scientists alike will have to await a decision that is often difficult to negotiate with. The numerous articles published by Retraction Watch are ready examples.

Second, it is believed that peer-reviewers perform checks that science journalists never can. But where do you draw the line? Do peer-reviewers check for all potential problems with a paper before green-flagging it? More pertinently, are they always more thorough in their checks than good science journalists can be? In fact, there is another group of actors here that science journalists can depend on: scientists who are publicly critiquing studies on their Twitter or Facebook pages and their blogs. I mention this here to quote the examples of Katie Mack, Adam Falkowski, Emily Lakdawalla, etc. – and, most of all, of Elizabeth Bik, a microbiologist. Bik has been carefully documenting the incidence of duplicated or manipulated images in published papers.

Circling back to peer-review’s being viewed as a monolith: many of the papers Bik has identified were published by journals after they were declared ‘good to go’ by review panels. So by casting their verdict as final, by describing each scientific paper as being fixed at a point in time and space, journals are effectively proclaiming that what they have published need not be revisited or revised. This is a questionable position. On the other hand, by casting the journalistic enterprise as the documentation of a present that is being constantly reshaped, journalists have access to a storytelling space that many scientific journals don’t afford the very scientists that they profit from.

Where this enterprise turns risky, or even potentially unreliable, is when it becomes dishonest about its intentions – rather, isn’t explicitly honest enough. That is, to effect change in what journalism stands for, we also have to change a little bit of how journalism does what it does. For example, in The Wire‘s article, the author was careful to note that (i) only the paper’s publication (or rejection) can answer some questions and perhaps even settle the ongoing debate, (ii) some crucial details of the IISc experiment are missing from the preprint (and likely will be from the submitted manuscript as well), (iii) the article’s discussion is based on conversations with materials scientists in India and (iv) the paper’s original authors have refused to speak until they have heard from Nature. Most of all, the article itself does not editorialise.

These elements, together with an informed readership, are necessary to stave off hype cycles – unnecessary news cycles typically composed of two stories, one making a mountain of a molehill and the next declaring that the matter at hand has been found to be a molehill. The simplest way to sidestep this fallacy is to remember at all stages of the editorial process that all stories will evolve irrespective of what those promoting it have to say. Of course, facts don’t evolve, but what conclusion a collection of facts lends itself to will. And so will opinions, implications, suggestions and whatnot. This is why attempting to call out science journalists who respect these terms of their enterprise will not work – because doing so also passively condones hype. What will work is to knock on the doors of those unquestioning journalists who pander to hype above all else.

This prescription is tied to one for ourselves: as much as science journalists want to reform the depiction of the scientific enterprise, moving it away from the idea that scientists find remarkable results with 100% confidence all the time (which is the impression journals give), they – rather, we – should also work towards reforming what journalism stands for in the people’s eyes. Inasmuch as science as well as journalism are bound by the pursuit of truth(s), it is important for all stakeholders to remember, and to be reminded, that – to adapt what historian of science Patrick McCray said tweeted – it’s about consensus, not certainty. Should they have a problem with journalists running a story based on a preprint instead of a published paper, journals can provide a way out (for reasons described here) by being more open about peer-review, what kind of issues reviewers check for and how journalists can perform the same checks.

This article was originally published on the author’s blog and has been republished here with edits for style.

Beauty Is Truth, Truth Is Beauty, and Other Lies of Physics

In the last 40 years, while there has been no major breakthrough in the foundations of physics, aesthetic arguments have flourished into research programmes and society has spent billions of dollars finding no evidence to support these beautiful ideas.

Who doesn’t like a pretty idea? Physicists certainly do. In the foundations of physics, it has become accepted practice to prefer hypotheses that are aesthetically pleasing. Physicists believe that their motivations don’t matter because hypotheses, after all, must be tested. But most of their beautiful ideas are hard or impossible to test. And whenever an experiment comes back empty-handed, physicists can amend their theories to accommodate the null results.

This has been going on for about 40 years. In these 40 years, aesthetic arguments have flourished into research programmes such as supersymmetry, the multiverse and grand unification – that now occupy thousands of scientists. In these 40 years, society spent billions of dollars on experiments that found no evidence to support the beautiful ideas. And in these 40 years, there has not been a major breakthrough in the foundations of physics.

My colleagues argue that criteria of beauty are experience-based. The most fundamental theories we currently have – the standard model of particle physics and Albert Einstein’s general relativity – are beautiful in specific ways. I agree it was worth a try to assume that more fundamental theories are beautiful in similar ways. But, well, we tried, and it didn’t work. Nevertheless, physicists continue to select theories based on the same three criteria of beauty: simplicity, naturalness and elegance.

With simplicity I don’t mean Occam’s razor, which demands that among two theories that achieve the same thing, you pick the one that’s simpler. No, I mean absolute simplicity: a theory should be simple, period. When theories are not simple enough for my colleagues’ tastes, they try to make them simpler – by unifying several forces or by postulating new symmetries that combine particles in orderly sets.

The second criterion is naturalness. Naturalness is an attempt to get rid of the human element by requiring that a theory should not use assumptions that appear hand-picked. This criterion is most often applied to the values of constants without units, such as the ratios of elementary particles’ masses. Naturalness demands that such numbers should be close to one or, if that’s not the case, the theory explains why that isn’t so.

Then there’s elegance, the third and most elusive aspect of beauty. It’s often described as a combination of simplicity and surprise that, taken together, reveals new connections. We find elegance in the ‘Aha effect’, the moment of insight when things fall into place.

Physicists currently consider a theory promising if it’s beautiful according to these three criteria. This led them to predict, for example, that protons should be able to decay. Experiments have looked for this since the 1980s, but so far nobody has seen a proton decay. Theorists also predicted that we should be able to detect dark matter particles, such as axions or weakly interacting massive particles (WIMPs). We have commissioned dozens of experiments but haven’t found any of the hypothetical particles – at least not so far. The same criteria of symmetry and naturalness led many physicists to believe that the Large Hadron Collider (LHC) should see something new besides the Higgs boson, for example so-called ‘supersymmetric’ particles or additional dimensions of space. But none have been found so far.

How far can you push this programme before it becomes absurd? Well, if you make a theory simpler and simpler it will eventually become unpredictable, because the theory no longer contains enough information to even carry through calculations. What you get then is what theorists now call a ‘multiverse’ – an infinite collection of universes with different laws of nature.

For example, if you use the law of gravity without fixing the value of Newton’s constant by measurement, you could say that your theory contains a universe for any value of the constant. Of course, you then have to postulate that we live in the one universe that has the value of Newton’s constant that we happen to measure. So it might look like you haven’t gained much. Except that theorists can now write papers about that large number of new universes. Even better, the other universes aren’t observable, hence multiverse theories are safe from experimental test.

Many beautiful hypotheses were just wrong

I think it’s time we take a lesson from the history of science. Beauty does not have a good track record as a guide for theory-development. Many beautiful hypotheses were just wrong, like Johannes Kepler’s idea that planetary orbits are stacked in regular polyhedrons known as ‘Platonic solids’, or that atoms are knots in an invisible aether, or that the Universe is in a ‘steady state’ rather than undergoing expansion.

And other theories that were once considered ugly have stood the test of time. When Kepler suggested that the planets move on ellipses rather than circles, that struck his contemporaries as too ugly to be true. And the physicist James Maxwell balked at his own theory involving electric and magnetic fields, because in his day the beauty standard involved gears and bolts. Paul Dirac chided a later version of Maxwell’s theory as ugly, because it required complicated mathematical gymnastics to remove infinities. Nevertheless, those supposedly ugly ideas were correct. They are still in use today. And we no longer find them ugly.

History has a second lesson. Even though beauty was arguably a strong personal motivator for many physicists, the problems that led to breakthroughs were not merely aesthetic misgivings – they were mathematical contradictions. Einstein, for example, abolished absolute time because it was in contradiction with Maxwell’s electromagnetism, thereby creating special relativity. He then resolved the conflict between special relativity and Newtonian gravity, which gave him general relativity. Dirac later removed the disagreement between special relativity and quantum mechanics, which led to the development of the quantum field theories which we still use in particle physics today.

A general view of the LHC experiment as seen at CERN, near Geneva, Switzerland. Credit: Reuters/Pierre Albouy

A general view of the LHC experiment as seen at CERN, near Geneva, Switzerland. Credit: Reuters/Pierre Albouy/File Photo

The Higgs boson, too, was born out of need for logical consistency. Found at the LHC in 2012, the Higgs boson is necessary to make the standard model work. Without the Higgs, particle physicists’ calculations return probabilities larger than one, mathematical nonsense that cannot describe reality. Granted, the mathematics didn’t tell us it had to be the Higgs boson, it could have been something else. But we knew that something new had to happen at the LHC, before it was even built. This was reasoning built on solid mathematical ground.

Pretty but not necessary

Supersymmetric particles, on the other hand, are pretty but not necessary. They were included to fix an aesthetic shortcoming of the current theory, a lack of naturalness. There’s nothing mathematically wrong with a theory that’s not supersymmetric, it’s just not particularly pretty. Particle physicists used supersymmetry to remedy this perceived shortfall, thereby making the theory much more beautiful. The predictions that supersymmetric particles should be seen at the LHC, therefore, were based on hope rather than sound logic. And the particles have not been found.

My conclusion from this long line of null results is that when physics tries to rectify a perceived lack of beauty, we waste time on problems that aren’t really problems. Physicists must rethink their methods, now – before we start discussing whether the world needs a next larger particle collider or yet another dark matter search.

The answer can’t be that anything goes, of course. The idea that new theories should solve existing problems is good in principle – it’s just that, currently, the problems themselves aren’t sharply formulated enough for that criterion to be useful. The conceptual and philosophical basis of reasoning in the foundations of physics is weak, and this must improve.

It’s no use, and not good scientific practice, to demand that nature conform to our ideals of beauty. We should let evidence lead the way to new laws of nature. I am pretty sure beauty will await us there.Aeon counter – do not remove

This article was originally published at Aeon and has been republished under Creative Commons.

Long-Sought Higgs Boson Detail Finally Confirmed

The reason for the delay was noise.

The Higgs boson has finally been observed decaying into the particles it most often decays to – six years after it was discovered. The reason for this delay was noise.

The Large Hadron Collider smashes protons head-on at nearly the speed of light to crack them open, releasing energy and other particles that then combine and condense into even other particles. The rules of particle physics and quantum mechanics specify the probability that some particles will form in this mess. So by smashing numerous protons together repeatedly, millions and millions of times, the collider can maximise the probability of producing particles that physicists can study to understand our universe.

One of them is the Higgs boson. The announcement of its discovery was made in July 2012. By knowing that the Higgs boson exists, physicists were able to ascertain that the theory pegged to this particle, which discusses how elementary particles get  their mass, is valid. This is why Peter Higgs and Francois Englert were awarded the physics Nobel Prize in 2013.

The Higgs boson was discovered indirectly. Detectors didn’t spot it directly as much as spotted a few particles in the collision fray that could only have come from a decaying Higgs boson. These were two photons and particles called W and Z bosons.

However, the Higgs boson decays into another kind of particle, called a bottom quark, far more often. Why didn’t physicists simply look for such bottom quarks during the collision to say whether or not they came from a Higgs boson?

The Higgs boson has a tendency to decay into pairs of a fermion and an anti-fermion (e.g. an electron and a positron, or a proton and an anti-proton). Moreover, it is likelier to decay into heavier fermions than into lighter fermions because it interacts more strongly with the former.

The heaviest fermion is the top quark, but it is too heavy itself for the Higgs to decay into. The next is the bottom quark – and theoretical calculations suggest the Higgs should decay into a bottom quark and anti-quark 60% of the time. The third is the tau particle, which weighs almost half as much as the bottom quark and to which the boson decays only 6% of the time.

But the problem is that hadron colliders – Large or not – also produce an abundance of bottom quarks through other mechanisms. The rules of quantum chromodynamics (QCD: the study of quarks and gluons) enable some in particular that produce lots of bottom quarks.

As a result, for a long time, physicists were able to tell if photons and W/Z bosons detected in a collision were from a decaying Higgs boson – but couldn’t tell if bottom quarks detected in a collision were from decaying Higgs bosons or due to QCD processes… until now.

Since 2012, physicists have been able to better isolate the bottom quarks produced by decaying Higgs bosons (from the data) as well as, according to an official blog post, compare the values of “other kinematic variables that show distinct differences between the signal and the various backgrounds”.

By combining calculations across all the data collected by the machine in 2015, 2016 and 2017, the results had a statistical significance of 4.9σ – just shy of the 5σ threshold to claim a discovery. The significance does breach the threshold when other data filtering and analytical techniques are applied, and so the ATLAS and CMS detector collaborations at the Large Hadron Collider have made their official announcements. We have finally observed Higgs bosons decaying into two bottom quarks.

Though it is a veritable technological achievement, its scientific consequences are pithy. It would have been mighty exciting had this ‘channel’ of decay not been detected, but now that it has means we haven’t discovered any of the signs of ‘new physics’ we have been looking for.

This is so because the theory that predicted the Higgs boson as well as how it will decay into various particles, the Standard Model, is airtight – yet clueless about why the Higgs boson’s mass is what it is or where dark matter comes from. The key is to find a mistake in the Model, an natural event or process that doesn’t play out as predicted, and then we’ll know there’s something new to explore there, something that might answer our unanswerable questions.

Standard Model – the Absolutely Amazing Theory of Almost Everything

Every attempt to overturn this theory to demonstrate in the laboratory that it must be substantially reworked – and there have been many over the past 50 years – has failed.

The Standard Model. What a dull name for the most accurate scientific theory known to human beings.

More than a quarter of the Nobel Prizes in physics of the last century are direct inputs to or direct results of the Standard Model. Yet its name suggests that if you can afford a few extra dollars a month you should buy the upgrade. As a theoretical physicist, I would prefer ‘The Absolutely Amazing Theory of Almost Everything’. That’s what the Standard Model really is.

Many recall the excitement among scientists and media over the 2012 discovery of the Higgs boson. But that much-ballyhooed event didn’t come out of the blue – it capped a five-decade undefeated streak for the Standard Model. Every fundamental force but gravity is included in it. Every attempt to overturn it to demonstrate in the laboratory that it must be substantially reworked – and there have been many over the past 50 years – has failed.

In short, the Standard Model answers this question: What is everything made of, and how does it hold together?

The smallest building blocks

You know, of course, that the world around us is made of molecules, and molecules are made of atoms. Chemist Dmitri Mendeleev figured that out in the 1860s and organized all atoms – that is, the elements – into the periodic table that you probably studied in middle school. But there are 118 different chemical elements. There’s antimony, arsenic, aluminum, selenium and 114 more.

Physicists like things simple. We want to boil things down to their essence, a few basic building blocks. Over a 100 chemical elements is not simple. The ancients believed that everything is made of just five elements – earth, water, fire, air and aether. Five is much simpler than 118. It’s also wrong.

By 1932, scientists knew that all those atoms are made of just three particles – neutrons, protons and electrons. The neutrons and protons are bound together tightly into the nucleus. The electrons, thousands of times lighter, whirl around the nucleus at speeds approaching that of light. Physicists Planck, Bohr, Schroedinger, Heisenberg and friends had invented a new science – quantum mechanics – to explain this motion.

That would have been a satisfying place to stop. Just three particles. Three is even simpler than five. But held together how? The negatively charged electrons and positively charged protons are bound together by electromagnetism. But the protons are all huddled together in the nucleus and their positive charges should be pushing them powerfully apart. The neutral neutrons can’t help.

What binds these protons and neutrons together? “Divine intervention” a man on a Toronto street corner told me; he had a pamphlet, I could read all about it. But this scenario seemed like a lot of trouble even for a divine being – keeping tabs on every single one of the universe’s 10⁸⁰ protons and neutrons and bending them to its will.

Expanding the zoo of particles

Meanwhile, nature cruelly declined to keep its zoo of particles to just three. Really four, because we should count the photon, the particle of light that Einstein described. Four grew to five when Anderson measured electrons with positive charge – positrons – striking the Earth from outer space. At least Dirac had predicted these first anti-matter particles. Five became six when the pion, which Yukawa predicted would hold the nucleus together, was found.

Then came the muon – 200 times heavier than the electron, but otherwise a twin. “Who ordered that?” I.I. Rabi quipped. That sums it up. Number seven. Not only not simple, redundant.

By the 1960s there were hundreds of “fundamental” particles. In place of the well-organized periodic table, there were just long lists of baryons (heavy particles like protons and neutrons), mesons (like Yukawa’s pions) and leptons (light particles like the electron, and the elusive neutrinos) – with no organisation and no guiding principles.

The Standard Model of elementary particles provides an ingredients list for everything around us. Credit: Fermi National Accelerator Laboratory, CC BY

Into this breach sidled the Standard Model. It was not an overnight flash of brilliance. No Archimedes leapt out of a bathtub shouting “eureka.” Instead, there was a series of crucial insights by a few key individuals in the mid-1960s that transformed this quagmire into a simple theory, and then five decades of experimental verification and theoretical elaboration.

Quarks. They come in six varieties we call flavours. Like ice cream, except not as tasty. Instead of vanilla, chocolate and so on, we have up, down, strange, charm, bottom and top. In 1964, Gell-Mann and Zweig taught us the recipes: Mix and match any three quarks to get a baryon. Protons are two ups and a down quark bound together; neutrons are two downs and an up. Choose one quark and one antiquark to get a meson. A pion is an up or a down quark bound to an anti-up or an anti-down. All the material of our daily lives is made of just up and down quarks and anti-quarks and electrons.

Simple. Well, simple-ish, because keeping those quarks bound is a feat. They are tied to one another so tightly that you never ever find a quark or anti-quark on its own. The theory of that binding, and the particles called gluons (chuckle) that are responsible, is called quantum chromodynamics. It’s a vital piece of the Standard Model, but mathematically difficult, even posing an unsolved problem of basic mathematics. We physicists do our best to calculate with it, but we’re still learning how.

The other aspect of the Standard Model is “A Model of Leptons.” That’s the name of the landmark 1967 paper by Steven Weinberg that pulled together quantum mechanics with the vital pieces of knowledge of how particles interact and organised the two into a single theory. It incorporated the familiar electromagnetism, joined it with what physicists called “the weak force” that causes certain radioactive decays, and explained that they were different aspects of the same force. It incorporated the Higgs mechanism for giving mass to fundamental particles.

Since then, the Standard Model has predicted the results of experiment after experiment, including the discovery of several varieties of quarks and of the W and Z bosons – heavy particles that are for weak interactions what the photon is for electromagnetism. The possibility that neutrinos aren’t massless was overlooked in the 1960s, but slipped easily into the Standard Model in the 1990s, a few decades late to the party.

3D view of an event recorded at the CERN particle accelerator showing characteristics expected from the decay of the SM Higgs boson to a pair of photons (dashed yellow lines and green towers). Credit: McCauley, Thomas; Taylor, Lucas; for the CMS Collaboration CERN, CC BY-SA

Discovering the Higgs boson in 2012, long predicted by the Standard Model and long sought after, was a thrill but not a surprise. It was yet another crucial victory for the Standard Model over the dark forces that particle physicists have repeatedly warned loomed over the horizon. Concerned that the Standard Model didn’t adequately embody their expectations of simplicity, worried about its mathematical self-consistency, or looking ahead to the eventual necessity to bring the force of gravity into the fold, physicists have made numerous proposals for theories beyond the Standard Model. These bear exciting names like Grand Unified Theories, Supersymmetry, Technicolor and String Theory.

Sadly, at least for their proponents, beyond-the-Standard-Model theories have not yet successfully predicted any new experimental phenomenon or any experimental discrepancy with the Standard Model.

The ConversationAfter five decades, far from requiring an upgrade, the Standard Model is worthy of celebration as the Absolutely Amazing Theory of Almost Everything.

Glenn Starkman, Distinguished University Professor of Physics, Case Western Reserve University

This article was originally published on The Conversation. Read the original article.