Particle Physics Turns to Quantum Computing for Big-Data Solutions

Particle Physics Turns to Quantum Computing for Big-Data Solutions
January 29, 2020

Internship Paves Path to Quantum-Computing Project at Berkeley Lab

Wednesday, January 29, 2020 to Thursday, January 30, 2020

Giant-scale physics experiments are increasingly reliant on big data and complex algorithms fed into powerful computers, and managing this multiplying mass of data presents its own unique challenges.

To better prepare for this data deluge posed by next-generation upgrades and new experiments, physicists are turning to the fledgling field of quantum computing to find faster ways to analyze the incoming info.

In a conventional computer, memory takes the form of a large collection of bits, and each bit has only two values: a one or zero, akin to an on or off position. In a quantum computer, meanwhile, data is stored in quantum bits, or qubits. A qubit can represent a one, a zero, or a mixed state in which it is both a one and a zero at the same time.

By tapping into this and other quantum properties, quantum computers hold the potential to handle larger datasets and quickly work through some problems that would trip up even the world’s fastest supercomputers. For other types of problems, though, conventional computers will continue to outperform quantum machines.

The High Luminosity Large Hadron Collider (HL-LHC) Project, a planned upgrade of the world’s largest particle accelerator at the CERN laboratory in Europe, will come on line in 2026. It will produce billions of particle events per second – five to seven times more data than its current maximum rate – and CERN is seeking new approaches to rapidly and accurately analyze this data.

In these particle events, positively charged subatomic particles called protons collide, producing sprays of other particles, including quarks and gluons, from the energy of the collision. The interactions of particles can also cause other particles – like the Higgs boson – to pop into existence.

Tracking the creation and precise paths (called “tracks”) of these particles as they travel through layers of a particle detector – while excluding the unwanted mess, or “noise” produced in these events – is key in analyzing the collision data.

The data will be like a giant 3D connect-the-dots puzzle that contains many separate fragments, with little guidance on how to connect the dots.

To address this next-gen problem, a group of student researchers and other scientists at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have been exploring a wide range of new solutions.

One such approach is to develop and test a variety of algorithms tailored to different types of quantum-computing systems. Their aim: Explore whether these technologies and techniques hold promise for reconstructing these particle tracks better and faster than conventional computers can.

Particle detectors work by detecting energy that is deposited in different layers of the detector materials. In the analysis of detector data, researchers work to reconstruct the trajectory of specific particles traveling through the detector array. Computer algorithms can aid this process through pattern recognition, and particles’ properties can be detailed by connecting the dots of individual “hits” collected by the detector and correctly identifying individual particle trajectories.

Heather Gray, an experimental particle physicist at Berkeley Lab and a UC Berkeley physics professor, leads the Berkeley Lab-based R&D effort – Quantum Pattern Recognition for High-Energy Physics (HEP.QPR) – that seeks to identify quantum technologies to rapidly perform this pattern-recognition process in very-high-volume collision data. This R&D effort is funded as part of the DOE’s QuantISED (Quantum Information Science Enabled Discovery for High Energy Physics) portfolio.

The HEP.QPR project is also part of a broader initiative to boost quantum information science research at Berkeley Lab and across U.S. national laboratories.

Other members of the HEP.QPR group are: Wahid Bhimji, Paolo Calafiura, Wim Lavrijsen, and former postdoctoral researcher Illya Shapoval, who explored quantum algorithms for associative memory. Bhimji is a big data architect at Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC). Calafiura is chief software architect of CERN’s ATLAS experiment and a member of Berkeley Lab’s Computational Research Division (CRD). And Lavrijsen is a CRD software engineer who is also involved in CERN’s ATLAS experiment.

Members of the HEP.QPR project have collaborated with researchers at the University of Tokyo and from Canada on the development of quantum algorithms in high-energy physics, and jointly organized a Quantum Computing Mini-Workshop at Berkeley Lab in October 2019.

Gray and Calafiura were also involved in a CERN-sponsored competition, launched in mid-2018, that challenged computer scientists to develop machine-learning-based techniques to accurately reconstruct particle tracks using a simulated set of HL-LHC data known as TrackML. Machine learning is a form of artificial intelligence in which algorithms can become more efficient and accurate through a gradual training process akin to human learning. Berkeley Lab’s quantum-computing effort in particle-track reconstruction also utilizes this TrackML set of simulated data.

Berkeley Lab and UC Berkeley are playing important roles in the rapidly evolving field of quantum computing through their participation in several quantum-focused efforts, including The Quantum Information Edge, a research alliance announced in December 2019.

The Quantum Information Edge is a nationwide alliance of national labs, universities, and industry advancing the frontiers of quantum computing systems to address scientific challenges and maintain U.S. leadership in next-generation information technology. It is led by the DOE’s Berkeley Lab and Sandia National Laboratories.

The series of articles listed below profile three student researchers who have participated in Berkeley Lab-led efforts to apply quantum computing to the pattern-recognition problem in particle physics:

Lucy Linder, while working as a researcher at Berkeley Lab, developed her master’s thesis – supervised by Berkeley Lab staff scientist Paolo Calafiura – about the potential application of a quantum-computing technique called quantum annealing for finding particle tracks. She remotely accessed quantum-computing machines at D-Wave Systems Inc. in Canada and at Los Alamos National Laboratory in New Mexico.

Linder’s approach was to first format the particle-track simulated data as something known as a QUBO (quadratic unconstrained binary optimization) problem that formulated the problem as an equation with binary values: either a one or a zero. This QUBO formatting also helped prepare the data for analysis by a quantum annealer, which uses qubits to help identify the best possible solution by applying a physics principle that describes how objects naturally seek the lowest-possible energy state. Read more.

Eric Rohm, an undergraduate student working on a contract at Berkeley Lab as part of the DOE’s Science Undergraduate Laboratory Internship program, developed a quantum approximate optimization algorithm (QAOA) using quantum-computing resources at Rigetti Computing in Berkeley, California. He was supervised by Berkeley Lab physicist Heather Gray.

This approach used a blend of conventional and quantum computing techniques to develop a custom algorithm. The algorithm, still in refinement, has been tested on the Rigetti Quantum Virtual Machine, a conventional computer that simulates a small quantum computer. The algorithm may eventually be tested on a Rigetti quantum processing unit that is equipped with actual qubits. Read more.

Amitabh Yadav, a student research associate at Berkeley Lab since November who is supervised by Gray and Berkeley Lab software engineer Wim Lavrijsen, is working to apply a quantum version of a convention technique called Hough transform to identify and reconstruct particle tracks using IBM’s Quantum Experience, a form of quantum computing.

The classical Hough transform technique can be used to detect specific features such as lines, curves, and circles in complex patterns, and the quantum Hough transform technique could potentially call out more complex shapes from exponentially larger datasets. Read more.

Source: Berkeley Lab News

Research Area: Particle Physics