Random thoughts about random subjects… From science to literature and between manga and watercolours, passing by data science and rugby; including film, physics and fiction, programming, pictures and puns.
This is a reblog of an article in ScienceDaily. See the original here.
A research team has for the first time experimentally proved a century old quantum theory that relativistic particles can pass through a barrier with 100% transmission.
The perfect transmission of sound through a barrier is difficult to achieve, if not impossible based on our existing knowledge. This is also true with other energy forms such as light and heat.
A research team led by Professor Xiang Zhang, President of the University of Hong Kong (HKU) when he was a professor at the University of California, Berkeley, (UC Berkeley) has for the first time experimentally proved a century old quantum theory that relativistic particles can pass through a barrier with 100% transmission. The research findings have been published in the top academic journal Science.
Just as it would be difficult for us to jump over a thick high wall without enough energy accumulated. In contrast, it is predicted that a microscopic particle in the quantum world can pass through a barrier well beyond its energy regardless of the height or width of the barrier, as if it is “transparent.”
As early as 1929, theoretical physicist Oscar Klein proposed that a relativistic particle can penetrate a potential barrier with 100% transmission upon normal incidence on the barrier. Scientists called this exotic and counterintuitive phenomenon the “Klein tunneling” theory. In the following 100 odd years, scientists tried various approaches to experimentally test Klein tunneling, but the attempts were unsuccessful and direct experimental evidence is still lacking.
Professor Zhang’s team conducted the experiment in artificially designed phononic crystals with triangular lattice. The lattice’s linear dispersion properties make it possible to mimic the relativistic Dirac quasiparticle by sound excitation, which led to the successful experimental observation of Klein tunneling.
“This is an exciting discovery. Quantum physicists have always tried to observe Klein tunneling in elementary particle experiments, but it is a very difficult task. We designed a phononic crystal similar to graphene that can excite the relativistic quasiparticles, but unlike natural material of graphene, the geometry of the human-made phononic crystal can be adjusted freely to precisely achieve the ideal conditions that made it possible to the first direct observation of Klein tunneling,” said Professor Zhang.
The achievement not only represents a breakthrough in fundamental physics, but also presents a new platform for exploring emerging macroscale systems to be used in applications such as on-chip logic devices for sound manipulation, acoustic signal processing, and sound energy harvesting.
“In current acoustic communications, the transmission loss of acoustic energy on the interface is unavoidable. If the transmittance on the interface can be increased to nearly 100%, the efficiency of acoustic communications can be greatly improved, thus opening up cutting-edge applications. This is especially important when the surface or the interface play a role in hindering the accuracy acoustic detection such as underwater exploration. The experimental measurement is also conducive to the future development of studying quasiparticles with topological property in phononic crystals which might be difficult to perform in other systems,” said Dr. Xue Jiang, a former member of Zhang’s team and currently an Associate Researcher at the Department of Electronic Engineering at Fudan University.
Dr. Jiang pointed out that the research findings might also benefit the biomedical devices. It may help to improve the accuracy of ultrasound penetration through obstacles and reach designated targets such as tissues or organs, which could improve the ultrasound precision for better diagnosis and treatment.
On the basis of the current experiments, researchers can control the mass and dispersion of the quasiparticle by exciting the phononic crystals with different frequencies, thus achieving flexible experimental configuration and on/off control of Klein tunneling. This approach can be extended to other artificial structure for the study of optics and thermotics. It allows the unprecedent control of quasiparticle or wavefront, and contributes to the exploration on other complex quantum physical phenomena.
This is a reblog of an article in ScienceDaily. See the original here.
Quantum computers have already managed to surpass ordinary computers in solving certain tasks — unfortunately, totally useless ones. The next milestone is to get them to do useful things. Researchers at Chalmers University of Technology, Sweden, have now shown that they can solve a small part of a real logistics problem with their small, but well-functioning quantum computer.
Interest in building quantum computers has gained considerable momentum in recent years, and feverish work is underway in many parts of the world. In 2019, Google’s research team made a major breakthrough when their quantum computer managed to solve a task far more quickly than the world’s best supercomputer. The downside is that the solved task had no practical use whatsoever — it was chosen because it was judged to be easy to solve for a quantum computer, yet very difficult for a conventional computer.
Therefore, an important task is now to find useful, relevant problems that are beyond the reach of ordinary computers, but which a relatively small quantum computer could solve.
“We want to be sure that the quantum computer we are developing can help solve relevant problems early on. Therefore, we work in close collaboration with industrial companies,” says theoretical physicist Giulia Ferrini, one of the leaders of Chalmers University of Technology’s quantum computer project, which began in 2018.
Together with Göran Johansson, Giulia Ferrini led the theoretical work when a team of researchers at Chalmers, including an industrial doctoral student from the aviation logistics company Jeppesen, recently showed that a quantum computer can solve an instance of a real problem in the aviation industry.
The algorithm proven on two qubits All airlines are faced with scheduling problems. For example, assigning individual aircraft to different routes represents an optimisation problem, one that grows very rapidly in size and complexity as the number of routes and aircraft increases.
Researchers hope that quantum computers will eventually be better at handling such problems than today’s computers. The basic building block of the quantum computer — the qubit — is based on completely different principles than the building blocks of today’s computers, allowing them to handle enormous amounts of information with relatively few qubits.
However, due to their different structure and function, quantum computers must be programmed in other ways than conventional computers. One proposed algorithm that is believed to be useful on early quantum computers is the so-called Quantum Approximate Optimization Algorithm (QAOA).
The Chalmers research team has now successfully executed said algorithm on their quantum computer — a processor with two qubits — and they showed that it can successfully solve the problem of assigning aircraft to routes. In this first demonstration, the result could be easily verified as the scale was very small — it involved only two airplanes.
Potential to handle many aircraft With this feat, the researchers were first to show that the QAOA algorithm can solve the problem of assigning aircraft to routes in practice. They also managed to run the algorithm one level further than anyone before, an achievement that requires very good hardware and accurate control.
“We have shown that we have the ability to map relevant problems onto our quantum processor. We still have a small number of qubits, but they work well. Our plan has been to first make everything work very well on a small scale, before scaling up,” says Jonas Bylander, senior researcher responsible for the experimental design, and one of the leaders of the project of building a quantum computer at Chalmers.
The theorists in the research team also simulated solving the same optimisation problem for up to 278 aircraft, which would require a quantum computer with 25 qubits.
“The results remained good as we scaled up. This suggests that the QAOA algorithm has the potential to solve this type of problem at even larger scales,” says Giulia Ferrini.
Surpassing today’s best computers would, however, require much larger devices. The researchers at Chalmers have now begun scaling up and are currently working with five quantum bits. The plan is to reach at least 20 qubits by 2021 while maintaining the high quality.
Applying the Quantum Approximate Optimization Algorithm to the Tail-Assignment Problem. Physical Review Applied, 2020; 14 (3) DOI: 10.1103/PhysRevApplied.14.034009
Optimised quantum algorithms present solution to Fermi-Hubbard model on near-term hardware
This a reblog of an article in Science Daily. See the original here.
The team, led by Bristol researcher and Phasecraft co-founder, Dr. Ashley Montanaro, has discovered algorithms and analysis which significantly lessen the quantum hardware capability needed to solve problems which go beyond the realm of classical computing, even supercomputers.
In the paper, published in Physical Review B, the team demonstrates how optimised quantum algorithms can solve the notorious Fermi-Hubbard model on near-term hardware.
The Fermi-Hubbard model is of fundamental importance in condensed-matter physics as a model for strongly correlated materials and a route to understanding high-temperature superconductivity.
Finding the ground state of the Fermi-Hubbard model has been predicted to be one of the first applications of near-term quantum computers, and one that offers a pathway to understanding and developing novel materials.
Dr. Ashley Montanaro, research lead and cofounder of Phasecraft: “Quantum computing has critically important applications in materials science and other domains. Despite the major quantum hardware advances recently, we may still be several years from having the right software and hardware to solve meaningful problems with quantum computing. Our research focuses on algorithms and software optimisations to maximise the quantum hardware’s capacity, and bring quantum computing closer to reality.
“Near-term quantum hardware will have limited device and computation size. Phasecraft applied new theoretical ideas and numerical experiments to put together a very comprehensive study on different strategies for solving the Fermi-Hubbard model, zeroing in on strategies that are most likely to have the best results and impact in the near future.
“The results suggest that optimising over quantum circuits with a gate depth substantially less than a thousand could be sufficient to solve instances of the Fermi-Hubbard model beyond the capacity of a supercomputer. This new research shows significant promise for the capabilities of near-term quantum devices, improving on previous research findings by around a factor of 10.”
Physical Review B, published by the American Physical Society, is the top specialist journal in condensed-matter physics. The peer-reviewed research paper was also chosen as the Editors’ Suggestion and to appear in Physics magazine.
Andrew Childs, Professor in the Department of Computer Science and Institute for Advanced Computer Studies at the University of Maryland: “The Fermi-Hubbard model is a major challenge in condensed-matter physics, and the Phasecraft team has made impressive steps in showing how quantum computers could solve it. Their work suggests that surprisingly low-depth circuits could provide useful information about this model, making it more accessible to realistic quantum hardware.”
Hartmut Neven, Head of Quantum Artificial Intelligence Lab, Google: “Sooner or later, quantum computing is coming. Developing the algorithms and technology to power the first commercial applications of early quantum computing hardware is the toughest challenge facing the field, which few are willing to take on. We are proud to be partners with Phasecraft, a team that are developing advances in quantum software that could shorten that timeframe by years.”
Phasecraft Founder Dr. Toby Cubitt: “At Phasecraft, our team of leading quantum theorists have been researching and applying quantum theory for decades, leading some of the top global academic teams and research in the field. Today, Ashley and his team have demonstrated ways to get closer to achieving new possibilities that exist just beyond today’s technological bounds.”
Phasecraft has closed a record seed round for a quantum company in the UK with £3.7m in funding from private-sector VC investors, led by LocalGlobe with Episode1 along with previous investors. Former Songkick founder Ian Hogarth has also joined as board chair for Phasecraft. Phasecraft previously raised a £750,000 pre-seed round led by UCL Technology Fund with Parkwalk Advisors and London Co-investment Fund and has earned several grants facilitated by InnovateUK. Between equity funding and research grants, Phasecraft has raised more than £5.5m.
Dr Toby Cubitt: “With new funding and support, we are able to continue our pioneering research and industry collaborations to develop the quantum computing industry and find useful applications faster.”
Although Bose-Einstein condensation has been observed in several systems, the limits of the phenomenon need to be pushed further: to faster timescales, higher temperatures, and smaller sizes. The easier creating these condensates gets, the more exciting routes open for new technological applications. New light sources, for example, could be extremely small in size and allow fast information processing.
In experiments by Aalto researchers, the condensed particles were mixtures of light and electrons in motion in gold nanorods arranged into a periodic array. Unlike most previous Bose-Einstein condensates created experimentally, the new condensate does not need to be cooled down to temperatures near absolute zero. Because the particles are mostly light, the condensation could be induced in room temperature.
‘The gold nanoparticle array is easy to create with modern nanofabrication methods. Near the nanorods, light can be focused into tiny volumes, even below the wavelength of light in vacuum. These features offer interesting prospects for fundamental studies and applications of the new condensate,’ says Academy Professor Päivi Törmä.
The main hurdle in acquiring proof of the new kind of condensate is that it comes into being extremely quickly.’According to our theoretical calculations, the condensate forms in only a picosecond,’ says doctoral student Antti Moilanen. ‘How could we ever verify the existence of something that only lasts one trillionth of a second?’
Turning distance into time
A key idea was to initiate the condensation process with a kick so that the particles forming the condensate would start to move.
‘As the condensate takes form, it will emit light throughout the gold nanorod array. By observing the light, we can monitor how the condensation proceeds in time. This is how we can turn distance into time,’ explains staff scientist Tommi Hakala.
The light that the condensate emits is similar to laser light. ‘We can alter the distance between each nanorod to control whether Bose-Einstein condensation or the formation of ordinary laser light occurs. The two are closely related phenomena, and being able to distinguish between them is crucial for fundamental research. They also promise different kinds of technological applications,’ explains Professor Törmä.
Both lasing and Bose-Einstein condensation provide bright beams, but the coherences of the light they offer have different properties. These, in turn, affect the ways the light can be tuned to meet the requirements of a specific application. The new condensate can produce light pulses that are extremely short and may offer faster speeds for information processing and imaging applications. Academy Professor Törmä has already obtained a Proof of Concept grant from the European Research Council to explore such prospects.
1 Tommi K. Hakala, Antti J. Moilanen, Aaro I. Väkeväinen, Rui Guo, Jani-Petri Martikainen, Konstantinos S. Daskalakis, Heikki T. Rekola, Aleksi Julku, Päivi Törmä. Bose–Einstein condensation in a plasmonic lattice. Nature Physics, 2018; DOI: 10.1038/s41567-018-0109-9
New quantum method generates really random numbers
Researchers at the National Institute of Standards and Technology (NIST) have developed a method for generating numbers guaranteed to be random by quantum mechanics. Described in the April 12 issue of Nature, the experimental technique surpasses all previous methods for ensuring the unpredictability of its random numbers and may enhance security and trust in cryptographic systems.
The new NIST method generates digital bits (1s and 0s) with photons, or particles of light, using data generated in an improved version of a landmark 2015 NIST physics experiment. That experiment showed conclusively that what Einstein derided as “spooky action at a distance” is real. In the new work, researchers process the spooky output to certify and quantify the randomness available in the data and generate a string of much more random bits.
Random numbers are used hundreds of billions of times a day to encrypt data in electronic networks. But these numbers are not certifiably random in an absolute sense. That’s because they are generated by software formulas or physical devices whose supposedly random output could be undermined by factors such as predictable sources of noise. Running statistical tests can help, but no statistical test on the output alone can absolutely guarantee that the output was unpredictable, especially if an adversary has tampered with the device.
“It’s hard to guarantee that a given classical source is really unpredictable,” NIST mathematician Peter Bierhorst said. “Our quantum source and protocol is like a fail-safe. We’re sure that no one can predict our numbers.”
“Something like a coin flip may seem random, but its outcome could be predicted if one could see the exact path of the coin as it tumbles. Quantum randomness, on the other hand, is real randomness. We’re very sure we’re seeing quantum randomness because only a quantum system could produce these statistical correlations between our measurement choices and outcomes.”
The new quantum-based method is part of an ongoing effort to enhance NIST’s public randomness beacon, which broadcasts random bits for applications such as secure multiparty computation. The NIST beacon currently relies on commercial sources.
Quantum mechanics provides a superior source of randomness because measurements of some quantum particles (those in a “superposition” of both 0 and 1 at the same time) have fundamentally unpredictable results. Researchers can easily measure a quantum system. But it’s hard to prove that measurements are being made of a quantum system and not a classical system in disguise.
In NIST’s experiment, that proof comes from observing the spooky quantum correlations between pairs of distant photons while closing the “loopholes” that might otherwise allow non-random bits to appear to be random. For example, the two measurement stations are positioned too far apart to allow hidden communications between them; by the laws of physics any such exchanges would be limited to the speed of light.
Random numbers are generated in two steps. First, the spooky action experiment generates a long string of bits through a “Bell test,” in which researchers measure correlations between the properties of the pairs of photons. The timing of the measurements ensures that the correlations cannot be explained by classical processes such as pre-existing conditions or exchanges of information at, or slower than, the speed of light. Statistical tests of the correlations demonstrate that quantum mechanics is at work, and these data allow the researchers to quantify the amount of randomness present in the long string of bits.
That randomness may be spread very thin throughout the long string of bits. For example, nearly every bit might be 0 with only a few being 1. To obtain a short, uniform string with concentrated randomness such that each bit has a 50/50 chance of being 0 or 1, a second step called “extraction” is performed. NIST researchers developed software to process the Bell test data into a shorter string of bits that are nearly uniform; that is, with 0s and 1s equally likely. The full process requires the input of two independent strings of random bits to select measurement settings for the Bell tests and to “seed” the software to help extract the randomness from the original data. NIST researchers used a conventional random number generator to generate these input strings.
From 55,110,210 trials of the Bell test, each of which produces two bits, researchers extracted 1,024 bits certified to be uniform to within one trillionth of 1 percent.
“A perfect coin toss would be uniform, and we made 1,024 bits almost perfectly uniform, each extremely close to equally likely to be 0 or 1,” Bierhorst said.
Other researchers have previously used Bell tests to generate random numbers, but the NIST method is the first to use a loophole-free Bell test and to process the resulting data through extraction. Extractors and seeds are already used in classical random number generators; in fact, random seeds are essential in computer security and can be used as encryption keys.
In the new NIST method, the final numbers are certified to be random even if the measurement settings and seed are publicly known; the only requirement is that the Bell test experiment be physically isolated from customers and hackers. “The idea is you get something better out (private randomness) than what you put in (public randomness),” Bierhorst said.
Peter Bierhorst, Emanuel Knill, Scott Glancy, Yanbao Zhang, Alan Mink, Stephen Jordan, Andrea Rommal, Yi-Kai Liu, Bradley Christensen, Sae Woo Nam, Martin J. Stevens, Lynden K. Shalm. Experimentally Generated Randomness Certified by the Impossibility of Superluminal Signals. Nature, 2018 DOI: 10.1038/s41586-018-0019-0
Yesterday the 2016 Nobel Prize in Physics was announced. I immediately got a few tweets asking for more information about what these “exotic” states of matter were and explain more about them… Well in short the prize was awarded for the theoretical discoveries that help scientists understand unusual properties of materials, such as superconductivity and superfluidity, that arise at low temperatures.
The prize was awarded jointly to David J. Thouless of the University of Washington in Seattle, F. Duncan M. Haldane of Princeton University in New Jersey, and J. Michael Kosterlitz of Brown University in Rhode Island. The citation from the Swedish Academy reads: “for theoretical discoveries of topological phase transitions and topological phases of matter.”
“Topo…what?” – I hear you cry… well let us start at the beginning…
Thouless, Haldane and Kosterliz work in a field of physics known as Condensed Matter Physics and it is interested in the physical properties of “condensed” materials such as solids and liquids. You may not know it, but results from research in condensed matter physics have made it possible for you to save a lot of data in your computer’s hard drive: the discovery of giant magnetoresistance has made it possible.
The discoveries that the Nobel Committee are highlighting with the prize provide a better understanding of phases of matter such as superconductors, superfluids and thin magnetic films. The discoveries are now guiding the quest for next generation materials for electronics, quantum computing and more. They have developed mathematical models to describe the topological properties of materials in relation to other phenomena such as superconductivity, superfluidity and other peculiar magnetic properties.
Once again that word: “topology”…
So, we know that all matter is formed by atoms. Nonetheless matter can have different properties and appear in different forms, such as solid, liquid, superfluid, magnet, etc. These various forms of matter are often called states of matter or phases. According to condensed matter physics , the different properties of materials originate from the different ways in which the atoms are organised in the materials. Those different organizations of the atoms (or other particles) are formally called the orders in the materials. Topological order is a type of order in zero-temperature phase of matter (also known as quantum matter). In general, topology is the study of geometrical properties and spatial relations unaffected by the continuous change of shape or size of figures. In our case, we are talking about properties of matter that remain unchanged when the object is flattened or expanded.
Although, research originally focused on topological properties in 1-D and 2-D materials, researchers have discovered them in 3-D materials as well. These results are particularly important as they enable us to understanding “exotic” phenomena such as superconductivity, the property of matter that lets electrons travel through materials with zero resistance, and superfluidity, which lets fluids flow with zero loss of kinetic energy. Currently one of the most researched topics in the area is the study of topological insulators, superconductors and metals.
Here is a report from Physics Today about the Nobel Prize announcement:
David Thouless, Duncan Haldane, and Michael Kosterlitz are to be awarded the 2016 Nobel Prize in Physics for their work on topological phases and phase transitions, the Royal Swedish Academy of Sciences announced on Tuesday. Thouless, of the University of Washington in Seattle, will receive half the 8 million Swedish krona (roughly $925 000) prize; Haldane, of Princeton University, and Kosterlitz, of Brown University, will split the other half.
This year’s laureates used the mathematical branch of topology to make revolutionary contributions to their field of condensed-matter physics. In 1972 Thouless and Kosterlitz identified a phase transition that opened up two-dimensional systems as a playground for observing superconductivity, superfluidity, and other exotic phenomena. A decade later Haldane showed that topology is important in considering the properties of 1D chains of magnetic atoms. Then in the 1980s Thouless and Haldane demonstrated that the unusual behavior exhibited in the quantum Hall effect can emerge without a magnetic field.
From early on it was clear that the laureates’ work would have important implications for condensed-matter theory. Today experimenters are studying 2D superconductors and topological insulators, which are insulating in the bulk yet channel spin-polarized currents on their surfaces without resistance (see Physics Today, January 2010, page 33). The research could lead to improved electronics, robust qubits for quantum computers, and even an improved understanding of the standard model of particle physics.
Vortices and the KT transition
When Thouless and Kosterlitz first collaborated in the early 1970s, the conventional wisdom was that thermal fluctuations in 2D materials precluded the emergence of ordered phases such as superconductivity. The researchers, then at the University of Birmingham in England, dismantled that argument by investigating the interactions within a 2D lattice.
Thouless and Kosterlitz considered an idealized array of spins that is cooled to nearly absolute zero. At first the system lacks enough thermal energy to create defects, which in the model take the form of localized swirling vortices. Raising the temperature spurs the development of tightly bound pairs of oppositely rotating vortices. The coherence of the entire system depends logarithmically on the separation between vortices. As the temperature rises further, more vortex pairs pop up, and the separation between partners grows.
The two scientists’ major insight came when they realized they could model the clockwise and counterclockwise vortices as positive and negative electric charges. The more pairs that form, the more interactions are disturbed by narrowly spaced vortices sitting between widely spaced ones. “Eventually, the whole thing will fly apart and you’ll get spontaneous ‘ionization,’ ” Thouless told Physics Today in 2006.
That analog to ionization, in which the coherence suddenly falls off in an exponential rather than logarithmic dependence with distance, is known as the Kosterlitz–Thouless (KT) transition. (The late Russian physicist Vadim Berezinskii made a similar observation in 1970, which led some researchers to add a “B” to the transition name, but the Nobel committee notes that Berezinskii did not theorize the existence of the transition at finite temperature.)
Unlike some other phase transitions, such as the onset of ferromagnetism, no symmetry is broken. The sudden shift between order and disorder also demonstrates that superconductivity could indeed subsist in the 2D realm at temperatures below that of the KT transition. Experimenters observed the KT transition in superfluid helium-4 in 1978 and in superconducting thin films in 1981. More recently, the transition was reproduced in a flattened cloud of ultracold rubidium atoms (see Physics Today, August 2006, page 17).
A topological answer for the quantum Hall effect
Thouless then turned his attention to the quantum foundations of conductors and insulators. In 1980 German physicist Klaus von Klitzing had applied a strong magnetic field to a thin conducting film sandwiched between semiconductors. The electrons traveling within the film separated into well-organized opposing lanes of traffic along the edges (see Physics Today, June 1981, page 17). Von Klitzing had discovered the quantum Hall effect, for which he would earn the Nobel five years later.
Crucially, von Klitzing found that adjusting the strength of the magnetic field changed the conductance of his thin film only in fixed steps; the conductance was always an integer multiple of a fixed value, e2/h. That discovery proved the key for Thouless to relate the quantum Hall effect to topology, which is also based on integer steps—objects are often distinguished from each other topologically by the number of holes or nodes they possess, which is always an integer. In 1983 Thouless proposed that the electrons in von Klitzing’s experiment had formed a topological quantum fluid; the electrons’ collective behavior in that fluid, as measured by conductance, must vary in steps.
Not only did Thouless’s work explain the integer nature of the quantum Hall effect, but it also pointed the way to reproducing the phenomenon’s exotic behavior under less extreme conditions. In 1988 Haldane proposed a means for electrons to form a topological quantum fluid in the absence of a magnetic field. Twenty-five years later, researchers reported such behavior in chromium-doped (Bi,Sb)2Te3, the first observation of what is known as the quantum anomalous Hall effect.
Exploring topological materials
Around 2005, physicists began exploring the possibility of realizing topological insulators, a large family of new topological phases of matter that would exhibit the best of multiple worlds: They would robustly conduct electricity on their edges or surfaces without a magnetic field and as a bonus would divide electron traffic into lanes determined by spin. Since then experimenters have identified topological insulators in two and three dimensions, which may lead to improved electronics. Other physicists have created topological insulators that conduct sound or light, rather than electrons, on their surfaces (see Physics Today, May 2014, page 68).
Haldane’s work in the 1980s on the fractional quantum Hall effect was among the theoretical building blocks for proposals to use topologically protected excitations to build a fault-tolerant quantum computer (see Physics Today, October 2005, page 21). And his 1982 paper on magnetic chains serves as the foundation for efforts to create topologically protected excitations that behave like Majorana fermions, which are their own antiparticle. The work could lead to robust qubits for preserving the coherence of quantum information and perhaps provide particle physicists with clues as to the properties of fundamental Majorana fermions, which may or may not exist in nature.
From gene mapping to space exploration, humanity continues to generate ever-larger sets of data — far more information than people can actually process, manage, or understand.
Machine learning systems can help researchers deal with this ever-growing flood of information. Some of the most powerful of these analytical tools are based on a strange branch of geometry called topology, which deals with properties that stay the same even when something is bent and stretched every which way.
Such topological systems are especially useful for analyzing the connections in complex networks, such as the internal wiring of the brain, the U.S. power grid, or the global interconnections of the Internet. But even with the most powerful modern supercomputers, such problems remain daunting and impractical to solve. Now, a new approach that would use quantum computers to streamline these problems has been developed by researchers at MIT, the University of Waterloo, and the University of Southern California.
The team describes their theoretical proposal this week in the journal Nature Communications. Seth Lloyd, the paper’s lead author and the Nam P. Suh Professor of Mechanical Engineering, explains that algebraic topology is key to the new method. This approach, he says, helps to reduce the impact of the inevitable distortions that arise every time someone collects data about the real world.
In a topological description, basic features of the data (How many holes does it have? How are the different parts connected?) are considered the same no matter how much they are stretched, compressed, or distorted. Lloyd explains that it is often these fundamental topological attributes “that are important in trying to reconstruct the underlying patterns in the real world that the data are supposed to represent.”
It doesn’t matter what kind of dataset is being analyzed, he says. The topological approach to looking for connections and holes “works whether it’s an actual physical hole, or the data represents a logical argument and there’s a hole in the argument. This will find both kinds of holes.”
Using conventional computers, that approach is too demanding for all but the simplest situations. Topological analysis “represents a crucial way of getting at the significant features of the data, but it’s computationally very expensive,” Lloyd says. “This is where quantum mechanics kicks in.” The new quantum-based approach, he says, could exponentially speed up such calculations.
Lloyd offers an example to illustrate that potential speedup: If you have a dataset with 300 points, a conventional approach to analyzing all the topological features in that system would require “a computer the size of the universe,” he says. That is, it would take 2300 (two to the 300th power) processing units — approximately the number of all the particles in the universe. In other words, the problem is simply not solvable in that way.
“That’s where our algorithm kicks in,” he says. Solving the same problem with the new system, using a quantum computer, would require just 300 quantum bits — and a device this size may be achieved in the next few years, according to Lloyd.
“Our algorithm shows that you don’t need a big quantum computer to kick some serious topological butt,” he says.
There are many important kinds of huge datasets where the quantum-topological approach could be useful, Lloyd says, for example understanding interconnections in the brain. “By applying topological analysis to datasets gleaned by electroencephalography or functional MRI, you can reveal the complex connectivity and topology of the sequences of firing neurons that underlie our thought processes,” he says.
The same approach could be used for analyzing many other kinds of information. “You could apply it to the world’s economy, or to social networks, or almost any system that involves long-range transport of goods or information,” Lloyd says. But the limits of classical computation have prevented such approaches from being applied before.
While this work is theoretical, “experimentalists have already contacted us about trying prototypes,” he says. “You could find the topology of simple structures on a very simple quantum computer. People are trying proof-of-concept experiments.”
The team also included Silvano Garnerone of the University of Waterloo in Ontario, Canada, and Paolo Zanardi of the Center for Quantum Information Science and Technology at the University of Southern California.
This time is not really a question that has arrived to the Quantum Tunnel mailbox, it is rather an observation and some cheers. Let’s take a look:
Dear Quantum Tunnel,
I have listened to all the available Quantum Tunnel podcasts in Spanish, the content is great and the news are cool. I am interested in understanding more about quantum theory and in my experience there is no a lot of information at my level that does not make it all sounds like philosophy or even a bad example. In most cases the explanations start up assuming that one does understand the “quantum concepts”. With those limitations, I am afraid to admit that I actually fail to see the genius of Einstein. Having said that I refuse to think that after I am unable to understand ideas that are thought in universities. Surely some explanations do not start with “time is relative”. If thousands can understand it, so can I.
Hello again Pablo, I agree with you that there is a lot of information out there that either assumes too much, or simply exploits the concepts for non-scientific purposes. You are right, I am sure you can understand the intricacies of quantum-mechanical phenomena, but bear in mind the words of Richard Feynman “I think I can safely say that nobody understands quantum mechanics”. I would not expect someone to become a quantum physicist without the appropriate training, in the same way we cannot all perform a heart transplant without studying medicine and practicing. That doesn’t mean we can’t change careers though!
If you want to learn quantum theory in ten minutes, take a look at the blog post that the Quantum Pontiff blog posted a few years back. Yes, there are ducks and turkeys, but then again they promised to explain in 10 minutes. There are nonetheless a few things that can serve as building blocks to achieve your goal:
Learn about classical physics (yes, the courses on mechanics that you probably took in high school, exactly those). A good understanding of this will highlight those non-intuitive results from the quantum world.
Understand how to describe the behaviour of particles and of waves (I guess this is part of number 1 above, so just stressing the point!)
Make sure you are well versed in the use of probability (yes, I am saying that you need to revise some mathematics!)
It all that works, perhaps consider enrolling at your local University to read physics, you never know you make the next discovery in physics. Incidentally, within your revision make sure you understand that relativity theory (general or special) is completely decoupled from quantum theory. As a matter of fact, joining the two is one of the biggest challenges in physics today.
If you want to ask a question to Quantum Tunnel use the form here.
I have been meaning to write this post for a while, but for one reason or another (or rather many reasons…) I had not been able to. Right, so what has triggered this post? Well, I was having a look at a the BFI website as they usually have some very good films and event to attend and I happened to come across some news about Film Nation’s new programme on film education. You can have a look at the website here. Did you click on the link? Have you seen the title of the news item? If not, please take a look at the screenshot I include in this post.
That is right! They describe the new programme as a “quantum leap for film education”. I believe they want to imply that the programme is a great advancement, but I am not sure that describing it as a “quantum leap” conveys what they want. It is rather sad to see this sort of misuses and that is why I am writing this post.
So, a quantum is indeed a unit: it is the smallest amount of energy that a system can gain or lose, and this actually contradicts the message they want to communicate. The term “quantum” started being used in the early 1900s by Max Plank as part of a theory to explain the physics of the sub-atomic world. As such, light was thought as a tiny packet of energy (as well as a wave…) that could be emitted or absorbed by an electron in an atom for instance. As such a quantum leap is the smallest possible change in the energy level of that electron, and one that can take place at random.
So, who knows, perhaps the BFI (as well as others out there) do mean indeed to use “quantum leap” to describe these achievement… Or what do you think? Let me know and if you have any similar terms that get misused get in touch.
We have seen how light could be described in terms of a wave, as demonstrated by the double-slit experiment. Nonetheless, that is not the whole story. For instance, in 1888, Wilhelm Hallwachs describes an experiment using a circular zinc plate mounted on an insulating stand and attached by a wire to a gold leaf electroscope, which was then charged negatively. The electroscope lost its charge very slowly. However, if the zinc plate was exposed to ultraviolet light, charge leaked away quickly. The leakage did not occur if the plate was positively charged.
By 1899, J. J.Thomson established that the ultraviolet light caused electrons to be emitted, the same particles found in cathode rays: atoms in the cathode contained electrons, which were shaken and caused to vibrate by the oscillating electric field of the incident radiation. In 1902, Philipp Lenard described how the energy of the emitted photoelectrons varied with the intensity of the light: doubling the light intensity doubled the number of electrons emitted, but did not affect the energies of the emitted electrons. The more powerful oscillating field ejected more electrons, but the maximum individual energy of the ejected electrons was the same as for the weaker field.
In 1905 Einstein gave proposed a way to explain these observations: He assumed that the incoming radiation should be thought of as quanta of frequency hf, with f the frequency. In photoemission, one such quantum is absorbed by one electron. If the electron is some distance into the material of the cathode, some energy will be lost as it moves towards the surface. There will always be some electrostatic cost as the electron leaves the surface, this is usually called the work function, W. The most energetic electrons emitted will be those very close to the surface, and they will leave the cathode with kinetic energy. This explanation was successful and validates the interpretation of the behaviour of light as particles. In 1921, Einstein was awarded the Nobel Prize in Physics“for his services to Theoretical Physics, and especially for his discovery of the law of the photoelectric effect”.
One very prominent application of the photoelectric effect is solar energy produced by photovoltaic cells. These are made of semi-conducting material which produce electricity when exposed to sunlight.