A new Bose-Einstein condensate

Originally published here.

A new Bose-Einstein condensate

 

Although Bose-Einstein condensation has been observed in several systems, the limits of the phenomenon need to be pushed further: to faster timescales, higher temperatures, and smaller sizes. The easier creating these condensates gets, the more exciting routes open for new technological applications. New light sources, for example, could be extremely small in size and allow fast information processing.

In experiments by Aalto researchers, the condensed particles were mixtures of light and electrons in motion in gold nanorods arranged into a periodic array. Unlike most previous Bose-Einstein condensates created experimentally, the new condensate does not need to be cooled down to temperatures near absolute zero. Because the particles are mostly light, the condensation could be induced in room temperature.

‘The gold nanoparticle array is easy to create with modern nanofabrication methods. Near the nanorods, light can be focused into tiny volumes, even below the wavelength of light in vacuum. These features offer interesting prospects for fundamental studies and applications of the new condensate,’ says Academy Professor Päivi Törmä.

The main hurdle in acquiring proof of the new kind of condensate is that it comes into being extremely quickly.’According to our theoretical calculations, the condensate forms in only a picosecond,’ says doctoral student Antti Moilanen. ‘How could we ever verify the existence of something that only lasts one trillionth of a second?’

Turning distance into time

A key idea was to initiate the condensation process with a kick so that the particles forming the condensate would start to move.

‘As the condensate takes form, it will emit light throughout the gold nanorod array. By observing the light, we can monitor how the condensation proceeds in time. This is how we can turn distance into time,’ explains staff scientist Tommi Hakala.

The light that the condensate emits is similar to laser light. ‘We can alter the distance between each nanorod to control whether Bose-Einstein condensation or the formation of ordinary laser light occurs. The two are closely related phenomena, and being able to distinguish between them is crucial for fundamental research. They also promise different kinds of technological applications,’ explains Professor Törmä.

Both lasing and Bose-Einstein condensation provide bright beams, but the coherences of the light they offer have different properties. These, in turn, affect the ways the light can be tuned to meet the requirements of a specific application. The new condensate can produce light pulses that are extremely short and may offer faster speeds for information processing and imaging applications. Academy Professor Törmä has already obtained a Proof of Concept grant from the European Research Council to explore such prospects.

Materials provided by Aalto University. Note: Content may be edited for style and length.

Journal Reference:

1 Tommi K. Hakala, Antti J. Moilanen, Aaro I. Väkeväinen, Rui Guo, Jani-Petri Martikainen, Konstantinos S. Daskalakis, Heikki T. Rekola, Aleksi Julku, Päivi Törmä. Bose–Einstein condensation in a plasmonic lattice. Nature Physics, 2018; DOI: 10.1038/s41567-018-0109-9

New quantum method generates really random numbers

Originally appeared in ScienceDaily, 11 April 2018.

New quantum method generates really random numbers

Researchers at the National Institute of Standards and Technology (NIST) have developed a method for generating numbers guaranteed to be random by quantum mechanics. Described in the April 12 issue of Nature, the experimental technique surpasses all previous methods for ensuring the unpredictability of its random numbers and may enhance security and trust in cryptographic systems.

The new NIST method generates digital bits (1s and 0s) with photons, or particles of light, using data generated in an improved version of a landmark 2015 NIST physics experiment. That experiment showed conclusively that what Einstein derided as “spooky action at a distance” is real. In the new work, researchers process the spooky output to certify and quantify the randomness available in the data and generate a string of much more random bits.

Random numbers are used hundreds of billions of times a day to encrypt data in electronic networks. But these numbers are not certifiably random in an absolute sense. That’s because they are generated by software formulas or physical devices whose supposedly random output could be undermined by factors such as predictable sources of noise. Running statistical tests can help, but no statistical test on the output alone can absolutely guarantee that the output was unpredictable, especially if an adversary has tampered with the device.

“It’s hard to guarantee that a given classical source is really unpredictable,” NIST mathematician Peter Bierhorst said. “Our quantum source and protocol is like a fail-safe. We’re sure that no one can predict our numbers.”

“Something like a coin flip may seem random, but its outcome could be predicted if one could see the exact path of the coin as it tumbles. Quantum randomness, on the other hand, is real randomness. We’re very sure we’re seeing quantum randomness because only a quantum system could produce these statistical correlations between our measurement choices and outcomes.”

The new quantum-based method is part of an ongoing effort to enhance NIST’s public randomness beacon, which broadcasts random bits for applications such as secure multiparty computation. The NIST beacon currently relies on commercial sources.

Quantum mechanics provides a superior source of randomness because measurements of some quantum particles (those in a “superposition” of both 0 and 1 at the same time) have fundamentally unpredictable results. Researchers can easily measure a quantum system. But it’s hard to prove that measurements are being made of a quantum system and not a classical system in disguise.

In NIST’s experiment, that proof comes from observing the spooky quantum correlations between pairs of distant photons while closing the “loopholes” that might otherwise allow non-random bits to appear to be random. For example, the two measurement stations are positioned too far apart to allow hidden communications between them; by the laws of physics any such exchanges would be limited to the speed of light.

Random numbers are generated in two steps. First, the spooky action experiment generates a long string of bits through a “Bell test,” in which researchers measure correlations between the properties of the pairs of photons. The timing of the measurements ensures that the correlations cannot be explained by classical processes such as pre-existing conditions or exchanges of information at, or slower than, the speed of light. Statistical tests of the correlations demonstrate that quantum mechanics is at work, and these data allow the researchers to quantify the amount of randomness present in the long string of bits.

That randomness may be spread very thin throughout the long string of bits. For example, nearly every bit might be 0 with only a few being 1. To obtain a short, uniform string with concentrated randomness such that each bit has a 50/50 chance of being 0 or 1, a second step called “extraction” is performed. NIST researchers developed software to process the Bell test data into a shorter string of bits that are nearly uniform; that is, with 0s and 1s equally likely. The full process requires the input of two independent strings of random bits to select measurement settings for the Bell tests and to “seed” the software to help extract the randomness from the original data. NIST researchers used a conventional random number generator to generate these input strings.

From 55,110,210 trials of the Bell test, each of which produces two bits, researchers extracted 1,024 bits certified to be uniform to within one trillionth of 1 percent.

“A perfect coin toss would be uniform, and we made 1,024 bits almost perfectly uniform, each extremely close to equally likely to be 0 or 1,” Bierhorst said.

Other researchers have previously used Bell tests to generate random numbers, but the NIST method is the first to use a loophole-free Bell test and to process the resulting data through extraction. Extractors and seeds are already used in classical random number generators; in fact, random seeds are essential in computer security and can be used as encryption keys.

In the new NIST method, the final numbers are certified to be random even if the measurement settings and seed are publicly known; the only requirement is that the Bell test experiment be physically isolated from customers and hackers. “The idea is you get something better out (private randomness) than what you put in (public randomness),” Bierhorst said.

Story Source:

Materials provided by National Institute of Standards and Technology (NIST)Note: Content may be edited for style and length.


Journal Reference:

  1. Peter Bierhorst, Emanuel Knill, Scott Glancy, Yanbao Zhang, Alan Mink, Stephen Jordan, Andrea Rommal, Yi-Kai Liu, Bradley Christensen, Sae Woo Nam, Martin J. Stevens, Lynden K. Shalm. Experimentally Generated Randomness Certified by the Impossibility of Superluminal SignalsNature, 2018 DOI: 10.1038/s41586-018-0019-0

Nobel Prize in Physics 2016: Exotic States of Matter

Yesterday the 2016 Nobel Prize in Physics was announced. I immediately got a few tweets asking for more information about what these “exotic” states of matter were and explain more about them… Well in short the prize was awarded for the  theoretical discoveries that help scientists understand unusual properties of materials, such as superconductivity and superfluidity, that arise at low temperatures.

Physics Nobel 2016

The prize was awarded jointly to David J. Thouless of the University of Washington in Seattle, F. Duncan M. Haldane of Princeton University in New Jersey, and J. Michael Kosterlitz of Brown University in Rhode Island. The citation from the Swedish Academy reads: “for theoretical discoveries of topological phase transitions and topological phases of matter.”

“Topo…what?” – I hear you cry… well let us start at the beginning…

Thouless, Haldane and Kosterliz work in a field of physics known as Condensed Matter Physics and it is interested in the physical properties of “condensed” materials such as solids and liquids. You may not know it, but results from research in condensed matter physics have made it possible for you to save a lot of data in your computer’s hard drive: the discovery of giant magnetoresistance has made it possible.

The discoveries that the Nobel Committee are highlighting with the prize provide a better understanding of phases of matter such as superconductors, superfluids and thin magnetic films. The discoveries are now guiding the quest for next generation materials for electronics, quantum computing and more. They have developed mathematical models to describe the topological properties of materials in relation to other phenomena such as superconductivity, superfluidity and other peculiar magnetic properties.

Once again that word: “topology”…

So, we know that all matter is formed by atoms. Nonetheless matter can have different properties and appear in different forms, such as solid, liquid, superfluid, magnet, etc. These various forms of matter are often called states of matter or phases. According to condensed matter physics , the different properties of materials originate from the different ways in which the atoms are organised in the materials. Those different organizations of the atoms (or other particles) are formally called the orders in the materials. Topological order is a type of order in zero-temperature phase of matter (also known as quantum matter). In general, topology is the study of geometrical properties and spatial relations unaffected by the continuous change of shape or size of figures. In our case, we are talking about properties of matter that remain unchanged when the object is flattened or expanded.

Although, research originally focused on topological properties in 1-D and 2-D materials, researchers have discovered them in 3-D materials as well. These results are particularly important as they enable us to understanding “exotic” phenomena such as superconductivity, the property of matter that lets electrons travel through materials with zero resistance, and superfluidity, which lets fluids flow with zero loss of kinetic energy. Currently one of the most researched topics in the area is the study of topological insulators, superconductors and metals.

Here is a report from Physics Today about the Nobel Prize announcement:

Thouless, Haldane, and Kosterlitz share 2016 Nobel Prize in Physics

David Thouless, Duncan Haldane, and Michael Kosterlitz are to be awarded the 2016 Nobel Prize in Physics for their work on topological phases and phase transitions, the Royal Swedish Academy of Sciences announced on Tuesday. Thouless, of the University of Washington in Seattle, will receive half the 8 million Swedish krona (roughly $925 000) prize; Haldane, of Princeton University, and Kosterlitz, of Brown University, will split the other half.

This year’s laureates used the mathematical branch of topology to make revolutionary contributions to their field of condensed-matter physics. In 1972 Thouless and Kosterlitz identified a phase transition that opened up two-dimensional systems as a playground for observing superconductivity, superfluidity, and other exotic phenomena. A decade later Haldane showed that topology is important in considering the properties of 1D chains of magnetic atoms. Then in the 1980s Thouless and Haldane demonstrated that the unusual behavior exhibited in the quantum Hall effect can emerge without a magnetic field.

From early on it was clear that the laureates’ work would have important implications for condensed-matter theory. Today experimenters are studying 2D superconductors and topological insulators, which are insulating in the bulk yet channel spin-polarized currents on their surfaces without resistance (see Physics Today, January 2010, page 33). The research could lead to improved electronics, robust qubits for quantum computers, and even an improved understanding of the standard model of particle physics.

Vortices and the KT transition

When Thouless and Kosterlitz first collaborated in the early 1970s, the conventional wisdom was that thermal fluctuations in 2D materials precluded the emergence of ordered phases such as superconductivity. The researchers, then at the University of Birmingham in England, dismantled that argument by investigating the interactions within a 2D lattice.

Thouless and Kosterlitz considered an idealized array of spins that is cooled to nearly absolute zero. At first the system lacks enough thermal energy to create defects, which in the model take the form of localized swirling vortices. Raising the temperature spurs the development of tightly bound pairs of oppositely rotating vortices. The coherence of the entire system depends logarithmically on the separation between vortices. As the temperature rises further, more vortex pairs pop up, and the separation between partners grows.

The two scientists’ major insight came when they realized they could model the clockwise and counterclockwise vortices as positive and negative electric charges. The more pairs that form, the more interactions are disturbed by narrowly spaced vortices sitting between widely spaced ones. “Eventually, the whole thing will fly apart and you’ll get spontaneous ‘ionization,’ ” Thouless told Physics Today in 2006.

That analog to ionization, in which the coherence suddenly falls off in an exponential rather than logarithmic dependence with distance, is known as the Kosterlitz–Thouless (KT) transition. (The late Russian physicist Vadim Berezinskii made a similar observation in 1970, which led some researchers to add a “B” to the transition name, but the Nobel committee notes that Berezinskii did not theorize the existence of the transition at finite temperature.)

Unlike some other phase transitions, such as the onset of ferromagnetism, no symmetry is broken. The sudden shift between order and disorder also demonstrates that superconductivity could indeed subsist in the 2D realm at temperatures below that of the KT transition. Experimenters observed the KT transition in superfluid helium-4 in 1978 and in superconducting thin films in 1981. More recently, the transition was reproduced in a flattened cloud of ultracold rubidium atoms (see Physics Today, August 2006, page 17).

A topological answer for the quantum Hall effect

Thouless then turned his attention to the quantum foundations of conductors and insulators. In 1980 German physicist Klaus von Klitzing had applied a strong magnetic field to a thin conducting film sandwiched between semiconductors. The electrons traveling within the film separated into well-organized opposing lanes of traffic along the edges (see Physics Today, June 1981, page 17). Von Klitzing had discovered the quantum Hall effect, for which he would earn the Nobel five years later.

Crucially, von Klitzing found that adjusting the strength of the magnetic field changed the conductance of his thin film only in fixed steps; the conductance was always an integer multiple of a fixed value, e2/h. That discovery proved the key for Thouless to relate the quantum Hall effect to topology, which is also based on integer steps—objects are often distinguished from each other topologically by the number of holes or nodes they possess, which is always an integer. In 1983 Thouless proposed that the electrons in von Klitzing’s experiment had formed a topological quantum fluid; the electrons’ collective behavior in that fluid, as measured by conductance, must vary in steps.

Not only did Thouless’s work explain the integer nature of the quantum Hall effect, but it also pointed the way to reproducing the phenomenon’s exotic behavior under less extreme conditions. In 1988 Haldane proposed a means for electrons to form a topological quantum fluid in the absence of a magnetic field. Twenty-five years later, researchers reported such behavior in chromium-doped (Bi,Sb)2Te3, the first observation of what is known as the quantum anomalous Hall effect.

Exploring topological materials

Around 2005, physicists began exploring the possibility of realizing topological insulators, a large family of new topological phases of matter that would exhibit the best of multiple worlds: They would robustly conduct electricity on their edges or surfaces without a magnetic field and as a bonus would divide electron traffic into lanes determined by spin. Since then experimenters have identified topological insulators in two and three dimensions, which may lead to improved electronics. Other physicists have created topological insulators that conduct sound or light, rather than electrons, on their surfaces (see Physics Today, May 2014, page 68).

Haldane’s work in the 1980s on the fractional quantum Hall effect was among the theoretical building blocks for proposals to use topologically protected excitations to build a fault-tolerant quantum computer (see Physics Today, October 2005, page 21). And his 1982 paper on magnetic chains serves as the foundation for efforts to create topologically protected excitations that behave like Majorana fermions, which are their own antiparticle. The work could lead to robust qubits for preserving the coherence of quantum information and perhaps provide particle physicists with clues as to the properties of fundamental Majorana fermions, which may or may not exist in nature.

—Andrew Grant

 

Quantum algorithms for topological and geometric analysis of data

Story Source:

The above post is reprinted from materials provided by Massachusetts Institute of Technology. The original item was written by David L. Chandler. Note: Materials may be edited for content and length.

Quantum Data Algos

From gene mapping to space exploration, humanity continues to generate ever-larger sets of data — far more information than people can actually process, manage, or understand.

Machine learning systems can help researchers deal with this ever-growing flood of information. Some of the most powerful of these analytical tools are based on a strange branch of geometry called topology, which deals with properties that stay the same even when something is bent and stretched every which way.

Such topological systems are especially useful for analyzing the connections in complex networks, such as the internal wiring of the brain, the U.S. power grid, or the global interconnections of the Internet. But even with the most powerful modern supercomputers, such problems remain daunting and impractical to solve. Now, a new approach that would use quantum computers to streamline these problems has been developed by researchers at MIT, the University of Waterloo, and the University of Southern California.

The team describes their theoretical proposal this week in the journal Nature Communications. Seth Lloyd, the paper’s lead author and the Nam P. Suh Professor of Mechanical Engineering, explains that algebraic topology is key to the new method. This approach, he says, helps to reduce the impact of the inevitable distortions that arise every time someone collects data about the real world.

In a topological description, basic features of the data (How many holes does it have? How are the different parts connected?) are considered the same no matter how much they are stretched, compressed, or distorted. Lloyd explains that it is often these fundamental topological attributes “that are important in trying to reconstruct the underlying patterns in the real world that the data are supposed to represent.”

It doesn’t matter what kind of dataset is being analyzed, he says. The topological approach to looking for connections and holes “works whether it’s an actual physical hole, or the data represents a logical argument and there’s a hole in the argument. This will find both kinds of holes.”

Using conventional computers, that approach is too demanding for all but the simplest situations. Topological analysis “represents a crucial way of getting at the significant features of the data, but it’s computationally very expensive,” Lloyd says. “This is where quantum mechanics kicks in.” The new quantum-based approach, he says, could exponentially speed up such calculations.

Lloyd offers an example to illustrate that potential speedup: If you have a dataset with 300 points, a conventional approach to analyzing all the topological features in that system would require “a computer the size of the universe,” he says. That is, it would take 2300 (two to the 300th power) processing units — approximately the number of all the particles in the universe. In other words, the problem is simply not solvable in that way.

“That’s where our algorithm kicks in,” he says. Solving the same problem with the new system, using a quantum computer, would require just 300 quantum bits — and a device this size may be achieved in the next few years, according to Lloyd.

“Our algorithm shows that you don’t need a big quantum computer to kick some serious topological butt,” he says.

There are many important kinds of huge datasets where the quantum-topological approach could be useful, Lloyd says, for example understanding interconnections in the brain. “By applying topological analysis to datasets gleaned by electroencephalography or functional MRI, you can reveal the complex connectivity and topology of the sequences of firing neurons that underlie our thought processes,” he says.

The same approach could be used for analyzing many other kinds of information. “You could apply it to the world’s economy, or to social networks, or almost any system that involves long-range transport of goods or information,” Lloyd says. But the limits of classical computation have prevented such approaches from being applied before.

While this work is theoretical, “experimentalists have already contacted us about trying prototypes,” he says. “You could find the topology of simple structures on a very simple quantum computer. People are trying proof-of-concept experiments.”

The team also included Silvano Garnerone of the University of Waterloo in Ontario, Canada, and Paolo Zanardi of the Center for Quantum Information Science and Technology at the University of Southern California.

Quantum Tunnel Answers – Interest in Quantum Physics

This time is not really a question that has arrived to the Quantum Tunnel mailbox, it is rather an observation and some cheers. Let’s take a look:

Dear Quantum Tunnel,

I have listened to all the available Quantum Tunnel podcasts in Spanish, the content is great and the news are cool. I am interested in understanding more about quantum theory and in my experience there is no a lot of information at my level that does not make it all sounds like philosophy or even a bad example. In most cases the explanations start up assuming that one does understand the “quantum concepts”. With those limitations, I am afraid to admit that I actually fail to see the genius of Einstein. Having said that I refuse to think that after I am unable to understand ideas that are thought in universities. Surely some explanations do not start with “time is relative”. If thousands can understand it, so can I.

Pablo Mitlanian

Hello again Pablo, I agree with you that there is a lot of information out there that either assumes too much, or simply exploits the concepts for non-scientific purposes. You are right, I am sure you can understand the intricacies of quantum-mechanical phenomena, but bear in mind the words of Richard Feynman “I think I can safely say that nobody understands quantum mechanics”.  I would not expect someone to become a quantum physicist without the appropriate training, in the same way we cannot all perform a heart transplant without studying medicine and practicing. That doesn’t mean we can’t change careers though!

If you want to learn quantum theory in ten minutes, take a look at the blog post that the Quantum Pontiff blog posted a few years back. Yes, there are ducks and turkeys, but then again they promised to explain in 10 minutes. There are nonetheless a few things that can serve as building blocks to achieve your goal:

  1. Learn about classical physics (yes, the courses on mechanics that you probably took in high school, exactly those). A good understanding of this will highlight those non-intuitive results from the quantum world.
  2. Understand how to describe the behaviour of particles and of waves (I guess this is part of number 1 above, so just stressing the point!)
  3. Make sure you are well versed in the use of probability (yes, I am saying that you need to revise some mathematics!)
  4. Be patient!

It all that works, perhaps consider enrolling at your local University to read physics, you never know you make the next discovery in physics. Incidentally, within your revision make sure you understand that relativity theory (general or special) is completely decoupled from quantum theory. As a matter of fact, joining the two is one of the biggest challenges in physics today.

If you want to ask a question to Quantum Tunnel use the form here.

Quantum Leap… Are you sure you mean that?

I have been meaning to write this post for a while, but for one reason or another (or rather many reasons…) I had not been able to. Right, so what has triggered this post? Well, I was having a look at a the BFI website as they usually have some very good films and event to attend and I happened to come across some news about Film Nation’s new programme on film education. You can have a look at the website here. Did you click on the link? Have you seen the title of the news item? If not, please take a look at the screenshot I include in this post.

Quantum Leap BFI

That is right! They describe the new programme as a “quantum leap for film education”. I believe they want to imply that the programme is a great advancement, but I am not sure that describing it as a “quantum leap” conveys what they want. It is rather sad to see this sort of misuses and that is why I am writing this post.

So, a quantum is indeed a unit: it is the smallest amount of energy that a system can gain or lose, and this actually contradicts the message they want to communicate. The term “quantum” started being used in the early 1900s by Max Plank as part of a theory to explain the physics of the sub-atomic world. As such, light was thought as a tiny packet of energy (as well as a wave…) that could be emitted or absorbed by an electron in an atom for instance. As such a quantum leap is the smallest possible change in the energy level of that electron, and one that can take place at random.

So, who knows, perhaps the BFI (as well as others out there) do mean indeed to use “quantum leap” to describe these achievement… Or what do you think? Let me know and if you have any similar terms that get misused get in touch.

Photoelectric Effect – Sci-advent – Day 21

photoelectric effectWe have seen how light could be described in terms of a wave, as demonstrated by the double-slit experiment. Nonetheless, that is not the whole story. For instance, in 1888, Wilhelm Hallwachs describes an experiment using a circular zinc plate mounted on an insulating stand and attached by a wire to a gold leaf electroscope, which was then charged negatively. The electroscope lost its charge very slowly. However, if the zinc plate was exposed to ultraviolet light, charge leaked away quickly. The leakage did not occur if the plate was positively charged.

By 1899, J. J.Thomson established that the ultraviolet light caused electrons to be emitted, the same particles found in cathode rays: atoms in the cathode contained electrons, which were shaken and caused to vibrate by the oscillating electric field of the incident radiation. In 1902, Philipp Lenard described how the energy of the emitted photoelectrons varied with the intensity of the light: doubling the light intensity doubled the number of electrons emitted, but did not affect the energies of the emitted electrons. The more powerful oscillating field ejected more electrons, but the maximum individual energy of the ejected electrons was the same as for the weaker field.

In 1905 Einstein gave proposed a way to explain these observations: He assumed that the incoming radiation should be thought of as quanta of frequency hf, with f  the frequency. In photoemission, one such quantum is absorbed by one electron. If the electron is some distance into the material of the cathode, some energy will be lost as it moves towards the surface. There will always be some electrostatic cost as the electron leaves the surface, this is usually called the work function, W. The most energetic electrons emitted will be those very close to the surface, and they will leave the cathode with kinetic energy. This explanation was successful and validates the interpretation of the behaviour of light as particles. In 1921, Einstein was awarded the Nobel Prize in Physics  “for his services to Theoretical Physics, and especially for his discovery of the law of the photoelectric effect”.

One very prominent application of the photoelectric effect is solar energy produced by photovoltaic cells. These are made of semi-conducting material which produce electricity when exposed to sunlight.

 

 

Double Slit Experiment – Sci-advent – Day 19

Double Slit Experiment

 

The double-slit experiment is one of the most famous experiments in physics and one with great implications in our understanding of Nature. Although the experiment was realised originally with light, it can be done with any other type of wave.

Thomas Young conducted the experiment in the early 1800s. The aim was to allow light to pass through a pair of slits in an opaque screen. Each slit, diffracts the light and thus each acts as an individual light source. When a single slit was open, the light hit a screen with a maximum intensity in the centre and fading away from it. But when there are two slits then the light produces an interference pattern in the screen – a result that would not be expected if light consisted strictly of particles. Although the experiment favours the wave-like description of light, that is not the whole story. This interpretation is at odds with phenomena where light can behave as it is composed of discrete particles, such as the photoelectric effect. Light exhibits properties of both waves and particles, giving rise to the concept of wave-particle duality used in quantum mechanics.