Let there be light: Florence Nightingale

This year, 2020, the word Nightingale has acquired new connotations. It is no longer just a word to refer to a passerine bird with beautiful and powerful birdsong, it is the name that NHS England has given to the temporary hospitals set up for the COVID-19 pandemic. In normal circumstances it is indeed a very good name to use for a hospital, but given the circumstances, it becomes more poignant. It is even more so considering the fact that this year, 2020, is the bicentenary go Florence Nightingale’s birth.

Florence Nightingale was born on 12th May, 1820 in Florence, Italy (hence the name!) and became a social reformer, statistician, and the founder of modern nursing. She became the first woman to be elected to be a Fellow of the Royal Society in 1874.

With the power of data, Nightingale was able to save lives and change policy. Her analysis of data from the Crimean War was compelling and persuasive in its simplicity. It allowed her and her team to pay attention to time – tracking admissions to hospital and crucially deaths – on a month by month basis. We must remember that the power of statistical tests as we know today were not established tools and the work horse of statistics, regression, was decades in the future. The data analysis presented in columns and rows as supported by powerful graphics that many of us admire today.

In 2014 had an opportunity to admire her Nightingale Roses, or to use its formal name polar area charts, in the exhibition Science is Beautiful at the British Library.

Florence Nightingale’s “rose diagram”, showing the Causes of Mortality in the Army in the East, 1858. Photograph: /British Library

These and other charts were used in the report that she later published in 1858 under the title “Notes in Matters Affecting the Health, Efficiency, and Hospital Administration of the British Army”. The report included charts of deaths by barometric pressure and temperature, showing that deaths were higher in hotter months compared to cooler ones. In polar charts shown above Nightingale presents the decrease in death rates that have been achieved. Let’s read it from her own hand; here is the note the accompanying the chart above:

The areas of the blue, red & black wedges are each measured from the centre as the common vortex.

The blue wedges measured from the centre of the circle represent area for area the deaths from Preventible or Mitigable Zymotic diseases, the red wedged measured from the centre the deaths from wounds, & the black wedged measured from the centre the deaths from all other causes.

The black line across the read triangle in Nov. 1854 marks the boundary of the deaths from all other caused during the month.

In October 1854, & April 1855, the black area coincides with the red, in January & February 1855, the blue area coincides with the black.

The entire areas may be compared bu following the blue, the read & the black lines enclosing them.

Nightingale recognised that soldiers were dying from other causes: malnutrition, poor sanitation, and lack of activity. Her aim was to improve the conditions of wounded soldiers and improve their chances of survival. This was evidence that later helped put focus on the importance of patient welfare.

Once the war was over, Florence Nightingale returned home but her quest did not finish there. She continued her work to improve conditions in hospitals. She became a star in her own time and with time the legend of “The Lady with Lamp” solidified in the national and international consciousness. You may have heard of there in the 1857 poem by Henry Wadsworth Longfellow called “Santa Filomena”:

Lo! in that house of misery
A lady with a lamp I see
Pass through the glimmering gloom,
And flit from room to room

Today, Nightigale’s lamp continues bringing hope to her patients. Not just for those working and being treated in the NHS Nightingale hospitals, but also to to all of us through the metaphorical light of rational optimism. Let there be light.

Screencasting with Macs and PCs

The videos below were made a few years ago to support a Science Communication and Group Project module at the School of Physics Astronomy and Mathematics at the University of Hertfordshire. The work was supported by the Institute of Physics and the HE STEM programme. I also got support from the Institute of Mathematics and its Applications. The tools are probably a bit dated now, but I hope the principles still help some students trying to get their work seen.

Students were asked to prepare a short video to present the results of their project and share it with the world. To support them, the videos below were prepared.

Students were also encouraged to prepare technical documentation and the videos for using LaTeX and structuring their documents with LaTeX were very useful.

Screencasting with a Mac

In this video we will see some tools to capture video from your screen using a Mac. The tools are Quicktime Player, MPEG Streamclip and iMovie.

Screencasting with a PC

In this video we will see some tools to capture video from your screen using a PC. The tools are CamStudio and Freemake Video Converter.

Uploading a Video to Vimeo

In this tutorial we will see how to set up an account in Vimeo and how to upload your screencast. Also you will be able to send a link to your video to you friends and other people.

2019 Nobel Prize in Chemistry

From left: John Goodenough, M. Stanley Whittingham, and Akira Yoshino. Credits: University of Texas at Austin; Binghamton University; the Japan Prize Foundation

Originally published in Physics Today by Alex Lopatka

John Goodenough, M. Stanley Whittingham, and Akira Yoshino will receive the 2019 Nobel Prize in Chemistry for developing lithium-ion batteries, the Royal Swedish Academy of Sciences announced on Wednesday. Goodenough (University of Texas at Austin), Whittingham (Binghamton University in New York), and Yoshino (Asahi Kasei Corp and Meijo University in Japan) will each receive one-third of the 9 million Swedish krona (roughly $900 000) prize. Their research not only allowed for the commercial-scale manufacture of lithium-ion batteries, but it also has supercharged research into all sorts of new technology, including wind and solar power.

At the heart of any battery is a redox reaction. During the discharge phase, the oxidation reaction at the anode frees ions to travel through a liquid electrolyte solution to the cathode, which is undergoing a reduction reaction. Meanwhile, electrons hum through a circuit to power a connected electronic device. For the recharge phase, the redox processes reverse, and the ions go back to the anode so that it’s ready for another discharge cycle.

The now ubiquitous lithium-ion battery that powers smartphones, electric vehicles, and more got its start shortly before the 1973 oil crisis. The American Energy Commission asked Goodenough, who was then at MIT’s Lincoln Laboratory, to evaluate a project by battery scientists at the Ford Motor Company. They were looking into the feasibility of molten-salt batteries, which used sodium and sulfur, to replace the standard but outdated lead–acid batteries developed about a century earlier. But by the late 1960s, it became clear that high operating temperatures and corrosion problems made those batteries impractical (see the article by Matthew Eisler, Physics Today, September 2016, page 30).

Whittingham, then a research scientist at Exxon, instead considered low-temperature, high-energy batteries that could not only power electric vehicles but also store solar energy during off-peak hours. To that end he developed a battery in 1976 with a titanium disulfide cathode paired with a lithium metal anode. Lithium’s low standard reduction potential of −3.05 V makes it especially attractive for high-density and high-voltage battery cells. Critically, Whittingham’s design employed lithium ions that were intercalated—that is, inserted between layers of the TiS2 structure—and provided a means to reversibly store the lithium during the redox reactions.

Illustration of Whittingham's battery.
The lithium-ion battery designed by M. Stanley Whittingham had a titanium disulfide cathode and a lithium metal anode, as illustrated here. John Goodenough and Akira Yoshino improved on the technology by replacing the cathode and anode with lithium cobalt oxide and graphite, respectively. Credit: Johan Jarnestad/The Royal Swedish Academy of Sciences

Lithium’s high reactivity, however, means that it must be isolated from air and water to avoid dangerous reactions. Whittingham solved that problem by using nonaqueous electrolyte solutions that had been carefully designed and tested by other researchers in lithium electrochemistry experiments conducted a few years earlier. The proof of concept was a substantial improvement: Whittingham’s lithium-ion battery had a higher cell potential than the lead–acid battery’s—2.5 V compared with 2 V.

Whittingham’s lithium-ion battery, though, wasn’t particularly stable. After repeated discharging and recharging, whisker-like crystals of lithium would grow on the anode. Eventually the wispy threads would grow large enough to breach the barrier separating the anode from the cathode, and the battery would short-circuit or even explode.

In 1980 Goodenough didn’t solve that problem, but he did come up with a much better material for the cathode. Along with Koichi Mizushima and colleagues at Oxford University, he found that lithium cobalt oxide could be used for the cathode. As with the TiS2, the cobalt oxide structure was tightly intercalated with lithium and could thus provide the cathode with sufficient energy density. Goodenough’s insight into the relationship between the cobalt oxide structure and voltage potential resulted in better battery performance; the voltage increased from 2.5 V to 4 V. Although the new battery was an improvement over Whittingham’s design, the system still used highly reactive lithium metal as the anode, so companies couldn’t safely manufacture the batteries on a commercial scale.

The final piece of the puzzle fell into place in 1985 when Yoshino, working at the Asahi Kasei Corp, replaced the anode material with graphite. It was stable in the required electrochemical conditions and accommodated many lithium ions in graphite’s crystal structure. With Goodenough’s lithium cobalt oxide cathode and the graphite anode, Yoshino, “came up with two materials you could put together without a glove box” in a chemistry laboratory, says Clare Grey, a chemist at the University of Cambridge. Importantly, the graphite anode is lightweight and capable of being recharged hundreds of times before its performance deteriorates. Soon after, Sony teamed up with Asahi Kasei and replaced all the nickel–cadmium batteries in its consumer electronics with lithium-ion ones.

“The story of the lithium-ion battery, like so many stories about innovation, is about contributions from many sources over many years, conditioned by changing economic and social circumstances,” says Matthew Eisler, a historian of science at the University of Strathclyde in Glasgow, UK. When the 1979 oil crisis ended, the automotive industry’s interest in batteries drained, but in 1991 they were commercialized for use in cameras, laptops, smartphones, and other handheld electronics enabled by advancements in microprocessor technology.

To develop transportation that doesn’t rely on fossil fuels, the US Department of Energy in 2013 set an ambitious goal for its Joint Center for Energy Storage Research: Make a battery for electric vehicles that has five times the energy density and is one-fifth the cost of currently available batteries. DOE’s goal hasn’t been reached yet, but the program was renewed in September 2018, with dedicated funding of $120 million over the next five years. In a story on the center, Goodenough told Physics Today (June 2013, page 26), “People are working hard, and I believe the problem is solvable, but to get to the next stage, it’s going to take a little luck and some cleverness.”

Editor’s note: This post was updated at 7:15pm EDT from an earlier summary.

The Year in Math and Computer Science

A reblog from Quanta Magazine:

https://www.quantamagazine.org/quantas-year-in-math-and-computer-science-2018-20181221/

Several mathematicians under the age of 30, and amateur problem-solvers of all ages, made significant contributions to some of the most difficult questions in math and theoretical computer science.

Youth ruled the year in mathematics. The Fields Medals — awarded every four years to the top mathematicians no older than 40 — went out to four individuals who have left their marks all over the mathematical landscape. This year one of the awards went to Peter Scholze, who at 30 became one of the youngest ever to win. But at times in 2018, even 30 could feel old.

Two students, one in graduate school and the other just 18, in two separate discoveries, remapped the borders that separate quantum computers from ordinary classical computation. Another graduate student proved a decades-old conjecture about elliptic curves, a type of object that has fascinated mathematicians for centuries. And amateur mathematicians of all ages rose up to make significant contributions to long-dormant problems.

But perhaps the most significant sign of youth’s rise was when Scholze, not a month after the Fields Medal ceremony, made public (along with a collaborator) his map pointing to a hole in a purported proof of the famous abc conjecture. The proof, put forward six years ago by a mathematical luminary, has baffled most mathematicians ever since.

A new Bose-Einstein condensate

Originally published here.

A new Bose-Einstein condensate

 

Although Bose-Einstein condensation has been observed in several systems, the limits of the phenomenon need to be pushed further: to faster timescales, higher temperatures, and smaller sizes. The easier creating these condensates gets, the more exciting routes open for new technological applications. New light sources, for example, could be extremely small in size and allow fast information processing.

In experiments by Aalto researchers, the condensed particles were mixtures of light and electrons in motion in gold nanorods arranged into a periodic array. Unlike most previous Bose-Einstein condensates created experimentally, the new condensate does not need to be cooled down to temperatures near absolute zero. Because the particles are mostly light, the condensation could be induced in room temperature.

‘The gold nanoparticle array is easy to create with modern nanofabrication methods. Near the nanorods, light can be focused into tiny volumes, even below the wavelength of light in vacuum. These features offer interesting prospects for fundamental studies and applications of the new condensate,’ says Academy Professor Päivi Törmä.

The main hurdle in acquiring proof of the new kind of condensate is that it comes into being extremely quickly.’According to our theoretical calculations, the condensate forms in only a picosecond,’ says doctoral student Antti Moilanen. ‘How could we ever verify the existence of something that only lasts one trillionth of a second?’

Turning distance into time

A key idea was to initiate the condensation process with a kick so that the particles forming the condensate would start to move.

‘As the condensate takes form, it will emit light throughout the gold nanorod array. By observing the light, we can monitor how the condensation proceeds in time. This is how we can turn distance into time,’ explains staff scientist Tommi Hakala.

The light that the condensate emits is similar to laser light. ‘We can alter the distance between each nanorod to control whether Bose-Einstein condensation or the formation of ordinary laser light occurs. The two are closely related phenomena, and being able to distinguish between them is crucial for fundamental research. They also promise different kinds of technological applications,’ explains Professor Törmä.

Both lasing and Bose-Einstein condensation provide bright beams, but the coherences of the light they offer have different properties. These, in turn, affect the ways the light can be tuned to meet the requirements of a specific application. The new condensate can produce light pulses that are extremely short and may offer faster speeds for information processing and imaging applications. Academy Professor Törmä has already obtained a Proof of Concept grant from the European Research Council to explore such prospects.

Materials provided by Aalto University. Note: Content may be edited for style and length.

Journal Reference:

1 Tommi K. Hakala, Antti J. Moilanen, Aaro I. Väkeväinen, Rui Guo, Jani-Petri Martikainen, Konstantinos S. Daskalakis, Heikki T. Rekola, Aleksi Julku, Päivi Törmä. Bose–Einstein condensation in a plasmonic lattice. Nature Physics, 2018; DOI: 10.1038/s41567-018-0109-9

New quantum method generates really random numbers

Originally appeared in ScienceDaily, 11 April 2018.

New quantum method generates really random numbers

Researchers at the National Institute of Standards and Technology (NIST) have developed a method for generating numbers guaranteed to be random by quantum mechanics. Described in the April 12 issue of Nature, the experimental technique surpasses all previous methods for ensuring the unpredictability of its random numbers and may enhance security and trust in cryptographic systems.

The new NIST method generates digital bits (1s and 0s) with photons, or particles of light, using data generated in an improved version of a landmark 2015 NIST physics experiment. That experiment showed conclusively that what Einstein derided as “spooky action at a distance” is real. In the new work, researchers process the spooky output to certify and quantify the randomness available in the data and generate a string of much more random bits.

Random numbers are used hundreds of billions of times a day to encrypt data in electronic networks. But these numbers are not certifiably random in an absolute sense. That’s because they are generated by software formulas or physical devices whose supposedly random output could be undermined by factors such as predictable sources of noise. Running statistical tests can help, but no statistical test on the output alone can absolutely guarantee that the output was unpredictable, especially if an adversary has tampered with the device.

“It’s hard to guarantee that a given classical source is really unpredictable,” NIST mathematician Peter Bierhorst said. “Our quantum source and protocol is like a fail-safe. We’re sure that no one can predict our numbers.”

“Something like a coin flip may seem random, but its outcome could be predicted if one could see the exact path of the coin as it tumbles. Quantum randomness, on the other hand, is real randomness. We’re very sure we’re seeing quantum randomness because only a quantum system could produce these statistical correlations between our measurement choices and outcomes.”

The new quantum-based method is part of an ongoing effort to enhance NIST’s public randomness beacon, which broadcasts random bits for applications such as secure multiparty computation. The NIST beacon currently relies on commercial sources.

Quantum mechanics provides a superior source of randomness because measurements of some quantum particles (those in a “superposition” of both 0 and 1 at the same time) have fundamentally unpredictable results. Researchers can easily measure a quantum system. But it’s hard to prove that measurements are being made of a quantum system and not a classical system in disguise.

In NIST’s experiment, that proof comes from observing the spooky quantum correlations between pairs of distant photons while closing the “loopholes” that might otherwise allow non-random bits to appear to be random. For example, the two measurement stations are positioned too far apart to allow hidden communications between them; by the laws of physics any such exchanges would be limited to the speed of light.

Random numbers are generated in two steps. First, the spooky action experiment generates a long string of bits through a “Bell test,” in which researchers measure correlations between the properties of the pairs of photons. The timing of the measurements ensures that the correlations cannot be explained by classical processes such as pre-existing conditions or exchanges of information at, or slower than, the speed of light. Statistical tests of the correlations demonstrate that quantum mechanics is at work, and these data allow the researchers to quantify the amount of randomness present in the long string of bits.

That randomness may be spread very thin throughout the long string of bits. For example, nearly every bit might be 0 with only a few being 1. To obtain a short, uniform string with concentrated randomness such that each bit has a 50/50 chance of being 0 or 1, a second step called “extraction” is performed. NIST researchers developed software to process the Bell test data into a shorter string of bits that are nearly uniform; that is, with 0s and 1s equally likely. The full process requires the input of two independent strings of random bits to select measurement settings for the Bell tests and to “seed” the software to help extract the randomness from the original data. NIST researchers used a conventional random number generator to generate these input strings.

From 55,110,210 trials of the Bell test, each of which produces two bits, researchers extracted 1,024 bits certified to be uniform to within one trillionth of 1 percent.

“A perfect coin toss would be uniform, and we made 1,024 bits almost perfectly uniform, each extremely close to equally likely to be 0 or 1,” Bierhorst said.

Other researchers have previously used Bell tests to generate random numbers, but the NIST method is the first to use a loophole-free Bell test and to process the resulting data through extraction. Extractors and seeds are already used in classical random number generators; in fact, random seeds are essential in computer security and can be used as encryption keys.

In the new NIST method, the final numbers are certified to be random even if the measurement settings and seed are publicly known; the only requirement is that the Bell test experiment be physically isolated from customers and hackers. “The idea is you get something better out (private randomness) than what you put in (public randomness),” Bierhorst said.

Story Source:

Materials provided by National Institute of Standards and Technology (NIST)Note: Content may be edited for style and length.


Journal Reference:

  1. Peter Bierhorst, Emanuel Knill, Scott Glancy, Yanbao Zhang, Alan Mink, Stephen Jordan, Andrea Rommal, Yi-Kai Liu, Bradley Christensen, Sae Woo Nam, Martin J. Stevens, Lynden K. Shalm. Experimentally Generated Randomness Certified by the Impossibility of Superluminal SignalsNature, 2018 DOI: 10.1038/s41586-018-0019-0