Let there be light: Florence Nightingale

This year, 2020, the word Nightingale has acquired new connotations. It is no longer just a word to refer to a passerine bird with beautiful and powerful birdsong, it is the name that NHS England has given to the temporary hospitals set up for the COVID-19 pandemic. In normal circumstances it is indeed a very good name to use for a hospital, but given the circumstances, it becomes more poignant. It is even more so considering the fact that this year, 2020, is the bicentenary go Florence Nightingale’s birth.

Florence Nightingale was born on 12th May, 1820 in Florence, Italy (hence the name!) and became a social reformer, statistician, and the founder of modern nursing. She became the first woman to be elected to be a Fellow of the Royal Society in 1874.

With the power of data, Nightingale was able to save lives and change policy. Her analysis of data from the Crimean War was compelling and persuasive in its simplicity. It allowed her and her team to pay attention to time – tracking admissions to hospital and crucially deaths – on a month by month basis. We must remember that the power of statistical tests as we know today were not established tools and the work horse of statistics, regression, was decades in the future. The data analysis presented in columns and rows as supported by powerful graphics that many of us admire today.

In 2014 had an opportunity to admire her Nightingale Roses, or to use its formal name polar area charts, in the exhibition Science is Beautiful at the British Library.

Florence Nightingale’s “rose diagram”, showing the Causes of Mortality in the Army in the East, 1858. Photograph: /British Library

These and other charts were used in the report that she later published in 1858 under the title “Notes in Matters Affecting the Health, Efficiency, and Hospital Administration of the British Army”. The report included charts of deaths by barometric pressure and temperature, showing that deaths were higher in hotter months compared to cooler ones. In polar charts shown above Nightingale presents the decrease in death rates that have been achieved. Let’s read it from her own hand; here is the note the accompanying the chart above:

The areas of the blue, red & black wedges are each measured from the centre as the common vortex.

The blue wedges measured from the centre of the circle represent area for area the deaths from Preventible or Mitigable Zymotic diseases, the red wedged measured from the centre the deaths from wounds, & the black wedged measured from the centre the deaths from all other causes.

The black line across the read triangle in Nov. 1854 marks the boundary of the deaths from all other caused during the month.

In October 1854, & April 1855, the black area coincides with the red, in January & February 1855, the blue area coincides with the black.

The entire areas may be compared bu following the blue, the read & the black lines enclosing them.

Nightingale recognised that soldiers were dying from other causes: malnutrition, poor sanitation, and lack of activity. Her aim was to improve the conditions of wounded soldiers and improve their chances of survival. This was evidence that later helped put focus on the importance of patient welfare.

Once the war was over, Florence Nightingale returned home but her quest did not finish there. She continued her work to improve conditions in hospitals. She became a star in her own time and with time the legend of “The Lady with Lamp” solidified in the national and international consciousness. You may have heard of there in the 1857 poem by Henry Wadsworth Longfellow called “Santa Filomena”:

Lo! in that house of misery
A lady with a lamp I see
Pass through the glimmering gloom,
And flit from room to room

Today, Nightigale’s lamp continues bringing hope to her patients. Not just for those working and being treated in the NHS Nightingale hospitals, but also to to all of us through the metaphorical light of rational optimism. Let there be light.

Science Communication – Technical Writing and Presentation Advice

The two videos below were made a few years ago to support a Science Communication and Group Project module at the School of Physics Astronomy and Mathematics at the University of Hertfordshire. The work was supported by the Institute of Physics and the HE STEM programme. I also got support from the Institute of Mathematics and its Applications. The tools are probably a bit dated now, but I hope the principles still help some students trying to get their work seen.

The students were encouraged to share and communicate the results of their projects via a video and they were supported by tutorials on how to do screencasts.

Students were also encouraged to prepare technical documentation and the videos for using LaTeX and structuring their documents with LaTeXwere very useful.

Technical Writing

This presentation addresses some issues we should take into account when writing for technical purposes.

Presentation Advice

In this tutorial we will address some of points that can help you make a better presentation either for a live talk or for recording.

Screencasting with Macs and PCs

The videos below were made a few years ago to support a Science Communication and Group Project module at the School of Physics Astronomy and Mathematics at the University of Hertfordshire. The work was supported by the Institute of Physics and the HE STEM programme. I also got support from the Institute of Mathematics and its Applications. The tools are probably a bit dated now, but I hope the principles still help some students trying to get their work seen.

Students were asked to prepare a short video to present the results of their project and share it with the world. To support them, the videos below were prepared.

Students were also encouraged to prepare technical documentation and the videos for using LaTeX and structuring their documents with LaTeX were very useful.

Screencasting with a Mac

In this video we will see some tools to capture video from your screen using a Mac. The tools are Quicktime Player, MPEG Streamclip and iMovie.

Screencasting with a PC

In this video we will see some tools to capture video from your screen using a PC. The tools are CamStudio and Freemake Video Converter.

Uploading a Video to Vimeo

In this tutorial we will see how to set up an account in Vimeo and how to upload your screencast. Also you will be able to send a link to your video to you friends and other people.

Structured Documents in LaTeX

This is a video I made a few years ago to encourage my students to use better tools to write dissertations, thesis and reports that include the use of mathematics. The principles stand, although the tools may have moved on since then. I am reposting them as requested by a colleague of mine, Dr Catarina Carvalho, who I hope will still find this useful.

In this video we continue explaining how to use LaTeX. Here we will see how to use a master document in order to build a thesis or dissertation.
We assume that you have already had a look at the tutorial entitled: LaTeX for writing mathematics – An introduction

Structured Documents in LaTeX

LaTeX for writing mathematics – An introduction

This is a video I made a few years ago to encourage my students to use better tools to write dissertations, thesis and reports that include the use of mathematics. The principles stand, although the tools may have moved on since then. I am reposting them as requested by a colleague of mine, Dr Catarina Carvalho, who I hope will still find this useful.

In this video we explore the LaTeX document preparation system. We start with a explaining an example document. We have made use of TeXmaker as an editor given its flexibility and the fact that it is available for different platforms.

LaTeX for writing mathematics – An introduction

2019 Nobel Prize in Chemistry

From left: John Goodenough, M. Stanley Whittingham, and Akira Yoshino. Credits: University of Texas at Austin; Binghamton University; the Japan Prize Foundation

Originally published in Physics Today by Alex Lopatka

John Goodenough, M. Stanley Whittingham, and Akira Yoshino will receive the 2019 Nobel Prize in Chemistry for developing lithium-ion batteries, the Royal Swedish Academy of Sciences announced on Wednesday. Goodenough (University of Texas at Austin), Whittingham (Binghamton University in New York), and Yoshino (Asahi Kasei Corp and Meijo University in Japan) will each receive one-third of the 9 million Swedish krona (roughly $900 000) prize. Their research not only allowed for the commercial-scale manufacture of lithium-ion batteries, but it also has supercharged research into all sorts of new technology, including wind and solar power.

At the heart of any battery is a redox reaction. During the discharge phase, the oxidation reaction at the anode frees ions to travel through a liquid electrolyte solution to the cathode, which is undergoing a reduction reaction. Meanwhile, electrons hum through a circuit to power a connected electronic device. For the recharge phase, the redox processes reverse, and the ions go back to the anode so that it’s ready for another discharge cycle.

The now ubiquitous lithium-ion battery that powers smartphones, electric vehicles, and more got its start shortly before the 1973 oil crisis. The American Energy Commission asked Goodenough, who was then at MIT’s Lincoln Laboratory, to evaluate a project by battery scientists at the Ford Motor Company. They were looking into the feasibility of molten-salt batteries, which used sodium and sulfur, to replace the standard but outdated lead–acid batteries developed about a century earlier. But by the late 1960s, it became clear that high operating temperatures and corrosion problems made those batteries impractical (see the article by Matthew Eisler, Physics Today, September 2016, page 30).

Whittingham, then a research scientist at Exxon, instead considered low-temperature, high-energy batteries that could not only power electric vehicles but also store solar energy during off-peak hours. To that end he developed a battery in 1976 with a titanium disulfide cathode paired with a lithium metal anode. Lithium’s low standard reduction potential of −3.05 V makes it especially attractive for high-density and high-voltage battery cells. Critically, Whittingham’s design employed lithium ions that were intercalated—that is, inserted between layers of the TiS2 structure—and provided a means to reversibly store the lithium during the redox reactions.

Illustration of Whittingham's battery.
The lithium-ion battery designed by M. Stanley Whittingham had a titanium disulfide cathode and a lithium metal anode, as illustrated here. John Goodenough and Akira Yoshino improved on the technology by replacing the cathode and anode with lithium cobalt oxide and graphite, respectively. Credit: Johan Jarnestad/The Royal Swedish Academy of Sciences

Lithium’s high reactivity, however, means that it must be isolated from air and water to avoid dangerous reactions. Whittingham solved that problem by using nonaqueous electrolyte solutions that had been carefully designed and tested by other researchers in lithium electrochemistry experiments conducted a few years earlier. The proof of concept was a substantial improvement: Whittingham’s lithium-ion battery had a higher cell potential than the lead–acid battery’s—2.5 V compared with 2 V.

Whittingham’s lithium-ion battery, though, wasn’t particularly stable. After repeated discharging and recharging, whisker-like crystals of lithium would grow on the anode. Eventually the wispy threads would grow large enough to breach the barrier separating the anode from the cathode, and the battery would short-circuit or even explode.

In 1980 Goodenough didn’t solve that problem, but he did come up with a much better material for the cathode. Along with Koichi Mizushima and colleagues at Oxford University, he found that lithium cobalt oxide could be used for the cathode. As with the TiS2, the cobalt oxide structure was tightly intercalated with lithium and could thus provide the cathode with sufficient energy density. Goodenough’s insight into the relationship between the cobalt oxide structure and voltage potential resulted in better battery performance; the voltage increased from 2.5 V to 4 V. Although the new battery was an improvement over Whittingham’s design, the system still used highly reactive lithium metal as the anode, so companies couldn’t safely manufacture the batteries on a commercial scale.

The final piece of the puzzle fell into place in 1985 when Yoshino, working at the Asahi Kasei Corp, replaced the anode material with graphite. It was stable in the required electrochemical conditions and accommodated many lithium ions in graphite’s crystal structure. With Goodenough’s lithium cobalt oxide cathode and the graphite anode, Yoshino, “came up with two materials you could put together without a glove box” in a chemistry laboratory, says Clare Grey, a chemist at the University of Cambridge. Importantly, the graphite anode is lightweight and capable of being recharged hundreds of times before its performance deteriorates. Soon after, Sony teamed up with Asahi Kasei and replaced all the nickel–cadmium batteries in its consumer electronics with lithium-ion ones.

“The story of the lithium-ion battery, like so many stories about innovation, is about contributions from many sources over many years, conditioned by changing economic and social circumstances,” says Matthew Eisler, a historian of science at the University of Strathclyde in Glasgow, UK. When the 1979 oil crisis ended, the automotive industry’s interest in batteries drained, but in 1991 they were commercialized for use in cameras, laptops, smartphones, and other handheld electronics enabled by advancements in microprocessor technology.

To develop transportation that doesn’t rely on fossil fuels, the US Department of Energy in 2013 set an ambitious goal for its Joint Center for Energy Storage Research: Make a battery for electric vehicles that has five times the energy density and is one-fifth the cost of currently available batteries. DOE’s goal hasn’t been reached yet, but the program was renewed in September 2018, with dedicated funding of $120 million over the next five years. In a story on the center, Goodenough told Physics Today (June 2013, page 26), “People are working hard, and I believe the problem is solvable, but to get to the next stage, it’s going to take a little luck and some cleverness.”

Editor’s note: This post was updated at 7:15pm EDT from an earlier summary.

2019 Nobel Prize in Physics

Left to right: James Peebles, Michel Mayor, and Didier Queloz. Credit: Royal Swedish Academy of Sciences; University of Geneva

This is a reblog go the post in Physics Today, written by Andrew Grant.

The researchers are recognized for their contributions to theoretical cosmology and the study of extrasolar planets.

James Peebles, Michel Mayor, and Didier Queloz will receive the 2019 Nobel Prize in Physics for helping to understand our place in the universe through advances in theoretical cosmology and the detection of extrasolar planets, the Royal Swedish Academy of Sciences announced on Tuesday. Peebles is a theoretical cosmologist at Princeton University who helped predict and then interpret the cosmic microwave background (CMB) and later worked to integrate dark matter and dark energy into the cosmological framework. Mayor and Queloz are observational astronomers at the University of Geneva who in 1995 discovered 51 Pegasi b, the first known exoplanet to orbit a Sunlike star. Peebles will receive half of the 9 million Swedish krona (roughly $900 000) prize; Mayor and Queloz (who also has an appointment at the University of Cambridge) will share the other half.

The contributions of Peebles and of Mayor and Queloz helped jumpstart their respective fields. Over the past few decades, researchers have developed the successful standard model of cosmology, Lambda CDM, though the nature of both dark energy and dark matter remains an open question. Meanwhile, astronomers have used the radial velocity technique employed by Mayor and Queloz, along with the transit method and even direct imaging, to discover and characterize a diverse population of thousands of exoplanets. Data from NASA’s Kepler telescope suggest that the Milky Way harbors more planets than stars.

Connecting past with present

“More than any other person,” writes Caltech theoretical physicist Sean Carroll on Twitter, Peebles “made physical cosmology into a quantitative science.” His contributions began even before Arno Penzias and Robert Wilson’s 20-foot antenna at Bell Labs picked up the unexpected hum of 7.35 cm microwave noise that would come to be known as the CMB. Working as a postdoc with Robert Dicke at Princeton, Peebles predicted in a 1965 paper that the remnant radiation from a hot Big Bang, after eons of propagating through an expanding universe, would have a temperature of about 10 K. In a subsequent paper Peebles connected the temperature of the CMB, measured by Penzias and Wilson at 3.5 K (now known to be 2.7 K), to the density of matter in the early universe and the formation of light elements such as helium.

In 1970 Peebles and graduate student Jer Yu predicted a set of temperature fluctuations imprinted in the CMB due to the propagation of acoustic waves in the hot plasma of the infant universe. Decades later, the Cosmic Background Explorer (COBE), the Wilkinson Microwave Anisotropy Probe (WMAP), and, most recently, the Planck satellite would measure a similar power spectrum in the CMB. “The theoretical framework that he helped create made testable predictions,” says Priyamvada Natarajan, a Yale theoretical astrophysicist. “They still inform a lot of the observational tests of cosmology.”

Peebles also considered the connection between those fluctuations and the large-scale structure of the universe we observe today, as measured through galaxy clusters in sky surveys. “His idea that you can see the initial conditions and dynamics of the universe in the clustering of galaxies transformed what we could do as a community,” says New York University astrophysicist David W. Hogg.

Peebles’s view of the CMB and what it embodies proved especially important in the early 1980s, when cosmologists struggled to reconcile the deduced densities of matter in the infant universe with the large-scale structure that ultimately emerged. In a 1982 paper, Peebles proposed a solution in the form of nonrelativistic dark matter. Long after escaping the dense confines of the infant cosmos, that cold dark matter (CDM) would form the cocoons in which ordinary matter clumped into galaxies and then galaxy clusters. His paper built on the work of Vera Rubin, whose measurements with Kent Ford of the rotation curves of the Andromeda galaxy were critical toward demonstrating that dark matter must be the dominant component of galactic halos, to keep disks of stars and gas from flying apart. Subsequent satellite measurements have revealed that collectively dark matter has about five times the mass of ordinary matter.

By the 1990s it was becoming clear that a model containing just CDM, ordinary matter, and photons couldn’t account for all the observed properties of the universe, notably the value of the Hubble constant. The result is Lambda CDM, the cosmological model that describes the universe with six precisely measured parameters and accounts for the 1998 discovery that the universe’s expansion is accelerating. Peebles was one of the theorists to propose resurrecting Albert Einstein’s once-discarded cosmological constant to describe the newly discovered dark energy, which makes up more than two-thirds of the mass–energy content of the universe.

Ushering in the exoplanet era

To appreciate the contribution of Mayor and Queloz, consider that in 1995 the least massive known object outside the solar system was a star of 0.08 solar masses; Jupiter, for comparison, is about 0.001 M. Mayor was part of a team that in 1989 reported the probable detection of an object 11 times as massive as Jupiter that could be classified as either a very large planet or a brown dwarf. Pennsylvania State University astronomer Jason Wright says that other teams amassed preliminary evidence of extrasolar planets, but it was unconvincing and led planetary scientist William Cochran to declare, “Thou shalt not embarrass thyself and thy colleagues by claiming false planets.”

In 1992 Alexander Wolszczan and his colleagues discovered two planets orbiting the pulsar PSR B1257+12 via timing variations in the dead star’s radio beacon. (A third later found around the same pulsar remains the lowest-mass exoplanet yet discovered.) The discovery showed that exoplanets are out there, but the question remained of how common they are around stars like the Sun, where well-placed ones would presumably have the potential to support life.

At the Haute-Provence Observatory in southeastern France, Mayor and his graduate student Queloz conducted a survey of 142 stars using a spectrograph called ELODIE, which they designed to enable the observation of fainter stars than had previously been surveyed. The researchers’ approach, first proposed in 1952 by Otto Struve, was to detect the Doppler shift in the stellar spectrum due to the star’s motion as it is pushed and pulled by an orbiting planet. The expected stellar wobble due to a planet’s tug was on the order of 10 m/s; even now, the best spectrometers have a resolution of about 1000 m/s, Hogg says. Mayor and Queloz needed to be able to pinpoint a shift that accounted for a hundredth, or even a thousandth, of a pixel.

That’s exactly what they did through analysis of the signal from 51 Pegasi, a star located about 50 light-years away in the constellation Pegasus. The Doppler shift was consistent with the motion of a Jupiter-mass planet in a four-day orbit at 0.05 astronomical units, far shorter than the distance between Mercury and the Sun. The discovery of a “hot Jupiter” was surprising but also helpful, as the short period enabled Mayor and Queloz, and competing groups, to easily conduct follow-up observations. The astronomers announced their discovery at a conference in Italy almost exactly 24 years ago, on 6 October 1995, and soon published their result in Nature. Another group promptly confirmed the finding.

“It’s a discovery that has completely changed our view of who we are,” says Yale University astronomer Debra Fischer. “And it came at a time when we thought that maybe there weren’t many planets around other stars.”

However, the astronomy community wasn’t yet convinced by Mayor and Queloz’s claim. Many researchers didn’t think it was possible for such a massive planet to either form so close to the star or migrate inward without getting incinerated. Theorists proposed that the observed stellar wobbles might not be caused by an exoplanet at all, but rather by phenomena such as stellar brightness oscillations. But even the most skeptical came around in 1999, with discoveries of the first multi-exoplanet system by Fischer and colleagues, and of HD 209548 b. That planet was detected via the drop in brightness it caused when it passed in front of its star.

The early planet confirmations convinced observatory directors to build and install spectrographs. They also ultimately helped coax NASA to greenlight the development of a space telescope proposal that had been languishing for decades, a mission called Kepler. That satellite, which was launched in 2009, and instruments such as the Transiting Exoplanet Survey Satellite have detected thousands of planets and planet candidates.

Nearly a quarter century after Mayor and Queloz’s discovery, exoplanet science is a powerhouse endeavor that engages a significant percentage of the astrophysics community. Researchers join the field to study not only the planets but also the stars they orbit, which in turn has led to new insights in stellar astrophysics. By pairing transit measurements, which determine planets’ radii, with radial velocity, which provides masses, researchers have determined that many of the galaxy’s planets don’t resemble those in our solar system. The lack of resemblance challenges theories of planet formation and extends the range of planetary types that theories have to accommodate.

The most tantalizing goal of the field set in motion by Mayor and Queloz is to find planets that resemble Earth and to detect biosignatures. Researchers are already probing the atmospheres of individual worlds using the Hubble Space Telescope and other tools. Next-generation instruments, particularly the James Webb Space Telescope and the Wide Field Infrared Survey Telescope, will aid in that effort.

Orion at the Institute of Physics

via Instagram http://bit.ly/2DGSPaI

It was great to have been able to attend a lecture at the new home of the Institute of Physics. I have been a member for almost two decades and I have even served as an officer for one of the interest groups, the Computational Physics Group is you must know.

The event was a talk by Stephen Hilton from the School of Pharmacy, UCL 3D Printing and its Application in Chemistry and Pharmacy. It was a very useful talk covering applications ranging from teaching, cost saving in chemistry labs, personalised medicine and chemistry itself.

As for the building, it was nice to finally see the end result, with a hint of brutalist architecture and some nice details such as the electromagnetic wave diagram in some of the windows, and Orion in the cealing!