Science Communication – Technical Writing and Presentation Advice

The two videos below were made a few years ago to support a Science Communication and Group Project module at the School of Physics Astronomy and Mathematics at the University of Hertfordshire. The work was supported by the Institute of Physics and the HE STEM programme. I also got support from the Institute of Mathematics and its Applications. The tools are probably a bit dated now, but I hope the principles still help some students trying to get their work seen.

The students were encouraged to share and communicate the results of their projects via a video and they were supported by tutorials on how to do screencasts.

Students were also encouraged to prepare technical documentation and the videos for using LaTeX and structuring their documents with LaTeXwere very useful.

Technical Writing

This presentation addresses some issues we should take into account when writing for technical purposes.

Presentation Advice

In this tutorial we will address some of points that can help you make a better presentation either for a live talk or for recording.

Screencasting with Macs and PCs

The videos below were made a few years ago to support a Science Communication and Group Project module at the School of Physics Astronomy and Mathematics at the University of Hertfordshire. The work was supported by the Institute of Physics and the HE STEM programme. I also got support from the Institute of Mathematics and its Applications. The tools are probably a bit dated now, but I hope the principles still help some students trying to get their work seen.

Students were asked to prepare a short video to present the results of their project and share it with the world. To support them, the videos below were prepared.

Students were also encouraged to prepare technical documentation and the videos for using LaTeX and structuring their documents with LaTeX were very useful.

Screencasting with a Mac

In this video we will see some tools to capture video from your screen using a Mac. The tools are Quicktime Player, MPEG Streamclip and iMovie.

Screencasting with a PC

In this video we will see some tools to capture video from your screen using a PC. The tools are CamStudio and Freemake Video Converter.

Uploading a Video to Vimeo

In this tutorial we will see how to set up an account in Vimeo and how to upload your screencast. Also you will be able to send a link to your video to you friends and other people.

2019 Nobel Prize in Chemistry

From left: John Goodenough, M. Stanley Whittingham, and Akira Yoshino. Credits: University of Texas at Austin; Binghamton University; the Japan Prize Foundation

Originally published in Physics Today by Alex Lopatka

John Goodenough, M. Stanley Whittingham, and Akira Yoshino will receive the 2019 Nobel Prize in Chemistry for developing lithium-ion batteries, the Royal Swedish Academy of Sciences announced on Wednesday. Goodenough (University of Texas at Austin), Whittingham (Binghamton University in New York), and Yoshino (Asahi Kasei Corp and Meijo University in Japan) will each receive one-third of the 9 million Swedish krona (roughly $900 000) prize. Their research not only allowed for the commercial-scale manufacture of lithium-ion batteries, but it also has supercharged research into all sorts of new technology, including wind and solar power.

At the heart of any battery is a redox reaction. During the discharge phase, the oxidation reaction at the anode frees ions to travel through a liquid electrolyte solution to the cathode, which is undergoing a reduction reaction. Meanwhile, electrons hum through a circuit to power a connected electronic device. For the recharge phase, the redox processes reverse, and the ions go back to the anode so that it’s ready for another discharge cycle.

The now ubiquitous lithium-ion battery that powers smartphones, electric vehicles, and more got its start shortly before the 1973 oil crisis. The American Energy Commission asked Goodenough, who was then at MIT’s Lincoln Laboratory, to evaluate a project by battery scientists at the Ford Motor Company. They were looking into the feasibility of molten-salt batteries, which used sodium and sulfur, to replace the standard but outdated lead–acid batteries developed about a century earlier. But by the late 1960s, it became clear that high operating temperatures and corrosion problems made those batteries impractical (see the article by Matthew Eisler, Physics Today, September 2016, page 30).

Whittingham, then a research scientist at Exxon, instead considered low-temperature, high-energy batteries that could not only power electric vehicles but also store solar energy during off-peak hours. To that end he developed a battery in 1976 with a titanium disulfide cathode paired with a lithium metal anode. Lithium’s low standard reduction potential of −3.05 V makes it especially attractive for high-density and high-voltage battery cells. Critically, Whittingham’s design employed lithium ions that were intercalated—that is, inserted between layers of the TiS2 structure—and provided a means to reversibly store the lithium during the redox reactions.

Illustration of Whittingham's battery.
The lithium-ion battery designed by M. Stanley Whittingham had a titanium disulfide cathode and a lithium metal anode, as illustrated here. John Goodenough and Akira Yoshino improved on the technology by replacing the cathode and anode with lithium cobalt oxide and graphite, respectively. Credit: Johan Jarnestad/The Royal Swedish Academy of Sciences

Lithium’s high reactivity, however, means that it must be isolated from air and water to avoid dangerous reactions. Whittingham solved that problem by using nonaqueous electrolyte solutions that had been carefully designed and tested by other researchers in lithium electrochemistry experiments conducted a few years earlier. The proof of concept was a substantial improvement: Whittingham’s lithium-ion battery had a higher cell potential than the lead–acid battery’s—2.5 V compared with 2 V.

Whittingham’s lithium-ion battery, though, wasn’t particularly stable. After repeated discharging and recharging, whisker-like crystals of lithium would grow on the anode. Eventually the wispy threads would grow large enough to breach the barrier separating the anode from the cathode, and the battery would short-circuit or even explode.

In 1980 Goodenough didn’t solve that problem, but he did come up with a much better material for the cathode. Along with Koichi Mizushima and colleagues at Oxford University, he found that lithium cobalt oxide could be used for the cathode. As with the TiS2, the cobalt oxide structure was tightly intercalated with lithium and could thus provide the cathode with sufficient energy density. Goodenough’s insight into the relationship between the cobalt oxide structure and voltage potential resulted in better battery performance; the voltage increased from 2.5 V to 4 V. Although the new battery was an improvement over Whittingham’s design, the system still used highly reactive lithium metal as the anode, so companies couldn’t safely manufacture the batteries on a commercial scale.

The final piece of the puzzle fell into place in 1985 when Yoshino, working at the Asahi Kasei Corp, replaced the anode material with graphite. It was stable in the required electrochemical conditions and accommodated many lithium ions in graphite’s crystal structure. With Goodenough’s lithium cobalt oxide cathode and the graphite anode, Yoshino, “came up with two materials you could put together without a glove box” in a chemistry laboratory, says Clare Grey, a chemist at the University of Cambridge. Importantly, the graphite anode is lightweight and capable of being recharged hundreds of times before its performance deteriorates. Soon after, Sony teamed up with Asahi Kasei and replaced all the nickel–cadmium batteries in its consumer electronics with lithium-ion ones.

“The story of the lithium-ion battery, like so many stories about innovation, is about contributions from many sources over many years, conditioned by changing economic and social circumstances,” says Matthew Eisler, a historian of science at the University of Strathclyde in Glasgow, UK. When the 1979 oil crisis ended, the automotive industry’s interest in batteries drained, but in 1991 they were commercialized for use in cameras, laptops, smartphones, and other handheld electronics enabled by advancements in microprocessor technology.

To develop transportation that doesn’t rely on fossil fuels, the US Department of Energy in 2013 set an ambitious goal for its Joint Center for Energy Storage Research: Make a battery for electric vehicles that has five times the energy density and is one-fifth the cost of currently available batteries. DOE’s goal hasn’t been reached yet, but the program was renewed in September 2018, with dedicated funding of $120 million over the next five years. In a story on the center, Goodenough told Physics Today (June 2013, page 26), “People are working hard, and I believe the problem is solvable, but to get to the next stage, it’s going to take a little luck and some cleverness.”

Editor’s note: This post was updated at 7:15pm EDT from an earlier summary.

2019 Nobel Prize in Physics

Left to right: James Peebles, Michel Mayor, and Didier Queloz. Credit: Royal Swedish Academy of Sciences; University of Geneva

This is a reblog go the post in Physics Today, written by Andrew Grant.

The researchers are recognized for their contributions to theoretical cosmology and the study of extrasolar planets.

James Peebles, Michel Mayor, and Didier Queloz will receive the 2019 Nobel Prize in Physics for helping to understand our place in the universe through advances in theoretical cosmology and the detection of extrasolar planets, the Royal Swedish Academy of Sciences announced on Tuesday. Peebles is a theoretical cosmologist at Princeton University who helped predict and then interpret the cosmic microwave background (CMB) and later worked to integrate dark matter and dark energy into the cosmological framework. Mayor and Queloz are observational astronomers at the University of Geneva who in 1995 discovered 51 Pegasi b, the first known exoplanet to orbit a Sunlike star. Peebles will receive half of the 9 million Swedish krona (roughly $900 000) prize; Mayor and Queloz (who also has an appointment at the University of Cambridge) will share the other half.

The contributions of Peebles and of Mayor and Queloz helped jumpstart their respective fields. Over the past few decades, researchers have developed the successful standard model of cosmology, Lambda CDM, though the nature of both dark energy and dark matter remains an open question. Meanwhile, astronomers have used the radial velocity technique employed by Mayor and Queloz, along with the transit method and even direct imaging, to discover and characterize a diverse population of thousands of exoplanets. Data from NASA’s Kepler telescope suggest that the Milky Way harbors more planets than stars.

Connecting past with present

“More than any other person,” writes Caltech theoretical physicist Sean Carroll on Twitter, Peebles “made physical cosmology into a quantitative science.” His contributions began even before Arno Penzias and Robert Wilson’s 20-foot antenna at Bell Labs picked up the unexpected hum of 7.35 cm microwave noise that would come to be known as the CMB. Working as a postdoc with Robert Dicke at Princeton, Peebles predicted in a 1965 paper that the remnant radiation from a hot Big Bang, after eons of propagating through an expanding universe, would have a temperature of about 10 K. In a subsequent paper Peebles connected the temperature of the CMB, measured by Penzias and Wilson at 3.5 K (now known to be 2.7 K), to the density of matter in the early universe and the formation of light elements such as helium.

In 1970 Peebles and graduate student Jer Yu predicted a set of temperature fluctuations imprinted in the CMB due to the propagation of acoustic waves in the hot plasma of the infant universe. Decades later, the Cosmic Background Explorer (COBE), the Wilkinson Microwave Anisotropy Probe (WMAP), and, most recently, the Planck satellite would measure a similar power spectrum in the CMB. “The theoretical framework that he helped create made testable predictions,” says Priyamvada Natarajan, a Yale theoretical astrophysicist. “They still inform a lot of the observational tests of cosmology.”

Peebles also considered the connection between those fluctuations and the large-scale structure of the universe we observe today, as measured through galaxy clusters in sky surveys. “His idea that you can see the initial conditions and dynamics of the universe in the clustering of galaxies transformed what we could do as a community,” says New York University astrophysicist David W. Hogg.

Peebles’s view of the CMB and what it embodies proved especially important in the early 1980s, when cosmologists struggled to reconcile the deduced densities of matter in the infant universe with the large-scale structure that ultimately emerged. In a 1982 paper, Peebles proposed a solution in the form of nonrelativistic dark matter. Long after escaping the dense confines of the infant cosmos, that cold dark matter (CDM) would form the cocoons in which ordinary matter clumped into galaxies and then galaxy clusters. His paper built on the work of Vera Rubin, whose measurements with Kent Ford of the rotation curves of the Andromeda galaxy were critical toward demonstrating that dark matter must be the dominant component of galactic halos, to keep disks of stars and gas from flying apart. Subsequent satellite measurements have revealed that collectively dark matter has about five times the mass of ordinary matter.

By the 1990s it was becoming clear that a model containing just CDM, ordinary matter, and photons couldn’t account for all the observed properties of the universe, notably the value of the Hubble constant. The result is Lambda CDM, the cosmological model that describes the universe with six precisely measured parameters and accounts for the 1998 discovery that the universe’s expansion is accelerating. Peebles was one of the theorists to propose resurrecting Albert Einstein’s once-discarded cosmological constant to describe the newly discovered dark energy, which makes up more than two-thirds of the mass–energy content of the universe.

Ushering in the exoplanet era

To appreciate the contribution of Mayor and Queloz, consider that in 1995 the least massive known object outside the solar system was a star of 0.08 solar masses; Jupiter, for comparison, is about 0.001 M. Mayor was part of a team that in 1989 reported the probable detection of an object 11 times as massive as Jupiter that could be classified as either a very large planet or a brown dwarf. Pennsylvania State University astronomer Jason Wright says that other teams amassed preliminary evidence of extrasolar planets, but it was unconvincing and led planetary scientist William Cochran to declare, “Thou shalt not embarrass thyself and thy colleagues by claiming false planets.”

In 1992 Alexander Wolszczan and his colleagues discovered two planets orbiting the pulsar PSR B1257+12 via timing variations in the dead star’s radio beacon. (A third later found around the same pulsar remains the lowest-mass exoplanet yet discovered.) The discovery showed that exoplanets are out there, but the question remained of how common they are around stars like the Sun, where well-placed ones would presumably have the potential to support life.

At the Haute-Provence Observatory in southeastern France, Mayor and his graduate student Queloz conducted a survey of 142 stars using a spectrograph called ELODIE, which they designed to enable the observation of fainter stars than had previously been surveyed. The researchers’ approach, first proposed in 1952 by Otto Struve, was to detect the Doppler shift in the stellar spectrum due to the star’s motion as it is pushed and pulled by an orbiting planet. The expected stellar wobble due to a planet’s tug was on the order of 10 m/s; even now, the best spectrometers have a resolution of about 1000 m/s, Hogg says. Mayor and Queloz needed to be able to pinpoint a shift that accounted for a hundredth, or even a thousandth, of a pixel.

That’s exactly what they did through analysis of the signal from 51 Pegasi, a star located about 50 light-years away in the constellation Pegasus. The Doppler shift was consistent with the motion of a Jupiter-mass planet in a four-day orbit at 0.05 astronomical units, far shorter than the distance between Mercury and the Sun. The discovery of a “hot Jupiter” was surprising but also helpful, as the short period enabled Mayor and Queloz, and competing groups, to easily conduct follow-up observations. The astronomers announced their discovery at a conference in Italy almost exactly 24 years ago, on 6 October 1995, and soon published their result in Nature. Another group promptly confirmed the finding.

“It’s a discovery that has completely changed our view of who we are,” says Yale University astronomer Debra Fischer. “And it came at a time when we thought that maybe there weren’t many planets around other stars.”

However, the astronomy community wasn’t yet convinced by Mayor and Queloz’s claim. Many researchers didn’t think it was possible for such a massive planet to either form so close to the star or migrate inward without getting incinerated. Theorists proposed that the observed stellar wobbles might not be caused by an exoplanet at all, but rather by phenomena such as stellar brightness oscillations. But even the most skeptical came around in 1999, with discoveries of the first multi-exoplanet system by Fischer and colleagues, and of HD 209548 b. That planet was detected via the drop in brightness it caused when it passed in front of its star.

The early planet confirmations convinced observatory directors to build and install spectrographs. They also ultimately helped coax NASA to greenlight the development of a space telescope proposal that had been languishing for decades, a mission called Kepler. That satellite, which was launched in 2009, and instruments such as the Transiting Exoplanet Survey Satellite have detected thousands of planets and planet candidates.

Nearly a quarter century after Mayor and Queloz’s discovery, exoplanet science is a powerhouse endeavor that engages a significant percentage of the astrophysics community. Researchers join the field to study not only the planets but also the stars they orbit, which in turn has led to new insights in stellar astrophysics. By pairing transit measurements, which determine planets’ radii, with radial velocity, which provides masses, researchers have determined that many of the galaxy’s planets don’t resemble those in our solar system. The lack of resemblance challenges theories of planet formation and extends the range of planetary types that theories have to accommodate.

The most tantalizing goal of the field set in motion by Mayor and Queloz is to find planets that resemble Earth and to detect biosignatures. Researchers are already probing the atmospheres of individual worlds using the Hubble Space Telescope and other tools. Next-generation instruments, particularly the James Webb Space Telescope and the Wide Field Infrared Survey Telescope, will aid in that effort.

Orion at the Institute of Physics

via Instagram http://bit.ly/2DGSPaI

It was great to have been able to attend a lecture at the new home of the Institute of Physics. I have been a member for almost two decades and I have even served as an officer for one of the interest groups, the Computational Physics Group is you must know.

The event was a talk by Stephen Hilton from the School of Pharmacy, UCL 3D Printing and its Application in Chemistry and Pharmacy. It was a very useful talk covering applications ranging from teaching, cost saving in chemistry labs, personalised medicine and chemistry itself.

As for the building, it was nice to finally see the end result, with a hint of brutalist architecture and some nice details such as the electromagnetic wave diagram in some of the windows, and Orion in the cealing!

A new Bose-Einstein condensate

Originally published here.

A new Bose-Einstein condensate

 

Although Bose-Einstein condensation has been observed in several systems, the limits of the phenomenon need to be pushed further: to faster timescales, higher temperatures, and smaller sizes. The easier creating these condensates gets, the more exciting routes open for new technological applications. New light sources, for example, could be extremely small in size and allow fast information processing.

In experiments by Aalto researchers, the condensed particles were mixtures of light and electrons in motion in gold nanorods arranged into a periodic array. Unlike most previous Bose-Einstein condensates created experimentally, the new condensate does not need to be cooled down to temperatures near absolute zero. Because the particles are mostly light, the condensation could be induced in room temperature.

‘The gold nanoparticle array is easy to create with modern nanofabrication methods. Near the nanorods, light can be focused into tiny volumes, even below the wavelength of light in vacuum. These features offer interesting prospects for fundamental studies and applications of the new condensate,’ says Academy Professor Päivi Törmä.

The main hurdle in acquiring proof of the new kind of condensate is that it comes into being extremely quickly.’According to our theoretical calculations, the condensate forms in only a picosecond,’ says doctoral student Antti Moilanen. ‘How could we ever verify the existence of something that only lasts one trillionth of a second?’

Turning distance into time

A key idea was to initiate the condensation process with a kick so that the particles forming the condensate would start to move.

‘As the condensate takes form, it will emit light throughout the gold nanorod array. By observing the light, we can monitor how the condensation proceeds in time. This is how we can turn distance into time,’ explains staff scientist Tommi Hakala.

The light that the condensate emits is similar to laser light. ‘We can alter the distance between each nanorod to control whether Bose-Einstein condensation or the formation of ordinary laser light occurs. The two are closely related phenomena, and being able to distinguish between them is crucial for fundamental research. They also promise different kinds of technological applications,’ explains Professor Törmä.

Both lasing and Bose-Einstein condensation provide bright beams, but the coherences of the light they offer have different properties. These, in turn, affect the ways the light can be tuned to meet the requirements of a specific application. The new condensate can produce light pulses that are extremely short and may offer faster speeds for information processing and imaging applications. Academy Professor Törmä has already obtained a Proof of Concept grant from the European Research Council to explore such prospects.

Materials provided by Aalto University. Note: Content may be edited for style and length.

Journal Reference:

1 Tommi K. Hakala, Antti J. Moilanen, Aaro I. Väkeväinen, Rui Guo, Jani-Petri Martikainen, Konstantinos S. Daskalakis, Heikki T. Rekola, Aleksi Julku, Päivi Törmä. Bose–Einstein condensation in a plasmonic lattice. Nature Physics, 2018; DOI: 10.1038/s41567-018-0109-9

New quantum method generates really random numbers

Originally appeared in ScienceDaily, 11 April 2018.

New quantum method generates really random numbers

Researchers at the National Institute of Standards and Technology (NIST) have developed a method for generating numbers guaranteed to be random by quantum mechanics. Described in the April 12 issue of Nature, the experimental technique surpasses all previous methods for ensuring the unpredictability of its random numbers and may enhance security and trust in cryptographic systems.

The new NIST method generates digital bits (1s and 0s) with photons, or particles of light, using data generated in an improved version of a landmark 2015 NIST physics experiment. That experiment showed conclusively that what Einstein derided as “spooky action at a distance” is real. In the new work, researchers process the spooky output to certify and quantify the randomness available in the data and generate a string of much more random bits.

Random numbers are used hundreds of billions of times a day to encrypt data in electronic networks. But these numbers are not certifiably random in an absolute sense. That’s because they are generated by software formulas or physical devices whose supposedly random output could be undermined by factors such as predictable sources of noise. Running statistical tests can help, but no statistical test on the output alone can absolutely guarantee that the output was unpredictable, especially if an adversary has tampered with the device.

“It’s hard to guarantee that a given classical source is really unpredictable,” NIST mathematician Peter Bierhorst said. “Our quantum source and protocol is like a fail-safe. We’re sure that no one can predict our numbers.”

“Something like a coin flip may seem random, but its outcome could be predicted if one could see the exact path of the coin as it tumbles. Quantum randomness, on the other hand, is real randomness. We’re very sure we’re seeing quantum randomness because only a quantum system could produce these statistical correlations between our measurement choices and outcomes.”

The new quantum-based method is part of an ongoing effort to enhance NIST’s public randomness beacon, which broadcasts random bits for applications such as secure multiparty computation. The NIST beacon currently relies on commercial sources.

Quantum mechanics provides a superior source of randomness because measurements of some quantum particles (those in a “superposition” of both 0 and 1 at the same time) have fundamentally unpredictable results. Researchers can easily measure a quantum system. But it’s hard to prove that measurements are being made of a quantum system and not a classical system in disguise.

In NIST’s experiment, that proof comes from observing the spooky quantum correlations between pairs of distant photons while closing the “loopholes” that might otherwise allow non-random bits to appear to be random. For example, the two measurement stations are positioned too far apart to allow hidden communications between them; by the laws of physics any such exchanges would be limited to the speed of light.

Random numbers are generated in two steps. First, the spooky action experiment generates a long string of bits through a “Bell test,” in which researchers measure correlations between the properties of the pairs of photons. The timing of the measurements ensures that the correlations cannot be explained by classical processes such as pre-existing conditions or exchanges of information at, or slower than, the speed of light. Statistical tests of the correlations demonstrate that quantum mechanics is at work, and these data allow the researchers to quantify the amount of randomness present in the long string of bits.

That randomness may be spread very thin throughout the long string of bits. For example, nearly every bit might be 0 with only a few being 1. To obtain a short, uniform string with concentrated randomness such that each bit has a 50/50 chance of being 0 or 1, a second step called “extraction” is performed. NIST researchers developed software to process the Bell test data into a shorter string of bits that are nearly uniform; that is, with 0s and 1s equally likely. The full process requires the input of two independent strings of random bits to select measurement settings for the Bell tests and to “seed” the software to help extract the randomness from the original data. NIST researchers used a conventional random number generator to generate these input strings.

From 55,110,210 trials of the Bell test, each of which produces two bits, researchers extracted 1,024 bits certified to be uniform to within one trillionth of 1 percent.

“A perfect coin toss would be uniform, and we made 1,024 bits almost perfectly uniform, each extremely close to equally likely to be 0 or 1,” Bierhorst said.

Other researchers have previously used Bell tests to generate random numbers, but the NIST method is the first to use a loophole-free Bell test and to process the resulting data through extraction. Extractors and seeds are already used in classical random number generators; in fact, random seeds are essential in computer security and can be used as encryption keys.

In the new NIST method, the final numbers are certified to be random even if the measurement settings and seed are publicly known; the only requirement is that the Bell test experiment be physically isolated from customers and hackers. “The idea is you get something better out (private randomness) than what you put in (public randomness),” Bierhorst said.

Story Source:

Materials provided by National Institute of Standards and Technology (NIST)Note: Content may be edited for style and length.


Journal Reference:

  1. Peter Bierhorst, Emanuel Knill, Scott Glancy, Yanbao Zhang, Alan Mink, Stephen Jordan, Andrea Rommal, Yi-Kai Liu, Bradley Christensen, Sae Woo Nam, Martin J. Stevens, Lynden K. Shalm. Experimentally Generated Randomness Certified by the Impossibility of Superluminal SignalsNature, 2018 DOI: 10.1038/s41586-018-0019-0