Working remotely…

Empty offices seem to be the new norm.

The current situation with #COVID has definitely accelerated a trend that was slowly being embraced. There are some pros and cons.

The chief executive of Barclays, Jes Staley, has suggested that piling 7,000 employees into a tall building “may be a thing of the past”. Large numbers of businesses have reported that productivity has been maintained, or even enhanced, as people work from home. Firms’ accountants will have taken note.

However, the normalisation of remote working will also have an impact on younger workers who, though digitally native, are also more likely to live in cramped circumstances and miss the kind of formative experiences that happen in a physical workplace.

Let there be light: Florence Nightingale

This year, 2020, the word Nightingale has acquired new connotations. It is no longer just a word to refer to a passerine bird with beautiful and powerful birdsong, it is the name that NHS England has given to the temporary hospitals set up for the COVID-19 pandemic. In normal circumstances it is indeed a very good name to use for a hospital, but given the circumstances, it becomes more poignant. It is even more so considering the fact that this year, 2020, is the bicentenary go Florence Nightingale’s birth.

Florence Nightingale was born on 12th May, 1820 in Florence, Italy (hence the name!) and became a social reformer, statistician, and the founder of modern nursing. She became the first woman to be elected to be a Fellow of the Royal Society in 1874.

With the power of data, Nightingale was able to save lives and change policy. Her analysis of data from the Crimean War was compelling and persuasive in its simplicity. It allowed her and her team to pay attention to time – tracking admissions to hospital and crucially deaths – on a month by month basis. We must remember that the power of statistical tests as we know today were not established tools and the work horse of statistics, regression, was decades in the future. The data analysis presented in columns and rows as supported by powerful graphics that many of us admire today.

In 2014 had an opportunity to admire her Nightingale Roses, or to use its formal name polar area charts, in the exhibition Science is Beautiful at the British Library.

Florence Nightingale’s “rose diagram”, showing the Causes of Mortality in the Army in the East, 1858. Photograph: /British Library

These and other charts were used in the report that she later published in 1858 under the title “Notes in Matters Affecting the Health, Efficiency, and Hospital Administration of the British Army”. The report included charts of deaths by barometric pressure and temperature, showing that deaths were higher in hotter months compared to cooler ones. In polar charts shown above Nightingale presents the decrease in death rates that have been achieved. Let’s read it from her own hand; here is the note the accompanying the chart above:

The areas of the blue, red & black wedges are each measured from the centre as the common vortex.

The blue wedges measured from the centre of the circle represent area for area the deaths from Preventible or Mitigable Zymotic diseases, the red wedged measured from the centre the deaths from wounds, & the black wedged measured from the centre the deaths from all other causes.

The black line across the read triangle in Nov. 1854 marks the boundary of the deaths from all other caused during the month.

In October 1854, & April 1855, the black area coincides with the red, in January & February 1855, the blue area coincides with the black.

The entire areas may be compared bu following the blue, the read & the black lines enclosing them.

Nightingale recognised that soldiers were dying from other causes: malnutrition, poor sanitation, and lack of activity. Her aim was to improve the conditions of wounded soldiers and improve their chances of survival. This was evidence that later helped put focus on the importance of patient welfare.

Once the war was over, Florence Nightingale returned home but her quest did not finish there. She continued her work to improve conditions in hospitals. She became a star in her own time and with time the legend of “The Lady with Lamp” solidified in the national and international consciousness. You may have heard of there in the 1857 poem by Henry Wadsworth Longfellow called “Santa Filomena”:

Lo! in that house of misery
A lady with a lamp I see
Pass through the glimmering gloom,
And flit from room to room

Today, Nightigale’s lamp continues bringing hope to her patients. Not just for those working and being treated in the NHS Nightingale hospitals, but also to to all of us through the metaphorical light of rational optimism. Let there be light.

2018 – A review

It is that time of year when we have an opportunity to look back and see what we have achieved while taking an opportunity to see what the next year will bring. This may be of interest just to me, so please accept my apologies… Here we go: 

In no particular order:

  • I signed up with my publisher Taylor & Francis to write a volume 2 for my “Data Science and Analytics with Python” book
  • During the year I had a opportunities to attend some great events such as the EGG Conference by Dataiku or the BBC Machine Learning Fireside Chats as well as multiple events with the Turing Institute
  • I continued delivering training at General Assembly, reaching out to people interested in learning more about Python and Data Science. It has been an interesting year and it is great to see what former students are currently doing with the skills learnt
  • The work delivered for companies such as Louis Vuitton, Volvo, Foster & Partners, and others was fantastic. I am also very proud to have tackled some strategy work for the Mayo Clinic and deliver a presentation in a lecture theatre at Mayo
  • I contributed to some open source software projects
  • It was a busy year in terms of speaking engagements having delivered keynotes at Entrepares 2018 and the IV Seminario de Periodismo Iberoamericano de Ciencia Tecnología e Innovación both in Puebla, Mexico. I also ran an Introduction to Data Science workshop at ODSC18 in London and an Introduction to Python at Entrepares 2018. I gave a talk about Data Science Practices at Google Campus in London. The interactive Q&A session was an fun way to answer queries from the audience. I also was a member in various debate panels
  • I rekindled playing board games with a couple of good friends of mine, and it has been a geeky blast!
  • I started a new role and still looking to get my foot through the door with Apple
  • I’ve been delving more into Machine Learning systems and platforms, learning about interpretability, reliability, monitoring, and more. There is still plenty more to learn
  • I met Chris Robshaw and attended a bunch of rugby matches through the year

Looking forward to 2019, learning and developing more.

New quantum method generates really random numbers

Originally appeared in ScienceDaily, 11 April 2018.

New quantum method generates really random numbers

Researchers at the National Institute of Standards and Technology (NIST) have developed a method for generating numbers guaranteed to be random by quantum mechanics. Described in the April 12 issue of Nature, the experimental technique surpasses all previous methods for ensuring the unpredictability of its random numbers and may enhance security and trust in cryptographic systems.

The new NIST method generates digital bits (1s and 0s) with photons, or particles of light, using data generated in an improved version of a landmark 2015 NIST physics experiment. That experiment showed conclusively that what Einstein derided as “spooky action at a distance” is real. In the new work, researchers process the spooky output to certify and quantify the randomness available in the data and generate a string of much more random bits.

Random numbers are used hundreds of billions of times a day to encrypt data in electronic networks. But these numbers are not certifiably random in an absolute sense. That’s because they are generated by software formulas or physical devices whose supposedly random output could be undermined by factors such as predictable sources of noise. Running statistical tests can help, but no statistical test on the output alone can absolutely guarantee that the output was unpredictable, especially if an adversary has tampered with the device.

“It’s hard to guarantee that a given classical source is really unpredictable,” NIST mathematician Peter Bierhorst said. “Our quantum source and protocol is like a fail-safe. We’re sure that no one can predict our numbers.”

“Something like a coin flip may seem random, but its outcome could be predicted if one could see the exact path of the coin as it tumbles. Quantum randomness, on the other hand, is real randomness. We’re very sure we’re seeing quantum randomness because only a quantum system could produce these statistical correlations between our measurement choices and outcomes.”

The new quantum-based method is part of an ongoing effort to enhance NIST’s public randomness beacon, which broadcasts random bits for applications such as secure multiparty computation. The NIST beacon currently relies on commercial sources.

Quantum mechanics provides a superior source of randomness because measurements of some quantum particles (those in a “superposition” of both 0 and 1 at the same time) have fundamentally unpredictable results. Researchers can easily measure a quantum system. But it’s hard to prove that measurements are being made of a quantum system and not a classical system in disguise.

In NIST’s experiment, that proof comes from observing the spooky quantum correlations between pairs of distant photons while closing the “loopholes” that might otherwise allow non-random bits to appear to be random. For example, the two measurement stations are positioned too far apart to allow hidden communications between them; by the laws of physics any such exchanges would be limited to the speed of light.

Random numbers are generated in two steps. First, the spooky action experiment generates a long string of bits through a “Bell test,” in which researchers measure correlations between the properties of the pairs of photons. The timing of the measurements ensures that the correlations cannot be explained by classical processes such as pre-existing conditions or exchanges of information at, or slower than, the speed of light. Statistical tests of the correlations demonstrate that quantum mechanics is at work, and these data allow the researchers to quantify the amount of randomness present in the long string of bits.

That randomness may be spread very thin throughout the long string of bits. For example, nearly every bit might be 0 with only a few being 1. To obtain a short, uniform string with concentrated randomness such that each bit has a 50/50 chance of being 0 or 1, a second step called “extraction” is performed. NIST researchers developed software to process the Bell test data into a shorter string of bits that are nearly uniform; that is, with 0s and 1s equally likely. The full process requires the input of two independent strings of random bits to select measurement settings for the Bell tests and to “seed” the software to help extract the randomness from the original data. NIST researchers used a conventional random number generator to generate these input strings.

From 55,110,210 trials of the Bell test, each of which produces two bits, researchers extracted 1,024 bits certified to be uniform to within one trillionth of 1 percent.

“A perfect coin toss would be uniform, and we made 1,024 bits almost perfectly uniform, each extremely close to equally likely to be 0 or 1,” Bierhorst said.

Other researchers have previously used Bell tests to generate random numbers, but the NIST method is the first to use a loophole-free Bell test and to process the resulting data through extraction. Extractors and seeds are already used in classical random number generators; in fact, random seeds are essential in computer security and can be used as encryption keys.

In the new NIST method, the final numbers are certified to be random even if the measurement settings and seed are publicly known; the only requirement is that the Bell test experiment be physically isolated from customers and hackers. “The idea is you get something better out (private randomness) than what you put in (public randomness),” Bierhorst said.

Story Source:

Materials provided by National Institute of Standards and Technology (NIST)Note: Content may be edited for style and length.


Journal Reference:

  1. Peter Bierhorst, Emanuel Knill, Scott Glancy, Yanbao Zhang, Alan Mink, Stephen Jordan, Andrea Rommal, Yi-Kai Liu, Bradley Christensen, Sae Woo Nam, Martin J. Stevens, Lynden K. Shalm. Experimentally Generated Randomness Certified by the Impossibility of Superluminal SignalsNature, 2018 DOI: 10.1038/s41586-018-0019-0

16 things we learned in 2016

16 things we learned in 2016


1. It’s not human versus machine, it’s humans and machines. As the fourth industrial revolution unfolds, experts reckon artificial intelligence and roboticshave the greatest potential, but need the most human oversight.

2. Our brains label someone as an ‘outsider’ or part of ‘our group’ within 170 thousandths of a second. The neuroscience of populism runs deep, but advances in understanding the brain could drive huge progress.

3. Young people are more comfortable with globalization than the old. Our Global Shapers Survey of over 26,000 young people revealed them most concerned with upholding open ideals of global citizenship, and worried about corruption, climate change and a lack of opportunity.

4. Social media is completely transforming politics. But we don’t know whether it is a complete threat to democracy, or not.

5. Mediocre is the new normal. At least as far as economic growth is concerned. That was the view from top economists, looking at the world from China, just after the UK voted to leave the EU.

6. Economists’ tool boxes are emptying. According to the Global Competitiveness Report, monetary stimulus doesn’t work if economies are not competitive, and innovation is increasingly important.

7. Some countries are more innovative than others: Singapore, Finland, and Sweden amongst them.

8. But the best countries for living both well and sustainably are neither rich nor European.

9. There may be some easy ways to boost growth. Like giving eyeglasses to those who need them.

10. We won’t have gender equality in the workplace until 2186. Yes, 2186: 170 years from now as progress slips backwards.

11. The global economy is failing 35% of the world’s talent. Our Human Capital Report found that only 65% of people are fulfilling their potential through education, skills and work.

12. Skills are changing. Fast. Many of the 10 most in-demand skills didn’t even exist a decade ago, but some very basic human traits, like sharing and negotiating, will never go out of fashion.

13. The future of finance is blockchain. This technology, perhaps the buzzword of the year, is set to revolutionise how money flows around the globe. Other emerging technologies set to shake up the world include the Internet of Nanothings.

14. Emerging markets will power global growth next year and beyond.China’s economy is gliding smoothly off its peaks; Africa’s growth story is only just beginning; India’s progress will be powered by manufacturing; and innovation is transforming Latin America.

15. In 2016, and probably for all eternity, people wanted to know what would make them more successfulhappier, and better leaders.

16. And… being bored is good for you.

Nobel Prize in Physics 2016: Exotic States of Matter

Yesterday the 2016 Nobel Prize in Physics was announced. I immediately got a few tweets asking for more information about what these “exotic” states of matter were and explain more about them… Well in short the prize was awarded for the  theoretical discoveries that help scientists understand unusual properties of materials, such as superconductivity and superfluidity, that arise at low temperatures.

Physics Nobel 2016

The prize was awarded jointly to David J. Thouless of the University of Washington in Seattle, F. Duncan M. Haldane of Princeton University in New Jersey, and J. Michael Kosterlitz of Brown University in Rhode Island. The citation from the Swedish Academy reads: “for theoretical discoveries of topological phase transitions and topological phases of matter.”

“Topo…what?” – I hear you cry… well let us start at the beginning…

Thouless, Haldane and Kosterliz work in a field of physics known as Condensed Matter Physics and it is interested in the physical properties of “condensed” materials such as solids and liquids. You may not know it, but results from research in condensed matter physics have made it possible for you to save a lot of data in your computer’s hard drive: the discovery of giant magnetoresistance has made it possible.

The discoveries that the Nobel Committee are highlighting with the prize provide a better understanding of phases of matter such as superconductors, superfluids and thin magnetic films. The discoveries are now guiding the quest for next generation materials for electronics, quantum computing and more. They have developed mathematical models to describe the topological properties of materials in relation to other phenomena such as superconductivity, superfluidity and other peculiar magnetic properties.

Once again that word: “topology”…

So, we know that all matter is formed by atoms. Nonetheless matter can have different properties and appear in different forms, such as solid, liquid, superfluid, magnet, etc. These various forms of matter are often called states of matter or phases. According to condensed matter physics , the different properties of materials originate from the different ways in which the atoms are organised in the materials. Those different organizations of the atoms (or other particles) are formally called the orders in the materials. Topological order is a type of order in zero-temperature phase of matter (also known as quantum matter). In general, topology is the study of geometrical properties and spatial relations unaffected by the continuous change of shape or size of figures. In our case, we are talking about properties of matter that remain unchanged when the object is flattened or expanded.

Although, research originally focused on topological properties in 1-D and 2-D materials, researchers have discovered them in 3-D materials as well. These results are particularly important as they enable us to understanding “exotic” phenomena such as superconductivity, the property of matter that lets electrons travel through materials with zero resistance, and superfluidity, which lets fluids flow with zero loss of kinetic energy. Currently one of the most researched topics in the area is the study of topological insulators, superconductors and metals.

Here is a report from Physics Today about the Nobel Prize announcement:

Thouless, Haldane, and Kosterlitz share 2016 Nobel Prize in Physics

David Thouless, Duncan Haldane, and Michael Kosterlitz are to be awarded the 2016 Nobel Prize in Physics for their work on topological phases and phase transitions, the Royal Swedish Academy of Sciences announced on Tuesday. Thouless, of the University of Washington in Seattle, will receive half the 8 million Swedish krona (roughly $925 000) prize; Haldane, of Princeton University, and Kosterlitz, of Brown University, will split the other half.

This year’s laureates used the mathematical branch of topology to make revolutionary contributions to their field of condensed-matter physics. In 1972 Thouless and Kosterlitz identified a phase transition that opened up two-dimensional systems as a playground for observing superconductivity, superfluidity, and other exotic phenomena. A decade later Haldane showed that topology is important in considering the properties of 1D chains of magnetic atoms. Then in the 1980s Thouless and Haldane demonstrated that the unusual behavior exhibited in the quantum Hall effect can emerge without a magnetic field.

From early on it was clear that the laureates’ work would have important implications for condensed-matter theory. Today experimenters are studying 2D superconductors and topological insulators, which are insulating in the bulk yet channel spin-polarized currents on their surfaces without resistance (see Physics Today, January 2010, page 33). The research could lead to improved electronics, robust qubits for quantum computers, and even an improved understanding of the standard model of particle physics.

Vortices and the KT transition

When Thouless and Kosterlitz first collaborated in the early 1970s, the conventional wisdom was that thermal fluctuations in 2D materials precluded the emergence of ordered phases such as superconductivity. The researchers, then at the University of Birmingham in England, dismantled that argument by investigating the interactions within a 2D lattice.

Thouless and Kosterlitz considered an idealized array of spins that is cooled to nearly absolute zero. At first the system lacks enough thermal energy to create defects, which in the model take the form of localized swirling vortices. Raising the temperature spurs the development of tightly bound pairs of oppositely rotating vortices. The coherence of the entire system depends logarithmically on the separation between vortices. As the temperature rises further, more vortex pairs pop up, and the separation between partners grows.

The two scientists’ major insight came when they realized they could model the clockwise and counterclockwise vortices as positive and negative electric charges. The more pairs that form, the more interactions are disturbed by narrowly spaced vortices sitting between widely spaced ones. “Eventually, the whole thing will fly apart and you’ll get spontaneous ‘ionization,’ ” Thouless told Physics Today in 2006.

That analog to ionization, in which the coherence suddenly falls off in an exponential rather than logarithmic dependence with distance, is known as the Kosterlitz–Thouless (KT) transition. (The late Russian physicist Vadim Berezinskii made a similar observation in 1970, which led some researchers to add a “B” to the transition name, but the Nobel committee notes that Berezinskii did not theorize the existence of the transition at finite temperature.)

Unlike some other phase transitions, such as the onset of ferromagnetism, no symmetry is broken. The sudden shift between order and disorder also demonstrates that superconductivity could indeed subsist in the 2D realm at temperatures below that of the KT transition. Experimenters observed the KT transition in superfluid helium-4 in 1978 and in superconducting thin films in 1981. More recently, the transition was reproduced in a flattened cloud of ultracold rubidium atoms (see Physics Today, August 2006, page 17).

A topological answer for the quantum Hall effect

Thouless then turned his attention to the quantum foundations of conductors and insulators. In 1980 German physicist Klaus von Klitzing had applied a strong magnetic field to a thin conducting film sandwiched between semiconductors. The electrons traveling within the film separated into well-organized opposing lanes of traffic along the edges (see Physics Today, June 1981, page 17). Von Klitzing had discovered the quantum Hall effect, for which he would earn the Nobel five years later.

Crucially, von Klitzing found that adjusting the strength of the magnetic field changed the conductance of his thin film only in fixed steps; the conductance was always an integer multiple of a fixed value, e2/h. That discovery proved the key for Thouless to relate the quantum Hall effect to topology, which is also based on integer steps—objects are often distinguished from each other topologically by the number of holes or nodes they possess, which is always an integer. In 1983 Thouless proposed that the electrons in von Klitzing’s experiment had formed a topological quantum fluid; the electrons’ collective behavior in that fluid, as measured by conductance, must vary in steps.

Not only did Thouless’s work explain the integer nature of the quantum Hall effect, but it also pointed the way to reproducing the phenomenon’s exotic behavior under less extreme conditions. In 1988 Haldane proposed a means for electrons to form a topological quantum fluid in the absence of a magnetic field. Twenty-five years later, researchers reported such behavior in chromium-doped (Bi,Sb)2Te3, the first observation of what is known as the quantum anomalous Hall effect.

Exploring topological materials

Around 2005, physicists began exploring the possibility of realizing topological insulators, a large family of new topological phases of matter that would exhibit the best of multiple worlds: They would robustly conduct electricity on their edges or surfaces without a magnetic field and as a bonus would divide electron traffic into lanes determined by spin. Since then experimenters have identified topological insulators in two and three dimensions, which may lead to improved electronics. Other physicists have created topological insulators that conduct sound or light, rather than electrons, on their surfaces (see Physics Today, May 2014, page 68).

Haldane’s work in the 1980s on the fractional quantum Hall effect was among the theoretical building blocks for proposals to use topologically protected excitations to build a fault-tolerant quantum computer (see Physics Today, October 2005, page 21). And his 1982 paper on magnetic chains serves as the foundation for efforts to create topologically protected excitations that behave like Majorana fermions, which are their own antiparticle. The work could lead to robust qubits for preserving the coherence of quantum information and perhaps provide particle physicists with clues as to the properties of fundamental Majorana fermions, which may or may not exist in nature.

—Andrew Grant

 

Rosetta’s Farewell

Rosetta’s Farewell
After closely following comet 67P/Churyumov-Gerasimenko for 786 days as it rounded the Sun, the Rosetta spacecraft’s controlled impact with the comet’s surface was confirmed by the loss of signal from the spacecraft on September 30, 2016. One the images taken during its final descent, this high resolution view looks across the comet’s stark landscape. The scene spans just over 600 meters (2,000 feet), captured when Rosetta was about 16 kilometers from the comet’s surface. Rosetta’s descent to the comet brought to an end the operational phase of an inspirational mission of space exploration. Rosetta deployed a lander to the surface of one of the Solar System’s most primordial worlds and witnessed first hand how a comet changes when subject to the increasing intensity of the Sun’s radiation. The decision to end the mission on the surface is a result of the comet’s orbit now taking it to the dim reaches beyond Jupiter where there would be a lack of power to operate the spacecraft. Mission operators also faced an approaching period where the Sun would be close to line-of-sight between Earth and Rosetta, making radio communications increasingly difficult.

2dmfBHl