Sci-Advent – ‘Electronic amoeba’ finds approximate solution to traveling salesman problem in linear time

Researchers at Hokkaido University and Amoeba Energy in Japan have, inspired by the efficient foraging behavior of a single-celled amoeba, developed an analog computer for finding a reliable and swift solution to the traveling salesman problem — a representative combinatorial optimization problem.

Amoeba-inspired analog electronic computing system integrating resistance crossbar for solving the travelling salesman problem. Scientific Reports, 2020; 10 (1) DOI: 10.1038/s41598-020-77617-7

Many real-world application tasks such as planning and scheduling in logistics and automation are mathematically formulated as combinatorial optimization problems. Conventional digital computers, including supercomputers, are inadequate to solve these complex problems in practically permissible time as the number of candidate solutions they need to evaluate increases exponentially with the problem size — also known as combinatorial explosion. Thus new computers called “Ising machines,” including “quantum annealers,” have been actively developed in recent years. These machines, however, require complicated pre-processing to convert each task to the form they can handle and have a risk of presenting illegal solutions that do not meet some constraints and requests, resulting in major obstacles to the practical applications.

These obstacles can be avoided using the newly developed “electronic amoeba,” an analog computer inspired by a single-celled amoeboid organism. The amoeba is known to maximize nutrient acquisition efficiently by deforming its body. It has shown to find an approximate solution to the traveling salesman problem (TSP), i.e., given a map of a certain number of cities, the problem is to find the shortest route for visiting each city exactly once and returning to the starting city. This finding inspired Professor Seiya Kasai at Hokkaido University to mimic the dynamics of the amoeba electronically using an analog circuit, as described in the journal Scientific Reports. “The amoeba core searches for a solution under the electronic environment where resistance values at intersections of crossbars represent constraints and requests of the TSP,” says Kasai. Using the crossbars, the city layout can be easily altered by updating the resistance values without complicated pre-processing.

Kenta Saito, a PhD student in Kasai’s lab, fabricated the circuit on a breadboard and succeeded in finding the shortest route for the 4-city TSP. He evaluated the performance for larger-sized problems using a circuit simulator. Then the circuit reliably found a high-quality legal solution with a significantly shorter route length than the average length obtained by the random sampling. Moreover, the time required to find a high-quality legal solution grew only linearly to the numbers of cities. Comparing the search time with a representative TSP algorithm “2-opt,” the electronic amoeba becomes more advantageous as the number of cities increases. “The analog circuit reproduces well the unique and efficient optimization capability of the amoeba, which the organism has acquired through natural selection,” says Kasai.

“As the analog computer consists of a simple and compact circuit, it can tackle many real-world problems in which inputs, constraints, and requests dynamically change and can be embedded into IoT devices as a power-saving microchip,” says Masashi Aono who leads Amoeba Energy to promote the practical use of the amoeba-inspired computers.

This is a Joint Release between Hokkaido University and Amoeba Energy Co., Ltd. More information

Sci-Advent – New superhighway system discovered in the Solar System

Researchers have discovered a new superhighway network to travel through the Solar System much faster than was previously possible. Such routes can drive comets and asteroids near Jupiter to Neptune’s distance in under a decade and to 100 astronomical units in less than a century. They could be used to send spacecraft to the far reaches of our planetary system relatively fast, and to monitor and understand near-Earth objects that might collide with our planet.

The arches of chaos in the Solar System. Science Advances, 2020; 6 (48): eabd1313 DOI: 10.1126/sciadv.abd1313

In their paper, published in the Nov. 25 issue of Science Advances, the researchers observed the dynamical structure of these routes, forming a connected series of arches inside what’s known as space manifolds that extend from the asteroid belt to Uranus and beyond. This newly discovered “celestial autobahn” or “celestial highway” acts over several decades, as opposed to the hundreds of thousands or millions of years that usually characterize Solar System dynamics.

The most conspicuous arch structures are linked to Jupiter and the strong gravitational forces it exerts. The population of Jupiter-family comets (comets having orbital periods of 20 years) as well as small-size solar system bodies known as Centaurs, are controlled by such manifolds on unprecedented time scales. Some of these bodies will end up colliding with Jupiter or being ejected from the Solar System.

The structures were resolved by gathering numerical data about millions of orbits in our Solar System and computing how these orbits fit within already-known space manifolds. The results need to be studied further, both to determine how they could be used by spacecraft, or how such manifolds behave in the vicinity of the Earth, controlling the asteroid and meteorite encounters, as well as the growing population of artificial human-made objects in the Earth-Moon system.

Sci-Advent – Trends in prevalence of blindness and distance and near vision impairment over 30 years

Keeping up with the Sci-advent post from yesterday about vision and optics, this report from the University of Michigan is relevant news. Researchers say eye care accessibility around the globe isn’t keeping up with an aging population, posing challenges for eye care professionals over the next 30 years.

As the global population grows and ages, so does their need for eye care. But according to two new studies published in The Lancet Global Health, these needs aren’t being met relative to international targets to reduce avoidable vision loss.

As 2020 comes to a close, an international group of researchers set out to provide updated estimates on the number of people that are blind or visually impaired across the globe, to identify the predominant causes, and to illustrate epidemiological trends over the last 30 years.

“This is important because when we think about setting a public health agenda, knowing the prevalence of an impairment, what causes it, and where in the world it’s most common informs the actions that key decision makers like the WHO and ministries of health take to allocate limited resources,” says Joshua Ehrlich, M.D., M.P.H., a study author and ophthalmologist at Kellogg Eye Center.

The study team assesses a collection of secondary data every five years, undertaking a meta-analysis of population-based surveys of eye disease assembled by the Vision Loss Expert Group and spanning from 1980 to 2018.

Creating a blueprint

A study like this poses challenges since regional populations vary in age.

“For example, the population in some Asian and European countries is much older on average than the population in many African nations. Many populations are also growing older over time. A direct comparison of the percentage of the population with blindness or vision impairment wouldn’t paint a complete picture” says Ehrlich, who is also a member of University of Michigan’s Institute for Healthcare Policy and Innovation, explains.

To address this issue, the study looked at age-standardized prevalence, accomplished by adjusting regional populations to fit a standard age structure.

“We found that the age-standardized prevalence is decreasing around the world, which tells us eye care systems and quality of care are getting better,” says study author Monte A. Del Monte, M.D., a pediatric ophthalmologist at Kellogg Eye Center. “However, as populations age, a larger number of people are being affected by serious vision impairment, suggesting we need to improve accessibility to care and further develop human resources to provide care.”

In fact, the researchers found that there wasn’t any significant reduction in the number of people with treatable vision loss in the last ten years, which paled in comparison to the World Health Assembly Global Action Plan target of a 25% global reduction of avoidable vision loss in this same time frame.

Although findings varied by region globally, cataracts and the unmet need for glasses were the most prevalent causes of moderate to severe vision impairment. Approximately 45% of the 33.6 million cases of global blindness were caused by cataracts, which can be treated with surgery.

Refractive error, which causes a blurred image resulting from an abnormal shape of the cornea and lens not bending light correctly, accounted for vision loss in 86 million people across the globe. This largest contributor to moderate or severely impaired vision can be easily treated with glasses.

Also important, vision impairment due to diabetic retinopathy, a complication of diabetes that affects eyesight, was found to have increased in global prevalence.

“This is another condition in which we can prevent vision loss with early screenings and intervention,” says study author Alan L. Robin, M.D., a collaborating ophthalmologist at Kellogg Eye Center and professor at Johns Hopkins Medicine. “As diabetes becomes more common across the globe, this condition may begin to affect younger populations, as well.”

Looking to 2050

“Working as a global eye care community, we need to now look at the next 30 years,” Ehrlich says. “We hope to take these findings and create implementable strategies with our global partners through our Kellogg Eye Center for International Ophthalmology so fewer people go blind unnecessarily.”

In an effort to contribute to the WHO initiative VISION 2020: The Right to Sight, the researchers updated estimates of the global burden of vision loss and provided predictions for what the year 2050 may look like.

They found that the majority of the 43.9 million people blind globally are women. Women also make up the majority of the 295 million people who have moderate to severe vision loss, the 163 million who have mild vision loss and the 510 million who have visual impairments related to the unmet need for glasses, specifically poor near vision.

By 2050, Ehrlich, Del Monte, and Robin predict 61 million people will be blind, 474 million will have moderate and severe vision loss, 360 million will have mild vision loss and 866 million will have visual impairments related to farsightedness.

“Eliminating preventable blindness globally isn’t keeping pace with the global population’s needs,” Ehrlich says. “We face enormous challenges in treating and preventing vision impairment as the global population grows and ages, but I’m optimistic of a future where we will succeed because of the measures we take now to make a difference.”

Both studies were funded by Brien Holden Vision Institute, Fondation Théa, Fred Hollows Foundation, Bill & Melinda Gates Foundation, Lions Clubs International Foundation, Sightsavers International and the University of Heidelberg.

GBD 2019 Blindness and Vision Impairment Collaborators, on behalf of theVision Loss Expert Group of the Global Burden of Disease Study. Causes of blindness and vision impairment in 2020 and trends over 30 years, and prevalence of avoidable blindness in relation to VISION 2020: the Right to Sight: an analysis for the Global Burden of Disease Study. The Lancet Global Health, 2020; DOI: 10.1016/S2214-109X(20)30489-7

Sci-Advent – Physicists Nail Down the ‘Magic Number’ That Shapes the Universe

This is a reblog of the article in Nautilus by Natalie Wolchover. See the original here.

A team in Paris has made the most precise measurement yet of the fine-structure constant, killing hopes for a new force of nature.

As fundamental constants go, the speed of light, c, enjoys all the fame, yet c’s numerical value says nothing about nature; it differs depending on whether it’s measured in meters per second or miles per hour. The fine-structure constant, by contrast, has no dimensions or units. It’s a pure number that shapes the universe to an astonishing degree — “a magic number that comes to us with no understanding,” as Richard Feynman described it. Paul Dirac considered the origin of the number “the most fundamental unsolved problem of physics.”

Numerically, the fine-structure constant, denoted by the Greek letter α (alpha), comes very close to the ratio 1/137. It commonly appears in formulas governing light and matter. “It’s like in architecture, there’s the golden ratio,” said Eric Cornell, a Nobel Prize-winning physicist at the University of Colorado, Boulder and the National Institute of Standards and Technology. “In the physics of low-energy matter — atoms, molecules, chemistry, biology — there’s always a ratio” of bigger things to smaller things, he said. “Those ratios tend to be powers of the fine-structure constant.”

The constant is everywhere because it characterizes the strength of the electromagnetic force affecting charged particles such as electrons and protons. “In our everyday world, everything is either gravity or electromagnetism. And that’s why alpha is so important,” said Holger Müller, a physicist at the University of California, Berkeley. Because 1/137 is small, electromagnetism is weak; as a consequence, charged particles form airy atoms whose electrons orbit at a distance and easily hop away, enabling chemical bonds. On the other hand, the constant is also just big enough: Physicists have argued that if it were something like 1/138, stars would not be able to create carbon, and life as we know it wouldn’t exist.

Physicists have more or less given up on a century-old obsession over where alpha’s particular value comes from; they now acknowledge that the fundamental constants could be random, decided in cosmic dice rolls during the universe’s birth. But a new goal has taken over.

Physicists want to measure the fine-structure constant as precisely as possible. Because it’s so ubiquitous, measuring it precisely allows them to test their theory of the interrelationships between elementary particles — the majestic set of equations known as the Standard Model of particle physics. Any discrepancy between ultra-precise measurements of related quantities could point to novel particles or effects not accounted for by the standard equations. Cornell calls these kinds of precision measurements a third way of experimentally discovering the fundamental workings of the universe, along with particle colliders and telescopes.

Today, in a new paper in the journal Nature, a team of four physicists led by Saïda Guellati-Khélifa at the Kastler Brossel Laboratory in Paris reported the most precise measurement yet of the fine-structure constant. The team measured the constant’s value to the 11th decimal place, reporting that α = 1/137.03599920611. (The last two digits are uncertain.)

With a margin of error of just 81 parts per trillion, the new measurement is nearly three times more precise than the previous best measurement in 2018 by Müller’s group at Berkeley, the main competition. (Guellati-Khélifa made the most precise measurement before Müller’s in 2011.) Müller said of his rival’s new measurement of alpha, “A factor of three is a big deal. Let’s not be shy about calling this a big accomplishment.”

Guellati-Khélifa has been improving her experiment for the past 22 years. She gauges the fine-structure constant by measuring how strongly rubidium atoms recoil when they absorb a photon. (Müller does the same with cesium atoms.) The recoil velocity reveals how heavy rubidium atoms are — the hardest factor to gauge in a simple formula for the fine-structure constant. “It’s always the least accurate measurement that’s the bottleneck, so any improvement in that leads to an improvement in the fine-structure constant,” Müller explained.

The Paris experimenters begin by cooling the rubidium atoms almost to absolute zero, then dropping them in a vacuum chamber. As the cloud of atoms falls, the researchers use laser pulses to put the atoms in a quantum superposition of two states — kicked by a photon and not kicked. The two possible versions of each atom travel on separate trajectories until more laser pulses bring the halves of the superposition back together. The more an atom recoils when kicked by light, the more out of phase it is with the unkicked version of itself. The researchers measure this difference to reveal the atoms’ recoil velocity. “From the recoil velocity, we extract the mass of the atom, and the mass of the atom is directly involved in the determination of the fine-structure constant,” Guellati-Khélifa said.

In such precise experiments, every detail matters. Table 1 of the new paper is an “error budget” listing 16 sources of error and uncertainty that affect the final measurement. These include gravity and the Coriolis force created by Earth’s rotation — both painstakingly quantified and compensated for. Much of the error budget comes from foibles of the laser, which the researchers have spent years perfecting.

For Guellati-Khélifa, the hardest part is knowing when to stop and publish. She and her team stopped the week of February 17, 2020, just as the coronavirus was gaining a foothold in France. Asked whether deciding to publish is like an artist deciding that a painting is finished, Guellati-Khélifa said, “Exactly. Exactly. Exactly.”

Surprisingly, her new measurement differs from Müller’s 2018 result in the seventh digit, a bigger discrepancy than the margin of error of either measurement. This means — barring some fundamental difference between rubidium and cesium — that one or both of the measurements has an unaccounted-for error. The Paris group’s measurement is the more precise, so it takes precedence for now, but both groups will improve their setups and try again.

Though the two measurements differ, they closely match the value of alpha inferred from precise measurements of the electron’s g-factor, a constant related to its magnetic moment, or the torque that the electron experiences in a magnetic field. “You can connect the fine-structure constant to the g-factor with a hell of a lot of math,” said Cornell. “If there are any physical effects missing from the equations , we would be getting the answer wrong.”

Instead, the measurements match beautifully, largely ruling out some proposals for new particles. The agreement between the best g-factor measurements and Müller’s 2018 measurement was hailed as the Standard Model’s greatest triumph. Guellati-Khélifa’s new result is an even better match. “It’s the most precise agreement between theory and experiment,” she said.

And yet she and Müller have both set about making further improvements. The Berkeley team has switched to a new laser with a broader beam (allowing it to strike their cloud of cesium atoms more evenly), while the Paris team plans to replace their vacuum chamber, among other things.

What kind of person puts such a vast effort into such scant improvements? Guellati-Khélifa named three traits: “You have to be rigorous, passionate and honest with yourself.” Müller said in response to the same question, “I think it’s exciting because I love building shiny nice machines. And I love applying them to something important.” He noted that no one can single-handedly build a high-energy collider like Europe’s Large Hadron Collider. But by constructing an ultra-precise instrument rather than a super-energetic one, Müller said, “you can do measurements relevant to fundamental physics, but with three or four people.”

Quantum magic squares

Quantum magic squares

In a new paper in the Journal of Mathematical Physics, Tim Netzer and Tom Drescher from the Department of Mathematics and Gemma De las Cuevas from the Department of Theoretical Physics have introduced the notion of the quantum magic square, which is a magic square but instead of numbers one puts in matrices.

This is a non-commutative, and thus quantum, generalization of a magic square. The authors show that quantum magic squares cannot be as easily characterized as their “classical” cousins. More precisely, quantum magic squares are not convex combinations of quantum permutation matrices. “They are richer and more complicated to understand,” explains Tom Drescher. “This is the general theme when generalizations to the non-commutative case are studied. Check out the paper!

Quantum magic squares: Dilations and their limitations: Journal of Mathematical Physics: Vol 61, No 11
— Read on aip.scitation.org/doi/10.1063/5.0022344

2020 Nobel Prize in Physics – Black holes

I had intended to post this much ealier on, and certainly closer to the actual announcement of the Nobel Prizes in early October. It has however been a very busy period. Better late than never, right?

I was very pleased to see that the winners of the 2020 Nobel Prize in Physics were a group that combined the observational with the theoretical. Sir Roger Penrose, Reinhard Genzel, and Andrea Ghez are the recipients of the 2020 Nobel Prize in Physics. Penrose receives half the 10 million Swedish krona while Ghez and Genzel will share the other half.

Penrose’s work has taken the concept of black holes from the realm of speculation to a sound theoretical idea underpinning modern astrophysics. With the use of topology and general relativity, Penrose has provided us with an explanation to the collapse of matter due to gravity leading to the singularity at the centre of a black hole.

A few decades after the 1960’s work from Penrose we have Genzel and Ghez whose independent work using adaptive optics and speckle imaging enabled them to analyse the motion of stars tightly orbiting Sagittarius A*. Their work led to the conclusion that the only explanation for the radio source at the centre of the Milky Way’s was a black hole.

Ghez is the fourth woman to be named a Nobel physics laureate, after Donna Strickland (2018), Maria Goeppert Mayer (1963), and Marie Curie (1903).

From an Oddity to an Observation

In 1916 Karl Schwarzwild described a solution to Einstein’s field equation for the curved spacetime around a mass of radius r. Some terms in the solution either diverged or vanished for r=\frac{2GM}{c} or r=0. A couple of decades later, Oppenheimer and his student Hartland Snyder realised that the former value corresponded to the radius within which light, under the influence of gravity, would no longer be able to reach outside observers – the so called event horizon. Their work would need more than mathematical assumptions to be accepted.

By 1964 Penrose came up with topological picture of the gravitational collapse described and crucially doing so without the assumptions made by Oppenheimer and Snyder. His work required instead the idea of a trapped surface. In other words a 2D surface in which all light orthogonal to it converges. Penrose’s work showed that inside the event horizon, the radial direction becomes time-like. It is impossible to reverse out of the black hole and the implication is that all matter ends up at the singularity. Penrose’s research established black holes as plausible explanation for objets such s quasars and other active galactic nuclei.

Closer to Home

Although our own galaxy is by no means spewing energy like your average quasar, it still emits X-rays and other radio signals. Could it be that there is a black hole-like object at the heart of the Milky Way? This was a question that Genzel and Ghez would come to answer in time.

With the use of infrared (IR) spectroscopy, studies of gas clouds near the galactic centre showed rising velocities with decreasing distances to the centre, suggesting the presence of a massive, compact source of gravitation. These studies in the 1980s were not definitive but provided a tantalising possibility.

In the mid 1990s, both Genzel and Ghez set out to obtain better evidence with the help of large telescopes operating in the near-IR to detect photons escaping the galactic center. Genzel and colleagues began observing from Chile, whereas Ghez and her team from Hawaii.

Their independent development of speckle imaging, a technique that corrects for the distortions caused by Earth’s atmosphere enabled them to make the crucial observations. The technique improves the images by stacking a series of exposures, bringing the smeared light of individual stars into alignment. In 1997, both groups published their measurements stars movements strongly favouring the black hole explanation.

Further to that work, the use of adaptive optics by both laureates not only improved the resolutions obtained, but also provided the possibility of carrying out spectroscopic analyses which enabled them to get velocities in 3D and therefore obtain precise orbits.

The “star” object in this saga is the so-called S0-2 (Ghez’s group) or S2 (Genzel’s group) star. It approaches within about 17 light-hours of Sagittarius A* every 16 years in a highly elliptical orbit.

Congratulations to Ghez and Genzel, and Penrose.

AI as a Catalyst for Health

computer desk laptop stethoscope
Photo by Negative Space on Pexels.com

Artificial Inteligence, or AI, is no longer the mysterious technological unknown that it once was. In fact, it is now arguably woven into the fabric of our daily lives. Alexa tells us the weather and puts together playlists based on our music choices. Tesco or Ocado personalise our shopping discounts using data from previous purchasing behaviours. Apps predict menstrual cycles and Facebook even serves us ads informed by our browsing habits. 

AI in Health

Through effective automation of processes and problem solving, AI is pushing the boundaries of innovation across practically every industry imaginable. And none more so than healthcare where its use is already relatively widespread. Spanning drug discovery, through automation of diagnosis to supporting patient engagement and adherence, the potential for impact in the sector is exponential. 

With new proven use cases springing up on a regular basis, one area that has shown real promise is screening and diagnostics. A study published in Nature journal earlier this year highlighted the high potential for human error in the identification of breast cancer via mammograms screening. When AI was then introduced as a direct comparator, the technology demonstrated accuracy that surpassed human experts. At a time when the health system is already bursting at the seams and trained professionals are in short supply, integration of AI into the diagnostic journey would seem to make good sense. 

It is also important to note that AI should not be viewed as a replacement for the work carried out by medical professionals. Instead, it should become an additional tool to enhance their capabilities. In the case of medicine, an AI diagnostic model can serve as an additional layer of support and validation for qualified doctors or nurses. This in turn can leave them to focus on other aspects of their roles that an AI machine cannot provide, like quality patient care. 

Improving Outcomes for Patients

collaboration between Moorfields Eye Hospital NHS Foundation Trust and DeepMind Health set out to find new ways to utilise the power of AI to support clinicians in their care for patients. The resulting programme is now able to recommend the correct referral decision for over 50 eye diseases with 94% accuracy, matching world-leading eye experts. Here again, AI continues to demonstrate its potential for the revolution in eye care diagnosis, enabling conditions to be spotted earlier and prioritisation of patients with the most serious conditions. 

At TympaHealth, our focus is on the transformation of ear and hearing care. As with many specialisms, the journey to diagnosis for patients can be long, requiring many consultations with numerous doctors across a variety of specialities. Therefore, it’s our mission to bring ear and hearing care into the community. This includes placing better and faster access to diagnostic and treatment services. AI evidently has an important role in facilitating that. 

There have been limited studies on the use of AI in the context of hearing and ear care. The use of smartphone otoscopes promise to help develop this sector further. Recent studies, including a recent collaboration from the author, use smartphone otoscopes to improve the medical learning environment. In many cases ENT experience is limited in medical school to just one week as a special study block, the advent of artificial intelligence would certainly help in recognising conditions of the ear and streamlining referrals.

Another study, published in Otology and Neurotology last month, showed that machine learning helped to predict post-operative performance of a cochlear implant, as well as identifying the influencing factors. This shows the sector is receptive to this change similar to specialties such as Dermatology and Opthalomology.

Building on these studies, at TympaHealth we have a team to help us embed the technology into our own processes and platforms effectively. In turn this helps us improve diagnostic capabilities and ultimately improve patient outcomes. Much like the Moorfields and DeepMind collaboration for eye health, we aim to use machine learning to assist us in identifying ear conditions. However, that is only the very start. We’re also exploring new ways to harness the power of AI to develop a fully integrated machine learning platform and well as using algorithms as predictors of future ear health deterioration.

A Vision For the Future

In the recent Topol Review which set out a vision for the future of digital health, Eric Topol reinforced the importance of preparing our health and care workforces for a digital future. 

There is clearly a bright future for AI in healthcare, which has proven itself time and time again, especially at a time where technology and digital innovation is beginning to move front and centre in the mission to provide better and faster care. In the UK, our NHS system is already overstretched during usual times, and now with the additional pressures caused by the pandemic, it is imperative that we find new solutions that enable people to access the care they need, whilst relieving the burden on the healthcare professionals.