Quantum imaging with undetected photons

Phase image of an object opaque for 810 nm light

Phase image of an object opaque for 810 nm light (arxiv)

Physicists have devised a way to take pictures using light that has not interacted with the object being photographed.

This form of imaging uses pairs of photons, twins that are ‘entangled’ in such a way that the quantum state of one is inextricably linked to the other. While one photon has the potential to travel through the subject of a photo and then be lost, the other goes to a detector but nonetheless ‘knows’ about its twin’s life and can be used to build up an image.

Normally, you have to collect particles that come from the object to image it, says Anton Zeilinger, a physicist at the Austrian Academy of Sciences in Vienna who led the work. “Now, for the first time, you don’t have to do that.”

One advantage of the technique is that the two photons need not be of the same energy, Zeilinger says, meaning that the light that touches the object can be of a different colour than the light that is detected. For example, a quantum imager could probe delicate biological samples by sending low-energy photons through them while building up the image using visible-range photons and a conventional camera. The work is published in the August 28 issue of Nature.

Zeilinger and his colleagues based the technique on an idea first outlined in 1991, in which there are two paths down which a photon can travel. Each contains a crystal that turns the particle into a pair of entangled photons. But only one path contains the object to be imaged.

According to the laws of quantum physics, if no one detects which path a photon took, the particle effectively has taken both routes, and a photon pair is created in each path at once, says Gabriela Barreto Lemos, a physicist at Austrian Academy of Sciences and a co-author on the latest paper.

In the first path, one photon in the pair passes through the object to be imaged, and the other does not. The photon that passed through the object is then recombined with its other ‘possible self’ — which travelled down the second path and not through the object — and is thrown away. The remaining photon from the second path is also reunited with itself from the first path and directed towards a camera, where it is used to build the image, despite having never interacted with the object.

The researchers imaged a cut-out of a cat, a few millimeters wide, as well as other shapes etched into silicon. The team probed the cat cut-out using a wavelength of light which they knew could not be detected by their camera. “That’s important, it’s the proof that it’s working,” says Zeilinger.

The cat was picked in honor of a thought experiment, proposed in 1935 by the Austrian physicist Erwin Schrödinger, in which a hypothetical cat in a box is both alive and dead, as long as no one knows whether or not a poison in the box has been released. In a similar way, in the latest experiment, as long as there is nothing to say which path the photon took, one of the photons in the pair that is subsequently created has both gone and not gone through the object, she adds.

Previous experiments have tried to do something similar in a process known as ghost imaging. But the latest method is simpler, says Mary Jacquiline Romero, a physicist at the University of Glasgow, UK. In ghost imaging, even though only one photon interacts with the object, both photons need to be collected to reconstruct the image, whereas in the Vienna team’s work only one photon needs to be detected. As ghost imaging needs both photons to produce the image, some physicists have questioned whether the effect is truly quantum or could be explained by classical physics – an argument Zeilinger says would be difficult to make with this experiment.

Robert Boyd, a physicist at the University of Rochester in New York, says that the experiment is so intriguing he wishes he had thought of it first. “That’s the greatest compliment that a scientist can give,” he says.
Read more atwww.scientificamerican.com

Van der Pol and the history of relaxation oscillations

Jean-Marc Ginoux, Christophe Letellier

Relaxation oscillations are commonly associated with the name of Balthazar van der Pol via his eponymous paper (Philosophical Magazine, 1926) in which he apparently introduced this terminology to describe the nonlinear oscillations produced by self-sustained oscillating systems such as a triode circuit.
Our aim is to investigate how relaxation oscillations were actually discovered. Browsing the literature from the late 19th century, we identified four self-oscillating systems in which relaxation oscillations have been observed:
i) the series dynamo machine conducted by Gerard-Lescuyer (1880),
ii) the musical arc discovered by Duddell (1901) and investigated by Blondel (1905),
iii) the triode invented by de Forest (1907)
and, iv) the multivibrator elaborated by Abraham and Bloch (1917).
The differential equation describing such a self-oscillating system was proposed by Poincare for the musical arc (1908), by Janet for the series dynamo machine (1919), and by Blondel for the triode (1919). Once Janet (1919) established that these three self-oscillating systems can be described by the same equation, van der Pol proposed (1926) a generic dimensionless equation which captures the relevant dynamical properties shared by these systems. Van der Pol’s contributions during the period of 1926-1930 were investigated to show how, with Le Corbeiller’s help, he popularized the “relaxation oscillations” using the previous experiments as examples and, turned them into a concept….
… Read more at http://arxiv.org/pdf/1408.4890v1.pdf

Do we live in a 2-D hologram?

New Fermilab experiment will test the nature of the universe

A Fermilab scientist works on the laser beams at the heart of the Holometer experiment. The Holometer will use twin laser interferometers to test whether the universe is a 2-D hologram. Credit: Fermilab

A Fermilab scientist works on the laser beams at the heart of the Holometer experiment. The Holometer will use twin laser interferometers to test whether the universe is a 2-D hologram. Credit: Fermilab

A unique experiment at the U.S. Department of Energy’s Fermi National Accelerator Laboratory called the Holometer has started collecting data that will answer some mind-bending questions about our universe – including whether we live in a hologram.

Much like characters on a television show would not know that their seemingly 3-D world exists only on a 2-D screen, we could be clueless that our 3-D space is just an illusion. The information about everything in our universe could actually be encoded in tiny packets in two dimensions.

Get close enough to your TV screen and you’ll see pixels, small points of data that make a seamless image if you stand back. Scientists think that the universe’s information may be contained in the same way and that the natural “pixel size” of space is roughly 10 trillion trillion times smaller than an atom, a distance that physicists refer to as the Planck scale.

“We want to find out whether space-time is a quantum system just like matter is,” said Craig Hogan, director of Fermilab’s Center for Particle Astrophysics and the developer of the holographic noise theory. “If we see something, it will completely change ideas about space we’ve used for thousands of years.”

Quantum theory suggests that it is impossible to know both the exact location and the exact speed of subatomic particles. If space comes in 2-D bits with limited information about the precise location of objects, then space itself would fall under the same theory of uncertainty. The same way that matter continues to jiggle (as quantum waves) even when cooled to absolute zero, this digitized space should have built-in vibrations even in its lowest energy state.

Essentially, the experiment probes the limits of the universe’s ability to store information. If there is a set number of bits that tell you where something is, it eventually becomes impossible to find more specific information about the location – even in principle. The instrument testing these limits is Fermilab’s Holometer, or holographic interferometer, the most sensitive device ever created to measure the quantum jitter of space itself.

Now operating at full power, the Holometer uses a pair of interferometers placed close to one another. Each one sends a one-kilowatt laser beam (the equivalent of 200,000 laser pointers) at a beam splitter and down two perpendicular 40-meter arms. The light is then reflected back to the beam splitter where the two beams recombine, creating fluctuations in brightness if there is motion. Researchers analyze these fluctuations in the returning light to see if the beam splitter is moving in a certain way – being carried along on a jitter of space itself.

“Holographic noise” is expected to be present at all frequencies, but the scientists’ challenge is not to be fooled by other sources of vibrations. The Holometer is testing a frequency so high – millions of cycles per second – that motions of normal matter are not likely to cause problems. Rather, the dominant background noise is more often due to radio waves emitted by nearby electronics. The Holometer experiment is designed to identify and eliminate noise from such conventional sources.

“If we find a noise we can’t get rid of, we might be detecting something fundamental about nature – a noise that is intrinsic to space-time,” said Fermilab physicist Aaron Chou, lead scientist and project manager for the Holometer. “It’s an exciting moment for physics. A positive result will open a whole new avenue of questioning about how space works.”

The Holometer experiment, funded by the U.S. Department of Energy Office of Science and other sources, is expected to gather data over the coming year.
Read more at www.fnal.gov

Astrosociology: Interwiews about an infinite universe

visible universe

click to enlarge

Erik Høg

If the universe is infinite now it has always been infinite. This is the opinion of many astronomers today as can be concluded from the following series of interviews, but the opinions differ much more than I had expected.
Many astronomers do not have a clear opinion on this matter. Others have a clear opinion, but very different from the majority. Detailed arguments by two experts on general relativity are also included.
Observations show that the universe is flat, i.e. the curvature is zero within the small uncertainty of measurements.
This implies an infinite universe, though most probably we will never know that for certain. For comparison with the recent interviews, opinions during the past 2300 years since Aristotle about the universe being finite or infinite have been collected from literature, and it appears that the scientists often had quite definite opinions…

… Read more at http://arxiv.org/ftp/arxiv/papers/1408/1408.4795.pdf

Elementary particles may be thought of as small black holes

TH21-STRING-BRSC_2068470fAshoke Sen, String Theory expert from Harish Chandra Research Institute, Allahabad, is one of the three recipients of the Dirac Medal awarded by International Centre for Theoretical Physics (ICTP) this year. Professor Sen, who receives this prize for his work on black holes and symmetries of string theory, communicates his thoughts in an e-mail interview with Shubashree Desikan - thehindu.com 

How do you feel on receiving this prize?

I certainly feel very honoured and happy to receive this prize. This prize is particularly important for me because this is given by ICTP which has played a significant role in the development of science in the developing countries. My own association with ICTP goes back to about 30 years.

Can you explain your work on symmetries of string theory?

My work on symmetries of string theory is on what is known as strong-weak coupling duality or S-duality.

A symmetry refers to a transformation under which an object looks the same. For example a square has 90 degrees rotational symmetry; if we rotate it by 90 degrees about its centre, it looks the same. Before my work, it had been suspected for a while by a few people (Montonen and Olive; Osborn; Duff and his collaborators; Font, Quevedo, Lust and Rey; myself; Schwarz and others) that string theory, and its close cousin, quantum field theories, had some unusual symmetries which are not easily visible. However because such symmetries were not easily visible, it was hard to decide if they are really there, and very few people took this idea seriously.

In my work in 1994, I showed how one can do concrete calculations to test whether such symmetries are really there and I worked out an explicit example which gave non-trivial evidence for such a symmetry. Later this technique was used in many other cases, and led to the discovery of many such symmetries. Eventually based on these results string theorists came to realise that what people thought earlier as different theories are all different ways of describing the same underlying theory. This unified all string theories into one theory.

The citation also mentions your work on black holes, what was this work?

My work on black holes was on the connection between black holes and elementary particles.

Normally we think of elementary particles as tiny objects. On the other hand, black holes can come in all sizes but normally we think of them as big objects from which even light cannot escape. My work indicated that if we consider smaller and smaller black holes, at some stage the properties of black holes become indistinguishable from those of elementary particles. Thus elementary particles may be thought of as small black holes and vice versa.

This work was preceded and followed by many important developments. The suggestion that black holes may behave like elementary particles had been there before my work — notably by ‘t Hooft and a little later by Susskind and his collaborators. But the calculations based on black holes and elementary particles did not match, [People] attributed it to the fact that the calculations for black holes were not reliable when the black holes are small.

As a result one could neither verify nor refute their proposal reliably. My contribution (in 1995) was to identify a specific system with large amount of symmetry that allowed us to do this calculation reliably. The results from black holes indeed matched those of the elementary particles in that system, giving concrete evidence that small black holes indeed describe elementary particles.

But string theory often faces a lot of flak as being a theory that can never be probed by experiment. So what is the justification for studying it?

String theory tries to give a unified description of all particles and forces operating between them.

One of the main successes of string theory is that it has been able to unify the general theory of relativity, which describes gravity, and quantum mechanics.

Unfortunately a direct test of string theory requires colliding extremely high energy particles and observing the result of this collision. It is impossible to achieve this with present technology.

This problem is not unique to string theory.

Any direct experimental test of quantum nature of gravity will require such high energy collisions. Given that such energies are not available today, we have two choices: either give up attempts to find a quantum theory of gravity or try to use existing knowledge to do the best we can. String theory follows the second path.

Requiring that the theory is mathematically consistent has led to many new results in mathematics. Without these mathematical relations, string theory would fail to be a consistent theory.

But so far all such new relations found in string theory have proved to be correct, providing further evidence for the underlying consistency of the theory.

Metamaterial Superconductor Hints At New Era Of High Temperature Superconductivity

Experiment hints at a new way to engineer high temperature superconductors

Superconductors are among the wonders of modern science. They allow a current to flow with zero resistance in materials cooled below some critical temperature. Superconductors are the crucial ingredients in everything from high-power magnets and MRI machines to highly sensitive magnetometers and magnetic levitation devices.

One problem though is that superconductors work only at very low temperatures. So one of the great challenges in this area of science is to find materials that superconduct at higher temperatures, perhaps even at room temperature. That won’t be easy given that the current record is around 150 kelvin (-120 degrees centigrade).supercondNevertheless, a way of increasing the critical temperature of existing superconducting materials would be hugely useful.

Today, a group of physicists and engineers say they have worked out how to do this. The trick is to think of a superconductor as a special kind of metamaterial and then to manipulate its structure in a way that increases its critical temperature.

Vera Smolyaninova at Towson University in Maryland and colleagues from the University of Maryland and the Naval Research Laboratory in Washington DC, have even demonstrated this idea by increasing the critical superconducting temperature of tin.

First some background about metamaterials. Until relatively recently, physicists had always treated bulk materials as homogeneous lumps of the same stuff. These lumps have bulk properties such as the ability to bend light in a certain way.

But in recent years they have began to think about constructing artificial materials made of periodic patterns of structures that themselves interact with electromagnetic waves, things like wires, c-shaped conductors and so on. If these structures are much smaller than the wavelength of the light passing by, then they act like a homogeneous lump, at least as far as the light is concerned.

By toying with this periodic structure, physicists can create artificial materials with all kinds of exotic properties. The most famous of these is the invisibility cloak, a metamaterial designed to steer light around an object as if it were not there.

Superconductivity can be thought of in a similar way, say Smolyaninova and co. Conventional superconductors made of a single metal are homogeneous lumps of the same stuff that have zero resistance at some critical temperature…..
… read more at medium.com

Building the Universe Pixel by Pixel

Three ‘mapmakers’ of the universe – Ralf Kaehler, Stuart Levy and Dylan Nelson – discuss how their dramatically intricate 3-D universes can tell important stories about the cosmos.

Recently, the Harvard-Smithsonian Center for Astrophysics unveiled an unprecedented simulation of the universe’s development. Called the Illustris project, the simulation depicts more than 13 billion years of cosmic evolution across a cube of the universe that’s 350-million-light-years on each side. The goal was to view the formation of galaxies and other large-scale structure we see around us today, to test our understanding of what makes up the universe – including dark matter and dark energy – as well as how those components interact. It was a massive undertaking, one that took more than 5 years to complete. But why was it important to conduct such a simulation?

To better understand the science and art of astrophysics visualizations, three experts came together in late July to discuss the ways in which their work benefits both science and the public’s perception of science. The participants:
RALF KAEHLER – is a physicist and computer scientist by training who now runs the visualization facilities at the Kavli Institute for Particle Astrophysics and Cosmology, located at SLAC National Accelerator Laboratory and Stanford University.
STUART LEVY – is a research programmer and member of the National Center for Supercomputing Applications’ Advanced Visualization Lab team, which creates high-resolution data-driven scientific visualizations for public outreach.
DYLAN NELSON – is a graduate student at the Harvard-Smithsonian Center for Astrophysics and a member of the Illustris collaboration, which recently completed a large cosmological simulation of galaxy formation.

… you can read the whole discussion here: www.kavlifoundation.org