Monday, February 24, 2020

The Virtual Image: Keeping It Real

When recently asked to explain the operation of a magnifying glass, one of us (MHB) revisited an old conundrum from secondary school: the virtual image. When a viewed object is within the focal length of the lens, an image is formed that is not inverted relative to the viewed object. Far more mystifying, it is on the same side of the lens as the light source, not on the side to which actual light is conveyed. Yet we see virtual images all the time.

You can see the traditional picture of a virtual-image situation in the part of Fig. 1 that includes the “eyepiece” lens and everything to the left of it.  The “object” is a short, downward-pointing black arrow. The “virtual image” is the longer, dashed black arrow at the far left.  The base of the object and of the image lie on the axis of the lens. The point of the arrow in object and image are connected by two rays (dashed black lines delimiting a pink area). One of these rays passes through the center of the lens, and in the ideal (thin-lens) case this ray will be undeflected by the lens.  The other ray starts out parallel to the lens axis, and is bent by the lens so it passes through the focal point (not shown) that is to the right of the lens.  (A collection of such rays from the Sun could concentrate enough to cause a fire.)  If the object had been to the left of the left-hand focal point of the lens, this bent axial ray would have intersected the undeflected ray so as to form a real image: inverted relative to the object, on the same side of the lens as the light propagates, and generally understandable. But when, as in Fig. 1, the object lies to the right of the left-hand focal point, the two rays diverge to the right of the lens, and instead, we are told, we have to backtrack both rays until they intersect on the left-hand side, and this  intersection is part of the virtual image.

How can an image form that is coincident neither with the eye nor with the light that should be generating that image? Virtual or not, this deserves an explanation.

As optical engineers will tell you (if pressed), the virtual image is a way of expressing a geometric relationship without involving the actual eye.  As soon as you include the eye in the explanation (see Fig. 1) the paradox is resolved.  The rays that diverge as they pass to the right of the eyepiece are brought together in focus by the lens of the eye.  Where these rays meet, one finds a real image on the retina. This image is shown in Fig. 1 as a white arrow. It is a real image, being where the light ends up, where vision takes over, and in an inverted configuration that the visual system interprets correctly. (That is a subject for another essay.) 

The important thing to remember about the virtual image is that it is a shorthand to replace a fairly intricate relationship of the rays that form the real image if the explanation includes the eye. That relationship is only partially apparent in Fig. 1.  One gets the false impression that the bent axial ray from the eyepiece lens becomes the undeflected ray of the eye’s lens, and vice versa.  Actually, the bent axial ray from the eyepiece lens passes through the right-hand focal point of the eyepiece, which only accidentally happens to be near the center of the eye lens. 

A clearer picture of the eye’s focusing of the diverging rays is shown in Fig. 2, a ray-tracing simulation.  Here the optical elements are more stylized than in Fig. 1: the two lenses are red-tipped line segments, the object consists of a pair of radiating green dots, and rays from the dots proceed through both lenses and meet at the right-hand convergence points that depict the real image on the retina.  The virtual image is too far to the left to be captured by the figure.  We think that both Figs. 1 and 2 are needed to show how the eye acts in a virtual-image situation.

In short, experts know the virtual image and how to “keep it real.” Now you know too.
 
Michael H. Brill and Nilesh Dhote
Datacolor


Fig. 1. Drastically abbreviated depiction of a virtual image and the real counterpart that emerges when the eye is included. Please ignore the yellow region. The figure in context can be found at https://micro.magnet.fsu.edu/primer/anatomy/components.html.


Fig. 2.  Simulation of formation of a real image using two lenses. The divergence of the rays through the left-hand (eyepiece) lens is evident from the rays that proceed outside the lens aperture of the eye.  (See https://ricktu288.github.io/ray-optics/simulator/.)


Friday, November 15, 2019

Black to the Future Redux


A bit more than a decade ago, Hue Angles presented an article called “Black to the Future” [1]. To further darken a black carbon surface, investigators in Rensselaer Polytechnic Institute and at Rice University roughened the surface by a carpet-like arrangement of carbon nanotubes (.01" long, 1/30,000 as wide) standing on their ends. The result was a surface with a reflectance as low as 0.045 percent (three times darker than any previous material) and a refractive index that could theoretically be as low as 1.01 [2]. We proposed possible uses for such a material in spectrophotometry:
 
1. Black surfaces for minimizing stray light in optical instruments

2. Light traps for suppressing unwanted diffraction orders

3. Gloss traps for removing specular reflection

4. Black calibration standards
 
Now, after more than a decade, how well did our wish-list work out? Not well, at least for commercial applications. At the ISCC topical meeting on black and white, held that very same year, it became clear that the carbon nanotube technology was too delicate and too expensive for our purposes. But the technology evolved and improved anyway, and new uses were found.
Beginning in 2014, Surrey NanoSystems issued a product called Vantablack, which reflects 0.04 percent of UV, visible, and IR radiation. Vantablack had the same mechanical vulnerability as its predecessors, so it did not find many applications on Earth. However, in space the substance could be undisturbed, and starting in 2015 helped capture stray light to enhance spaceborne imagery (e.g., tracking stars) without a large payload penalty. Also, back on Earth, it achieved an effect that was coveted by artists: three-dimensional objects covered with Vantablack would appear to be flat surfaces because not enough light was reflected to reveal the 3D topography. BMW even painted a car with Vantablack. By 2017, a version of Vantablack (S-VIS) became available in a spray-on form. The reflectance was understandably not quite so low in this form: 0.2 percent. But the material still served its various functions. Although Vantablack is not commercially available, Surrey NanoSystems has licensed the product.
 
Now there is a material that is still blacker. It emerged from laboratories in Shanghai and at Massachusetts Institute of Technology[3], and has a reflectance of 0.004 percent. The discovery was accidental, during attempts to grow carbon nanotubes on aluminum foil. To avoid the formation of oxides between the nanotubes and the foil, the investigators soaked the foil in salt water and moved it into a small oven where the nanotubes could grow without oxygen interference.
 
A popular article by Brandon Specktor [4] describes two implications of the new black technology. There is a $2 million diamond on exhibit in the New York Stock Exchange that has been covered with the material and is invisible on a background of similar black material. Specktor speaks poetically of the black material “eating” the diamond and that it is a “veritable black hole.” Indeed, as he suggests, we may soon be able to see real black holes if the new black material is deployed to optical instruments in space. But I don’t expect to see signs with the words “Schwarzschild radius” on any photos, though we have seen cartoons of other human-created follies (such as Pluto bedecked with the sign “am too a planet.”) Enough about black holes…
 
In summary, the last ten years have brought a factor of 10 reflectance decrease in the blackest black. We’ve achieved a decade in a decade. Stay tuned for the next decade.
 
[1] M H Brill, A Ingleson, and C McLellan, Black to the future. ISCC News # 434 (2008), 3-4.

[2] Z-P Yang, L. Ci, JA Bur, S-Y Lin, PM Ajayan, Experimental observation of an extremely dark material made by a low-density nanotube array. Nano Letters 8, No. 2 (Feb. 2008), 446-451.

[3] K Cui, B L Wardle, Breakdown of native oxide enables multifunctional, free-form carbon nanotube-metal hierarchical architectures. CS Appl. Mater. Interfaces 2019 XXXXXXXXXX-XXX:September 12, 2019;  https://doi.org/10.1021/acsami.9b08290

[4] B Specktor, There’s a new blackest material ever, and it’s eating a diamond as we speak. Live Science, Sept 16, 2019, https://www.livescience.com/blackest-black-devours-diamond.… 

Michael H. Brill
Datacolor