SPIDER shrinks telescopes with far-out design
In the space business, weight and size are what run up the bills. So imagine the appeal of a telescope that’s a tenth to as little as a hundredth as heavy, bulky and power hungry as the conventional instruments that NASA and other government agencies now send into space. Especially alluring is the notion of marrying the time-tested technology called interferometry, used in traditional observatories, with the new industrial field of photonics and its almost unimaginably tiny optical circuits.
Say hello to SPIDER, or Segmented Planar Imaging Detector for Electro-optical Reconnaissance.
But its inventors believe that, once demonstrated at full-scale, SPIDER will replace standard telescopes and long-range cameras in settings where room is scarce, such as on planetary probes and reconnaissance satellites.
Researchers at the Lockheed Martin Advanced Technology Center in Palo Alto, Calif., with partners in a photonics lab at the University of California, Davis, have described work on SPIDER for several years at specialty conferences. In January, they revealed their progress with a splash to the public in a press release and polished video.
Somewhat like a visible-light version of a vast field of radio telescopes, but at a radically smaller scale, a SPIDER scope’s surface would sparkle with hundreds to thousands of lenses about the size found on point-and-shoot cameras. The instrument might be a foot or two across and only as thick as a flat-screen TV.
Transit system for light
SPIDER probably won’t be equivalent to a large instrument such as the Hubble Space Telescope, but it could be a smaller, lighter alternative to modest telescopes and long-range cameras. Experts tend to rank telescopes by their aperture — the size of the bucket that catches light or other such radiation. The wider the bucket’s mouth, the higher the resolution. Ordinarily, behind the bucket’s maw is an extensive framework for massive lenses, mirrors and heating or cooling systems. Hubble’s aperture spans 2.4 meters; its power-generating solar panels enlarge it to the size and weight of a winged city bus. Even a compact telescope with a saucer-sized lens might have more than a kilogram of equipment stretched behind its face for a third of a meter or so.
Alan Duncan, a senior fellow at Lockheed Martin’s Advanced Technology Center, has devoted much of his career to space and reconnaissance imaging. He often focuses on interferometry, a method astronomers have long used to combine electromagnetic waves — both radio and visible — from several different telescopes. The results, with the help of computers, are images more sharply focused than from any of the smaller telescopes or radio dishes. Yet even with the leverage of conventional interferometry, Duncan struggled to slash the SWaP: size, weight and power demand.
His ambitions leapt at the Photonics West 2010 meeting in San Francisco. He learned that IBM researchers had a supercomputer design that would need relatively little energy to cool its electronic innards. They proposed finely laced channels through which data-filled beams of light would travel to deliver the computer’s output data. The setup would require a fraction of the energy of standard, integrated electronic chips that use metal wiring.
Duncan stared at the skeins of optical channels and the millions of junctions portrayed on the screen during the IBM talk. He recalls seeing “about as many optical interconnects as a digital camera has pixels.” (A point-and-shoot camera’s pictures can have several megapixels, or millions of individual dots.) He imagined turning IBM’s tactic on its head. “They create photons in the chip, impose information on them and send them out to be decoded. What if you captured the light waves on the outside?” Duncan says. “The photons already have the [image] information you want.… You have to decode it inside the device. The decoder is the interferometer.”
The IBM people had not designed an interferometer, of course, but their optical circuitry seemed sophisticated enough to be adaptable to interferometry. Duncan figured that the fast-growing photonics industry already had or would soon invent fabrication solutions that his suddenly imagined telescope could use. Already, photonics companies were selling machines to create transparent channels or waveguides only a few millionths of a meter wide.
Considerably smaller than the fibers bundled into fiber-optic cables that carry data across continents and under oceans, photonic waveguides are made by finely focused, pulsating laser beams. As the beams scan along inside silicon-based photonic integrated circuits, or PICs, they leave behind close-packed strings in molten silicon that swiftly merge and cool. The resulting trails of transparency are superb transit systems for light, and they can be laser-incised in any pattern desired. Similar wizardry can shrink the scale of other optical gadgetry, such as filters to sort the signals by color, or the interferometry gadgetry to mix signals from different lenses in a SPIDER scope.
Decoding fringes
Interferometry does not produce pictures the way a conventional telescope does. Telescopes refract a scene’s incoming light through lenses or bounce it off of mirrors. The lenses or mirrors are shaped so that light beams, or photons, from a given part of a scene converge on a corresponding place on a photo-sensitive surface such as an image chip of a digital camera, similar to the retina of an eye.
Interferometry, instead, gathers signals from pairs of receivers — sometimes many pairs — all aimed at the same scene. It combines the signals to reveal the slight differences in the phases and strengths of the radio, light or other waves. The separate wave trains, or signals, are projected on a screen in an interferometry chamber as patterns of light and dark fringes where the signals from the paired receivers reinforce or counteract each other. The fringes, somewhat resembling checkout counter bar codes, carry a distinct, encoded hint of the difference in the viewed object as seen from the receivers’ offset positions in the aperture. With enough measurements of fringes from enough pairs of waves gathered by enough small receivers, a computer can deduce a picture that is as sharp as from a telescope with a lens as wide as the distance between the most widely spaced lenses, for example, on a SPIDER’s face.
Building a tiny version of this using photonics requires separate sets of waveguides for different colors or “spectral bins.” The more bins used, the more accurately an object can be portrayed. But each such layer of complexity aggravates the chore of fabrication.
So even a bare-bones SPIDER may need thousands of waveguides. Advanced SPIDERs may have millions of them. As far as Duncan knows, SPIDER would be the most complicated interferometer ever made.
Spycraft and space views
After his epiphany, Duncan began working with Lockheed colleagues, chiefly technology expert Richard L. Kendrick. Computed simulations convinced them that their mini-interferometer should work. In 2012, Lockheed Martin filed for a patent — granted in late 2014 — naming the two men as the inventors. Reflecting the company’s defense ties, the document provides a hypothetical application: SPIDER in a proposed, high-altitude Pentagon recon drone called Vulture, perhaps built into the curved bottom of a wing.
Initial simulations showed how SPIDER’s pictures of one satellite taken from another, or of buildings as seen from space, compare with pictures by standard long-range cameras. Interferometric images, due to the complex calculations using the equations of Fourier transforms, often have extra flares and streaks. Nonetheless, to a layman’s eye, the simulated SPIDER images look about the same as equivalent ones from standard lens or mirror telescopes.
If SPIDER pans out, its inventors imagine uses beyond spycraft. NASA is planning a mission to orbit Jupiter’s moon Europa (SN Online: 5/26/15). The SPIDER team calculates that, given the same space that has already been assigned to a conventional imager, SPIDER’s instrument could inspect 10 times the terrain at 17 times better resolution. SPIDER should be able to have a wider array of lenslets — or receivers — take pictures at points farther from Europa on the craft’s elliptical orbit and should have a wider field of view.
One proposed design for the first fully operable, but spartan, SPIDER is to have 37 radial blades, each backed by a single photonic chip with 14 lenslets along one edge. The whole model would be about the size of a dinner plate. Eventually, a SPIDER might be built on the face of a single chip of similar or larger size. This would allow more lenslets to be fitted, and permit waveguides to pair them up from anywhere in the aperture. Upshot: more “eyes” packed into the same space.
The Lockheed group has begun to fabricate test components in partnership with a photonics laboratory led by Ben Yoo, professor of electrical and computer engineering at UC Davis. DARPA, the Department of Defense’s agency for funding advanced research, granted about $2 million for prototype photonic integrated circuits and other gear to test the idea’s feasibility.
The technical challenges are extreme. Each tiny lenslet could need 200 or more separate waveguides leading from its focal area to the interferometers. For a fairly simple SPIDER scope, that would mean tens of thousands of waveguides coursing through the chips’ insides — perhaps fabricated many layers deep. So far, the researchers have built prototype components with only four lenslets, too few to get images.
Skeptics and a crusader
At least one top authority says the scheme is nonsense. Others are more amused than critical. Michael Shao, an MIT-trained astronomer and project scientist at NASA’s Jet Propulsion Laboratory in Pasadena, Calif., has extensive experience with interferometry. He calls the concept of SPIDER “fundamentally sound,” but adds that it will require such extensive optical plumbing on a photonic scale that the sheer complexity “would scare a lot of folks away.” If the SPIDER team makes it work, great. “But it is a lot of work to save a little space.”
Peter Tuthill, an astronomer at the University of Sydney in Australia, leads one of the world’s busiest interferometry groups. His team has augmented such large conventional ground-based telescopes as the Keck Observatory in Hawaii with auxiliary interferometers. His group also designed an interferometer to be included on the James Webb Space Telescope, planned successor to the Hubble. After looking over the SPIDER proposal, he declared by e-mail, “I think the argument made that this can be somehow cheaper, simpler, lower mass and higher performance than conventional optics appears not to pass the laugh test.”
The extremely large number of waveguides in the SPIDER design, he added, would leave the signal strength per waveguide too feeble — hence vulnerable to swamping by noise in the system. “In short, I don’t think (the SPIDER team members) are waiting for technology to enable their platform. I think they are waiting for a miracle that defies physics.”
Duncan just smiles when he hears Tuthill’s opinion. Even if technical difficulties delay or quash this initial SPIDER project, he is confident somebody will step in and surmount any barriers. “It will happen,” he says.