Category Dreams of Other Worlds

Seeing in the Dark

The largest and most densely populated habitat on Earth is the ocean realm. As Claire Nouvian observes, “The deep sea, which has been immersed in total darkness since the dawn of time. . . forms the planet’s largest habitat.” Zoologists and oceanogra­phers are far from documenting the welter of sea life. “Current estimates about the number of species yet to be discovered vary between 10 and 30 million,” writes Nouvian. “By comparison, the number of known species populating the planet today, whether terrestrial, aerial, or marine, is estimated at about 1.4 million.”39 Down-welling light, whether sunlight or moonlight, in the upper reaches of the ocean’s water column can reveal an animal’s loca­tion or hideout. To avoid predators, marine animals largely forage for food at night, under cover of darkness. The result is that the ocean’s water column is the site of the largest daily animal migra­tion known, with marine animals migrating vertically as much as tens of meters to a kilometer. Harbor Branch Oceanographic Insti­tute scientist Marsh Youngbluth explains:

Each evening and morning all the oceans and even the lakes of the world are a theater to mass movements involving billions and billions of creatures swimming from deep waters to the surface and then back again to a colder, darker world. . . . Sixty years ago, when ac­tive sonar was first available, captains of fishing vessels thought that the bottom was rising under the boat. This phenomenon, called the “vertical migration,” is the largest synchronized animal movement on Earth. . . . Like nomads in deserts, they move to and from the surface waters in search of an oasis: the first 100m of the sea, where the sun­light still penetrates, the “photic” zone, abounding with food. . . . The signal that triggers and resets this ritual is the daily waning and wax­ing of sunlight, so travel frequently starts at dusk and ends at dawn.40

Although the deep sea is one place sunlight cannot reach, light and vision are no less critical there than for landlocked creatures. The Spookfish, for instance, lives at 1,000 meters’ depth in cold, dark Pacific Ocean waters and has reflective telescopes for eyes.41 Spookfish weren’t exactly the first astronomers, but unlike any other vertebrates, their eyes include crystal mirrors that focus light on the retina. And some marine animals have evolved the equiva­lent of wearing Polaroid sunglasses—they can see polarized light, which assists in their discerning jellyfish and other nearly transpar­ent species moving through the water column.

Deep below the coral reefs, ocean animals produce their own light to see and lure prey, signal distress or otherwise communi­cate, attract a mate, repel and stun predators, and counter-shade themselves.42 Nouvian contends that bioluminescence is the most common mode of communication on the planet. Approximately 90 percent of all ocean animals produce or deploy biolumines­cence in some capacity. Also referred to as cold body radiation or “cold light,” nearly 98 percent of the energy used in generating this living light affords only a minimal release of heat, presumably to protect the luminescent organism from being detected by preda­tors who sense infrared.

Bioluminescence has evolved independently some forty to fifty times as a result of its clear survival advantages. While fireflies pro­vide the most familiar example, a handful of land-based organisms luminesce, including fly larvae as well as some earthworms, snails, mushrooms, and centipedes. In the ocean realm, bioluminescence is the rule rather than the exception. Ranking among the biolumi­naries are bacteria, snails, krill, squid, eel, jellies, sponges, corals, clams, sea worms, Green Lanternsharks, Megamouth Sharks, and an array of fish. Inhabitants of mid-water habitats luminesce at varying intensities to countershade themselves and blend in with down-welling light so informative in the ecology of the water col­umn. Lanternfishes, of which there are at least two hundred spe­cies, can illuminate photophores along the entire length of their bodies to provide counter shade. “Photophores are often highly evolved structures, consisting not only of a light-generating cell, but also a reflector, lens, and color filter,” explains Scripps Insti­tute oceanographer Tony Koslow. “Most photophores emit light in the blue end of the spectrum to match the wavelength of down – welling light.”43

Viperfish and deep-sea anglerfishes have attached to their bodies a kind of a fishing rod with a lighted lure that attracts sizable prey directly into their toothy mouths. Marine biologist Edith Widder notes other deployments of bioluminescence in the ocean depths:

There are many fishes, shrimps, and squids that use headlights to search for prey and to signal to mates. . . . And as with some cars, some headlights can be rolled down and out of sight when they are not in use—a handy way of hiding that reflective surface and allow­ing the fish to better blend into the darkness. Most headlights in the ocean are blue, which is the color that travels furthest through sea­water and the only color that most deep-sea animals can see. But there are some very interesting exceptions like dragonfish with red headlights that are invisible to most other animals but that the drag – onfish can see and use like a sniper scope to sneak up on unsuspect­ing, unseeing prey.44

Deep-sea prawns and some squid emit bioluminescent fluids to startle predators and evade being eaten. Koslow reports, “Some jellyfishes jettison bioluminescing tentacles before making their escape, much as a lizard may let go its still-writhing tail to oc­cupy its predator while effecting its escape.”45 Other jellyfish can spray their predators with a luminescent substance that makes the

Plate 1. More than a century after Per – cival Lowell mapped Mars, the Hubble Space Telescope turned its best camera onto the red planet, producing this true color image in March 1997, just after the last day of Martian spring in the northern hemisphere. Mars was near its closest approach to Earth, 60 million kilome­ters away (NASA/David Crisp and the WFPC2 Camera Team).

Seeing in the DarkSeeing in the DarkPlate 2. Mars as seen by the Viking 2 lander. Images like this dashed the more fevered speculation of Mars as a living planet stemming from Percival Lowell and subsequent science fiction; it was revealed as a frigid and arid desert with a tenuous atmosphere, where water cannot exist for more than a moment on the surface before evaporating (NASA/Mary Dale – Bannister, Washington University in St. Louis).

Plate 3. Artist’s conception of one of the Mars Exploration Rovers on Mars. Spirit and Opportunity each far exceeded their life expectancy and have performed at a very high level in the unforgiving conditions on the surface of Mars. The rovers were identical as pieces of hardware, yet had quite dif­ferent experiences and adven­tures. Opportunity has exceeded its design lifetime by a factor of more than 35 (NASA/Jet Propulsion Laboratory).

Plate 4. Victoria Crater at Meridiani Planum on Mars, which is about 730 meters in diameter. Opportunity spent over two years exploring the rim of the crater, occasionally venturing inside, and by mid-2011 had traveled over 20 miles across the Martian terrain. Opportunity had a lucky bounce on landing and extricated itself from the kind of sand dune that disabled Spirit (NASA/Jet Propulsion Laboratory).

Seeing in the DarkPlate 5. The four Galilean moons of Jupiter, visited by the Voyager spacecraft in 1979. In decreasing size order, the moons are Ganymede, Callisto, Io, and Europa; Jupiter is not shown at the same scale. Ganymede is 5300 km in diameter. The spacecraft made many discoveries in the Jovian moon system, including volcanism on Io and a subsurface ocean on Europa (NASA Planetary Photojournal).

Seeing in the DarkPlate 6. Attached to the body of each Voyager spacecraft is a gold-plated, copper phonograph record. The record contains musical selections, images, and audio greetings in many world languages, along with instructions on how to retrieve the information. The analog technology will be very durable in the far reaches of space (NASA/Jet Propulsion Laboratory).

Plate 7. An artist’s impression of a close-up view of Saturn’s rings.

Seeing in the DarkThe rings are thought to be made of material unable to form a moon because of Saturn’s tidal forces, or to be debris from a moon that broke up due to tidal forces. The particles are made of ice and rock and range in size from less than a millimeter to tens of meters across (NASA/ Marshall Image Exchange).

Plate 8. A panoramic view of Saturn’s moon Titan, from the Huygens lander during its descent to the surface in late 2005. In this fish-eye view from an altitude of three miles, a dark, sandy basin is surrounded by pale colored hills and a surface laced with stream beds and shallow bodies of liquid composed of methane and ethane (NASA/ESA/Descent Imager Team).

Plate 9. Stardust collected samples of a comet and interstellar dust samples using a particle collector with cells containing aerogel, which is an amorphous, silica-based material that is strong yet exceptionally light. Particles entered this solid foam at high speed and were decelerated and trapped. The spacecraft returned its samples to Earth in 2006 (NASA/Jet Propulsion Laboratory).

Plate 10. This composite image was taken of comet Wild 2 by Stardust during its close approach in early 2004. The comet is about 3 miles in diameter. Its surface is intensely active, with jets of gas and dust spewing millions of miles into space. This image is a hybrid: a short exposure captures the jet while a long exposure captures the surface features (NASA/Jet Propulsion Laboratory).

Seeing in the DarkPlate 11. SOHO took this image of the Sun in January 2000. The relatively placid optical appearance belies the intense activity seen in this ultraviolet image, where a huge, twisting prominence has escaped the Sun’s surface. When these events are pointed at the Earth, telecommunications and power grids can be affected (NASA/SOHO).

Plate 12. This computer representation shows one of millions of modes of sound wave oscillations of the Sun, where receding regions are colored red and approaching regions are colored blue. The Sun “rings” like a bell, with many complex harmonics, and study of the surface motions can be used to diagnose the interior regions (NSO/AURA/NSF).

Seeing in the Dark

Seeing in the Dark

Plate 13. People have used the sky as a map, a clock, a calendar, and as a cultural and spiritual backdrop since antiquity. This celestial map was produced in the seventeenth century by the Dutch cartographer Frederick de Wit. Constellations and star patterns are unchanging from generation to generation, so the stars were seen as being eternal (Wikimedia Commons/ Frederick de Wit).

Plate 14. We live in a city of stars, seen here in a full-sky panorama of the Milky Way photographed from Death Valley in California. The ragged band of light represents a view through the disk of the spiral galaxy we inhabit, and Hipparcos has mapped out the nearby regions of the disk and parts of the extended halo (U. S. National Park Service/Dan Duriscoe).

Seeing in the DarkPlate 15. This Spitzer Space Telescope image shows star formation around the Omega Nebula, M17. This Messier object is a nebulosity around an open cluster of three dozen hot, young stars, about 5,000 light-years away. Spitzer records information at invisibly long wavelengths, and the difference between the view in the infrared (top) and the optical (bottom) can be dramatic. Colors in the infrared view represent different temperature regimes, with red coolest and blue hottest (NASA/JPL – Caltech/M. Povich).

Plate 16. The Spitzer Space Telescope detected molecules of buckminsterfullerene, or “buckyballs,” in a nearby galaxy, the Small Magellanic Cloud. The first zoom shows the type of planetary nebula where the molecules were found, and the second shows the molecule structure, where sixty carbon atoms are arranged like a tiny soccer ball (NASA/SSC/Kris Sellgren).

Seeing in the DarkPlate 17. NASA’s Great Observatories are multi-billion – dollar missions with complex instrument suites, designed to answer fundamental questions in all areas of astrophysics. Including the ground-based Atacama Large Millimeter Array (ALMA), they can diag­nose the universe at temper­atures ranging from tens to tens of billions of Kelvin (NASA/ CXC/M. Weiss).

Seeing in the DarkPlate 18. The dying star that produced this great bubble of hot glowing gas was first noted by Tycho Brahe in 1572. A white dwarf detonated as a supernova when mass falling in from a companion triggered its collapse; the shock wave from the subse­quent explosion led to the blue arc. The surrounding material is iron-rich and highly excited iron atoms create spectral lines detectable at X-ray wavelengths (NASA/ CXC/Chinese Academy of Sciences/F. Lu).

Plate 19. Two images of a pillar of star birth, three light-years high, in the Carina nebula, about 7,500 light-years away. Images taken through different filters select different wavelength ranges, which are combined into “true color” composites, where the colors convey astrophysical information in either visible light or infrared waves. Images like this have turned nebulae, galaxies, and clusters into “places” that resonate in the popular imagination (NASA/ESA/ STScI/M. Livio).

Seeing in the Dark

Seeing in the Dark

Plate 20. This image of the towering gas columns and bright knots of young stars seen in the Eagle Nebula (M16) was probably the first Hubble Space Telescope image to achieve widespread public recognition. It was part of the inspiration for the Hubble Heritage project, which showcases a different high-impact color image on the web each month. New worlds are being born at the tips of these fingers of hot gas (NASA/ESA/STScI/J. Hester/P. Scowen).

Plate 21. This exquisitely accurate map of the microwave sky, a projection of the celestial sphere onto a plane, shows the universe when it was a tiny fraction of its present age. The temperature variation between red and blue “speckles” is about a hundredth of a percent. The tiny variations, on angular scales of about a degree, represent the seeds for galaxy formation. It took a hundred million years or so for gravity to form the first galaxies (NASA/WMAP Science Team).

Plate 22. WMAP has played a major role in pushing the big bang model to the limit. The current model of the expanding universe posits an early epoch of inflation or exponential expansion, and subsequent expansion governed in turn by dark matter, causing deceleration, and more recently, dark energy, which is causing acceleration. WMAP has ushered in an era of “precision” cosmology (NASA/WMAP Science Team).

Seeing in the DarkPlate 23. The Mars Science Laboratory, named Curiosity, will be exploring Mars for at least two years, starting with its landing in August 2012. The rover is the size of an SUV, compared to the Mars Exploration Rovers, which are the size of a golf cart, and the earlier Pathfinder, which is the size of a go – kart. Curiosity will study the past and present habitability of Mars by a detailed geochemical analysis of its rocks and atmosphere (NASA/ JPL-Caltech).

Plate 24. This montage of 1,235 exoplanet candidates from Kepler shows the planets projected against their parent stars, giving an idea of how they are detected by the slight dimming of the star’s light. By the end of its mission, Kepler will have collected enough data to be sensitive to Earth-like planets in Earth-like orbits of their stars, many of which are expected to be habitable (NASA/ Kepler Science Team).

predator in turn visible to fish looking for a quick meal. Medical researchers are only beginning to realize what we might learn from deep-sea bioluminaries. Off Puget Sound in the Pacific Northwest live the Aequorea victoria jellyfish, from which Green Fluorescent Protein (GFP) was first derived and used to generate other fluo­rescing marker proteins crucial in cancer and brain research and invaluable to cell biology and genetic engineering.46 What zoolo­gists discover about the variety of species deploying biolumines­cence in the deep ocean can help astrobiologists anticipate the life – forms that might illuminate the icy oceans of Europa, or alien seas on exoplanets orbiting other stars. To that end, Spitzer and other telescopes nightly scour the skies in search of other worlds.

A Long and Bumpy Road

In 1946, Yale astronomy professor Lyman Spitzer wrote a paper detailing the advantages of an Earth-orbiting telescope for deep observations of the universe.4 The concept had been floated even earlier, in 1923, by Hermann Oberth, one of the pioneers of mod­ern rocketry. In 1962, the U. S. National Academy of Sciences gave its imprimatur to the idea, and a few years later Spitzer was ap­pointed chair of a committee to flesh out the scientific motiva­tion for a space observatory. The young space agency NASA was to provide the launch vehicle and support for the mission. NASA cut its teeth with the Orbiting Astronomical Observatory mis­sions from 1966 to 1972.5 They demonstrated the great potential of space astronomy, but also the risks—two of the four missions failed. We’ve already encountered Spitzer since he gave his name to NASA’s infrared Great Observatory. Spitzer worked diligently to convince his colleagues around the country of the benefits of such a risky and expensive undertaking as an orbiting telescope.

After the National Academy of Sciences reiterated its support of a 3-meter telescope in space in 1969, NASA started design stud­ies. But the estimated costs were $400-500 million and Congress balked, denying funding in 1975. Astronomers regrouped, NASA enlisted the European Space Agency as a partner, and the telescope shrunk to 2.4 meters. With these changes, and a price tag of $200 million, Congress approved funding in 1977 and the launch was set for 1983. More delays followed. Making the primary mirror was very challenging and the entire optical assembly wasn’t put to­gether until 1984, by which time launch had been pushed back to 1986. The whole project was thrown into limbo by the tragic loss of the Challenger Space Shuttle in January 1986. When the shuttle flights finally resumed, there was a logjam of missions so another couple of years slipped by.6

Hubble was launched on April 24, 1990, by the shuttle Discov­ery. A few weeks after the systems went live and were checked out, euphoria turned to dismay as scientists examined the first images and saw they were slightly blurred. The telescope could still do science but some of the original goals were compromised. Instead of being focused into a sharp point, some of the light was smeared into a large and ugly halo. This symptom indicated spherical aber­ration, and further in-flight tests confirmed that the primary mir­ror had an incorrect shape. It was too flat near the edges by a tiny amount, about one-fiftieth of the width of a human hair. Such was the intended precision of Hubble’s optics that this tiny flaw made for poor images.7 Hubble’s mirror was still the most precise mirror ever made, but it was precisely wrong.

The spherical aberration problem may be ancient history and in the rearview mirror now, but at the time it was a public relations nightmare for NASA. Its flagship mission could only take blurry images. Commentators and talk show hosts lampooned the tele­scope and David Letterman presented a Top Ten list of “excuses” for the problem on the Late Show with David Letterman.8 More seriously, the episode became fodder for case studies in business schools around the country. The fundamental error was the result of poor management, not poor engineering. The Space Telescope project had two primary contractors: Perkin-Elmer, who built the optical telescope assembly, and Lockheed, who built the support systems for the telescope. There was also a network of two dozen secondary contractors from the aerospace industry. The mission was jointly executed by Marshall Space Flight Center and God­dard Space Flight Center, whose relationship involved rivalry and was not always harmonious, with overall supervision from NASA Headquarters. Complexity of this degree can be a recipe for disas­ter without tight and transparent management, and clear commu­nication among the best technical experts.

When the primary mirror was being ground and polished in the lab by Perkin-Elmer, they used a small optical device to test the shape of the mirror. Because two of the elements in this device were mis-positioned by 1.3 millimeters, the mirror was made with the wrong shape. This mistake was then compounded. Two addi­tional tests carried out by Perkin-Elmer gave an indication of the problem, but those results were discounted as being flawed! No completely independent test of the primary mirror was required by NASA, and the entire assembled telescope was not tested be­fore launch, because the project was under budget pressure. Also, NASA managers didn’t have their best optical scientists and engi­neers looking at the test results as they were collected. The agency was embarrassed and humbled by the failure. Their official inves­tigation put it succinctly: “Reliance on a single test method was a process which was clearly vulnerable to simple error.”9 In this way a multi-billion-dollar mission was hamstrung by a millimeter-level mistake and the failure to do some relatively cheap tests. In the old English idiom: penny wise, pound foolish. The propagation of a small problem into a huge one recalls another aphorism from En­gland, where a lost horseshoe stops the transmission of a message and the result affects a critical battle: for the want of a nail, the war was lost.

Wilkinson Microwave Anisotropy Probe

Enter the Wilkinson Microwave Anisotropy Probe. WMAP was conceived as a way of pushing to a new level of precision and a new set of tests of the big bang theory. Most of those tests involve looking at anisotropies in the radiation, small variations in tem­perature from one part of the sky to another.

The all-sky map of microwave radiation has to have foreground emission from the Milky Way removed before it can be interpreted. There is a temperature gradient across the sky caused by the mo­tion of the Solar System relative to the universe as a whole at a speed of about 360 kilometers per second. The microwave sky is 0.00335 K warmer toward the direction of our motion and the same amount cooler in the direction opposite to our motion. That small signal is also modeled and subtracted out.28 What’s left is a mottled pattern of very low-level variations. COBE had enough sensitivity to detect the variations statistically, but with an angular resolution of 7 degrees (the angle of your outstretched fingers at arm’s length) it could not say much about the detailed structure of the radiation.

COBE was a small satellite that traveled in a 900-kilometer high orbit of the Earth. The instrument that measured temperature variations had two horn receivers pointing in different directions, with the satellite rotating every 70 seconds so they could sweep across the sky. WMAP was a much larger and more sophisticated satellite, even though it was a third the mass of COBE. It collected microwaves with a pair of 1.5-m dishes and its receivers detected the radiation in five frequency bands. It rotated every 130 seconds and made a complete map of the sky every six months. The satel­lite was launched in 2001 and sent to a Lagrange point (where the gravity of the Earth and Moon balance) 1.5 million kilometers from Earth, where the contaminating radiation is much lower than in low Earth orbit. As a result, WMAP was forty-five times more sensitive than COBE and it was able to resolve thirty-five times smaller regions on the sky, or two times smaller than the angle of the full Moon in the sky (plate 21). The large difference is com­parable to the gain of the Hubble Space Telescope over a one-foot diameter telescope of the ground.29

WMAP operated flawlessly for ten years, and the exciting re­sults of COBE and WMAP generated the momentum for a third – generation microwave satellite called Planck, named after the Nobel Prize-winning German physicist. Planck is primarily a Eu­ropean mission. Launched by ESA in 2009 to a location at the same Lagrange point as WMAP (L2), it improves on WMAP in both sensitivity and angular resolution.30

Forming New Solar Systems

The cold and dusty nebulae in which stars form are also places where planets form, and Spitzer has helped to tell this story. New­born stars are embedded in a circumstellar disk of dust and grit coalescing around the star as a result of angular momentum con­servation in the collapsing debris cloud. Just as an ice skater spins faster as they bring their arms to their chest, a diffuse cloud with slight rotation shrinks to a rapidly spinning compact disk.47 Proto­planetary disks, like those in the Orion Nebula, are heated by a central star, reradiating all that energy in the infrared. Gas giant planets like Jupiter or Saturn form within the disk in as little as a few million years, the blink of a cosmic eye. The disk then dis­sipates and leaves behind a sparse gruel of debris made of dust particles that have been recycled though collisions of larger chunks of rock and evaporation (figure 9.4). Spitzer is perfectly suited to studying debris disks because the cool material easily outshines the star in infrared radiation. By comparing disks at various stages of their development, the evolutionary history of how solar systems are born can be pieced together.48

Spitzer’s data indicates that the extremely dense, early phase of disk evolution lasts a couple of million years.49 The process begins as microscopic particles aggregate into larger particles the size of dust grains. Rocky planets like Earth and Mars form by accre-

Forming New Solar Systems

Figure 9.4. An artist’s concept of a young star in its dense disk of gas and dust. Light cannot escape from the interior of the disk, but infrared radiation emerges unaltered and reveals the process of planet formation. Within the disk, theory predicts that matter can accrete from dust grains into several thousand Moon – mass objects in just a few million years (NASA/JPL-Caltech).

tion, as grit grows into boulders, then mountains, then planets—a process that takes 10 million years. Spectra show that the dust is mixed with gaseous organic molecules, including water (steam), carbon monoxide, carbon dioxide, and methane. Soot or pure car­bon grains are an important ingredient of planets, and carbon – based life-f orms. So it was with great excitement that astrono­mers announced the discovery, a few years ago, of Buckminster fullerenes, or buckyballs, in deep space. Buckyballs are spherical molecules made of sixty carbon atoms arranged in hexagons and pentagons like the panels of a soccer ball; the name alludes to Buckminster Fuller, who designed geodesic domes that look like these carbon molecules. Observations made with Spitzer have shown that buckyballs, or buckminsterfullerenes, are common in hydrogen-rich regions of space such as gaseous nebula where stars are born.50 Buckyballs are surprisingly robust and may be linked to the emergence of life by forming stable cages for concentrating other molecules and so accelerating interaction rates and chemical complexity (plate 16).

Spitzer is well-suited to diagnose the chemistry of planet forma­tion because molecules have most of their spectral transitions at in­frared wavelengths. The spectra of dozens of planet-forming disks have revealed interesting anomalies. Hydrogen cyanide (HCN), which is a major repository of interstellar nitrogen, is rarer around stars less than half the Sun’s mass than around solar-type stars. This implies that planets around low-mass stars are nitrogen-poor. Since nitrogen is a key biological ingredient, this implies that life might be rare, or fundamentally different, on planets of low-mass stars. Spitzer has also shown that rocky silicate material in the space between stars is amorphous in form, while silicates in the de­bris disks associated with planet formation have the sharper spec­tral features associated with crystalline silicates.51 In the lab, amor­phous silicates can only be converted into the crystalline forms by annealing at a temperature of 1000 K, which allows the molecules to gently reorient themselves. In proto-planetary disks, annealing requires recurrent flaring of the star during the first million years of disk evolution. In these cool regions that can only be observed with an infrared telescope, we can observe the early steps along the path to planets and life.

A Telescope Rejuvenated

In fact, NASA had lost a big battle, but they weren’t yet ready to concede the war. With Hubble working at part strength, the agency immediately began planning to diagnose and correct the problem with the optics. Although a backup mirror was available, the cost of bringing the telescope down to Earth and re-launching it would have been exorbitant. It was just as well that the telescope had been built to be visited by the Space Shuttle and serviced by astronauts. The instruments sitting behind the telescope fit snugly into bays like dresser drawers and they could be pulled out and replaced with others of the same size.

It also helped that the mirror flaw was profound but relatively simple, and the challenge was reduced to designing components with exactly the same mistake but in the opposite sense, essentially giving the telescope prescription eyeglasses. The system designed to correct the optics was the brainchild of electrical engineer James Crocker, and it was called COSTAR, or Corrective Optical Space Telescope Axial Replacement.10 COSTAR was a delicate and com­plicated apparatus with more than five thousand parts that had to be positioned in the optical path to within a hair’s breadth. Install­ing COSTAR raised the difficulty level of an already challenging first servicing mission that was planned for late 1993. Seven astro­nauts spent thousands of hours training for the mission, learning to use nearly a hundred tools that had never been used in space before. They did a record five back-to-back space walks, each one grueling and dangerous, during which they replaced two instru­ments, installed new solar arrays, and replaced four gyros. This last fix was needed because of the disconcerting tendency of Hub­ble’s gyros to fail at a high rate, in a way never seen in lab testing. Without working gyros, the telescope could not point at or lock on a target. Before leaving, the crew boosted Hubble’s altitude, since three years of drag in Earth’s tenuous upper atmosphere had started to degrade the orbit.

The first servicing mission was a stunning success. It restored the imaging capability to the design spec and added new capabili­ties with a more modern camera. It also played significantly into the vigorous debate in the astronomy and space science commu­nity over the role of humans in space. NASA had always placed a strong bet that the public would be engaged by the idea of space as a place for us to work and eventually live. But after the success of Apollo, public interest and enthusiasm waned. (It even waned during the program; a final three planned Moon landings were scrapped.) Also, most scientists thought that it was cheaper and less dangerous to create automated or robotic missions than to ser­vice them with astronauts. NASA’s amazingly successful planetary missions from the 1970s to the 1990s were seen as evidence of the primacy of robotic spacecraft. Hubble was of course designed to be serviced by astronauts, but the often-unanticipated problems they were able to solve in orbit, coupled with the positive public response (and high TV ratings during the space walks, at least the early ones), persuaded many that the human presence was essen­tial and inspirational.

More servicing missions followed. Each one rejuvenated the facility and kept it near the cutting edge of astronomy research. The second mission in 1997 installed a sensitive new spectrograph and Hubble’s first instrument designed to work in the infrared. Astronauts also upgraded the archaic onboard computers. The third mission in 1999 was moved forward to deal with the vex­ing problem of failing gyros. Before the mission flew, the telescope lost a fourth gyro, essentially leaving it dead in the water.11 All six gyros were replaced and the computer was upgraded to one with a blistering speed of 25 MHz and capacious two megabytes of RAM (that’s fifty times slower and thousands of times less storage than the average smart phone). The fourth mission in 2002 replaced the last of the original instruments and also removed COSTAR since it was no longer needed—each subsequent instrument has had its optics designed to compensate for the aberration of the primary mirror. The infrared instrument was revived, having run out of coolant two years early. New and better solar arrays were installed, along with a new power system, which caused some anx­iety since to install it the telescope had to be completely powered down for the first time since launch. Except for its mirror, Hubble is reminiscent of the ship of Theseus, a story from antiquity where every plank and piece of wood of a ship is replaced as it plies the seas.

The fifth and last servicing mission almost didn’t happen. Once again, the mission was affected by a Shuttle disaster, in this case the catastrophic loss of Columbia and crew in February 2003. NASA Administrator Sean O’Keefe decided that human repairs of the telescope were too risky and future Shuttle flights could only go to the safe harbor of the International Space Station. He also studied engineering reports and concluded that a robotic servicing mission was so difficult that it would most likely fail. At that point, Hubble was destined to die a natural death as gyros and data links failed.

When in January 2004 it was announced that the Hubble Space Telescope would be scrapped, the public’s response was overwhelm­ingly in support of refurbishing the instrument in orbit. David De – vorkin and Robert Smith report that a “‘Save the Hubble’ move­ment sprang into life.”12 Robert Zimmerman similarly comments: “The public response. . . was, to put it mildly, loud. Very soon, NASA was getting four hundred emails a day in protest, and over the next few weeks editorials and op-eds in dozens of newspapers across the United States came out against the decision.”13 Zim­merman details the history, development, and deployment of the telescope as well as its more significant findings. He suggests that Hubble, more than any other telescope or major scientific instru­ment, has allowed humankind to explore what lies at the furthest depths of space and that this is one reason for the widespread pub­lic sense of ownership of what some newspapers have called the “people’s telescope.”14 Not since Edwin Hubble did his pioneering work on cosmology with the 100-inch reflector at Mount Wilson Observatory has public interest in a particular instrument been so intense or pervasive.15

O’Keefe and other NASA officials were taken aback by the pub­lic response. They’d expected astronomers to lobby hard for an­other servicing mission, and they weren’t surprised that astronauts weighed in, saying they’d signed up knowing the risks and they wanted to service the telescope. But it turned out that a large num­ber of people felt attached to Hubble through its spectacular pic­tures and newsworthy discoveries. They felt clear affection for the facility, and since the taxpayer had indeed paid for it, the agency took notice and O’Keefe first reevaluated and then reversed his decision.16 The fifth servicing mission went off without a hitch in May 2009, leaving the telescope in better shape overall and with certain capabilities a hundred times more effective than its original configuration.17 Two new instruments were installed, two others were fixed, and many other repairs were carried out since it was agreed by everyone that this was the last time the telescope would be serviced (figure 11.1).

A Telescope Rejuvenated

Figure 11.1. The Hubble Space Telescope is still in high demand as astronomy’s flagship facility, over two decades after it was launched. In large part, this is due to five servicing missions with the Space Shuttle, where astronauts performed extremely challenging space walks to replace and upgrade vital components, and install new state-of-the-art instruments (NASA/Johnson Space Center).

Hubble’s continual rejuvenation is a major part of its scientific impact. The instruments built for the telescope are state-of-the-art, and competition for time on the telescope has consistently been so intense that only one in eight proposals gets approved. All this comes with a hefty price tag. Estimating the cost of Hubble is dif­ficult because of how much to assign to the Shuttle launches and astronaut activities, but a twenty-two-year price tag of $6 billion is probably not far from the mark. For reference, the budget was $400 million when construction started and the cost at launch was $2.5 billion. Compared to slightly larger 4-meter telescopes on the ground, Hubble generates fifteen times as many scientific citations (one crude measure of impact on a field) but costs one hundred times as much to operate and maintain.18 Regardless of its cost, the facility sets a very high bar on any subsequent space telescope. As Malcolm Longair, Emeritus Professor of Physics at the University of Cambridge and former chairman of the Space Telescope Sci­ence Institute Council, has observed: “The Hubble Space Telescope has undoubtedly had a greater public impact than any other space astronomy mission ever.” He also notes that this small telescope’s “images are not only beautiful, but are full of spectacular new sci­ence,” much of which was unimagined by the astronomers who conceived and launched the instrument.19

Piper at the Gates of Dawn

Variations in the microwave radiation capture important infor­mation about conditions in the early universe. To describe these variations, astronomers like to think about the power spectrum of the variations, or how much of the variation is on a particular angular scale. In this formalism, l is the angular frequency of varia­tions. For example, l = 2 corresponds to two cycles across the sky or variations over 100 degrees, showing the dipole of the Local Group motion mentioned earlier. The 7-degree limit of COBE cor­responded to l = 30 and the much better 0.3-degree resolution of WMAP reaches almost to l = 1000. Think of l as the number of waves to go all around the sky in a circle and l2 as the number of “tiles” needed to cover the sky. Having more tiles means each one covers a smaller area of sky. The shape of the angular power spec-

Multipole moment l

10 100 500 1000

Piper at the Gates of Dawn

Angular size

Figure 12.3. A graph showing the amount of temperature variations on the verti­cal scale, versus the angular scale of those variations on the horizontal scale. The quantity l is like a harmonic so it represents how many waves can fit across the sky, so larger l means smaller feature on the microwave map of the sky. The peak in microwave power on scales of a degree dates from 400,000 years after the big bang (NASA/WMAP Science Team/S. Larson).

 

trum is compared to predictions from the big bang model (figure 12.3). It’s fair to think of l as characterizing the “harmonics” of variation of the radiation.31

The physics of the early universe is esoteric, but we can gain insight by the analogy with sound as long as we don’t stretch the analogy to the breaking point. Before 380,000 years, while radia­tion was still trapped by matter, the electrons and photons acted like a gas, with the photons ricocheting off the electrons like little bullets. As in any gas, density disturbances moved at the speed of sound as a wave, a series of compressions and rarefactions. Com­pression would heat the gas and rarefaction would cool it, so the sound waves would manifest as a shifting series of temperature fluctuations. When electrons combined with protons to become neutral atoms, the photons from slightly hotter and cooler regions traveled unimpeded through the universe. So the temperature vari­ations that we see now are a “frozen” record of fluctuations from that time.

If there was a “piper at the gates of dawn,” then who or what was the piper?32 Inflation is presumed to be the mechanism in the
very early universe that rendered space flat and smooth, so it also must have been the source of the initial tiny variations. Those variations from inflation were hugely expanded quantum fluctua­tions, with the special property that the strengths of disturbances on all different scales were about equal. The disturbances are all produced at once from the very moment of creation, so they make sound waves that are synchronized. The result is sound waves with a series of harmonics or overtones, like the sounds from a flute with holes at regular intervals. Other models for the origin of the disturbances tend to be more chaotic or random, so they predict sound waves like those from a flute with holes at irregular or ran­dom intervals. Inflation is the music of the spheres dreamed of by Pythagoras.

In the flute analogy, the fundamental tone is a wave with its largest amplitude at either end of the tube and its minimum am­plitude in the middle. The overtones are whole number fractions of the fundamental tone, so one half its wavelength, one third, one quarter, and so on. In the early universe, however, the waves are oscillating in time as well as space, so the waves originate in the first iota of time at inflation and they end at the time the universe becomes transparent about 380,000 years later. The fundamental tone is a wave that has maximum positive displacement (or equiv­alently, maximum temperature) at inflation and has oscillated to maximum negative displacement (or minimum temperature) at the time of transparency. The overtones oscillate two, three, four, or more times faster and so cause successively smaller regions of space to reach their maximum amplitude 380,000 years later.

Thus, we have all the ingredients to interpret the graph of amount of temperature variation versus angular frequency, l, mea­sured by WMAP. There’s one more subtlety. Inflation predicts that all the harmonics should have the same strength. But sound with very short waves dissipates, because it’s carried by the collisions be­tween particles and when the wavelength is less than the distance a particle travels before hitting another particle, the wave dissipates. In the air, this is just 10-5 cm. But in the “empty space” of the universe before recombination, photons travel 10,000 light-years before colliding. So the high harmonics are reduced or damped out. After a thousand-fold expansion, those scales are now 10 mil­lion light-years. Thus, we don’t expect to see significant structure in the local universe on scales much more than ten times that size, and the clustering of galaxies is indeed weak on scales that large, chalking up another success for the big bang model. The piper at the gates of dawn is playing a strong fundamental note, with faint echoes from the higher harmonics, and a steady descent into high frequency “hiss.”

Diagnosing Distant Worlds

The discovery of planets orbiting stars other than the Sun was one of the most dramatic events of twentieth-century science. Until 1995, our Solar System was unique. Since then an accelerating pace of discovery has confirmed more than 850 exoplanets, with 2,700 more good candidates.52 Many of these are Jupiter-mass planets, orbiting Sun-like stars within one hundred light-years.53 But exo­planet detection limits are steadily approaching Earth-mass, and systems with as many as seven planets, have been discovered. The majority of these exoplanets were discovered by an indirect method, where the orbiting planet induces a “reflex motion” or a wobble in the parent star, and the wobble is detected by a periodic Doppler shift in the spectrum of the star. Some of the early discov­eries were “hot Jupiters,” or large planets on close orbits around their parent stars. But the Doppler detection method reveals noth­ing more than a planet’s mass and orbital distance.

Spitzer, by contrast, can observe exoplanets as they pass in front of or behind their star. This is an eclipse or a transit. Only the small fraction of systems where the orbital plane is lined up with our sight line can show eclipses. The detectors can gather infrared light from large exoplanets that orbit close to their star and that are heated to at least 1000 Kelvin, if the star is within two hundred light-years of our Sun. As a planet traverses the face of its star it blocks some of the starlight. There’s a temporary dip in the infra­red signal of the combined system due to the planet blocking a portion of the stellar disk. This eclipse allows the size of the planet to be measured. Then when the planet passes behind the star (this is called a secondary eclipse), the infrared signal again dips because the planet’s contribution is missing from the system’s heat signal. The size of the drop is the amount of infrared emission coming from the planet. Spitzer’s extended orbit allows continuous obser­vations over an entire exoplanet orbit, and the telescope’s ability to measure small changes of just 0.1 percent in brightness makes these observations possible (figure 9.5).

NASA’s Kepler spacecraft, launched in March 2009, is also looking for transits caused by exoplanets by staring at a region in the Cygnus constellation. Kepler scientists want to determine whether Earth-like planets, small rocky worlds with long period orbits, are common. The exoplanet Kepler 22b, which is 2.4 times larger than Earth, was the first planet found by the telescope to orbit in the habitable zone of its sun-like star. As with the Milky Way Project that uses data from Spitzer’s GLIMPSE survey, Zooni – verse’s Planet Hunters Project invites volunteers to assist NASA in digging through Kepler’s data. Kepler sends data on more than 150,000 stars to Earth at regular intervals, and volunteers for Planet Hunters survey the stars’ light-curves to identify possible transits. “Planet Hunters is an online experiment that taps into the power of human pattern recognition,” organizers explain on the

Diagnosing Distant Worlds

Figure 9.5. A distant exoplanet is partially eclipsed by its parent star in this art­ist’s impression. Spitzer uses observations like this to “taste” the atmosphere of an exoplanet by seeing which atoms or molecules are absorbed by the atmo­sphere. This particular planet is about 1000°F and the atmosphere has substan­tial amounts of carbon monoxide but surprisingly little methane (NASA/SSC/ Joseph Harrington).

website. So far, two likely planets have been detected by citizen sci­entists in the Kepler data.54 As has been the history of science from its earliest days, nonspecialists are making serious contributions to Kepler’s scientific outcomes.

While Kepler can predict statistically the ubiquity of planets in our galaxy, it takes the sensitive instruments of an infrared tele­scope to characterize their composition. Spitzer’s ability to di­agnose the atmospheric composition of remote exoplanets was a big surprise. Observations have been made on more than two dozen exoplanets, with dozens more expected during the ongoing “warm” mission. Spitzer project scientist Michael Werner explains the telescope’s contribution to this effort: “Because hot Jupiters are rich in gas, different wavelengths arise from different levels in the atmosphere of different chemical constituents. Spitzer data have allowed the determination of planetary temperatures and of con­straints on chemical composition (including the identification of water vapor), atmospheric structure and atmospheric dynamics.”55 Though Spitzer’s primary mission was to detect objects hidden by interstellar dust, characterizing exoplanets may become its greatest contribution.

In 2007, scientists first measured weather on a planet beyond the Solar System.56 The planet HD 189733b races around its par­ent star in just over two days. Temperatures across the planet range from 970 K to 1220 K, which means winds of 6,000 mph must rage to keep the temperature variation that modest. By compari­son, the Earth’s jet stream sails along at only 200 mph. The exo­planet Upsilon Andromeda b is even more extreme; day to night temperature variations are 1400°C, that’s 2550°F, and a hundred times larger than is typical on Earth!57 In the young, dynamic field of exoplanets, there have been many surprises, and theorists’ ex­pectations are often confounded. One hot Jupiter-like planet was found to have carbon monoxide but almost no methane. This vio­lates models of hot gas giants, where most of the carbon should be in the form of methane. Another two planets are shrouded by dry, dusty clouds unlike anything seen in our Solar System. These planets show little of the expected water or steam, meaning that, if present, water might be hidden under the clouds in the form of a scalding ocean.

The Peoples Telescope

The Hubble Space Telescope has contributed to the identification of exoplanets, the dark energy that permeates the universe, and massive black holes that lurk in nearby galaxies. Probably no other science facility has left its mark in so many homes or done more to advance the general public’s understanding of the structure, age, and size of the universe. Breathtaking photographs of regions of star formation, stunning spiral galaxies, exploding planetary neb­ula, and the most distant galaxies in the visible universe spill out of coffee table books, adorn the walls of children’s bedrooms, and serve as computer screensavers. Why is that? The answer is not simply that the Hubble images pervade popular culture, though of course they do. There are multiple factors that might account for the telescope’s tremendous popularity and an increasing public awareness of, and affection for, the telescope.

While there are telescopes with twenty-five times the light­gathering power on Earth, Hubble remains the premier tool of astronomers due to its exquisite sensitivity, and the ubiquity of the Hubble photographs has been unprecedented. From newspa­pers and magazine covers, to planetarium and museum programs and displays, popular science books, posters, calendars, and post­age stamps, Hubble images pervade popular culture. Moreover, the emergence of the Internet has afforded global access to the telescope’s photos and scientific results. In July 1994, when Comet Shoemaker-Levy 9 slammed into Jupiter with an estimated ex­plosive force of six hundred times humanity’s entire nuclear ar­senal, Hubble offered up-close views of the devastating planetary impacts. Astronomer David Levy, co-discoverer of the comet, re­ported that millions of people around the world watched on televi­sion or via the Internet as the comet’s line of fragments bombarded the massive planet.20

Art historian Elizabeth Kessler offers several less obvious rea­sons why the Hubble Space Telescope is so cherished worldwide. She contends that Hubble’s spectacular images have become “in­terwoven into our larger visual culture” in part because of their very deliberate construction along the lines of sublime art, which often seeks to evoke grandeur, great height and breadth of field, and an overwhelming awe of the power of nature (plate 19). She points out that many of the Hubble images released by the Space Telescope Science Institute (STScI) and the associated Hubble Her­itage Project reflect the aesthetics of nineteenth-century paintings of the American West. “Light streams from above” in the Hubble images as in landscape paintings, explains Kessler,21 despite the fact that there is no up or down in space. Through such fram­ing strategies Hubble’s photos are often configured like landscape paintings or photographs.

Kessler argues that what also endears Hubble to so many is that its photos provide a means for public audiences “to imagine the possibility of seeing such spacescapes with our own eyes.”22 But Kessler is careful to clarify that all published Hubble photos are interpretations, usually composites of multiple images captured at various wavelengths of light. Even if we could travel at many times the speed of light across the galaxy to observe astrophysical ob­jects, because of the rods and cones in the human eye, we would likely be unable to see color in faint light sources. The reason is that the cones in the human eye allow us to see color, but cones need lots of light to do so. The rods require less light, but do not pick up color. That’s why photographs of the Milky Way arced across the night sky often include color that we cannot see with the naked eye.23

Kessler points out that the Hubble Heritage images are exten­sively processed and represent not what the human eye would see, “but a careful series of steps that translate numeric data into pic – ture.”24 Hubble collects data in visible light but also supports a suite of instruments that “see” at wavelengths not visible to the human eye, such as infrared and ultraviolet light that can reveal additional details. In actuality, HST’s electronic detectors see only intensity, which can be represented in grayscale, a range of shades of black and white, or gray. A set of filters made of colored glass, such as red, green and blue, are rotated in front of the detector as images are captured.25 With spectroscopy, different wavelengths of light are dispersed with a grating and linearly arrayed on the detector. Once the data have been collected, color is added in pro­cessing by combining the images using different filters with the appropriate weights to reproduce “true color”26 that can be used to distinguish gases within a nebula or their temperatures, or to distinguish young, hot, newly formed stars from older, cooler stars.

Space Music

While astronomers were tuning their high-fidelity radio antennas to outer space, electronic artists began turning electromagnetic ra­diation into music.33 Penzias and Wilson accidentally discovered the cosmic microwave background in 1965 just as high-fidelity equipment necessary for radio astronomy, as well as synthesizers and electronic instruments suitable for rock music, were being de­veloped. Both fields were poised for an unprecedented exploration of the cosmos.

Bell Laboratories was the perfect place for this seemingly unre­lated exploration of outer space. Having produced the first high – fidelity phonograph recording in 1925, Bell Labs advanced the technology used in radio astronomy and in generating electronic music. Karl Jansky, while investigating at Bell Labs the reasons for static in long-distance shortwave communications, in 1931 laid the foundations for radio astronomy by detecting naturally occur­ring radio sources at the center of the Milky Way. A quarter of a century later, Max Mathews, also at Bell Labs, began using a com­puter to synthesize sounds. Often credited as the father of com­puter music, Mathews wrote the code for the computer program Music 1, which in later iterations became a widely used program­ming language for computer-generated music.34 The lab’s simulta­neous interest in radio technologies and electronic music is hardly surprising, since from its founding by Alexander Graham Bell, the company that became Bell Labs was dedicated to developing elec­tronic media such as the phonograph, telephone, radio, and other communication technologies.

Music historian Pietro Scaruffi contends that it was not the elec­tric guitar, as one might expect, that would “revolutionize rock music down to the deepest fiber of its nature,” but the emergence of electronic synthesizers and computer programs designed for musi­cal composition. By 1966, Robert Moog had developed the syn­thesizer, “the first instrument that could play more than one ‘voice’ and even imitate the voices of all the other instruments.”35 Within four years, Moog was marketing the Minimoog, a portable ver­sion that allowed for live performance with the synthesizer. And in the decade that followed, high fidelity or hi-fi technology emerged as turntables, synthesizers, oscillators, and other electronic devices became more widely used precision instruments.

Even as Penzias and Wilson at Bell Laboratories were measuring the exact temperature of the cosmic microwave background, avant- garde electronic artists like the German group Tangerine Dream began exploring the new music genres made possible by electronic instruments, keyboards, and synthesizers. The group is credited with launching what was referred to as kosmische musik, cosmic or space music that later evolved into disco, ambient, techno, trance, and other new age genres.36 Their album Alpha Centauri (1971) is purportedly the first electronic rock space album in history. Scar – uffi writes, “Tangerine Dream’s music is the perfect soundtrack for the mythology of the space age. . . . They were contemporaries with the moon landing. The world was caught in a collective dream of the infinite. Tangerine Dream gave that dream a sound.”37 In 1972, they produced the album Zeit (Time) that included among other tracks “Birth of Liquid Plejades” and “Nebulous Dawn.” Other space-themed selections by the group were titled “Sunrise in the Third System,” “Astral Voyager,” and “Abyss.” Edgar Froese, one of the group’s founding musicians, wrote “NGC 891” for his solo album Aqua (1974) in reference to the crisp, nearly edge-on gal­axy in Andromeda oriented so that its dust lanes sharply highlight the galaxy’s outer spiral arms. English astronomer James Jeans in­cluded an image of NGC 891 in his popular astronomy books of the 1930s, and astronomers repeatedly cited this seemingly perfect spiral galaxy to illustrate what the Milky Way would look like from 30 million light-years away.38

In 1920, Leon Theremin invented an electronic instrument that produced electronic sound by moving one’s hands near two anten­nas that controlled pitch and volume. The theremin’s eerie tones often were used to generate sequences meant to evoke space-like themes, and the instrument served as a mainstay in both avant – garde and rock, particularly in the Beach Boys’ selection “Good Vibrations.” Such electronic instruments, along with synthesizers, modulators, and amplifiers, became invaluable to sound designers working in the film industry. Ben Burtt, for instance, widely known for the “synthesized sound worlds” of the Star Wars films, actually developed the now easily recognized laser-gun sounds by record­ing and manipulating the twang of a guy-wire from a radio tower after accidentally getting his backpack hooked on one.39

Just as the first generation of astronauts was walking in space and on the lunar surface, the related genre of “space rock” was likewise exploring the cosmos. The genre emerged as an art form during the late 1960s via the British psychedelic movement. The title track to David Bowie’s album Space Oddity (1969), which opened with the memorable phrase “Ground Control to Major Tom,” shaped much of space rock to come. Pink Floyd’s The Dark Side of the Moon (1973), with its closing track “Eclipse,” is one of the best-selling rock albums in history. American songwriter Gary Wright’s “Dream Weaver” (1975) was composed using only keyboards, synthesizers, and drums and according to Wright was intended as a kind of fantastical train ride through the cosmos. Innovators like Brian Eno and Steve Roach contributed to spaced – themed ambient music in their evocative soundscapes. Eno’s Apollo: Atmospheres & Soundtracks (1983), the soundtrack for Al Reinert’s space documentary For all Mankind, was intended to capture “the grandeur and strangeness” of the Apollo mis­sions. Eno characterizes the soundtrack as evoking the astronauts’ somewhat disorienting experience of “looking back to a little blue planet drifting alone in space, looking out into the endless dark­ness beyond, and finally stepping onto another planet.” He adds that the score was an attempt to extend “the vocabulary of human feeling just as those missions had expanded the boundaries of our universe.”40

But the intersection of music and cosmogony goes back to pre­history, when nomadic people would literally “sing the place” to recreate and remember a physical landscape in the form of song. Songlines in Australia provide the Aborigines with unerring navi­gation over harsh terrain that can extend for thousands of miles.41 Language may have started as song and the aboriginal Dreamtime sings the world into existence. Bringing the concept into the digital age, eclectic singer-songwriter Bjork released an album in 2011 titled Biophilia, where each track is a sensory experience rooted in sound but extending into an iPhone app.42 With songs ranging from “Cosmogony” to “Dark Matter,” Bjork explores inner and outer space with her ethereal, electronic, sonic environments.

On Planets and Dwarfs

Somewhere in the twilight between stars and planets lie objects called brown dwarfs. Below about 8 percent of the Sun’s mass, physical conditions will never allow the fusion of hydrogen into helium, as happens in the Sun. There may be a flickering of en­ergy from the fusion of hydrogen into deuterium, but lower mass objects don’t have a sustainable source of energy. When they’re young, brown dwarfs are easy to observe in the infrared because they generate a lot of heat during their gravitational collapse. As they age, they get cooler and fainter. For example, a puny star 10 percent of the mass of the Sun would have a temperature of 3000 K and a luminosity 1/10,000 that of the Sun. By contrast, a brown dwarf 5 percent of the mass of the Sun would be three times cooler and a further 100 times dimmer. It would take a million of these feeble objects to equal the light of the Sun. Some astronomers con­sider Jupiter a failed star or a brown dwarf. Models indicate that brown dwarfs likely host moons like those of Jupiter and Saturn, worlds replete with weather systems, geysers, volcanoes, moun­tain ranges, and oceans—even if, as on Europa, the oceans are frozen over.

Set on a moon orbiting a gas giant planet in the nearby Alpha Centauri star system, James Cameron’s film Avatar (2009) mes­merized audiences with its rendering of Pandora orbiting a Jupiter- like world whose swirled and rippled clouds resemble those of Ju­piter and its Great Red Spot. Cameron’s Pandora (one of Saturn’s moons also happens to be named Pandora) abounds in colorful flora and fauna evocative of Earth’s ocean environments, like the giant Christmas Tree Worms that suddenly retract when touched, the seeds of the sacred tree that float with the pulses of a jelly­fish, and the photophores that dot the plants, animals, insects, and human-like inhabitants of Cameron’s adventure tale.58 That the Na’vi have blue skin seems no accident or random artistic choice. In ocean waters, blue light is least absorbed and can penetrate into the depths so that most deep-sea animals see solely in blue light. With the announcement in October 2012 of an exoplanet with a similar mass to Earth detected in the Alpha Centauri triple star system, Cameron’s fictional depiction seems even more plausible.59

In part, what captivated film audiences was the bioluminescence with which Cameron painted his Pandoran forests. Having ex­plored and documented deep ocean fauna in the film Aliens of the Deep (2005), Cameron is familiar with myriad bioluminescent sea life and readily projects similar plants and animals onto his imagi­nary world. Draped with waterfalls alight with bioluminescence, Pandora’s forests teem with glowing fireflies and whirling fan liz­ards. The forest floor is carpeted with illuminating moss, while its streambeds are lit with the equivalent of sea anemones. In effect,

Cameron anticipates how life might be adapted on an exomoon of a gas giant planet or a brown dwarf star. On such worlds, we might expect to find luminous biota highlighted with photophores as Cameron predicts or species with eyes better adapted to night or low-light conditions. Even as felines have dark-adapted vision superior to humans, the Na’vi have feline features and navigate the night-t ime forest with far better facility than the character Jake. Cameron’s bioluminescent world may have been inspired by Jules Verne’s 20,000 Leagues Under the Sea, in which the char­acters, strolling on the ocean floor, encounter bioluminescent jel­lyfish and note their “phosphorescent glimmers.” Actually, nearly all jellies in the deep sea are luminescent. Verne’s voyagers likewise chance upon corals, the tips of which glow. In stark contrast to the nineteenth-century perception that the ocean floor was devoid of life, Verne imagined the seafloor abounding in bioluminescence. An engraved illustration for 20,000 Leagues titled “On the ocean floor” depicts a kelp forest with large corals, crustaceans, and a flotilla of giant jellyfish whose bodies and tentacles radiate light (figure 9.6). Cameron’s vision of bioluminescent organisms flour­ishing on a nearby exomoon is similarly prescient.

Back on Earth, the first brown dwarf was discovered in 1995. Spitzer has contributed to this research by detecting some of the coolest and faintest examples known, including eighteen in one small region of sky sifted from among a million sources detected.60 Despite their extreme faintness, Spitzer can detect the coolest brown dwarfs out to a distance of one hundred light-years. The outer layers of these substellar objects are cool enough that they’re rich in molecules. The composition of the gas, and so the appear­ance of narrow lines that act as chemical “fingerprints” in the spec­trum, changes as the brown dwarf evolves. Most of the eight hun­dred cataloged brown dwarfs have atmospheres with temperatures in the range 1200°C to 2000°C. The next category, with tempera­tures from 1200°C down to 250°C, has strong methane absorp­tion in their atmospheres. There’s overlap (and often confusion) between planets and brown dwarfs because some giant exoplanets are larger and hotter than some brown dwarfs. The very coolest category of the brown dwarf, and the end point of their evolution, has recently been discovered. NASA’s Wide Field Infrared Survey

On Planets and Dwarfs

Figure 9.6. An original illustration from Jules Verne’s 20,000 Leagues Under the Sea depicting bioluminescence. A more modern media representation is in James Cameron’s 2009 motion picture Avatar, where bioluminescent creatures inhabit the exomoon Pandora, where the action takes place. On Earth, thousands of species of sea creatures of all sizes employ bioluminescence (Jules Verne’s 20,000 Leagues Under the Sea).

Explorer (WISE), which was launched in 2009, has finished a scan of the entire sky at infrared wavelengths. In 2011, astronomers reported six of the coolest stars ever found, one of which has a sur­face at no more than room temperature.61 The WISE data have the potential to reveal brown dwarfs closer than Proxima Centauri, the nearest star to our Sun.

Another promising mission in the search for dwarf stars and transiting exoplanets is headed by Harvard astronomer David Charbonneau and was designed largely by Philip Nutzman, an as­tronomer at the University of California, Santa Cruz. The MEarth Project is focusing eight small robotic telescopes on 2,000 M dwarf stars.62 The project targets particularly M dwarfs as they are smaller than the Sun and transiting planets would block out a greater portion of the star’s light, making them easier to detect. Studying nearby transiting exoplanets could afford astronomers a better sense of whether Earth-like planets are common and ad­ditionally allow astronomers to discriminate the chemistry of their atmospheres. Within six months of launching their project, the team detected their first transiting planet, a super-Earth named GJ 1214b in orbit around a star 13 parsecs from Earth.63 In 2010, the journal Nature reported that the atmosphere of GJ 1214b was found to be comprised either of water vapor, or of thick clouds or haze as on Saturn’s moon Titan.64 Infrared investigations are planned to determine which of these options exist on the planet. With resources like Zooniverse. org, in the coming decade the pub­lic will likely contribute to the search for biomarkers such as ozone or water vapor in the atmospheres of these other worlds.

Awash as it is in newborn stars and exoplanets, the universe may be teaming with life. Perhaps in some far future humans will have physical, robotic, or other means of virtual presence on a nearby exoplanet or one of its moons that, similar to Cameron’s Pandora, teems with bioluminescent life. Astronomer Carole Haswell cau­tions, “If any of this is to happen, however, we need to use our collective ingenuity to understand and repair the effects that our industrial activity and our burgeoning population are having on our own planet.”65 What we learn from exoplanets, Haswell sug­gests, might be invaluable in understanding Earth’s climate evolu­tion and in ensuring our own survival and that of our companion species. In the meantime, NASA and the astronomical community welcome the public’s contribution to one of humankind’s greatest adventures—locating possible habitable planets and, in time, the signatures of life in some dark and overlooked corner of the vast and silent wastes of interstellar space.

On Planets and Dwarfs