The unveiling by U.S. President Joe Biden Monday of the first full-color image from the James Webb Space Telescope is already astounding — and delighting — humans around the globe.
“We can see possibilities nobody has ever seen before, we can go places nobody has ever gone before,” Biden said during a White House press event. “These images are going to remind the world that America can do big things.”
But humans aren’t the only audience for these images. Data from what Biden described as the “miraculous telescope” are also being soaked up by a new generation of GPU-accelerated AI created at the University of California, Santa Cruz.
And Morpheus, as the team at UC Santa Cruz has dubbed the AI, won’t just be helping humans make sense of what we’re seeing. It will also use images from the $10 billion space telescope to better understand what it’s looking for.
The image released by the National Aeronautics and Space Administration Monday represents the deepest and sharpest infrared images of the distant universe to date. Dubbed “Webb’s First Deep Field,” the image of galaxy cluster SMACS 0723 is overflowing with detail.
NASA reported that thousands of galaxies — including the faintest objects ever observed in the infrared — have appeared in Webb’s view for the first time.
And Monday’s image represents just a tiny piece of what’s out there, with the image covering a patch of sky roughly the size of a grain of sand held at arm’s length by someone on the ground, explained NASA Administrator Bill Nelson.
The telescope’s iconic array of 18 interlocking hexagonal mirrors, which span a total of 21 feet 4 inches, are peering far deeper into the universe and deeper into the universe’s past than any tool to date.
“We are going to be able to answer questions that we don’t even know what the questions are yet,” Nelson said.
Strange New Worlds
The telescope won’t just see back further in time than any scientific instrument — almost to the beginning of the universe — it may also help us see if planets outside our solar system are habitable, Nelson said.
Morpheus — which played a key role in helping scientists understand images taken on NASA’s Hubble Space Telescope — will help scientists ask, and answer, these questions.
Working with Ryan Hausen, a Ph.D. student in UC Santa Cruz’s computer science department, UC Santa Cruz Astronomy and Astrophysics Professor Brant Robertson helped create a deep learning framework that classifies astronomical objects, such as galaxies, based on the raw data streaming out of telescopes on a pixel-by-pixel basis.
“The JWST will really enable us to see the universe in a new way that we’ve never seen before,” said Robertson said. “So it’s really exciting.”
Eventually, Morpheus will also be using the images to learn, too. Not only are the JWST’s optics unique, but JWST will also be collecting light galaxies that are further away — and thus redder — than were visible on the Hubble.
Morpheus is trained on UC Santa Cruz’s Lux supercomputer. The machine includes 28 GPU nodes with two NVIDIA V100 GPUs each.
In other words, while we’ll all feasting our eyes on these images for years to come, scientists will be feeding data from the JWST to AI.
Tune in: NASA and its partners will release the full series of Webb’s first full-color images and data, known as spectra, Tuesday, July 12, during a live NASA TV broadcast.