[Image: Synthetic volcanoes modeled by Jeff Clune, from “Plug & Play Generative Networks,” via Nature].
Various teams of astronomers have been using “deep-learning neural networks” to generate realistic images of hypothetical stars and galaxies—but their work also implies that these same tools could work to model the surfaces of unknown planets. Alien geology as dreamed by machines.
The Square Kilometer Array in South Africa, for example, “will produce such vast amounts of data that its images will need to be compressed into low-noise but patchy data.” Compressing this data into readable imagery opens space for artificial intelligence to work: “Generative AI models will help to reconstruct and fill in blank parts of those data, producing the images of the sky that astronomers will examine.”
The results are thus not photographs, in other words; they are computer-generated models nonetheless considered scientifically valid for their potential insights into how regions of space are structured.
What interests me about this, though, is the fact that one of the scientists involved, Jeff Clune, uses these same algorithmic processes to generate believable imagery of terrestrial landscape features, such as volcanoes. These could then be used to model the topography of other planets, producing informed visual guesstimates of mountain ranges, ancient ocean basins, vast plains, valleys, even landscape features we might not yet have words to describe.
The notion that we would thus be seeing what AI thinks other worlds should look like—that, to view this in terms of art history, we are looking at the projective landscape paintings of machine intelligence—is a haunting one, as if discovering images of alien worlds in the daydreams of desktop computers.
(Spotted via Sean Lally; vaguely related, “We don’t have an algorithm for this”).