[Image: Via @CrookedCosmos].
More or less following on from the previous post, @CrookedCosmos is a Twitter bot programed by Zach Whalen, based on an idea by Adam Ferriss, that digitally manipulates astronomical photography.
It describes itself as “pixel sorting the cosmos”: skipping image by image through the heavens and leaving behind its own idiosyncratic scratches, context-aware blurs, stutters, and displacements.
[Image: Via @CrookedCosmos].
While the results are frequently quite gorgeous, suggesting some sort of strange, machine-filtered view of the cosmos, the irony is that, in many ways, @CrookedCosmos is simply returning to an earlier state in the data.
After all, so-called “images” of exotic celestial phenomena often come to Earth not in the form of polished, full-color imagery, ready for framing, but as low-res numerical sets that require often quite drastic cosmetic manipulation. Only then, after extensive processing, do they become legible—or, we might say, art-historically recognizable as “photography.”
Consider, for example, what the data really look like when astronomers discover an exoplanet: an almost Cubist-level of abstraction, constructed from rough areas of light and shadow, has to be dramatically cleaned up to yield any evidence that a “planet” might really be depicted. Prior to that act of visual interpretation, these alien worlds “only show up in data as tiny blips.”
In fact, it seems somewhat justifiable to say that exoplanets are not discovered by astronomers at all; they are discovered by computer scientists peering deep into data, not into space.
[Image: Via @CrookedCosmos].
Deliberately or not, then, @CrookedCosmos seems to take us back one step, to when the data are still incompletely sorted. In producing artistically manipulated images, it implies a more accurate glimpse of how machines truly see.
(Spotted via Martin Isaac. Earlier on BLDGBLOG: “We don’t have an algorithm for this.”)