Just popping in to say hello.
While processing an image using AI-assisted software, a photographer named Jomppe Vaarakallio unexpectedly found actor Ryan Gosling’s face in the resulting image file. The software apparently mistook some window curtains, featuring just the right geometry of shade and folding, for the Canadian actor and thus inserted his face.
According to PetaPixel, this “shows you what happens when computer vision gets tripped up by what looks like a blurry face”—but, of course, it is also what happens when we put too much faith in pattern recognition as a viable form of analysis, whether it’s visual, textual, or otherwise.
Like playing Led Zeppelin records backward in the 1970s and straining to hear subliminal messages pledging allegiance to Satan in the noise, we could feed all our photos through AI programs and see what secret scenes of celebrity rendezvouses they uncover—famous faces hidden in tree leaves, carpets, and window shades, in clothes hanging inside closets and in the fur of distant animals. Use it to generate scenes in films and novels, like Blow-Up or The Conversation for an age of post-human interpretation.
In fact, I’m sure we’ll see the rise and widespread use of authoritarian AI analytics, fed a constant stream of images and audio recordings, finding crimes that never happened in the blur of a street scene or hearing things were never said in a citywide wiretap—call it the Gosling Effect—resulting in people going to prison for the evidential equivalent of faces that were never really there.
(Spotted via @kottke.)
[Image: Caspar David Friedrich, “Wanderer Above the Sea of Fog” (c. 1818).]
[NOTE: I have dozens of posts—from book reviews to news items—sitting in my drafts folder that I never published for some reason. In re-reading this, from summer 2019, I thought I’d publish it.]
The tone of a new piece by Matthew Walther in The Week hits just shy of satire, but it makes an interesting—and, I would suggest, valid—point about the Apollo moon landing as more of an art historical event, a direct extension of European Romanticism, than it was a scientific one.
“What Goethe began at Weimar in 1789 ended on August 15, 1969,” Walther writes. “Apollo 11 was the culmination of the Romantic cult of the sublime prefigured in the speculations of Burke and Kant, an artistic juxtaposition of man against a brutal environment upon which he could project his fears, his sympathies, his feelings of transcendence.” Note the use of the word man.
The primarily aesthetic nature of the first Apollo mission becomes clearer when one considers it from the perspective of both the participants and the spectators. The lunar landing was not a scientific announcement or a political press conference; it was a performance, a literal space opera, a Wagnerian Gesamtkunstwerk that brought together the efforts of more than 400,000 people, performed before an audience of some 650 million.
I’m reminded of Kathleen Jamie’s critique of Robert Macfarlane’s work, published in the London Review of Books several years ago. There, Jamie mocked the current state of nature writing as a genre of the “lone enraptured male,” in her words, who she instead portrays as a figure of mythic delusion seeking self-aggrandizing solitude amongst inhuman things.
“What’s that coming over the hill?” Jamie asks. “A white, middle-class Englishman! A Lone Enraptured Male! From Cambridge! Here to boldly go, ‘discovering’, then quelling our harsh and lovely and sometimes difficult land with his civilised lyrical words.” Of course, to boldly go is clearly a sarcastic reference to a particular kind of explorer’s impulse “to boldly go where no man has gone before.”
In this context, consider Walther’s own comment that much of modern expeditionary literature—such as Antarctic diaries or the records of long ship journeys—was often written by “hard men” who “put their will in the service of a literary mania for feelings of remoteness, hugeness, and brooding oceanic emptiness. What a shame that we have been able to produce no great lunar literature to succeed the writings by Byron, the Shelleys, Tennyson, and Melville that both immortalized and inspired the great hypothermic pioneers.”
Men sending themselves to the moon, men climbing mountains, men staring down glacial landscapes alone or moving into mountain huts and man-caves.
I remember waking up from a dream once, maybe ten or twelve years ago*, in which I had been the author of a novel called Space. In that dream, my (purely imaginary) novel was about a man who abandons his family—leaving his wife and kids without notice—to find “space” for himself elsewhere, an ill-considered quest for solitude that was emotionally and financially devastating for the people he left behind, but a feeling of “space” he valued so much that he could not bring himself to worry about the pain it might cause to others. In other words, it was space refigured as a kind of masculine cruelty, a weapon for solitude-seeking men who can afford to walk away—space as freedom from consequences masquerading as self-sufficiency. (I wrote a description of the dream down the next morning and thus still remember it.)
In any case, I will say—perhaps because I am blinded by my own demography—that the idea of expeditionary solitude as a kind of landscape project, something that can lure hikers into remote national forests or pull unaccompanied astronauts into deep valleys on the dark sides of distant worlds, needn’t necessarily be gendered and needn’t necessarily have any scientific value at all. Humans alone in an experience of awe, surrounded by geology, can have nothing more than aesthetic value—of course, whether it’s $152 billion worth of aesthetic value is another question entirely.
*UPDATE: Looking back at my notes, I actually had this dream in September 2014.
[Image: Photographer unknown; spotted via Medium.]
A design constraint I would sometimes use while teaching was to throw in an unexpected change to the project brief: this cluster of buildings you’re designing is now sponsored by Netflix, REI, Philips, etc. The point would be to think about how this might affect the resulting project—its streets designed as an open-air prototype of smart-lighting techniques, say, or an office campus now featuring climbing walls, artificial rivers, or small-group cinema projection booths. (In turn, the purpose of this was simply to remain flexible as one pushes ahead on a particular assignment.)
The prospect that always seemed one of the most interesting to me, though, was a company such as Dolby Laboratories: an audio services firm who might sponsor or commission an entire building or suburb, a new community somewhere designed for how it sounds. Six new houses pop up down the street from you next year and they’re a cross-platform collaboration not in high-end embedded speakers and such like, but in actual structural audio, like Joel Sanders’s Mix House scaled up.
For example, recall Nate Berg’s piece on the design history of roadside noise barriers. Although there is an almost Coen Brothers-like comical subplot to Berg’s story—as industries throughout Los Angeles, from homebuilders to classical music performers to Hollywood film studios, confronted the deafening and ever-growing roar of all the damn freeways being constructed everywhere, like some urban-scale act of self-inflicted hearing impairment, people screaming on telephones, What?!, no one sleeping at night, a city gone insane—the primary takeaway is simply that overwhelming sound sources inspire structural changes elsewhere. You build a freeway, in other words, then someone will build that freeway’s acoustic opposite, a shield or dampener.
In any case, it was thus interesting to read about what the New York Times calls “a pair of giant noise-canceling headphones for your apartment” designed by researchers in Singapore.
The system uses a microphone outside the window to detect the repeating sound waves of the offending noise source, which is registered by a computer controller. That in turn deciphers the proper wave frequency needed to neutralize the sound, which is transmitted to the array of speakers on the inside of the window frame.
The speakers then emit the proper “anti” waves, which cancel out the incoming waves, and there you have it: near blissful silence.
If you read the full New York Times piece, it seems clear that the system currently has several drawbacks: it is visually ungainly, for example, it cannot counter human voices, and it still lets in a lot of sound.
Nevertheless, the idea of a new building, town, or entire city offering its residents sonic amenities beyond just Bang & Olufsen speakers or similar seems long overdue. For that matter, combine luxury frequency-reduction techniques with seismic wave-mitigation and perhaps you’ve just designed the future of architecture in global earthquake zones. At the very least, someone’s living room will sound better at night.
(Related: Body Sonic / Coronavirus Surroundsound.)
After tweeting a link to a recent story about a Connecticut man who fell through a patch of weak floorboards into a previously unknown well hidden beneath the house, someone replied with the story, above.
I’m always a fan of undiscovered architectural spaces coming to light in a mysterious manner—whether that be through secret passages, old floorplans, forgotten maps, trapdoors, or even dreams—but this suggests a new method, of deducing from the state of one’s own moldy clothing that there might be hidden rooms nearby, wells and cellars unknown to you by other means. Architectural detection garments.
There’s a great moment in a recent article by Jace Clayton, who reviews an installation by DJ and musician Carl Craig for Artforum, where Clayton talks about music’s relationship to empty space.
There is something of “a sonic axiom,” Clayton writes: “Amplified music sounds terrible in empty rooms. The less stuff there is in any given space, the more sound waves will bounce around the walls and ceiling and glass, losing definition as they both interrupt and double themselves. The resulting audio is smeary, muffled, and diffuse. However, when the same space fills with bodies moving around, those waves are absorbed, dampening those irksome reflections and allowing us to hear the sound more powerfully and in far greater detail.”
The effect is such that “the only thing that could make [music] sound better is people.” Bodies make music better—a second sonic axiom, as well as an optimist’s call for more social listening. In other words, your music will sound better the more people you experience it with. Hang out with others. Be bodies. Share.
In any case, Clayton’s piece went online a couple weeks ago but I find myself thinking about it almost daily, as the acoustic effects of the coronavirus lockdown become clear in cities around the world.
“As the pandemic brought much of the crush of daily life to a halt,” the New York Times reported, “microphones listening to cities around the world have captured human-made environments suddenly stripped of human sounds.” To put this in Clayton’s terms, cities are now spaces without bodies.
Think, for example, of Francesca Marciano describing “the new silences of Rome” in an age of coronavirus, or the New York Times itself pointing out how, in Manhattan, “the usual chaos of sounds—car horns, idle chatter and the rumble of subways passing frequently below—[has] been replaced by the low hum of wind and birds. Sound levels there fell by about five decibels, enough to make daytime sound more like a quiet night.”
There is an interesting paradox at work here, though, in terms of a widely reported belief that birds appear to be singing louder than ever before: birds are actually quieting down now, as they have less competition to out-sing. As the NYT writes, this is “because they no longer have to sing louder to be heard over the racket of the city, a behavior, known as the Lombard effect, that has been observed in other animals, too.”
[Image: Gowanus, Brooklyn; photograph by Geoff Manaugh.]
I’ve written at length about sound and the city elsewhere, but one of my favorite pieces on this was a short profile of acoustic engineer Neill Woodger, then-head of Arup’s SoundLab, published in Dwell way back in June 2008.
There, Woodger made the point that, as we transition to electric vehicles, which will remove the sound of the internal combustion engine from our cities, we are being given a seemingly once-in-a-lifetime acoustic opportunity: to redesign urban space for sound, highlighting noises we might want to hear—birdsong, bells, distant train whistles—and helping to excise those we do not.
The coronavirus, it seems, has inadvertently set the stage for another such sonic opportunity. Our global urban lockdowns have all but stripped our cities of “bodies moving around,” in Clayton’s words, such that our streets now sound quite eerie, as if replaced by uncanny muted versions of themselves, or what Marciano calls “an atmosphere of peaceful suspension, as when it snows and everything is wrapped in cotton wool.”
Much has been made of how temporary design interventions in response to COVID-19—things like wider sidewalks, outdoor cafes, streets liberated from cars and opened up to children, families, and the elderly—might become permanent.
In this context, what permanent acoustic shifts might we hear coming from all this, as well?
(Consider picking up a copy of Jace Clayton’s book, Uproot: Travels in 21st-Century Music and Digital Culture.)
[Image: Diagram from The Stones of Venice by John Ruskin.]
There are at least two interesting moments in John Ruskin’s book The Stones of Venice.
One is his description of buttresses.
Buttresses, Ruskin writes, are structures against pressure: a cathedral’s walls want to fall outward, for example, pushed aside by the relentless weight of the roof. But this gravitational pressure can be stabilized by an exoskeleton: a sequence of buttresses that will prevent those walls from collapsing outward.
However, Ruskin points out, there is a similar kind of pressure from the waves of the sea. Think of the curved hull of a ship, he writes, which is internally buttressed against the “crushing force” of the ocean around it. It is a kind of inside-out cathedral.
Consider other high-pressure environments where architecture can thrive—resting in the benthic abyss or twirling through the vacuum of outer space, where offworld stations rotate and spin through exotic gravitational scenarios—and you’ve perhaps envisioned what John Ruskin would be writing about today. Ship-buildings, buttressed against the void.
In any case, for Ruskin, buttresses perform a kind of gravitational judo: he describes “buttresses of peculiar forms, cunning buttresses, which do not attempt to sustain the weight, but parry it, and throw it off in directions clear of the wall.” They shed the load, so to speak, flipping it elsewhere, as if taking advantage of an opponent’s slow and graceless momentum.
…as science advances, the weight to be borne is designedly and decisively thrown upon certain points; the direction and degree of the forces which are then received are exactly calculated, and met by conducting buttresses of the smallest possible dimensions; themselves, in their turn, supported by vertical buttresses acting by weight, and these perhaps, in their turn, by another set of conducting buttresses: so that, in the best examples of such arrangements, the weight to be borne may be considered as the shock of an electric fluid, which, by a hundred different rods and channels, is divided and carried away into the ground.
It’s buttresses buttressing buttresses—or buttresses all the way down.
Ruskin reminds his readers, however, that a buttress’s function can even be seen outdoors, where he specifically cites Swiss landscape defenses. There, Ruskin writes, horizontal buttresses like defensive walls “are often built round churches, heading up hill, to divide and throw off the avalanches.” Again, it’s a question of parrying an oppositional force, deflecting it elsewhere.
[Image: “Profile of a buttress with vertical internal line, when the line of thrust coincides with the axis of the buttress,” taken from a paper called “Milankovitch’s Theorie der Druckkurven: Good mechanics for masonry architecture” by Federico Foce, in Nexus Network Journal.]
From an architectural point of view, you might say that a landscape is stationary until it buckles, shudders, or moves, becoming oceanic, heaving like the sea.
Or, to be pretentious and quote myself from an op-ed in the New York Times, “the ground itself is a kind of ocean in waiting. We might say that [the Earth] is a marine landscape, not a terrestrial one, a slow ocean buffeted by underground waves occasionally strong enough to flatten whole cities. We do not, in fact, live on solid ground: We are mariners, rolling on the peaks and troughs of a planet we’re still learning to navigate. This is both deeply vertiginous and oddly invigorating.”
For Ruskin, the buttress is an architectural technology—a spatial tool—that can be built to anticipate this act of marine transformation, a device that can prepare our buildings and cities to resist violent events in the landscape they are built upon.
With this in mind, it’s worth recalling a recent experiment that showed buildings can be partially shielded from the effects of earthquakes. An “invisibility cloak,” as researchers somewhat hyperbolically described it back in 2013, would use a “regular grid of cylindrical and empty boreholes” drilled into the earth to absorb and deflect seismic waves and thus protect certain structures from damage.
They would “parry it,” as Ruskin once wrote, “and throw it off in directions clear” of the city. In Ruskin’s terms, in other words, they would be buttresses: empty void-silos in the earth that nevertheless function like the exoskeletal cage of a cathedral or the internal ribs of a ship at sea.
[Image: Glacial logics diagrammed in The Stones of Venice by John Ruskin.]
The second interesting thing from The Stones of Venice—among many others, to be sure, but I will only focus on two here—is that, amazingly, for a book published back in 1853, Ruskin scales his analysis up to the point of suggesting that glaciers should be considered as complex architectural objects.
Ruskin describes “a curve about three quarters of a mile long,” for example, “formed by the surface of a small glacier of the second order.” This curve, he writes, is “the most beautiful simple curve I have ever seen in my life.” So, he wonders, how could it be applied to architecture? How could we learn from glaciers?
At this point, Ruskin draws a diagram—the one I’ve scanned, above—to highlight a variety of nested curves that he believes are hiding inside a particular glacier. These are organizational systems that extend for many miles at a time through the ice and that allegedly entail geometric lessons for architects.
The idea here—that Ruskin was trying to extract architectural lessons from glaciers nearly two centuries ago—is incredible to me.
After all, if the Gothic is an architectural language that, as writers such as Lars Spuybroek have compellingly shown, draws from the natural vocabulary of leaves, plants, tree roots, and so on, then this means that Ruskin is suggesting—in 1853!—a kind of Glacial Gothic, an architectural lesson drawn from continent-spanning masses of ice.
I’m reminded of an old t-shirt produced by the band Godflesh that described their music as an “Avalanche On Pause.”
This is a very Ruskinian description, we might say in the present context.
An avalanche on pause brings together Ruskin’s interests in landscape-scale structural events—such as glaciers and landslides—with his attention to the mechanics of cathedrals built to resist such imposing pressures. To freeze them in place. To press pause.
(Thanks to Marc Weidenbaum for reminding me of that Godflesh shirt many years ago.)
[Image: “Clouds, Sun and Sea” (1952) by Max Ernst, courtesy Phillips.]
There’s an interesting space where early modern, mostly 19th-century earth sciences overlap with armchair conjectures about the origins of human civilization. It’s a mix of pure pseudo-science, science-adjacent speculation, and something more like theology, as writers of the time tried to adjust new geological hypotheses and emerging biological evidence—Charles Lyell, Charles Darwin, etc.—to fit with Biblical creation myths and cosmogonic legends borrowed from other cultures. Was there really a Flood? If humans are separate from the animal kingdom, how did we first arrive or appear on Earth?
It is not those particular questions that interest me—although, if I’m being honest, I will happily stay at the table for hours talking with you about the Black Sea deluge hypothesis or the history of Doggerland, two of the most interesting things I’ve ever read about, and whether or not they might have influenced early human legends of a Flood.
Instead, there are at least two things worth pointing out here. One is that these sorts of people never really went away, they just got jobs at the History Channel.
The other is that impossibly long celestial cycles, ancient astronomical records, the precession of the Earth’s poles, and weird, racist ideas about the “fall of Man” all came together into a series of speculations that seem straight out of H.P. Lovecraft.
Take, for example, Sampson Arnold Mackey and his “Age of Horror.”
[Image: Diagram from The Mythological Astronomy in Three Parts by Sampson Arnold Mackey.]
As Joscelyn Godwin writes in a book called The Theosophical Enlightenment, Mackey—a shoemaker, not an astronomer—was fascinated by “the inclination of the earth’s axis and its changes over long spans of time. Astronomers have known at least since classical times that the Earth’s axis rotates once in about 25,920 years, pointing successively at different stars, of which the current one is Polaris, the North Star. One result of this cycle is the ‘precession of the equinoxes,’ according to which the spring-point of the sun moves around the twelve signs of the zodiac, spending about 2160 years in each sign.”
Of course, the assumption that these signs and stars might somehow influence life on Earth is the point at which astronomy morphs into astrology.
Godwin goes on to explain that—contrary to “most astronomers” of his time—Mackey assumed the Earth’s precession was dramatic and irregular, to the extent that, as Mackey speculated, “the earth’s axis describes not a circle but an alternately expanding and contracting spiral, each turn comprising one cycle of the precession of the equinoxes, and at the same time altering the angle of inclination by four degrees.”
The upshot of this is that, at various points in the history of our planet, Mackey believed that the Earth’s “inclination was much greater, to the point at which it lay in the same plane as the earth’s orbit around the sun.”
This sounds inconsequential, but it would have had huge seasonal and climatic effects. For example, Godwin explains, “At the maximum angle, each hemisphere would be pointed directly at the sun day and night during the summer, and pointed away for weeks on end during the winter. These extremes of light and dark, of heat and cold, would be virtually insupportable for life as we know it. In Mackey’s words, it was an ‘age of horror’ for the planet.”
[Image: Diagram from The Mythological Astronomy in Three Parts by Sampson Arnold Mackey.]
The flipside of this, for Mackey, is that the Earth would have gone back and forth, over titanic gulfs of time, between two angular extremes. Specifically, his model required an opposite extreme of planetary rotation in which “there would be no seasons on earth, but a perpetual spring and a ‘golden age.’ Then the cycle would begin again.”
None of this would have been recent: “Mackey dates the Age of Horror at 425,000 years in the past, the Golden Age about a million years ago, and its recurrence 150,000 years from now.”
Nevertheless, Godwin writes, “It was essential to [Mackey’s] system of mythography that the Age of Horror should have been witnessed and survived by a few human beings, its dreadful memory passing into the mythology of every land.”
For Mackey, the implications of this wobble—this dramatic precession between a Golden Age and an Age of Horror, between the darkness of Hell and the sunlight of Paradise—would have been highly significant for the evolution of human civilization.
In other words, either we are coming out of an age of eternal winter and emerging slowly, every minute of the day, every year of the century, into a time of endless sunlight and terrestrial calm, or we are inevitably falling, tipping, losing our planetary balance as we pass into near-permanent night, a frozen Hell of ruined continents and dead seas buried beneath plates of ice.
[Image: The August 2017 total eclipse of the sun, via NASA.]
One of the weirder aspects of all this—something Godwin himself documents in another book, called Arktos—is that these sorts of ideas eventually informed, among other things, Nazi political ideology and even some of today’s reactionary alt-right.
The idea that there was once a Hyperborean super-civilization, a lost Aryan race once at home in the Arctic north, lives on. It’s what we might call the cult of the fallen Northener.
[Image: “Cairn in Snow” (1807) by Caspar David Friedrich.]
What actually interests me here, though, is the suggestion that planetary mega-cycles far too long for any individual human life to experience might be slowly influencing our myths, our cultures, our consciousness (such as it is).
My point is not to suggest that this is somehow true—to say that astrologers and precession-truthers are right—but simply to say that this is a fascinating idea and it has within it nearly limitless potential for new films, novels, and myths, stories where entirely different ways of thinking emerge on planets with extreme seasonal inclinations or unusual polar relationships to the stars.
Think of the only good scene in an otherwise bad movie, 2000’s Pitch Black, where the survivors of a crash on a remote human planetary outpost discover an orrery—a model of the planet they’re standing on—inside an abandoned building.
Playing with the model, the survivors realize that the world they’ve just crashed on is about to be eclipsed by a nearby super-planet, plunging them into a night that will last several months (or weeks or years—I saw the film 20 years ago and don’t remember).
Just imagine the sorts of horrors this might inspire—an entire planet going dark perhaps for centuries, doomed by its passage through space.
[Image: Adolph Gottlieb, courtesy Hollis Taggart.]
In any case, the idea that the earliest human beings lived through something like this hundreds of thousands of years ago—an imminent night, a looming darkness, an Age of Horror that imprinted itself upon the human imagination with effects lasting to this day—would mean that what we think of as human psychology is just an angular epiphenomenon of planetary tilt. Call it orbital determinism.
(Very vaguely related: a planet without a sun.)
[Image: Nevada test site, Google Maps, filtered through Instagram.]
There’s a great line in Tom Zoellner’s book Uranium: War, Energy, and the Rock That Shaped the World where he describes the after-effects of underground nuclear tests. Zoellner writes that, during these tests, “a nuclear bomb buried in a deep shaft underneath a mountain would vaporize the surrounding rock and make a huge cathedral-like space inside the earth, ablaze with radioactivity.”
I thought of Zoellner’s vision of a “huge cathedral-like space inside the earth” recently while reading a paper by Colin N. Waters et al., called “Recognising anthropogenic modification of the subsurface in the geological record.” Among other things, the authors describe the long-term “structural effects of subsurface weapon detonations.”
[Image: Nevada test site, Google Maps, filtered through Instagram.]
They suggest that these detonations produce spaces—such as collapse cones and debris fields—that have “no direct natural analogue,” although they do helpfully contrast weapon-test craters with meteor-impact sites. (The authors also break underground nuclear test sites down into “zones,” which include a “zone of irreversible strain,” which is an amazing phrase.)
The larger purpose of their paper, though, is to look at long-term “signatures” that humans might leave behind in our underground activity, from nuclear tests to mineralogical carbon-capture to deep boreholes to coal mines. Will these signatures still be legible or detectible for humans of the far future? On the whole, their conclusion is not optimistic, suggesting instead that even vast subterranean mines and sites of underground nuclear weapons tests will fade from the terrestrial archive.
“Many of the physical and chemical products of human subsurface intrusion either do not extend far from the source of intrusion, lack long-term persistence as a signal or are not sufficiently distinctive from the products of natural processes to make them uniquely recognisable as of anthropogenic origin,” they write. “But the scope and complexity of the signals have increased greatly over recent decades, both in areal extent and with increasing depths, and seem set to be a fundamental component of our technological expansion. There will be some clues to the geologist of the far-future, when historical knowledge records may not be preserved, that will help resolve the origin.”
[Image: Nevada test site craters, courtesy of the National Nuclear Security Administration Nevada Site Office Photo Library.]
Nevertheless, it is totally fascinating to imagine what future archaeologists might make of Zoellner’s “huge cathedral-like space[s] inside the earth, ablaze with radioactivity,” long after they’ve collapsed, and where sand has been fused into unnatural glass and anomalous traces of radiation can still be found with no reasonable explanation for how they got there.
Could future archaeologists deduce the existence of nuclear weapons from such a landscape? And, if so, would such a suggestion—ancient weapons modeled on the physics of stars—sound rational or vaguely insane?
(Vaguely related: “fossil reactors” underground in Gabon.)
Bruce Sterling has thrown in the towel on his long-running blog, “Beyond the Beyond,” with an interesting farewell note. I’ve read his blog for ages; in fact, the fourth post I ever wrote here—a weird and, in retrospect, not particularly interesting riff on the possibilities of lunar 3D-printing—was in response to one of his posts.
I will quote his goodbye note at length here, but please feel free to read the whole thing.
Unlike most WIRED blogs, my blog never had any “beat”—it didn’t cover any subject matter in particular. It wasn’t even “journalism,” but more of a novelist’s “commonplace book,” sometimes almost a designer mood board. (…) Posting on the blog was a form of psychic relief, a stream of consciousness that had moved from my eyes to my fingertips; by blogging, I removed things from the fog of vague interest and I oriented them toward possible creative use. (…) I used to toss a lot of stuff into the blog that looked “funny,” but a lot of it was testing the very idea of significance. “Does this odd thing I found matter to anyone in any way whatsoever?” Will there be a public response of some kind to this? You can never get that response from a diary, a notebook, a studio corkboard. A blog, though, has an alternating current; so maybe some little meme will catch on and glow. (…) Throwing spaghetti at the wall of a blog to see if anything would stick, that actually kept my interest up, it was motivating. It wasn’t drudgery; I was willing to get up in the morning and do that, it seemed fun, life-enhancing. (…) I knew from the beginning that my weblog would surely cease some day, and I frequently warned readers that “blogs,” the “internet,” desktop computers, browser software and so forth, were all passing phenomena. They were indeed period artifacts, some with the lifespan of hamsters. The content of my blog “rotted” quickly too, since most things I talked about, or linked to, are long gone. (…) I was spreading myself thin, acting the dilettante, and commonly sticking my nose into scenes and situations that were none of my business. Often, I had little to offer, too, other than some quip and a link. But that was my good fortune; I chose the bohemian downsides, the life of archaic niches and avant-garde clutter; I preferred the dead factory and the palace attic.
And now he’s chosen other media entirely.
As the slower pace of posting here on BLDGBLOG over the last two or three years has no doubt clearly indicated, the pull of regular blogging—the urgency of it, the personal routine and daily discipline of writing online, the sense of audience, the faith that other people out there share these interests—has changed dramatically with the new internet, today’s cramped and disappointing version of online life that is now nothing but reaction GIFs and Donald Trump.
I remember hearing a story once when I was a kid about a guy who crashed his car out on a remote country road somewhere. He got pinned in place somehow, unable to move or call for help; his car’s tape deck was the kind that would auto-flip to the other side of the tape, play through to the end, then flip back over and do it all over again, in an endless loop. The guy allegedly spent like seven hours pinned in his still-running car, listening to Wham! the entire time, over and over and over again, with no way to turn it off. That’s what the internet feels like now, only it’s not George Michael, it’s Donald J. Trump and the Hydroxychloroquine Cure, and it’s enough to make anyone quit blogging.
Anyway, good luck, Bruce. Thanks for the nearly two decades of “Beyond the Beyond.”
(Update: Incredibly, that Wham! story seems to be true.)
I posted these on social media the other day, but I thought I’d include them here simply because of how much I love the casually jaw-dropping caption used for these over at the Library of Congress. This eerie pile of bricks looming over the desert, photographed back in 1932?
It’s nothing other than “Possibly the Tower of Babel,” or the “So-called Tower of Babel.” No biggie.
[Images: “Possibly the Tower of Babel” photographed in 1932; courtesy Library of Congress.]
As novelist Paul M.M. Cooper responded on Twitter, the site is still extent today. Iraqi-Dutch filmmaker Mohamed Al-Daradji, Cooper wrote, “used it as a backdrop for a memorable scene in his movie Son of Babylon.”
Here it is on Google Maps.
[Image: The “so-called Tower of Babel,” photographed in 1932; courtesy Library of Congress.]
The Library of Congress also refers to the site as an “extinct city,” which is a fabulous phrase, complete with its own “Watchman of the Ruins,” only adding to the mythic weight of the place.
[Image: “Possibly the Tower of Babel,” photographed in 1932; courtesy Library of Congress.]
Even better, I now have an excuse to post some paintings of the Tower of Babel, as seen through the lens of European art history…
[Image: “The Tower of Babel” (1595) by Abel Grimmer, via Wikimedia Commons.]
[Image: “The Tower of Babel” (1563) by Pieter Bruegel the Elder, via Fine Art America.]
Check out several more photos—including a later, color version—over at the Library of Congress.