The best far-looking SF is always rooted in the conflicts of the moment -- especially the things we think we will someday outgrow.
... the reason that it is important to include diverse characters and diverse voices in speculative fiction would be because the assertion “we’re all in this together” is not, in fact, a pure, shining, unimpeachable truth, handed down by the gods of speculative fiction for our enlightenment. The statement “we’re all in this together” is, instead, an ideological presumption which is not supported by most of the extant facts.
I'd put it more this way: "we're all in this together" is a dormant truth, one which can emerge one of two ways: either as an evident fact of life, becuse we are all in fact in the same boat and pulling together; or as a grim specter, in which the connectivity of each to all is expressed despite this and not because of it.
The best bad-sounding record you'll ever hear.
The more polished and capable our artistic tools become, the harder it becomes to go for the gut -- or the throat. A black-and-white movie from the black-and-white era has a grit that isn't possible today, a grit that only comes from having no choice but to use black-and-white. Likewise, many of the albums from the early years of electronics in popular music somehow hit harder and cut deeper than their present-day contemporaries, by dint of being the most made from the least. Bit-depth-reduction plugins and analog-synth recreation modules are one kind of sound, but they always bulk tiny next to the epic hiss and rumble of a Farfisa organ and a low-end drum machine fed through a plate echo. (See: the first Suicide album.)
As The Veneer Of Democracy Starts To Fade might well be the absolute first and last word in the power of low-fi as an artistic statement. It's not low-fi because it sounds cool; it's low-fi because that was all they could get their hands on, and also because any record whose mission is to tell you that the earth has been sold out from under your own feet can't afford to sound like a studio confection. The second attribute may be no more than a by-product of the first, but such by-products are part of how art becomes more than a technical affair. Tape hiss, leakage, overloaded vocals, overloaded keyboards -- overloaded everything, really; the meters probably all got stuck in the red the minute they hit PLAY -- it's one of the best bad-sounding records you'll ever hear.
Great writing and great cuisine, compared.
This week and the next one are likely to be slow, due to the holidays and some other stuff going on, but here's some (metaphorical) food for thought.
One of Paul Krugman's more interesting essays -- one I've cited before -- was about English cuisine ("Supply, Demand, and English Food"). His thesis was, in part, that a free-market economy is great at breaking down barriers and introducing things to places where they weren't before, but not so great at preserving the diversities introduced by such a system. "You may say that people have the right to eat what they want," Krugman wrote, "but by thinning the market for traditional fare [as opposed to mass-market foods], their choices may make it harder to find -- and thus harder to learn to appreciate -- and everyone may end up worse off."
On why both giving and interpreting good criticism are dying arts.
Criticism and advice are often wrong, but that merely highlights it is not a sufficient condition to becoming a better person, only a necessary one. Life is too short to learn everything by trial and error, so watching and listening to others is essential. A bias that critics are cretins leads to a life guided only by errors so great they can not be ignored, an inefficient path to enlightenment.
If Diogenes were out and about today, lamp in hand, I have a feeling he'd not say "I'm looking for an honest man," but rather "I'm looking for a sensible critic."
Do you own thing, but don't let it fence you in.
If you enjoyed making a thing, and you’re proud of the thing you made, that’s enough. Not everyone is going to like it, and that’s okay. And sometimes, a person who likes your work and a person who don’t will show up within milliseconds of each other to let you know how they feel. One does not need to cancel out the other, positively or negatively; if you’re proud of the work, and you enjoyed the work, that is what’s important.
Don’t let the fear of not pleasing someone stop you from being creative. The goal isn’t to make something everyone will love; the goal is to get excited, and make a thing where something wasn’t before.
Two sides to this.
Is franchise-driven storytelling the default mode of storytelling from now on?
Is it enough to just make one movie anymore? In the wake of Marvel’s audacious world-building in an assembly line of completely indistinguishable adventure movies, the studios would answer no. What used to be one series of movies has become a web, one that involves various other series’ and offshoots of one particular brand. ... But no one should be surprised: every studio has been headed in this direction for quite a while now.
Observation 1: Only someone not looking very closely would think that The Wolverine (which I liked) is "indistinguishable" from Captain America (which I also liked), but they're completely alike in the sense that they're both a product of the same exude-by-the-yard-and-shrinkwrap-to-order mentality. The individual projects, and the individual creators, may not believe themselves to be part of such a thing, but once enough of their output clogs up the multiplex like so much foliage blocking a storm drain, it hardly matters. It's the long-term effects on diversity that are most deadly -- the stuff you never know is being crowded out in the first place because it doesn't have a chance to manifest on the same level.
You have to take yourself seriously enough to know when not to take yourself seriously at all.
When writing a novel (greatly overrated as a romantic and enjoyable activity, by the way) I always hit the buffers at some point and think the book is utter rubbish. The trick when this happens is to get less serious about what you're doing and recognise that one less novel in the world is not going to make a heap of difference. You're there to have fun and you hope that will communicate itself to the reader. So you take your eye off the ball for a bit, go for a walk, see friends or simply play.
One surefire way to enrage most any artist is to say to them, "Hey, I think you should take what you're doing a little less seriously." This, in the artist's mind, is tantamount to telling a mother, "You should give less of a damn about your kids." POW! Right in the jaw, or in the nethers if you're really unlucky.
We need dystopia to know how things can fall apart.
The gap between what we are and what we can be is also the space in which utopias are conceived. Utopian literature, at its best, may document in detail our struggle with personal and societal failure. While often constructed in worlds of excess and plenitude, utopias are a reaction to the deficits and precariousness of existence; they are the best expression of what we lack most. Thomas More’s book is not so much about some imaginary island, but about the England of his time. Utopias may look like celebrations of human perfection, but read in reverse they are just spectacular admissions of failure, imperfection and embarrassment.
... it is crucial that we keep dreaming and weaving utopias. If it weren’t for some dreamers, we would live in a much uglier world today. But above all, without dreams and utopias we would dry out as a species.
I'm big on the idea of the space we create with our imaginations as a kind of sandbox -- in both the sense of "playpen" and the sense of "protected code zone" -- where we can try out different stuff and see how they fly. Most of the time, though, such thought experiements seem to be work best when they deal with how things fall apart. We read and think about our dystopias far more often than our utopias.
Man of Steel understands Superman well enough to know he should be taken seriously, even if it doesn't always quite know how to make that understanding real.
The problem with comic book movies is that it's too easy to give people what they think they want, instead of what they need. Man of Steel is a case study in such a contradiction. For the great majority of its running time, it's a dazzling and thought-provoking exploration of the Superman mythos, where tough questions are raised about what it would mean to be Clark Kent in a world that would almost certainly mistrust and fear him. Then it turns into a PlayStation game, mostly as a way to shut up all the fans who wanted to see Superman punch things, and while the fun didn't end there for me, it did get dialed down a whole lot.
Maybe it's just the Superman fan in me talking when I say the movie still works despite all that. But if there's one thing I've learned in my time as both a fan and a critic, it's that some of my favorite films are not the perfect ones, but the ones that struggle against their own very evident flaws and still somehow deliver. Man of Steel is flawed in ways that draw me back to it, because when seen in the right light those flaws are also revelatory. They say at least as much about our attitudes towards this sort of material as they speak of any shortcomings in the film itself.
The mysticism of the future by way of technology is no improvement over the mysticisms of the past.
... science fiction often leads the way in science, but that's surely compatible with keeping clear the distinction between serious theoretical inquiry and fantasy, and recognizing that the singularity theories exemplify the latter and not the former.
... I believe that the rise in popularity of singularity mysticism is symptomatic of our uncertainty with respect to the nature and future of artificial intelligence, and the fear that it has become increasingly important to our lives and yet beyond our control. Singularity theory has become popular in these conditions partly because there is no real alternative theory in the popular discussion for thinking about our technological condition, and insofar as it helps people understand their circumstances at all it is preferable to treating technological change as wanton and chaotic.
Plainer English: The default mode for thinking about where all our technological progress will take us is the Techno-Rapture, because it seems faintly silly to think of it any other way. Isn't it better to believe we'll just all go to heaven, and (in the words of Steven Spielberg) be handed a laser gun and a hovercar?
The cruel cost of the samurai code, across the generations.
"The problem with Japanese movies today," I said to someone else not long ago, "is that all their teeth have been pulled." The samurai movies of the 1960s and 1970s -- Goyokin, Shura, Samurai Rebellion, Hara-kiri, most any of Kurosawa's films -- were bold and nondoctrinaire, daring Japan to challenge its own image of itself, and with an unsparing view of how the samurai code of old was not only inhumane but counterproductive. Today, the mood is far more sentimental (When the Last Sword is Drawn) than it is confrontatory. Only rarely do maverick productions like Gojoe, or Battle Royale for that matter, come along.
Tadashi Imai's Bushido, from 1963, belongs alongside all the rest of those cage-rattling samurai productions, but for multiple reasons. It's a creative look at how the samurai code destroyed and stunted the lives of those who practiced it, by following several generations of men from the same family down from samurai days of yore to the present day, where such a code incarnates itself as deference to authority that takes away with both hands what it bestowes with one. The film also works as a vehicle for one of Japan's most flamboyant and commanding actors from that period, Kinnosuke Nakamura, and one of the natural end results of watching this film ought to be to seek out most everything else he's been in.
On "I don't want to have to follow an artist that I have to lead."
People tend to want artists to do the same thing, and it is incumbent upon artists to do something that the audience doesn't want -- yet. I'll tell you this. I won't follow an artist who will be led by his audience. Because I don't want to have to follow an artist that I have to lead.
The comments about Silicon Valley aside (I use and make a living off this technology, but I see more and more every day why many creative people are embittered about it, but that's another essay), it was this comment -- courtesy of Marc McKenzie, hat tip -- which caught my attention.
A new site opens up under my provenance; video games make a comeback in my life; and I ponder the future of my publishing system.
tl;dr: Interesting things afoot avec mi casa.
Busy week, not much time for blogging, and I'm heading into an even busier weekend -- big, big changes at Chez Genji in store -- but a few things worth discussing.
On appreciating the new without wearing the blinders of the old.
More from Professor Dijkstra. Pardon the longish quote:
It is the most common way of trying to cope with novelty: by means of metaphors and analogies we try to link the new to the old, the novel to the familiar. Under sufficiently slow and gradual change, it works reasonably well; in the case of a sharp discontinuity, however, the method breaks down: though we may glorify it with the name "common sense", our past experience is no longer relevant, the analogies become too shallow, and the metaphors become more misleading than illuminating. This is the situation that is characteristic for the "radical" novelty.
... Coming to grips with a radical novelty amounts to creating and learning a new foreign language that can not be translated into one's mother tongue. (Any one who has learned quantum mechanics knows what I am talking about.) Needless to say, adjusting to radical novelties is not a very popular activity, for it requires hard work. For the same reason, the radical novelties themselves are unwelcome.
Another case of yesterday's tomorrow, today.
I'll start with a quote, from A.K. Dewdney's The Magic Machine:
I can readily imagine the first full-fledged, feature-length motion picture generated by computer. The year is 2001. I stumble down the aisle while carrying an oversize bucket of synthetic popcorn and a soft drink containing a few additives that make all the usual ingredients unnecessary. The house lights dim, the curtains part, and the silver screen comes alive with an adaptation of J. R. R. Tolkien's The Lord of the Rings trilogy. Frodo the Hobbit strolls through an open glen. In the distance jagged, snow-capped mountain peaks thrust into the sky. In the foreground exotic trees and plants of unknown species shimmer in the sunlight. The scene changes to a wizard gazing into a crystal ball. In the center of the sphere a fortress appears, flames leaping from its battlements.
Although it is hard to say just how convincingly Frodo will walk and talk in such a film, I am convinced that the mountains, the plants, the crystal ball, and the flames will all come off magnificently. The success will be due largely to the pioneering software and hardware of a company called Pixar, formerly the Lucasfilm Computer Graphics Laboratory.
Dewdney wrote this in December of 1986.
He was only off by two years -- and if you count 2001 as part of the release window for the whole film cycle, and you substitute Gollum for Frodo (I guess he figured the whole thing would be CGI), then he was pretty close.
The big difference, of course, is that the movie we got is a mixture of live-action and CGI, rather than the all-digital PIXAR spectacle Dewdney imagined. Now there was something to mull over: what if it had been PIXAR and Disney that had booted up this project, rather than New Line, Warner Brothers, Peter Jackson, and WETA? Would a PIXAR / Disney(-fied) version of the project been very different, tonally, from the version we did get?
Based on what Disney has since become -- I mean, come on, they just bought Indiana Jones, for goodness' sake -- I'm not sure it would have been all that different in the long run.
I wonder now what Dewdney thought about all this in retrospect.
E.W. Dijkstra Archive: On the cruelty of really teaching computing science (EWD 1036) The usual way in which we plan today for tomorrow is in yesterday's vocabulary. We do so, because we try to get away with the concepts we...
The usual way in which we plan today for tomorrow is in yesterday's vocabulary. We do so, because we try to get away with the concepts we are familiar with and that have acquired their meanings in our past experience.
This insight is a big part of why I'm convinced most any attempt to talk about "the future", especially in SF, is always going to be some form of talking about the here and now. When I wrote Flight of the Vajra I didn't really think the future I was imagining was the future we were going to have, or even a future we were likely to inhabit. It was a future, one I used more as a way to muse about where we're headed or even where we are right now. Such is the way of skiffy.
What I don't think we should ever do, though, is settle for only that. Today's tomorrow shouldn't look like yesterday's tomorrow if we can help it.
On the merits of talking critical smack.
On BuzzFeed's ban on negative reviews:
The usual insufferable tweedwads argue that literary criticism is a genre unto itself, its value residing not in the appraisal of the book so much as the context, scholarship and thematic exploration offered by the critic. Uh-huh. Sure. Go ahead, Margaret Atwood — make this about you.
The other silly argument is that a positive review is rendered meaningless if there is no possibility for a negative one. Oh, really? Ever see a hyperlink?
The single kernel of truth in their justification -- "Why waste breath talking smack about something?" -- is surrounded by so many acres of idiocity I'd need hip boots to wade over there. And it turns out that one kernel is, on closer inspection, a withered husk.
Me and my MT. (Is it "blog" or "bleargh"?)
I just upgraded to the latest version of Movable Type that I'm allowed to use, 5.2.9, and reconfigured it to run under FastCGI. The results are pleasantly snappy, enough that any difference in terms of UI performance between this and WordPress are pretty much nitpicking.
On the whole, though, the long-term plan is to wean myself of Movable Type as soon as Ghost becomes a viable option. It's not that I dislike MT, but the most recent version of the program (and everything to come after it, apparently) really isn't designed for, or marketed to, individual bloggers like yrz trly. This used to be a major component of MT's sales pitch, but not any more. No more single-user or open source versions of the program, either: the least I'd have to pay is $595 for a 5-user license, four users of which will most likely never get used! (Open can of paint, paint one door, throw out the rest?)
This page contains an archive of posts for the month of December 2013.
New York City
Other Lives Of The Mind