On things being their own reward.
Rudolf Flesch is the man who wrote Why Johnny Can't Read; some thirty years later he wrote a follow-up, Why Johnny Still Can't Read. His premise was simple: teaching reading by phonics, rather than what he called the "look-and-say" method, is valid and appropriate even for a language as allegedly inconsistent as English. In the second book, he mentioned his dismay over a motivational technique that involved essentially bribing kids for reading well. Reading, he argued, should be (and is) its own reward.
The idea that anything should be its own reward is actually a cornerstone of Zen as well. You don't sit zazen to get some nebulous reward sometime far off in the future; you do it because sitting zazen is its own accomplishment. Practicing zazen encourages you to believe that much more in what the present moment offers, which is really all there is. The past is fiction; the future is a nebulous promise. The more equipped you are to accept what is right in front of you now, the better.
Artists don't need to be damaged to be profound.
There’s also this strange, unhealthy love of sick artists not just in video games but in the wider art world; this idea that artists need to be insane or addicted or otherwise impaired to create great art. It’s not to say that those artists can’t create great art (clearly, they can) or that their art is somehow lesser, but that more often than not those self-destructive tendencies interfere with their ability to complete projects that take more than one sitting.
Eric is, apart from being a good friend, one of the better writers I know on the subject of video games as culture -- and also video games as forum (as opposed to just form).
I've long wrestled with, and rejected, the idea that damage or sickness is a prerequisite of good art -- that the artist needs to be a screwed-up person in order for his art to be "genuine". The most obvious problem with this formulation is how it leads us to believe the reverse: that in order to become an artist, you have to get screwed up.
Sorry, no AnimeFest this year. "Flight of the Vajra" will come out, though.
Due to a whole bunch of stuff beyond my control, I've had to bow out of making my appearance at AnimeFest this year. That was where I had planned to debut Flight of the Vajra, but there's nothing that says it has to happen there. It's just what my plan was, and life is what happens while you're off making plans.
The book itself will be released in the next week or so, and you will hear all kinds of screaming about it right here when that does happen. I'm also going to be aggressively exploring some other, and what I hope to be far more directly fruitful, avenues of promotion both for that book and the rest of my oeuvre.
This isn't a slap at AnimeFest or any convention at all, really, but I think I'm learning a hard lesson about the ways to promote one's own work. Cons are great if people already know you; if you aren't a known quantity, then they all too easily become a place to get lost in the crowd. I thought that by building a consistent presence at a particular con I could get a foothold of sorts, but it doesn't work that way: the turnover in the crowd is too great, for one.
Also, my posts may continue to be slow for the next couple of weeks. I'm about to start a new job (more on that later), and I need to get caught up before that officially kicks in.
On criticism vs. reviewing.
I have been reading Literature of the Lost Home by Hideo Kobayashi, a literary critic of Taishō-era Japan who has no reputation worth speaking of outside of his country. It's a shame, because this little volume has more quotable lines and more nuggets of insight worth mulling over per page than most anything I've encountered since my last tangle with Edmund Wilson.
There's a lot to mine from this book, and I plan to do that over a succession of posts, but one of the main insights gleaned from it is how most people no longer think of criticism as being distinct from reviewing. To review a movie means to assess its likely appeal to a given audience; to criticize a movie means to look at it in a penetrating and thoughtful way. Roger Ebert was the rare sort to write both kinds of texts. His first look at a movie was a review; his "Great Movies" columns were criticism.
Music for the inner cinema, and all that.
We've put up a playlist for the music that goes with Flight of the Vajra.
On why too much advice to writers is mere marketing advice.
Such was the advice for success given to the young Dustin Hoffman in The Graduate. I mentioned before a best-selling SF&F author (I won't name him here) who has given writer's advice of the same flavor, and which in the end was indistinguishable from marketer's advice.
Among them was something that amounted to "don't be literary" -- use small words, words that everyone knows, so others will want to read you. It's difficult to dismiss the immediate flush of annoyance -- no, disgust -- that I felt when reading that.
Okay, sure -- you write to be read, and you write for the audience that's likely to read you. But there's no mistaking the reek of anti-intellectualism there, not least of all because it's not the length of one's words that make what you write literature, but the insight and the perspective behind them. (See: Hemingway.)
What's more, such advice is a slap in the face to the impulse that drives many people to become a writer in the first place. The power of words is what swept us up, and what compels us to sweep others up as well -- and yes, you can do that without being pretentious. Telling a writer not to be literary is tantamount to telling a musician not to be musical.
I'd rather have a scrupulous intellectual opponent than an ally with dodgy thinking.
The other day, I took a site I read frequently off my list of RSS feeds. It's not something anyone is likely to know about -- it's this little blog curated by a guy who's a self-styled SF&F author and whose opinions about a number of things are more or less congruous to or convergent with mine. (No, I won't link to it here.)
I didn't delist him because I vehemently disagreed with what he had to say. I took him off my list because while I'm technically on the same side as he is on a number of issues, I couldn't for the life of me stand the way he defended them. I'd rather have someone who is an honest, well-defended, and scrupulous proponent of a viewpoint I don't agree with than someone who is a shabby proponent for something I do agree with.
On the fallacies of attention-getting in the "going viral" age.
What every economist, and for that matter every writer on any subject, needs to realize is that unless you are a powerful person and people are looking for clues about what you’ll do next, nobody has to read what you write — and lecturing them about what they’re missing doesn’t help. You have to provide the hook, the pitch, whatever you want to call it, that pulls them in. It’s part of the job.
I'm going to risk putting my foot through my keyboard and suggest, at the risk of oversimplification, that there are two kinds of writers: those that have an knack or an inclination for being a vigorous promoter for their work, and those that don't.
''We have to take human behavior the way it is, not the way we would wish it to be.''
Avoid the engineer's and economist's fallacy: don't reason your way to a solution -- observe real people. We have to take human behavior the way it is, not the way we would wish it to be.
The author is Donald A. Norman (he of The Design of Everyday Things), who has written extensively about and done tons of research on human-computer interaction. He knows his material the way Linus knows the Linux kernel.
A car is not just for sitting in, and SF&F aren't just chewing gum for the mind.
In Daniel M. Pinkwater's Alan Mendelson, The Boy From Mars (it's a great book, go read it already), there is a wonderful moment when one of the characters explains why the use of psychic powers for boorish mischief does such terrible harm. I don't have the book here in front of me, so I'll have to paraphrase.
Imagine (he says) you lived in a world where there was no such thing as an automobile, and then one day you stumbled across a fully-restored Studebaker Lark, all gassed up and ready to go. You'd invite all your friends to come and marvel at this strange wonder, but you wouldn't know that it was a machine with the power to take you from place to place. Instead, you thought the function of the Studebaker Lark was to sit in the front seat and play the radio. You'd pretty much have missed the point, right?
I've come to call this attitude Studebaker Sitting, and I've ended up devising a label for it if only because I see it so often. Mostly, I see it in creative terms, and specifically, I see it in SF&F shirking its birthright.
What we call "the market" reflects more the behaviors of a few, not many.
... what does it say about our society that it seems to generate an extremely limited demand for talented poet-musicians, but an apparently infinite demand for specialists in corporate law? (Answer: if 1% of the population controls most of the disposable wealth, what we call “the market” reflects what they think is useful or important, not anybody else.) ... A world without teachers or dock-workers would soon be in trouble, and even one without science fiction writers or ska musicians would clearly be a lesser place.
Emph. mine. The article is absolutely worth a read for its main focus -- why do so many of us work jobs we not only hate but subtly sense are worthless? -- but these couple of sentences especially caught my attention because of what I bolded there.
Regular readers of these pages know I've long suspected that a major reason why media enterprises tend to be so unadventurous is because it's always easier to sell people an incremental variant in yesterday's experience than it is to do the real work of selling them something new. Human psychology is at work on both ends of the equation: the readers pay lip service to the idea of the new more than they do the new things itself, and the publishers are just manifesting their own incarnation of that behavior.
Why DC and Marvel are stuck in a taste trap of their own making.
The insular mentality [of comic book companies] remains. By and large the philosophy is still to create almost exclusively for the audience that’s already here or the one that used to be here. Women couldn’t possibly like superheroes (despite the gads of evidence to the contrary). Children would never buy superhero comics (despite the booming kids and all-ages comics market and kids’ almost-unanimous love of superheroes). When they’re asked why they don’t try harder in these areas, they say that they’ve tried in the past and they just never work out. Why don’t they work out? Because, no matter how well-meaning, they have usually ended up being sabotaged on some level. Budgets are miniscule, or start off reasonable and then vanish when there isn’t instant success. Almost always, the marketing is done to the same audience who has steadfastly resisted reading anything beyond superheroes or similar male-targeted fantasy/adventure. Why expect anything beyond a small percentage of crossover? ... DC and Marvel largely don’t know how to market outside the superhero audience, and when they do usually give it such a miniscule budget that penetration is minimal. Conventional wisdom would say to hire a marketing firm that does know how to reach the target demographic, but of course that requires money.
Does this sound familiar? To my ears, it does: it's the same problem I've touched on before about how it's easier to sell what you know to the people you know than it is to sell something new to anyone at all. Why? Because the latter requires, oh my gosh, work and money. More than that, though, it requires risk, and that's the one word you never want to speak out loud in front of a publisher lest you have them whip out the garlic and crosses.
My space opera "... Vajra" has been finally put to bed, and some things have been learned in the process.
Four drafts and over 360,000 words later, Flight of the Vajra is finally done and off to the printer's. (The e-book version will need some more work and will debut later in September.)
I do not think I have ever worked this long or this hard on any one project that I have actually finished, and it has left me with a couple of convictions. First is that I was right for sticking to my guns about not doing sequels: as much as I loved writing about this universe and it's people, it's now a closed book for me, literally and figuratively. I don't think I could go back into it even if I wanted to.
Why, as a fan, sometimes it's best not to get just what you want.
One of the best comments for this article:
My fervent wish as a fan is to give me something I didn't know I wanted in the first place.
This, to me, is the reason to appreciate most anything creative. Not to get more of what you already know you want, but to be introduced to something you never knew you wanted, and to want more of it.
It happened to me with Kurosawa -- I knew almost nothing about Japan or Kurosawa when I saw Ran, but after I walked out of it I wanted to know every damn thing there was to know about both the country and the director. I had never known I could have such curiosity about something, anything.
The same thing happened with Koyaanisqatsi. If I hadn't encountered that movie at an age when my understanding of films was still relatively plastic -- that is, that they didn't have to be any particular kind of thing, that they could be as open-ended and all-encompassing as
One of the things I've pounded on ceaselessly in these posts is the need for people to get out of their experiential and prejudicial bubbles -- especially if they consider themselves creators. It's too easy to get comfortable, and worse, to never know just how much that comfort is holding you back.
Not likely to be posting much for the next couple of weeks, as work and the final preparations for Flight of the Vajra have devoured my attention. Look for me on the other side of September.
More on the general avoidance of discussions about spirituality in futurism.
While on the lead-up to the release of Flight of the Vajra, I've talked a lot about the way most of our talk about the future is technological and not social or personal -- that we don't feel like the people we'll be in a hundred years will be appreciably different from the people we are right now. Vajra suffers, I think, from much of that as well, but at least I tried to take that awareness with me into the heart of the book and do something with it, instead of just let it overwhelm me or lay traps.
The only way we know what a future humanity can be like is by becoming something else, even if only a little at a time. The people we are now as opposed to a hundred years ago are strikingly different -- not just in terms of what we know, but what we're inclined to do, what we want out of our world, and what we're prepared to do to get it. I would like to think we've become incrementally more humane over time, even if there is still violence in our lives, and that we are on the whole less beholden to superstition and compulsive stupidity. How we go about getting to that also makes a lot of difference.
What happened to the cool future we all imagined? Maybe it wasn't all of us that imagined it, or wanted it.
My good friend Steven Savage riffed on The Loss of Cool Futurism: Disunity after a discussion on our part about how something like the sciento-optimism of OMNI Magazine would be a no-show today.
Like Steven and a lot of other nerdy kids of the '80s, I was an OMNI reader, and I read it for just about every damned thing that was jammed between its gorgeous covers: the fiction, the speculative pieces, the journalism about the way science touched everything from art to human behavior. OMNI, by the way, is now being rebooted, and available in the Internet Archive until someone yells at them to take it down. I suspect that's one of the fastest ways to find out who actually owns it, since there's some dispute there.
And like Steven, I dug the way OMNI posited a future that was by default better than the one we had. We were getting our first little whiffs of the future courtesy of the personal computers only just then poking their digital little noses into households. Never once back then would I have entertained the idea that all this would be, could be, seen as a terrible pain in the ass. The biggest shock that came to my naïve little self back then regarding anything in OMNI was discovering the same folks who put out OMNI also put out Penthouse.
This can-do attitude stood in stark contrast to the cultural skepticism many people had manifested about science for some time -- since at least the Fifties and Sixties, which was the last stretch of time when it was possible to take seriously slogans like "Better Living Through Chemistry".
On the ongoing publishpocalypse.
... a healthy book industry is a diverse one, in which it’s possible for a talented author to knock on several doors before resorting to self-publishing. The more gatekeepers, the better the odds ...
Too bad the book industry is going in the opposite direction, as the editorial points out. Penguin and Random House have since merged and are now 25% of the entire global market for publishing.
It's tempting to suggest these big outfits will create sub-imprints that target specific markets, in essence recapitulating the way the big labels acquired indies and let them run their own A&R, simply handling the distribution. But I suspect it'll be more akin to the way George Lucas's predictions about the multiplex went terribly wrong. Instead of having art films and blockbusters cheek-by-jowl in the same building, we ended up with back-to-back showings of the same lousy movies -- except in 2D, 3D, and glorious IMAX.
How to be out standing (sic) in your field.
Krugman's column the other day quoted Raymond Chandler, in a way that brought back to mind what I've been thinking about re: the profundity problem.
Other things being equal, which they never are, a more powerful theme will provoke a more powerful performance. Yet some very dull books have been written about God, and some very fine ones about how to make a living and stay fairly honest.
For those who missed it earlier, I once came across Tibor Fischer talking about the problem of profundity in the arts. Or rather, the problem of profundity in certain artists. It's something you either have or you don't, and striving for profundity just makes you look belabored and silly.
And a big part of why you either have it or you don't, I think, is how you look at the things you're drawn to. As Chandler hinted, you can be drawn to very grand subjects but find that you have little to say about them, because you simply don't see anything that isn't already visible to innumerable others. On top of that, you haven't found a container for expressing what you see that is also compelling by itself.
I run into this problem a lot with fledgeling writers, who "just want to tell a good story". And while that by itself is fine, they always seem to believe the act of thinking too deeply about what they're doing will ruin it -- an analogue of what I've called Cutting The Drum Open To See What Makes It Go Bang. But the other side of that is not engaging at all with what makes a story resonant in the first place -- what compels people to go back and read it again, or even better, stand in line for when your next production comes out.
I'll have more to say about this later, as I have the final bit of Flight of the Vajra edits to put to bed this weekend.
A culture of free cannot be sustained by an economy of free.
... what I’m proposing is that finance, and indeed consumer Internet companies and all kinds of other people using giant computers, are trying to become Maxwell’s demons in an information network. ... [W]ith big computing and the ability to compute huge correlations with big data, it becomes irresistible. ... [But] what’s wrong with that is that you can’t ever really get ahead. What you’re really doing then is you’re radiating waste heat. I mean, for yourself you’ve created this perfect little business, but you’ve radiated all the risk, basically, to the society at large. And if the society was infinitely large and could absorb it, it would work. There’s nothing intrinsically faulty about your scheme except for the assumption that the society can absorb the risk.
Lanier's views have inspired a lot of ire from people I know -- they react to him in the ways that, say, talking heads on TV react to bad news about anthropogenic climate change: by getting angry, or attacking him personally, or just claiming his ideas don't make any sense. And even while I disagree with the way he proposes to solve some of the problems, I agree with his diagnoses of the underlying issues. The more you build an economy around diffusing risk instead of creating things, the more you shift costs into places where they are not detected until they become impossible to sustain.
Why citing "data" as your justification can be no less arbitrary than "Because I said so."
If you want a classic example of the Folly of the Quants, look no further than Microsoft and Windows 8.
For those of you who don't know the story, Microsoft ditched the classic Start Menu from Windows 8 and replaced it with a full-screen menu that makes people want to put their fists through their monitors. Me included. This was bad enough, but they then tried to justify this nonsensical decision by citing user-behavior telemetry that allegedly showed the Start Menu just wasn't used all that much.
Lies, damned lies, and statistics, said I. Sure, you could say that nobody uses the Start Menu, therefore let's ditch it -- which sounds to me more like a justification in hindsight than anything else. But what about the few people who do use it? I know I get tons of use out of it, pinning to the Taskbar notwithstanding. I'm reminded of similar stats about readers: not a lot of people actually read, but those that do read, read voraciously.
I've written more neutrally about this issue elsewhere, but this is my personal opinion: Using all this to justify changing over to a UI that is predominantly organized for touch was a bad idea. Touch UIs are not solely the problem; lousy motives are. (And Windows 8.1 goes some distance towards ameliorating these issues.)
How something classifies as "original" for us may be just as arbitrary as whether or not we like it in the first place.
With a special "long trailer" in theaters and out on the web for Elysium (a tactic I'm coming to associate with films that are an uneasy sell for mainstream audiences, but more on that later), there's some rather weird fan backlash circulating. Not to say that fan backlash is by itself some exotic circumstance -- look at how much of that we got for Man of Steel -- but the way it manifests in each case is weird. In this case, it's something to the effect of there being nothing original here, that director Neill Blomkamp is just repeating himself anyway, and that he isn't all that and a bag of chips in the first place.
I find this attitude bizarrely parochial, the kind of popularity-contest and my-dad's-bigger-than-yours thinking that dominates too many online forums.
If Blomkamp makes a good movie, a genuinely good movie -- he has before, and I hope he has again -- what difference does it make if he's doing so by incrementally refining a set of ideas he's been interested in all along? Granted, it helps for any creator to broaden their focus over time and try different things, but we're only talking about the second feature film in a director's entire career here. (Plus which, shouldn't we judge the quality of the movie by, you know, seeing the movie, and not by looking at the advertising?)
This page contains an archive of posts for the month of August 2013.
New York City
Other Lives Of The Mind