Just got these in the mail….
Very exciting, in the way that only a vanishingly small number of grinding, attention-demanding tasks can be.
Just got these in the mail….
Very exciting, in the way that only a vanishingly small number of grinding, attention-demanding tasks can be.
Here's the cover for the contemplative computing book:
Little, Brown spent a lot of time on it, and I think they've managed to communicate a lot in a very small, challenging medium. They were also really good about explaining the design choices, making clear that they thought worked, and accommodating those changes I thought would improve it (or explaining why they would be hard to implement).
So the machine chugs along, and we get one step closer to having a finished book on the shelves!
Evgeny Mozerov's review of several new TED books-- pamphlets, really-- is one of the greatest things I've read in a long time. You know you're in for a wild ride when the opening paragraphs starts like this--
Only the rare reader would finish this piece of digito-futuristic nonsense unconvinced that technology is—to borrow a term of art from the philosopher Harry Frankfurt—bullshit. No, not technology itself; just much of today’s discourse about technology, of which this little e-book is a succinct and mind-numbing example.
--and then gets vicious.
Most of the review focuses on Parag and Ayesha Khanna's ebook Hybrid Reality. Apparently the Khannas accidentally once ran over Morozov's dog in their Range Rover, and didn't stop because they were too busy dishing dirt to News of the World about Morozov's mother. Or so I gather, because nothing less would explain the review.
Remember the creatures in Aliens who bleed concentrated acid? Tha's what comes to mind when you read this.
[A]ll the features that the Khannas invoke to emphasize the uniqueness of our era have long been claimed by other commentators for their own unique eras.... What the Khannas’ project illustrates so well is that the defining feature of today’s techno-aggrandizing is its utter ignorance of all the techno-aggrandizing that has come before it. The fantasy of technology as an autonomous force is a century-old delusion that no serious contemporary theorist of technology would defend.
What's it say about TED? Nothing good, I'm afraid:
I spoke at a TED Global Conference in Oxford in 2009, and I admit that my appearance there certainly helped to expose my argument to a much wider audience, for which I remain grateful. So I take no pleasure in declaring what has been obvious for some time: that TED is no longer a responsible curator of ideas “worth spreading.” Instead it has become something ludicrous, and a little sinister.
Though I have to confess that it felt like he was getting dangerously close to describing som eof the work i've done with this paragraph:
[O]ne can continue fooling the public with slick ahistorical jeremiads on geopolitics by serving them with the coarse but tasty sauce that is the Cyber-Whig theory of history. The recipe is simple. Find some peculiar global trend—the more arcane, the better. Draw a straight line connecting it to the world of apps, electric cars, and Bay Area venture capital. Mention robots, Japan, and cyberwar. Use shiny slides that contain incomprehensible but impressive maps and visualizations. Stir well. Serve on multiple platforms.
And the bit about how the Parangs and Tofflers are both "fast-talking tech-addled couple[s] who thrived on selling cookie-cutter visions of the future one paperback, slogan, and consulting gig at a time" sounds like a kind of a good gig. If you can do it in a more intellectually responsible way, of course.
This week I read John Lanchester's new novel Capital, about life in London during the great financial collapse of 2008.
I thought it was a great read, though not because of its great pacing or high drama or characters you're cheering for. It's more like an Impressionist crowd painting, a set of brilliantly-rendered scenes and personalities and moments, not a story that drives to a decisive conclusion. About 200 pages into it, I started thinking, This is great, but with all this buildup, it had better end with Queen Elizabeth on a velociraptor, on top of Big Ben, striking down zombies with nunchuks.
Not to spoil it, but no Queen Elizabeth, no zombies, no velociraptor. (Though one of the characters does like dinosaurs.)
Still, if you want a book that paints a picture of one of the world's great cities sans velociraptors-- and especially if you've spent time there, and perhaps intersected very peripherally with the sorts of characters that populate the book-- Capital is terrific.
I sent off the revised draft of my book last Friday, and celebrated this weekend by watching the end of the Tour de France.
the book is back, via flickr
It was great to see an Englishman win the tour (Britain's investment in cycling is paying off, as John Kay notes), and it was also cool to see someone win who was so clear about how much his victory was a team achievement. Yes, Wiggins gets to wear the yellow jersey, but as he himself acknowledges, he stands on the shoulders of his teammates.
I was juxtaposing this to Penelope Trunk's recent essay about self-publishing her book. The piece, a long post on her Brazen Careerist blog, is about how traditional publishers don't know anything about their markets, they take too long to get stuff out, and you're better off doing it yourself. The piece was really striking to me because both in scope and substance it's so different from my recent (or current) experience.
home office, california style, via flickr
First of all, Trunk's account of the publishing industry is all about production and distribution; the work of shaping and editing books is invisible. To me, though, this is about 90% of the value that the publishing industry offers. Fourteen months ago, give or take, I had a very very different idea for a book about contemplative computing. That book might have fit well with an academic press, but it wasn't the book I really wanted to write. I was lucky to have an agent who pushed me to think more commercially without giving up my intellectual bona fides or the ambition of explaining to ordinary users how our deep entanglement with technology shapes us. I was also really lucky, once I'd produced a manuscript, to have an editor who could work with me to tune it up, and who insisted (in that totally self-effacing way most book editors have) on making it more accessible and useful.
Another important way in which our experiences contrast is that Trunk describes books as calling-cards, as a way of introducing to the public who you are and what services you have to offer. Now, this is totally in keeping with the Tom Peters "Brand of Me" way of seeing the world, and I had professors at Wharton who talked about how their books were really just ways of attracting clients, so clearly there are authors who either genuinely feel that a book can play this role, or see reasons to talk about it this way. For me, though, writing this book has been pretty transformative, and I have a hard time imagining starting something this hard with the assumption that there won't be a big personal payout at the end.
it's about ME! via flickr
I'm probably going to experiment with some digital self-publishing in the coming year, though I wouldn't call what I'm going to create electronic books-- more like electronic pamphleteering, or digital broadsheeting. A "book" feels like a different proposition than a highly illustrated, expanded version of a talk. Indeed, it's not just a different proposition, but a promise to readers that the object they're getting has been through a more rigorous kind of review and publishing process.
bytes, via flickr
Indeed, the only way I would self-publish a "book" would be if I could hire editorial talent as strong as Zoë and John, and I'm not sure I'd want to take on the risk of investing that much in a book. It's possible that I could find equivalent talent in the freelance editorial market, but I quite like the idea that lots of other people at Little, Brown share the risk with me, and have an incentive to help the book be a success.
Just as important, I don't want my relationship with an editor to become more transactional. As John Kay recently pointed out, the financial services industry worked best for investors and companies when it was more trust-based; in today's world of super-fast transactions and massive bets, there's less interest in building trust, because you tend to assume that you'll be rich and retired within a couple years. I don't need intellectual relationships that are more transactional. Indeed, I think those two things are polar opposites. Frictionless, transactional relationships are mindless (in Ellen Langer's use of the term), and can just as easily succeed as win-lose games; meaningful relationships involve trust and struggle, and only succeed when both parties succeed.
stay, via flickr
I see tremendous benefit in having a team of people who are invested in your victory, like Team Sky was invested in Wiggins' taking home the yellow jersey. If all you're doing is a straight-on transaction, something you know how to do and really can do on your own, then maybe the self-publishing model works; but the way I write books requires a team.
Charles Pierce's review of Ross Douthat's Bad Religion (shorter version: the Sixties sucked) is a master class in how to take apart a book in a manner that respects the subject, but gives the author the flogging they deserve. This may be my favorite part:
[N]owhere does Douthat so clearly punch above his weight class as when he decides to correct the damage he sees as having been done by the historical Jesus movement, the work of Elaine Pagels and Bart Ehrman and, ultimately, Dan Brown's novels. Even speaking through Mark Lilla, it takes no little chutzpah for a New York Times op-ed golden child to imply that someone of Pagels's obvious accomplishments is a "half-educated evangelical guru." Simply put, Elaine Pagels has forgotten more about the events surrounding the founding of Christianity, including the spectacular multiplicity of sects that exploded in the deserts of the Middle East at the same time, than Ross Douthat will ever know, and to lump her work in with the popular fiction of The Da Vinci Code is to attempt to blame Galileo for Lost in Space.
Fantastic. As good as Adam Gopnik's epic takedown of The Matrix, Reloaded. It's made more impressive by the fact that you get the sense that Pierce really knows what he's talking about. Here are two very different lines that each in their way are quite illuminating:
He describes the eventual calcification of the sprawling Jesus movement into the Nicene Creed as "an intellectual effort that spanned generations" without even taking into account the political and imperial imperatives that drove the process of defining Christian doctrine in such a way as to not disturb the shaky remnants of the Roman empire. The First Council of Nicaea, after all, was called by the Emperor Constantine, not by the bishops of the Church. Constantine — whose adoption of the Christianity that Douthat so celebrates would later be condemned by James Madison as the worst thing that ever happened to both religion and government — demanded religious peace. The council did its damndest to give it to him. The Holy Spirit works in mysterious ways, but Constantine was a doozy. Douthat is perfectly willing to agree that early Christianity was a series of boisterous theological arguments as long as you're willing to believe that he and St. Paul won them all....
[Douthat is] yearning for a Catholic Christianity triumphant, the one that existed long before he was born, the Catholicism of meatless Fridays, one parish, and no singing with the Methodists. I lived those days, Ross. That wasn't religion. It was ward-heeling with incense.
So wonders David Frum in his great review of Charles Murray's new book Coming Apart.
The book, as far as Frum is concerned, has several problems. It starts in the wrong place, with the bugaboo 1960s rather than earlier, which Frum argues would yield a rather different and more accurate perspective. It says nothing about the rise of manufacturing in Asia, which is a very big thing if you want to understand what happened to working classes in America.
It also blames the 1960s for our current social and cultural ills, which in turn are responsible for the white working class' decline. But Frum replies, "once you spell out the implied case here, it collapses of its own obvious ludicrousness."
Let me try my hand:
You are a white man aged 30 without a college degree. Your grandfather returned from World War II, got a cheap mortgage courtesy of the GI bill, married his sweetheart and went to work in a factory job that paid him something like $50,000 in today's money plus health benefits and pension. Your father started at that same factory in 1972. He was laid off in 1981, and has never had anything like as good a job ever since. He's working now at a big-box store, making $40,000 a year, and waiting for his Medicare to kick in.
Now look at you. Yes, unemployment is high right now. But if you keep pounding the pavements, you'll eventually find a job that pays $28,000 a year. That's not poverty! Yet you seem to waste a lot of time playing video games, watching porn, and sleeping in. You aren't married, and you don't go to church. I blame Frances Fox Piven.
How you can tell a story about the moral decay of the working class with the "work" part left out is hard to fathom.
As he explains, the gaps aren't just details: they go right to the heart of Murray's argument.
To understand what Murray does in Coming Apart, imagine this analogy:
A social scientist visits a Gulf Coast town. He notices that the houses near the water have all been smashed and shattered. The former occupants now live in tents and FEMA trailers. The social scientist writes a report:
The evidence strongly shows that living in houses is better for children and families than living in tents and trailers. The people on the waterfront are irresponsibly subjecting their children to unacceptable conditions.
When he publishes his report, somebody points out: "You know, there was a hurricane here last week." The social scientist shrugs off the criticism with the reply, "I'm writing about housing, not weather."
This is the kind of book review I love to read, and never want to be the subject of!
Probably the only time in history I'll be mentioned with Lady Gaga and Glee: Publishers' Weekly's article on what books American publishers are selling at next week's Frankfurt Book Fair.
Look under the listings of Hachette, the conglomerate that owns my publisher, Little Brown.
(And so long as I live, I swear I'll never get tired of saying "my publisher, Little Brown"!)
No longer will readers be able to chuck a third free book onto their pile of purchases as they head to the till at Waterstone's: the UK's biggest bookseller is bringing its long-running three-for-two offer to an end.
The Bookseller reports that staff were told of the move yesterday, with the current three-for-two promotion across all paperback fiction to come to an end today. The demise of the famous offer, which has been running for more than a decade, follows the sale of the chain by HMV to Russian billionaire Alexander Mamut, and the appointment of independent bookseller James Daunt as managing director in June.
In the course of reorganizing my home office (I'm starting serious work on my next book, on contemplative computing), I've made another cull of my book collection. Like last year, I've got several large boxes of books that are duplicates, books I read long ago, or books I'm honestly never going to read. Some are in pristine condition (alas), while others are annotated. I'm looking to give away to someone who can use them-- preferably a grad student in history of science / STS or some related field, but that's highly negotiable.
Many of them are really outstanding, but the trajectory of my professional work is such that they're no longer really relevant for me, and I've got more books coming in the door every day. Here's a picture of many, but not quite all, the books I'm giving away.
Go to flickr for a full size version
The terms of the giveaway are:
If you're interested, contact me at askpang at gmail dot com.
I've spent the last few weeks working on a book proposal around contemplative computing. It's been a great, absorbing experience, so naturally an article on the growing respectability of self-publishing would catch my eye.
With Bowker reporting an "explosive growth" of 169% last month in "non-traditional" publishing, it's not just vanity projects that are taking the self-publishing route these days. Amazon announced last week that John Locke had sold 1,010,370 Kindle books using Kindle Direct Publishing, making him the first self-published author to join the "Kindle Million Club", alongside the likes of Stieg Larsson and James Patterson. Meanwhile, self-published authors Louise Voss and Mark Edwards currently top Amazon.co.uk's Kindle bestseller list, and say they're selling up to 1,900 copies a day of their jointly-written thriller, Catch Your Death. Faulkner award-winning author John Edgar Wideman last year chose to publish his new collection of short stories through Lulu.com; the site, offering authors an 80/20 revenue split, has published over 1.1 million authors to date, adding 20,000 titles to its catalogue a month. Writers around the world are getting their books to readers – and getting paid for it – without a publisher standing in between. Self-publishing, it seems, is becoming respectable.
Many of the authors who this Guardian article talks about are established authors with fan bases: their name recognition means that they're going to be sought out by readers, and don't have to compete as hard as first-time authors.
So what's changed recently? According to one author who's selling a lot online,
"Two major developments have had a hugely beneficial impact on self-publishing. Firstly, changes in technology, in particular the adoption of ebooks by the mainstream thanks to Amazon's Kindle, the iPad, etc... If you're a self-publishing author today, you have a vast audience waiting, and a decent number of professional channels through which you can easily make your work available. I personally know authors who are doing this to great effect – some are making over $10,000 every month! Secondly, the advent of social networking has had an incredible effect."
Word of mouth matters a lot for both printed and electronic books. And as another author makes clear, for professional writers, this isn't just about disintermediation, or being free of the shackles of the editorial process: to be a success in the self-publishing market, you need to
"Write the best book you can, hire a real editor to make it better. Have it professionally copy-edited to remove typos. Get a real cover artist – if you're not a professional artist, don't do your own cover. Get that book into ebook form. Start promoting, and start on your next book. Repeat, repeat, repeat."
So essentially self-publishing is "create your own virtual publishing house."
A couple weeks ago, I picked up a copy of Michael Crichton's Next, and on a whim read it tonight. (I should have been reviewing proofs of an article, but so be it.)
Rarely do I think to myself, "I need to finish this book so I can throw it away," but with about fifty pages to go, when it was clear that the only remaining drama in the book was whether the Dog the Bounty Hunter-style character who was trying to get the DNA from an 8 year-old boy would be thwarted by a super-smart African grey parrot whose gusto for one-liners and movie quotes rivals the combined talents of the judges on Project Runway, or the human-chimp mutant boy who had previously defeated a group of skateboarding middle school thugs by throwing poop at them, I thought, "This is not staying in the house." Even though we spent much of the weekend rearranging the living room to fit a fourth bookcase, thus creating a genuinely Wagnerian Wall of Texts (or is it Phil Spectorian?), I knew I'd never keep it.
Indeed, given that the book has about six different story lines that only manage to come together in the most absurd ways possible, one might say that the book itself is an illustration of the dangers of breeding different kinds of species: in this case, it manages to combine the narrative coherence of Pirates of the Caribbean: At World's End with the structure and logic of a Dave Barry novel. With the Pirates movie, there reached a point where I thought to myself, I'm watching a movie featuring a guy with an octopus face, and that last plot twist made me think, 'Boy, is that improbable.' That should never have happened. With Next, I just couldn't believe that the guy who wrote Jurassic Park could publish such a mess. It's the worst final performance since Peter Sellers' unwatchable Fiendish Plot of Dr. Fu Manchu.
Though I love this description of Crichton from the New York Times review of Next:
Though the moment may lack the inherent gravitas of Harriet Beecher Stowe’s encounter with Abraham Lincoln, or even Elvis Presley’s private audience with Richard Nixon, surely history should reserve a special place for the day in 2005 when Michael Crichton was invited to the White House to meet with George W. Bush. Imagine: the modern era’s leading purveyor of alarmist fiction, seated side by side with Michael Crichton.
Ah, the New York Times.
Yesterday was the 150th anniversary of the publication of Darwin's Origin of Species. Happy birthday, evolution! (Alfred Russel Wallace, meanwhile, is nowhere to be found.)
And yes, this is the world's worst Stephen Colbert impression.
From The Long Goodbye:
There are blondes and blondes and it is almost a joke word nowadays. All blondes have their points, expert perhaps the metallic ones who are as blonde as a Zulu under the bleach and as to disposition as soft as a sidewalk. There is the small cute blonde who cheeps and twitters, and the big statuesque blonde who straight-arms you with an ice-blue glare. There is the blonde who gives you the up-from-under look and smells lovely and shimmers and hangs on your arm and is always very very tired when you take her home....
There is the soft and willing and alcoholic blonde who doesn't care what she wears as long as it is mink or where she goes as long as it is the Starlight Roof and there is plenty of dry champagne. There is the small perky blonde who is a little pal and wants to pay her own way and is full of sunshine and common sense and knows judo from the ground up.... There is the pale, pale blonde with anemia of some non-fatal but incurable type. She is very languid and very shadowy and she speaks softly out of nowhere and you can't lay a finger on her because in the first place you don't want to and in the second place she is reading The Waste Land or Dante in the original....
And lastly there is the gorgeous show piece who will outlast three kinpin racketeers and marry a couple of millionaires at a million a head and end up with a pale rose villa at Cap Antibes, an Alfa Romeo town car complete with pilot and copilot, and a stable of shopworn aristocrats....
The dream across the way was none of these, not even of that kind of world. She was unclassifiable, as remote and clear as mountain water, as elusive as its color.
Something tells me she's going to cause Philip Marlowe some trouble.
Actually, this list would make a brilliant Facebook quiz: What Kind of Raymond Chandler Blonde Are You?
This is a good example of what a book review should be: an essay that's worth reading for itself, not just for what it tells you about the book in question.
You may not have noticed, but our cities are changing. As Anna Minton shows in her excellent new study, Ground Control: Fear and Happiness in the 21st-century City, the development of Canary Wharf in the 1990s blazed a trail that is now being followed in cities across the UK, creating privatized, personality-free zones stripped of any historical or cultural uniqueness. These hi-tech “defensible spaces” are promoted as being “clean and safe”. But they are also sterile and soulless. Pat, a hairdresser who has lived on the Isle of Dogs for 37 years, says of Canary Wharf today: “I don’t like going there. It always gives me the fear.”
"I know that history is going to be dominated by an improbable event. I just don't know what that event will be." (Nassim Taleb, The Black Swan, p. 154)
It's conventional wisdom that groups generate ideas and plans more moderate than those of individuals. Groups and discussion encourage compromise, smooth out extremes, and guarantee moderation. It is also one of the unspoken assumptions of facilitation and group-oriented scenario work. Facilitation and scenario-building, the thinking goes, builds a sense of collective spirit by helping groups develop a shared vision of the future.
Going to Extremes... finds that sitting people down to deliberate does not necessarily lead them to compromise or to converge on their mean opinion. They tend to radicalize in the direction of whatever bias they had to begin with. Teams of doctors, deciding collectively, are more likely to support the "extreme" strategy of heroic efforts to save terminally ill patents than the average individual doctor among them. Juries tend to vote, after discussion, for much more "extreme" monetary awards than the average individual juror among them would. Talking things over isn't necessarily wrong. But it doesn't lead reliably to moderation, either....
Much of Sunstein's evidence about how people drift to extremes comes from his studies of groups that already have a bias to begin with. Individual Democrats and Republicans on three-judge panels cast more "extreme" votes when they are in the majority than when they are not. A group of conservative Republicans in Colorado Springs will move sharply rightward when they discuss global warming among themselves, and a group of liberal Democrats from Boulder will move sharply leftward.
These homogeneous groups are not the special cases they would appear. They tell us something about what happens in more heterogeneous groups, too. If you bring the two clashing sides together, they don't find middle ground any more than like-minded people do. Each side digs in. If you give "a set of balanced, substantive readings" to a group that is at loggerheads over abortion or affirmative action, Sunstein shows, each side simply mines the readings for support of its own position. Ideology, it turns out, is not just a matter of opinions or positions—it is a predisposition to receive some kinds of evidence and not others. Compounding the problem, certain kinds of extremist arguments have an "automatic rhetorical advantage" in deliberation. Me, too, but less is harder to rally behind than In for a penny, in for a pound.
The question this raises is whether the facilitation methods that futurists use tend to encourage moderation, or exacerbate this problem. Do scenarios tend to force people to think together, and recognize that complex issues can't be solved through simple means? Or does the intellectual and imaginative freedom that thinking about the future provides encourage groups to project their own extremes?
Add this to the list of insights from psychology-- along with the work of Daniel Gilbert, Daniel Kahnemann, Philip Tetlock, et al-- that futurists need to consider when thinking about how to improve their work.
Maciej Ceglowski, creator of Wrong Tomorrow (our motto: time vs. pundits), may be my new favorite writer. Here he is on Kundera's Unbearable Lightness of Being:
One of the terrors of dating is Milan Kundera, and specifically, The Unbearable Lightness of Being, the sexually-transmitted book that this Czech-born author has inflicted on a generation of American youth.
I fully recognize the important role of the dating book, that is, the carefully selected work you lend a prospective lover sometime in the golden honeymoon period between your second cup of coffee together and the first time you spend a night in the same bed without touching. In that short window of time, your partner is still a delicious mystery to you, an enigmatic and discerning being, and to her you are a dark continent of adventure and excitement, waiting to be explored. And so you lend her books that are funny, playful, and good subway reading, but also complex enough to hint at your Hidden Depths. Something unusual is a plus, as are lots of sexy bits, to serve as a reminder of the animal fires that burn within. And since you don't yet know one another too well, you try to choose a shotgun of a book that fires a wide pattern, thematically speaking. Like an early physicist studying the atom, you will hurl little bits of culture at your new love and collect valuable data about her inner life by observing the way they bounce off.
Given these requirements, it's not surprising that many people have gravitated towards The Unbearable Lightness of Being.... The problem, though, is that The Unbearable Lightness of Being is a really bad book. Milan Kundera is the Dave Matthews of Slavic letters, a talented hack, certainly a hack who's paid his dues, but a hack nonetheless. And by his own admission, this is his worst book.
The idea of new people being like atoms at the Cavendish, to be understood through indirect and oblique probes (Ernest Rutherford was widely acknowledged as the sexiest of the early 20th century's experimental physicists); the Dave Matthews comparison; the assault on a book so well-regarded that Daniel Day-Lewis was in the movie. Gold.
Actually, Wrong Tomorrow would be a great motto for a futurist: "Right Today, Wrong Tomorrow."
From Metropolis, an essay on "Tracking the Future" that describes a recent book on new urban infrastructures.
The 50-year arc of engines and batteries puts us right on the cusp of viable clean-power transit. The computation and flexibility necessary to make better use of the energy feeding the electric grid are already available; they’re the same technologies keeping cell phones going for days on a single charge. And telecommunications itself is slowly but steadily having a noticeable effect on how and when we use energy, whether through the reduced need for office space because of flexible work locations, the creeping advance of videoconferencing, or even the use of online social networking to buttress face-to-face interactions. It’s not as if we can’t imagine what a viable future might look like (even if it is just as easy to summon a picture of total collapse).
What’s harder to grasp is the inherent flexibility of this new infrastructure. With The Infrastructural City, Varnelis, an architectural historian and the director of Columbia University’s Network Architecture Lab, set out to update Reyner Banham’s 1971 book, Los Angeles: The Architecture of Four Ecologies. The major difference is that where Banham saw in Los Angeles’s unplanned urbanism a logic that could be instructive, Varnelis views it as a city in perpetual crisis—a victim of its own infrastructure. The freeways are perpetually clogged. The wildfires burn faster the more they are suppressed. “Infrastructure is no longer a solution,” Varnelis writes. But he really means the old infrastructure, those masterworks built according to a plan....
The emerging infrastructure is different. Varnelis describes it as something multiple and shifting: “networked ecologies,” plural “infrastructures” that are “hypercomplex” and as likely to consist of legal mechanisms and barely visible cell-phone networks as the heavy stuff of tunnels and bridges. Inherently less apparent than the infrastructure that came before, they’re also as likely to be owned by corporations as by governments—meaning these networks can’t really be controlled, only “appropriated” according to their own logic. With traditional planning made impotent by capitalism and NIMBYism, rebuilding the city now requires a “new type of urbanist,” a designer Varnelis compares to a computer hacker who reimagines a new use for the underlying rules and codes.
A fellow Moleskine enthusiast points me to a fabulous Moleksine art project-- the beautiful Alchemy Notebook, an imagined medieval notebook. Kind of a visual hybrid of Calvino's Invisible Cities and Neal Stephenson's Baroque Cycle.
Since we moved into our house in 2001, we've used part of the garage as a home office. Actually, functionally speaking much of the house is a home office at one time or another, but my desk and books are in the garage. Some of my books, at least: I've long had more books than is good for me, and not enough space for them, so at least half of them have been in a storage shed or the Institute. (An occupational hazard: my father and stepmother have a two-story octagonal library in their house, and have also filled the basement with books!)
I've long dreamed of having enough space for all my books. A couple weekends ago, we went to Ikea and bought some shelving. We bought it right before I went to Europe, so we didn't get it assembled before I left; but on Saturday we got it built. Finally, I've got space for all my books. I've got to put two rows on each of the shelves, but I've had to do that since Berkeley, so I'm used to it.
my daughter alphabetizing books, via flickr
So now I have bookcases and working space on three sides: the armoire, the new tall bookcases beside those, and the short white bookcases forming the other arm of the U. Heaven.
my son in my new intellectual control center, via flickr
I'll spend the next few days happily alphabetizing the books, then figuring out the ideal way to arrange them around me. Actually, I'm not likely to ever find an ideal system; I'll keep reorganizing them forever, as projects come and go.
Update: A Finnish friend informs me that the design for the Ikea bookcases I just bought is, shall we say, an homage to bookcases long sold by a Finnish company, Lundia. Their Web site doesn't seem to have an English section, but their designs-- particularly their chairs-- look edgier than most Ikea furniture these days. Maybe the difference is that Ikea design, for all its Swedish origins, is now a generic global modern, manufactured in and designed to appeal to buyers in China and Copenhagen alike, while Lundia's is more purely Finnish.
Last night I finished the new Alan Furst book, The Spies of Warsaw. Furst is one of my favorite living authors: I choose his books as dinner companions when I travel, and his work is something of a reference point for me. (For those who don't know Furst and his work, this is still a good introduction.)
I thought his last book, The Foreign Correspondent, was very entertaining, but had a bit too much of familiar characters and places for my taste. The problem is that Furst has built up a remarkably rich fictional universe-- imagine JRR Tolkien or Terry Pratchett without magic-- in which places have a lot of resonance and meaning, and part of the pleasure of reading his work is learning more about it. Imagine going to a city you already like and discovering a new cool neighborhood, another excellent restaurant, and becoming a bit more comfortable with the subway: a trip in which you see only familiar sights can be very nice, but lack the pleasure of surprise. (Now that I think about it, the books of his that I reread the least, Dark Voyage and Blood of Victory, take place on the periphery of that world-- maybe too far.) So the challenge is to keep expanding that universe, while throwing new light on the familiar parts of it.
Spies of Warsaw manages to hit a very nice balance between familiarity and novelty. There are a couple secondary characters who we meet originally in The Polish Officer or The World at Night, whose back-stories are fleshed out. The main characters are new, and most of the action takes place in Warsaw (where Furst's earlier books haven't spent much time), or Germany; Paris makes an appearance, but it isn't as big a character as it is in some of his other books.
The stakes are also clearer and higher in this book. Without giving too much away, the central character becomes aware that the Wehrmacht is trying to figure out how to conduct blitzkrieg operations through forests-- which suggests that Germany is going to try to attack France not by throwing itself against the Maginot Line, but by going through the Ardennes. Normally, Furst's characters risk their lives for very uncertain stakes: unless they're trying to save a loved one, they rarely know if the operations they're involved in will make any difference at all to the war. (The recycling of Furst's characters runs the risk of making World War II seem like something that was fought by about fifty people; but having his characters operate in worlds that have completely uncertain, and often very ambiguous, outcomes helps create a sense that you're watching just one of a million little parts of the war, not the central figures whose actions secretly determine the course of the war.) They're also more war-weary in this book. Maybe it's because several of the are French veterans of the trenches of World War I; or maybe it's harder to write a book about war these days without thinking that your characters would be more scarred, and simultaneously more hardened and fearful.
Furst has to write a book set mainly in Budapest now.
If you're in Philadelphia in the next couple months, check out the "Textual Spaces: an Architecture of Reading" exhibit at the University of Pennsylvania:
As we seek to understand the way in which the act of reading is defined by its material constraints, our line of questioning necessarily extends to the spaces in which reading takes place. Where do we read? And how do those places affect our reading? To answer these questions is to move toward an architecture of reading. To place a book within the rooms of a house or public space shifts the significance of historical context from background to foreground. Just as the material constraints involved in the process of printing, binding, and selling books arguably shape the attitudes of readers, so do their physical surroundings add to the shape of their reading experiences.
20 years ago, Russell Jacoby published The Last Intellectuals, on the rise and fall of the public intellectual in 20th-century America. He has an op-ed piece in the Chronicle of Higher Education that reviews the book, public (or at least academic) reaction to it, and how the argument has stood up.
The piece is worth reading, if only because it nicely lays out his argument:
I offered a generational explanation for what I saw as the eclipse of younger intellectuals. Why in 1987 had the same intellectuals dominated for more than 20 years, with few new faces among them? Why was it that the Daniel Bells or Gore Vidals or Kenneth Galbraiths seemed to lack successors? Professionalization and academization appeared to be the reason. Younger intellectuals were retreating into specialized and cloistered environments.
Earlier 20th-century thinkers like Lewis Mumford and Edmund Wilson kept the university and its apparatus at arm's length. Indeed, they often disdained it. They oriented themselves toward an educated public, and, as a result, they developed a straightforward prose and gained a nonprofessional audience. As his reputation grew, Wilson printed up a postcard that he sent to those who requested his services. On it he checked the appropriate box: Edmund Wilson does not write articles or books on order; he does not write forewords or introductions, does not give interviews or appear on television, and does not participate in symposia.
Later intellectual generations, including, paradoxically, the rebellious 60s cohort, do give interviews; do write articles on demand; and most evidently do participate in symposia. They grew up in a much-expanded campus universe and never left its safety. Younger intellectuals became professors who geared their work toward their colleagues and specialized journals. If this generation — my generation! — advanced into postmodernism, post-Marxism, and postcolonialism, where the Daniel Bells and Lewis Mumfords never trod, it did so by surrendering a public profile.
The book is still well worth reading, I think.
[To the tune of Bill Evans Trio, "What Is This Thing Called Love?," from the album "Portrait in Jazz".]
Commenting on recent court cases over whether bookstores need to turn over records of book purchases to law enforcement authorities, Peter Brantley at O'Reilly Radar makes and interesting point:
[R]eading is tuning into a series of digital transactions, transitioning from a private matter of solitary, silent reading into an inherently social act suitable for data mining. Indeed, the fascinating historical work of Paul Saenger demonstrates how the revolutionary change wrought in the early Medieval Ages by the Arabs and the Irish of separating words with spaces and punctuation to ease the understanding of translated Latin texts enabled silent reading, which in turn created modern expectations for privacy in the matter of what we read and think....
So we all must then inquire of publishers building online digital text libraries, and Microsoft and Google with their online books corpora: what happens when the police and courts of the state come to you? : Are you prepared to respect and reassert in a digital age -- an age in which the act of reading is inherently recordable -- the individual's control of privacy that has been maintained over the last 700 years? The alternative is to begin a retreat to the sunken expectations for the disclosure of our thoughts and writing that echo with eerie fidelity the cloistered labyrinths of the oral culture of 1200 AD -- a world far more inimical to free expression.
[To the tune of The Church, "Under the Milky Way," from the album "Starfish".]
To say that Anthony Grafton has a "brilliant essay" in the latest New Yorker is a bit like saying that John Woo has directed an "action-packed movie:" in both cases, the adjective is superfluous, because their work is always like that. Grafton, a professor at Princeton, is unquestionably one of the smartest historians practicing today, and writes mainly on Renaissance and early modern intellectual history.
His New Yorker piece is on digitization and the quest for the universal library, and it nicely shows how a deep knowledge of the history of books and ideas can be used to help understand the future of new media.
Google’s [book scanning and library] projects, together with rival initiatives by Microsoft and Amazon, have elicited millenarian prophecies about the possibilities of digitized knowledge and the end of the book as we know it. Last year, Kevin Kelly, the self-styled “senior maverick” of Wired, predicted, in a piece in the Times, that “all the books in the world” would “become a single liquid fabric of interconnected words and ideas.” The user of the electronic library would be able to bring together “all texts—past and present, multilingual—on a particular subject,” and, by doing so, gain “a clearer sense of what we as a civilization, a species, do know and don’t know.” Others have evoked even more utopian prospects, such as a universal archive that will contain not only all books and articles but all documents anywhere—the basis for a total history of the human race.
In fact, the Internet will not bring us a universal library, much less an encyclopedic record of human experience. None of the firms now engaged in digitization projects claim that it will create anything of the kind. The hype and rhetoric make it hard to grasp what Google and Microsoft and their partner libraries are actually doing. We have clearly reached a new point in the history of text production. On many fronts, traditional periodicals and books are making way for blogs and other electronic formats. But magazines and books still sell a lot of copies. The rush to digitize the written record is one of a number of critical moments in the long saga of our drive to accumulate, store, and retrieve information efficiently. It will result not in the infotopia that the prophets conjure up but in one in a long series of new information ecologies, all of them challenging, in which readers, writers, and producers of text have learned to survive.
Grafton argues that efforts to create universal libraries, and efforts to create personal tools for working with and making sense of ever-larger bodies of information, are as old as the written word itself. Further, as big as the projects that Google, Amazon and Microsoft have undertaken, they're still not likely to create a "universal library" that includes all the kinds of physical media-- from early books to letters to architectural models-- that make up the world of knowledge. Finally, though, Grafton argues that the future isn't one in which databases replace books and archives, but one in which they coexist:
these streams of [digital] data, rich as they are, will illuminate, rather than eliminate, books and prints and manuscripts that only the library can put in front of you. The narrow path still leads, as it must, to crowded public rooms where the sunlight gleams on varnished tables, and knowledge is embodied in millions of dusty, crumbling, smelly, irreplaceable documents and books....
For now and for the foreseeable future, any serious reader will have to know how to travel down two very different roads simultaneously. No one should avoid the broad, smooth, and open road that leads through the screen. But if you want to know what one of Coleridge’s annotated books or an early “Spider-Man” comic really looks and feels like, or if you just want to read one of those millions of books which are being digitized, you still have to do it the old way, and you will have to for decades to come. At the New York Public Library, the staff loves electronic media. The library has made hundreds of thousands of images from its collections accessible on the Web, but it has done so in the knowledge that its collection comprises fifty-three million items.
In a way, this isn't a new argument: the "books and electronic resources will complement, each other, not compete" vision isn't unique to Grafton, though he does do an especially good job making it. (I suppose you might call the piece unoriginal, but it if is, it's unoriginal the way a Gil Evans Orchestra cover of Jimi Hendrix's "Little Wing" is unoriginal: Evans didn't write it, but he definitely took it places Jimi never imagined.)
My review of Stuart Clark's The Sun Kings: The Unexpected Tragedy of Richard Carrington and the Tale of How Modern Astronomy Began is in the latest issue of American Scientist.
It was a good book, but to be perfectly honest, it was one of those reviews that the editor took apart, rearranged, and greatly improved. So equal credit on this one should go to Flora Lewis.
Thanks to Bill C. for letting me know it was out!
SQUIDPUNK: THE MANIFESTO
Fiction that unlike New Weird, Steampunk, or Slipstream, is at its core not only about squid, but about the symbolism of squid as color-changing, highly-mobile, alien-looking, intelligent ocean-goers. As a powerful ecosystem indicator, the squid is a potent symbol for environmental rejuvenation. Squidpunk is almost exclusively set at sea and must contain some reference to either cephalopods or to anything that thematically relates to squid, in terms of world iconography and tropes. Squidpunk is never escapist or whimsical. It is always serious and edgy. This combination of a hard punk aesthetic with the fluid propulsion system common to the squid has produced a unique literary hybrid beloved by Mundanes and Surrealists alike.
[To the tune of Russ Ballard, "Voices," from the album "Anthology".]
From Eamon Duffy's essay in the New York Review of Books, March 29:
Early Christianity was more than a new religion: it brought with it a revolutionary shift in the information technology of the ancient world. That shift was to have implications for the cultural history of the world over the next two millennia at least as momentous as the invention of the Internet seems likely to have for the future. Like Judaism before it and Islam after it, Christianity is often described as "a religion of the Book." The phrase asserts both an abstraction—the centrality of authoritative sacred texts and their interpretation within the three Abrahamic religions—and also a simple concrete fact—the importance of a material object, the book, in the history and practice of all three traditions....
Our modern book form, the codex, in fact evolved from the ancient equivalent of the stenographer's pad, bundles of wooden tablets linked with string hinges and coated with wax, on which information could be jotted with a stylus (often in shorthand). When the information was no longer needed, the wax could be heated and smoothed, and the tablets reused. The first papyrus and (especially) parchment books of pages were recyclable in just the same way, folded and stitched bundles written on with soluble ink that could be washed off to leave the pages blank again. To inscribe the words of Holy Scripture on such jotting pads would demean its sacred character and authority....
Why should the new religion have adopted this down-market and unfashionable book technology? The codex, it is true, has obvious practical advantages. Being written on both sides of the page, it is more economical than the roll, it can be readily indexed, it can be leafed through quickly to find a particular place, and it is more robustly portable. But these practical advantages, which certainly contributed to its eventual adoption as the normative form of the book, do not adequately explain the early Christians' exclusive preference for the form, even for their copies of the Jewish scriptures, which must of course have been transcribed from rolls. Historians have speculated that difference from Judaism may have been the point—that the codex was adopted to distance the emergent Church from its origins within the religion of Israel, or perhaps in an attempt to signal that its foundational texts were indeed a sort of sacred stenography, the living transcript of apostolic experience, taken from the mouths of the first witnesses.
My wife is now on her way from Cambridge to Hamburg, to spend the weekend with friends before flying home next week. Before she left, though, she got copies of Harry Potter and the Deathly Hallows.
Apparently, it was quite the scene.
Most of the people here seem to be adults and teen-age girls. There are a few little kids, some look under 10, and I am not quite sure what the point of that is, although the fellow who just came by had a brilliant Harry Potter costume on, he looked just like the young Harry Potter – but should he be up this late getting the last book?
It should be no surprise that many adults have academic robes to use for this in Cambridge. What is a surprise is how many children have them. Did they get them just for this?
The family that dresses up together, stays together
[There's] a large group of very small boys, they look they are like they can’t be older than 8. They are dressed as a Quidditch team, they look very cute, but they will be so tired tomorrow.
12:50.... I walked past Waterstones. The line went out the door, and all the way down the street past the gates to Sidney Sussex College. It was amazing.
Also, one of the exchange programs had a bunch of students who wanted copies, but the program has a strict curfew; so they agreed to send some of the tutors out to buy copies for all the kids, and bring them back to the college.
I really need to reread volume 6 before too long. I hardly remember any of it.
[To the tune of Keith Jarrett, "Vienna, Pt. 1," from the album "Vienna Concert".]
A few days ago, I got Dan Simmons' new book, The Terror: A Novel. It's based on the ill-fated Franklin Expedition, which set out from England in 1845 in search of the fabled North-West Passage linking the Atlantic and Pacific Oceans. Despite having two of the better cold-weather ships in the Royal Navy, and years of provisions, the party disappeared into the Arctic. The ships were trapped in ice, Franklin himself died, and after two years, the ships were abandoned. So far as we know, all 120-odd members of the expedition died.
Nothing was ever heard from the expeditions (by Europeans, anyway-- apparently, the ships were almost a tourist attraction for the Inuit, and David Woodman's Unravelling the Franklin Mystery: Inuit Testimony gathers voluminous material documenting local knowledge of the expedition), which makes it a great platform for an historical thriller. And when one of the ships is actually called Terror (the other was Erebus), it's inevitable that someone writes about them.
I find much of Simmons' work quite compelling, both in an emotional sense-- reading Song of Kali made for a very disturbing afternoon-- and an intellectual one-- his Ilium and Olympos, which replay the Homeric epics on Mars, on the base of Mount Olympus (with a big dose of The Tempest stirred in), are wonderfully audacious. (Imagine Steven King rewriting the Old Testament, with an emphasis on all the really gory stuff.) The Terror is a bit like The Difference Engine or Stephenson's Baroque Cycle, in that all begin with real historical events and people, but then spin off into these weird alternate universes.
Some of the books' recurring themes are a bit disturbing. A few of Simmons' books have, shall we say, a complicated relationship with Catholicism. Characters you spend a lot of time with have a way of getting horribly mutilated or killed. The societies he conjures are defined principally by their relationships with some awful monster or threat: to paraphrase Freud, the discontents of the civilizations in Simmons' books tend to be things that decapitate their victims before sucking the still-warm marrow out of their bones. The result is a world-view that's equal parts Thomas Hobbes and H. P. Lovecraft.
Still, Simmons is unquestionably a brilliant writer, and I have to respect anyone who thinks on such a big scale.
[To the tune of Ludwig van Beethoven, "Sonate no. 22 op. 54 in f gr.t., ," from the album "Piano Sonatas Vol. 2".]
Technorati Tags: science_fiction
I didn't realize, but it's Towel Day:
Towel Day is a day when you carry around a towel all day to commemorate the late, great Douglas Adams, author of the The Hitchhiker's Guide to the Galaxy.
[To the tune of Earth, Wind & Fire, "Can't Hide Love," from the album "Earth Wind & Fire: Greatest Hits".]
My daughter and I just started Terry Pratchett's Wintersmith, the third in his series of young adult novels featuring witch-in-training Tiffany Aching. We'd already read The Wee Free Men and A Hat Full of Sky, the first two books about Tiffany, and my daughter enjoyed them both.
At first, she mainly enjoyed the Nac Mac Feegle, creatures who are a cross between fairies and the Mark Wahlberg character in The Departed, with heavy Scottish accents thrown in for good measure. She still likes them, as do I; but I think she's also becoming much more interested in the character of Tiffany, and her development. I love the Feegle, but Tiffany is the most interesting character Pratchett has created since Sam Vimes, the policeman who figures prominently in a number of the Discworld books.
Doubtless there are groups who decry the books as Bad For Children, but in Pratchett's world, witchcraft is about 5% supernatural, and 95% work, responsibility and social networking. The witches who dress like Stevie Nicks ca. 1978 and spend lots of time on the occult are always bested by the witches who wear boots and listen carefully to village gossip. So in the long run, I think reading the books probably drives down the odds of girls eventually joining a coven or getting into wiccan.
And for me, reading these books is a pleasure because Pratchett is one of my favorite authors, and one of the few I'm likely to be able to share with my children. He's a real pleasure to read aloud, and it'll be years before my daughter is old enough to read Alan Furst or William Gibson (much less Neal Stephenson or Dan Simmons). For the forseeable future, Pratchett will be a common literary reference point for us, and a genuinely literary one: you don't read a Discworld novel with the movie adaptation superimposed on your imagination, as you do when you pick up a J. K Rowling book.
[To the tune of Elton John, "Crocodile Rock," from the album "Greatest Hits".]
At long last, there's a Britannica Firefox search plugin. Good for them.
On the other hand, Britannica's short contributor biographies used to note when the author had died; those notes have been taken out. Granted, encyclopedias like to think of themselves as existing in a kind of timeless, Platonic dimension of Truth, and writing for Britannica is a form of literary immortality. But noting whether an author had passed on to the Great Library in the Sky used to provide a rough sense of how old the article was; and the only reason to remove the notation is to take away that measure. And the only reason to do that is to make it harder to figure out how old (and perhaps how out of date) an article might be.
I understand the instinct, but when it's so easy to find information about just about anyone, and when every Wikipedia article carries a nauseating amount of metadata, it's a dumb thing to do.
Technorati Tags: encyclopedia
While I was soaking in the bath, I realized something: I've gotten into the habit of taking long, hot baths when I travel, largely because at the beginning of Alan Furst's Kingdom of Shadows, the main character, the Hungarian Nicholas Morath (traveling under a diplomatic passport), has a long soak in his girlfriend's tub after returning to Paris from a trip to Budapest.
How many habits do we have that are partly literary references? When I go hang out and work in cafes, I often think of a line in Point Counterpoint in which a character is described as renting a cafe table for the price of tea and a sticky bun. I wonder if there are others.
[To the tune of Paul McCartney & Wings, "The Long And Winding Road," from the album "Wings Over America (Disc 1)".]
One of the most gripping stories about the end of cyberspace involved the overthrow of books, and more generally of print culture, by the Internet and e-books. Depending on what side you were on, this was either a technological inevitability, or a sign of the end of all things Great and Good.
There are lots of ways you can measure how wrongs these predictions turned out to be-- the book industry has certainly had its share of structural adjustments, and some high-profile closures of independent bookstores-- but one suggestive one is John Miller's study of America's most literate cities.
The thing that grabs my attention is that Seattle and San Francisco, two of the centers of software and new media in the United States, rank among the top 10 most literate cities in America. They're also in the top 10 cities for magazine publishers. Finally, they rank #2 and #1, respectively, in per capita concentration of bookstores. Of course, both cities have a long tradition of serious literary endeavors, strengthened by the presence of large universities and student populations, and a (now almost-defunct) combination of cultural richness and relatively low cost of living that attracted all kinds of interesting countercultural types (a phenomenon dissected in John Markoff's really great book, What the Dormouse Said). So it's not entirely surprising that there would be a correlation between high literary ranking and tech concentration; arguably, the former is an (at least indirect) attractor for the latter.
Christopher Buckley is one of my favorite writers. Having gone to college on a scholarship from tobacco giant Philip Morris, Thank You for Smoking, Buckley's story of moral triangulation and defense of the rights of minors to get lung cancer, really spoke to me. His review of Tom Clancy's Debt of Honor (his Japanese characters aren't "one dimensional, they're half dimensional") was a work of genius.
So naturally his essay in Washington Monthly is really great.
[To the tune of Gipsy Kings, "Hotel California (Spanish Mix)," from the album "¡Volare! - The Very Best of the Gipsy Kings".]
This morning I finished reading Alan Furst's latest book, The Foreign Correspondent. Furst is one of my all-time favorite writers (along with Terry Pratchett), and so when I was packing for the trip, his new book was an obvious thing to take along.
I discovered Alan Furst a couple years ago in the Penn bookstore, when I was in Philadelphia for a conference, and needed something to read in the evening. The covers caught my eye, and I bought The World at Night and The Polish Officer more or less on a whim, and I still reread both. I often travel with one of his books now: they're easy to slip into, and just the thing to read when I'm having dinner at some cool Polish restaurant in London, or soaking in the tub at the end of the day.
For those who don't know him, Furst's novels are all set in Europe, in the 1930s and early 1940s. He rarely ventures past 1942 or so, in part because the outcome of the war was still uncertain then, and Furst's novels are studies in decent but not superheroic people-- journalists, petty nobility, tramp steamer captains, waiters-- trying to keep their balance in universes defined by uncertainty. Often they're on the losing side of the struggle against fascism-- they're emigres who've been lucky enough to make it to Paris, or officers or diplomats from the smaller countries unlucky enough to be located between Germany and Russia-- and are forced by circumstances into intelligence work. The personal is political, as they say.
But while all of them eventually end up covertly fighting the Axis, none of them ever is responsible for the key act that turns the tide in World War II: they smuggle weapons, carry secret plans, do the occasional exfitration or rescue, but it's often not clear what difference their actions make. In part, the struggle itself is the point in Furst's books; but I think he also is making the point that modern history, and especially a titanic event like World War II, doesn't boil down to such turning-points. Total war pits economies and societies against each other, and his characters are but small cogs in those machines. At first I found this confusing, and a bit irritating; now, however, I find it completely convincing, and rather like the fact that the reader has to share the character's confusion about whether the tide will turn, and their sacrifices will matter.
If Furst's characters never really know whether whatever you've risked your life to do makes a difference to the war, however, you (the reader) know that sooner or later, they're going to end up in Paris, in one or another setting familiar from one of the earlier books. Furst clearly loves Paris; he can't stop writing about it. Most of The Foreign Correspondent takes place there, in the tangled world of Italian emigres and anti-fascist exiles. Paris is at once the City of Lights that beckons to everyone, and a thousand little mutually exclusive worlds of Balkan emigres and exiles, Soviet emigres, Polish emgires, etc etc. It's at once universal (both in its attractions and irritations) and highly particular. But after about ten books covering the same territory, he's now created a whole little alternative Paris-- or rather, a real Paris with an extra layer of fiction on top-- I can understand why he wouldn't want to tear it up.
The recycling of characters is a bit more problematic. Almost every new book now includes cameos by characters who were in earlier books. In the earlier books, this made a lot of sense, because it gave Furst a chance to fill in back-stories; but now I have mixed feelings about it. In The Foreign Correspondent, Carlo Weisz spends an evening with several people who were central in The World at Night and Dark Star; it's not a bad or unbelievable scene (Furst specializes in characters who travel in very wide circles, so it's not surprising that people from different books would meet up), but you learn nothing new about Nicholas Morath or Andre Szara or anyone else, and you do have to put up with the story of the bullet-hole in the mirror behind Table 14 yet again.
This particularly is a shame because there are some characters-- major ones like Count von Polyani, and minor ones like Lady Angela Hope-- who are fascinating and opaque, and could do with some fleshing out. If these scenes revealed a new aspect of a familiar character, they'd be very cool; but often they feel more like attempts to extend the franchise, making sure that the inventory in the House of Furst is put to good use.
Interestingly, the people who see the world in black and white are among the most dangerous: namely, the Communists. Furst's books make the appeal of Communism in the 1930s comprehensible: all the other major political parties and figures-- Churchill excepted, but he's only a quote in the newspaper-- are constitutionally incapable of standing up to fascism, even as they see Hitler marching through central Europe. The Communists, in contrast, are willing to take action and take losses: they know what they're up against, and don't screw around. But Furst isn't romantic about them, especially in Dark Star and Night Soldiers, the two books featuring prominent Soviet characters. The Communists aren't noble, just every bit as ruthless as Hitler and Mussolini. (Americans are almost completely absent in these books, but America comes out looking really good, given the horrors of the Axis and the cultured ambivalence of Western Europe.)
As much fun as the Paris books are, I'd love for Furst to broaden out more. Dark Voyage was off the map, and an excellent book as a result. I'd like to see him write a book set mainly in Berlin-- his style is well-suited to the menacing yet cultivated atmosphere of that city in the 1930s. Or write something set mainly in Vienna or Copenhagen or London, for that matter.
I recently read Vincent Mosco's The Digital Sublime: Myth, Power, and Cyberspace. It's an interesting book, and it does a good job of ground-clearing of the "I read all these books so you don't have to" variety, but I have some reservations about it.
The book has several big ideas. First, ideas about cyberspace and its impact are myths. Not myths in the sense of ideas that are "delusional and completely wrong," but myths used by religious scholars-- concepts that order our understanding of the world, that, as Alisdair MacIntyre put it, "are neither true nor false, but living or dead." (29) Myths of cyberspace, promulgated by figures as varied as Al Gore, Thomas Friedman, and Nicholas Negroponte, helped drive the dot-com boom, the belief that the Internet would transform modern life, and predictions about the end of history, politics, and space. The digital library, information highway, e-commerce, and virtual community were all, in one way or another, representations of the myth.
Myths of cyberspace were also part of a broader discourse that developed in the years before Y2K, characterized by "a general willingness to entertain the prospect of a fundamental turning point in society and culture" (55-56). The Internet was assigned the role of driver of changes that were already under discussion. Most prominent among them were arguments about the end of history; the death of distance (something that's been happening since at least the telegraph and railroad); and the end of conventional politics (exemplified by the arguments of the Progress and Freedom Foundation).
But it turns out that such technological myths aren't new. When they were new, the telegraph, electric light, radio, and television all seemed to some to herald a new age in which war would be obsolete, economies would prosper, and the lion would lie down with the lamb. In each case, those predictions turned out to be false. Just as Brian Arthur argues that it's after the boom that technologies like the railroad and telegraph really start to matter, so too does Mosco argue that "it is when technologies... cease to be sublime icons of mythology and enter the prosaic world of banality... that they become important forces for social and economic change." (6)
It's had some positive reviews in Technology and Society, First Monday (scroll down to the second review), SCRIPT-ed, Culture Machine, and University Affairs, among other places. Yet I find myself less impressed by the book. What's there to object to? I think there are a couple small things, and one big one.
The Guardian reports on a, ahem, novel writing exercise:
Author turns to eBay in search for collaborators
A first-time author has bypassed the traditional route of getting an agent, and is publishing a collaborative thriller on eBay. The novel is being written one page at a time, one writer to a page. As each installment is finished, the chance to create the next is offered for auction on eBay. So far, 17 pages have been completed, with 234 to go....
Novel Twists is the brainchild of 31-year-old Phil McArthur, who got the idea while recovering from cancer. "I'd had extensive chemotherapy and I had a lot of time on my hands to recover," he says. "I found I was reading a lot more, and that inspired me to think maybe I could write something myself."
He had planned to write the book alone, but then realised there must be many more budding novelists out there. Within days of posting the first page online, he had contributors from all over the world. "The fact that it's an auction means that if people are really keen to write the next page they have a good chance of getting it," he says.
When I was a postdoc, I spent a fair amount of time-- and more than a responsible portion of my disposable income-- at Cody's Books, one of several great bookstores within a couple blocks of my apartment. (I spent even more at Moe's, where my father had taken me when I was a kid and he a grad student in the history department.)
Today, I happened across word that Cody's is closing.
Like Keplers, Cody's was founded in the 1950s, for decades has been a local institution (its founders helped create the Berkeley Free Clinic), and had no small global reputation: it continued carrying Salman Rushdie's Satanic Verses when most bookstores pulled it. But mainly I think of it as one of the centerpieces of Berkeley public intellectual culture: I'd regularly stop there on the way home from campus and catch up on the latest history titles (Moe's is the place to go for used books); to get the Sunday Times (along with a couple croissants from their cafe); to hear a wonderful assortment of speakers; and generally to feel the pulse of literary life. Goodbye to all that.
I've been working, off and on, for a couple months on an Encyclopedia of the 21st Century. The work has now reached the point where I'm devoting more than a few processor cycles to it, so naturally I've done what anyone would do at this point: started blogging about it.
Not that I don't have enough to do, but writing the history of the future was just too good an opportunity to pass up.
[To the tune of Sarah McLachlan, "Building a Mystery," from the album "Mirrorball".]
I've started reading Everyware: The Dawning Age of Ubiquitous Computing. I'm around thesis 10, with about 75 to go. So far, we're still in fairly basic territory, and I can't quite tell who the book is aimed at: interaction designers and other professionals, or a more general public?
On one hand, the book's strong on the intellectual history of ubicomp-- almost stronger on the history than you'd expect a book for practitioners to be. But the organization into theses makes it less accessible a read, and more prescriptive.
As usual, Gene Becker beat me to the punch, this time getting the Everyware-Martin Luther comparison out before I could.
The book it also makes clear just how valuable a biography of Mark Weiser, or at least an article that talks about the origins and development of his concept of ubiquitous computing, would be. Weiser keeps showing up in the story, as the Man With the (Original) Plan, the guy who first imagined what a ubicomp world could be like. Another book or two like this, and he'll be the Buddy Holly of computing.
At the end of Shaping Things, Bruce Sterling lays out what the post-spime world might look like.
The step after the Spime Wrangler-- tomorrow's tomorrow-- is neither an object nor a person. It's a Biot, which we can define as an entity which is both object and person.
A Biot would be the logical intermeshing, the blurring of the boundary between Wrangler and spime. This is happening now, but we can't perceive and measure it.
Today, every human being... carries a load of industrial effluent.... A human body can be understood as a sponge of warm salt water within a shell of skin; so everything we emit [or manufacture or consume] ends up partially within ourselves.
Some artificial substances are bioaccumulative; our metabolisms preferentially suck them out of the biosphere and try to make structure out of them. These processes are involuntary and take place beneath our awareness. (134)
A Biot is somebody who knows about this and can deal with the consequences. He's in a position to micromanage and design the processes that shape his own anatomy. (135)
When will be get to the Biot Age? Sterling guesses around 2070. What kinds of technologies will a Biot technosociety create?
In a Biot world, the leading industries are not artifacts, machines, products, gizmos, or spimes, but technologies for shaping human beings.... The driving technologies of a Biot technosociety would be cybernetics, biotechnology, and cognition. (135)
Because some of the most advanced, valuable technologies will be incorporated into the body, or lived with every day (with full awareness of the biological impacts of that contact), and because of the need for more environmentally sustainable design and manufacturing, a Biot technosociety would prefer
technology that can eventually rot and go away all by itself. Its materials and processes are biodegradable, so it's an auto-recycling technology.... It means room-temperature industrial assembly without toxins. (143)
But there will still exist two other kinds of technologies. One will be
artifacts deliberately built to outlast the passage of time. This is very hard to do and much over-estimated. Many objects we consider timeless monuments, such as the Great Pyramid and the Roman Colosseum, are in fact ruins. (143-4)
The other will be
the kind [of technology] I have tried to haltingly describe here. It's a fully documented, trackable, searchable technology. This whirring, ultra-buzzy technology can keep track of all its moving parts and, when its time inevitably comes, it would have the grace and power to turn itself in at the gates of the junkyard and suffer itself to be mindfully pulled apart. It's a toybox for inventive, meddlesome humankind that can puts its own toys neatly and safely away. (144-5)
How will spimes help save the world? Bruce Sterling lays out a scenario in Shaping Things. Essentially, it's the first book in which metadata is a superhero.
The fact that objects are divorced from information about them encourages us to focus on and take responsibility for only a tiny part of any object's life, and makes it far harder to perceive the consequences of our encouraging the creation of that object, our consumption of it, or our disposal.
Consider a bottle of wine (see chap. 9). Today, our interactions with it are reduced to consulting the price tag, drinking the wine, then throwing away the bottle. But
there must be a mountain of externalities, currently obscured and invisible to me, that involved this object. That growing and fermenting of grapes... topsoil loss, chemical fertilizer, insecticide sprays, the fuels involved in heating and distilling all that liquid.... [Were the workers] suntanned Italian peasantry in the full healthful glow of EU agricultural regulations... [or] illegal African or Abanian immigrants? If that's the case, then I've been invegled into oppressing these people under a veil of my own ignorance.... Why do I collaborate with someone who forces me, through obscurantism, to do that against my will?...
This bottle sure came a long way. How'd it get here to me? How much carbon dioxide got spewed into the planet's air ino order to to ship this object into my hands?...
I'm not supposed to worry my head about all of that, but you know something? I know I am paying for it somehow....
What goes around, comes around. If I ignore distant consequences merely because they seem distant, then distant people will similarly inflict their consequences on me. That's a beggar-your-neighbor situation, a race to the bottom.
But suppose I show them how the object came to be, and I link that information to the object. That would be "transparent production."
So a spime is a moral entanglement with a built-in decoder ring. It's no less a savior or destroyer of worlds than any manufactured object that came before; but by making it laying bare its composition, history, and real costs, you can make better decisions about whether buying and using it will be good for you-- by which you mean, good for you, the world, and the future.
Right now, if these externalities are dealt with at all, they're handled by markets or governments: the price might include a ltitle extra for better labor practices (or it might not), and our taxes cover the costs of disposal and environmental cleanup (or they might not). Our capacity to deal with them independently is pretty limited: knowledge about what companies are socially or environmentally responsible is separated from the point of sale, while detailed information about the composition and history of things is often simply unavailable. Today, how do you know you're making the consumption choice you'd make if you were fully informed? You don't.
This bottle arrived in my possession seemingly stripped of consequences, but those consequences exist.... My relationship to this bottle of wine is a parable of my human relationship to all objects....
My own single-handed effort is entirely unequal to that challenge of discovering all those relationships]. I can't simply know enough... but I can't Wrangle all the world's technosocial issues all the time.
It follows this much of this activity should be done for me by other people.
Who would do that? "Designers."
Just as John Markoff argued that the idea of personal computing was invented before the personal computer itself-- that the PC embodied an already-extant notion of how people and computers should relate-- so too does Sterling suggest that fifty years from now, we'll see concepts like the triple bottom line, environmentally aware consumption, and social investing as anticipating the things we'd be able to do, easily and with greater consequence, with spimes.
In laying out his vision of the future in Shaping Things, Bruce Sterling employs two concepts that require a little decoding: metahistory and synchronic society.
Every civilization has a metahistory, a kind of internal cultural logic. One great flaw is that metahistories tend of see themselves as permanent; a contingent metahistory that allowed for the possibility of its own end-- and was more thoughtful about how to avoid that end-- would work better.
Our own current metahistory is damaging in its short-sigtedness and have yielded "slow crises cheerfully generated by people rationally pursuing their short-term interests." (41) As Sterling puts it,
The 20th century's industrial infrastructure has run out of time. It can't go on; it's antiquated, dangerous and not sustainable. it's based on a finite amount of ice in our ice caps, of air in our atmosphere, of free room for highways and transmission lines, of room in the dumps, and of combustible filth underground. This is a gathering crisis gloomily manifesting itself int he realm of bad weather and resource warfare. It is the legacy we received from world'shaping industrial titans such as Thomas Edison, and Henry Ford, and John D. Rockefeller-- basically, the three 20th century guys who guys us into the Greenhouse Effect. (131)
Its no use starting from the top by ideologically re-educating the consumer to become some bizarre kind of rigid, hairshirt Green.... The only sane way out of a technosociety is through it, in to a newer one that knows everything the older one knew.... That means revolutionizing the interplay of human and object. It means bringing more attention and analysis to bear on objects than they have undergone. It also means engaging with the human body and our affordances. (131-132)
The fact that we can insulate ourselves from the histories and consequences of our decisions, and that markets can assist us in that process (by reducing our relationships to things to price, and treating everything from the social consequences of abusive labor practices to the environmental costs of disposal of packaging as an "externality" that neither you nor the manufacturer has to think about), means that we can live in a state of blissful, deadly innocence.
Ironically, in the artifact era, when most humans grew their own food and made their own things-- or were related to those who did-- we knew a lot more about where stuff came from, and the consequences of making things poorly (of using unsustainable farming practices or building a shoddy furnace); but there were also few enough of us so that anything we did was likely to have very little impact on the world.
Our ability to change the world, intentionally or unintentionally, has far outstripped our ability to make sense of those changes. (Will history regard the internal combustion engine, and not nuclear weapons, as the greatest technological terror of the 20th century?)
To deal with this, "[w]e need a designed metahistory," (42) and Sterling thinks it will
combine the computational power of an INFORMATION SOCIETY with the stark interventionist need for a SUSTAINABLE SOCIETY. The one is happening anyway; the other has to happen. (42)
It would be a synchronic society. Such a society
If we design that metahistory to exploit the power of spimes, which are "information melded with sustainability," (43) we can create a dynamic by which we can preserve and learn from our history, thus giving us the chance to evolve our way out of the current mess. Spimes are especially important because they exist at:
the intersection of two vectors of technosocial development. They have the capacity to change the human relationship of time and material processes, by making those processors blatant and generalization. Every spime is a little metahistorical generator.
History is this technoculture's primary source of wealth. As it transits through time, due to the principles of its organization, it will increase in knowledge, capability, wealth, and power.
But wait, there's more....
I write about people, technology, and the worlds they make.
I'm a senior consultant at Strategic Business Insights, a Menlo Park, CA consulting and research firm. I also have two academic appointments: I'm a visitor at the Peace Innovation Lab at Stanford University, and an Associate Fellow at Oxford University's Saïd Business School. (I also have profiles on Linked In, Google Scholar and Academia.edu.)
I began thinking seriously about contemplative computing in the winter of 2011 while a Visiting Researcher in the Socio-Digital Systems Group at Microsoft Research, Cambridge. I wanted to figure out how to design information technologies and user experiences that promote concentration and deep focused thinking, rather than distract you, fracture your attention, and make you feel dumb. You can read about it on my Contemplative Computing Blog.
My book on contemplative computing, The Distraction Addiction, will be published by Little, Brown and Company in 2013. (It will also appear in Dutch and Russian.)
My latest book, and the first book from the contemplative computing project. The Distraction Addiction will appear in summer 2013, published by Little, Brown and Co.. (You can pre-order it through Amazon or IndieBound now, though!)
My first book, Empire and the Sun: Victorian Solar Eclipse Expeditions, was published with Stanford University Press in 2002 (order via Amazon).
PUBLISHED IN 2012
PUBLISHED IN 2011
PUBLISHED IN 2010
PUBLISHED IN 2009