"We practitioners and quants aren't too fazed by remarks on the part of academics – it would be like prostitutes listening to technical commentary by nuns." (From his new book Antifragile, rather negatively reviewed in the Guardian)
I had two cats die this spring and summer, and after they were gone, I really had no interest in replacing them. They had been with me for seventeen years, since they were kittens, and I'd always thought of myself as a cat person; yet, with their passing, I felt like that part of my life was now done.
In contrast, a few days after Christopher died, after I'd cleared out his dog bed and packed away his food and toys-- indeed, the afternoon I got his ashes back from the crematorium-- I realized: I want another dog. After my wife and I talked it over, we agreed that it would be good to get another dog.
We decided to get a rescue, mainly because there are so many dogs in the Bay Area who need homes. Christopher, so far as anyone could guess, was a Carolina or American Dingo, and that's a pretty distinctive breed; you don't see many of them. There's a Carolina breeder here in California, and a couple places in the Rockies that specialize in Carolina rescues, but they're not a breed that shows up on Petfinder or the adoption Web sites; so I quickly gave up the idea of getting another one. (I also wasn't 100% sure getting the same breed was the smart thing for me.)
The Bay Area dog adoption market, it turns out, has a couple weird quirks. First of all, there are tons of chihuahuas and pit bulls, or mixes involving one of those breeds. Second, we import unwanted dogs, from as far away as Taiwan (which has several native breeds, but where it's very tough to survive as a stray). Apparently the Bay Area can't produce enough unwanted dogs, and has to import them. Who knew. I filled out a long form, had a phone interview, and got set up to see a couple dogs.
So on Sunday we went to a pet store deep in Campbell to meet two dogs: a five year-old border collie-husky mix, and a two year-old lab. The scene was crazy: a pen full of adorable puppies, crates with adult dogs in back, and people everywhere. If the wedding dresses in Filenes Basement could bark… you get the idea. We tried out both dogs, and were really split: collie-husky was great, calm, and watchful without being too eager, but she could jump tall fences. The two year-old was more compact and energetic, but also more kid oriented, so naturally the children gravitated to him.
Eventually, we went with the two year-old, took care of the paperwork, bought an inordinately large amount of hardware, toys, etc., and brought him home. We renamed him Davis: my wife and I met there, and while he had been called Dallas by his foster family, he didn't recognize the name.
We're not really sure what breed he is, and probably never will be. I decided that he's a "labramuddle," because he's smaller than a traditional yellow lab and his face is a bit squarer, but friends suggest he could be an English Labrador.
It's been less than a week, and Davis is settling in nicely. He has a crate that he sleeps in at night, and we're still working on how to manage him during the day.
He's very much of the "I'll follow whichever human is doing something" model, but he's more into following the children than Christopher was: I think he likes my son's manic energy, and certainly enjoys the attention the kids lavish on him.
We've taken him to the dog park a couple times, and fortunately he enjoys spending time there.
In the last couple days I've discovered that he's an absolute fiend for chasing balls, which is hardly surprising in a dog that's bred to be a retriever. For me, though, knowing as little about dogs as I do, everything is still a revelation. (It also means exploring the world of dog toys.)
However, for all his crazed energy, he's also good at just hanging out under the desk while I work.
We're quite happy with him, but frankly, we got lucky. Choosing a dog after less than an hour, in a crowded exciting and slightly frantic environment, hardly guarantees good results. if I had to do it again, I'd go to one of these adoption fairs first, with absolutely no plans to get a dog, so I wouldn't be overwhelmed by the energy and emotionalism of the event; then I'd go back a second time, and start looking at the dogs.
After all, a dog could be with you for years (if he's a lab, Davis should live 10-12 years), and while we made a great choice, I've spent more time researching which movie to go to on the weekend.
But we've got him now, and he's been great.
A couple weeks ago Christopher, the dog we inherited in January, died. My wife took him for a morning walk, he went to sleep in the backyard, and never woke up.
He was 14, so we knew when we took him in that he was more or less on loan. Still, it was a shock, even if it wasn't really a surprise.
I hadn't lived with a dog since I was a kid, and when we took him in I didn't really know what to expect. But he proved to be very smart, and great at communicating his needs. I quickly realized that if I just paid attention to what he was doing, I could decode what he wanted-- though sometimes it was especially easy.
It was also instructive living with a creature who didn't really just wanted to belong, to be part of the family, and was happy so long as he could be with us. As someone who lives among highly analytical, calculating people, I'm constantly trying to figure out what clients want, what readers want, what funders want to hear, etc. Being with someone whose mental model of himself and others was really straightforward and guileless was instructive.
At the same time, it was also cool that he was a dog, and did dog things. While he took pleasure in being with us, he also enjoyed having his own, very different, incredibly physical life, one where smells and dirt were really fascinating.
His life intersected with ours; it didn't overlap completely. I found that cool.
We went to the dog park pretty regularly, and he had several friends there, including one dog he would follow around and just drool on. They were both quite elderly, so it was a charming sight.
He also made me a lot more familiar with my neighborhood. Taking him on walks twice a day meant I developed an intimate sense of my surroundings, albeit from a somewhat canine point of view. (I never knew things smelled so interesting around here.)
A friend-- one of the many Peninsula people who had contact with him over the years-- said that he was such a good dog he was sure to come back as a human. I'm not so sure he needs to; if it's possible for a dog to achieve canine nirvana, I think Christopher managed it.
I was in Seattle this weekend at the POD Network conference, a conference of academic technology and professional development types.
I've not been in Seattle in a while, so it was cool to be there. And the crowd at the conference was terrific: very technically savvy, so they knew what I was talking about, but they could also ask interesting questions, and very engaged. Especially impressive for a crowd that had already been at the conference for three days and hadn't yet had lunch.
It was the first time I'd given a big talk since finishing the book, and it was good to see that it seems to hold up in public.
After my talk I spent the afternoon on the monorail (how often as a futurist do you get to ride on an artifact from the future?) and visiting the Experience Museum Project and Seattle Public Library, two of the cooler pieces of architecture... well, anywhere in the world.
The Experience Music Project is said to look like a melted Jimi Hendrix guitar from above; that could well be urban legend, but I do know is it's really cool on the ground.
Here's the cover for the contemplative computing book:
Little, Brown spent a lot of time on it, and I think they've managed to communicate a lot in a very small, challenging medium. They were also really good about explaining the design choices, making clear that they thought worked, and accommodating those changes I thought would improve it (or explaining why they would be hard to implement).
So the machine chugs along, and we get one step closer to having a finished book on the shelves!
The one problem with writing a book for users, taking a Buddhist-inflected approach to information technologies that emphasizes how people can take back control of their minds, is that I'm less likely to get onto this kind of gravy train:
Ferguson's critics have simply misunderstood for whom Ferguson was writing that piece. They imagine that he is working as a professor or as a journalist, and that his standards slipped below those of academia or the media. Neither is right. Look at his speaking agent's Web site. The fee: 50 to 75 grand per appearance. That number means that the entire economics of Ferguson's writing career, and many other writing careers, has been permanently altered. Nonfiction writers can and do make vastly more, and more easily, than they could ever make any other way, including by writing bestselling books or being a Harvard professor. Articles and ideas are only as good as the fees you can get for talking about them. They are merely billboards for the messengers.
That number means that Ferguson doesn't have to please his publishers; he doesn't have to please his editors; he sure as hell doesn't have to please scholars. He has to please corporations and high-net-worth individuals, the people who can pay 50 to 75K to hear him talk. That incredibly sloppy article was a way of communicating to them: I am one of you. I can give a great rousing talk about Obama's failures at any event you want to have me at.
What's so worrying about this trend is that Niall Ferguson, once upon a time, was the best. I'm one of the few people who has actually read his history of the Rothschilds, The World's Banker, all 1,040 pages of the thing, and it is brilliant, a model of archival research. I find it fantastically depressing that the man who could write that book could end up writing a book like Civilization or an article with just as much naked silliness as the Newsweek cover.
I feel very much the same way about Victor Davis Hanson, a man whose military history is really absolutely first-rate, whose The Other Greeks fairly exploded with insight into Greek society and philosophy, but who's been mailing in sloppy, thoughtless pieces ever since he left the farm for The Farm. Sad.
We know too much and feel too little. At least we feel too little of those creative emotions from which a good life springs. In regard to what is important we are passive; where we are active it is over trivialities.
Evgeny Mozerov's review of several new TED books-- pamphlets, really-- is one of the greatest things I've read in a long time. You know you're in for a wild ride when the opening paragraphs starts like this--
Only the rare reader would finish this piece of digito-futuristic nonsense unconvinced that technology is—to borrow a term of art from the philosopher Harry Frankfurt—bullshit. No, not technology itself; just much of today’s discourse about technology, of which this little e-book is a succinct and mind-numbing example.
--and then gets vicious.
Most of the review focuses on Parag and Ayesha Khanna's ebook Hybrid Reality. Apparently the Khannas accidentally once ran over Morozov's dog in their Range Rover, and didn't stop because they were too busy dishing dirt to News of the World about Morozov's mother. Or so I gather, because nothing less would explain the review.
Remember the creatures in Aliens who bleed concentrated acid? Tha's what comes to mind when you read this.
[A]ll the features that the Khannas invoke to emphasize the uniqueness of our era have long been claimed by other commentators for their own unique eras.... What the Khannas’ project illustrates so well is that the defining feature of today’s techno-aggrandizing is its utter ignorance of all the techno-aggrandizing that has come before it. The fantasy of technology as an autonomous force is a century-old delusion that no serious contemporary theorist of technology would defend.
What's it say about TED? Nothing good, I'm afraid:
I spoke at a TED Global Conference in Oxford in 2009, and I admit that my appearance there certainly helped to expose my argument to a much wider audience, for which I remain grateful. So I take no pleasure in declaring what has been obvious for some time: that TED is no longer a responsible curator of ideas “worth spreading.” Instead it has become something ludicrous, and a little sinister.
Though I have to confess that it felt like he was getting dangerously close to describing som eof the work i've done with this paragraph:
[O]ne can continue fooling the public with slick ahistorical jeremiads on geopolitics by serving them with the coarse but tasty sauce that is the Cyber-Whig theory of history. The recipe is simple. Find some peculiar global trend—the more arcane, the better. Draw a straight line connecting it to the world of apps, electric cars, and Bay Area venture capital. Mention robots, Japan, and cyberwar. Use shiny slides that contain incomprehensible but impressive maps and visualizations. Stir well. Serve on multiple platforms.
And the bit about how the Parangs and Tofflers are both "fast-talking tech-addled couple[s] who thrived on selling cookie-cutter visions of the future one paperback, slogan, and consulting gig at a time" sounds like a kind of a good gig. If you can do it in a more intellectually responsible way, of course.
An article about my friend Jim Fadiman and his LSD research includes this awesome bit about his last experiment in the 1960s (conducted on a group of "an architect and three senior scientists—two from Stanford, the other from Hewlett-Packard" who had "each brought along three highly technical problems from their respective fields that they’d been unable to solve for at least several months") before the government shut down all LSD work:
LSD absolutely had helped them solve their complex, seemingly intractable problems. And the establishment agreed. The 26 men unleashed a slew of widely embraced innovations shortly after their LSD experiences, including a mathematical theorem for NOR gate circuits, a conceptual model of a photon, a linear electron accelerator beam-steering device, a new design for the vibratory microtome, a technical improvement of the magnetic tape recorder, blueprints for a private residency and an arts-and-crafts shopping plaza, and a space probe experiment designed to measure solar properties.
Ah yes, those crazy druggies.
Though the one time Hermann Kahn took psychedelics he supposedly spent two hours saying "Oh, wow," and claimed afterwards to have come up with a new system for prioritizing nuclear targets. I never believed that story. Maybe I should.
George Monbiot calls publishers like Elsevier and Springer "the most ruthless capitalists in the Western world":
What we see here is pure rentier capitalism: monopolising a public resource then charging exorbitant fees to use it. Another term for it is economic parasitism. To obtain the knowledge for which we have already paid, we must surrender our feu to the lairds of learning.
Open-access publishing, despite its promise, and some excellent resources such as the Public Library of Science and the physics database arxiv.org, has failed to displace the monopolists…. The reason is that the big publishers have rounded up the journals with the highest academic impact factors, in which publication is essential for researchers trying to secure grants and advance their careers. You can start reading open-access journals, but you can’t stop reading the closed ones.
Interesting article in Jezebel/io9 (choose your Gawker media outlet) about autism and its growing influence in high-tech culture:
autism has played a significant role in crafting much of what we consider to be modern culture — from the music and books we read, to the technological devices we all take for granted. The acceptance of radically different ways of thinking, it turns out, can be seen as an integral part of a rich and diverse overarching culture….
The signs of autism's reach are beginning to been seen virtually everywhere. People on the spectrum are driving the creation of alternative forms of expression, new businesses and institutions, and cutting-edge technologies. "And not only do they make these things comfortable for themselves," noted [wired author Steve] Silberman, "they're useful for all of us."
Whole piece is worth reading.
This week I read John Lanchester's new novel Capital, about life in London during the great financial collapse of 2008.
I thought it was a great read, though not because of its great pacing or high drama or characters you're cheering for. It's more like an Impressionist crowd painting, a set of brilliantly-rendered scenes and personalities and moments, not a story that drives to a decisive conclusion. About 200 pages into it, I started thinking, This is great, but with all this buildup, it had better end with Queen Elizabeth on a velociraptor, on top of Big Ben, striking down zombies with nunchuks.
Not to spoil it, but no Queen Elizabeth, no zombies, no velociraptor. (Though one of the characters does like dinosaurs.)
Still, if you want a book that paints a picture of one of the world's great cities sans velociraptors-- and especially if you've spent time there, and perhaps intersected very peripherally with the sorts of characters that populate the book-- Capital is terrific.
I sent off the revised draft of my book last Friday, and celebrated this weekend by watching the end of the Tour de France.
the book is back, via flickr
It was great to see an Englishman win the tour (Britain's investment in cycling is paying off, as John Kay notes), and it was also cool to see someone win who was so clear about how much his victory was a team achievement. Yes, Wiggins gets to wear the yellow jersey, but as he himself acknowledges, he stands on the shoulders of his teammates.
I was juxtaposing this to Penelope Trunk's recent essay about self-publishing her book. The piece, a long post on her Brazen Careerist blog, is about how traditional publishers don't know anything about their markets, they take too long to get stuff out, and you're better off doing it yourself. The piece was really striking to me because both in scope and substance it's so different from my recent (or current) experience.
home office, california style, via flickr
First of all, Trunk's account of the publishing industry is all about production and distribution; the work of shaping and editing books is invisible. To me, though, this is about 90% of the value that the publishing industry offers. Fourteen months ago, give or take, I had a very very different idea for a book about contemplative computing. That book might have fit well with an academic press, but it wasn't the book I really wanted to write. I was lucky to have an agent who pushed me to think more commercially without giving up my intellectual bona fides or the ambition of explaining to ordinary users how our deep entanglement with technology shapes us. I was also really lucky, once I'd produced a manuscript, to have an editor who could work with me to tune it up, and who insisted (in that totally self-effacing way most book editors have) on making it more accessible and useful.
Another important way in which our experiences contrast is that Trunk describes books as calling-cards, as a way of introducing to the public who you are and what services you have to offer. Now, this is totally in keeping with the Tom Peters "Brand of Me" way of seeing the world, and I had professors at Wharton who talked about how their books were really just ways of attracting clients, so clearly there are authors who either genuinely feel that a book can play this role, or see reasons to talk about it this way. For me, though, writing this book has been pretty transformative, and I have a hard time imagining starting something this hard with the assumption that there won't be a big personal payout at the end.
it's about ME! via flickr
I'm probably going to experiment with some digital self-publishing in the coming year, though I wouldn't call what I'm going to create electronic books-- more like electronic pamphleteering, or digital broadsheeting. A "book" feels like a different proposition than a highly illustrated, expanded version of a talk. Indeed, it's not just a different proposition, but a promise to readers that the object they're getting has been through a more rigorous kind of review and publishing process.
bytes, via flickr
Indeed, the only way I would self-publish a "book" would be if I could hire editorial talent as strong as Zoë and John, and I'm not sure I'd want to take on the risk of investing that much in a book. It's possible that I could find equivalent talent in the freelance editorial market, but I quite like the idea that lots of other people at Little, Brown share the risk with me, and have an incentive to help the book be a success.
Just as important, I don't want my relationship with an editor to become more transactional. As John Kay recently pointed out, the financial services industry worked best for investors and companies when it was more trust-based; in today's world of super-fast transactions and massive bets, there's less interest in building trust, because you tend to assume that you'll be rich and retired within a couple years. I don't need intellectual relationships that are more transactional. Indeed, I think those two things are polar opposites. Frictionless, transactional relationships are mindless (in Ellen Langer's use of the term), and can just as easily succeed as win-lose games; meaningful relationships involve trust and struggle, and only succeed when both parties succeed.
stay, via flickr
I see tremendous benefit in having a team of people who are invested in your victory, like Team Sky was invested in Wiggins' taking home the yellow jersey. If all you're doing is a straight-on transaction, something you know how to do and really can do on your own, then maybe the self-publishing model works; but the way I write books requires a team.
John Kay's latest essay on the current state of the financial sector, published on the heels of a report he just released for the British government on state of financial services, is well worth reading:
In the equity investment chain, asset holders and asset managers need to be trusted stewards of savers’ money. Company directors need to be trusted stewards of the assets and activities of the corporations they manage. In the absence of such trust, intermediaries become no more than toll collectors.
It is hard to see how trust can be sustained in an environment characterised by increasingly hyperactive trading, and it has not been. Trust is essentially personal and cannot easily be found in a dark pool. Impersonal trust can be established only in a rigidly disciplined organisation – the kind that retail banks were once but are no longer – or by regulation of a ferocity that has not been achieved and is probably not achievable.
He also has this great observation of the ways analysts and regulators are naturally captured by complicated industries that rely on
behavioural regulation, designed to combat inappropriate incentives by detailed prescriptive rules. The outcome is regulation that is at once extensive and intrusive, yet ineffective and largely captured by financial sector interests.
Such capture is sometimes crudely corrupt, as in the US where politics is in thrall to Wall Street money. The European position is better described as intellectual capture. Regulators come to see the industry through the eyes of market participants rather than the end users they exist to serve, because market participants are the only source of the detailed information and expertise this type of regulation requires. This complexity has created a financial regulation industry – an army of compliance officers, regulators, consultants and advisers – with a vested interest in the regulation industry’s expansion.
I think you can see variations on this in all kinds of policy worlds (foreign and military policy especially), and in technology and futures research. Futurists don't regulate the future in any meaningful way, but they and industry analysts do have a close relationship with their objects of study and clients, and it's "natural" that a kind of regulatory capture occurs in these relationships.
I can only hope that he's correct that more people now recognize that "the sector's problems are not the byproduct of unpredictable events but arise from a wrong turning in the culture of an industry that has come to prioritise transactions and trading over trust relationships."
Man, Charles Pierce knows how to write.
I remain convinced that American conservative thought is now not a philosophy but, rather, a book of spells, a series of conjuring words that have meaning only to the initiates.
"Troubled" companies have a particular meaning on Wall Street. Sure, sometimes they refer to companies that are just muddled, have over-expanded, and are badly managed. But more often, what they are talking about is companies that do not seem to providing a large enough return to shareholders—a stagnating stock price in particular. But that does not mean a company is "troubled." It can be quite profitable, have productive and loyal employees, have satisfied customers, and cash on hand.
What players like Bain do is enforce a Wall Street preference. There is a bias against companies that seek a "quiet life." They are shunned by institutional investors, which depresses stock prices and makes these companies “troubled” in the first place. It isn’t that they are not profitable, but rather than institutional investors don’t like them, and as a result they trade at dramatically lower P/E ratios. Indeed, it isn’t even clear that takeover targets do have weaker stock performance if you look at total returns, including dividends.
Once a company goes public, it is essentially subject to "disciplinary" takeovers if it fails to act in accordance with financial sector preferences. This is often phrased as "poorly performing managers," but what does that really mean? That is really just about enforcing a certain conventional wisdom about what a company ought to do. But these preferences are socially problematic. Consider some of the things that seem to contribute to being a takeover target: slow growth, stable revenues, cash on hand rather than debt, generous employee compensation, conservatively-funded pension or insurance plans.
Dylan Matthews has a short post on equality of outcomes versus equality of opportunity. Political rhetoric claims that you can't choose one or the other. In the real world, though, it turns out that
the distinction between equality of opportunity (usually phrased in terms of upward income mobility) and equality of outcomes (the raw distribution of income or wealth in an economy) is not as big as it sometimes appears. More specifically, countries with high inequality of outcomes (as measured by the Gini index of economic inequality) tend to have low social mobility (as measured by the association between parents’ and childrens’ incomes) as well....
The distinction between equality of outcomes and opportunity has some theoretical appeal, but in practice, you get both or neither.
This is why I read John Kay:
In the 20th century political frontiers became a central influence on economic life. Old Kaspar’s work presumably consisted of providing food, fuel and shelter for his family. But with complex products, varied consumer tastes and low degrees of personal sufficiency, resource allocation became less of an individual enterprise, more one of the social and political environment.
That observation is evident on the Finnish-Russian border. The razor wire kept Russian citizens in when the living standards of planned societies and market economies diverged. But now the border is easy to cross and the gap in per capita income has narrowed, though not by much. The very different income distributions of egalitarian Finland and inegalitarian Russia can be seen in the car parks and designer shops of Lappeenranta.
In the Soviet era, Finland produced Marimekko; Russia made no clothes any fashion-conscious woman would want to buy. Post-Communist but still autocratic Russia made surveillance equipment; democratic Finland led the world in mobile phones. Today Russia’s geeks hack into your bank account, while those of Finland develop Angry Birds."
Michael Lewis' Princeton commencement address is terrific. After the obligatory opening joke ("Members of the Princeton Class of 2012. Give yourself a round of applause. The next time you look around a church and see everyone dressed in black it’ll be awkward to cheer. Enjoy the moment"), he talks about writing Liar's Poker and the role of luck in making that book possible:
I was 28 years old. I had a career, a little fame, a small fortune and a new life narrative. All of a sudden people were telling me I was born to be a writer. This was absurd. Even I could see there was another, truer narrative, with luck as its theme. What were the odds of being seated at that dinner next to that Salomon Brothers lady? Of landing inside the best Wall Street firm from which to write the story of an age? Of landing in the seat with the best view of the business? Of having parents who didn’t disinherit me but instead sighed and said “do it if you must?” Of having had that sense of must kindled inside me by a professor of art history at Princeton? Of having been let into Princeton in the first place?
This isn’t just false humility. It’s false humility with a point. My case illustrates how success is always rationalized. People really don’t like to hear success explained away as luck — especially successful people. As they age, and succeed, people feel their success was somehow inevitable. They don’t want to acknowledge the role played by accident in their lives. There is a reason for this: the world does not want to acknowledge it either.
Read the whole thing. It's worth it.
My daughter left this morning on a week-long camping trip with her class. Camping is a big thing at Peninsula. The youngest elementary school classes start with overnight stays in their classrooms, and by 8th grade the students are planning a couple weeks' worth of trips.
With twenty kids and about five teachers, there's a lot of gear.
Camping has been a big part of the school experience for years, and alumni talk about it as one of the most highlights of their time here.
This year they're going up to some park in the far north of the state. So in addition to all the usual stuff, they filled a trailer with firewood, and make up a convoy of four or five cars, vans, and trucks. It was hard to keep track.
I took Christopher with me to school, and turned him loose. He loved being able to run around off-leash for once.
Though I think he was a little disappointed when he wasn't able to go with the kids. I'm sure he would have loved it.
Derek Thompson has a short interview with Steve Blank about Facebook's IPO and what it means for Silicon Valley:
Facebook's success has the unintended consequence of leading to the demise of Silicon Valley as a place where investors take big risks on advanced science and tech that helps the world. The golden age of Silicon valley is over and we're dancing on its grave. On the other hand, Facebook is a great company. I feel bittersweet.
Why is that?
Silicon Valley historically would invest in science, and technology, and, you know, actual silicon. If you were a good VC you could make $100 million. Now there's a new pattern created by two big ideas. First, for the first time ever, you have computer devices, mobile and tablet especially, in the hands of billions of people. Second is that we are moving all the social needs that we used to do face-to-face, and we're doing them on a computer.
And this trend has just begun.
In other words, opportunities to make lots of money quickly on bubblicious things tend to draw attention away from hard things that offer more enduring value. Makes perfect sense. Blank also makes this interesting observation:
I see my students trying to commercialize really hard stuff. But the VCs are only going to be interested in chasing the billions on their smart phones. Thank God we have small business research grants from the federal government, otherwise the Chinese would just grab them….
The four most interesting projects in the last five years are Tesla, SpaceX, Google Driving, and Google Goggles. That is one individual, Elon Musk, and one company, Google, doing all four things that are truly Silicon Valley-class disruptive…. Thank God for federal government grants, and the NIH, and Musk, and Google.
I think TED talks are the worst example of modern faux-intellectualism. Audience flattering, based on ego and personality, dripping with self-congratulation, they contribute to one of the great lies of our time, which is that the truth is entertaining and can be contained in bite-sized, ready-for-television aphorisms. The reality is that progress is hard, that knowledge making is a long and dispiriting slog, and that when ideas and solutions appear pat, cute, easy, or triumphant, they’re almost certainly wrong.
Mainly this is an excuse to trot out my favorite Bart Simpson quote: To those who say there are no easy answers I say you're not looking hard enough!
As someone who's given TEDx talks, yet is occasionally put off by just how much buzz these talks generate (or really, how ready some speakers are to point to hit counts as proof that They Are Taken Seriously) I can understand the criticism.
Yet there can be value in struggling to take a complex project and at the very least, show people enough of it to make them think that it would be worth investing their time and attention to see the whole thing. Good TED talks aren't like music videos; they're like movie previews.
Inside Higher Ed has yet another in the never-ending series of "rethinking the humanities Ph.D." articles. But for once, it's not just about "rethinking" (which too often is regarded as an end in itself), but actually making changes to it: a proposal at Stanford
where students decide on a career plan -- academic or nonacademic -- they want to embark on by the end of their second-year of graduate study, file the plan with their department, and then prepare projects and dissertation work that would support that career…. This would represent a dramatic shift from the current norm, whereby many humanities grad students say that their entire program is designed for an academic career, and that they only start to consider other options when they are going on the job market -- a bit late to shape their preparation for nonacademic options.
You like Tastykake Butterscotch Krimpets? They're makin' you dumb. You like Gushers, red and blue and all flavors, including "mystery?" They're makin' you dumb. You like Arizona Green Tea? It's makin' you dumb.
At least, that seems to be the case for rats, the humans of the rat world.
Working on job applications and book proposals this morning, I set Olafur Arnalds' ...And They Have Escaped the Weight of Darkness on repeat, and have been plowing away. It's really great music, not quite as dense as his fellow Icelandic composer Johann Johannsson, but still very good-- simpler and more romantic.
About three months ago, we took in a new member of the household: a 13 year-old dog named Christopher. A friend of ours just turned 90 and is moving, and couldn't take him with her; my son knew Christopher for a couple years, so we agreed to try him out.
I haven't owned a dog in ages, and so I had no real idea what I was getting into. But with two cats (at the time), and birds whove established nests all around the house, I was feeling like, what's one more animal?
He was, of course, somewhat guarded at first, and had some health issues, but over time has become more comfortable, both socially and physically.
One thing that concerned me was that I'd have to drive everywhere with him, as he's too old, and I'm too smart, to have him run beside me on the leash while I bike. So I found an old Burley trailer, ripped out the seats, and put in a dog bed. With enough dog treats he'll hop right in, and now happily rides.
Most days we go to Peninsula in the morning, and he has a circuit he likes to make, visiting different classrooms and saying hi to different kids, and to a cage of guinea pigs. I can't tell if the thinks they're friends or food.
Some of the kids knew him from his previous life; like my son they had tutored with his old owner, who herself was a teacher at Peninsula for a long time. So he's quite the celebrity at school. And he's made a couple canine friends, too.
Being thirteen, he has a variety of chronic ailments, and so he takes as many allergy pills and vitamins as I do. But peanut butter seems able to disguise just about everything.
And despite his age, or perhaps because of it, he's quite cheerful, yet generally pretty mellow. He sleeps like a rock, and like Marlowe has a genius for finding strategically inconvenient places to bed down: getting to the coffee maker in the morning is now like the maze scene in Raiders of the Lost Ark.
Indeed, his example, along with my dad's heading off to Singapore for two years after his retirement, has started me rethinking the nature of aging. To the degree I've thought about it at all, I've tended to assume that getting older is mainly about declining faculties, managing chronic health problems, and fighting social irrelevance-- telling kids to get the hell off your lawn, but not knowing which kids they at because you don't have on your glasses.
But maybe there's more to the story. Maybe the other stuff can be epiphenomena, the friction or froth that is part of every part of your life.
I've been thinking about this particularly in relation to technology use. There's a tendency of think of elders as 1) incapable of understanding computers, 2) a set of engineering challenges (decreased mobility, reduced short-term memory) that need to be solved using technical means, and 3) something a bit less than free agents. But Steve Jobs was something like 18 months away from being eligible for Medicaid when he died; was he too old to "get" Apple's products? Do the guys (and they're largely guys) who built Silicon Valley in the 1960s and 1970s, who spent their careers in the computer industry and now are retired, somehow lose the ability to think about technical stuff when they get their gold watches?
I suspect that, for important segments of the population at least, the conventional narrative about computers and aging is completely wrong. That there are things we can learn from elders about technology choice and use-- about how much to let devices into our lives, about how to use them, about what things really matter. Sure, there are things people my age can do to help our elderly parents make sense of technology; but there are things we can learn from them, too.
This is our best hope.
I love Felix Salmon's work. He's one of the best financial reporters in the business, someone who's got tons of insight and an ability to explain complicated, obscure but important things to a general audience.
So I was pained to see him get a small but significant detail wrong in his piece about Marc Andreessen. Among Andreessen's achievements, Felix writes, is that "he's dragging Silicon Valley into the world of philanthropy, where it’s historically been very weak."
Umm. No. Absolutely wrong.
Dave Hewlett and Bill Packard, and their families, have been philanthropists for decades. The value of their gifts to Stanford University exceed the Stanford family’s original endowment (or so I was told by some development folks there). The Lucille Packard Children’s Hospital (where both my children were born), the open space trusts that have kept a significant part of Silicon Valley from turning into places to park Range Rovers in front of McMansion… not to mention a variety of locally-famous schools, charitable foundations, etc. etc. ad infinitum-- have all benefitted from the work of Hewlett, Packard, the Varian family, and many others.
Too elitist? Fundraisers for schools sound too self-serving? Maybe Santa Clara U.’s social innovation prizes, and its goal of improving the lives of a billion poor by 2020, is a bit more to one’s liking.
Indeed, you might make the case that Marc learned about the value of philanthropy from his spouse, whose family real estate business shaped Silicon Valley, and whose family foundation has shaped Silicon Valley in different ways.
I think the idea that the Valley isn't interested in philanthropy comes from extrapolating the example of Steve Jobs, who famously was uninterested in it. However, what you have to realize is how much Steve was the exception to the rule; indeed, you'd have to be someone of Steve Jobs' caliber (in other words, only Steve Jobs) to get away with it.
So Marc’s not inventing a new tradition. If anything, he’s doing a great job of showing how new money legitimates itself by imitating old money.
And yes, here in the Valley money made selling klystrons and calculators– anything before about 1990– is Old Money. Those dollars might as well have been printed by Rembrandt.
Charles Pierce's review of Ross Douthat's Bad Religion (shorter version: the Sixties sucked) is a master class in how to take apart a book in a manner that respects the subject, but gives the author the flogging they deserve. This may be my favorite part:
[N]owhere does Douthat so clearly punch above his weight class as when he decides to correct the damage he sees as having been done by the historical Jesus movement, the work of Elaine Pagels and Bart Ehrman and, ultimately, Dan Brown's novels. Even speaking through Mark Lilla, it takes no little chutzpah for a New York Times op-ed golden child to imply that someone of Pagels's obvious accomplishments is a "half-educated evangelical guru." Simply put, Elaine Pagels has forgotten more about the events surrounding the founding of Christianity, including the spectacular multiplicity of sects that exploded in the deserts of the Middle East at the same time, than Ross Douthat will ever know, and to lump her work in with the popular fiction of The Da Vinci Code is to attempt to blame Galileo for Lost in Space.
Fantastic. As good as Adam Gopnik's epic takedown of The Matrix, Reloaded. It's made more impressive by the fact that you get the sense that Pierce really knows what he's talking about. Here are two very different lines that each in their way are quite illuminating:
He describes the eventual calcification of the sprawling Jesus movement into the Nicene Creed as "an intellectual effort that spanned generations" without even taking into account the political and imperial imperatives that drove the process of defining Christian doctrine in such a way as to not disturb the shaky remnants of the Roman empire. The First Council of Nicaea, after all, was called by the Emperor Constantine, not by the bishops of the Church. Constantine — whose adoption of the Christianity that Douthat so celebrates would later be condemned by James Madison as the worst thing that ever happened to both religion and government — demanded religious peace. The council did its damndest to give it to him. The Holy Spirit works in mysterious ways, but Constantine was a doozy. Douthat is perfectly willing to agree that early Christianity was a series of boisterous theological arguments as long as you're willing to believe that he and St. Paul won them all....
[Douthat is] yearning for a Catholic Christianity triumphant, the one that existed long before he was born, the Catholicism of meatless Fridays, one parish, and no singing with the Methodists. I lived those days, Ross. That wasn't religion. It was ward-heeling with incense.
This 1968 profile of The Band is great. The last two lines of this Robbie Robertson quote really speak to me.
"We were so exhausted that everybody said this was a time of rest. When we went up to Woodstock, we stopped listening to music for a year. We didn't listen to anything but what you didn't have to listen to, like opera. That's why we couldn't play things like the Monterey Pop testival. We weren't – and we aren't – looking for blood any longer. We're just looking for music."...
The Band sings in the rough-hewn harmonies of honest mountain air. The music from Big Pink has the taste of Red River cereal. It has the consistency of King Biscuit flour. It rings with the now-ancient echo of John R, broadcasting from Nashville over Radio Station WLAC, 1510 on the: dial, its signal faintly received but eagerly listened to by an audience that took root in Stratford, Ontario, and Elaine, Arkansas, all with the same passion.... If it sounds traditional, the reason is that it has nothing to do with fads. If it sounds gritty, the reason is that it's full of road dust. If it sounds real, the reason is that it is....
It is music which comes from a band that has nothing but music to offer. The Band doesn't even have a name.
Thomas Frank has a terrific essay in The Baffler about the failure of experts in the dot-com bubble, the war in Iraq, and the housing and credit crisis, and about
our failure, after each of these disasters, to come to terms with how we were played. Each separate catastrophe should have been followed by a wave of apologies and resignations; taken together—and given that a good percentage of the pundit corps signed on to two or even three of these idiotic storylines—they mandated mass firings in the newsrooms and op-ed pages of the nation. Quicker than you could say “Ahmed Chalabi,” an entire generation of newsroom fools should have lost their jobs….
What I didn’t understand was that these were moral failures, mistakes that were hardwired into the belief systems of the organizations and professions and social classes in question. As such they were mistakes that—from the point of view of those organizations or professions or classes—shed no discredit on the individual chowderheads who made them. Holding them accountable was out of the question, and it remains off the table today. These people ignored every flashing red signal, refused to listen to the whistleblowers, blew off the obvious screaming indicators that something was going wrong in the boardrooms of the nation, even talked us into an unnecessary war, for chrissake, and the bailout apparatus still stands ready should they fuck things up again.
I'm afraid any book about the future and prediction has to take into account the transformation, on a large and very public scale, of being wrong into a badge of honor, and the world-view that has been created around it.
Michael Lewis interviews himself about the Occupy movement.
The chief cause of the financial crisis was what the government didn’t do (regulate) rather than what it did (subsidize homeownership), and so it seemed strange to me that, until now, the most potent political reaction to the financial crisis has been an antigovernment backlash. It was as if, after some infectious disease killed a million people, the only political reaction was a popular uprising to prevent the manufacture of antibiotics.
The man is a genius.
on the bed, via flickr
Last night our cat Tennison died. She was 17.
with my son, via flickr
A couple weeks ago she started eating less, then stopped altogether; I fed her by hand for a few days, but after a while she lost interest in that as well.
I thought about taking her to the vet, but she didn't show any signs of being uncomfortable or in any pain. I couldn't feel any tumors or see anything wrong. She didn't even seem particularly hungry. She just seemed ready to be done.
under the patio furniture, via flickr
I would not have imagined seventeen years ago that she and her brother would be with me this long, or play such a role in my life. I got them at the Berkeley pet shelter, in the summer of 1995, when they were just a few weeks old. I had been house-sitting some friends' cats, and discovered that I liked their company, and became desensitized to them after a while, so I went looking for my own. There they were.
After she stopped eating she just did what she's done for the last couple years, which is sleep pretty much constantly. Sometimes I thought she was spending more time in her box, or on the couch, awake. Though I could never be quite sure if her behavior was changing, or I was paying more attention to her.
reading my books, via flickr
A couple months after getting them, I asked out a fellow Davis historian I'd known for the past year. We were engaged a couple months later. I don't think it's a coincidence. Getting the cats made clear how much better life could be when you're not alone.
A few days I ago I noticed that her voice was hoarser, and she had trouble speaking. Not that she was ever a great talker. I started spending more time carrying her around. She was moving very slowly, and I wanted her close by.
Occasionally I (rather selfishly) thought, I'm almost finished with the book, so don't mess it up by dying now. Perhaps it's wrong to think that she could sense that I was too focused on the book, and had enough energy to wait for me to finish.
taking over the computer, via flickr
After I sent it off on Sunday, she seemed to decline more rapidly.
Yesterday we spent part of the day in the garage (she walked in while I was working there), then, as we'd done for the last few days, I put her up on a patio chair in the afternoon, and sat beside her while I read. In the evening I brought her inside, and tried to feed her something. She wasn't interested.
film noir cat, via flickr
She went over to the water dish, and spent a few minutes there. I could swear at one point she was staring at herself in the bottom of the metal bowl.
the last picture, via flickr
I kept her with me on the couch, then when she was tired of that, I put her in the box with her brother. Occasionally I stroked her fur. Sometimes she purred a bit, but mainly she just lay there. She didn't seem to have the strength or desire to move. It felt to me like she was winding down.
When I went to bed, I put her on the foot of the bed, where the cats have slept for years.
At some point during the night she got down, went into the hallway, stretched out, and died.
A while ago I wrote a piece about writing for the trades. As someone who'd written for academic audiences, and for corporate and government clients, it was interesting to take on the challenge of writing a book for a popular audience.
I just finished the first draft of the manuscript-- as in, sent it off to my editor and agent a couple hours ago-- and while it's all still fresh, thought I'd spend a little more time on what I've learned about writing.
The single most important thing is, be organized. The reason I was able to write this draft in a year was that I started the process with a strong, well-organized outline-- an outline that I took very seriously, because it was the basis of my book contract. So that short-circuited all that screwing around you do trying to find the perfect structure. I had one that the publisher liked, and so I was damn well going to stick with it.
Then on a daily basis, this means: organizing your goals for the week, listing out the sections you're going to write, and generally spending as much time as you need to be clear about what you're going to write-- so long as you actually write it. There's always the danger that this kind of prep becomes a substitute for actual word production. Watch out for that.
It also meant always setting up the coffee the night before, and organizing your workspace before bed so you could just sit down and be ready to go.
These are little things, but they make tangible your commitment to the project.
Another is to seek solitude. Turn on Freedom, or LeechBlock, or whatever. Put on the headphones. Before they exist on paper, good words live in a very quiet space, that you can only really reach in solitude. Of course you need to share your work in writing groups, with editors, and (you hope) a very big public. But in order to have ideas good enough to share, you need to seal yourself from everything but the words.
It's like how monastics describe the role of silence in contemplation of the divine. A common theme in monastic practice is that you cannot hear the voice of God, or achieve Enlightenment or satori, or see the ultimate truth, until your soul is quiet and ready. God does not make himself heard over the din. You have to listen for Him.
Another thing that I found really helpful was to stop for the day in mid-thought, or with one more sentence in the paragraph. It had to be something I knew I wouldn't forget, but having that as the first thing I did the next morning really helped me get started. The beginning is always the hardest part, and so if you can make the start of each writing day easier by actually knowing exactly what you're going to say, you'll make your life easier.
I now think that after years of writing, there's a more direct connection between whatever parts of my mind generate good ideas, and the part of my mind that controls my hands. There's a relationship between the physical act of writing and the "mental" act of creating that is not merely linear: I don't have ideas and then write them down. I have ideas because I am writing.
So it's absolutely essential that I spend time at the keyboard.
And I needed something to make it easier, because I was getting up in the pre-dawn hours and writing for an hour or two before anyone else woke up. (Even the dog stayed asleep and didn't follow me out to the living room.) I am absolutely NOT a morning person, but it made a big difference to have that time to myself, and to write in a state where I was actually to tired to distract myself. My semi-conscious brain was better able to stay on target, and whatever good ideas were bubbling up from my subconscious had an easier time reaching the calmer surface of my mind.
This was a complete change from the way I normally write and live. I'm naturally a late sleeper, and so it took real will for me to get up early. But it really did work. I was actually taken by surprise. I figured that having some words under my belt before I took the kids to school would be a psychological boost. What I didn't expect was that the very early morning would actually be a good time to write. But it turns out it was. Everyone should experiment with writing on a different schedule, or in a different way, to see if there are things that work better for them.
Thus endeth the lesson. For now.
I decided to download Blogsy, a blog editor for the iPad, and give it a try. I've lamented the apparent absence of decent Typepad editors (indeed, I still pine for the old days of Ecto), but this one looks pretty promising.
I spent yesterday at the Being Human conference in San Francisco, about which I'll have more to say shortly. It was a very interesting time, and quite well-done.
From his essay "On Walking:"
At present, in this vicinity, the best part of the land is not private property; the landscape is not owned, and the walker enjoys comparative freedom. But possibly the day will come when it will be partitioned off into so-called pleasure grounds, in which a few will take a narrow and exclusive pleasure only, — when fences shall be multiplied, and man traps and other engines invented to confine men to the public road; and walking over the surface of God’s earth, shall be construed to mean trespassing on some gentleman’s grounds.
I'm going to blow through this quickly, so I can get back to real stuff, but I couldn't let this awfulness go unremarked: Gary Olson's latest essay in the Chronicle of Higher Education, on "How Not to Reform Humanities Scholarship." The piece starts by noting "the growing number of commentators" at the recent Modern Language Association meeting "who were recommending changes in how the discipline conceives scholarly work." I suspect if you went to any MLA between, say, 1960 and today, you could print that sentence and it would due true, but let's take Olson's word that such calls are becoming more frequent and confident.
Certainly, he says, the number of people contacting him to say how terrible such things would be is on the rise. Whatever their good intentions,
Such recommendations, my callers unanimously agreed, would damage not only the careers of aspiring and new professors but also the reputation of the humanities. The proposed changes would also present substantial challenges to academic administrators charged with evaluating scholarship for tenure and promotion.
I'll just note four huge problems with the essay.
The first is the clumsy use of "some people worry about something, so that's evidence" as a form of argument. (One might argue that in a soft field like the humanities perception is reality, but given that this is an essay arguing for the strength of humanistic thinking and scholarship, I think Olson doesn't want to go there.) So you get claims like this:
Some veteran faculty members worry that graduate students and young faculty members—all members of the fast-paced digital world—are losing... their capacity for deep concentration—the type of cognitive absorption essential to close, meditative reading and to sustained, richly complex writing.
[A]llowing doctoral students to produce alternative projects may well disadvantage them on the job market, as hiring committees—or at least some members of them—may not be as receptive to experimental forms and may favor candidates who have, in fact, produced a monograph…. "I can just imagine how my colleagues in our very traditional department would respond to a colleague's tenure application if most of the work were digital," said one department chair. "We would have a clash of cultures and values, and, sadly, I know who would win."
And finally this:
It is true that more and more online journals are claiming to employ a peer-review process. That could be a positive development if we can arrive at a point where the community of scholars has confidence that the review process in online venues is as rigorous as it is in top-tier print journals. At the present, however, many scholars are still skeptical that the processes are equivalent.
Now, the argument that people don't concentrate any more in the digital age is one worth having; my book contends that while plenty of people feel like their faculties of concentration and memory are under assault, it absolutely doesn't have to be this way. Connection is inevitable, but distraction is a choice. But "some people say" is not proof.
Nor, I think it the argument that "we shouldn't do it because the old fogies would shake their canes and yell, you kids get off my lawn" particularly convincing. It's an unfortunate reality that some people don't like new stuff. But that's not a reason to not do new things that are good.
The second problem is that, tragically and not surprisingly, the assumption is that humanities Ph.D.s are all bound for academic jobs, and that training for other professions is more or less unthinkable.
Hence the equivalence of "job market" and "hiring committees," even though 1) a fraction of humanities Ph.D.s are ever going to get tenure-track jobs, and 2) other industries are much more likely to be able to see the value of an innovative piece of work than the search committee chair whose last book was published by Yale UP in 1977. Google's HR people won't care if you haven't produced a monograph, but rather have created something else that displays imagination, an ability to think deeply, and a capacity for focus.
More generally, the essay betrays an unwillingness, shared by far too many members of this generation of scholars, to admit that their field is not in some temporary crisis from which they're going to soon recover, and that good people are ground up and denied futures for structural reasons. Instead, you get things like this:
Besides, the typical rationale for abandoning the traditional dissertation—that the time-to-degree for the humanities doctorate is too long—is not a function of the monograph as a genre; it is a function of some dissertators' personal lives, as they attempt to juggle numerous priorities along with completing a dissertation.
Well, yes, personal lives can play a role. But… could the academy's well-documented reliance on temporary, itinerant, and graduate student labor also play a role here? Might the fact that too many students are under-funded while they write be a contributing factor? This reminds me of Charles Murray's argument that poor whites don't work because their culture has eroded, not that they don't work because the labor market for working class whites is a shadow of its former self.
So what recommendations does the essay embrace? How do we move forward to improve humanities scholarship?
This is the third problem with the essay: for the life of me, I cannot tell.
Olson doesn't seem to say, except to imply that we need more of the same, only better funded. Like too many academics, he seems to believe that if we wait long enough, the fairies to come and sprinkle gold dust on everything. There's no effort to distinguish good reform proposals from bad, to suggest how the rigor of traditional peer-review could be brought to electronic journals, to say how we might use other Web-based metrics (trackbacks, hits, number of comments, and other updated bibliometrics, for example) to help make informed judgments about digital scholarship.
Fourth an finally, I think this gives very short shrift to older faculty. As the son of someone who retired after twenty years at CSM, and then immediately went to Singapore for two years, I've seen at first hand that the relationship between age and personal conservatism is only as strong as you want it to be. He ends up constructing two sets of straw men, digital Panglosses and aging Cassandras, and thus doing justice to neither.
Okay, back to real work.
So wonders David Frum in his great review of Charles Murray's new book Coming Apart.
The book, as far as Frum is concerned, has several problems. It starts in the wrong place, with the bugaboo 1960s rather than earlier, which Frum argues would yield a rather different and more accurate perspective. It says nothing about the rise of manufacturing in Asia, which is a very big thing if you want to understand what happened to working classes in America.
It also blames the 1960s for our current social and cultural ills, which in turn are responsible for the white working class' decline. But Frum replies, "once you spell out the implied case here, it collapses of its own obvious ludicrousness."
Let me try my hand:
You are a white man aged 30 without a college degree. Your grandfather returned from World War II, got a cheap mortgage courtesy of the GI bill, married his sweetheart and went to work in a factory job that paid him something like $50,000 in today's money plus health benefits and pension. Your father started at that same factory in 1972. He was laid off in 1981, and has never had anything like as good a job ever since. He's working now at a big-box store, making $40,000 a year, and waiting for his Medicare to kick in.
Now look at you. Yes, unemployment is high right now. But if you keep pounding the pavements, you'll eventually find a job that pays $28,000 a year. That's not poverty! Yet you seem to waste a lot of time playing video games, watching porn, and sleeping in. You aren't married, and you don't go to church. I blame Frances Fox Piven.
How you can tell a story about the moral decay of the working class with the "work" part left out is hard to fathom.
As he explains, the gaps aren't just details: they go right to the heart of Murray's argument.
To understand what Murray does in Coming Apart, imagine this analogy:
A social scientist visits a Gulf Coast town. He notices that the houses near the water have all been smashed and shattered. The former occupants now live in tents and FEMA trailers. The social scientist writes a report:
The evidence strongly shows that living in houses is better for children and families than living in tents and trailers. The people on the waterfront are irresponsibly subjecting their children to unacceptable conditions.
When he publishes his report, somebody points out: "You know, there was a hurricane here last week." The social scientist shrugs off the criticism with the reply, "I'm writing about housing, not weather."
This is the kind of book review I love to read, and never want to be the subject of!
I'd love someone to have someone say this about me one day:
In most ways, the auction house is unshackled from intellectual pretense by its pure attention to the marketplace…
Sotheby’s felt detached from the posturing that happens in Chelsea galleries and the gnomic garbage that counts for art-world conversation. Auction house employees don’t invoke half-remembered poststructuralism or make inapt analogies. They don’t have to. The prices speak for themselves.
(via Felix Salmon)
David Brooks has a great idea for solving America's cultural problems (and for him, all problems reduce to culture):
We need a program that would force members of the upper tribe and the lower tribe to live together, if only for a few years. We need a program in which people from both tribes work together to spread out the values, practices and institutions that lead to achievement.
If we could jam the tribes together, we’d have a better elite and a better mass.
There's a name for such a thing: the English manor. The great and the good, their virtues observed by an army of footmen and maids. Everyone living together in a strict hierarchy that teaches the value of place and work.
Come to think of it, there was another name for it. The conscript army.
Clearly Brooks has been watching too much Downton Abbey, and learning the wrong lessons.
Umair Haque has been hanging out in hip New York hotels,
overhearing more than my fair share of Very Serious Conversations* from the movers and shakers of the world.
And boy, have they been tedious.
Haque uses this as a jumping-off point to talk about the "lethally serious" work of "doing stuff that actually matters." He suggests three criteria:
Does it stand the test of time? Ponder this for a moment: the vast majority spend the vast majority of our lives sweating, suffering, and slogging mightily over stuff that's forgotten by next quarter, let alone next year or next century. Call me crazy, but I'd suggest: mattering means building stuff that's awesome enough to last…. Of course, all that really means is that since nearly everyone seems to suck at standing the test of time, you've got a tremendous opportunity not to.
Does it stand the test of excellence?... Mattering means recognizing that everyone's opinion is not created equal — some count more than others, for the simple reason that some opinions are more nuanced, educated, sophisticated, historically grounded, and self-aware than others.
Does it stand the test of you?.... It's one thing to work on stuff that seems sexy because it's socially cool and financially rewarding. But fulfillment doesn't come much from money or cool-power — all the money in the world can't buy you a searing sense of accomplishment.
And I love this conclusion:
Being human is never easy. But that's the point. Perhaps as an unintended consequence of our relentless quest for more, bigger, faster, cheaper, now, we've comfortably acceded to something akin to a minor-league contempt for the richness and grandeur of life unquenchably meaningfully well lived. Hence, call this post my tiny statement of rebellion. Hex me with all the bland management jargon in the world, zap me with all the perfect theories and models you like, but I'll never, ever accept the idea that triviality, mediocrity, and futility are appropriate goals for any human being, much less our grand, splintering systems of human organization.
* I love how Very Serious Conversations, or "Very Serious [insert thing here]" is evolving into an insult. When those two words appear together in a Paul Krugman piece, you know the big guns are being trained on a new target.
A rather unfortunate, though doubtless algorithmically-generated, juxtaposition:
A good, lengthy piece in the New York Times uses iPhone manufacturing as a window into the growth of Asian factories, and the relative decline of American manufacturing capabilities. Well worth a read.
Apple had redesigned the iPhone’s screen at the last minute, forcing an assembly line overhaul. New screens began arriving at the plant near midnight.
A foreman immediately roused 8,000 workers inside the company’s dormitories, according to the executive. Each employee was given a biscuit and a cup of tea, guided to a workstation and within half an hour started a 12-hour shift fitting glass screens into beveled frames. Within 96 hours, the plant was producing over 10,000 iPhones a day.
“The speed and flexibility is breathtaking,” the executive said. “There’s no American plant that can match that.”
Similar stories could be told about almost any electronics company — and outsourcing has also become common in hundreds of industries, including accounting, legal services, banking, auto manufacturing and pharmaceuticals.
But while Apple is far from alone, it offers a window into why the success of some prominent companies has not translated into large numbers of domestic jobs. What’s more, the company’s decisions pose broader questions about what corporate America owes Americans as the global and national economies are increasingly intertwined.
“Companies once felt an obligation to support American workers, even when it wasn’t the best financial choice,” said Betsey Stevenson, the chief economist at the Labor Department until last September. “That’s disappeared. Profits and efficiency have trumped generosity.”
At least that's the underlying argument behind a move being considered by hedge funds that have bought Greece's debt:
Hedge funds... [are considering] suing Greece in a human rights court to make good on its bond payments.
The novel approach would have the funds arguing in the European Court of Human Rights that Greece had violated bondholder rights, though that could be a multiyear project with no guarantee of a payoff.
You almost have to admire the absolute, graceless cynicism and depravity of a strategy like this. There's a refreshing honesty about a move this openly crass.
On the other hand, it IS a logical extension of the idea that corporations are people.
Dominique Strauss-Kahn did not know he was sleeping with prostitutes 'because they were all naked'
Nice light at school this morning.
This is an excellent little essay:
The Republican candidate Newt Gingrich and the cable channel History have both followed the same formula for success, by elevating fantasy over actual history. The difference, however, is that Newt wants to carry his sensational vision of a bygone age into office.
Vannevar Bush's Memex, Ted Nelson's concept of hypertext, Paul Baran's invention of packet-switching, the TCP/IP protocol stack, Tim Berners-Lee's HTML... What do we do with all this? Do we create magnificent works of art? Finally achieve world peace and global happiness?
No, we create Virtual Bikini Wrestling.
Humans. Are. Doomed.
[Don't ask me how I got to it. Actually, I was buying an ebook about Twitter and churches, and the same online publisher was selling "pictures" from recent VBW matches. It's such a weird concept, I couldn't resist clicking through.]
I write about people, technology, and the worlds they make.
I'm a senior consultant at Strategic Business Insights, a Menlo Park, CA consulting and research firm. I also have two academic appointments: I'm a visitor at the Peace Innovation Lab at Stanford University, and an Associate Fellow at Oxford University's Saïd Business School. (I also have profiles on Linked In, Google Scholar and Academia.edu.)
I began thinking seriously about contemplative computing in the winter of 2011 while a Visiting Researcher in the Socio-Digital Systems Group at Microsoft Research, Cambridge. I wanted to figure out how to design information technologies and user experiences that promote concentration and deep focused thinking, rather than distract you, fracture your attention, and make you feel dumb. You can read about it on my Contemplative Computing Blog.
My book on contemplative computing, The Distraction Addiction, will be published by Little, Brown and Company in 2013. (It will also appear in Dutch and Russian.)
My latest book, and the first book from the contemplative computing project. The Distraction Addiction is published by Little, Brown and Co.. It's been widely reviewed and garnered lots of good press. You can find your own copy at your local bookstore, or order it through Amazon, Barnes & Noble or IndieBound.
My first book, Empire and the Sun: Victorian Solar Eclipse Expeditions, was published with Stanford University Press in 2002 (order via Amazon).
PUBLISHED IN 2012
PUBLISHED IN 2011
PUBLISHED IN 2010
PUBLISHED IN 2009