Attributed to Kahlil Gibran:
I have learnt silence from the talkative, toleration from the intolerant, and kindness from the unkind; yet strange, I am ungrateful to these teachers.
Attributed to Kahlil Gibran:
I have learnt silence from the talkative, toleration from the intolerant, and kindness from the unkind; yet strange, I am ungrateful to these teachers.
A few weeks ago I spoke a memorial service for one of my thesis advisors, Riki Kuklick. While I was at Penn I also gave a couple other talks, on postacademic careers and contemplative computing; but all three turned out, one way or another, to touch on Riki and her influence on me.
After I returned home, I noodled around with the talks, and eventually put them together. The result wouldn't have been appropriate in any of the three venues, but it better reflects what I was struggling to say in separate places on different days.
In September 2013 I returned to Philadelphia to speak at a memorial service for one of my favorite professors, Henrika Kuklick. Exactly thirty Septembers earlier, I stepped into my first classroom with Riki, and her course on the sociology of knowledge. It was the beginning of an association that would shape the next eight years of my life at Penn, and beyond.
Even though my father was a professor, and I was lucky to have some great teachers and role models at Penn, Riki lived the life of the mind in a way that was especially vivid and accessible. It goes without saying that she was as brilliant as the other professors who most deeply influenced me at Penn-- her colleagues Rob Kohler and Thomas Hughes; art historian David Brownlee; and strategist and systems thinker Russ Ackoff-- but she was a great model for aspiring scholars.
The Problem of the Real World
The importance of academic models like Riki for aspiring scholars shouldn't be overestimated, because academic life is often looked at skeptically by people who see themselves as firmly rooted in the "real world."
As my years at Penn drew out, some of my old friends and relatives expressed the opinion that all this education was just a way of avoid going into the real world. The real world was the place where people DID things, made money, got stuff done. The university was fine if it helped you get a job, but otherwise it was little point to it. Well, if the university was NOT the real world, then I wanted no part of it. I wanted to be a professor; the campus would be MY real world.
That didn't work out: I graduated into a terrible job market, and after finishing my first book and a couple postdocs became a consultant. But then I made a surprising discovery: the "real world" was actually a great place to pursue the life of the mind.
Working as a futurist means grappling constantly with epistemological issues around the possibility of predicting the future, your professional credibility, and the standards by which your work should be judged-- all familiar themes in the sociology of science. In the mid-1990s, thanks to the growth of the Internet, the rising importance of the service economy, the ferocious pace of technological and global change, and other factors, the boundary between the world of ideas and the "real world" was collapsing. In order to survive in today's economy, organizations have to think seriously about what they were doing and why, and have models that explained how the world works and how it's changing. In their worldly impact, ideas are more real than ever.
One reason I was able to continue my own intellectual life was that I had Riki's pursuit of it as a model. There was nothing unreal about the life of the mind the way she lived it, or her love of the craft of scholarship. Her own professional life was lived in the ivory tower, she would have regarded the prospect of working with C-suite executives with horror. Despite this, she gave me the means to see the life of the mind as a devotion rather than just a profession, as an internal discipline as well as an academic one.
In a sense, I was also applying to my own life another lesson Riki taught me: that we should question what others believe is inevitable and inescapable, because what appears fixed may in fact be contingent and changeable. The expertise that may seem unassailable, the assumptions that seem self-evident, the truths that claim to be eternal, all may not be as real as they seem-- or like a great movie, their greatness may a blend of hard word, clever staging, and a willing suspension of disbelief.
Seeing that the boundaries between the academic world and "real world" could be more porous than I'd believed helped me create a life that borrowed from both worlds. It let me uproot my own well-cultivated prejudice against corporate life. It freed me to reimagine academic life as something more portable and useful than I'd previously imagined. It let me see that one could make a life that combined the vita activa and vita contemplativa.
Another Real World: IRL
That experience of moving between worlds had a subtle but important resonance in my latest book. While writing The Distraction Addiction, I ran up against the sensibility that Facebook, text messaging, the Web, and the other things that make up the digital world can ONLY be distractions from a well-lived life; that proximate physical interactions are naturally superior to anything we can experience online; and that the best solution to our electronic troubles is simply to turn technologies off. We should get offline in order to spend more time in the real world, where we can have a real life. The simple and apparently innocuous acronym "IRL" turns out to be a kind of intellectual virus. It packs a lot of unexpected information and moral judgment in a very small package.
This claim is one side of an argument that's into its third decade. In the 1990s and the early days of the World Wide Web, figures like John Perry Barlow and Esther Dyson declared that cyberspace was a new world separate from and superior to the physical world; critics answered that the Internet was a threat to literature, social development, even our memory and cognitive abilities. To me this debate had a ring of familiarity. If the distinction between the academic world and real world doesn't make a lot of sense, I wondered, could the same be true of the apparently huge gap between digital life and real life?
Once I dug deeper, I saw that just as the distance between academic life and real life was overhyped, so too was the distance between digital life and real life. Technologies like smartphones, locative services, and wireless Internet access have erased the functional boundary between bits and atoms, while ecommerce, email, and social media have woven the digital world into our everyday lives.
Even more profoundly, I realized, using technologies is not something that makes us less human, or takes us away from our natural selves. Since the invention of stone tools two million years ago, human bodies have co-evolved with our physical tools, while our minds have co-evolved with our cognitive tools. We are, as philosopher Andy Clark puts it, natural-born cyborgs. At its best, this entanglement of person and technology extends our cognitive and physical abilities, gives us great pleasure, and makes us more human.
The challenge with smartphones and social media, then, is not to learn to give them up, but to learn to use them wisely. We need to practice what I call contemplative computing, developing ways of working and interacting with information technologies that help us be more mindful and focused-- and thus better people-- rather than be endlessly distracted and frustrated.
By better understanding the nature of attention and distraction, by studying how our interactions with technologies go bad, and by experimenting with new ways of using them, we can resolve the paradoxes these technologies seem to bring into our lives. Using them wisely helps us become wiser about ourselves. Being more mindful about HOW we use technologies helps us be more mindful WHILE using them.
This leads me to argue that we should push back against the moral distinction between academic life or digital life on one hand, and real life on the other. We shouldn't think in terms of a "real life" versus a "digital life" any more than we should think of our lives in the library or laboratory as unreal.
IRL = In Richer Life
To put it another way, we should redefine what the acronym IRL means. When people talk about "going IRL," one of the things they're doing is expressing a desire for self-improvement: turning off the devices, going camping or spending time with the family and friends. The impulse is laudable, but the assumption that it can only happen when you hit the off switch is incorrect.
Instead, we should think of RL as a richer life, one of that isn't driven mainly by distractions, but reflects a serious attempt to create meaning in the world, to do things that matter with our lives, to build and extend our selves. This is an effort in which the thoughtful, judicious, mindful use of technology can play a role-- and which those habits of mind that we think of as "academic" can also be intensely useful. We can build lives aren't merely real, but are richer, using tools that take form in silicon and electrons, or tools that are encoded in words and ideas.
Practicing contemplative computing requires taking a more critical, ethnographic approach to how we use technology; asking basic questions about why we use technologies, noticing unconscious habits, how we think about them, and how they affect the way we think about ourselves. All these ideas could have come from one of Riki's classes, even though they're applied in an area that seems outside her scholarly interest.
Riki and the Richer Life
But that ability to follow ideas wherever they lead, to pursue diversions until they reveal something unexpected yet connected to your original interests, is just me channeling another of Riki's habits.
Riki was an astonishing conversationalist-- indeed it was hard to get a word in edgewise. If you didn't know her you might listen to her monologues and think she was just free associating. But if you listened carefully, you discovered that she would start a sentence, interrupt herself and veer off onto another subject, then do it again, and again-- and then, systematically work her way back, until twenty minutes later she finished that first sentence. That ability to draw together a dozen different subjects in a single conversation, to weave between and weave together different ideas, never failed to amaze her students, and I suspect there's an echo of it in my writing even today.
But in a sense the questions I'm working on now are not outside her area at all. What Riki showed me, through her work and her life, is that far from being an escape from real life, the life of the mind can serve as a model for how to build richer lives.
The categories of "real world" on one hand, and "digital world" or "academic world" on the other, can be remade, and in the course of doing so, we can make better, richer lives for ourselves. A more thoughtful understanding of our everyday engagements with technology can make our lives better. It's an attempt to make sense of how we should define what it means to be human, how to think about the divide between people and technologies, and to see that the challenge and the opportunity we face is not to learn how to live in real life, but to learn how better to use tools and time to have a richer life.
Author and World War II RAF veteran Harry Leslie Smith has an eloquent explanation of why "I will wear a poppy for the last time:"
I am from that last generation who remember a war that encompassed the entire world. I wear the poppy because I can recall when Britain was actually threatened with a real invasion and how its citizens stood at the ready to defend her shores. But most importantly, I wear the poppy to commemorate those of my childhood friends and comrades who did not survive the second world war and those who came home physically and emotionally wounded from horrific battles that no poet or journalist could describe.
However, I am afraid it will be the last time that I will bear witness to those soldiers, airmen and sailors who are no more, at my local cenotaph. From now on, I will lament their passing in private because my despair is for those who live in this present world. I will no longer allow my obligation as a veteran to remember those who died in the great wars to be co-opted by current or former politicians to justify our folly in Iraq, our morally dubious war on terror and our elimination of one's right to privacy....
I find that the government's intention to spend £50m to dress the slaughter of close to a million British soldiers in the 1914-18 conflict as a fight for freedom and democracy profane. Too many of the dead, from that horrendous war, didn't know real freedom because they were poor and were never truly represented by their members of parliament.
My uncle and many of my relatives died in that war and they weren't officers or NCOs; they were simple Tommies. They were like the hundreds of thousands of other boys who were sent to their slaughter by a government that didn't care to represent their citizens if they were working poor and under-educated. My family members took the king's shilling because they had little choice, whereas many others from similar economic backgrounds were strong-armed into enlisting by war propaganda or press-ganged into military service by their employers.
Smith is an interesting character in part because he only started writing seriously late in life, after a career in the Oriental carpet business.
Deanna Day, a grad student at my alma mater, wrote a nice little piece on "Harry Potter, Wizards, and How We Let Technology Create Who We Are." It gets seriously into the weeds of the Harry Potter universe, but it makes a serious point about how magic and technology can shape their users:
Muggles and wizards alike are mystified by the mechanisms of objects like iPads and Sorting Hats, and this ignorance can often, ironically, create a deep sense of trust in these objects. We create stories that explain their behavior, and when our tools work, it cements the validity of those stories. How else to explain their mechanism, be it magical or mechanical? But when we allow our technologies to remain opaque, we also prevent ourselves from seeing the crucial ways they make us who we are....
Most of the piece is about how wands work, who can use them, and their relationship to their users (e.g., the whole "wand chooses the wizard" thing). She concludes:
[T]he stories that wizards tell about their tools don’t match up with how they’re used in practice. The wand chooses the wizard, because that’s what wizards want to believe about their type of magic. In this story, wizards are special, and wands are objective proof. In another example, the Sorting Hat is believed to reveal one’s true identity, until an arguing student reveals that the Hat’s interpretation — and its social consequences — are much more negotiable than its song would imply.
In this (and many other) ways, the wizarding world exists in parallel with the muggle world.... By pointing out some of the ways that the technologies of the wizarding world are constructed — and the kinds of wizards they construct — we might also be better able to see the workings of our own muggle magic. As we go about our lives using our mysterious technologies, what kinds of people are we enabling them to make up?
Jessica Francis Kane, writing in The Atlantic, talks about a Marcus Aurelius quotation that she took to heart:
Book 8, #36
Do not disturb yourself by picturing your life as a whole; do not assemble in your mind the many and varied troubles which have come to you in the past and will come again in the future, but ask yourself with regard to every present difficulty: 'What is there in this that is unbearable and beyond endurance?' You would be ashamed to confess it! And then remind yourself that it is not the future or what has passed that afflicts you, but always the present, and the power of this is much diminished if you take it in isolation and call your mind to task if it thinks that it cannot stand up to it when taken on its own.
I thought about how Marcus Aurelius's concerns and mine differed, but I was inspired by the idea that the spirit of them, separated by so many centuries, was similar. His words helped me get to the desk, and stay there, during all the years it took me to write my first good story. Writing is hard, but is it unbearable? Who would say that it is? Even asking the question, I'm reminded of the one exclamation in the passage: "You would be ashamed to confess it!" His words helped me navigate rejection, which is certainly no fun, but if you ask yourself if it's unbearable, you find yourself preparing the next self-addressed stamped envelope pretty quickly. The words helped me survive the protracted sale of my first novel, and they reminded me to start writing again after a long hiatus after the birth of my first child. I wasn't sure how to make room for writing with a baby. It is difficult, but beyond endurance? I got myself back to the desk.
Personally, I think nothing prepared me for writing as well as studying the Victorians. Not because they invented the world as we know it (in many ways they did), or because their work was awesome (though it was), but rather because they got so much done. Tomes, multivolume histories, three-decker novels. Theories, scientific discoveries, expeditions, surveys. Buildings, massive urban redesigns, vast public buildings, and more than a few dark Satanic mills. New ways of seeing the world, of traveling it, or recording it.
And they still managed to take month-long vacations, or at least have high tea. The older I get, the more impressive that part is-- and, I begin to suspect, the more important it is for understanding why they were able to get so much done. It wasn't just the absence of television or Facebook. My intuition now senses that they were productive because they had a better sense of when to quit for the day. They could be more productive because they were more measured.
Granted, I have absolutely no real evidence for this, and I'm sure it'll be years before I can really chase it down, but their lives were about as well-documented as you can get without FitBit and SenseCam, so I'll bet you really could study their work habits, how much time they spent at work and play, how they saw the differences between the two, and how it made them great.
Yesterday I found out that one of my mentors from college and graduate school, Henrika Kuklick, died.
Riki was one of the professors who got me hooked on the history of science, and along with Rob Kohler helped make me who I am. In the fall of my freshman year I had taken a seminar with Tom Hughes, mainly because it sounded interesting and he had a Ph.D. from the University of Virginia, and then in the spring had a class with Hughes and Rob, who would go on to be my undergraduate and graduate advisor. In my sophomore year I took Riki's sociology of science class, and from then on hardly a semester went by when I wasn't taking something with her.
Riki was a kind of intellectual performer I'd never encountered before. I never knew anyone who could keep track of so many thoughts: I marveled at how she could start a sentence, divert herself, then go off on something else, but then work her way back up and finish the sentence 20 minutes later. She had a kind of unreserved enthusiasm for life and ideas that really resonated with me; my decision to work on Victorian science was influenced in no small part by her description of living in England and working in the archives there. When I was a bit older and had more of a critical sensibility, I found her scholarship to be really outstanding, erudite without being purposely complicated: I taught her Great Zimbabwe Ruins article in several of my classes, and it always went over well.
She was also a great person and teacher, always supportive and generous, great at helping you think through arguments. Not the closest reader, though; lots of chapters came back with "Good work" scrawled at the end, and little more. (That's why you needed Rob Kohler on your committee. That man could line edit a diffraction grating.)
There are lots of people who can hardly remember classes from college, or the professors they had. Riki, in contrast, introduced to me a set of questions about the ways people, ideas, and technologies interact that I'm still dealing with. It's why I dedicated my first book to her and Rob. And I think I'll spend the rest of my life working on things that we talked about. Fortunately they're very big questions.
I find as I close in on 50, I don't particularly notice my age: I've had some grey hair since I was in graduate school (it'll do that to you), and aside from bifocals, I'm not in worse physical shape (though that's not the highest bar ever set), and more important, I'm a better writer and thinker than I've ever been in my life. But what I can't comprehend is other people getting older, too: my parents are in their 70s, which I find weird, and Riki was 70, which to me is inconceivable: my memory of her was fixed in the 1980s.
It's one of life's ironies that the gap a person leaves when they're gone is as large as the impact they made when they were alive. By that standard, Riki's passing leaves a very large gap indeed.
Hollee (really?) Actman Becker has a great, heartfelt and smart piece about Instagram beauty contests (as terrible as they sound, read about them yourself), and the responsibilities parents have to helping their kids use technologies in ways that are smart:
we are failing our children by not giving them the tools they need to properly navigate this scary new world, and by not monitoring their interactions in this world closely enough once we do….
Because the minute we give our kids an iphone or ipod or any other gadget that puts technology quite literally in the palms of their hands, we become responsible for whatever happens next….
We potty train our kids, teach them good table manners, spend 10 minutes deciphering the food label on a candy bar before we let them eat it. And yet, we set our kids up on social media, and then for all intents and purposes, we hang them out to dry.
Checking our kids’ news feeds to see what they are viewing, scrolling through their profiles to see what they’re posting, investigating the people who want to follow them, finding out who they’ve given their password to and monitoring all of their accounts (because most kids have more than one instagram account in case you didn’t know) doesn’t make us helicopter parents.
It makes us smart parents.
As the father of a girl who just turned 14 yesterday, I say: Read the whole thing.
This week I took my daughter and a friend hers to the Maroon 5 concert. In the Virginia of my childhood, concerts were raucous and druggy affairs; you could get a contact high taking a deep breath at a Doobie Brothers show (before Michael McDonald joined, anyway). Here, things are different. When we reached the HP Pavilion, the entrance was cordoned off; instead of walking right in, we had to go through a long barricade that diverted us away from the door, only to dissolve back into chaos once we got up to the entrance.
"Typical crappy HP interface," someone behind me said.
Welcome to a rock concert in Silicon Valley, where you can overhead offhand UX slams.
For those of you who have never heard them: take the pop sensibility of any single-named star, sonic elements of disco, Motown, and 1980s pop, contagiously singable hooks and some wonderfully unexpected bridges and transition, and you have Maroon 5. (Check out "Makes Me Wonder" as a perfectly-crafted song illustrating all of the above.)
As for the lyrics, well, they're not Shakespeare. Most of their songs are about sex, and the rest sound like it. Adam Levine, like Amy Winehouse, is one of those singers who could make "pass the salt please" sound like a double entendre; fortunately his lyrical sensibilities are much closer to Cole Porter than Dr. Dre, so I'm more sanguine about my kids listening to them.
Part of their charm is that even their songs about heartbreak aren't sad: the melodies and rhythms are as peppy as their songs about seduction and sex. If Taylor Swift sounds like she's always surprised and hurt when a relationship fails, Adam Levine treats it as just another phase; he's not at all cavalier or callous, but in songs like "Misery" and the fantastic "This Love" the downsides of relationships are-- not to be enjoyed, exactly, but as much a part of romance as the good parts. You can be as passionate about the "she's driving me crazy" or "I'm driving myself crazy because I drove her away by being so thoughtless" phase as the rest of it.
Maybe that's why a significant proportion of the crowd on Wednesday night was, as my daughter noted, recently-divorced middle-aged women: this is music that takes an easy attitude to the consequences of poor judgment, while also tapping into middle school memories of listening to their older sisters' copies of Off the Wall and Songs in the Key of Life.
I'm glad I got to take my daughter and her friend. It suggests that maybe as we get older it isn't inevitable that I become completely alienated from them.
And as you can tell from the pictures, the show was awesome. Just incredible. And the only thing smoking was the dry ice.
"The rich get rich and the poor get nothing
In the meantime
In between time
Ain't we got fun....
It's funny, or merely ironic, to hear Alma Cogan's "Aint' We Got Fun" while on hold with a credit card company.
Yesterday I bought a new camera, a Fujifilm X-E1. I've been coveting it since it was announced: it looks like the rangefinder cameras my dad had when we lived in Brazil in the late 1960s and early 1970s, the specs are fabulous, and the reviews have been pretty ecstatic. My wife and I went to the camera store, checked out a couple different models, and after some deliberation, we took the plunge.
We thought about a Nikon D7000, because we already have a D5000 and are quite happy with it. But while the D7000 gets great reviews, I felt that the X-E1 would be better for the kinds of professional uses I expect to put a camera to in the coming years-- lots of street photography and observations of people using devices-- and it'll be very easy to travel with. The D7000 is fabulous, and feels equally professional, but it's a much heavier camera, both physically and visually. This one will be less obtrusive.
Though I've had it for about 18 hours (8 of which I've been asleep), and have mainly taken pictures of the dog (who I don't photograph enough) and my son and his friend (who are having a sleepover), I think it's going to be a camera I can spend years working with.
As you can see, it's got a very retro, Leica rangefinder aesthetic, though it has an electronic viewfinder rather than an optical one (or the cool hybrid that the X-Pro1 has). Of course, you can set everything to adjust itself automatically; but exposure speed, aperture, and focus all have dedicated manual controls on the camera or lens, and the ISO can be accessed from the Fn button just beside the shutter button.
Dive into the options menus, and there are tons of other things you can adjust, custom profiles you can create (that'll be next on my to-do list), and special effects-- simulators that mimic the distinctive color profiles of different Fuji films, a couple black-and-white films, and so on.
The other two things about it that I think I'm going to love are that it's very light, and it's surprisingly small.
The pictures don't really give you a good sense of how small the camera is. The body is about a quarter inch longer than an iPhone, and perhaps a quarter inch taller, so it's Not Large At All. And the body weighs about 12 ounces (350 g), which is Really Light.
So while it's mean to be a two-handed camera, you can comfortably carry it in one hand.
My talks feature all my own pictures, and so having good a good camera is a professional necessity; it's an important part of the Brand of Me, and helps me get my ideas across to my audiences.
More than that, though, I feel like this is the kind of device I could spend a decade working with. These days, as specs constantly improve and costs drop, it's easy to convince yourself that the Next Cool Thing will make you a better photographer, or writer, or golfer, or guitarist. Of course, there is a marginal truth to that, but it's a lot more important to learn how to use a device to improve your own ability to see, or your voice.
That doesn't mean NOT taking advantage of technology. It not relying on its improvement alone, and being thoughtful about how you can both exploit it and improve yourself. (There are things I've almost completely outsourced to devices. In the last ten years I've memorized the phone numbers of my wife and kids, but entrust all the others to my iPhone.)
There's one other calculation for me. As I get older and more reflective, I think less about how many more turns of Moore's Law I can consume, and how many cool devices I can acquire. The challenge isn't to get the Next Great Thing, but the Last Great Thing: as much as possible, to choose things that, whether I live another five years or another fifty, will last; serve me well; constantly give me pleasure; and help me consciously extend or augment my own abilities. This requires a level of thoughtfulness and self-understanding, and frankly a certain amount of money: a $1400 camera is a lot more likely to fall into this category than a $300 one.
So we'll see if I made the right choice.
Yesterday the family went to see the new film version of Les Miserables, with Hugh Jackman and Russell Crowe et al. I can see why critics dislike it, but I found it quite engaging. Yes, Russell Crowe approaches singing the way some Americans approach making themselves understood in a foreign country-- if the cabbie doesn't understand you, just yell your destination, and your words will be magically translated-- and the close-up style that worked so well in The King's Speech takes a little getting used to.
But the movie is every bit as manipulative and heart-tugging as the play, Tom Hooper can be sweeping and bold when the scene calls for it (some of his shots of Paris and Javert singing from rooftops reminded me of Gore Verbinski), Amanda Seyfried and Samantha Barks were terrific, and the film did the musical justice.
Just got these in the mail….
Very exciting, in the way that only a vanishingly small number of grinding, attention-demanding tasks can be.
The apocalypse is surely near when Ramzan Kadyrov emerges as the voice of reason. The ruthless leader of Chechnya is among dozens of Russians officials, priests, doctors and psychiatrists aiming to calm an anxious populace frantically preparing for the end of the world later this week.
"People are buying candles saying the end of the world is coming," Kadyrov said in comments published on his official website last week. "Does no one realise that once the end of the world comes, candles won't help them?"
For more than a month, Russians around the country have been buying up candles and matches, salt and torches in an effort to outsmart the apocalypse some believe will come when the Mayan calendar runs out on Friday.
I had two cats die this spring and summer, and after they were gone, I really had no interest in replacing them. They had been with me for seventeen years, since they were kittens, and I'd always thought of myself as a cat person; yet, with their passing, I felt like that part of my life was now done.
In contrast, a few days after Christopher died, after I'd cleared out his dog bed and packed away his food and toys-- indeed, the afternoon I got his ashes back from the crematorium-- I realized: I want another dog. After my wife and I talked it over, we agreed that it would be good to get another dog.
We decided to get a rescue, mainly because there are so many dogs in the Bay Area who need homes. Christopher, so far as anyone could guess, was a Carolina or American Dingo, and that's a pretty distinctive breed; you don't see many of them. There's a Carolina breeder here in California, and a couple places in the Rockies that specialize in Carolina rescues, but they're not a breed that shows up on Petfinder or the adoption Web sites; so I quickly gave up the idea of getting another one. (I also wasn't 100% sure getting the same breed was the smart thing for me.)
The Bay Area dog adoption market, it turns out, has a couple weird quirks. First of all, there are tons of chihuahuas and pit bulls, or mixes involving one of those breeds. Second, we import unwanted dogs, from as far away as Taiwan (which has several native breeds, but where it's very tough to survive as a stray). Apparently the Bay Area can't produce enough unwanted dogs, and has to import them. Who knew. I filled out a long form, had a phone interview, and got set up to see a couple dogs.
So on Sunday we went to a pet store deep in Campbell to meet two dogs: a five year-old border collie-husky mix, and a two year-old lab. The scene was crazy: a pen full of adorable puppies, crates with adult dogs in back, and people everywhere. If the wedding dresses in Filenes Basement could bark… you get the idea. We tried out both dogs, and were really split: collie-husky was great, calm, and watchful without being too eager, but she could jump tall fences. The two year-old was more compact and energetic, but also more kid oriented, so naturally the children gravitated to him.
Eventually, we went with the two year-old, took care of the paperwork, bought an inordinately large amount of hardware, toys, etc., and brought him home. We renamed him Davis: my wife and I met there, and while he had been called Dallas by his foster family, he didn't recognize the name.
We're not really sure what breed he is, and probably never will be. I decided that he's a "labramuddle," because he's smaller than a traditional yellow lab and his face is a bit squarer, but friends suggest he could be an English Labrador.
It's been less than a week, and Davis is settling in nicely. He has a crate that he sleeps in at night, and we're still working on how to manage him during the day.
He's very much of the "I'll follow whichever human is doing something" model, but he's more into following the children than Christopher was: I think he likes my son's manic energy, and certainly enjoys the attention the kids lavish on him.
We've taken him to the dog park a couple times, and fortunately he enjoys spending time there.
In the last couple days I've discovered that he's an absolute fiend for chasing balls, which is hardly surprising in a dog that's bred to be a retriever. For me, though, knowing as little about dogs as I do, everything is still a revelation. (It also means exploring the world of dog toys.)
However, for all his crazed energy, he's also good at just hanging out under the desk while I work.
We're quite happy with him, but frankly, we got lucky. Choosing a dog after less than an hour, in a crowded exciting and slightly frantic environment, hardly guarantees good results. if I had to do it again, I'd go to one of these adoption fairs first, with absolutely no plans to get a dog, so I wouldn't be overwhelmed by the energy and emotionalism of the event; then I'd go back a second time, and start looking at the dogs.
After all, a dog could be with you for years (if he's a lab, Davis should live 10-12 years), and while we made a great choice, I've spent more time researching which movie to go to on the weekend.
But we've got him now, and he's been great.
A couple weeks ago Christopher, the dog we inherited in January, died. My wife took him for a morning walk, he went to sleep in the backyard, and never woke up.
He was 14, so we knew when we took him in that he was more or less on loan. Still, it was a shock, even if it wasn't really a surprise.
I hadn't lived with a dog since I was a kid, and when we took him in I didn't really know what to expect. But he proved to be very smart, and great at communicating his needs. I quickly realized that if I just paid attention to what he was doing, I could decode what he wanted-- though sometimes it was especially easy.
It was also instructive living with a creature who didn't really just wanted to belong, to be part of the family, and was happy so long as he could be with us. As someone who lives among highly analytical, calculating people, I'm constantly trying to figure out what clients want, what readers want, what funders want to hear, etc. Being with someone whose mental model of himself and others was really straightforward and guileless was instructive.
At the same time, it was also cool that he was a dog, and did dog things. While he took pleasure in being with us, he also enjoyed having his own, very different, incredibly physical life, one where smells and dirt were really fascinating.
His life intersected with ours; it didn't overlap completely. I found that cool.
We went to the dog park pretty regularly, and he had several friends there, including one dog he would follow around and just drool on. They were both quite elderly, so it was a charming sight.
He also made me a lot more familiar with my neighborhood. Taking him on walks twice a day meant I developed an intimate sense of my surroundings, albeit from a somewhat canine point of view. (I never knew things smelled so interesting around here.)
A friend-- one of the many Peninsula people who had contact with him over the years-- said that he was such a good dog he was sure to come back as a human. I'm not so sure he needs to; if it's possible for a dog to achieve canine nirvana, I think Christopher managed it.
I was in Seattle this weekend at the POD Network conference, a conference of academic technology and professional development types.
I've not been in Seattle in a while, so it was cool to be there. And the crowd at the conference was terrific: very technically savvy, so they knew what I was talking about, but they could also ask interesting questions, and very engaged. Especially impressive for a crowd that had already been at the conference for three days and hadn't yet had lunch.
It was the first time I'd given a big talk since finishing the book, and it was good to see that it seems to hold up in public.
After my talk I spent the afternoon on the monorail (how often as a futurist do you get to ride on an artifact from the future?) and visiting the Experience Museum Project and Seattle Public Library, two of the cooler pieces of architecture... well, anywhere in the world.
The Experience Music Project is said to look like a melted Jimi Hendrix guitar from above; that could well be urban legend, but I do know is it's really cool on the ground.
Here's the cover for the contemplative computing book:
Little, Brown spent a lot of time on it, and I think they've managed to communicate a lot in a very small, challenging medium. They were also really good about explaining the design choices, making clear that they thought worked, and accommodating those changes I thought would improve it (or explaining why they would be hard to implement).
So the machine chugs along, and we get one step closer to having a finished book on the shelves!
The one problem with writing a book for users, taking a Buddhist-inflected approach to information technologies that emphasizes how people can take back control of their minds, is that I'm less likely to get onto this kind of gravy train:
Ferguson's critics have simply misunderstood for whom Ferguson was writing that piece. They imagine that he is working as a professor or as a journalist, and that his standards slipped below those of academia or the media. Neither is right. Look at his speaking agent's Web site. The fee: 50 to 75 grand per appearance. That number means that the entire economics of Ferguson's writing career, and many other writing careers, has been permanently altered. Nonfiction writers can and do make vastly more, and more easily, than they could ever make any other way, including by writing bestselling books or being a Harvard professor. Articles and ideas are only as good as the fees you can get for talking about them. They are merely billboards for the messengers.
That number means that Ferguson doesn't have to please his publishers; he doesn't have to please his editors; he sure as hell doesn't have to please scholars. He has to please corporations and high-net-worth individuals, the people who can pay 50 to 75K to hear him talk. That incredibly sloppy article was a way of communicating to them: I am one of you. I can give a great rousing talk about Obama's failures at any event you want to have me at.
What's so worrying about this trend is that Niall Ferguson, once upon a time, was the best. I'm one of the few people who has actually read his history of the Rothschilds, The World's Banker, all 1,040 pages of the thing, and it is brilliant, a model of archival research. I find it fantastically depressing that the man who could write that book could end up writing a book like Civilization or an article with just as much naked silliness as the Newsweek cover.
I feel very much the same way about Victor Davis Hanson, a man whose military history is really absolutely first-rate, whose The Other Greeks fairly exploded with insight into Greek society and philosophy, but who's been mailing in sloppy, thoughtless pieces ever since he left the farm for The Farm. Sad.
We know too much and feel too little. At least we feel too little of those creative emotions from which a good life springs. In regard to what is important we are passive; where we are active it is over trivialities.
Evgeny Mozerov's review of several new TED books-- pamphlets, really-- is one of the greatest things I've read in a long time. You know you're in for a wild ride when the opening paragraphs starts like this--
Only the rare reader would finish this piece of digito-futuristic nonsense unconvinced that technology is—to borrow a term of art from the philosopher Harry Frankfurt—bullshit. No, not technology itself; just much of today’s discourse about technology, of which this little e-book is a succinct and mind-numbing example.
--and then gets vicious.
Most of the review focuses on Parag and Ayesha Khanna's ebook Hybrid Reality. Apparently the Khannas accidentally once ran over Morozov's dog in their Range Rover, and didn't stop because they were too busy dishing dirt to News of the World about Morozov's mother. Or so I gather, because nothing less would explain the review.
Remember the creatures in Aliens who bleed concentrated acid? Tha's what comes to mind when you read this.
[A]ll the features that the Khannas invoke to emphasize the uniqueness of our era have long been claimed by other commentators for their own unique eras.... What the Khannas’ project illustrates so well is that the defining feature of today’s techno-aggrandizing is its utter ignorance of all the techno-aggrandizing that has come before it. The fantasy of technology as an autonomous force is a century-old delusion that no serious contemporary theorist of technology would defend.
What's it say about TED? Nothing good, I'm afraid:
I spoke at a TED Global Conference in Oxford in 2009, and I admit that my appearance there certainly helped to expose my argument to a much wider audience, for which I remain grateful. So I take no pleasure in declaring what has been obvious for some time: that TED is no longer a responsible curator of ideas “worth spreading.” Instead it has become something ludicrous, and a little sinister.
Though I have to confess that it felt like he was getting dangerously close to describing som eof the work i've done with this paragraph:
[O]ne can continue fooling the public with slick ahistorical jeremiads on geopolitics by serving them with the coarse but tasty sauce that is the Cyber-Whig theory of history. The recipe is simple. Find some peculiar global trend—the more arcane, the better. Draw a straight line connecting it to the world of apps, electric cars, and Bay Area venture capital. Mention robots, Japan, and cyberwar. Use shiny slides that contain incomprehensible but impressive maps and visualizations. Stir well. Serve on multiple platforms.
And the bit about how the Parangs and Tofflers are both "fast-talking tech-addled couple[s] who thrived on selling cookie-cutter visions of the future one paperback, slogan, and consulting gig at a time" sounds like a kind of a good gig. If you can do it in a more intellectually responsible way, of course.
An article about my friend Jim Fadiman and his LSD research includes this awesome bit about his last experiment in the 1960s (conducted on a group of "an architect and three senior scientists—two from Stanford, the other from Hewlett-Packard" who had "each brought along three highly technical problems from their respective fields that they’d been unable to solve for at least several months") before the government shut down all LSD work:
LSD absolutely had helped them solve their complex, seemingly intractable problems. And the establishment agreed. The 26 men unleashed a slew of widely embraced innovations shortly after their LSD experiences, including a mathematical theorem for NOR gate circuits, a conceptual model of a photon, a linear electron accelerator beam-steering device, a new design for the vibratory microtome, a technical improvement of the magnetic tape recorder, blueprints for a private residency and an arts-and-crafts shopping plaza, and a space probe experiment designed to measure solar properties.
Ah yes, those crazy druggies.
Though the one time Hermann Kahn took psychedelics he supposedly spent two hours saying "Oh, wow," and claimed afterwards to have come up with a new system for prioritizing nuclear targets. I never believed that story. Maybe I should.
George Monbiot calls publishers like Elsevier and Springer "the most ruthless capitalists in the Western world":
What we see here is pure rentier capitalism: monopolising a public resource then charging exorbitant fees to use it. Another term for it is economic parasitism. To obtain the knowledge for which we have already paid, we must surrender our feu to the lairds of learning.
Open-access publishing, despite its promise, and some excellent resources such as the Public Library of Science and the physics database arxiv.org, has failed to displace the monopolists…. The reason is that the big publishers have rounded up the journals with the highest academic impact factors, in which publication is essential for researchers trying to secure grants and advance their careers. You can start reading open-access journals, but you can’t stop reading the closed ones.
Interesting article in Jezebel/io9 (choose your Gawker media outlet) about autism and its growing influence in high-tech culture:
autism has played a significant role in crafting much of what we consider to be modern culture — from the music and books we read, to the technological devices we all take for granted. The acceptance of radically different ways of thinking, it turns out, can be seen as an integral part of a rich and diverse overarching culture….
The signs of autism's reach are beginning to been seen virtually everywhere. People on the spectrum are driving the creation of alternative forms of expression, new businesses and institutions, and cutting-edge technologies. "And not only do they make these things comfortable for themselves," noted [wired author Steve] Silberman, "they're useful for all of us."
Whole piece is worth reading.
This week I read John Lanchester's new novel Capital, about life in London during the great financial collapse of 2008.
I thought it was a great read, though not because of its great pacing or high drama or characters you're cheering for. It's more like an Impressionist crowd painting, a set of brilliantly-rendered scenes and personalities and moments, not a story that drives to a decisive conclusion. About 200 pages into it, I started thinking, This is great, but with all this buildup, it had better end with Queen Elizabeth on a velociraptor, on top of Big Ben, striking down zombies with nunchuks.
Not to spoil it, but no Queen Elizabeth, no zombies, no velociraptor. (Though one of the characters does like dinosaurs.)
Still, if you want a book that paints a picture of one of the world's great cities sans velociraptors-- and especially if you've spent time there, and perhaps intersected very peripherally with the sorts of characters that populate the book-- Capital is terrific.
I sent off the revised draft of my book last Friday, and celebrated this weekend by watching the end of the Tour de France.
the book is back, via flickr
It was great to see an Englishman win the tour (Britain's investment in cycling is paying off, as John Kay notes), and it was also cool to see someone win who was so clear about how much his victory was a team achievement. Yes, Wiggins gets to wear the yellow jersey, but as he himself acknowledges, he stands on the shoulders of his teammates.
I was juxtaposing this to Penelope Trunk's recent essay about self-publishing her book. The piece, a long post on her Brazen Careerist blog, is about how traditional publishers don't know anything about their markets, they take too long to get stuff out, and you're better off doing it yourself. The piece was really striking to me because both in scope and substance it's so different from my recent (or current) experience.
home office, california style, via flickr
First of all, Trunk's account of the publishing industry is all about production and distribution; the work of shaping and editing books is invisible. To me, though, this is about 90% of the value that the publishing industry offers. Fourteen months ago, give or take, I had a very very different idea for a book about contemplative computing. That book might have fit well with an academic press, but it wasn't the book I really wanted to write. I was lucky to have an agent who pushed me to think more commercially without giving up my intellectual bona fides or the ambition of explaining to ordinary users how our deep entanglement with technology shapes us. I was also really lucky, once I'd produced a manuscript, to have an editor who could work with me to tune it up, and who insisted (in that totally self-effacing way most book editors have) on making it more accessible and useful.
Another important way in which our experiences contrast is that Trunk describes books as calling-cards, as a way of introducing to the public who you are and what services you have to offer. Now, this is totally in keeping with the Tom Peters "Brand of Me" way of seeing the world, and I had professors at Wharton who talked about how their books were really just ways of attracting clients, so clearly there are authors who either genuinely feel that a book can play this role, or see reasons to talk about it this way. For me, though, writing this book has been pretty transformative, and I have a hard time imagining starting something this hard with the assumption that there won't be a big personal payout at the end.
it's about ME! via flickr
I'm probably going to experiment with some digital self-publishing in the coming year, though I wouldn't call what I'm going to create electronic books-- more like electronic pamphleteering, or digital broadsheeting. A "book" feels like a different proposition than a highly illustrated, expanded version of a talk. Indeed, it's not just a different proposition, but a promise to readers that the object they're getting has been through a more rigorous kind of review and publishing process.
bytes, via flickr
Indeed, the only way I would self-publish a "book" would be if I could hire editorial talent as strong as Zoë and John, and I'm not sure I'd want to take on the risk of investing that much in a book. It's possible that I could find equivalent talent in the freelance editorial market, but I quite like the idea that lots of other people at Little, Brown share the risk with me, and have an incentive to help the book be a success.
Just as important, I don't want my relationship with an editor to become more transactional. As John Kay recently pointed out, the financial services industry worked best for investors and companies when it was more trust-based; in today's world of super-fast transactions and massive bets, there's less interest in building trust, because you tend to assume that you'll be rich and retired within a couple years. I don't need intellectual relationships that are more transactional. Indeed, I think those two things are polar opposites. Frictionless, transactional relationships are mindless (in Ellen Langer's use of the term), and can just as easily succeed as win-lose games; meaningful relationships involve trust and struggle, and only succeed when both parties succeed.
stay, via flickr
I see tremendous benefit in having a team of people who are invested in your victory, like Team Sky was invested in Wiggins' taking home the yellow jersey. If all you're doing is a straight-on transaction, something you know how to do and really can do on your own, then maybe the self-publishing model works; but the way I write books requires a team.
John Kay's latest essay on the current state of the financial sector, published on the heels of a report he just released for the British government on state of financial services, is well worth reading:
In the equity investment chain, asset holders and asset managers need to be trusted stewards of savers’ money. Company directors need to be trusted stewards of the assets and activities of the corporations they manage. In the absence of such trust, intermediaries become no more than toll collectors.
It is hard to see how trust can be sustained in an environment characterised by increasingly hyperactive trading, and it has not been. Trust is essentially personal and cannot easily be found in a dark pool. Impersonal trust can be established only in a rigidly disciplined organisation – the kind that retail banks were once but are no longer – or by regulation of a ferocity that has not been achieved and is probably not achievable.
He also has this great observation of the ways analysts and regulators are naturally captured by complicated industries that rely on
behavioural regulation, designed to combat inappropriate incentives by detailed prescriptive rules. The outcome is regulation that is at once extensive and intrusive, yet ineffective and largely captured by financial sector interests.
Such capture is sometimes crudely corrupt, as in the US where politics is in thrall to Wall Street money. The European position is better described as intellectual capture. Regulators come to see the industry through the eyes of market participants rather than the end users they exist to serve, because market participants are the only source of the detailed information and expertise this type of regulation requires. This complexity has created a financial regulation industry – an army of compliance officers, regulators, consultants and advisers – with a vested interest in the regulation industry’s expansion.
I think you can see variations on this in all kinds of policy worlds (foreign and military policy especially), and in technology and futures research. Futurists don't regulate the future in any meaningful way, but they and industry analysts do have a close relationship with their objects of study and clients, and it's "natural" that a kind of regulatory capture occurs in these relationships.
I can only hope that he's correct that more people now recognize that "the sector's problems are not the byproduct of unpredictable events but arise from a wrong turning in the culture of an industry that has come to prioritise transactions and trading over trust relationships."
Man, Charles Pierce knows how to write.
I remain convinced that American conservative thought is now not a philosophy but, rather, a book of spells, a series of conjuring words that have meaning only to the initiates.
"Troubled" companies have a particular meaning on Wall Street. Sure, sometimes they refer to companies that are just muddled, have over-expanded, and are badly managed. But more often, what they are talking about is companies that do not seem to providing a large enough return to shareholders—a stagnating stock price in particular. But that does not mean a company is "troubled." It can be quite profitable, have productive and loyal employees, have satisfied customers, and cash on hand.
What players like Bain do is enforce a Wall Street preference. There is a bias against companies that seek a "quiet life." They are shunned by institutional investors, which depresses stock prices and makes these companies “troubled” in the first place. It isn’t that they are not profitable, but rather than institutional investors don’t like them, and as a result they trade at dramatically lower P/E ratios. Indeed, it isn’t even clear that takeover targets do have weaker stock performance if you look at total returns, including dividends.
Once a company goes public, it is essentially subject to "disciplinary" takeovers if it fails to act in accordance with financial sector preferences. This is often phrased as "poorly performing managers," but what does that really mean? That is really just about enforcing a certain conventional wisdom about what a company ought to do. But these preferences are socially problematic. Consider some of the things that seem to contribute to being a takeover target: slow growth, stable revenues, cash on hand rather than debt, generous employee compensation, conservatively-funded pension or insurance plans.
Dylan Matthews has a short post on equality of outcomes versus equality of opportunity. Political rhetoric claims that you can't choose one or the other. In the real world, though, it turns out that
the distinction between equality of opportunity (usually phrased in terms of upward income mobility) and equality of outcomes (the raw distribution of income or wealth in an economy) is not as big as it sometimes appears. More specifically, countries with high inequality of outcomes (as measured by the Gini index of economic inequality) tend to have low social mobility (as measured by the association between parents’ and childrens’ incomes) as well....
The distinction between equality of outcomes and opportunity has some theoretical appeal, but in practice, you get both or neither.
This is why I read John Kay:
In the 20th century political frontiers became a central influence on economic life. Old Kaspar’s work presumably consisted of providing food, fuel and shelter for his family. But with complex products, varied consumer tastes and low degrees of personal sufficiency, resource allocation became less of an individual enterprise, more one of the social and political environment.
That observation is evident on the Finnish-Russian border. The razor wire kept Russian citizens in when the living standards of planned societies and market economies diverged. But now the border is easy to cross and the gap in per capita income has narrowed, though not by much. The very different income distributions of egalitarian Finland and inegalitarian Russia can be seen in the car parks and designer shops of Lappeenranta.
In the Soviet era, Finland produced Marimekko; Russia made no clothes any fashion-conscious woman would want to buy. Post-Communist but still autocratic Russia made surveillance equipment; democratic Finland led the world in mobile phones. Today Russia’s geeks hack into your bank account, while those of Finland develop Angry Birds."
Michael Lewis' Princeton commencement address is terrific. After the obligatory opening joke ("Members of the Princeton Class of 2012. Give yourself a round of applause. The next time you look around a church and see everyone dressed in black it’ll be awkward to cheer. Enjoy the moment"), he talks about writing Liar's Poker and the role of luck in making that book possible:
I was 28 years old. I had a career, a little fame, a small fortune and a new life narrative. All of a sudden people were telling me I was born to be a writer. This was absurd. Even I could see there was another, truer narrative, with luck as its theme. What were the odds of being seated at that dinner next to that Salomon Brothers lady? Of landing inside the best Wall Street firm from which to write the story of an age? Of landing in the seat with the best view of the business? Of having parents who didn’t disinherit me but instead sighed and said “do it if you must?” Of having had that sense of must kindled inside me by a professor of art history at Princeton? Of having been let into Princeton in the first place?
This isn’t just false humility. It’s false humility with a point. My case illustrates how success is always rationalized. People really don’t like to hear success explained away as luck — especially successful people. As they age, and succeed, people feel their success was somehow inevitable. They don’t want to acknowledge the role played by accident in their lives. There is a reason for this: the world does not want to acknowledge it either.
Read the whole thing. It's worth it.
My daughter left this morning on a week-long camping trip with her class. Camping is a big thing at Peninsula. The youngest elementary school classes start with overnight stays in their classrooms, and by 8th grade the students are planning a couple weeks' worth of trips.
With twenty kids and about five teachers, there's a lot of gear.
Camping has been a big part of the school experience for years, and alumni talk about it as one of the most highlights of their time here.
This year they're going up to some park in the far north of the state. So in addition to all the usual stuff, they filled a trailer with firewood, and make up a convoy of four or five cars, vans, and trucks. It was hard to keep track.
I took Christopher with me to school, and turned him loose. He loved being able to run around off-leash for once.
Though I think he was a little disappointed when he wasn't able to go with the kids. I'm sure he would have loved it.
Derek Thompson has a short interview with Steve Blank about Facebook's IPO and what it means for Silicon Valley:
Facebook's success has the unintended consequence of leading to the demise of Silicon Valley as a place where investors take big risks on advanced science and tech that helps the world. The golden age of Silicon valley is over and we're dancing on its grave. On the other hand, Facebook is a great company. I feel bittersweet.
Why is that?
Silicon Valley historically would invest in science, and technology, and, you know, actual silicon. If you were a good VC you could make $100 million. Now there's a new pattern created by two big ideas. First, for the first time ever, you have computer devices, mobile and tablet especially, in the hands of billions of people. Second is that we are moving all the social needs that we used to do face-to-face, and we're doing them on a computer.
And this trend has just begun.
In other words, opportunities to make lots of money quickly on bubblicious things tend to draw attention away from hard things that offer more enduring value. Makes perfect sense. Blank also makes this interesting observation:
I see my students trying to commercialize really hard stuff. But the VCs are only going to be interested in chasing the billions on their smart phones. Thank God we have small business research grants from the federal government, otherwise the Chinese would just grab them….
The four most interesting projects in the last five years are Tesla, SpaceX, Google Driving, and Google Goggles. That is one individual, Elon Musk, and one company, Google, doing all four things that are truly Silicon Valley-class disruptive…. Thank God for federal government grants, and the NIH, and Musk, and Google.
I think TED talks are the worst example of modern faux-intellectualism. Audience flattering, based on ego and personality, dripping with self-congratulation, they contribute to one of the great lies of our time, which is that the truth is entertaining and can be contained in bite-sized, ready-for-television aphorisms. The reality is that progress is hard, that knowledge making is a long and dispiriting slog, and that when ideas and solutions appear pat, cute, easy, or triumphant, they’re almost certainly wrong.
Mainly this is an excuse to trot out my favorite Bart Simpson quote: To those who say there are no easy answers I say you're not looking hard enough!
As someone who's given TEDx talks, yet is occasionally put off by just how much buzz these talks generate (or really, how ready some speakers are to point to hit counts as proof that They Are Taken Seriously) I can understand the criticism.
Yet there can be value in struggling to take a complex project and at the very least, show people enough of it to make them think that it would be worth investing their time and attention to see the whole thing. Good TED talks aren't like music videos; they're like movie previews.
Inside Higher Ed has yet another in the never-ending series of "rethinking the humanities Ph.D." articles. But for once, it's not just about "rethinking" (which too often is regarded as an end in itself), but actually making changes to it: a proposal at Stanford
where students decide on a career plan -- academic or nonacademic -- they want to embark on by the end of their second-year of graduate study, file the plan with their department, and then prepare projects and dissertation work that would support that career…. This would represent a dramatic shift from the current norm, whereby many humanities grad students say that their entire program is designed for an academic career, and that they only start to consider other options when they are going on the job market -- a bit late to shape their preparation for nonacademic options.
You like Tastykake Butterscotch Krimpets? They're makin' you dumb. You like Gushers, red and blue and all flavors, including "mystery?" They're makin' you dumb. You like Arizona Green Tea? It's makin' you dumb.
At least, that seems to be the case for rats, the humans of the rat world.
Working on job applications and book proposals this morning, I set Olafur Arnalds' ...And They Have Escaped the Weight of Darkness on repeat, and have been plowing away. It's really great music, not quite as dense as his fellow Icelandic composer Johann Johannsson, but still very good-- simpler and more romantic.
About three months ago, we took in a new member of the household: a 13 year-old dog named Christopher. A friend of ours just turned 90 and is moving, and couldn't take him with her; my son knew Christopher for a couple years, so we agreed to try him out.
I haven't owned a dog in ages, and so I had no real idea what I was getting into. But with two cats (at the time), and birds whove established nests all around the house, I was feeling like, what's one more animal?
He was, of course, somewhat guarded at first, and had some health issues, but over time has become more comfortable, both socially and physically.
One thing that concerned me was that I'd have to drive everywhere with him, as he's too old, and I'm too smart, to have him run beside me on the leash while I bike. So I found an old Burley trailer, ripped out the seats, and put in a dog bed. With enough dog treats he'll hop right in, and now happily rides.
Most days we go to Peninsula in the morning, and he has a circuit he likes to make, visiting different classrooms and saying hi to different kids, and to a cage of guinea pigs. I can't tell if the thinks they're friends or food.
Some of the kids knew him from his previous life; like my son they had tutored with his old owner, who herself was a teacher at Peninsula for a long time. So he's quite the celebrity at school. And he's made a couple canine friends, too.
Being thirteen, he has a variety of chronic ailments, and so he takes as many allergy pills and vitamins as I do. But peanut butter seems able to disguise just about everything.
And despite his age, or perhaps because of it, he's quite cheerful, yet generally pretty mellow. He sleeps like a rock, and like Marlowe has a genius for finding strategically inconvenient places to bed down: getting to the coffee maker in the morning is now like the maze scene in Raiders of the Lost Ark.
Indeed, his example, along with my dad's heading off to Singapore for two years after his retirement, has started me rethinking the nature of aging. To the degree I've thought about it at all, I've tended to assume that getting older is mainly about declining faculties, managing chronic health problems, and fighting social irrelevance-- telling kids to get the hell off your lawn, but not knowing which kids they at because you don't have on your glasses.
But maybe there's more to the story. Maybe the other stuff can be epiphenomena, the friction or froth that is part of every part of your life.
I've been thinking about this particularly in relation to technology use. There's a tendency of think of elders as 1) incapable of understanding computers, 2) a set of engineering challenges (decreased mobility, reduced short-term memory) that need to be solved using technical means, and 3) something a bit less than free agents. But Steve Jobs was something like 18 months away from being eligible for Medicaid when he died; was he too old to "get" Apple's products? Do the guys (and they're largely guys) who built Silicon Valley in the 1960s and 1970s, who spent their careers in the computer industry and now are retired, somehow lose the ability to think about technical stuff when they get their gold watches?
I suspect that, for important segments of the population at least, the conventional narrative about computers and aging is completely wrong. That there are things we can learn from elders about technology choice and use-- about how much to let devices into our lives, about how to use them, about what things really matter. Sure, there are things people my age can do to help our elderly parents make sense of technology; but there are things we can learn from them, too.
I love Felix Salmon's work. He's one of the best financial reporters in the business, someone who's got tons of insight and an ability to explain complicated, obscure but important things to a general audience.
So I was pained to see him get a small but significant detail wrong in his piece about Marc Andreessen. Among Andreessen's achievements, Felix writes, is that "he's dragging Silicon Valley into the world of philanthropy, where it’s historically been very weak."
Umm. No. Absolutely wrong.
Dave Hewlett and Bill Packard, and their families, have been philanthropists for decades. The value of their gifts to Stanford University exceed the Stanford family’s original endowment (or so I was told by some development folks there). The Lucille Packard Children’s Hospital (where both my children were born), the open space trusts that have kept a significant part of Silicon Valley from turning into places to park Range Rovers in front of McMansion… not to mention a variety of locally-famous schools, charitable foundations, etc. etc. ad infinitum-- have all benefitted from the work of Hewlett, Packard, the Varian family, and many others.
Too elitist? Fundraisers for schools sound too self-serving? Maybe Santa Clara U.’s social innovation prizes, and its goal of improving the lives of a billion poor by 2020, is a bit more to one’s liking.
Indeed, you might make the case that Marc learned about the value of philanthropy from his spouse, whose family real estate business shaped Silicon Valley, and whose family foundation has shaped Silicon Valley in different ways.
I think the idea that the Valley isn't interested in philanthropy comes from extrapolating the example of Steve Jobs, who famously was uninterested in it. However, what you have to realize is how much Steve was the exception to the rule; indeed, you'd have to be someone of Steve Jobs' caliber (in other words, only Steve Jobs) to get away with it.
So Marc’s not inventing a new tradition. If anything, he’s doing a great job of showing how new money legitimates itself by imitating old money.
And yes, here in the Valley money made selling klystrons and calculators– anything before about 1990– is Old Money. Those dollars might as well have been printed by Rembrandt.
Charles Pierce's review of Ross Douthat's Bad Religion (shorter version: the Sixties sucked) is a master class in how to take apart a book in a manner that respects the subject, but gives the author the flogging they deserve. This may be my favorite part:
[N]owhere does Douthat so clearly punch above his weight class as when he decides to correct the damage he sees as having been done by the historical Jesus movement, the work of Elaine Pagels and Bart Ehrman and, ultimately, Dan Brown's novels. Even speaking through Mark Lilla, it takes no little chutzpah for a New York Times op-ed golden child to imply that someone of Pagels's obvious accomplishments is a "half-educated evangelical guru." Simply put, Elaine Pagels has forgotten more about the events surrounding the founding of Christianity, including the spectacular multiplicity of sects that exploded in the deserts of the Middle East at the same time, than Ross Douthat will ever know, and to lump her work in with the popular fiction of The Da Vinci Code is to attempt to blame Galileo for Lost in Space.
Fantastic. As good as Adam Gopnik's epic takedown of The Matrix, Reloaded. It's made more impressive by the fact that you get the sense that Pierce really knows what he's talking about. Here are two very different lines that each in their way are quite illuminating:
He describes the eventual calcification of the sprawling Jesus movement into the Nicene Creed as "an intellectual effort that spanned generations" without even taking into account the political and imperial imperatives that drove the process of defining Christian doctrine in such a way as to not disturb the shaky remnants of the Roman empire. The First Council of Nicaea, after all, was called by the Emperor Constantine, not by the bishops of the Church. Constantine — whose adoption of the Christianity that Douthat so celebrates would later be condemned by James Madison as the worst thing that ever happened to both religion and government — demanded religious peace. The council did its damndest to give it to him. The Holy Spirit works in mysterious ways, but Constantine was a doozy. Douthat is perfectly willing to agree that early Christianity was a series of boisterous theological arguments as long as you're willing to believe that he and St. Paul won them all....
[Douthat is] yearning for a Catholic Christianity triumphant, the one that existed long before he was born, the Catholicism of meatless Fridays, one parish, and no singing with the Methodists. I lived those days, Ross. That wasn't religion. It was ward-heeling with incense.
This 1968 profile of The Band is great. The last two lines of this Robbie Robertson quote really speak to me.
"We were so exhausted that everybody said this was a time of rest. When we went up to Woodstock, we stopped listening to music for a year. We didn't listen to anything but what you didn't have to listen to, like opera. That's why we couldn't play things like the Monterey Pop testival. We weren't – and we aren't – looking for blood any longer. We're just looking for music."...
The Band sings in the rough-hewn harmonies of honest mountain air. The music from Big Pink has the taste of Red River cereal. It has the consistency of King Biscuit flour. It rings with the now-ancient echo of John R, broadcasting from Nashville over Radio Station WLAC, 1510 on the: dial, its signal faintly received but eagerly listened to by an audience that took root in Stratford, Ontario, and Elaine, Arkansas, all with the same passion.... If it sounds traditional, the reason is that it has nothing to do with fads. If it sounds gritty, the reason is that it's full of road dust. If it sounds real, the reason is that it is....
It is music which comes from a band that has nothing but music to offer. The Band doesn't even have a name.
Thomas Frank has a terrific essay in The Baffler about the failure of experts in the dot-com bubble, the war in Iraq, and the housing and credit crisis, and about
our failure, after each of these disasters, to come to terms with how we were played. Each separate catastrophe should have been followed by a wave of apologies and resignations; taken together—and given that a good percentage of the pundit corps signed on to two or even three of these idiotic storylines—they mandated mass firings in the newsrooms and op-ed pages of the nation. Quicker than you could say “Ahmed Chalabi,” an entire generation of newsroom fools should have lost their jobs….
What I didn’t understand was that these were moral failures, mistakes that were hardwired into the belief systems of the organizations and professions and social classes in question. As such they were mistakes that—from the point of view of those organizations or professions or classes—shed no discredit on the individual chowderheads who made them. Holding them accountable was out of the question, and it remains off the table today. These people ignored every flashing red signal, refused to listen to the whistleblowers, blew off the obvious screaming indicators that something was going wrong in the boardrooms of the nation, even talked us into an unnecessary war, for chrissake, and the bailout apparatus still stands ready should they fuck things up again.
I'm afraid any book about the future and prediction has to take into account the transformation, on a large and very public scale, of being wrong into a badge of honor, and the world-view that has been created around it.
Michael Lewis interviews himself about the Occupy movement.
The chief cause of the financial crisis was what the government didn’t do (regulate) rather than what it did (subsidize homeownership), and so it seemed strange to me that, until now, the most potent political reaction to the financial crisis has been an antigovernment backlash. It was as if, after some infectious disease killed a million people, the only political reaction was a popular uprising to prevent the manufacture of antibiotics.
The man is a genius.
I write about people, technology, and the worlds they make.
I'm a senior consultant at Strategic Business Insights, a Menlo Park, CA consulting and research firm. I'm also a visitor at the Peace Innovation Lab at Stanford University. (I also have profiles on Linked In, Google Scholar and Academia.edu.)
I began thinking seriously about contemplative computing in the winter of 2011 while a Visiting Researcher in the Socio-Digital Systems Group at Microsoft Research, Cambridge. I wanted to figure out how to design information technologies and user experiences that promote concentration and deep focused thinking, rather than distract you, fracture your attention, and make you feel dumb. You can read about it on my Contemplative Computing Blog.
My book on contemplative computing, The Distraction Addiction, was published by Little, Brown and Company in 2013.
The Distraction Addiction
My latest book, and the first book from the contemplative computing project. The Distraction Addiction is published by Little, Brown and Co.. It's been widely reviewed and garnered lots of good press. You can find your own copy at your local bookstore, or order it through Barnes & Noble, Amazon (check B&N first, as it's usually cheaper there), or IndieBound.
The Spanish edition
The Dutch edition
Empire and the Sun
My first book, Empire and the Sun: Victorian Solar Eclipse Expeditions, was published with Stanford University Press in 2002 (order via Amazon).
PUBLISHED IN 2012
PUBLISHED IN 2011
PUBLISHED IN 2010
PUBLISHED IN 2009