Just got these in the mail….
Very exciting, in the way that only a vanishingly small number of grinding, attention-demanding tasks can be.
Just got these in the mail….
Very exciting, in the way that only a vanishingly small number of grinding, attention-demanding tasks can be.
The one problem with writing a book for users, taking a Buddhist-inflected approach to information technologies that emphasizes how people can take back control of their minds, is that I'm less likely to get onto this kind of gravy train:
Ferguson's critics have simply misunderstood for whom Ferguson was writing that piece. They imagine that he is working as a professor or as a journalist, and that his standards slipped below those of academia or the media. Neither is right. Look at his speaking agent's Web site. The fee: 50 to 75 grand per appearance. That number means that the entire economics of Ferguson's writing career, and many other writing careers, has been permanently altered. Nonfiction writers can and do make vastly more, and more easily, than they could ever make any other way, including by writing bestselling books or being a Harvard professor. Articles and ideas are only as good as the fees you can get for talking about them. They are merely billboards for the messengers.
That number means that Ferguson doesn't have to please his publishers; he doesn't have to please his editors; he sure as hell doesn't have to please scholars. He has to please corporations and high-net-worth individuals, the people who can pay 50 to 75K to hear him talk. That incredibly sloppy article was a way of communicating to them: I am one of you. I can give a great rousing talk about Obama's failures at any event you want to have me at.
What's so worrying about this trend is that Niall Ferguson, once upon a time, was the best. I'm one of the few people who has actually read his history of the Rothschilds, The World's Banker, all 1,040 pages of the thing, and it is brilliant, a model of archival research. I find it fantastically depressing that the man who could write that book could end up writing a book like Civilization or an article with just as much naked silliness as the Newsweek cover.
I feel very much the same way about Victor Davis Hanson, a man whose military history is really absolutely first-rate, whose The Other Greeks fairly exploded with insight into Greek society and philosophy, but who's been mailing in sloppy, thoughtless pieces ever since he left the farm for The Farm. Sad.
George Monbiot calls publishers like Elsevier and Springer "the most ruthless capitalists in the Western world":
What we see here is pure rentier capitalism: monopolising a public resource then charging exorbitant fees to use it. Another term for it is economic parasitism. To obtain the knowledge for which we have already paid, we must surrender our feu to the lairds of learning.
Open-access publishing, despite its promise, and some excellent resources such as the Public Library of Science and the physics database arxiv.org, has failed to displace the monopolists…. The reason is that the big publishers have rounded up the journals with the highest academic impact factors, in which publication is essential for researchers trying to secure grants and advance their careers. You can start reading open-access journals, but you can’t stop reading the closed ones.
Michael Lewis' Princeton commencement address is terrific. After the obligatory opening joke ("Members of the Princeton Class of 2012. Give yourself a round of applause. The next time you look around a church and see everyone dressed in black it’ll be awkward to cheer. Enjoy the moment"), he talks about writing Liar's Poker and the role of luck in making that book possible:
I was 28 years old. I had a career, a little fame, a small fortune and a new life narrative. All of a sudden people were telling me I was born to be a writer. This was absurd. Even I could see there was another, truer narrative, with luck as its theme. What were the odds of being seated at that dinner next to that Salomon Brothers lady? Of landing inside the best Wall Street firm from which to write the story of an age? Of landing in the seat with the best view of the business? Of having parents who didn’t disinherit me but instead sighed and said “do it if you must?” Of having had that sense of must kindled inside me by a professor of art history at Princeton? Of having been let into Princeton in the first place?
This isn’t just false humility. It’s false humility with a point. My case illustrates how success is always rationalized. People really don’t like to hear success explained away as luck — especially successful people. As they age, and succeed, people feel their success was somehow inevitable. They don’t want to acknowledge the role played by accident in their lives. There is a reason for this: the world does not want to acknowledge it either.
Read the whole thing. It's worth it.
Inside Higher Ed has yet another in the never-ending series of "rethinking the humanities Ph.D." articles. But for once, it's not just about "rethinking" (which too often is regarded as an end in itself), but actually making changes to it: a proposal at Stanford
where students decide on a career plan -- academic or nonacademic -- they want to embark on by the end of their second-year of graduate study, file the plan with their department, and then prepare projects and dissertation work that would support that career…. This would represent a dramatic shift from the current norm, whereby many humanities grad students say that their entire program is designed for an academic career, and that they only start to consider other options when they are going on the job market -- a bit late to shape their preparation for nonacademic options.
I'm going to blow through this quickly, so I can get back to real stuff, but I couldn't let this awfulness go unremarked: Gary Olson's latest essay in the Chronicle of Higher Education, on "How Not to Reform Humanities Scholarship." The piece starts by noting "the growing number of commentators" at the recent Modern Language Association meeting "who were recommending changes in how the discipline conceives scholarly work." I suspect if you went to any MLA between, say, 1960 and today, you could print that sentence and it would due true, but let's take Olson's word that such calls are becoming more frequent and confident.
Certainly, he says, the number of people contacting him to say how terrible such things would be is on the rise. Whatever their good intentions,
Such recommendations, my callers unanimously agreed, would damage not only the careers of aspiring and new professors but also the reputation of the humanities. The proposed changes would also present substantial challenges to academic administrators charged with evaluating scholarship for tenure and promotion.
I'll just note four huge problems with the essay.
The first is the clumsy use of "some people worry about something, so that's evidence" as a form of argument. (One might argue that in a soft field like the humanities perception is reality, but given that this is an essay arguing for the strength of humanistic thinking and scholarship, I think Olson doesn't want to go there.) So you get claims like this:
Some veteran faculty members worry that graduate students and young faculty members—all members of the fast-paced digital world—are losing... their capacity for deep concentration—the type of cognitive absorption essential to close, meditative reading and to sustained, richly complex writing.
[A]llowing doctoral students to produce alternative projects may well disadvantage them on the job market, as hiring committees—or at least some members of them—may not be as receptive to experimental forms and may favor candidates who have, in fact, produced a monograph…. "I can just imagine how my colleagues in our very traditional department would respond to a colleague's tenure application if most of the work were digital," said one department chair. "We would have a clash of cultures and values, and, sadly, I know who would win."
And finally this:
It is true that more and more online journals are claiming to employ a peer-review process. That could be a positive development if we can arrive at a point where the community of scholars has confidence that the review process in online venues is as rigorous as it is in top-tier print journals. At the present, however, many scholars are still skeptical that the processes are equivalent.
Now, the argument that people don't concentrate any more in the digital age is one worth having; my book contends that while plenty of people feel like their faculties of concentration and memory are under assault, it absolutely doesn't have to be this way. Connection is inevitable, but distraction is a choice. But "some people say" is not proof.
Nor, I think it the argument that "we shouldn't do it because the old fogies would shake their canes and yell, you kids get off my lawn" particularly convincing. It's an unfortunate reality that some people don't like new stuff. But that's not a reason to not do new things that are good.
The second problem is that, tragically and not surprisingly, the assumption is that humanities Ph.D.s are all bound for academic jobs, and that training for other professions is more or less unthinkable.
Hence the equivalence of "job market" and "hiring committees," even though 1) a fraction of humanities Ph.D.s are ever going to get tenure-track jobs, and 2) other industries are much more likely to be able to see the value of an innovative piece of work than the search committee chair whose last book was published by Yale UP in 1977. Google's HR people won't care if you haven't produced a monograph, but rather have created something else that displays imagination, an ability to think deeply, and a capacity for focus.
More generally, the essay betrays an unwillingness, shared by far too many members of this generation of scholars, to admit that their field is not in some temporary crisis from which they're going to soon recover, and that good people are ground up and denied futures for structural reasons. Instead, you get things like this:
Besides, the typical rationale for abandoning the traditional dissertation—that the time-to-degree for the humanities doctorate is too long—is not a function of the monograph as a genre; it is a function of some dissertators' personal lives, as they attempt to juggle numerous priorities along with completing a dissertation.
Well, yes, personal lives can play a role. But… could the academy's well-documented reliance on temporary, itinerant, and graduate student labor also play a role here? Might the fact that too many students are under-funded while they write be a contributing factor? This reminds me of Charles Murray's argument that poor whites don't work because their culture has eroded, not that they don't work because the labor market for working class whites is a shadow of its former self.
So what recommendations does the essay embrace? How do we move forward to improve humanities scholarship?
This is the third problem with the essay: for the life of me, I cannot tell.
Olson doesn't seem to say, except to imply that we need more of the same, only better funded. Like too many academics, he seems to believe that if we wait long enough, the fairies to come and sprinkle gold dust on everything. There's no effort to distinguish good reform proposals from bad, to suggest how the rigor of traditional peer-review could be brought to electronic journals, to say how we might use other Web-based metrics (trackbacks, hits, number of comments, and other updated bibliometrics, for example) to help make informed judgments about digital scholarship.
Fourth an finally, I think this gives very short shrift to older faculty. As the son of someone who retired after twenty years at CSM, and then immediately went to Singapore for two years, I've seen at first hand that the relationship between age and personal conservatism is only as strong as you want it to be. He ends up constructing two sets of straw men, digital Panglosses and aging Cassandras, and thus doing justice to neither.
Okay, back to real work.
This is an excellent little essay:
The Republican candidate Newt Gingrich and the cable channel History have both followed the same formula for success, by elevating fantasy over actual history. The difference, however, is that Newt wants to carry his sensational vision of a bygone age into office.
The scholar in me finds this article title hilarious, in a good way--
--I think because it so perfectly expresses the thing it promises to analyze.
Today I stole my wife's copy of AHA Perspectives and Anthony Grafton and Jim Grossman's essay "No More Plan B," on the need to reform history graduate programs to train people for non-academic jobs. Having written about post-academic life, this is of course a subject that interests me.
I think the Grafton and Grossman essay points in the right direction, and it inspires two suggestions and a caveat.
First, for students in the early stages of the dissertation, it could be tremendously helpful for a department to bring in a literary agent for a day. There are agents who specialize in academic-to-trade crossover projects, and the business is competitive enough for there to be some younger agents who'd find the prospect of representing an entire department interesting. In an afternoon, the agent could explain how the whole selling books for money thing works, and interested students can pitch their dissertations as book proposals.
It wouldn't be the end of the process of turning a thesis into a trade book, but just the beginning; but you have to start somewhere, and if it's possible to craft a Ph.D. with an eye to immediately converting it into a trade press manuscript-- preferably by just stripping out the footnotes and some of the academic framing in chapter 1-- that would do a lot to acculturate young Ph.D.s to the idea that they don't have to make Faustian bargains to make a living writing. (Of course you can if you want, but the academic vs. trade route is not a choice between freedom and serfdom: it's a choice between two different sets of pressures and constraints.)
This would do several things: help demystify the world of trade publishing, give students a sense of how their projects could be crafted for a broader audience, and for at least some, get some funding for the writing. Not every dissertation is the next "Longitude," but I'll bet a surprising number could be crafted for the trades. My agent was phenomenally valuable in both shaping my current book, and without her I'd still be trying to get MIT Press to return my phone calls. Instead, I'm in a very different position.
This might also help deal with a second issue. The biggest thing I had to deal with after finishing my dissertation was a sense of narrowed professional horizons. The cruel irony is that newly-minted history Ph.D.s tend to have a sense that they're LESS able to survive in the world than when they graduated from college, and often less interested in doing so. I'm not really sure there's a whole lot anyone can do to reduce this. It can help to bring in people like me who've had intellectually interesting lives (interesting to me at least) outside academia, but I think graduate school requires internalizing the cultural norms in order to survive-- not to mention justify the intense focus on a narrow subject, deferred income, etc..
At the same time, there's a critical thing that must be maintained in graduate school at all costs. Spaces for contemplation are being torn up faster than rain forests: just look at the mania for collaborative spaces in library architecture, the assumption that knowledge work is all about networking and idea-sharing, the arguments among (both evangelical and liberal) Protestant ministers over bringing social media into church services ("RT Luke 3:16 LOL #atchurch"), etc. etc.
If there is one great thing I got from graduate school that has sustained me in all my professional endeavors, it's the capacity not just to write and produce knowledge-- scholarly knowledge, popular pieces, even slightly disreputable consulting "product" with what Stephen Colbert might call "knowledginess"-- but an understanding that serious thinking really requires time and sustained, slightly manic, attention. There are precious few places outside universities-- and fewer and fewer places within the academic "marketplace of ideas" (kill me now)-- that take the vita contemplativa seriously; one of the best things you can do for students is help them learn how to live that life, and to make it portable.
Inside Higher Ed reports that the American Historical Association has just released a position paper, co-authored by AHA president Anthony Grafton and AGA executive director James Grossman, arguing that non-academic careers for history Ph.D.s shouldn't be thought of as some kind of aberration or "plan B," but recognized as the new normal.
For years now, humanities and other disciplines have promoted "alternative" careers for new Ph.D.s, trying both to increase the range of opportunities available to new graduates and to ease the competition just a bit in the academic job market.
The president and executive director of the American Historical Association have just released a statement calling for their field to abandon the idea that any career path -- including those paths outside of academe -- be classified as "alternative." It is time, they argue, to admit that the academic job market is not coming back anytime soon, that many new Ph.D.s who find jobs outside academe find rewarding work (both financially and intellectually), and that the doctoral experience needs to change in some ways so that new Ph.D.s have more options.
It's only taken 20+ years to recognize that graduate training promotes an outmoded, unrealistic (and, I would argue, unnecessarily narrow) set of career expectations. But maybe attitudes will actually start to change. Or perhaps graduate programs will just reduce their enrollments by 50%, to reflect the permanently reduced size of the academic job market, and to keep from having to change their way or working.
After a couple months working on it, I've been thinking about the experience of writing a serious non-fiction book. It's been a stretch for me, in quite a good way so far: I'm writing about something big that I'm passionate about, but in a manner that I find new and very challenging.
my backyard office with dog, via flickr
First, writing without footnotes is a pretty liberating experience. I'm the sort of scholarly writer who likes the recapitulate his entire intellectual history in the first five footnotes, and construct a dense thicket of citations to support my main text. If anybody doesn't know, this is part of an academic game that has several goals: creating a defensive barrier below your work that keeps it from being undermined ("well, yes, you would think that's a flaw in my argument if you haven't read these sixteen other things"), creating a place for your work in the literature, and sending little mash notes to people whose work you like. The downside is that this is an enormously time-consuming game, and it's a great way to procrastinate; if you get too caught up in it, it gets harder to actually write.
working at cafe zoë, via flickr
Little Brown doesn't do footnotes; instead, their books have bibliographic essays at the end. This means that I don't have to document every claim I make as I write it; I need to keep track of what I'm doing and where things come from, of course, but there's a whole slice of literary labor that I can forget about. The standard in the industry is also to quote other people sparingly, unless they're Shakespeare or Yogi Berra; as my editor explained, they're buying MY ideas, not my gloss on someone else's.
working at cafe zoë, via flickr
The result of all this has been that I'm writing faster; it also means that I'm constructing a different kind of relationship between this work and my sources, and between my authorial self and other writers in the field.
Put most simply, knowing that I can't impress readers with spectacular acts of citation jujitsu means that I have to make the work itself more compelling, and my own voice more authoritative. A footnote citing half a dozen books can be the intellectual version of an incomplete sentence, an erudite way of saying, "Well, you know..." With this, I have to actually FINISH the thoughts, and make them mine.
reading abraham heschel's The Sabbath, via flickr
The practice of quoting other works has one big benefit: it means I'm doing more interviews with people. Even when I can pull a quote from something a person has written, it's better to quote from an interview. This is, in effect, a great excuse to talk to have conversations with interesting people, which is something I always enjoy. And fortunately they're quite forgiving when we go over things they've already written about; few people actively dislike talking about their work, and most of them know how this game work.
What I find really unexpected is that this kind of authority-- writing that depends more on what the AUTHOR does, than on who the author cites-- is, for me at least, truer to the ideal of scholarly authority. It forces you to take complete responsibility for your ideas. (Even at the Institute, while we didn't use footnotes, we often supported ideas that were challenged by readers (usually clients or prospective clients) by saying, in effect, we're just telling you what our expert sources told us.) You can argue that some writers abuse this, by appropriating other people's ideas, or not sufficiently acknowledging their debts; I hope to avoid that, but I can now see how it happens.
getting pretty deep, via flickr
I've also been struck at how much writing is a business, albeit one that requires a high degree of focus and creativity. Even after editing the Encyclopaedia Britannica, publishing an academic monograph, turning out articles in newspapers, Scientific American, and lots of academic journals, I'm learning a LOT about how the trade book market works, and it's pretty different from everything else.
microsoft research cambridge, via flickr
With my ambitious 1,000 word/day writing schedule, I'm also having to be very ruthless about my time and avoiding distractions. I'm not always successful (WILL JACK EVER ESCAPE FROM THE OTHERS? WHAT THE HELL IS THAT SMOKE MONSTER?), but this kind of writing requires starting early (the days when I'm up before 6, and get some writing done before I have to take the kids to school, are the most satisfying), and not giving up. People who think you get inspired, then rush to the keyboard and write in a creative frenzy, have it exactly backwards: you sit at the keyboard, and hope you can get to that state.
cafe milano, berkeley, via flickr
At the same time, while you need to hit your deadlines, you also need to be creative: anyone can tell the difference between what I've written when I'm really engaged and passionate, and what I write when I'm turning out Product. People don't tell their friends that they have to read this Product; Terry Gross doesn't interviews to writers about Product. They want strong, passionate writing, and creating it is... a challenge.
stimulant, distraction, caffeine, via flickr
Paradoxically, I think setting a 1,000 word/day pace for myself turns out to be a good way to bring on that more creative state, that feeling of being entangled with the work. The more you're able to write to a schedule, the more likely you are to hit those great moments when you feel like you're transcribing ideas that come from somewhere other than your own mind. It can take at least a day to get to that mental state where the ideas really flow well; inspiration doesn't come in a flash, but after a long run-up. Put another way, those states can, to some degree, be induced: you can start wordsmithing and end up doing something really creative. This helps explain Frans Johannsen's observation (in The Medici Effect) that creative people do some of their best, most memorable work when they're doing a LOT of work. We assume that masterpieces are the result of long solitary focus on a single problem, but they're more usually part of a bigger enterprise.
my office in microsoft research cambridge, via flickr
Now back to real writing.
working at cafe zoë, via flickr
This echoes feelings I've had, and I've heard plenty of other people express:
I have given up my secure academic job as Reader at the University of the West of England for the vagaries of life as a freelance. And why? Because I want to work - really work - and my job made that impossible.
Am I mad? The losses include a reliable salary, a pension, sick pay, a heated room, and a computer that someone comes to mend when it breaks down. But I can do without those (I think). The gain is a true academic life at last. I can devote my time to thinking, and reading and writing; to sharing ideas with others; to asking questions of the universe and trying to find the answers. The simple fact is that I could not do these things and the job.
I'm reading Blackmore's Zen and the Art of Consciousness, which I think does a brilliant job of communicating how difficult meditation and mindfulness exercises are.
Think all you can get are reprints? Think again! I could get an "eye-catching, full-color, poster of your article on the cover of the journal," or "an attractive color poster" of my article, "Perfect for your lab or office," or a "Certificate of Publication... in a high-quality frame, dark brown wood with gold trim."
Just in time for the holidays! Except probably not.
I wonder in which countries, or which disciplines, these things sell? Academic life has lots of well-worn rules about display and status, and the book-lined office, piles of paper on the desk (and floor and extra chair), and harried yet abstracted expression are all signifiers of The Life and how well you play it. (Few things mark the boundary between tenured faculty and adjuncts more powerfully than their control of space: the bare office shared with two other people fairly screams, "I'm just here temporarily, pay no attention to me.")
But having a poster advertising an article... that seems over the top, at least in the places I taught. But maybe in places that are very status- and publication-conscious, it's actually useful to have such in-your-face markers of accomplishment?
...even when they're published "in a little-read journal," as Ezra Klein points out.
He's talking about Elizabeth Warren's 2007 article proposing a Financial Products Safety Commission-- which she is now the front-runner to lead.
From the debut issue of postmedival:
Nicola Masciandaro (English, Brooklyn College, CUNY), Individuation: This Stupidity
The problem of individuation exposes the insuperable stupidity of human being and guarantees the groundlessness and illegitimacy of any systematic understanding of it.
I have a bunch of books-- probably a couple hundred-- from my professional/scholarly collection that I want to give away. Most are history (with an emphasis on European and British history), history of science (largely modern, but a respectable smattering of early modern), STS, and contemporary technology and business. Many are duplicates (how did I get three copies of Rheingold's Virtual Communities and Benedict Anderson's Imagined Communities); others are books I've carried around for years and realize I will never read again (holla, Renissance Self-Fashioning!); and various others no longer match my current or likely future interests (Bernal's 3-volume history and Daniel Lindberg's Rise of Western Science are both great, but I'm not likely to teach intro history of science again).
I would prefer they go to someone in the field-- ideally a history or STS grad student or postdoc-- rather than just be donated to my local library's book sale; I don't want to go through the trouble of putting them up on eBay. Is there an academic equivalent to Freecycle that I can use to connect with some worthy soul (who will agree to pay shipping)?
Steve Eisman, "the outspoken investor whose huge wager against the subprime mortgage market was chronicled by author Michael Lewis in his bestselling book The Big Short, talking about the for-profit education industry:
Until recently, I thought that there would never again be an opportunity to be involved with an industry as socially destructive and morally bankrupt as the subprime mortgage industry. I was wrong. The for-profit education industry has proven equal to the task.
As Mother Jones elaborates,
Driving much of the growth, Eisman explained, was the sector's easy access to federally guaranteed debt through Title IV student loans. In 2009, he said, for-profit educators raked in almost one-quarter of the $89 billion in available Title IV loans and grants, despite having only 10 percent of the nation's postsecondary students.
Eisman attributes the industry's success to a Bush administration that stripped away regulations and increased the private sector's access to public funds. "The government, the students, and the taxpayer bear all the risk and the for-profit industry reaps all the rewards," Eisman said. "This is similar to the subprime mortgage sector in that the subprime originators bore far less risk than the investors in their mortgage paper."...
Another similarity between subprime lending and for-profit education is this, Eisman said: Both push low-income Americans into something they can't afford.... [Finally], the industry's era of massive profits—ITT is more profitable on a margin basis than Apple, he notes—are about to end, thanks to new government regulations in the pipeline.
A while ago I wrote about reinventing academic talks. It got me thinking about how to better design workshops or conferences that bring together scholars or scientists (who, broadly speaking, like to think about stuff) with policy people, corporate strategists, and military people (who, broadly speaking, also like to think, but really like to DO).
It's a space I've been exploring in my consulting practice this past year, and I just posted a piece on Future2 on the opportunities we now have to reinvent events on the academic / real-world boundary.
I noticed a traffic spike on the blog, thanks to Lexi Lord's essay on post-academic life in the recent Chronicle of Higher Education (thanks, Lexi!). She talks about how she decided to leave a tenure-track position, and her discovery of the fact that you don't have to be an academic to have an interesting intellectual life (and indeed, can have a more interesting one if you're not a professor):
Because I live in a large city, as opposed to the small college towns where I was a professor, I live in a world of museums, lectures, public seminars, extraordinary bookstores, fantastic archives, and libraries. I live in a place that has racial as well as ethnic diversity. All of those factors encourage me to think about historical problems in a rigorous albeit different fashion from how I saw them in academe....
I live where a lot of archives are—which makes research easier than it was in academe. I write and publish. My new book, researched and written completely outside academe, was just published by Johns Hopkins University Press.
Since leaving academe, I have continued to endorse the belief that being an intellectual entails analyzing and understanding issues from multiple angles. I hope that in advising their undergraduates, academics will encourage their students to share that view. More important, I hope faculty members will encourage students to do informational interviews and extensive research on career options—before entering a Ph.D. program, which is, after all, only one path to the life of the mind.
This is always good advice, but it's especially timely, given that last night I had an experience that reminded me of the increased feasibility of pursuing academic projects outside the university.
I recently became interested in the concept of unintended consequences, and how the term is used to either describe or excuse the unexpected. It would be obvious to start such an essay with "a Raymond Williams Keywords-like analysis of its history," and last night I decided to poke around a little bit and see if I could find some early uses.
A little time on Google Scholar turned up the fact that Robert Merton wrote an article about the term in 1936, and died with a book on unintended consequences still unfinished-- a warning that I should be very tactical in how I approach the subject. (The fact that Google Scholar has "Stand on the shoulders of giants" as its motto warms my heart, since Merton wrote a book on the phrase.) That took a few minutes.
I then jumped over to the Stanford Library Web site, to see if Poole's Index of 19th Century Periodicals was online. When I was writing my dissertation, I spent a LOT OF TIME with Poole's-- it was an invaluable resource, and I remember many hours in the Penn and UC Berkeley libraries, looking for article citations, then tracking them down in the stacks. Instead, I quickly found the 19C Index, an online repository / directory that includes Poole's, but also a number of other 19th century indexes, publications, scanned magazines, etc.
For the next couple hours, I tracked down various combinations of unintended, unexpected, and unanticipated, and effects or consequences; by bedtime, I had a couple pages' worth of material written (most of it is footnotes and quotations, of course).
All this happened on my couch, with the "Biggest Loser" finale in the background.
I wouldn't give up those days spent in the library for anything; and I still really enjoy going to libraries to read and write. But the point of the story is this: that while fifteen years ago (when I did it) successfully leaving academia but remaining intellectual required geography and attitude-- I could do it because I was living in Chicago, Lexi was in DC, and we both were willing to keep a growth mindset about the next phase of our lives-- today, resources like 19C make it even easier to do serious scholarly work-- at least preliminary scholarly writing-- without being close to libraries. I'm about three miles away from Green Library, but with kids, work, and other stuff, it's hard to get there, and impossible to just dash over to the reference section to check up on something (as I could do when I was single and living a mile from the Berkeley campus).
So what Lexi argues in her recent piece, and what I argued years ago, is more true than ever: the raw resources for pursuing academic projects are more accessible and portable than ever. It still often requires maintaining some kind of connection with an academic institution-- my Stanford affiliation gets me access to the online databases like 19C and JSTOR-- and you still have to manage all the logistical stuff required to carve out time for yourself, but the Web at least seriously lowers the barriers to getting access to the resources necessary to support a real intellectual life.
I know that projects like JSTOR are intended to support academics, but I think it's even more valuable for people who are doing serious intellectual work but who aren't academics. These services were designed to support scholarship is doing that... but the most profound benefits aren't going to the people they were originally designed for.
Hey. That's an unintended consequence.
Interesting article in the New York Times on the use of brain science in literature:
Literature, like other fields including history and political science, has looked to the technology of brain imaging and the principles of evolution to provide empirical evidence for unprovable theories.
Interest has bloomed during the last decade. Elaine Scarry, a professor of English at Harvard, has since 2000 hosted a seminar on cognitive theory and the arts. Over the years participants have explored, for example, how the visual cortex works in order to explain why Impressionist paintings give the appearance of shimmering. In a few weeks Stephen Kosslyn, a psychologist at Harvard, will give a talk about mental imagery and memory, both of which are invoked while reading.
While this is very interesting, the practice of drawing on the sciences (particularly cognitive science) to inform the humanities is less new than the article suggests: E. H. Gombrich's classic Art and Illusion opens with a discussion of the latest findings on perception and cognition (from the 1950s, obviously) and how they should be applied to art history and criticism.
From the latest by Thomas Benton:
Graduate school in the humanities is a trap. It is designed that way. It is structurally based on limiting the options of students and socializing them into believing that it is shameful to abandon "the life of the mind." That's why most graduate programs resist reducing the numbers of admitted students or providing them with skills and networks that could enable them to do anything but join the ever-growing ranks of impoverished, demoralized, and damaged graduate students and adjuncts for whom most of academe denies any responsibility.
Of course, I'm predisposed to appreciate the piece.
The Times reports that
When France’s most dashing philosopher took aim at Immanuel Kant in his latest book, calling him “raving mad” and a “fake”, his observations were greeted with the usual adulation. To support his attack, Bernard-Henri Lévy — a showman-penseur known simply by his initials, BHL — cited the little-known 20th-century thinker Jean-Baptiste Botul.
There was one problem: Botul was invented by a journalist in 1999 as an elaborate joke, and BHL has become the laughing stock of the Left Bank....
Mr Lévy admitted last night that he had been fooled by Botul, the creation of a literary journalist, Frédéric Pages, but he was not exactly contrite.
Appearing on Canal+ television, he said he had always admired The Sex Life of Immanuel Kant and that its arguments were solid, whether written by Botul or Pages. “I salute the artist [Pages],” he said, adding with a philosophical flourish: “Hats off for this invented-but-more-real-than-real Kant, whose portrait, whether signed Botul, Pages or John Smith, seems to be in harmony with my idea of a Kant who was tormented by demons that were less theoretical than it seemed.”
Granted I haven't had any coffee this morning, but it sounds like Lévy's argument is, "Yes the work I cite is fiction, but it says what I think, so I'll continue to reference it." Which sounds rather like an appeal to truthiness: it's not true, but it kind of looks true, and confirms my own beliefs, so I'm going to find it convincing.
This piece of complete non-news in the New York Times caught my eye this morning:
With colleges and universities cutting back because of the recession, the job outlook for graduate students in language and literature is bleaker than ever before.
According to the Modern Language Association’s forecast of job listings, released Thursday, faculty positions will decline 37 percent, the biggest drop since the group began tracking its job listings 35 years ago.
The projection, based on a comparison between the number of jobs listed in October 2008 and October 2009, follows a 26 percent drop the previous year.
I read this, and wondered what bothered me about it. Obviously the news itself is bad, but not surprising: the academic job market has been a disaster area for a generation now, it's not going to get better, and anyone who thinks it will is delusional. When I was in grad school in the late 1980s, the conventional wisdom was that we were hitting the job market at exactly the right time: the generation that was hired during the Great Expansion in the 1950s and 1960s would retire, and we'd cruise into those positions.
Needless to say, that didn't happen, and the fact that many of those jobs were converted into short-term positions should have been a clear signal that The Market Had Changed.
But this is old news. What gets me about this piece, I realized, is how it's framed. It equates "the job outlook for graduate students in language and literature" with "the academic job market:" there's no sense that Ph.D.s might be capable of doing SOMETHING ELSE with all that knowledge. Demonstrably wrong, guys.
It was 2003, and Warren, an earnest-sounding and ever enthusiastic Harvard law professor who specializes in bankruptcy, was on the set of Dr. Phil. She had written a book with her daughter called The Two-Income Trap: Why Middle-Class Mothers & Fathers Are Going Broke, and she'd expected to sit next to the host and explain its key points. Instead, Dr. Phil was interviewing a stressed-out couple with serious medical and financial troubles. After they mentioned they had obtained a second mortgage to pay off their credit card debt, the lights went up on Warren, and Dr. Phil asked her if this had been a smart step. No, she declared, because now they could lose their home if they defaulted.
As soon as her turn was over, Warren found herself thinking, "You've been doing this work for 20 years now, and it is unlikely that any of it has had as direct an impact as these 45 seconds." She had reached millions, some of whom might actually pay attention to her advice. "So here you are, Miss Fancy-Pants Professor at Harvard. What do you plan to do now? Is it all about writing more academic articles, or is it about making a difference for the families you study? I made a decision right then: It was for the families, not the self-aggrandizement of scholarship."
Since then she's proved herself to be surprisingly mediagenic, in a very understated, just-drove-the-minivan-to-the-office kind of way. At the same time, she's not given to oversimplification or jargon: she's really good at explaining the stakes in TARP (she's part of the Congressional office that tries to oversee TARP), where the money's going, of why we don't know where the money's going. (Check out her appearance-- in two parts-- on The Daily Show.)
Yet despite, or more accurately because, of her willingness to choose "families" over "the self-aggrandizement of scholarship,"
Harvard economists... dismiss Warren as insufficiently theoretical. "They think she shouldn't be talking about bankruptcy except as someone in the economics department would—that is, with formulas and theorems, not about how it affects real people."
I suppose this drives me around the bend for two reasons.
First, my work has sometimes been accused of being insufficiently theoretical (usually in reader's reports), as if theory is the sine qua non of importance. As Taibbi would put it, first of all, few kinds of scholarly work are both harder and less likely to stand the test of time as theory; and second, what the fuck? When did we all turn into mini-Derridas? Isn't theory a tool? I mean, we all use word processors, but I don't see many of my colleagues rushing to create their own versions of Microsoft Word.
Second, probably the single greatest personal intellectual epiphany I've had since leaving academia is that the real world actually has interesting problems: not just problems that you ought to deal with because life as we know it could get pretty screwed up if we don't, but problems that are actually intellectually engaging, make use of the cognitive muscles you developed in academia, force you to develop new abilities, and expose you to interesting questions you would never have discovered otherwise. The assumption that academia is where people grapple with interesting questions, and the business world is where stupid things happen, is just wrong.
I kind of like Matt Taibbi's argument that she should be drafted to run for president, and if
someone like Elizabeth Warren doesn’t want that responsibility, well, she shouldn’t have gone into office and gone on TV making all that sense and shit. She’s pushed for transparency in the Fed, is openly furious about the misuse of bailout money, and seems to take personally the chicanery that credit card companies and banks use to game the suckers out there. I simply cannot see her suddenly flipping and holding $2000-a-plate fundraisers with Lloyd Blankfein and Jamie Dimon.
This is the sort of thing that shouldn't happen:
Fights broke out as law students queued for up to 11 hours last night to secure the dissertation supervisor of their choice at Brunel University.
More than 100 students queued outside Brunel Law School overnight in the hope of working with their preferred academic, after the school introduced a first-come, first-served supervisor-allocation system.
I love the University's utterly tone-deaf response.
A spokesman for Brunel said the university was “very concerned” that law students had queued overnight and was “disappointed to see the lengths to which some feel they have had to go”.
“In preparing for their dissertation, students are informed that neither their choice of topic nor their first choice of supervisor can be guaranteed. It seems that they have done all they can to try to achieve their first topics and supervisors.”
Ummm.... why should that be disappointing, or any kind of surprise?
Philip Gerrands on bubbles in higher education:
The cause of the meltdown in global financial markets is obvious: leveraged trading in financial instruments that bear no relationship to the things they are supposed to be secured against.... The academy, too, is a market - a large one in which the value of any piece of research is ultimately secured against the world. If the world is not as described or predicted in the article or book, the research is worthless. A paper that claims that autism is caused by vaccination or terrorism by poverty is valuable only if it turns out to be a good explanation of autism or terrorism. That is why an original and true explanation is the gold standard of academic markets....
The academic market is also like the financial market in another way. Stocks trade above their value, which leads to bubbles and crashes. Brain-imaging studies, for example, are a current bubble, not because they don't tell us anything about the brain, but because the claims made for them so vastly exceed the information they actually provide.... [E]very week we read in the science pages that brain-imaging studies prove X, where X is what the readers or columnists already believe. Women can't read maps! Men like sex! Childhood trauma affects brain development! There is an Angelina Jolie neuron!...
[Much scholarship by] [h]istorians, anthropologists, linguists and even philosophers... is unsecured and highly leveraged. By this I mean that people in the humanities often do not write about the world or the people in it. Rather, they write about what somebody wrote about what somebody else wrote about what somebody else wrote. This is called erudition (not free association), and scholars sell it to their audience as a valuable insight about the nature of terrorism or globalisation or the influence of the internet (preferably all three). Almost every grant application in the humanities mentions these three topics, but the relationship between them and the names and concepts dropped en route are utterly obscure.
None of this would matter if the market were basically self-correcting like the science market, or erratic but brutally self-correcting like the financial markets.... [But] the main corrective mechanism in the humanities is reputation built on publication and, since publication is often based on reputation, the danger of a bubble is extreme. Someone who takes a supervisor's advice to base a career on writing about Slavoj Zizek is in the position of an investor deciding to invest in Bear Stearns on the advice of Lehman Brothers. The price is high and predicted - by those who have a vested interest - to rise further....
Compare the citation for a Nobel prizewinner in chemistry or physics with the way humanities research is evaluated. The Nobel citations are accessible to any intelligent reader.... Things sometimes seem to go the other way with the big names in the humanities. A problem (eg, terrorism) is misdescribed (eg, as an expression of subaltern response to modernity) and a raft of pseudo-explanation is recruited to leave everyone baffled.
From an essay in the Times Higher Education on the seven deadly sins in academia, when I first read it this piece on lust made my eyeballs hurt, and not in a good way:
When Willie Sutton was asked why he robbed banks, he is famously said to have replied, "because that's where the money is". Equally, the universities are where the male scholars and the female acolytes are. Separate the acolytes from the scholars by prohibiting intimacy between staff and students (thus confirming that sex between them is indeed transgressive - the best sex being transgressive, as any married person will soulfully confirm) and the consequences are inevitable.
The fault lies with the females. The myth is that an affair between a student and her academic lover represents an abuse of his power. What power? Thanks to the accountability imposed by the Quality Assurance Agency and other intrusive bodies, the days are gone when a scholar could trade sex for upgrades....
Normal girls - more interested in abs than in labs, more interested in pecs than specs, more interested in triceps than tripos - will abjure their lecturers for the company of their peers, but nonetheless, most male lecturers know that, most years, there will be a girl in class who flashes her admiration and who asks for advice on her essays. What to do?
Enjoy her! She's a perk. She doesn't yet know that you are only Casaubon to her Dorothea, Howard Kirk to her Felicity Phee, and she will flaunt you her curves. Which you should admire daily to spice up your sex, nightly, with the wife.... And in any case, you should have learnt by now that all cats are grey in the dark.
So, sow your oats while you are young but enjoy the views - and only the views - when you are older.
Crooked Timber comments, this is a "classic example of the sort of thing where having shown a draft to a single close female friend might have saved the day, and in the process offered a useful insight into the distinction between the concept 'refreshingly un-PC' and the concept 'creepy'."
However, the author answers his critics this way:
This is a moral piece that says that middle aged male academics and young female undergraduates should not sleep together. Rather, people should exercise self-restraint. Because transgressional sex is inappropriate, the piece uses inappropriate and transgressional language to underscore the point - a conventional literary device. At a couple of places, the piece confounds expectations, another conventional literary device, designed to maintain the reader's interest. Sex between academics and students is not funny, and should not be a source of humour. But employing humour to highlight the ways by which people try to resolve the dissonance between what is publicly expected of them and how they actually feel - not just in this context - reaches back to origins of humour itself. In his introduction, [editor] Matthew [Reisz] wondered how many of his contributors would enter into the spirit of levity that inspired the idea of the seven deadly academic sins (submitting a piece on prevarication late, etc) and I suspected that one could get to heart of all that is wrong with sex between scholars and students by employing the good ol' boy language of middle aged male collusion. I'm not sure I'm wrong.
If it's intended to be a piece whose style and tone exemplify its subject, I have to admit it does a decent job. Naturally the piece has generated a huge number of comments, though this one takes the prize:
Professor Kealey assumes that every male academic’s wife mustn’t be that attractive. How wrong! I, for example, I am far hotter than any of my husband’s young and inexperienced students could ever (unfortunately for them) hope to be.
Hear hear. When I was in Oxford, walking around in the evening and trying to navigate around the crowds of students in high heels and skirts, I'd sometimes think, "They might be interesting in twenty years." I can't be the only man who reacts like that.
Via Crooked Timber, this Inside Higher Ed review of Diego Gambetta's Codes of the Underworld: How Criminals Communicate that has a great comparison of projected incompetence among mafiosi, who according Gambetta, cheerfully "let the professionals and the entrepreneurs take care of the actual business operations" and admit that they're only good at shaking people down, and a certain brand of italian academic, the "baroni (barons) who oversee the selection committees involved in Italian academic promotions."
While some fields are more meritocratic than others, the struggle for advancement often involves a great deal of horse trading. "The barons operate on the basis of a pact of reciprocity, which requires a lot of trust, for debts are repaid years later. Debts and credits are even passed on from generation to generation within a professor's 'lineage,' and professors close to retirement are excluded from the current deals, for they will not be around long enough to return favors."
The most powerful figures in this system, says Gambetta, tend to be the least intellectually distinguished. They do little research, publish rarely, and at best are derivative of "some foreign author on whose fame they hope to ride.... Also, and this is what is the most intriguing, they do not try to hide their weakness. One has the impression that they almost flaunt it in personal contacts."
Well, one also has the impression that the author is here on the verge of writing a satirical novel. But a friend who is interested in both the politics and academic life of Italy tells me that this account is all too recognizably accurate, in some fields anyway. Gambetta calls the system "an academic kakistocracy, or government by the worst," which is definitely an expression I can see catching on.
Author and creative writing teacher Rachel Toor writes in the latest Chronicle of Higher Education (subscription required) about the problems of either dashing off talks the night before, or just reading papers:
More often than I can believe, someone will preface a reading by saying, "I just wrote this last night." Why on earth, I wonder, would you read something that raw? Generally public readings are set up months in advance. It's not like the speakers don't know they're going to have to have something ready.... But then I remembered that arrogance is often the conjoined twin of insecurity. What those writers wanted us to know, perhaps, was that this new work was the result of pure talent: Just think, audience, how good this would be if it were coupled with labor? If the piece stinks, it's simply a matter of timing. It's not my fault. I could do better, really, I could. I just didn't have the time....
Most academics don't present hastily written papers. But they do something almost as bad. They read their papers aloud. Some professors read their lectures. It's common practice, I know, but frankly, it bugs me. It's hard enough for an audience to follow a short story, where, presumably, some attention is being paid to crafting narrative tension. Having to track audibly an argument written in long, convoluted sentences and leaden, jargon-ridden prose can feel like a forced drowning.... Reading instead of presenting is, I think, the academic equivalent of "I just dashed this off last night." It's an act borne out of (choose as many as apply): fear, insecurity, arrogance, procrastination, habit, poor training, or lack of regard for the audience. It's also just plain lazy. It's a lot of work to think something through and then write it out as a conference paper. Taking the next step—understanding what you've done and figuring out how to summarize it extemporaneously—seems to be one that many are willing to forsake.
The piece is a reminder of just how different the kinds of talks I've done for the last few years, and the sorts of intellectual events I'm usually involved in, are from conventional academic presentations. I spend huge amounts of time preparing for the workshops I facilitate: I go over every activity, every breakout session, think about the posters I need to create, the instructions I should give, what I should and shouldn't say, and what outcomes the client and I want.
All this preparation generates one of two things: artifacts and other materials that help organize an event (or that help participants stay self-organized and -aware of what they're supposed to be doing); and a clearer understanding of what I need to do for the day to succeed. What that preparation doesn't generate is a perfectly-planned day: all that planning, I know, is to prepare me to succeed despite the fact that something is going to happen that requires me to adapt and adjust.
What you absolutely cannot do in an environment like this is throw something together the night before; nor can you write it all out and assume you can just follow the script mindlessly-- the two options Toor describes.
Why are these events so different? Two reasons. First, the faciltiated workshop, much more than the academic conference, is explicitly about the production of shared meaning. The aim after a day or two is to have a common vision of the future, a common roadmap, and common understanding of what an organization's strategy should be. You don't necessarily have that as an outcome of a scholarly conference. Second, workshops are a means to an end, not an end in themselves: they're supposed to catalyze action, not be the end of action.
With the proliferation of interesting kinds of workshops, novel forms of meetings, and now the rise of the unconference, I think it's high time we thought about how we could reinvent academic (maybe mainly humanities) conferences. There's no reason we can't create a better model, that satisfies conference speakers' professional needs (e.g., the line on the c.v., the publicity, the chance to interview for jobs) and personal ones (e.g., the opportunity for subsidized travel to see your friends), as well as the needs of conference organizers and the profession/discipline as a whole-- and is a lot more interesting and engaging. So many academic events I go to end on an optimistic note, or generate lots of interest in moving on to actually doing something... but then dissipate, and at best yield an edited volume. Sitting in a stuffy (or over air-conditioned) hotel conference room, listening to someone read a talk, and feeling the collective interest and enthusiasm generated by the event evaporate days afterward-- aren't there better ways we could all spend our time?
Seriously, I'd really like to do this.
This bit from the TierneyLab:
“Academics, like teenagers, sometimes don’t have any sense regarding the degree to which they are conformists.”
So says Thomas Bouchard, the Minnesota psychologist known for his study of twins raised apart, in a retirement interview with Constance Holden in the journal Science....
The strength of this urge to conform can silence even those who have good reason to think the majority is wrong. You’re an expert because all your peers recognize you as such. But if you start to get too far out of line with what your peers believe, they will look at you askance and start to withdraw the informal title of “expert” they have implicitly bestowed on you. Then you’ll bear the less comfortable label of “maverick,” which is only a few stops short of “scapegoat” or “pariah.”
A remarkable first-hand description of this phenomenon was provided a few months ago by the economist Robert Shiller, co-inventor of the Case-Shiller house price index. Dr. Shiller was concerned about what he saw as an impending house price bubble when he served as an adviser to the Federal Reserve Bank of New York up until 2004.
So why didn’t he burst his lungs warning about the impending collapse of the housing market? “In my position on the panel, I felt the need to use restraint,” he relates. “While I warned about the bubbles I believed were developing in the stock and housing markets, I did so very gently, and felt vulnerable expressing such quirky views. Deviating too far from consensus leaves one feeling potentially ostracized from the group, with the risk that one may be terminated.”
I've been in Bloomington, Indiana for a conference on visualization and the history and philosophy of science. It's one of those events that brings together my old life as an historian, and my new life as a futurist: on one hand we're mainly talking about how visualizations of scientific communities and social dynamics can be used by historians and philosophers; on the other I suspect that there are cool things I could do with these maps to forecast the future of science.
the official conference picture, via flickr
There's one other think-tank person here, which saves me from being the one non-academic Ph.D. in the room, the scholarly equivalent of Stephen Colbert's one black friend.
There have been some efforts to use scinometric (or "science of science") maps in the history of science, but so far as I know, most of this work has followed fairly conventional historiographic paths: for example, mapping the Darwin or Mersenne correspondence, or asking questions about the growth of scholarly networks. We've not yet used them to something radically new, like using geographical coding to calculate the speed of the transmission of ideas or instruments, or constructing agent-based models of scientific communities and seeing how they evolve over time. But that's why we're here-- to think about how we could create such things, and what benefit they might bring.
I quite like Bloomington, or the few blocks of Bloomington that I've seen.
The place is enormous. It has roughly the same number of students as Berkeley, but physically it's much larger. It also takes collegiate Gothic (a somewhat stripped-down, modernized version) to a scale I don't think I've never seen before. If you took Princeton or Bryn Mawr, put it on a balloon, then blew up the balloon to five times its previous size, you'd get the IU campus. Yale and University of Chicago bear some family resemblance to Oxford or Cambridge, thanks to their small scale; IU takes Gothic where it's never gone before.
It's also pretty heavily wooded. There are a couple streams that flow through the campus, and they're surrounded by forest and crisscrossed with little footbridges.
campus tuesday night, via flickr
the same location, wednesday afternoon, via flickr
The town has a lot of restaurants, and a lot of foreign food, for a place its size. Tuesday night I had dinner at an Ethiopian restaurant, and last night it was Thai at Siam House. (Both are a serious challenge to dieting!) One local attributed this to the long presence of foreign students at IU, some of whom brought spouses or other relatives who went into the restaurant business. I have no way of knowing if this is true, but for whatever reason, there's good food here.
There's a bit of a restaurant row, small places in old houses. That's cool, as it gives the restaurants a more informal character.
restaurant row, via flickr
There are also rabbits that come out in the evening, which adds one more little (furry and bouncy) note of whimsy to the place.
insouciant bunny, via flickr
In the last few days I've been doing a lot of stuff: biking, organizing a Memorial Day dinner, preparing for a week-long trip to the East Coast, thinking about the craft and design of workshops. (These are the expert workshops that I organize all over the place.)
In many ways these are very different activities, but I really enjoy them all. I recently realized that despite their differences, they actually share a few qualities.
1) They're active, embodied knowledge.
Obviously bicycling is physical, but cooking is a nice combination of fine motor skill and lifting big heavy things (or in my case, avoiding setting myself on fire); you're always on your feet in a workshop; and travel is pretty physically strenuous, for good and bad reasons. Maybe I'm getting older, I'm less of a couch potato, or my ADD is increasing (and I know these are somewhat mutually exclusive explanations), but I find my patience with sitting for long hours and just reading is decreasing. I can do it, but I'm happier engaging my body. And nothing is better than activities where you're involving your body, but you have to think about what you're doing. (Gregg Zachary had a great piece last year on the rediscovery of the virtues of manual work. I'm part of a movement.)
cycling hunter's point, via flickr
Like Richard Sennett's craftsman (and I really recommend his book), I enjoy things that are physical or tangible, but also engage the mind. Thoughtful action is where it's at.
gestural interface missile command, via flickr
2) There are real deadlines.
My capacity for finishing things that have open-ended deadlines, or fake deadlines ("so we all agree that we'll finish our tasks by next week, right? right?"), is plummeting to near zero. I have too much other stuff in my life that absolutely has to get done.
hard deadlines: flames don't wait, via flickr
So hard deadlines are good for me now. Essential even. The workshop starts at exactly this time, the plane leaves at exactly that time, the guests are arriving now.
Hard deadlines also put a nice bound on craftwork, by preventing you from tinkering forever with something. A paragraph could always be better, but as Sennett writes, the demands of the trade force craftsmen to accept limits, to do the best job they can within the time they have, and to learn to be satisfied with that. As graphic designers say, "Finished is Good."
3) They require preparation.
The day of the cookout, I spent hours chopping vegetables, checking marinades, cleaning off platters (you can never have too many platters at a BBQ), locating plates and cups, setting up staging areas for food and drinks, laying out tools, etc. (I noticed, though, that this wasn't tedious, it was pleasant. It was a classic example of what Csíkszentmihályi calls flow.)
Likewise, when you travel, you've got to think a lot about what to pack, how to structure your time, how to get among different places, etc.. A bike won't work with a flat tire, nor will a cyclist work if he's dehydrated, so you'd better be prepared for those possibilities. Every ride requires some kind of adjustment: technical climbs mess up gears; thorns flatten tires; I get hungry. Having the resources to deal with those things lets me keep riding.
With workshops, you have to think in advance about everything, and I mean everything: you have to go over the agenda minute-by-minute, think about the flow of the day, tinker with questions and exercises to eliminate ambiguity and focus people, lay out materials, move the furniture around, make sure the caterers know when to appear, etc., etc. (Indeed, there are things that we normally don't think about that I'd like to start experimenting with, like lighting and ambient sound, making some activities more embodied and physical-- sitting is exhausting-- and playing with the day's menu to keep people from getting weighed down by muffins and too much coffee.)
Good preparation doesn't require you to think just about one thing. It requires you to think about a lot of different things, big and small; to think about timing and process; about division of labor; about contingencies and strategies. That's part of what makes it pleasant.
future of science workshop, malaysia, via flickr
But here's the important thing.
Some of that preparation is meant to help you keep things on track, and do things exactly the right way. But most serious preparation isn't about scripting. Rather, its about making it possible for you to adapt to whatever actually happens. I've never had a workshop run exactly the way I imagined it would: more people show up, they turn out to be interested in other things than we'd discussed before, the room isn't laid out the way we expected-- a thousand different things can go akimbo.
I used to think that the point of planning workshops in such great detail was so I'd have more control over them. Wrong. You never have control. You have whatever you have when you get in the room. The point of doing all that planning is to deeply understand the intentionality and philosophy behind the workshop, so you can improvise your way to the same end-point, and you have the tools at hand to do so.
perimeter institute, waterloo, via flickr
[Update: I've realized that this is my complaint about humanities graduate training: it socializes you to believe that you possess skills that are useful only in a very specific future-- namely tenure track jobs in your field-- and train you to believe that you're less qualified to succeed at a different future, and that any other future is a failure.]
If you know that you're going to go off the map-- if events are going to conspire to send you in another direction, and they will-- the best that you can do is have the right gear, and a clear picture of where you want to go.
4) They have serendipity.
The upside of plans not working out the way you expect is that they can work out better. Sometimes the very coolest thing isn't on the map, and the only way to find it is to venture into the unknown.
One of the great pleasures of having a big party is that mixing up friends who don't know each other can have pleasant results for everyone. The best rides are ones that have a brilliant hill and view that you didn't know about. The best trips are the ones that expose you to something you've never seen before, or didn't even know was cool. I fell in love with Budapest not because I'd always wanted to go there, but because it's an amazing, complicated, Old World post-socialist place that I find alternately fascinating and frustrating. I love London because it rewards walking: I know it well enough to be able to navigate by Tube or on foot, but every time I go out in the evening I discover something-- a little square, a park, a row of businesses-- that charms and captivates, and that I'd never heard of.
surprise in the london underground, via flickr
Workshops have serendipity too. Tons of it. You want to build connections between ideas or fields that even experts hadn't seen before, or explore the cross-impact of trends that people normally think about separately. When that works, the results are awesome-- and the amazing thing is, the results are awesome a lot more often than you'd expect. You never know what the outcome of a workshop is going to be-- and if you do, there's really no point in having it in the first place. This doesn't mean that a workshop shouldn't have certain goals or deliverables; far from it. But it's like an evening walk in London: you know where you're going to end up, you know that there are certain landmarks you'll pass, but you don't know what else you're going to see along the way. Your job is to be open to the serendipity, so you can take advantage of it.
5) They draw out people.
I mean this in two senses. First, they can push you do things you didn't know you could. Good rides challenge you to do things you didn't think you were capable of, or leave you exhausted by happy with your performance.
Second, they open up a space for people to contribute. My wife used the cookout as an opportunity to repot a bunch of flowers in the backyard, dig out and repot some aging bamboo, and do other things on her gardening/home improvement list. Once kids started arriving, my daughter made (or taught the kids how make) balloon swords, which they then played with all evening. I hadn't thought of either of these, but people commented on how nice the backyard looked, and the kids all left exhausted and uninjured. Win.
perimeter institute, waterloo, via flickr
Workshops require both kinds of drawing out. Running a workshop isn't an exercise in controlling other people, but it's a hard task to create a venue in which everyone can think seriously, think differently, and think together.
It's also not about getting a certain result, but about creating the conditions out of which interesting new things will emerge. Of course, workshops have objectives, but as a facilitator, you have to approach them obliquely, and recognize that the actual work and thinking will be done by participants: you're just ("just" isn't quite the right word!) there to help make it happen.
workshop in laxenburg, astria via flickr
6) Sometimes you can push, but mainly you have to flow.
You can challenge people, but you can't order them to be innovative. You can try to get guests to mingle or introduce them to each other, but you can't make them be chatty and friendly. You can also push yourself, but you must recognize that pushing doesn't get you everything: you can get to the airport on time, but you can't control the weather and need to be able to go with whatever the situation presents.
my son on a happier ride
This morning I got an unexpected lesson on pushing versus flow from my son. We were biking to school, and he has the habit of standing up while pedaling. I can't get him to stop (he's seven, after all), so I was trying to teach him how to do it in a way that maintains his balance. He got frustrated and mad, which made him distracted; and so he took a spill. Bad enough to break the mirror on his bike, add a couple nicks to the brakes or handlebars, and require some ice and band-aids when he got to school. Fortunately nothing on him was broken, and he'll be fine.
As I try to tell the kids, biking is one of those things that demands mindfulness: you have to watch the road, know what gear you're in, know where the cars are, know how tired you are. You can push yourself, but if you lose your concentration-- if you lose the flow-- you're likely to crash. In the course of pushing him, I made him lose what little flow he had.
Still, any spill that doesn't send you to urgent care is a learning opportunity, not an accident. And as a friend of mine wrote after hearing about the crash,
But falling is an essential part of growth. It teaches you where the boundaries are. If you never push hard enough to fall, you will never know if you could grow twice as much or twice as fast-- because you are playing it safe.
So across all these activities-- and maybe across everything you do-- hitting that mix of pushing and flow, planning but staying open to serendipty, and being active is key.
Nothing in it about penis-shaped helicopters, but this Anthony Grafton piece about going to graduate school is pretty good-- the kind of combination of encouragement about the inherent (if quirky) rewards of academic apprenticeship, combined with some (maybe too gentle) warnings about the downside. I particularly like this little "then-and-now" gem:
In the ’60s, as universities expanded around the country and the world, job offers strewed the desks of bright Ph.D. candidates like autumn leaves in Vallombrosa. [ed: This is the kind of thing that separates writers like Tony from us mere mortals. I have no idea what it means, but I feel more erudite just reading that reference.] One friend of mine opened an envelope that had been buried under detritus on his desk and discovered that he’d been offered a job two years before and never even answered.
Not very likely to happen these days, but my father (who got his Ph.D. in 1970, and his first tenure-track job three of four years before) confirms that yes, that's what it was like back then.
While Tony advises readers that they shouldn't "jump [into grad school] before you find out exactly what lies below," though I wonder if it's really possible to "find out" what it's like, or what it'll do for (or to) you with anything approaching exactitude.
Of course you should talk to lots of people, but for most prospective students that universe will only include current students and faculty. The students who will be alternately glowing about grad school and their prospects, or will try to give you the scary "real" story. The faculty will be pretty useless as advisors about the realities of grad school: life looks very different at the head of the seminar table.
On the face of it, talking to students and faculty is a pretty logical decision, but the problem is this: odds are, you're not going to get a Ph.D. and then be a professor at the kind of university you aspire to attend. Further, while they're helpful about the day-to-day reality of school, graduate students are going to be useless sources about the long-term effects of going to graduate school-- either in economic or career terms, or in psychological terms. At the same time, other people who could be very informative-- people who've been ABD for 15 years; people who finished their Ph.D.s and then went to Wall Street, the World Bank, or think tanks; students who dropped out before their orals-- are much harder to track down.
So there's an inverse relationship between the availability of experts to consult, and the likelihood that their expertise is actually going to be useful in your own life.
When I was an undergrad (I was one of those nerdy kids who went straight from college to grad school-- actually, I started taking graduate classes as a sophomore), I never thought about talking to people who'd almost finished the programs I was looking at but dropped out, or people who didn't become academics. It turns out, of course, that it would have been far more useful for me to talk to Ph.D.s who'd gone into business. But those people aren't as easy to find as the ones in the faculty lounge or TA offices.
This is actually an example of a bigger problem that people and organizations face when thinking about the future: we tend to confine our research to cases that are relatively easy to find, and look only at successes (successful cases, organizations, or people), and not at failures. Getting a handle on that space-- or at least a more realistic appreciation of the likelihood of the unexpected happening-- is one of the toughest things you can do as a forecaster, or parent, or human. After all, success is what we want, and it's easy to understand; failure is what we want to avoid, and people fail for all sorts of unpredictable reasons. Success if what a strategy or good decision or first-rate school can bring you; failure is what'll happen if you don't get those things. We don't think explore the possibility that we could get those things, execute properly, and still not reach our goal; but that happens all the time. Success, we think, is comprehensible and predictable (and not largely determined by the economic state of universities and how expansive faculty hiring is allowed to be in any given season); failure is random, or something that'll happen to other people. But in reality, we're probably going to end up one of those other people. We're better off if we know that in advance.
And if we know that the definition of "failure" is sometimes as arbitrary as the forces that determine whether it happens to us or not. I can testify that it's possible to have an interesting intellectual life without being an academic (though having a library card does help). As Grafton notes,
Even if you don’t finish, or finish and don’t wind up as a professor, the skills you learn in grad school can be of value in a range of other venues. Some of my most successful former students work as scholars, teachers or writers outside the academy. But as you might expect, few follow this path without some bitterness. And no wonder. A fair number of professors treat students who leave the academy, even after experiencing terrible difficulties, as renegades and wash their hands of them. Be prepared.
Be prepared, indeed.
Having spent so much time thinking about young Ph.D.s developing postacademic lives, it never really occurred to me that there would be similar problems of professional marginality at the end of one's career. But Siris makes an argument that the failure of philosophy-- which, one imagines, would be second only to history as a scholarly activity in which age is a virtue rather than a disadvantage-- to find a place for emeritus scholars in the profession represents "the second failure of academia:"
[E]veryone assumes that retirement is and must be the end of the road: that the only reason you'd retire is because you've become dead wood. And no one has recognized that this is a symptom of a profound failure on our part, one almost as profound as the failure to prevent 'adjunctification'.
It is utterly absurd that we have no standard options after retirement for senior philosophers who still want to be actively involved in philosophy. If anything, retirement should standardly be the next stage after tenure, not an exit from the field but another kind of removal of constraints.
Perhaps we get something vaguely like this in how some departments treat emeritus professors; but only vaguely, and only like. We are failing people at the end as we are at the beginning.
But what gets me is that everyone takes it for granted: suggest retirement and it is assumed you are suggesting uselessness -- and, given the way the system's set up, that's a not unreasonable assumption. But it needs to be brought to consciousness that this is a failure that needs to be overcome, not a reasonable feature of the landscape.
How many fields are like this? Most of them, I'll bet. And it reflects our somewhat schizophrenic attitudes towards age, experience, and work: we alternately talk about experience and skill being the most valuable things an organization can have, but at the same time sometimes imagine real innovation only coming from twentysomethings who sleep under their desks. Even in academia, some fields-- mathematics and theoretical physics, for example-- assume that the really brilliant work is done by the young, and if you don't have a major discovery by the time you're 30, you never will.
This idea struck a chord for personal reasons. My father just retired from his professorship at the Colorado School of Mines, to take advantage of some new professional opportunities, and to give himself more time to work on writing projects. His impulse to see retirement not as a chance to kick back, but to do the work he really wants, is hardly unusual. And I expect if I ever get to that age, I'll approach retirement the same way. Assuming retirement, or something like it, still exists.
Actually, Theodore Roszak makes a really good point in his book The Longevity Revolution (who I visited a few years ago, and whose work I talk about) that the concept of retirement as a period of time that you could do something with is a very modern invention. It used to be that you were likely to die within a few years of retirement (assuming you made it that far), and for part of that time were likely to be an invalid. In contrast, now people regularly face years or decades of life in retirement, and fewer and fewer of them are content with the idea of just running out the clock in Florida (and more and more can't afford it anyway). So if academia is behind the curve in recognizing post-retirement as a productive time, that's probably not a surprise-- though anything that wastes talent is always a shame.
[Via Sympoze, which itself looks like an interesting data-point.]
Thomas Benton, writing on (really warning against) graduate school:
It's hard to tell young people that universities recognize that their idealism and energy — and lack of information — are an exploitable resource. For universities, the impact of graduate programs on the lives of those students is an acceptable externality, like dumping toxins into a river. If you cannot find a tenure-track position, your university will no longer court you; it will pretend you do not exist and will act as if your unemployability is entirely your fault. It will make you feel ashamed, and you will probably just disappear, convinced it's right rather than that the game was rigged from the beginning.
Professor forgets to attend his own sell-out lecture about duty
A highly respected literary and academic figure failed to attend a talk he was due to give for a literature festival, after getting confused over the date.
Professor of philosophy Anthony Grayling, of Birkbeck College, University of London, had arranged to speak about his latest book, The Choice of Hercules.
His talk was part of Richmond’s 17th annual literature festival.
The book reflects on the challenges of duty versus pleasure.
]Seen on Not Exactly Rocket Science]
Frank Rich's New York Times piece cautioning that "the brightest are not always the best" is very good.
IN 1992, David Halberstam wrote a new introduction for the 20th-anniversary edition of “The Best and the Brightest,” his classic history of the hubristic J.F.K. team that would ultimately mire America in Vietnam. He noted that the book’s title had entered the language, but not quite as he had hoped. “It is often misused,” he wrote, “failing to carry the tone or irony that the original intended.”...
The stewards of the Vietnam fiasco had pedigrees uncannily reminiscent of some major Obama appointees. McGeorge Bundy, the national security adviser, was, as Halberstam put it, “a legend in his time at Groton, the brightest boy at Yale, dean of Harvard College at a precocious age.” His deputy, Walt Rostow, “had always been a prodigy, always the youngest to do something,” whether at Yale, M.I.T. or as a Rhodes scholar. Robert McNamara, the defense secretary, was the youngest and highest paid Harvard Business School assistant professor of his era before making a mark as a World War II Army analyst, and, at age 44, becoming the first non-Ford to lead the Ford Motor Company.
The rest is history that would destroy the presidency of Lyndon Johnson and inflict grave national wounds that only now are healing.
For those of us who come out of this kind of world, or at least have been influenced by and wanted to emulate these kinds of people, it's a nice little reminder that brains-- or the particular forms of intelligence that are bred in the hothouses of academia and think-tanks-- aren't everything.
I've been thinking about this for a while, because I've recently become aware of how formative the experience of graduate school was for me (or perhaps I've allowed it to become), and how much I've had to unlearn-- and still am unlearning-- some of the habits that I developed there and as a young academic.
In my current incarnation (as a mortgage owner, to say nothing of someone who lives at the interface of marketplaces and ideas), the contempt for money that I learned as a young professor-to-be is definitely a maladaptation. It's good to not be motivated primarily by money (unless you're in a job like banking, where that makes sense), but it's always bad to be careless about it, or to be uncomfortable talking about it-- something that as a consultant you can NEVER get away with. (Academic contempt for money is also to some degree a product of two other things: the fact that you're likely never to see much of it anyway, and that once you're tenured, you never really have to worry about it again. Your income is not large but extremely secure.)
Likewise, the assumption that that you have to rewrite things a dozen times, worry over them for months, and get as much of your argument exactly right before you can let someone else see it, is definitely not attuned to the way the rest of the world works. The tiniest fraction of ideas are meant to be timeless; a slightly larger sliver might last for years; but the fact is, most ideas are perishable goods, that need to be churned out, circulated, and monetized before times change. (Actually, a lot of scientific and scholarly ideas are like this, too.) Timely goodness is better than obsolete excellence.
I think of myself as living mainly in my own mind: aside from the family and friends, most of my conversations take place with books and words. This has tended to translate into a mild (or maybe not so mild) disregard for the world. But recently I realized that I need to be hyper-effective in the world to be effective in my own world. The demands of the everyday don't go way; instead of ignoring them, you need to be able to deal with them with ruthless efficiency, so you have time and bandwidth for what really matters.
It makes me appreciate Sam Rayburn's words in an anecdote Halberstam tells in The Best and the Brightest:
Johnson, after his first Kennedy cabinet meeting, raved to his mentor, the speaker of the House, Sam Rayburn, about all the president’s brilliant men. “You may be right, and they may be every bit as intelligent as you say,” Rayburn responded, “but I’d feel a whole lot better about them if just one of them had run for sheriff once.”
Up at the Carnegie Foundation for the Advancement of Teaching this evening, for the opening of a conference on tinkering. It looks like it's going to be a really fascinating event. There are lots of cool people, it's a wonderful subject, and the venue is really nice.
In the Chronicle:
Wal-Mart, the nation’s largest private employer, long criticized for its workplace policies, is a “more-honest employer” of part-time workers than colleges that employ thousands of adjunct faculty members. That was the harsh message delivered to a group of college human-resources officials here on Monday by one of their own: Angelo-Gene Monaco, associate vice president for human resources and employee relations at the University of Akron....
“We helped create a highly educated part of the working poor, and it’s starting to get attention from outsiders,” he said, noting that unions are trying to organize part-timers, and lawmakers in nearly a dozen states are examining the issue.... “We rely on them for a very important function, and we assume that they will continue to accept mistreatment in return.”
Well, I made it. I may fall asleep, since my body thinks it's 2 a.m., but at least I'll fall asleep at the conference, rather than some random place in England.
I got here about an hour late-- not only did the bus take a little while, but I was dropped off about 10 minutes' walk from SBS-- but I got checked in, dropped my bag, and came to the lecture hall. Of course, in classic 19th century fashion, the doors to the lecture hall are at the front, so if you're late everyone can see you. (There are doors in the back, but the young lady who was doing registration didn't tell me how to get to those doors. I think she was punishing me.) So everyone knows I'm here. Not that more than a handful of people might recognize me, of course....
Walking up High Street, I went past a vast number of teenagers with their parents, all holding maps or slender catalogs. Is a summer school session starting? Or is this what Oxford is like all summer?
Terry Eagleton has a piece on Raymond Williams and the challenge of culture in the 21st century:
"Culture is ordinary," [Raymond] Williams wrote in a pioneering essay, and his own life was a case in point. He saw his transition from Black Mountains to Cambridge spires as in no sense untypical. Right to the end, he regarded the politically conscious rural community in which he was reared, with its neighbourliness and cooperative spirit, as far more of a genuine culture than the Cambridge in which he held a professorial chair and that he once acidly described as "one of the rudest places on earth". Working-class Britain may not have produced its quota of Miltons and Jane Austens; but in Williams's view it had given birth to a culture that was at least as valuable: the dearly won institutions of the labour, union and cooperative movements....
The real sense in which culture since Williams's death has become more ordinary has little to do with Dante or Mozart. One of Williams's key moves was to insist that culture meant not just eminent works of art, but a whole way of life in common; and culture in this sense - language, inheritance, identity, religion - has become important enough to kill for. Dante and Mozart may be elitist, but they have never blown the limbs off small children....
Ever since the early 19th century, culture or civilisation has been the opposite of barbarism. Behind this opposition lay a kind of narrative: first you had barbarism, then civilisation was dredged out of its murky depths.... Civilisation needs to be wrested from nature by violence, but the violence lives on in the coercion used to protect civilisation - a coercion known among other things as the political state.
These days the conflict between civilisation and barbarism has taken an ominous turn. We face a conflict between civilisation and culture, which used to be on the same side. Civilisation means rational reflection, material wellbeing, individual autonomy and ironic self-doubt; culture means a form of life that is customary, collective, passionate, spontaneous, unreflective and arational. It is no surprise, then, to find that we have civilisation whereas they have culture. Culture is the new barbarism.
There's also this great line: "In a rare moment of disillusion, he told me that the difference between teaching adults and students in the 1950s was like 'teaching doctors' daughters rather than doctors' sons'."
I don't think this was a very well-kept secret, but now it's official: in addition to my day job, and my work on the end of cyberspace book, I'm now officially an Associate Fellow at the Saïd Business School at Oxford University. It's a two-year appointment, which runs through the spring of 2010 (through Hilary Term, for those of you keeping track across the pond). I don't teach any courses, but I do work with students, and am on call to do things with SBS groups visiting Silicon Valley.
The appointment was initially approved in March, but they only got me up on the Web site this week. Such is the pace of things there. (And as one friend said, "My god, your picture on the SBS website is so Californian!" It was taken in the garden of Howard Rheingold's house. You don't get more California than that.)
I've still got my affiliation with Stanford, and thank heavens for that: having access to the Stanford library has been critical to my continued viability as a thinker. But I've got a couple executive MBAs I'm working with at Oxford, and have had a good time collaborating with people at the James Martin Institute. And in the last few years I've been to more conferences there than Stanford.
Strange to have closer intellectual ties to a university in England than to one three miles away, but such is life these days. Or my life, anyway.
Needless to say, this is a real thrill. Not because it represents some prospective return to academia, but because it's an interesting hybrid position. SBS is one of several business schools that are real intellectual hot-houses these days. Some of the best B-schools are no longer places that just train people to crank out exotic formulas or spout jargon, but are seriously thinking about what it will mean to do business in this century. Oxford the added virtue of having the James Martin Institute, which in the next few years will-- if it has any sense at all-- become the global epicenter for serious futures work. So this is a good time to get connected to this little world.
I've already promised several people that I won't start speaking like a character out of P. G. Wodehouse, as tempting as that would be.
[To the tune of Drew Barrymore & Hugh Grant, "Way Back Into Love [Demo Version]," from the album "Music & Lyrics".]
Using the title "Dr." if you have a doctorate from the U.S. can get you into trouble:
Americans with PhDs beware: Telling people in Germany that you're a doctor could land you in jail.
At least seven U.S. citizens working as researchers in Germany have faced criminal probes in recent months for using the title "Dr." on their business cards, Web sites and resumes. They all hold doctoral degrees from elite universities back home.
Under a little-known Nazi-era law, only people who earn PhDs or medical degrees in Germany are allowed to use "Dr." as a courtesy title.
The law was modified in 2001 to extend the privilege to degree-holders from any country in the European Union. But docs from the United States and anywhere else outside Europe are still forbidden to use the honorific. Violators can face a year behind bars.
[To the tune of Blue Öyster Cult, "Godzilla," from the album "Don't Fear the Reaper: The Best of Blue Öyster Cult".]
I've gotten a slew of Facebook and LinkedIn requests these last few days, from people I've not been in touch with for a while. These come now and then, but what's unusual right now is how many of them are from people I haven't been in touch with for a long time.
This past weekend I got a friend request on Facebook from a high school classmate who I haven't seen since graduation, more than 25 years ago. He's now a pastor, and from what I hear a pretty good one.
I also reconnected with one of my high school music teachers. This is someone I haven't spoken to in a couple decades, but she was one of my favorite teachers. It turns out that she was also of the most influential. I've not sung in any organized venue since college, but I think singing gave me a valuable familiarity with public performance and an awareness (in a good way) of the craft and artifice of self-presentation.
This is not an impact either of us could have predicted, and it illustrates two things.
The first is that education is rarely wasted... but its doesn't always pay off where you expect. When my children were babies and waking up in the middle of the night, I was getting very little sustained sleep, and often thought to myself, this is like studying for my orals. I didn't read all that Joseph Ben-David, Margaret Rossiter and Andy Pickering in order to be more effective at baby-wrangling; but it turns out that the experience of having to plow through vast amounts of stuff, and not having enough hours to both read and sleep, paid off in unexpected ways. Nor did I study STS to become a futurist; but the value of STS as a conceptual toolkit and way of thinking is pretty self-evident to my colleagues.
The second is that if it's hard for us to predict how what we learn will pay off, it's almost impossible for our teachers to know. For me, one of the hardest things about teaching was the sense that I didn't know-- indeed, couldn't know-- what kind of impact I was having on my students, or would have on them. It might be that the enthusiastic ones would never find a use for anything I taught them, or that the smart but slightly jaded one would have a career-defining moment that turned on something she learned in class. All of that was unknowable to me, and I would have to take on faith that, after all was said and done, my impact would be more positive than negative (or maybe neutral was the worst you could reasonably expect-- a history teacher is going to have a hard time ruining anyone's life).
Of course, there are a few students you hear about, and if you're old enough you might merit some kind of formal recognition, which is an occasion for people to come and say nice things about you. But those kinds of events are pretty scripted, and come pretty late in one's professional life.
I wonder, though, if in the future teachers will find it a little easier to know how their former students are doing, and what kind of effect they might have had on them. My wife, who teaches eighth graders, is connected to some of her former students through Facebook; and while they may not talk regularly, those weak ties are easier to maintain than my connections to my teachers, and it's probably a little harder for them to decay to the point of being useless. (After a couple moves, I found that not only had I shed myself of things I wanted to get rid of, I'd also inadvertently thrown out things like address books, old letters, and the like. So much for going home again.) I suspect that in the future these links may make it easier for teachers to have a sense of how they've affected students. Which would be nice for everyone.
[To the tune of Perpetual Groove, "March of Gibbles Army," from the album "Live at The Music Farm, 31 December 2006".]
I write about people, technology, and the worlds they make.
I'm a senior consultant at Strategic Business Insights, a Menlo Park, CA consulting and research firm. I'm also a visitor at the Peace Innovation Lab at Stanford University. (I also have profiles on Linked In, Google Scholar and Academia.edu.)
I began thinking seriously about contemplative computing in the winter of 2011 while a Visiting Researcher in the Socio-Digital Systems Group at Microsoft Research, Cambridge. I wanted to figure out how to design information technologies and user experiences that promote concentration and deep focused thinking, rather than distract you, fracture your attention, and make you feel dumb. You can read about it on my Contemplative Computing Blog.
My book on contemplative computing, The Distraction Addiction, was published by Little, Brown and Company in 2013.
My next book, Rest: Why Working Less Gets More Done, is under contract with Basic Books. Until it's out, you can follow my thinking about deliberate rest, creativity, and productivity on the project Web site.
The Distraction Addiction
My latest book, and the first book from the contemplative computing project. The Distraction Addiction is published by Little, Brown and Co.. It's been widely reviewed and garnered lots of good press. You can find your own copy at your local bookstore, or order it through Barnes & Noble, Amazon (check B&N first, as it's usually cheaper there), or IndieBound.
The Spanish edition
The Dutch edition
The Chinese edition
The Korean edition
Empire and the Sun
My first book, Empire and the Sun: Victorian Solar Eclipse Expeditions, was published with Stanford University Press in 2002 (order via Amazon).
PUBLISHED IN 2015
PUBLISHED IN 2014
PUBLISHED IN 2013
PUBLISHED IN 2012
PUBLISHED IN 2011
PUBLISHED IN 2010
PUBLISHED IN 2009