"I happen to believe that you can’t study men, you can only get to know them, which is quite a different thing.” (C. S. Lewis in That Hideous Strength, quoted in Humphrey Carpenter’s J. R. R. Tolkien: A Biography)
"I happen to believe that you can’t study men, you can only get to know them, which is quite a different thing.” (C. S. Lewis in That Hideous Strength, quoted in Humphrey Carpenter’s J. R. R. Tolkien: A Biography)
Forgive me if I've referenced this before, but I find this new Wellcome Collection project too interesting:
The urge to be busy defines modern life. Rest can seem hard to find, whether in relation to an exhausted body, a racing mind or a hectic city. Should we slow down, or should we embrace intense activity? What effects do each of these states have on the health of our bodies and minds? Such questions frequently find their way into media reports and everyday conversations, but there has never been any sustained interdisciplinary attempt to answer them. The Hub will gather international experts investigating hubbub and rest at different scales, to breathe new life into the questions we ask about rest and busyness.
It's not really clear to me what The Hub is, other than yet another converted lofty-Ikea-filled team collaboration buzzspace, but hey, those aren't bad, and you can always escape them when you need solitude. Still, you can't fault the idea of a project on rest and busyness.
There’s a famous quote by Albert Einstein that “The whole of science is nothing more than a refinement of everyday thinking.” I’d heard it a number of times, but today I ran across a note by Carnegie Mellon psychologist David Klahr pointing out that Einstein’s aim was not to make science seem simple, but to call attention to the complexity of normal thinking. For Einstein continues:
It is for this reason that the critical thinking of the physicist cannot possibly be restricted to the examination of concepts of his own specific field. He cannot proceed without considering critically a much more difficult problem, the problem of analyzing the nature of everyday thinking.
The idea that “analyzing the nature of everyday thinking” is a “difficult problem” rather changes the meaning of the first line.
A few weeks ago I spoke a memorial service for one of my thesis advisors, Riki Kuklick. While I was at Penn I also gave a couple other talks, on postacademic careers and contemplative computing; but all three turned out, one way or another, to touch on Riki and her influence on me.
After I returned home, I noodled around with the talks, and eventually put them together. The result wouldn't have been appropriate in any of the three venues, but it better reflects what I was struggling to say in separate places on different days.
In September 2013 I returned to Philadelphia to speak at a memorial service for one of my favorite professors, Henrika Kuklick. Exactly thirty Septembers earlier, I stepped into my first classroom with Riki, and her course on the sociology of knowledge. It was the beginning of an association that would shape the next eight years of my life at Penn, and beyond.
Even though my father was a professor, and I was lucky to have some great teachers and role models at Penn, Riki lived the life of the mind in a way that was especially vivid and accessible. It goes without saying that she was as brilliant as the other professors who most deeply influenced me at Penn-- her colleagues Rob Kohler and Thomas Hughes; art historian David Brownlee; and strategist and systems thinker Russ Ackoff-- but she was a great model for aspiring scholars.
The Problem of the Real World
The importance of academic models like Riki for aspiring scholars shouldn't be overestimated, because academic life is often looked at skeptically by people who see themselves as firmly rooted in the "real world."
As my years at Penn drew out, some of my old friends and relatives expressed the opinion that all this education was just a way of avoid going into the real world. The real world was the place where people DID things, made money, got stuff done. The university was fine if it helped you get a job, but otherwise it was little point to it. Well, if the university was NOT the real world, then I wanted no part of it. I wanted to be a professor; the campus would be MY real world.
That didn't work out: I graduated into a terrible job market, and after finishing my first book and a couple postdocs became a consultant. But then I made a surprising discovery: the "real world" was actually a great place to pursue the life of the mind.
Working as a futurist means grappling constantly with epistemological issues around the possibility of predicting the future, your professional credibility, and the standards by which your work should be judged-- all familiar themes in the sociology of science. In the mid-1990s, thanks to the growth of the Internet, the rising importance of the service economy, the ferocious pace of technological and global change, and other factors, the boundary between the world of ideas and the "real world" was collapsing. In order to survive in today's economy, organizations have to think seriously about what they were doing and why, and have models that explained how the world works and how it's changing. In their worldly impact, ideas are more real than ever.
One reason I was able to continue my own intellectual life was that I had Riki's pursuit of it as a model. There was nothing unreal about the life of the mind the way she lived it, or her love of the craft of scholarship. Her own professional life was lived in the ivory tower, she would have regarded the prospect of working with C-suite executives with horror. Despite this, she gave me the means to see the life of the mind as a devotion rather than just a profession, as an internal discipline as well as an academic one.
In a sense, I was also applying to my own life another lesson Riki taught me: that we should question what others believe is inevitable and inescapable, because what appears fixed may in fact be contingent and changeable. The expertise that may seem unassailable, the assumptions that seem self-evident, the truths that claim to be eternal, all may not be as real as they seem-- or like a great movie, their greatness may a blend of hard word, clever staging, and a willing suspension of disbelief.
Seeing that the boundaries between the academic world and "real world" could be more porous than I'd believed helped me create a life that borrowed from both worlds. It let me uproot my own well-cultivated prejudice against corporate life. It freed me to reimagine academic life as something more portable and useful than I'd previously imagined. It let me see that one could make a life that combined the vita activa and vita contemplativa.
Another Real World: IRL
That experience of moving between worlds had a subtle but important resonance in my latest book. While writing The Distraction Addiction, I ran up against the sensibility that Facebook, text messaging, the Web, and the other things that make up the digital world can ONLY be distractions from a well-lived life; that proximate physical interactions are naturally superior to anything we can experience online; and that the best solution to our electronic troubles is simply to turn technologies off. We should get offline in order to spend more time in the real world, where we can have a real life. The simple and apparently innocuous acronym "IRL" turns out to be a kind of intellectual virus. It packs a lot of unexpected information and moral judgment in a very small package.
This claim is one side of an argument that's into its third decade. In the 1990s and the early days of the World Wide Web, figures like John Perry Barlow and Esther Dyson declared that cyberspace was a new world separate from and superior to the physical world; critics answered that the Internet was a threat to literature, social development, even our memory and cognitive abilities. To me this debate had a ring of familiarity. If the distinction between the academic world and real world doesn't make a lot of sense, I wondered, could the same be true of the apparently huge gap between digital life and real life?
Once I dug deeper, I saw that just as the distance between academic life and real life was overhyped, so too was the distance between digital life and real life. Technologies like smartphones, locative services, and wireless Internet access have erased the functional boundary between bits and atoms, while ecommerce, email, and social media have woven the digital world into our everyday lives.
Even more profoundly, I realized, using technologies is not something that makes us less human, or takes us away from our natural selves. Since the invention of stone tools two million years ago, human bodies have co-evolved with our physical tools, while our minds have co-evolved with our cognitive tools. We are, as philosopher Andy Clark puts it, natural-born cyborgs. At its best, this entanglement of person and technology extends our cognitive and physical abilities, gives us great pleasure, and makes us more human.
The challenge with smartphones and social media, then, is not to learn to give them up, but to learn to use them wisely. We need to practice what I call contemplative computing, developing ways of working and interacting with information technologies that help us be more mindful and focused-- and thus better people-- rather than be endlessly distracted and frustrated.
By better understanding the nature of attention and distraction, by studying how our interactions with technologies go bad, and by experimenting with new ways of using them, we can resolve the paradoxes these technologies seem to bring into our lives. Using them wisely helps us become wiser about ourselves. Being more mindful about HOW we use technologies helps us be more mindful WHILE using them.
This leads me to argue that we should push back against the moral distinction between academic life or digital life on one hand, and real life on the other. We shouldn't think in terms of a "real life" versus a "digital life" any more than we should think of our lives in the library or laboratory as unreal.
IRL = In Richer Life
To put it another way, we should redefine what the acronym IRL means. When people talk about "going IRL," one of the things they're doing is expressing a desire for self-improvement: turning off the devices, going camping or spending time with the family and friends. The impulse is laudable, but the assumption that it can only happen when you hit the off switch is incorrect.
Instead, we should think of RL as a richer life, one of that isn't driven mainly by distractions, but reflects a serious attempt to create meaning in the world, to do things that matter with our lives, to build and extend our selves. This is an effort in which the thoughtful, judicious, mindful use of technology can play a role-- and which those habits of mind that we think of as "academic" can also be intensely useful. We can build lives aren't merely real, but are richer, using tools that take form in silicon and electrons, or tools that are encoded in words and ideas.
Practicing contemplative computing requires taking a more critical, ethnographic approach to how we use technology; asking basic questions about why we use technologies, noticing unconscious habits, how we think about them, and how they affect the way we think about ourselves. All these ideas could have come from one of Riki's classes, even though they're applied in an area that seems outside her scholarly interest.
Riki and the Richer Life
But that ability to follow ideas wherever they lead, to pursue diversions until they reveal something unexpected yet connected to your original interests, is just me channeling another of Riki's habits.
Riki was an astonishing conversationalist-- indeed it was hard to get a word in edgewise. If you didn't know her you might listen to her monologues and think she was just free associating. But if you listened carefully, you discovered that she would start a sentence, interrupt herself and veer off onto another subject, then do it again, and again-- and then, systematically work her way back, until twenty minutes later she finished that first sentence. That ability to draw together a dozen different subjects in a single conversation, to weave between and weave together different ideas, never failed to amaze her students, and I suspect there's an echo of it in my writing even today.
But in a sense the questions I'm working on now are not outside her area at all. What Riki showed me, through her work and her life, is that far from being an escape from real life, the life of the mind can serve as a model for how to build richer lives.
The categories of "real world" on one hand, and "digital world" or "academic world" on the other, can be remade, and in the course of doing so, we can make better, richer lives for ourselves. A more thoughtful understanding of our everyday engagements with technology can make our lives better. It's an attempt to make sense of how we should define what it means to be human, how to think about the divide between people and technologies, and to see that the challenge and the opportunity we face is not to learn how to live in real life, but to learn how better to use tools and time to have a richer life.
Deanna Day, a grad student at my alma mater, wrote a nice little piece on "Harry Potter, Wizards, and How We Let Technology Create Who We Are." It gets seriously into the weeds of the Harry Potter universe, but it makes a serious point about how magic and technology can shape their users:
Muggles and wizards alike are mystified by the mechanisms of objects like iPads and Sorting Hats, and this ignorance can often, ironically, create a deep sense of trust in these objects. We create stories that explain their behavior, and when our tools work, it cements the validity of those stories. How else to explain their mechanism, be it magical or mechanical? But when we allow our technologies to remain opaque, we also prevent ourselves from seeing the crucial ways they make us who we are....
Most of the piece is about how wands work, who can use them, and their relationship to their users (e.g., the whole "wand chooses the wizard" thing). She concludes:
[T]he stories that wizards tell about their tools don’t match up with how they’re used in practice. The wand chooses the wizard, because that’s what wizards want to believe about their type of magic. In this story, wizards are special, and wands are objective proof. In another example, the Sorting Hat is believed to reveal one’s true identity, until an arguing student reveals that the Hat’s interpretation — and its social consequences — are much more negotiable than its song would imply.
In this (and many other) ways, the wizarding world exists in parallel with the muggle world.... By pointing out some of the ways that the technologies of the wizarding world are constructed — and the kinds of wizards they construct — we might also be better able to see the workings of our own muggle magic. As we go about our lives using our mysterious technologies, what kinds of people are we enabling them to make up?
Jessica Francis Kane, writing in The Atlantic, talks about a Marcus Aurelius quotation that she took to heart:
Book 8, #36
Do not disturb yourself by picturing your life as a whole; do not assemble in your mind the many and varied troubles which have come to you in the past and will come again in the future, but ask yourself with regard to every present difficulty: 'What is there in this that is unbearable and beyond endurance?' You would be ashamed to confess it! And then remind yourself that it is not the future or what has passed that afflicts you, but always the present, and the power of this is much diminished if you take it in isolation and call your mind to task if it thinks that it cannot stand up to it when taken on its own.
I thought about how Marcus Aurelius's concerns and mine differed, but I was inspired by the idea that the spirit of them, separated by so many centuries, was similar. His words helped me get to the desk, and stay there, during all the years it took me to write my first good story. Writing is hard, but is it unbearable? Who would say that it is? Even asking the question, I'm reminded of the one exclamation in the passage: "You would be ashamed to confess it!" His words helped me navigate rejection, which is certainly no fun, but if you ask yourself if it's unbearable, you find yourself preparing the next self-addressed stamped envelope pretty quickly. The words helped me survive the protracted sale of my first novel, and they reminded me to start writing again after a long hiatus after the birth of my first child. I wasn't sure how to make room for writing with a baby. It is difficult, but beyond endurance? I got myself back to the desk.
Personally, I think nothing prepared me for writing as well as studying the Victorians. Not because they invented the world as we know it (in many ways they did), or because their work was awesome (though it was), but rather because they got so much done. Tomes, multivolume histories, three-decker novels. Theories, scientific discoveries, expeditions, surveys. Buildings, massive urban redesigns, vast public buildings, and more than a few dark Satanic mills. New ways of seeing the world, of traveling it, or recording it.
And they still managed to take month-long vacations, or at least have high tea. The older I get, the more impressive that part is-- and, I begin to suspect, the more important it is for understanding why they were able to get so much done. It wasn't just the absence of television or Facebook. My intuition now senses that they were productive because they had a better sense of when to quit for the day. They could be more productive because they were more measured.
Granted, I have absolutely no real evidence for this, and I'm sure it'll be years before I can really chase it down, but their lives were about as well-documented as you can get without FitBit and SenseCam, so I'll bet you really could study their work habits, how much time they spent at work and play, how they saw the differences between the two, and how it made them great.
Yesterday I found out that one of my mentors from college and graduate school, Henrika Kuklick, died.
Riki was one of the professors who got me hooked on the history of science, and along with Rob Kohler helped make me who I am. In the fall of my freshman year I had taken a seminar with Tom Hughes, mainly because it sounded interesting and he had a Ph.D. from the University of Virginia, and then in the spring had a class with Hughes and Rob, who would go on to be my undergraduate and graduate advisor. In my sophomore year I took Riki's sociology of science class, and from then on hardly a semester went by when I wasn't taking something with her.
Riki was a kind of intellectual performer I'd never encountered before. I never knew anyone who could keep track of so many thoughts: I marveled at how she could start a sentence, divert herself, then go off on something else, but then work her way back up and finish the sentence 20 minutes later. She had a kind of unreserved enthusiasm for life and ideas that really resonated with me; my decision to work on Victorian science was influenced in no small part by her description of living in England and working in the archives there. When I was a bit older and had more of a critical sensibility, I found her scholarship to be really outstanding, erudite without being purposely complicated: I taught her Great Zimbabwe Ruins article in several of my classes, and it always went over well.
She was also a great person and teacher, always supportive and generous, great at helping you think through arguments. Not the closest reader, though; lots of chapters came back with "Good work" scrawled at the end, and little more. (That's why you needed Rob Kohler on your committee. That man could line edit a diffraction grating.)
There are lots of people who can hardly remember classes from college, or the professors they had. Riki, in contrast, introduced to me a set of questions about the ways people, ideas, and technologies interact that I'm still dealing with. It's why I dedicated my first book to her and Rob. And I think I'll spend the rest of my life working on things that we talked about. Fortunately they're very big questions.
I find as I close in on 50, I don't particularly notice my age: I've had some grey hair since I was in graduate school (it'll do that to you), and aside from bifocals, I'm not in worse physical shape (though that's not the highest bar ever set), and more important, I'm a better writer and thinker than I've ever been in my life. But what I can't comprehend is other people getting older, too: my parents are in their 70s, which I find weird, and Riki was 70, which to me is inconceivable: my memory of her was fixed in the 1980s.
It's one of life's ironies that the gap a person leaves when they're gone is as large as the impact they made when they were alive. By that standard, Riki's passing leaves a very large gap indeed.
Everything must be made into a sweeping analogy. And his instinct for grandiosity is so pronounced that only a small group of recurring subjects are fit for the comparisons that he offers. Would anyone other than Newt Gingrich respond to failing to get on a ballot by asking, "Okay, what's the historical analogy?" Even the "the" is perfect -- as if there is one definitive analogy that fits….
Knowing Gingrich regards Pearl Harbor as "the analogy" for effectively losing Virginia, what comparison will he think appropriate if he loses the entire campaign. The Bataan Death March? Lincoln's tragic night at the theater? The fall of Rome? Or maybe he'll win, and God help us if so. Has there ever been a politician more inclined to make history? In a president of the United States, that is a dangerous impulse.
There's actually a serious point here, as Conor observes. For all his love of history-- or his love of himself as an historian-- Gingrich's use of history is remarkably shallow, and really consists of a kind of Mad Libs consisting of the Revolutionary War, the Civil War (how many people has Gingrich challenged to "a series of Lincoln-Douglas debates"?), and the Cold War; comparisons of himself to Washington, Reagan, and Thatcher; and apocalyptic language that would have warmed the hearts of Leo Strauss or Nostradamus.
He truly is a stupid person's idea of what a smart person is like.
Also, this handy flow chart.
Has anyone written an article on the term "real time"-- where it comes from, how it's been used in the last few decades, and what it means today? In particular, I'm interested in how and when the "realness" of things like computer network and information flows became more "real" than those of more "natural" everyday human life.
Theodore Roszak, author of Making of a Counter Culture and The Longevity Revolution-- and another two dozen books, more or less-- has died. I interviewed Roszak a few years ago for a project on the future of aging, and loved the passion and energy of his first book, on the counterculture. In effect, in those two books Roszak chronicled the Baby Boomer generation's great impact on American society and culture. As I explained a while ago,
Roszak spent his career at Cal State Hayward, and retired a few years ago. A decade ago, he went through a medical crisis "that would have killed our parents or grandparents. I came through this realizing that… I might live another twenty or thirty years," he recalled to me last fall.
His experience revealed two things. First, surviving an event like this is "a profoundly transformative, spiritual experience" that makes you "wonder what you're going to do with what you now see as a gift." Second, he wasn't alone: for more and more people, events like this are becoming a passage into a new, as-yet undefined phase of life.
I really became a fan of his when I came upon a review he wrote of Buckminster Fuller's Operating Manual for Spaceship Earth that was one of the most critical, smart attacks on Fuller's worldview.
My other substantive encounter with Roszak happened a decade ago, around his book From Satori to Silicon Valley, which he was kind enough to let me republish on my Making the Macintosh project. He was quite supportive of my interest in the book (which makes an argument about the link between counterculture and computing later explored by Fred Turner and John Markoff, and more recently applied to theoretical physics by David Kaiser), and gracious in letting me reprint the book.
Finally, I recently rediscovered a book he published in 1986, The Cult of Information, which I'd picked up at a book sale and never actually looked at in detail. However, it prefigures some of what I talk about in contemplative computing, particularly in acting on an understanding of the difference between human and computer intelligence and memory:
Two distinct elements come together in the computer: the ability to store information in vast amounts, the ability to process that information in obedience to strict logical procedures. Each of these will be taken up in turn in Chapters 5 and 6 and explored for its relationship to thought. There we will see how the cult of information fixes upon one or the other of these elements (sometimes both) and construes its intellectual value. Because the ability to store data somewhat corresponds to what we call memory in human beings, and because the ability to follow logical procedures somewhat corresponds to what we call reasoning in human beings, many members of the cult have concluded that what computers do somewhat corresponds to what we call thinking.... The burden of my argument is to insist that there is a vital distinction between what machines do when they process information and what minds do when they think.
In the course of reorganizing my home office (I'm starting serious work on my next book, on contemplative computing), I've made another cull of my book collection. Like last year, I've got several large boxes of books that are duplicates, books I read long ago, or books I'm honestly never going to read. Some are in pristine condition (alas), while others are annotated. I'm looking to give away to someone who can use them-- preferably a grad student in history of science / STS or some related field, but that's highly negotiable.
Many of them are really outstanding, but the trajectory of my professional work is such that they're no longer really relevant for me, and I've got more books coming in the door every day. Here's a picture of many, but not quite all, the books I'm giving away.
Go to flickr for a full size version
The terms of the giveaway are:
If you're interested, contact me at askpang at gmail dot com.
Edward Tenner has a piece in The Atlantic about the Titanic's continued importance in Ireland as "an icon of the vanished glories of Belfast shipbuilding at its peak." This makes sense: when I was in Cambridge, I saw someone wearing a t-shirt that said,
The Titanic: Built by 15,000 Irishmen. Sunk by one Englishman.
As someone said, the past isn't a different country. It isn't even past.
Robert Darnton challenges "five myths about the information age" that, taken together, "constitute a font of proverbial nonwisdom."
It used to be said that the difference between God and Robert Darnton was that God was everywhere, while Darnton was everywhere but Princeton. Now that he's Harvard's university librarian, I wonder if the joke has migrated and updated?
An Ohio congressional candidate turns out to have been a Nazi reenactor, though he claims he only did it as "a father-son bonding thing." (WTF)? I grew up in Confederate War reenactment territory, but this strikes even me as really weird.
I spent ten years of my childhood in Tennessee and Virginia. For most of that I lived I Richmond, the once and current capital of the Confederacy. The city's beautiful Monument Avenue has statues to Confederate generals (and one recent addition, of Arthur Ashe), and the city is dotted with other memorials or institutions devoted to Confederate history. I knew a few people in high school who were into Civil War reenactment. For those unfamiliar with it, this doesn't involve buying black people, attacking federal institutions, dying of dysentery, or coming home to find your modest pre-war life reduced to ashes; rather it means dressing up in grey uniforms and pretending to fight Civil War battles.
For the people I knew, it wasn't a matter of keeping ancient grievances alive; Civil War reenactments were more like a hunting or fishing, a hobby that got you outdoors and away from the women. And it wasn't particularly political: it wasn't a gateway drug to joining the Klan or agitating for reparations for Southern slave-owners, any more than going to Renaissance faires leads to a belief in monarchy or droit du seigneur. (Though New Republic writer Ed Kigore, "a white southerner old enough to remember the final years of Jim Crow, when every month was Confederate History Month," once suggested organizing "a Neo-Confederate History Month that draws attention to the endless commemorations of the Lost Cause that have wrought nearly as much damage as the Confederacy itself.")
There are lots of other kinds of reenactors around: there are even hilarious reenactments of the American Revolution.
But this strikes me as pretty different from dressing up in a Waffen SS uniform and marching around. Even Germans in Germany don't do this; I haven't seen arguments that participants here in the States tend to be Germanic-American. It also seems to me a lot harder to disentangle tacit endorsement of the politics of National Socialism from reenactment, particularly when you choose to imitate members of an especially fanatical and murderous component of the Nazi machine. (Granted, their uniforms were fabulous, but still, what's wrong with the Wehrmacht?) While you can make some exculpatory argument that at least some Confederate soldiers and officers fought to defend their homeland, and weren't mainly motivated by a desire to preserve slavery (an argument that really cannot be made about the leaders of the Confederacy-- for all the whitewash about states' rights and so forth, they really were about slavery first and foremost), it's harder to make a similar argument about the Waffen SS: they thought of themselves as a vicious elite. Imitating them says something.
Finally, for those for whom World War II games on XBox just aren't enough, there are lots of alternatives to pretending to be part of a regiment that cleared the Lodz ghetto. The WWII Historical Reenactment Society has a list of reenactment units around the country. There are a lot of them, and they reveal some interesting national differences. The British reenactors are mainly commando units. The Americans are all over the place, in terms of the branches and types of groups-- heck, there's even a homefront reeanactment group. There don't seem to be any Japanese reenactment groups (which tells us something about the degree to which wartime Japanese are still regarded as more of an Other than the Nazis). Given that you could join the pretend 81st Airborne or the Royal Marines or the SAS, it's especially interesting that most of the German reenactors are Panzer divisions, and in a vaguely sinister twist, a non-trivia number of them are SS.
Finally, this bit of comparison didn't make me feel any better:
Local Tea Party organizer Bill Zouhary told The Blade Saturday that it was important to focus on what Mr. Iott stands for, which is smaller government and a move away from the types of big-spending policies that could result in hyperinflation, or a revisit of 1930s Germany.
"I thank God there are people like him to run. This isn't an issue about Nazis; this is an issue about what's going on in Washington," he said. "The re-enactments, if anything, would bring to light what happens when you end up with an economy which is very similar to the economy we have now, which opened the door for a dictator like Hitler."
Ah, historical comparisons.
Update: Hitler responds.
This observation from The League of Ordinary Gentlemen about the expert witnesses in Perry:
in the findings of fact Judge Walker ended up relying almost exclusively on the plaintiffs’ witnesses, above all Nancy Cott, whose work is indeed excellent. Gays and lesbians should thank her, George Chauncey, Hendrik Hartog, and indeed the entire American historical profession that supported their work. Historians have made very clear what marriage has and has not been throughout American history, even if it doesn’t necessarily square with many of our received understandings about the way things always were. The truth is that marriage has been a continuously changing institution, not one settled for all time, and that same-sex marriage is simply one more change in a direction that aligns marriage more closely with our ideals and values. Historians have performed an invaluable service to the cause of liberty and human dignity in this case.
Ruth Evans takes an historical perspective on Andy Clark's natural-born cyborgs argument, and that "human cognition is not just embodied but embedded: not mind in body, but both mind and body enmeshed in a wider environment of ever-growing complexity that we create and exploit to make ourselves smarter."
From the abstract:
The philosopher and cognitive scientist Andy Clark has argued that humans have always been ‘natural-born cyborgs,’ that is, they have always collaborated and merged with non-biological props and aids in order to find better environments for thinking. These ‘mindware’ upgrades (I borrow the term ‘mindware’ from Clark, 2001) extend beyond the fusions of the organic and technological that posthumanist theory imagines as our future. Moreover, these external aids do not remain external to our minds; they interact with them to effect profound changes in their internal architecture. Medieval artificial memory systems provide evidence for just this kind of cognitive interaction. But because medieval people conceived of their relationship to technology in fundamentally different ways, we need also to attend to larger epistemic frameworks when we analyze historically contingent forms of mindware upgrade. What cultural history adds to our understanding of embedded cognition is not only a recognition of our cyborg past but a historicized understanding of human reality.
This reminds me some of the work of the cognitive anthropology crowd, which I find necessarily speculative but extremely ambitious and interesting.
Because I'm a whore for attention, or perhaps want to put put all my friends to sleep (only I know for sure), I've posted my latest article, "Paper Spaces: Visualizing the Future," on Future2.
Oddly, I managed to dress to match the Cray supercomputer in the exhibit area.
Tony Perrottet found something pretty interesting:
A couple of years ago, while researching a treatise on salacious European history, I discovered the phantasmagoric wonderland of sex that was Georgian Britain, the era from 1714 to 1837. Long before the heyday of Austin Powers, debauchery proliferated up and down the rain-soaked land, fueled by riotous boozing and self-indulgence. "There was a gusto about 18th century vice unmatched before or since," writes historian Fergus Linnane with tangible nostalgia, in London: The Wicked City. A flood of wealth from the budding empire allowed the leisured classes to fulfill their carnal fantasies without restraint. And perhaps the most striking feature of the age was the explosion of British sex clubs.
Yesterday was the 150th anniversary of the publication of Darwin's Origin of Species. Happy birthday, evolution! (Alfred Russel Wallace, meanwhile, is nowhere to be found.)
And yes, this is the world's worst Stephen Colbert impression.
Just over four years ago, Apple came out with the Mighty Mouse, its now-standard multi-button mouse with a scroll ball. I talked about the mouse as a canonical example of a device that is "easy to use," but its evolution shows that the definition of "easy to use" changes a lot over time. The release of the Magic Mouse shows the same pattern: it's a device that Apple is presenting as easy to use (and at the same time revolutionary), but its ease of use depends not on the permanent revolutionary genius of Steve & Co., but on the changing repertoire of user skills.
As I explained in a 2005 post, when it first appeared in the early 1980s,
the mouse was a total novelty, and anything with more than one button required users to think and make decisions. In contrast, in an age of Game Boy, Playstation, Treo, Blackberry, and the cell phone (not to mention multibutton mice on Wintel machines), kids can look at a device with four buttons and a scroll ball and think, "Hey, that's easy to use." To me, the best indicator of just how far the goal-posts that define ease of use have moved is the now-pervasive use of thumb buttons on mice. Doug Engelbart wanted to put more than three buttons on his mouse, but couldn't figure out how; apparently they didn't think of putting them on the side of the mouse, under the thumb.
What this tells us is that while the concept of "ease of use" is wonderful, and to be encouraged at all times, just what constitutes ease of use will change over time. It's not some unchanging Platonic ideal; it varies and evolves over time, and is defined by a community's exposure to earlier technologies, levels of mechanical or physical skill, and a bunch of other factors.
The Magic Mouse represents another example of practices that now define "easy to use" but were once obscure. The top surface of the mouse is one big multitouch area, like an iPhone or iPod Touch: there's no physical button, and instead the surface identifies certain zones as the equivalent of a left button, right button, etc.. You can also do two-fingered swiping.
Before the appearance of the iPhone and other devices with haptic interfaces, the physical vocabulary of swiping, of having zones on a device that changed their function depending on what program was being used or what mode you're in, would have been alien and confusing. Now, no longer. It's another section in the secret history of physical skill that's part of the history of computing (and which we still don't pay enough attention to).
Yale University astrophysics professor Kevin Schawinski studies how galaxies form. But his most valuable tool isn't a telescope or arcane theory. It's Galaxy Zoo, a project that has enlisted the help of more than 150,000 "citizen scientists" to classify a million galaxies.
Why use people rather than computers for such an undertaking? At least for now, humans with a little training are more accurate than expensive software. And when you have a million galaxies to classify, you want all the help you can get.
Not so long ago, "citizen scientist" would have seemed to be a contradiction in terms. Science is traditionally something done by people in lab coats who hold PhDs. As with classical music or acting, amateurs might be able to appreciate science, but they could not contribute to it. Today, however, enabled by technology and empowered by social change, science-interested laypeople are transforming the way science gets done.
This won't be exciting to anyone but historians of science, but yesterday when I was at the Natural History Museum, I noticed something new. The statue of Richard Owen that had been on the landing of the grand staircase had been replaced by Charles Darwin.
Here's Owen in 2005:
And Darwin now:
Nothing at all symbolic about the swap, of course.
This afternoon I went to the Jewish Museum Vienna. The museum is located near Stephensplatz, in the center of the city, in a building that was once the residence of a Hungarian count, and later an art gallery. I had heard good things about it, but I was completely unprepared for how unexpectedly brilliant and moving it is. Of course, anyone who knows anything about the history of Europe should have some passing familiarity with the brilliant and tragic history of the Jewish community in Vienna. But the experience of the museum isn't powerful because it plays on your emotions in obvious ways: it's not a kind of material version of Schindler's List. It's an amazing experience for other reasons.
The bottom floor is dominated by a large room with selections from the Max Berger Collection of Judaica. The room itself has a curved wall that stretches to the top floor.
It reminded me of Adolf Loos' Steiner House, one of the great milestones of early Viennese modernism.
steiner house, vienna
The exhibit is in one long case, with selections from the Torah etched on the glass in front of the objects.
The placement of the words is unconventional to say the least, and the intention is twofold. First, it's meant to remind you that the objects were used in rituals that were communal and spoken, and were themselves representations of the divine: in a way, the words are what you should focus on, and the objects are their mere physical expression. Second, the words present a challenge: you have to choose to look past them to see the objects. Seeing objects and their history actually isn't easy, and the design of the exhibit highlights this fact.
Now if this all sounds kind of postmodern, you're exactly right. Indeed, the whole museum is saturated with an awareness of the complex nature of representation: the current exhibit on stereotypes, for example, talks about this a lot (no surprise).
It's hard to talk about stereotypes and not sound kind of saccharine or predictable: even if you try to apply some reflexivity or rigor, people who study stereotypes tend to conclude that "stereotypes are bad." I haven't seen someone (say) deconstruct Amos and Andy and conclude that black people really are awfully funny when they shuffle.
But even the stereotypes exhibit-- inevitably titled "typical!"-- is saved by the amazingly good design. Everything about the exhibits is carefully constructed: the lighting is terrific, the artifacts are well-placed, the rooms flow into each other smoothly, and there are all kinds of interesting choices-- the scrims that hold the captions and gently divide rooms, most notably-- that make the experience a lot more compelling that I expected. Too often we associate this kind of perfectionism with a lack of passion and creativity: real creativity is messy and ragged and inspired, grunge rock rather than Chopin. But that's not at all true: perfectionism can communicate a level of reverence for craft and content that forces you to take its subject more seriously.
That experience of consciously looking through-- looking through words etched in glass, through cloth, through one's own perceptions and limitations-- in the Berger Collection and typical! reaches a climax in the viewable storage room in the archive, on the top floor.
Actually, that's not quite true. It's more complicated than that. In a normal museum, there are objects that are stored away and inaccessible to the public. The visible storage room, as the name suggests, makes the storage of objects... visible. There are map drawers, climate-controlled storage for textiles, and movable stacks, all the kinds of things that historians are familiar with. Even if you're not someone who's spent time in archives, you might find that infrastructure kind of interesting; but if like me, and you've literally grown up in archives (when I was a kid I went with my father to the National Archives in Rio, and sat in the reading room and read science fiction while he read notarial documents... or whatever), the experience is really striking.
But the more I took in the storage area, the more remarkable it seemed. Some of the artifacts had been damaged in the attacks on Jews in 1938, and they'd never been repaired: you could still see the burn marks on some of the Torah covers. And the reason this collection exists is that the civilization that produced it was very nearly completely destroyed: Vienna in particular had one of the most prosperous and accomplished Jewish communities anywhere, and that world was eradicated almost overnight. The collection is a witness to that tragedy... and it exists in part because of it.
While it's described as a visible storage area, you can only see things on the outer shelves; the next shelves are faintly visible... but after that, objects recede into obscurity. Visibility shades into invisibility. You're aware of the objects, but just as aware of everything you can't see: tens of thousands of artifacts, boxes and boxes of material.
So the collection is visible and accessible, but at the same time what you can see reminds you of everything that's invisible and inaccessible; it's dedicated to preservation, but some of what's being preserved is a record of destruction. Like the practice of history itself, it demonstrates how we can examine and ponder the records of the past, but never really get to the people and age that made them. It's like a beloved person who's physically close, but firmly and permanently out of reach. All these ideas are stock in trade among historians trained in postmodernism, and I've known them for years; but I've never seen postmodernism put to so serious a purpose, nor its ideas used with such gravity and grace.
There are times when you're confronted with something so brilliant and devastating it's almost overwhelming. Maybe it was the combination of jet lag and overwork, but the brilliance of the idea of the visible archive, the amazing care that went into it, the seriousness with which it uses concepts about history and memory that too often are used for merely clever and sophistic purposes, and the history of the artifacts themselves, made this one of those times for me. For minutes I stood there, not daring to move, not wanting it to end, amazed at how perfect and piercing it all was.
My friend (and, if the editors smile upon our efforts, soon-to-be coauthor) Darlene Cavalier pointed me to an article by Dan Schultz on Media Shift Idea Lab about what journalists can learn from the citizen scientist movement. Essentially, the piece argues that citizen and professional scientists have developed a division of labor and authority that journalists could emulate.
First, this isn't the first time that such a division of labor and authority has emerged in science. In the early nineteenth century, the scientific world in Britain (and in somewhat similar measure the U.S.) consisted of a small elite that ran the Royal Society, was considered (or considered itself) competent to deal with big theoretical issues, and set the agenda for science; and a mass of local observers, ranging from country parson and skilled artisans to teachers and soldiers. Members of this second group could become notable for masterful knowledge of a narrow slice of the universe-- the natural history of their parish, the habits of large mammals in eastern Kenya, Jupiter's moons, etc.-- and could make meaningful contributions to science within their area of expertise.
These boundaries weren't entirely hard and fast-- there was always the possibility of either moving up from the category of local expert to scientific eminence (Charles Darwin might never have made the jump to the second category if he hadn't gone on the Beagle), or reaching beyond one's place-- but people generally (to use an outmoded phrase) knew their place.
The existence of these well-understood boundaries, and the resulting symbiotic relationship that is emerging between professional and citizen scientists, gives Schultz hope that journalists could create something similar:
If you buy my claim that scientists and journalists all care about informational integrity and the quest for truth, then several things can be extrapolated:
- If professional journalists take the lead by clearly defining expectations, explaining best practices, and implementing an accessible infrastructure, then amateurs can contribute without disrupting the industry.
- If amateur journalists do a good job of covering a smaller scope of topics or areas (e.g. the hyperlocal), then professionals can focus on the deeper, otherwise inaccessible issues.
- Professional journalists are responsible for creating and maintaining the citizen network if they want it to meet their standards.
- Citizen networks need more than a host. In order to reach full potential, they need to be explicitly empowered through tools and guidance.
- A symbiotic relationship between the professional, the amateur, and the crowd is not just possible, it's socially optimal.
And there we have it: If the journalism industry can create an infrastructure that allows amateurs to contribute reliable information, then professionals will be able to dedicate more resources to epic reporting. If local papers can find the capacity to set up and empower meaningful citizen networks, they will establish a major foothold in the evolving domains of community and information.
But this leads to my second point, which is that this division of labor and authority is exactly what some bloggers argue is unnecessary today-- and which is more at issue in contentious scientific fields like climate change (or alas, evolution) than it should be. Proponents of intelligent design, for example, have quite brilliantly appropriated the language of democracy to suggest that people should be allowed to make up their own minds about evolution, and could easily make a similar appeal using the citizen science movement. Journalists, it seems to me, are likely to have a tougher time differentiating what they do from "citizen journalists," particularly in an age in which the boundary between reporting and opinion has been eroded, and the professional status of journalists is under assault.
Still, it's a good model to follow.
Paul Graham has a nice post on the different ways managers and "makers" divide up time:
There are two types of schedule, which I'll call the manager's schedule and the maker's schedule. The manager's schedule is for bosses. It's embodied in the traditional appointment book, with each day cut into one hour intervals. You can block off several hours for a single task if you need to, but by default you change what you're doing every hour.
When you use time that way, it's merely a practical problem to meet with someone. Find an open slot in your schedule, book them, and you're done.
Most powerful people are on the manager's schedule. It's the schedule of command. But there's another way of using time that's common among people who make things, like programmers and writers. They generally prefer to use time in units of half a day at least. You can't write or program well in units of an hour. That's barely enough time to get started.
When you're operating on the maker's schedule, meetings are a disaster. A single meeting can blow a whole afternoon, by breaking it into two pieces each too small to do anything hard in. Plus you have to remember to go to the meeting.
When I read this, I thought, this explains why I found meetings so disruptive to my days. I'm a pretty social person, but I find myself increasingly aware of the need to create large blocks of time during which I can really get into a subject, and planning my days so all calls and meetings are loaded into a certain period, rather than spread throughout the day.
Graham's essay also echoes the distinction E. P. Thompson made in his classic article "Time, Work-Discipline, and Industrial Capitalism," between time-oriented work and task-oriented work, in his famous article on time consciousness in the early Industrial Revolution. Pre-industrial work, Thompson argued, was task-oriented: whether you worked in the fields or town, the rhythm of your working day wasn't determined by a clock, but by Nature and the work you needed to get done. With the rise of the factory system, and the growing specialization of labor within factories, the rhythms of work were defined not by organic tasks, but by machines and the factory itself: you worked a certain number of hours a day, and then you stopped. Work was no longer task-oriented, but time-oriented.
Of course, there are types of work that have always remained task-oriented, even when we're measuring or regulating or standardizing them using time. Cooking is one. Parenting is another. Babies are as demanding as any factory-owner, but as any new parent will tell you, they run very much on their own clocks. But today, when the two are at odds, task-orientation loses out to time-orientation: managers set meeting times for subordinates, some of whom are likely to be young mothers. As Judith Schulevitz argues,
The politics of time are hugely significant for women because the temporality of motherhood is strictly at odds with the temporality of work... Motherhood follows not just a pre-industrial schedule but a biological one as well. (The two are related.) Women have to have their babies before they become infertile, and once their children are born, they have to meet their needs then, not later. As we learn more about the psychological and physiological benefits to a baby of being soundly attached to a mother or father figure, the importance of love for brain development, not just personality formation, we get an ever clearer sense of the cost to children of depriving their parents of the means to spend time with them, especially when they’re young. Under current social arrangements, however, motherhood and fatherhood clocks clash with most career clocks, so parents who spend that time often pay a high price for doing so.
One of the things I think I'm going to have to do more ruthlessly is control my time: not just "manage" it better, but think more clearly about what kinds of time I need. I've done this pretty well for space and other resources, but time is something that I've tended to think of merely as a scarce but relatively undifferentiated resource. High time, as it were, to figure out how I can better balance tasks and time, and the different kinds of discipline required to satisfy each.
I've been in Bloomington, Indiana for a conference on visualization and the history and philosophy of science. It's one of those events that brings together my old life as an historian, and my new life as a futurist: on one hand we're mainly talking about how visualizations of scientific communities and social dynamics can be used by historians and philosophers; on the other I suspect that there are cool things I could do with these maps to forecast the future of science.
the official conference picture, via flickr
There's one other think-tank person here, which saves me from being the one non-academic Ph.D. in the room, the scholarly equivalent of Stephen Colbert's one black friend.
There have been some efforts to use scinometric (or "science of science") maps in the history of science, but so far as I know, most of this work has followed fairly conventional historiographic paths: for example, mapping the Darwin or Mersenne correspondence, or asking questions about the growth of scholarly networks. We've not yet used them to something radically new, like using geographical coding to calculate the speed of the transmission of ideas or instruments, or constructing agent-based models of scientific communities and seeing how they evolve over time. But that's why we're here-- to think about how we could create such things, and what benefit they might bring.
I quite like Bloomington, or the few blocks of Bloomington that I've seen.
The place is enormous. It has roughly the same number of students as Berkeley, but physically it's much larger. It also takes collegiate Gothic (a somewhat stripped-down, modernized version) to a scale I don't think I've never seen before. If you took Princeton or Bryn Mawr, put it on a balloon, then blew up the balloon to five times its previous size, you'd get the IU campus. Yale and University of Chicago bear some family resemblance to Oxford or Cambridge, thanks to their small scale; IU takes Gothic where it's never gone before.
It's also pretty heavily wooded. There are a couple streams that flow through the campus, and they're surrounded by forest and crisscrossed with little footbridges.
campus tuesday night, via flickr
the same location, wednesday afternoon, via flickr
The town has a lot of restaurants, and a lot of foreign food, for a place its size. Tuesday night I had dinner at an Ethiopian restaurant, and last night it was Thai at Siam House. (Both are a serious challenge to dieting!) One local attributed this to the long presence of foreign students at IU, some of whom brought spouses or other relatives who went into the restaurant business. I have no way of knowing if this is true, but for whatever reason, there's good food here.
There's a bit of a restaurant row, small places in old houses. That's cool, as it gives the restaurants a more informal character.
restaurant row, via flickr
There are also rabbits that come out in the evening, which adds one more little (furry and bouncy) note of whimsy to the place.
insouciant bunny, via flickr
My latest article, on tinkering and the future, has been published in the latest issue of Vodafone's Receiver Magazine. The piece is an effort to draw together a couple of my research and personal interests (though the boundaries between those two categories is pretty blurry), and to see the tinkering / DIY movement as one piece in an emerging strategy for creating better futures.
Almost forty years ago, the Whole Earth Catalog published its last issue. For the American counterculture, it was like the closing of a really great café: the Catalog had brought together the voices of contributors, readers and editors, all unified by a kind of tech-savvy, hands-on, thoughtful optimism. Don't reject technology, the Catalog urged: make it your own. Don't drop out of the world: change it, using the tools we and your fellow readers have found. Some technologies were environmentally destructive or made you stupid, others were empowering and trod softly on the earth; together we could learn which were which.The piece is also an attempt to think more deeply about things we talked about at the conference on tinkering that Anne Balsamo organized last year (and I continued thinking about in other venues).
Millions found the Catalog's message inspirational. In promoting an attitude toward technology that emphasized experimentation, re-use and re-invention, seeing the deeper consequences of your choices, appreciating the power of learning to do it yourself and sharing your ideas, the Whole Earth Catalog helped create the modern tinkering movement. Today, tinkering is growing in importance as a social movement, as a way of relating to technology and as a source of innovation. Tinkering is about seizing the moment: it is about ad-hoc learning, getting things done, innovation and novelty, all in a highly social, networked environment.
What is interesting is that at its best, tinkering has an almost Zen-like sense of the present: its 'now' is timeless. It is neither heedless of the past or future, nor is it in headlong pursuit of immediate gratification. Tinkering offers a way of engaging with today's needs while also keeping an eye on the future consequences of our choices. And the same technological and social trends that have made tinkering appealing seem poised to make it even more pervasive and powerful in the future. Today we tinker with things; tomorrow, we will tinker with the world.
I've got a new short article at Seedmagazine.com, on automated scientific discovery and the sociology of knowledge. Sounds fascinating, I know, but it really is a better read than I make it sound.
In a recent article in Science, Cornell professor Hod Lipson and graduate student Michael Schmidt described a new computer system that can discover scientific laws. At first glance, it looks like a fulfillment of the dreams of “computational scientific discovery,” a small field at the intersection of philosophy and artificial intelligence (AI) that seeks to reverse-engineer scientific imagination and create a computer as skilled as we are at constructing theories. But if you look closer, it turns out that the system’s success at analyzing large, complicated data sets, formulating initial theories, and discarding trivial patterns in favor of interesting ones comes not from imitating people, but from allowing a very different kind of intelligence to grow in silico — one that doesn’t compete with humans, but works with us....
lder AI projects in scientific discovery tried to model the way scientists think. This approach doesn’t try to imitate an individual scientist’s cognitive processes — you don’t need intuition when you have processor cycles to burn — but it bears an interesting similarity to the way scientific communities work.
Though I have to give credit where it's due: if it turned out well, it's because it's a great project, and several people were very generous with their time, talking me through its details, and speculating on what the project and this approach to automated scientific discovery could mean for the future of science. I should never be amazed that people are almost always willing to talk about their work and what makes it interesting, but I never fail to be. Remember that when I call you!
I'm on the Caltrain from San Francisco to Palo Alto. I spent the day here getting a visa to go to China at the end of the month. I'm planning to be there for about a week, mainly in Beijing.
The Chinese consulate is located on the edge of Japantown. Today it had police barricades and several fairly bored-looking cops, and a couple Falun Gong demonstrators. I don't know if this is normal (I suspect the demonstrators kind of come with the place), or whether the police are here because of the Tibetan anniversary; however, the only excitement was inside, and caused by a few irate people yelling about their service. I got there at about 10:45, and joined a long line of people; initially I was seriously worried about whether I'd be able to get a visa today, but quickly realized that the line was moving relatively quickly.
Once inside I was struck by something else: that while my first impression was that it was pretty chaotic-- lots of people, several very long lines, a certain number of raised voices, an intercom that didn't work THAT well-- after a few minutes I could see that it was, in its way, pretty speedy. There were hundreds of people there, and all things considered, everything moved rather fast. Whatever social cues I follow that tell me that things are going well or poorly didn't quite apply here.
It put me in mind of Harry Collins' description of tacit knowledge. As he explains it, there's contingent tacit knowledge, which is stuff we don't talk about for various reasons but could. Somatic tacit knowledge, in contrast, is physical: putting on clothes is a good example (if you watch young children, they're figuring out how to interpret various kinds of resistance, and figure out that THIS means the sleeve is turned inside-out, THAT means the collar is just to the left, etc.). Finally, there's collective tacit knowledge, which you can only get by immersing yourself in a society. Riding a bike requires somatic tacit knowledge; riding a bike in traffic requires collective tacit knowledge; riding a bike in Copenhagen, Davis CA, and Mumbai requires the same somatic knowledge, but verrry different collective knowledge.
There are social signs we learn that tell us whether a place or situation is safe or unsafe, chaotic or orderly, quiet or tense: how people stand, how they speak, how frustrated or angry they act, whether there are kids or old people present, how friendly the guards or soldiers are, etc. etc. ad infinitum. The ability to read those signs is one thing that distinguishes insiders from outsiders, because they vary from culture to culture. (Not knowing them is one of the things that can get you into trouble in a strange place; and their comforting presence is one of the things that you pay for when you stay in places like business hotels.) Making sense of the consulate, I realized, required a slightly different body of collective knowledge than I apply when I'm in downtown Palo Alto. Once I got that, I was able to see that it was actually a smoother operation than I'd first realized, the presence of loudly angry aged Chinese women aside.
And indeed, I got in right under the wire: I was the last visa applicant they took before breaking for lunch. The whole process took about 45 seconds: I handed in my paperwork, picture, and passport, had a brief discussion about whether I wanted rush service (I did), and was told to come back in a few hours. Sure enough, that afternoon (after lunch in Japantown, which seems now to largely consist of Korean restaurants, and some work at Cafe Murano, a very cool little place on Steiner) they had it ready.
The visa takes up a full page in my passport, for some reason. Whenever I got to the EU, I'm lucky to get stamped; Singapore and Malaysia have nice-looking entry and exit stamps; but China and South Africa take up whole pages in my passport. There's probably some contingent social knowledge that explains why this is.
The Institute's new future of science Web site is now live. For the last couple years we've been running the project under the name X2-- an historical reference to the X Club, a group I've long found fascinating-- but we've updated the name to Signtific, and rolled out a new, much more user-friendly Web site.
No time to stop and relax, though. We've also nearly finished development of a custom version of the online mapping tool that I started using last year (here are copies of my paper spaces and end of cyberspace maps, for example), which promises to be pretty amazing. So no rest for the wicked.
I spent a really stimulating day yesterday at the Tinkering as a Mode of Knowledge conference, listening and talking to people like Dale Dougherty (founder of Make Magazine, the Maker Faire, etc.), Mitch Resnik (MIT Media Lab), Rick Prelinger (the Prelinger Library and online film collection), Anne Balsamo, and others. We're meeting for part of today, but I wanted to start reflecting on yesterday's discussion; and in particular, I want to get at the question of what tinkering is. Is it a unified body of practices? Is it a distinct set of skills? is it an historical moment? Is it just a trendy name? This is something we spent a fair amount of time discussing, either formally or informally, and the answer is: It's all of those. I also thinking there are a couple other important things that define tinkering.
What is Tinkering?
You can define tinkering in part in contrast to other activities. Mitch Resnick, for example, talks about how traditional technology-related planning is top-down, linear, structured, abstract, and rules-based, while tinkering is bottom-up, iterative, experimental, concrete, and object-oriented. (Resnick is very big on creating toys that invite tinkering.)
Anne Balsamo and Perry Hoberman have looked at a wide variety of tinkering activities, ranging from circuit bending to paper prototyping to open source to blogging. They argue that these varied activities are unified by a common set of principles or practices. (The following are just highlights.)
Tinkering isn't so much a specific set of technical skills: there tends to be a pretty instrumental view of knowledge. You pick up just enough knowledge about electronics, textiles, metals, programming, or paper-folding to figure out how to do what you want. It certainly respects skill, but skills are a means, not an end: mastery isn't the point, as it is for professionals. Competence and completion are.
Is Tinkering Shallow or Deep?
One of the things I talked with several people (Mike Kuniavsky in particular) about was how historically specific tinkering is. The deeper question is, is this just a flash in the pan, a trendy name without any substance underneath? The answer we came up with is that this is like a musical style, both the product of specific historical forces, and an expression of something deeper and more fundamental. (Think of jazz: you can talk about how it emerges in the early 20th century out of blues, ragtime, and other previous musical forms, reflects particular sociological and historical trends, and is guided by certain assumptions about beauty and what music is; but at the same time, it definitely expresses a deeper impulse to create music.)
Think of the historically contingent forces shaping tinkering first. I see several things influencing it:
No doubt there are other sources you could point to-- microentrepreneurship or the growth of "jobbies," the presence of an infrastructure that supports the sharing and tracking of unique handmade things (from eBay to ThingLink).
Does Tinkering Matter?
That's a pretty varied list. And it suggests that tinkering is more than a local, Valley, geek leisure thing.
First, tinkering is a powerful form of learning. Even if it doesn't stress mastery of skills, tinkering does emphasize learning how to use your hands, learning how to use materials, and to engage with the physical world rather than the world of software or Second Life-- though tinkering does share a sensibility toward the world that lots of kids demonstrate to programs and virtual worlds: you just get in there, hit buttons, and see what happens.
This really matters because you can be creative with stuff in ways you can't with bits, and that the more you understand the possibilities and limitations or materials-- or more abstractly, if you learn how to develop that knowledge-- the smarter you become. In this respect, it dovetails with "a little-noticed movement in the world of professional design and engineering" that Gregg Zachary wrote about a few weeks ago: "a renewed appreciation for manual labor, or innovating with the aid of human hands." (I write about this at greater length on End of Cyberspace.)
Second, tinkering is forward-looking. It's partly about how we'll use and interact with technologies in the future. As much as any loose movement can be described this way, tinkering is a set of anticipatory practices, aimed at developing a sensibility about the future. It's a way to develop skills that are going to matter in the Conceptual Age, in the ubiquitous computing world. As we move into a world in which we can manufacture things as cheaply as we print them, the skills that tinkerers develop-- not just their ability to play with stuff, or to use particular tools, but to share their ideas and improve on the ideas of others-- will be huge. (I talk about this some in an article in Samsung's DigitAll Magazine.)
Finally, tinkering is an expression of the nature of our engagement with technology. If you buy the argument of Andy Clark that we are natural-born cyborgs, you can see tinkering as a form of co-evolution with technology, or a kind of symbiotic activity.
William Saletan has a piece in Slate looking at the (not very strong) dispute over whether Michael Phelps won the 100-meter butterfly, in which he edged out Serbian Milorad Cavic by 1/100 second. (Cavic, incidentally, was born in California, and went to UC-Berkeley. Like lots of Olympians, he seems to be as much a product of the U.S. as members of the U.S. Olympic squad.)
The problem, Saletan argues, is that in a race this tight, the uncertainties created by the way the scoreboard records times may make it impossible to determine who really won. The scoreboard, he contends,
doesn't tell you which swimmer arrived, touched, or got his hand on the wall first. It tells you which swimmer, in the milliseconds after touching the wall, applied enough force to trigger an electronic touch pad.
[Cornel Marculescu, head of the world swimming federation, FINA] says there's ''absolutely no doubt'' who won, because the clock registered Phelps' arrival first, and "the touch stops the clock.'' Not true. A touch doesn't stop the clock. The touch pad is designed to require a certain degree of force, because otherwise, slight pressure from the water would trigger it. "You can't just put your fingertips on the pad, you really have to push it," the race timekeeper explains. A FINA vice president says the crucial moment is "the instant of depression, of activation of the touch pad, not contact with the pad."...
Technically, the question of who touched first doesn't matter. FINA and the Olympics honchos agreed beforehand to use the touch pads; the touch pads require pressure; all swimmers and their coaches should know this.... I'm not saying the touch-pad system is fishy. It beats the heck out of the old stopwatch method, not to mention the mysteries of judging gymnastics. It's the fairest, most precise system around. And that's the point: Even the most precise system leaves a gray area. In this case, it's the area between touching and pressing. Did Phelps beat Cavic to the wall? We'll never know.
This is the kind of thing that sociologists of science are familiar with. Experiments, they argue, aren't simply direct engagements with Nature, but with things that are proxies for natural phenomena. A neutrino experiment, to paraphrase Trevor Pinch's book Confronting Nature, doesn't generate a bowlful of neutrinos; it generates a set of signals that are translated into graphs that conform (or don't) to theories about how neutrinos ought to behave.
We saw in the 2000 election that even something apparently as straightforward as counting votes was pretty complex, and that we normally weren't aware of the complexity not because it didn't exist, but because normally it didn't seem to matter. And Saletan points to another example of how an instrument-- in this case a touch pad-- that's intended to measure something in a straightforward way and eliminate ambiguity can, under certain circumstances, be revealed to be another proxy.
Wonkette reports that a Dartmouth professor
is suing her class for discrimination, as she revealed in a series of regrettable and bizarre emails that promptly ended up all over Dartmouth blogs. Priya Venkatesan (Dartmouth '90, MS in Genetics, PhD in literature) emailed members of her Winter '08 Writing 5 class Saturday night to announce her intention to seek damages from them for their being mean to her.
Looking at that academic pedigree, I immediately started to worry. Sure enough, she was teaching STS. Her book, Molecular Biology in Narrative Form "is a groundbreaking, interdisciplinary study that shows a connection between molecular biology and French narrative theory."
With many new insights on the link between science (in the form of DNA, a set of codes) and literature (in the form of language, another set of codes), this book looks at modern experimental science within the framework of semiotics. Priya Venkatesan reveals the extraordinary parallel between the work of scientists and the work of narratologists who develop narrative paradigms and analyze literary texts. Molecular Biology in Narrative Form will be a useful resource for scientists and literary theorists interested in the epistemological workings of science, as well as, anyone that desires to explore the linkages between scientific theory and literary analysis.
Two things come to mind. First, didn't Lily Kay and Tim Lenoir do exactly this about 15 years ago? Or does the project just bear a strong resemblance to George Landow's Hypertext, with its argument for unexpected parallels between computer science and literary theory?
And... suing her students? Huh?
[To the tune of Times Online, "The Bugle - Episode 16 - Afghanistan in a zen state of chaos," from the album "The Bugle - Audio Newspaper For A Visual World".]
I've been in Malaysia and Singapore this week, conducting workshops on the future of science and innovation. It's been a very interesting week, talking to scientists in Penang and Kuala Lumpur about the future of science, and what role they see Malaysia playing in that future. The people I've been talking to are pretty convinced that Malaysia, which has a respectable but not world-class scientific community, can evolve into a global player in science in the next couple decades. They don't want to emulate American and European institutions: you won't see multi-billion dollar particle accelerators here any time soon. But they're pretty aware that cloud computing, cheap genomics, and other inexpensive research tools will lower the economic bars to develop world-class competence in some important fields.
So I was especially struck by Gregg Zachary's latest column in the New York Times, which asks, "might cheap science from low-wage countries help keep American innovators humming?" At least a few policy analysts and scholars studying global trends in science think that the United States can profit from the growth of scientific excellence in the developing world.
Americans have long profited from low-cost manufactured goods, especially from Asia. The cost of those material “inputs” is now rising. But because of growing numbers of scientists in China, India and other lower-wage countries, “the cost of producing a new scientific discovery is dropping around the world,” says Christopher T. Hill, a professor of public policy and technology at George Mason University.
American innovators — with their world-class strengths in product design, marketing and finance — may have a historic opportunity to convert the scientific know-how from abroad into market gains and profits. Mr. Hill views the transition to “the postscientific society” as an unrecognized bonus for American creators of new products and services.
Mr. Hill’s insight, which he first described in a National Academy of Sciences journal article last fall, runs counter to the notion that the United States fails to educate enough of its own scientists and that “shortages” of them hamper American competitiveness.
The opposite may actually be true. By tapping relatively low-cost scientists around the world, American innovators may actually strengthen their market positions....
Precisely because the gap between basic science and commercial innovations is large, Mr. Hill’s postscientific society makes sense to innovators on the front lines. One implication for the future is that the United States “won’t have to import so many scientists,” says Stephen D. Nelson, associate director of policy programs at the American Association for the Advancement of Science.
The association, which for decades has generally favored policies to expand the ranks of American scientists, is devoting a portion of its annual policy seminar next month to talk about the “postscience” situation.
Industry, meanwhile, is adapting to a world where scientific goods can come from anywhere — and fewer scientists work on abstract problems unrelated to the market. “It is no accident that many corporate labs have fallen apart,” Sean M. Maloney, executive vice president of Intel, says. “They were science farms looking for problems.”
What is this post-scientific society that Hill writes about? As he explains it,
A post-scientific society will have several key characteristics, the most important of which is that innovation leading to wealth generation and productivity growth will be based principally not on world leadership in fundamental research in the natural sciences and engineering, but on world-leading mastery of the creative powers of, and the basic sciences of, individual human beings, their societies, and their cultures.
Just as the post-industrial society continues to require the products of agriculture and manufacturing for its effective functioning, so too will the post-scientific society continue to require the results of advanced scientific and engineering research. Nevertheless, the leading edge of innovation in the post-scientific society, whether for business, industrial, consumer, or public purposes, will move from the workshop, the laboratory, and the office to the studio, the think tank, the atelier, and cyberspace.
There are growing indications that new innovation-based wealth in the United States is arising from something other than organized research in science and engineering. Companies based on radical innovations, exemplified by network firms such as Google, YouTube, eBay, and Yahoo, create billions in new wealth with only modest contributions from industrial research as it has traditionally been understood. Huge and successful firms like Wal-Mart, FedEx, Dell, Amazon.com, and Cisco have grown to be among the largest in the world, not as much by mastering the intricacies of physics, chemistry, or molecular biology as by structuring human work and organizational practices in radical new ways. The new ideas and concepts that support these post-scientific society companies are every bit as subtle and important as the fundamental natural science and engineering research findings that supported the growth of firms such as General Motors, DuPont, and General Electric in the past half century. But innovation in these two generations of firms is fundamentally different.
The piece is well worth reading, as it has a number of provocative implications for science policy, innovation policy, and education. Essentially, Hill is arguing that a decline in America's monopoly on science-- even if that does happen-- is not to be lamented any more than the shrinking of the agricultural workforce: it doesn't reflect a weakness, but a more fundamental shift to a different kind of economy, in which the sources of value aren't facts, but what you do with them.
Sure, the Enigma was cracked in World War II, but it's still a pretty cool device. Did you know you can make a (very simple) paper version?
[via Bruce Schneier]
[To the tune of Perpetual Groove, "Get Down Tonight," from the album "Live at The Music Farm, 31 December 2006".]
Got a pleasant surprise today: a package of reprints for my latest article, a piece on "The Industrialization of Vision in Victorian Astronomy" in Bildwelten des Wissens. It's always nice to get these. I'll have to send them off to various academic friends, for whom the ritual of receiving reprints holds some cultural meaning.
The article is one I wrote a while ago, but never quite got around to publishing; so when the chance came last year to contribute to this issue, I figured, why not make good use of it? I'm not doing much work on Victorian science now, but still it's a subject that never ceases to be interesting.
And in an ironic twist, last night I was up late answering queries from an editor who's working on a piece of mine on mobility and the end of cyberspace. My old and new intellectual lives overlapping.... Though actually I think that's not quite correct: you don't really have old and new intellectual lives, unless you completely change fields and go from, say, string theory to eschatology; you just mobilize your interests and intellectual skills around different subjects.
[To the tune of Ben Folds & William Shatner, "In Love," from the album "Fear of Pop, Vol. 1".]
John Boudreau, unable to get anyone credible to comment on the Deep Meaning of the Microsoft! bid for Yahoo!, quotes me! in today's Mercury News!:
After just 14 years in which it helped launch the Internet age, Yahoo has hit "middle age" and faces the fate of many other iconic Silicon Valley companies - takeover bait.
Microsoft's $44.6 billion unsolicited bid for Yahoo is yet another indication of a common valley axiom: Innovate or face unwanted suitors.
"Yahoo may join the long list of distinguished companies going back to Fairchild Semiconductor known in their time for doing great stuff that couldn't keep up with the times," said Alex Soojung-Kim Pang, a research director for the non-profit Institute for the Future.
Actually, I love you John-- you do great work, and help me look more impressive to my in-laws.
[To the tune of Ella Fitzgerald, "It's Only A Paper Moon," from the album "Ella Fitzgerald Sings the Cole Porter Songbook".]
This forthcoming issue of Convergence looks really interesting.
Special Issue on ‘Digital Cultures of California'
Guest editor: Julian Bleecker (Near Future Laboratory and University of Southern California)
This call invites submissions for a special issue related to digital cultures of California. Internationally, California is a phenomenon in terms of its relationship to creating, consuming and analyzing the era of digital technologies. From the legendary garage entrepreneurs, to the multi-billion dollar culture of venture capital, to stock back-dating scandals, to the epic exodus of California’s IT support staff during the Burning Man festival, this territory plays an important role in the political, cultural and economic underpinnings of digitally and network-mediated lives on a global scale.
Half of my brain trying to figure out if there's some piece of my end of cyberspace project that I can carve out and submit, and the other half is more sensibly telling me to get the Hell back to work on the book ms. This damn essay on paper spaces-- on how some interactions with paper are more architectural and spatial than merely personal (obviously I need to work on the language a little)-- is the last distraction I should allow myself.
[To the tune of Billy Idol, "Flesh for Fantasy," from the album "Rebel Yell".]
Has anyone written about the history of those boards that stock exchanges use to show information to traders-- the ones that in the movies always have guys running around them, frantically updating prices?
[To the tune of Ratatat, "Wildcat," from the album "Classics".]
Thomas Van der Waal has a great post about "the elements in the social software stack." In addition to having some great advice, and offering a nice clear way to think about social software, the following bit jumped out at me:
It was through reading Jyri Engeström's blog post about "Why some social network services work and others don't — Or: the case for object-centered sociality" that I came to have familiarity with Karin Knorr Cetina's object-centered sociality. It was through the repeated mentioning of this Knorr Cetina concept by Rashmi Sinha in her presentations and from personal conversations with Rashmi that the ideas deep value sunk in (it is a concept central to Rashmi and Jon Boutelle's product SlideShare).
Interaction designers have long been reading anthropology-- Chris Espinoza once told me that when they were designing the first Mac interface, he and the other designers had copies of George Lakoff's work on metaphor in the office-- and I've been aware for a while of more academic interface design types being familiar with STS and history of technology. But it's good to see that people who are actually doing serious products-- Jaiku, SlideShare, etc.-- are using it, too.
[To the tune of Cocteau Twins, "Pur," from the album "Four-Calendar Cafe".]
For those of you old enough to have played video games in the late 1970s or 1980s-- the halcyon days of Defender, Xevious, and Tron, not to mention a Pac Man franchise that rivaled CSI-- the terrific retro arcade photset on Flickr is not to be missed.
Perhaps I'm just over-generalizing from my own over-excited teenage reactions to these kinds of spaces, but I think these arcades, with their spaceship or Buck Rogers interiors, darkness lit only by the neon and the light of the games, played an underappreciated role in creating a psychological association between computers and space-- or alternate spaces.
called Station Break. The arcade was on the edge of the Virginia Commonwealth University campus, near student eateries, bookstores, and the city's only independent movie theatre. For a teenager, it was a neighborhood that spoke of leisure, freedom, and escape. The arcade itself was like another world.
The appeal of these spaces hasn't disappeared entirely, though most arcades are gone. The memory of the old arcade model was compelling enough to inspire MAME developers to create a virtual arcade, and there's a pretty clear linage from Station Break to Chuck E Cheese to the Pizza Planet in Toy Story. For those who really want the old experience, a Springfield, MO arcade, 1984, is a nostalgic re-creation of arcades from the era, right down to the 50+ classic games.
My review of Stuart Clark's The Sun Kings: The Unexpected Tragedy of Richard Carrington and the Tale of How Modern Astronomy Began is in the latest issue of American Scientist.
It was a good book, but to be perfectly honest, it was one of those reviews that the editor took apart, rearranged, and greatly improved. So equal credit on this one should go to Flora Lewis.
Thanks to Bill C. for letting me know it was out!
When events move very fast and possible worlds swing around them, something happens to the quality of thinking. Some men repeat formulae; some men become reporters. To time observation with thought so as to mate a decent level of abstraction with crucial happenings is a difficult problem. Its solution lies in the using of intellectual residues of social-history, not jettisoning them except in precise confrontation with events... (C. Wright Mills, on Franz Neumann's Behemoth: the Structure and Practice of National Socialism 1933-1944)
[via Daily Kos]
Technorati Tags: quotation
From Eamon Duffy's essay in the New York Review of Books, March 29:
Early Christianity was more than a new religion: it brought with it a revolutionary shift in the information technology of the ancient world. That shift was to have implications for the cultural history of the world over the next two millennia at least as momentous as the invention of the Internet seems likely to have for the future. Like Judaism before it and Islam after it, Christianity is often described as "a religion of the Book." The phrase asserts both an abstraction—the centrality of authoritative sacred texts and their interpretation within the three Abrahamic religions—and also a simple concrete fact—the importance of a material object, the book, in the history and practice of all three traditions....
Our modern book form, the codex, in fact evolved from the ancient equivalent of the stenographer's pad, bundles of wooden tablets linked with string hinges and coated with wax, on which information could be jotted with a stylus (often in shorthand). When the information was no longer needed, the wax could be heated and smoothed, and the tablets reused. The first papyrus and (especially) parchment books of pages were recyclable in just the same way, folded and stitched bundles written on with soluble ink that could be washed off to leave the pages blank again. To inscribe the words of Holy Scripture on such jotting pads would demean its sacred character and authority....
Why should the new religion have adopted this down-market and unfashionable book technology? The codex, it is true, has obvious practical advantages. Being written on both sides of the page, it is more economical than the roll, it can be readily indexed, it can be leafed through quickly to find a particular place, and it is more robustly portable. But these practical advantages, which certainly contributed to its eventual adoption as the normative form of the book, do not adequately explain the early Christians' exclusive preference for the form, even for their copies of the Jewish scriptures, which must of course have been transcribed from rolls. Historians have speculated that difference from Judaism may have been the point—that the codex was adopted to distance the emergent Church from its origins within the religion of Israel, or perhaps in an attempt to signal that its foundational texts were indeed a sort of sacred stenography, the living transcript of apostolic experience, taken from the mouths of the first witnesses.
My article on the industrialization of visualization in 19th century astronomy, that is, not me. I hope the editors think well of it. It's a shorter piece, but that's what the journal publishes.
I still need to root around to see if I have any pictures I can run with the piece.
The article is actually one I started ten years ago, but set aside to write a couple other things; but in the intervening years, there have been a number of articles on photography, drawing, and observing practices in Victorian science, so I was able to fill in a couple gaps, and send it off. It's a relatively short piece. We'll see.
Now back to other things, most particularly the end of cyberspace.
[To the tune of The Beatles, "Sgt. Pepper's Lonely Hearts Club Band," from the album "Sgt. Pepper's Lonely Hearts Club Band".]
We're on UAL 931, heading back to San Francisco. Dinner has been served, curtains are drawn, overhead lights turned off-- though it's only early evening London time and we're flying in sunlight all the way to California-- so it's time to put on the noise-cancelling headphones and take stock.
The big thing on my persona plate is a piece for a German history of science volume on visualization in science. I'm adapting a chapter from my eclipses book, which means cutting it down by about 70%, and maybe working in some new stuff (if there's any room at all) drawing comparisons between 19th century challenges in fieldwork and representation (in particular the issues around reproducing delicate images for publication) and current issues in simulation or computer visualization. There probably won't be room for the latter.
I'm also supposed to audition for a little column in an Asian culture magazine. I'll throw together something based on a couple posts to my kids' blog, but I think I've got too much on my plate to do a regular gig.
At work, there's more going on than I really want to think about at the moment, but I can get it all together.
A Flickr photoset of pictures of this weekend. No captions or geolocations yet, I'm mainly focused on getting them up.
Edmund Halley plaque at Westminster Cathedral, via flickr
It's a lovely morning, though my wife's luggage is still missing. Now off to breakfast. Doubtless I'll post pictures of that, too.
I write about people, technology, and the worlds they make.
I'm a senior consultant at Strategic Business Insights, a Menlo Park, CA consulting and research firm. I also have two academic appointments: I'm a visitor at the Peace Innovation Lab at Stanford University, and an Associate Fellow at Oxford University's Saïd Business School. (I also have profiles on Linked In, Google Scholar and Academia.edu.)
I began thinking seriously about contemplative computing in the winter of 2011 while a Visiting Researcher in the Socio-Digital Systems Group at Microsoft Research, Cambridge. I wanted to figure out how to design information technologies and user experiences that promote concentration and deep focused thinking, rather than distract you, fracture your attention, and make you feel dumb. You can read about it on my Contemplative Computing Blog.
My book on contemplative computing, The Distraction Addiction, will be published by Little, Brown and Company in 2013. (It will also appear in Dutch and Russian.)
The Distraction Addiction
My latest book, and the first book from the contemplative computing project. The Distraction Addiction is published by Little, Brown and Co.. It's been widely reviewed and garnered lots of good press. You can find your own copy at your local bookstore, or order it through Barnes & Noble, Amazon (check B&N first, as it's usually cheaper there), or IndieBound.
The Spanish edition
The Dutch edition
Empire and the Sun
My first book, Empire and the Sun: Victorian Solar Eclipse Expeditions, was published with Stanford University Press in 2002 (order via Amazon).
PUBLISHED IN 2012
PUBLISHED IN 2011
PUBLISHED IN 2010
PUBLISHED IN 2009