"We’re now on the cusp of an even more dramatic change, as we enter the age of the global positioning system, which is well on its way to being a standard feature in every car and on every cellphone. At the same time, neuroscientists are starting to uncover a two-way street: our brains determine how we navigate, but our navigational efforts also shape our brains. The experts are picking up some worrying signs about the changes that will occur as we grow accustomed to the brain-free navigation of the gps era."
Hutchinson says that with the digital navigational tool well on its way to becoming standard in every car and on every cellphone, “experts are picking up some worrying signs” about brain atrophy “once we lose the habit of forming cognitive maps.” Research is showing people, their heads in abstract spatial realms, flummoxed finding their way around in the real world.
Yet as the dust settles on the last of these projects, what begins to emerge is a more complex image of America’s cultural values at the birth of a new century. The formal dazzle masks a deeper struggle by cities and architects to create accessible public space in an age of shrinking government revenue and privatization. At their most ambitious, they are an effort to rethink the two great urban planning movements that gave shape to the civic and cultural identity of the American city.
She's normally aloof to the point of anti-social, but for some reason my cat decided to spend the morning on my lap.
I've had better writing surfaces, frankly.
Just found an online reprint of Ellen Ullman's wonderful 2003 essay "Memory and Megabytes," originally published in American Scholar. It's one of my favorite short pieces ever, and started me thinking about the differences between human and machine memory.
Though her recent New York Times op-ed on adoption and knowing your family history is great, too:
I am not against ... the trend... toward openness, a growing “right” to know. I simply want to give not-knowing its due.
I like mysteries. I like the sense of uniqueness that comes from having unknown origins (however false that sense may be).
The whole idea of contrarianism is that you’re “attacking the conventional wisdom”, you’re “telling people that their most cherished beliefs are wrong”, you’re “turning the world upside down”. In other words, you’re setting out to annoy people. Now opinions may differ on whether this is a laudable thing to do – I think it’s fantastic – but if annoying people is what you’re trying to do, then you can hardly complain when annoying people is what you actually do....
The other point of contrarianism is that, if it’s well done, you assemble a whole load of points which are individually uncontroversial (or at least, solidly substantiated) and put them together to support a conclusion which is surprising and counterintuitive. In other words, the aim of the thing is the overall impression you give. Because of this, if you’re writing a contrarian piece properly, you ought to be well aware of what point it looks like you’re making, because the entire point is to make a defensible argument which strongly resembles a controversial one.
So having done this intentionally, you don’t get to complain that people have “misinterpreted” your piece by taking you to be saying exactly what you carefully constructed the argument to look like you were saying.
About a year ago I wrote about Web 2.0 as a time machine for my generation, and my suspicion that "mine may be the last generation that has the experience of losing touch with friends." This concerned me because
when it comes to shaping identity, the ability to forget can be as important as the ability to remember. It's easy to implore people not to forget who they are; but sometimes, in order to become someone better, you need to forget a little bit.
Forgetting insults and painful events, we all recognize, is a pretty healthy thing for individuals: a well-adjusted person just doesn't feel the same shock over a breakup after ten years (if they can even remember the name of Whoever They Were), nor do they regard a fight from their childhood with anything but clinical detachment. Collectively, societies can also be said to make decisions about what they choose to remember, and how to act toward the past. Sometimes this happens informally, but has practical reasons: think of national decisions of avoid deep reflection on wars or civil strife, in the interests of promoting national unity and moving forward.
The idea that digital and human memory work differently, and that we fail to recognize the difference between the two at our peril, is something I've been writing about for a while. So I was very interested to see a review by Henry Farrell in Times Higher Education of Viktor Mayer-Schoenberger's new book Delete: The Virtue of Forgetting in the Digital Age. It sounds like a book I need to read... or at least footnote!
At its heart, his case against digital memory is humanist. He worries that it will not only change the way we organise society, but it will damage our identities. Identity and memory interact in complicated ways. Our ability to forget may be as important to our social relationships as our ability to remember. To forgive may be to forget; when we forgive someone for serious transgressions we in effect forget how angry we once were at them.
Delete argues that digital memory has the capacity both to trap us in the past and to damage our trust in our own memories. When I read an old email describing how angry I once was at someone, I am likely to find myself becoming angry again, even if I have since forgiven the person. I may trust digital records over my own memory, even when these records are partial or positively misleading. Forgetting, in contrast, not only serves as a valuable social lubricant, but also as a bulwark of good judgment, allowing us to give appropriate weight to past events that are important, and to discard things that are not. Digital memory - which traps us in the past - may weaken our ability to judge by distorting what we remember.
[H]aving paraded their daring contrarianism, the freakonomists are trying to wiggle out of the consequences when it turns out that they were wrong.
They just need to take a page from the evil futurists' guide.
Newt Gingrich-- or his tech people, or someone-- is holding a Twitter reenactment-- a Twitternactment, natch!-- of the Battle of Trenton on Sunday.
Despite the fact that a few teachers have used Twitter to "reenact" historic events, it still makes my head hurt, mainly because "Twitternachtment" sounds like a bunch of Web 2.0 fanatics going on an anti-semitic rampage.
I think I'll stick with making dinner for everyone.
[To the tune of Bruce Springsteen & The E Street Band, "Spirit In The Night," from the album Live 1975-85 (I give it 2 stars).]
It was 2003, and Warren, an earnest-sounding and ever enthusiastic Harvard law professor who specializes in bankruptcy, was on the set of Dr. Phil. She had written a book with her daughter called The Two-Income Trap: Why Middle-Class Mothers & Fathers Are Going Broke, and she'd expected to sit next to the host and explain its key points. Instead, Dr. Phil was interviewing a stressed-out couple with serious medical and financial troubles. After they mentioned they had obtained a second mortgage to pay off their credit card debt, the lights went up on Warren, and Dr. Phil asked her if this had been a smart step. No, she declared, because now they could lose their home if they defaulted.
As soon as her turn was over, Warren found herself thinking, "You've been doing this work for 20 years now, and it is unlikely that any of it has had as direct an impact as these 45 seconds." She had reached millions, some of whom might actually pay attention to her advice. "So here you are, Miss Fancy-Pants Professor at Harvard. What do you plan to do now? Is it all about writing more academic articles, or is it about making a difference for the families you study? I made a decision right then: It was for the families, not the self-aggrandizement of scholarship."
Since then she's proved herself to be surprisingly mediagenic, in a very understated, just-drove-the-minivan-to-the-office kind of way. At the same time, she's not given to oversimplification or jargon: she's really good at explaining the stakes in TARP (she's part of the Congressional office that tries to oversee TARP), where the money's going, of why we don't know where the money's going. (Check out her appearance-- in two parts-- on The Daily Show.)
Yet despite, or more accurately because, of her willingness to choose "families" over "the self-aggrandizement of scholarship,"
Harvard economists... dismiss Warren as insufficiently theoretical. "They think she shouldn't be talking about bankruptcy except as someone in the economics department would—that is, with formulas and theorems, not about how it affects real people."
I suppose this drives me around the bend for two reasons.
First, my work has sometimes been accused of being insufficiently theoretical (usually in reader's reports), as if theory is the sine qua non of importance. As Taibbi would put it, first of all, few kinds of scholarly work are both harder and less likely to stand the test of time as theory; and second, what the fuck? When did we all turn into mini-Derridas? Isn't theory a tool? I mean, we all use word processors, but I don't see many of my colleagues rushing to create their own versions of Microsoft Word.
Second, probably the single greatest personal intellectual epiphany I've had since leaving academia is that the real world actually has interesting problems: not just problems that you ought to deal with because life as we know it could get pretty screwed up if we don't, but problems that are actually intellectually engaging, make use of the cognitive muscles you developed in academia, force you to develop new abilities, and expose you to interesting questions you would never have discovered otherwise. The assumption that academia is where people grapple with interesting questions, and the business world is where stupid things happen, is just wrong.
I kind of like Matt Taibbi's argument that she should be drafted to run for president, and if
someone like Elizabeth Warren doesn’t want that responsibility, well, she shouldn’t have gone into office and gone on TV making all that sense and shit. She’s pushed for transparency in the Fed, is openly furious about the misuse of bailout money, and seems to take personally the chicanery that credit card companies and banks use to game the suckers out there. I simply cannot see her suddenly flipping and holding $2000-a-plate fundraisers with Lloyd Blankfein and Jamie Dimon.
Is there a difference in the way the brain takes in or absorbs information when it is presented electronically versus on paper? Does the reading experience change, from retention to comprehension, depending on the medium?
For those of us who carry iPhones, this shift to a persistent Internet has already happened, and it's really profound. The Internet is no longer a destination, someplace you "go to." You don't "get on the Internet." You're always on it. It's just there, like the air you breathe.
[To the tune of Future Sound of London, "Room 208," from the album Lifeforms (I give it 2 stars).]
Just over four years ago, Apple came out with the Mighty Mouse, its now-standard multi-button mouse with a scroll ball. I talked about the mouse as a canonical example of a device that is "easy to use," but its evolution shows that the definition of "easy to use" changes a lot over time. The release of the Magic Mouse shows the same pattern: it's a device that Apple is presenting as easy to use (and at the same time revolutionary), but its ease of use depends not on the permanent revolutionary genius of Steve & Co., but on the changing repertoire of user skills.
As I explained in a 2005 post, when it first appeared in the early 1980s,
the mouse was a total novelty, and anything with more than one button required users to think and make decisions. In contrast, in an age of Game Boy, Playstation, Treo, Blackberry, and the cell phone (not to mention multibutton mice on Wintel machines), kids can look at a device with four buttons and a scroll ball and think, "Hey, that's easy to use." To me, the best indicator of just how far the goal-posts that define ease of use have moved is the now-pervasive use of thumb buttons on mice. Doug Engelbart wanted to put more than three buttons on his mouse, but couldn't figure out how; apparently they didn't think of putting them on the side of the mouse, under the thumb.
What this tells us is that while the concept of "ease of use" is wonderful, and to be encouraged at all times, just what constitutes ease of use will change over time. It's not some unchanging Platonic ideal; it varies and evolves over time, and is defined by a community's exposure to earlier technologies, levels of mechanical or physical skill, and a bunch of other factors.
The Magic Mouse represents another example of practices that now define "easy to use" but were once obscure. The top surface of the mouse is one big multitouch area, like an iPhone or iPod Touch: there's no physical button, and instead the surface identifies certain zones as the equivalent of a left button, right button, etc.. You can also do two-fingered swiping.
Before the appearance of the iPhone and other devices with haptic interfaces, the physical vocabulary of swiping, of having zones on a device that changed their function depending on what program was being used or what mode you're in, would have been alien and confusing. Now, no longer. It's another section in the secret history of physical skill that's part of the history of computing (and which we still don't pay enough attention to).
HealthDay News reports on a study of the impact of Internet use on the brains of elders:
Surfing the Internet just might be a way to preserve your mental skills as you age.
Researchers found that older adults who started browsing the Web experienced improved brain function after only a few days.
"You can teach an old brain new technology tricks," said Dr. Gary Small, a psychiatry professor at the Semel Institute for Neuroscience and Human Behavior at the University of California, Los Angeles, and the author of iBrain. With people who had little Internet experience, "we found that after just a week of practice, there was a much greater extent of activity particularly in the areas of the brain that make decisions, the thinking brain -- which makes sense because, when you're searching online, you're making a lot of decisions," he said. "It's interactive."...
"We found a number of years ago that people who engaged in cognitive activities had better functioning and perspective than those who did not," said Dr. Richard Lipton, a professor of neurology and epidemiology at Albert Einstein College of Medicine in New York City and director of the Einstein Aging Study. "Our study is often referenced as the crossword-puzzle study -- that doing puzzles, writing for pleasure, playing chess and engaging in a broader array of cognitive activities seem to protect against age-related decline in cognitive function and also dementia."...
For the research, 24 neurologically normal adults, aged 55 to 78, were asked to surf the Internet while hooked up to an MRI machine. Before the study began, half the participants had used the Internet daily, and the other half had little experience with it.
After an initial MRI scan, the participants were instructed to do Internet searches for an hour on each of seven days in the next two weeks. They then returned to the clinic for more brain scans.
"At baseline, those with prior Internet experience showed a much greater extent of brain activation," Small said.
Doubtless some readers will recognize this as an updated version of the Proust and the Squid argument, which relies in part on fMRI studies indicating that the brains of literate people have specialized sections for quickly recognizing letters. What's interesting here is that you get a similar kind of stimulation with the elderly.
Yale University astrophysics professor Kevin Schawinski studies how galaxies form. But his most valuable tool isn't a telescope or arcane theory. It's Galaxy Zoo, a project that has enlisted the help of more than 150,000 "citizen scientists" to classify a million galaxies.
Why use people rather than computers for such an undertaking? At least for now, humans with a little training are more accurate than expensive software. And when you have a million galaxies to classify, you want all the help you can get.
Not so long ago, "citizen scientist" would have seemed to be a contradiction in terms. Science is traditionally something done by people in lab coats who hold PhDs. As with classical music or acting, amateurs might be able to appreciate science, but they could not contribute to it. Today, however, enabled by technology and empowered by social change, science-interested laypeople are transforming the way science gets done.
The Guardian has a good piece on the 1979 movie Alien, and its introduction of the first female horror / SF heroine, Sigouney Weaver's Ellen Ripley. I find the whole series fascinating, and in fact I just put a copy of the quartet-- or quadrilogy, as the box set's creators unfortunately call it-- on my computer, so I'd be able to fall asleep to it in foreign hotels.
Watching the scene now, at a 30-year lag, you find yourself drawn as much to the reactions of the other actors as to the creature itself. Scott famously shot the film in one take with four cameras, and purposely kept the actors in the dark as to what, exactly, they were about to witness. It is safe to assume that none of them were as startled as Veronica Cartwright (playing the Nostromo's navigator), who is shown recoiling in genuine horror from a spray of blood. "What you saw on camera was the real response," recalls co-star Tom Skerritt. "She had no idea what the hell happened. All of a sudden this thing just came up."
Cartwright's shock would be mirrored in cinemas around the world. "Everybody remembers the moment when the creature comes out, because it was such a staggering event; totally beyond prediction," says Thomson. "I remember seeing the film at the time with my wife and she was so horrified that she stood up and walked right out of the theatre. Afterwards she admitted that it was a very well-made film and all of that. But she could not take it; could not live with that possibility. It was as though she thought: if that can happen, anything can."
Then there's this unforgettable bit of prose:
Giger's alien features the requisite razor-blade teeth and unreadable, implacable air. Sometimes it is limpid and wet, fashioned on the set out of oysters and clams brought in from a local fishmongers. Sometimes it is hard and blunt. Not to put too fine a point on it, the alien in Alien comes in two guises: vaginal and phallic.
"Alien is a rape movie with male victims," explains David McIntee, author of the Alien study Beautiful Monsters. "And it also shows the consequences of that rape: the pregnancy and birth. It is a film that plays, very deliberately, with male fears of female reproduction."
In response to press reports saying that the health care reform train is leaving the station with President Obama at the wheel (or whatever you use to run a train), Michael Steele just told Fox to look out because he is [the] "cow on the tracks." In other words, in addition to his other shortcomings, Steele is apparently unschooled on the history of train/cow confrontations, though I'm not sure it's a metaphor Democrats will necessarily want to dispute. Later, in a new strike in his on-going war with his dignity, Steele pleaded for a "Rodney King moment" on health care.
The Fox anchor had noted that Democrats are saying the health care reform train has "already left the station" and "Republicans better jump on board."
"Well, I'm the cow on the tracks. You're gonna have to stop that train to get this cow off the track to move forward," Steele said.
It's like the man can't not be entertaining. That's worth something in today's world.
Though "I'm the cow on the tracks" would make an awesome meme.
The last several days I've been laid up with the flu. I started getting a sore throat on Thursday, when I was in Berkeley, and by Friday morning I could barely do anything. I spent the next two days sleeping. It's a very strange thing for me, given how hyperactive I usually am, to not want to do anything at all, and succeeding at it.
Fortunately, I got back some of my energy on Sunday, which was good because I was going to chew up the bed if I spent any more time in it (why does lying in bed hurt if you do it too long?), and now am not quite back to normal, but can definitely see normal from here.
The one good thing I discovered is that Lost In Translation turns out to be the perfect movie to watch when you're sick. That sense of physical and cultural dislocation, or being thrown out of time (and Tokyo is a perfect backdrop for that feeling, given how bright it remains), nicely parallels the feeling of being sick-- of drifting outside of normal time, or being simultaneously trapped in and alien from your body.
Back to work now.
Sometimes I find behavioral economics kind of amusing-- an emotion I don't normally associate with academic research. This piece (via IdeaFestival) in Scientific American is a great example: it reports on a study that asks "Why does the act of falling in love – or at least thinking about love – lead to such a spur of creative productivity?"
One possibility is that when we’re in love we actually think differently. This romantic hypothesis was recently tested by the psychologists Jens Förster, Kai Epstude, and Amina Özelsel at the University of Amsterdam. The researchers found that love really does alter our thoughts, and that this profound emotion affects us in a way that is different than simply thinking about sex.
The clever experiments demonstrated that love makes us think differently in that it triggers global processing, which in turn promotes creative thinking and interferes with analytic thinking. Thinking about sex, however, has the opposite effect: it triggers local processing, which in turn promotes analytic thinking and interferes with creativity.
Why does love make us think more globally? The researchers suggest that romantic love induces a long-term perspective, whereas sexual desire induces a short-term perspective. This is because love typically entails wishes and goals of prolonged attachment with a person, whereas sexual desire is typically focused on engaging in sexual activities in the "here and now". Consistent with this idea, when the researchers asked people to imagine a romantic date or a casual sex encounter, they found that those who imagined dates imagined them as occurring farther into the future than those who imagined casual sex.
According to construal level theory (CLT), thinking about events that are farther into the future or past - or any kind psychological distancing (such as considering things or people that are physically farther away, or considering remote, unlikely alternatives to reality) triggers a more global processing style. In other words, psychological distancing makes us see the forest rather than the individual trees.... [B]ecause love activates a long-term perspective that elicits global processing, it should also promote creativity and impede analytic thinking. [Ed: It certainly succeeds in the latter, in my experience.] In contrast, inasmuch as sex activates a short-term perspective that elicits local processing, it should also promote analytic thinking and impede creative thinking. [Ed: Promote analytical thinking? Does "How the hell do these snaps work?" count as analytical thinking?]
The "think about love but not sex" part is a little tricky, though.
And it's still around.
Perimeter ensures the ability to strike back, but it's no hair-trigger device. It was designed to lie semi-dormant until switched on by a high official in a crisis. Then it would begin monitoring a network of seismic, radiation, and air pressure sensors for signs of nuclear explosions. Before launching any retaliatory strike, the system had to check off four if/then propositions: If it was turned on, then it would try to determine that a nuclear weapon had hit Soviet soil. If it seemed that one had, the system would check to see if any communication links to the war room of the Soviet General Staff remained. If they did, and if some amount of time—likely ranging from 15 minutes to an hour—passed without further indications of attack, the machine would assume officials were still living who could order the counterattack and shut down. But if the line to the General Staff went dead, then Perimeter would infer that apocalypse had arrived. It would immediately transfer launch authority to whoever was manning the system at that moment deep inside a protected bunker—bypassing layers and layers of normal command authority. At that point, the ability to destroy the world would fall to whoever was on duty: maybe a high minister sent in during the crisis, maybe a 25-year-old junior officer fresh out of military academy. And if that person decided to press the button ... If/then. If/then. If/then. If/then.
Once initiated, the counterattack would be controlled by so-called command missiles. Hidden in hardened silos designed to withstand the massive blast and electromagnetic pulses of a nuclear explosion, these missiles would launch first and then radio down coded orders to whatever Soviet weapons had survived the first strike. At that point, the machines will have taken over the war. Soaring over the smoldering, radioactive ruins of the motherland, and with all ground communications destroyed, the command missiles would lead the destruction of the US.
Sounds just a wee tiny bit like Skynet.
But there's a mystery: as Dr. Strangelove put it, a doomsday device is useless if you don't tell anyone about it. So why was it kept a secret?
Perimeter was never meant as a traditional doomsday machine. The Soviets had taken game theory one step further than Kubrick, Szilard, and everyone else: They built a system to deter themselves.
By guaranteeing that Moscow could hit back, Perimeter was actually designed to keep an overeager Soviet military or civilian leader from launching prematurely during a crisis. The point, Zheleznyakov says, was "to cool down all these hotheads and extremists. No matter what was going to happen, there still would be revenge. Those who attack us will be punished."
[via Overcoming Bias]
california academy of sciences, via flickr
Yes, it has a kind of Teletubbies thing going on, but it's still cool.
california academy of sciences, via flickr
So I was intrigued by this article on green roofs, in part because while it's generally positive about the growing popularity of green roofs, it ends on a worried note: that these will end up reduced to an architectural gimmick.
At some point a few of these are going to get built, and people are going to realise that green roofs are being used as the new equivalent of mirrored glass; Architects used to show renderings of towers reflecting birds and clouds to somehow make it disappear and be more palatable to residents and planning officials. In the end, people got a big box covered in mirrors. We should be careful that we don’t get sold a whole lot of big boxes with green boxtops, always shown from above. When you are at street level it will be a very different picture.
This is the sort of thing that shouldn't happen:
Fights broke out as law students queued for up to 11 hours last night to secure the dissertation supervisor of their choice at Brunel University.
More than 100 students queued outside Brunel Law School overnight in the hope of working with their preferred academic, after the school introduced a first-come, first-served supervisor-allocation system.
I love the University's utterly tone-deaf response.
A spokesman for Brunel said the university was “very concerned” that law students had queued overnight and was “disappointed to see the lengths to which some feel they have had to go”.
“In preparing for their dissertation, students are informed that neither their choice of topic nor their first choice of supervisor can be guaranteed. It seems that they have done all they can to try to achieve their first topics and supervisors.”
Ummm.... why should that be disappointing, or any kind of surprise?
A while ago I created a Prezi for an end of cyberspace talk. Prezi has a cool functionality that lets you create a path or trail through a presentation (very Vannevar Bush).
I've realized that because of the trail feature, this presentation isn't just a single talk, or it doesn't need to be. Rather, I can use it as a kind of online studio for displaying everything I talk about in this project, and just create different paths through the Prezi for different talks.
I think I'm going to make this a persistent post, so it always stays on the front page. If you want to go directly to the Prezi, here it is.
In William Gibson's Pattern Recognition, a burglar looks up a porn site during a break-in. Apparently an Italian burglar remembered this, but forgot that it was how the main character knew her apartment had been broken into....
A burglar was caught by Italian police after logging on to Facebook while carrying out a break-in.
The 26 year old, who has not been named, was traced by detectives after the owner of the house reported the crime.
Officers noticed the computer was still on and when the 52 year old owner touched the keyboard, the social network site's homepage flashed up.
The man, from Albano Laziale near Rome told police he was not a member, and they quickly realised the last person to use the computer had been the burglar.
He had written several messages on his wall - but not revealed he was carrying out a crime - and police were quickly able to trace him and recover cash and jewellery that had been taken.
Major Ivo Di Blasio, of the carabinieri paramilitary police, said:"He was tempted to log on during the break in and it led to his arrest - it was a silly mistake to make and we were onto him very quickly.
When I've spoken about the end of cyberspace, and the displacement of the idea of cyberspace as a Platonic plane of information, separate from and superior to the real world, someone's almost asked, "But what about Second Life?" (or World of Warcraft, or Everquest, depending on what year we're talking about). The idea is that these kinds of games and game-worlds represents a continuation of the vision of cyberspace as alternate world.
My response has been twofold. First, despite claims about the utility (or potential utility) of Second Life to business, or the number of hours devoted players spend in World of Warcraft, so far as I can tell, nobody argues that these constitute alternatives to physical reality that will lead to the death of the office or the transformation of travel. They have their appeal, but their appearance is not a sign that the tectonic plates of reality are starting to rumble. Second, it looks more likely that with the coming of ubiquitous computing, some of the kinds of interactions and feedback that make games compelling are going to migrate into the real world, but with serious social and economic implications.
This evening I ran across a piece by Brett McCallon on the growing pervasiveness of games in everyday life that echoes this last point:
"Lexulous", and the game's incredible popularity on Facebook, does say something about the way that gaming is infiltrating the experience of seemingly non-gaming-related activities. As gaming becomes more mainstream, and as designers learn to use gaming mechanics to enhance our work, education and relaxation, we can envision a time in which nearly every experience offers the possibility, if not the requirement, for play....
Exercise is only one of the non-gaming areas into which gaming has intruded in recent years. Games that teach foreign languages, cooking and other skills are also becoming increasingly popular.... Even such mundane activities as household chores can be made less onerous through the addition of gaming mechanics. A free, web-based game called "Chore Wars" lets players apply traditional role-play game rules to their laundry, dishwashing and vacuuming duties. For each completed task, players are granted "experience", "gold", etc, which helps their characters advance through imagined quests. It's a fairly basic system, but as a means of motivating lazy spouses and housemates to pull their weight, it could be quite helpful.
I think McCallon's argument is inaccurate but in a revealing way. It's inaccurate in the sense that while we are going to see the growth of feedback and incentive systems around everyday activities, they're not going to really be games. They may borrow some bits and pieces from games-- familiar visual tropes, rewards, and the like-- but they won't turn housework into a game, any more than my offering my son a quarter to clean his room turns my family into a labor market.
But what's revealing about the piece is that it suggests how likely we are to embrace the language of games when thinking about, and interacting with, these technology. I saw something of this when I was interviewing people about the impact of the Prius MPG estimator on driver behavior. As I wrote in 2008,
Interestingly, many drivers describe efforts to boost their fuel efficiency as a kind of game. One driver, a former Silicon Valley tech executive and car afficionado, recalls that "When I got my Prius, it absolutely felt like I was piloting a large, rolling video game, seeing how to optimize the mileage." Another, a Valley educator, reports that driving her Prius has "become a game for me. I always try to improve the mpg over the last trip." When I gave my end of cyberspace talk at IDEO last week, I brought up the Prius MPG estimator, and one personal immediately said, "It's like a game!" Game designer Amy Jo Kim recalled, "When I first got my Prius 4 years ago, I was completely transfixed by the real-time MPG display. Multi-scale feedback! I could see my mileage per tank, in 5-minute increment, and moment-to-moment. I experimented with my driving style, trying to beat my "high score" each day." A 2006 Cnet article described the Prius as "a mobilized video game... surely the most expensive, biggest gaming machine built... so far."
This may sound like a distinction without a difference, but think about how many times we borrow bits and pieces of phrasing from one realm and apply it to another, and how those borrowings have but a limited influence. We talk about business as war, or coworkers as teams, but we understand that these metaphors don't mean we should bomb a competitor's offices. Doubtless we'll be able to learn some things from game designers about how to improve the interfaces for, say, home energy monitoring systems, but it's not clear that creating an entire game-- complete with characters, more elaborate rules, goals, etc.-- would be necessary or even desirable to achieve substantial energy savings.
Pay no attention to this.
The email edition of the British Psychological Society's Research Digest has reached the milestone of its 150th issue.... To mark the occasion, the Digest editor has invited some of the world's leading psychologists to look inwards and share, in 150 words, one nagging thing they still don't understand about themselves. Their responses are by turns honest, witty and thought-provoking.
I may have to try this before too long: running Mac OSX on a netbook. Doesn't look that hard to do.
From the New York Times, in an article about reactions to the Roman Polanski arrest:
The president of the German Film Academy... spoke about the need for "solidarity among prominent people "and bemoaned how Mr. Polanski had been arrested on his way to a film festival, as if film festivals were embassies or churches.
What a brilliant line-- solidarity among prominent people. Damned Regular People applying their standards to us!
Having followed the story back and forth, read about the documentary, about how one of the prosecutors has recanted his story about trying to influence the judge, etc., I come down on the opinion that Polanski ought to be returned to Los Angeles. The argument that he's suffered enough strikes me as dumb, for several reasons. First, he knew the risks when he fled, and he managed to give himself another 30 years. (And are you more deserving of release if you've been on the lam for 5 years or 50? Have you suffered more if you've spent decades looking over your shoulder, or if your caper only worked for a short period?)
Second, lots of people lose parents or spouses under horrible circumstances yet don't go on to commit crimes against 13 year-old girls.
Finally, Polish artists and intellectuals have been moving to France for at least the last 150 years: it's where you go when you make the big time. It's hard to say that Polanski "suffered long enough" if he's imitating Chopin.
Tom Standage has the best line about a video game... ever:
“Grand Theft Auto: Chinatown Wars”, the latest instalment of the most notorious of video games, is available on the DS. It’s like putting vodka in a baby’s bottle....
The action, set in the murky underworld of Liberty City, a stylised version of New York, is rendered in a cartoonish graphic style which cleverly remains true to previous versions of the game without overtaxing the processing power of the DS. And it’s great fun. Better still, you can play it on the train—there’s an extra illicit thrill in doing all this on the 8.39 to Charing Cross.
The player takes the role of Huang Lee, the son of a murdered Triad boss, who heads to Liberty City to deliver a family heirloom to his uncle and finds himself drawn into the conflict between rival gangs. The script, as usual with “GTA”, is cynical, witty and well written. The game world manages to be large and immersive even when squeezed into the tiny DS. As unlikely as it sounds, given the sorts of game usually found on the DS, this is the genuine, controversial, ultra-violent article, with no corners cut.
I've been on a Raymond Chandler kick recently, plowing through my two-volume Library of America collection of his work. Chandler was the quintessential hard-drinking writer-- it took him a long time, but he basically drank himself to death-- and of course alcohol figures pretty prominently in his work: Philip Marlowe drinks a lot, people meet at cocktail parties and bars, characters have shady pasts because of alcoholism.
So I was interested to see an article in More Intelligent Life about authors who sober up. The sad conclusion is that with a few exceptions-- Cheever, Carver, Steven King-- going sober is a professional error: "It may seem a little impertinent to gauge the literary merits of sobriety—you cannot write books of any discernible quality if you are dead—but clearly, sobering up is one of the more devastating acts of literary criticism an author can face."
Seen on a friend's page:
Just got a note from a JDater. And I'm not kidding: The photo shows him wearing a tool belt, stooped by some appliance, perhaps on the verge of installing something I want or should want installed. Bet he could screen in that porch of mine. But he doesn't want children... I shouldn't have to pick between kids and a screened-in porch.
[To the tune of Moby, "Extreme Ways," from the album 18 (I give it 4 stars).]
Slate's Sharon Lerner writes about "The Real Reason American Women Are So Unhappy" (go read the whole piece):
While women in rich countries around the world may be becoming generally sadder... American women are still probably the gloomiest. Only 3 percent of people in Japan experience major depression in their lifetime, for instance, compared with about 17 percent of Americans, according to the most recent cross-national comparison of depression rates.... [According to the] online World Database of Happiness... the family-friendly (or at least family-friendlier) nations of Sweden (in eighth place), Denmark (second), Finland (seventh), and Holland (13th) as happier than we are. For what it’s worth, the United States, birthplace of both “happy hour” and “the Happy Meal,” ranked only 31st in overall happiness....
So why are American women so particularly blue? For women, two of the most potentially life- (and mood-) altering factors are family size and work hours. American women have notable distinctions on both fronts. First, we have more babies than women in most any other developed country. While an American woman still typically has around 2.1 children over her lifetime, in other rich countries, family size has dropped significantly as women have gained access to jobs and education. More than 90 nations throughout Europe and Asia now have fertility rates well below ours. Second, even while we’ve continued to raise sizable families, American women have achieved the very highest rate of full-time employment in the world, with 75 percent of employed women working full-time....
[A] bizarre, punishing disregard for the impact of work stress on mothers of very young children permeates our culture. How else can one explain the U.S. Army’s policy of sending female soldiers back to work full-time just six weeks after giving birth and back into war zones just two-and-a-half months after that? Welfare policy reflects a similar disconnect from the reality of motherhood, with some welfare recipients now guaranteed no leave at all from their work assignments after having babies, which can mean being separated from newborns just days after giving birth. Together, these factors may help explain why, at least in the United States, parenthood now tends to be a downer, with both male and female parents more depressed than their childless peers.
In many ways, the pressures mount as women age and continue to feel the unalleviated pulls of working and parenting. Even though they may start out in the same schools and land in the same jobs, as their careers typically don’t offer the flexibility necessary to care for children, women often have to watch the income gap between themselves and their male counterparts grow—a gap that, given the lack of re-entry points onto career tracks, seems to widen even after children are grown. So, while many women, particularly those who can’t afford to “opt-out,” wind up overwhelmed and exhausted by the combination of full-time careers and motherhood, others wind up nudged out of their professions. Some leave the workforce altogether, but many just wind up in lower-paying, lower-status work that accommodates their schedules. Often neither option is what they wanted, which helps explain the gradual dwindling of women’s happiness.[To the tune of Zero, "These Blues," from the album 1993-02-06 - Great American Music Hall (I give it 4 stars).]
I write about people, technology, and the worlds they make.
I'm a senior consultant at Strategic Business Insights, a Menlo Park, CA consulting and research firm. I also have two academic appointments: I'm a visitor at the Peace Innovation Lab at Stanford University, and an Associate Fellow at Oxford University's Saïd Business School. (I also have profiles on Linked In, Google Scholar and Academia.edu.)
I began thinking seriously about contemplative computing in the winter of 2011 while a Visiting Researcher in the Socio-Digital Systems Group at Microsoft Research, Cambridge. I wanted to figure out how to design information technologies and user experiences that promote concentration and deep focused thinking, rather than distract you, fracture your attention, and make you feel dumb. You can read about it on my Contemplative Computing Blog.
My book on contemplative computing, The Distraction Addiction, will be published by Little, Brown and Company in 2013. (It will also appear in Dutch and Russian.)
My latest book, and the first book from the contemplative computing project. The Distraction Addiction is published by Little, Brown and Co.. It's been widely reviewed and garnered lots of good press. You can find your own copy at your local bookstore, or order it through Amazon, Barnes & Noble or IndieBound.
My first book, Empire and the Sun: Victorian Solar Eclipse Expeditions, was published with Stanford University Press in 2002 (order via Amazon).
PUBLISHED IN 2012
PUBLISHED IN 2011
PUBLISHED IN 2010
PUBLISHED IN 2009