Sound Unbound is now available! I recently served as Assistant Editor to Paul D. Miller a.k.a. DJ Spooky on his essay collection, Sound Unbound: Sampling Digital Music and Culture. Contributors include Erik Davis, Manuel De Landa, Cory Doctorow, Chuck D, Brian Eno, Dick Hebdige, Vijay Iyer, Jaron Lanier, Jonathan Lethem, Moby, Steve Reich, Simon Reynolds, Scanner aka Robin Rimbaud, Bruce Sterling, Lucy Walker, and Saul Williams, among many others — and now it’s out. Continue reading “Sound Unbound is out!”
Camera Obscura: Cloverfield and the Myth of Transparency
The scariest moment of Bowling for Columbine (2002) was watching the security camera footage of the shootings. Something about seeing that black and white representation of Eric Harris’ and Dylan Klebold’s grainy forms stalking the cafeteria that morning was just plain eerie. Continue reading “Camera Obscura: Cloverfield and the Myth of Transparency”
Building a Mystery: Taxonomies for Creativity
In a 2005 Daniel Robert Epstein interview, Pi director Darren Aronofsky likened writing to making a tapestry: “I’ll take different threads from different ideas and weave a carpet of cool ideas together.” In the same interview, he described the way those ideas hang together in his films, saying, “every story has its own film grammar so you have to sort of figure out what the story is about and then figure out what each scene is about and then that tells you where to put the camera.” Continue reading “Building a Mystery: Taxonomies for Creativity”
An Inconvenient Youth, Part Two
Remember when music was good — when bands stood for something and the music they created was from the heart? Remember when music was real?
I remember a college professor trying to tell me that Nine Inch Nails’ Pretty Hate Machine was “fake, plastic music” while Jimi Hendricks’ Are You Experienced? was “real.” I recently heard the same argument about the fakeness of My Chemical Romance, with NIN as the “real” example.
Since writing last entry, I attended a skateboarding session where there were several skaters much older than I am. One said skater couldn’t seem to get his head in the present. All he talked about was “how things used to be” — the tricks, the ramps, the attitude, the music — everything. Needless to say, this grew tiresome very quickly, and I was glad when the younger crew finally showed up to session.
Some cultural artifacts get “grandfathered” in before our critical filters develop — shows that you remember loving that would probably annoy you now. Others however are chosen by your newly discerning pre-teen mind. Be it Bad Brains, The Wipers, The Sex Pistols, Dead Kennedys, Fugazi, Nirvana, Nine Inch Nails, or My Chemical Romance, everyone has that “punk rock moment” where he or she realizes that the shit on the radio or the shit that their dad likes is wack. This does not make the stuff that you used to like better than the stuff your daughter likes. This does not make Nine Inch Nails “better” than My Chemical Romance (there are plenty of other reasons for that).
As Doug Stanhope would put it, Nine Inch Nails is good to you because being young is good. Everything was better then, but not because it was 1991 (or 1968), for example. It’s because you were young then. The same can be said for the Jimi Hendricks example and my college professor above. Sorry, everyone, “Three’s Company” was not necessarily better than “The King of Queens.”
Part of this is cognitive. Our brains’ ability to create and store new memories simply slows down — to a near-stop, therefore making our most cherished memories those of a bygone era, those of our youth. And when we remember those times, we reify them, making them stronger (Freud called the process “Nachtraglichkeit” meaning “retroactivity”).
So, the aging skateboarder lamenting the olden days when skateboarding was more about gnar than fashion (Ed. note: it’s always been about both) might be suffering from cognitive deceleration, but most likely he’s just being nostalgic boor. Farbeit from me to quote Bob Dylan, but he once said, “nostalgia is death.”
My college professor (who’d probably be proud of me for quoting Dylan, even if I’m using it against him) was just being nostalgic as well. Nostalgia is not inherently bad, but when it comes from a sad place (as in our lamenting skateboarder above), then it indicates a dissatisfaction with the present. This, I believe, is when it becomes death.
We should all always be working toward making these the good ol’ days. The day I’m looking back, lamenting the now, is the day I want to cease.
Sources:
Johnson, S. Mind Wide Open. Schribner: New York, 2004.
Watson, J. D. Avoid Boring People. Alfred A. Knopf: New York, 2007.
Watson, J. D. “On Enduring Memories” SEED Magazine, April/May, 2006. p. 45.
Thanks to Reggie for sending me the Ruben Bolling comic.
I Check The Mail Only When Certain It Has Arrived
The mailbox at my parents’ house in Alabama is at the end of winding asphalt strip, the only interruption in acres of sporadic deciduous trees, save their house of course. One day almost exactly twenty years ago, some of the best mail I ever received happened upon that mailbox. Continue reading “I Check The Mail Only When Certain It Has Arrived”
The Just Noticeable Difference
Marié Digby was lauded as the internet’s next big find, a phenomena that had grown organically through digital word-of-mouth, but the media’s multi-roomed echo chamber told on itself. Maybe it was too much, too quickly, but just after Digby’s couple of homey, simple YouTube videos started spreading online, she was featured on radio stations, MTV, iTunes, announcing that she’d been signed to Disney’s Hollywood Records. The official press release headline read, “Breakthrough YouTube Phenomenon Marié Digby Signs With Hollywood Records.” What didn’t come out until later was that her name appeared on that dotted line a year and a half previous. Her online “discovery” was orchestrated from around long, conference-room tables.
Digby wasn’t unlimited bandwidth’s first phony phenom. YouTube’s avidly watched Lonelygirl15, a high school anygirl with a webcam, turned out to be nineteen-year-old aspiring actress Jessica Rose. She’d answered a Craigslist ad for an independent film, landed the part, and — after the “directors” did a bit of explaining — became Lonelygirl15. There was no product attached to the project, but all involved made names for themselves and are now well-represented in Hollywood.
These two stories are postmodern-day examples of what it takes to break through our media-mad all-at-once-ness and get noticed, to float some semblance of signal in a sea of noise. To experience the new is really just to notice a difference. In psychophysics it’s called the just noticeable difference (the “jnd”). Creating that difference is becoming more and more difficult as the tide of noise rises higher and higher.
Where Hollywood records and the Lonelygirl15 crew manipulated an emerging media channel, Miralus Healthcare took the opposite tack with their HeadOn headache remedy. They took one look at new media and ran the opposite direction. The original HeadOn television spot, which some ironically claim induces headaches, looks like a print ad and sounds like a broken record. But it worked. The commercial stood in such stark contrast to everything else on TV that the product is known worldwide.
It is sometimes claimed that technology makes it so that anyone can perform a certain task, like Photoshop made everyone an artist or Pro Tools made everyone a record producer. We make or tools and our tools make us (as Marshall McLuhan once said), but our tools do not make us great.
The idea that the internet and Pro Tools and — whatever else the advent and proliferation of the computer hath wrought — enables anyone to be an artist is both true and false. True, everyone has the tools to do so, but so few people have the talent. The latter is and always will be the case.
New technologies are normalizing events. Think of it like a crosstown street race where the traffic signals are normalizing events. One might be in the lead for a good bit of the ride, but as soon as everyone is stopped at a traffic light, the race effectively starts over. By way of convoluted analogy, one might be “ahead” in the home production process until Soundforge’s new software hits the scene.
Sure, there are people making money producing music who are not that good, but that doesn’t mean that anyone can compete with Dr. Dre just because he or she sets up a MySpace page and posts some loops from Acid. I’ve heard this argument so often lately, that anyone can cut-and-paste a record together and become a producer. If that were true, then why does Dr. Dre even have a career? Simple: Because he’s good at what he does. Let everyone try it!
Yes, building a name is a huge part of this and one person’s bloated name can overshadow someone else’s immense talents, but the proliferation of tools and channels does not dilute the fact that it takes talent, skills, work, and chance — as an artist and a marketer — to get noticed. Computers, the internet, weblogs, and everything else haven’t made everyone a great writer and killed authors’ careers.
DJ Scratch nailed it when he said, “The reason we respect something as an art is because it’s hard as fuck to do.” Good production, good writing, and good marketing are still hard to do — and it’s getting harder and harder to get them noticed. New tools and new channels don’t change the talent and effort it takes to capture the attention and the imagination of the masses, but a new twist here or there can make the just noticeable difference, and that can be all the difference in the world.
Jonah Lehrer: The Fourth Culture
In 1959, C. P. Snow lamented a chasm between what he called the Two Cultures: artsy types on one side and stuffy science folks on the other. Well, Jonah Lehrer has been trying to bring them back together. His book Proust was a Neuroscientist (Houghton Mifflin, 2007) makes large strides toward their collusion by showing how the insights of several artists, musicians, writers, and one chef were a step ahead of the science of their time. In spite of Sir Karl Popper’s insistence that “real” science be falsifiable (though even he respected the authority of the artist), art often tells us more about ourselves.
Noam Chomsky once said, “It is possible — overwhelmingly probable, one might guess — that we will always learn more about human life and personality from novels than from scientific psychology.” Examples of the overlap between art and science are not difficult to unearth. World-renowned physicist Richard Feynman was known to draw, Philip K. Dick‘s A Scanner Darkly (for one example from Dick’s vast canon) explores possible effects of a corpus callosotomy, and Lehrer himself reveals many more in his article “The Future of Science is Art” from Seed Magazine, where he is Editor at Large.
Up against Lehrer, with his post at Seed, his oft-updated blog (The Frontal Cortex), and his well-written, well intriguing book, the rift between the two cultures doesn’t stand a chance.
Roy Christopher: How did the people in Proust was a Neuroscientist come together? Was James Joyce too easy an example? How about Philip K. Dick?
Jonah Lehrer: I’m always a little embarrassed to admit just how idiosyncratic my selection process was for the eight artists in the book. Once I had this idea about artists anticipating the discoveries of modern neuroscience –- and I got that idea when I started reading Proust in a lab — I began to see connections everywhere. I’d mutter about the visual cortex while looking at a Cezanne painting, or think about the somatosensory areas while reading Whitman on the “body electric.” Needless to say, my labmates mocked me mercilessly. But, in general, my selection process could be boiled down to this: I began with my favorite artists and tried to see what they had to say about the mind. The first thing that surprised me was just how much they had to say. Virginia Woolf, for instance, is always going on and on about her brain. “Nerves” has to be one of her favorite words.
Joyce makes a few appearances in the book, but so much ink has already been spilt on Joyce and “consciousness” that I wanted to find something a little more surprising. And Philip K. Dick will definitely appear in the sequel, when I get around to writing it.
RC: In light of all of the parallels between the Two Cultures that you’ve documented, do you think that C. P. Snow’s insight was a fallacy?
JL: Of course, there are real differences between our Two Cultures. Artists speak with metaphors, brushstrokes and plot, while scientists rely on acronyms, experiments and control variables. Sometimes, the languages of art and science can be so different that it’s hard to imagine a consilience ever taking place. But I think that cheap and easy binary distinction is also a little misleading. For starters, artists often rely on experimentation while making art -– they’ll try out different approaches and see what “works” –- while scientists often depend on their imagination.
Finally, I’d add that you don’t have to go very back in time before this cultural distinction disappears. George Eliot, for instance, famously described her novels as a “a set of experiments in life.” Virginia Woolf, before she wrote Mrs. Dalloway, said that in her new novel the “psychology should be done very realistically.” Or look at Coleridge. When the poet was asked why he attended so many lectures on chemistry, he gave a great answer: “To improve my stock of metaphors.” In other words, the poet didn’t believe that art and science needed to be separated.
RC: Snow’s Third Culture has given way to John Brockman’s Third Culture. Do you think the latter will inspire a proper version of the former?
JL: They’re fundamentally different enterprises. I believe that a third culture should ultimately be about re-creating a dialogue between our two cultures, which is what C. P. Snow was referring to. John Brockman, on the other hand, believes that the job of a third culture is to translate science for the masses. (As he puts it, “Science is the only news”.) That’s certainly a worthy endeavor — educating the public about science is really, really important — but it’s not a Third Culture.
RC: Is there a cultural divide between East and West? I ask because it seems to me that Eastern cultures — specifically Japan — are more open to what we would consider noise. Your chapter on Stravinsky got me thinking about this.
JL: That’s an interesting idea. I’m not aware of any research on that subject, but it’s certainly a testable hypothesis. I’d only add that I think neuroscience is really beginning to discover the importance of culture. We’re slowly beginning to learn all of the different ways the inputs of the arts — from “American Idol” to Wagner — can literally shape the brain. In other words, ideas are powerful things.
RC: What are you working on next?
JL: I’m currently hard at work on a book that should be published next year. (I just knocked on wood, in case you couldn’t tell.) The book is still coming together, but it won’t involve Proust, unfortunately.
The Interface and the Algorithm: Four Recent Books
The much-discussed, much-explored interface between humans and machines is seemingly our final frontier. Comparing the interface to the Victorian novel and the 1950s television show (both of which shaped society’s understanding at the time), Steven Johnson wrote, “There are few creative acts in modern life more significant than this one, and few with such broad social consequences.” The graphical user interface has come to represent all of the many processes going on inside the computer — and the way we interact with each other through them.
The machine is not the environment for the person; the person is the environment for the machine. — Aviv Bergman
With Beyond the Desktop Metaphor: Designing Integrated Digital Work Environments (MIT Press), editors Victor Kaptelinin and Mary Czerwinski have compiled essays finding the limits of the current widespread user interface and imagining a post-desktop interface. Studies have found that our current virtual desktop doesn’t afford supporting services for the growing areas of computer-supported collaborative work (CSCW), the ever-expanding diversity of technologies, or the multiple roles or tasks we find ourselves filling. Beyond the Desktop Metaphor is a compendium that reaches just that — beyond the desktop.
Looking back to look ahead, Thomas Erickson and David W. McDonald compiled HCI Remixed: Reflections on Works That Have Influenced the HCI Community (MIT Press). Erickson and McDonald asked fifty-one designers to reflect on one work — something at least ten-years old — that influenced their approach to human-computer interface design. The result is fifty-one brief essays covering artifacts spanning everything from books like Everett Rogers’ Diffusion of Innovations (The Free Press) and Ted Nelson’s Computer Lib/Dream Machines, early innovations like Douglas Engelbart’s mouse and Ivan Sutherland’s SketchPad, and influential people like Edward Tufte and Jane Jacobs. In a field where the research and results are cutting-edge and exciting, but where the literature is often bogged down in minutia and, well, boring, HCI Remixed exhibits a novel approach and is actually fun to read.
It is all just an algorithm with enough unknowns to make a game of it. — McKenzie Wark
Nowhere has HCI been more “remixed” than in computer gaming. A simmering subculture for decades, supposedly the gaming industry has overtaken Hollywood in size, money, and attention. Making sense of this rapid growth and its influence on our culture has spawned confusion, reckless theorizing, and a whole new field of study. Fortunately for us, people like Alexander Galloway and McKenzie Wark have taken up the task of keeping things in perspective. Galloway’s Gaming: Essays on Algorithmic Culture (University of Minnesota Press) draws from over fifty video games — from PONG and Space Invaders to Half-Life and Halo — (as well as his keen critical eye and l33t gamer skills) to deliver a holistic and seasoned approach to gaming studies.
Wark’s Gamer Theory (Harvard University Press), which was originally published in-progress online as “G4M3R 7H30RY,” is written in the aphoristic style of Guy Debord’s The Society of the Spectacle (not unlike Wark’s previous book, A Hacker Manifesto). While its being published online has gotten more attention than the book itself, this should not be the case. Like Wark’s previous work, this is an important text for anyone interested in progressive thought on media and technology — and our relationships with it. Gamer Theory is less about the avatars, images, and interface, and more about the philosophy that drives them. It’s the algorithm as allegory, the formula as form, the rules as rubrics, and what all of it might mean to the culture they’re shaping.
Depending on what end of the human-computer spectrum you’re interested in — from haptics and CSCW to gaming and philosophy — these four books tap the pulse of the melding of humans and machines.
Daniel H. Pink: 9-to-5ers Anthem
Daniel H. Pink has been exploring the way we work for over a decade now. From Free Agent Nation (Warner Books, 2001) to A Whole New Mind (Riverhead, 2005), he’s been unearthing the intricacies of the working world from the abstract to the concrete. His latest book, The Adventures of Johnny Bunko (Riverhead, 2008), is a career guide written in the Japanese graphic-novel style of manga (a trailer for which is embedded below). As the world of work continues to get more and more confusing, we need all the help we can get.
Roy Christopher: Your work has made an interesting shift from the nomothetic to the idiographic, from the working trends of the masses to the career of the individual. Has this been an intentional change in focus?
Daniel Pink: A little bit. I’ve tried to write all my books from the perspective of the individual. As I write, I really think about an individual reader going through the pages and trying to glean some information and guidance. The big change with Johnny B. is that I decide to do a pure narrative — and, of course, I decided to tell that story in the picture-based form that is manga.
RC: Tell me about Johnny Bunko, “the first American business manga.” Where’d you get the idea to present your business writing through the Japanese graphic novel format?
DP: It was a bunch of factors. I spent a couple of months last year in Japan studying the manga industry. One of the things I discovered that manga is ubiquitous in Japan — 22% of all printed material is in comics — in part because it’s a form that’s for adults as well as
kids. In any Japanese bookstore, you can find manga to help you manage your time, learn about Japanese history, find a mate, etc. But as popular as manga has become in the U.S., we still don’t have that genre of manga for adults. What’s more, when it comes to career
information, people today get their tactical information online — what keywords to include in a résumé, info about what a company does, etc. What they want from a book is what they can’t get from Google: strategic, big picture advice. That’s why I’ve organized this book around the six broad principles about satisfaction and impact at work that I wish I’d known 25 years ago.
RC: A Whole New Mind goes a long way to reconciling the brain-hemisphere bias we’ve all been trained to accept. The book is definitely full of solid insights, but did you ever feel like you were reaching a bit?
DP: No. If anything, people have told me that I went overboard in the book repeating that both hemispheres — or, more accurately, both left-brain and right-brain style thinking — are important.
RC: There seem to be two sides to the whole-mind concept: one is an opening up to new ideas and influences so that one doesn’t become stale, and the other is a narrowing of stimuli just so one can get one’s work done. How do we find a balance in this?
DP: That’s one of the most central questions of personal productivity. And there’s no simple answer. It seems like it’s less about balance than about being able to toggle back and forth between those two modes. The key, of course, is figuring out when to shift. I have a tough time with that myself.
RC: I’ve been primarily freelancing for most of the last decade. Looking back, how prescient were your ideas in Free Agent Nation?
DP: I let others decide the prescience. What I see is that this form of working has become more prevalent and more socially acceptable. And perhaps more interesting, Corporate America itself is becoming more free agent-like. Job tenures are shorter; companies hire people without extending any expectation that new hires will be there for a long time. Organizations are extending much greater flexibility over time and work style. And, of course, they’re shifting responsibility like health care and pensions to the individual. In some sense, whether we’re getting a 1099 or a W-2, we’re all free agents now.
——
Here is the trailer for Daniel Pink’s Johnny Bunko (runtime: 1:46):
Haxploitation: Three Recent Books
Networks and network protocols are often seen as sites of control, but extreme connectivity eludes control. Diseases, worms, viruses, these all spread beyond our control because of connectivity — networks — that are beyond our control.
When networks cause problems is it because they work too well, not because they are broken. Continue reading “Haxploitation: Three Recent Books”