Social Media Fatigue

The closer we get to each other, the less likely we are to have things in common. The more we know about each other, the more likely we are to fundamentally disagree on how the world should work. The more intimate the details we share, the more likely one of us has done something unforgivable in the eyes of the other. Dig deep enough inside anyone and you’re going to find something you don’t like. As my friend Lucas Molandes puts it, the only reason you’re with the person you’re with right now is because all of your previous relationships failed.

I Can't Believe I'm Not Bitter!

Human relationships are messy. We get involved only when we have to. We skim across the surfaces of each other. We give and get only what is needed in each situation: filling out forms, ID numbers, driver’s licenses, log-ins, passwords, online presences, social networks. We inconvenience ourselves for institutions and one another. Even our personal opinions and comments have migrated from scattered sites and blogs to social media silos, soon to be replaced by Likes and Re-Tweets. The illusion of being in touch. Spam disguised as social interaction.

It’s global. It’s local.
It’s the next thing in Social.
Hip-hop, rockin’, or microbloggin’ —
You get updates every time you log in.
So, come on in, we’re open,
And we’re hopin’ to rope in
All your Facebook friends and MySpace memories.
There’s a brand new place for all of your enemies.
You don’t really care about piracy or privacy.
You just want music and friends as far as the eye can see.
So, sign up, sign in, put in your information.
It’s the new online destination for a bored, boring nation.
Tell your friends, your sister, and your mom.
It’s time for something-something-something.com

The numbers say that social media doesn’t replace face-to-face communication, it enhances and encourages it. The numbers say that older people are uncooling social media and driving the youth to other means of interaction. The numbers tell them what we’ve bought in the past, what we’re buying now, and predict what we will buy later. The numbers tell them where we’ve been, where we are, and where we’re going. The numbers know who we are and who we’re likely to become. We are the products of social media. We are what it buys and sells.

And when time like the pyramids has worn away
All the mountains and the valleys of the words that we say
We have got to make sure that something remains
If we lose each other we’ve got no one to blame
— Oingo Boingo, “My Life”

The numbers can’t tell them what it’s like to hold her hand. How nice it is when she’s here or how empty it is when she’s not. They can’t quantify the unashamed laughs of children or the smiles in the eyes of parents. There’s no database for the barely perceivable daydream-driven smirk, no pivot table for the way that curl hits that curve in her neck just so. Big data seems so small in the face of real human detail.

Getting close to someone else is a sloppy, risky mess. The things you love most can quickly become the things you loathe. Taking that chance is the best thing in the world though. And there is no app for that.

Dispatches from Digital Dystopia

David Hoffman once summarized George Orwell’s 1984, writing that “during times of universal deceit, telling the truth becomes a revolutionary act.” Aaron Swartz, Chelsea (née Bradley) Manning, Adrian Lamo, Aaron Barr, and Edward Snowden have all been pawns and prisoners of information warfare. As the surveillance has expanded from mounted cameras to wireless taps, hackers have evolved from phone phreaking to secret leaking. It’s a ratcheting up of tactics and attacks on both sides. Andy Greenberg quotes Hunter S. Thompson, saying that the weird are turning pro. It’s a thought that evokes the last line of Bruce Sterling‘s The Hacker Crackdown (1991) which, after deftly chronicling the early history of computer hacker activity, investigation, and incarceration, states ominously, “It is the End of the Amateurs” (p. 301).

These quips can be applied to either side.

Sousveillance: Steve Mann
Sousveillance device via Steve Mann, 1998.

The Hacker Ethic — as popularized by Steven Levy’s Hackers (Anchor, 1984) — states that access to computers “and anything which might teach you something about the way the world works should be unlimited and total” (p. 40). Hackers seek to understand, not to undermine. And they tolerate no constraints. Tactical media, so-called to avoid the semiotic baggage of related labels, exploits the asymmetry of knowledge gained via hacking (Branwyn, 1994; Lievrouw, 2011; Lovink, 2002; Raley, 2009). In a passage that reads like recent events, purveyor of the term, Geert Lovink (2002) writes, “Tactical networks are all about an imaginary exchange of concepts outbidding and overlaying each other. Necessary illusions. What circulates are models and rumors, arguments and experiences of how to organize cultural and political activities, get projects financed, infrastructure up and running and create informal networks of trust which make living in Babylon bearable” (p. 254). Sounds like a description of the tumult behind Wikileaks and Anonymous.

This Machine Kills SecretsIn This Machine Kills Secrets (Dutton, 2012), Andy Greenberg explores the infighting and odd cooperation among those out to break and build boundaries around certain strains of information. It’s a tale of rogues gone straight, straights gone rogue, and the weird gone pro. It’s a battle over stiffly defined contexts, lines drawn and defended. He writes of the leakers, “They take an immoral act out of some special, secret culture where it seems acceptable and expose it to the world of moral human relationships, where it’s exposed as obviously horrific” (p. 311). Theirs are easy acts to defend when the extremes are so evident, but what about the more subtle contexts? As danah boyd puts it, “Privacy isn’t a binary that can be turned on or off. It’s about context, social situations, and control.” Privacy is not secrecy, but they’re so closely related that the former seems to be lost in the fight against the latter. They’re also so close as to be constantly conflated when debated.

We Are Anonymous

Following Matt Blaze, Neal Stephenson (2012) states “it’s best in the long run, for all concerned, if vulnerabilities are exposed in public” (p. 27). Informal groups of information insurgents like the crews behind Wikileaks and Anonymous keep open tabs on the powers that would be. After a cameo in This Machine Kills Secrets, Aaron Barr takes a more central role in We Are Anonymous (Little, Brown, 2012) by Parmy Olson. A high-end security consultant, Barr set out to expose Anonymous unprovoked, and quickly found himself on the wrong side of the line. Again, hackers are easy to defend when they’re on your side. Wires may be wormholes (Stephenson, 1996), but that can be dangerous when they flow both ways. Once you get locked out of all your accounts and the contents of your harddrive end up on the wrong screen, hackers aren’t your friends anymore, academic or otherwise. The recent DDoS attacks on several major torrent trackers should be raising more eyebrows on both sides.

Hackers of every kind behave as if they understand that “[p]ostmodernity is no longer a strategy or style, it is the natural condition of today’s network society” (Lovink, 2002, p. 259). In a hyper-connected world, disconnection is power. The ability to become untraceable is the ability to become invisible (Kluitenberg, 2008). We need to unite and become hackers ourselves now more than ever against what Kevin DeLuca (2007) calls “the acronyms of the apocalypse” (e.g., WTO, NAFTA, GATT, etc.; p. 47). The original Hacker Ethic isn’t enough when Shit is Fucked-Up and Bullshit (Wark, 2012). We need more of those nameless nerds, nodes in undulating networks of cyber disobedience. “Information moves, or we move to it,” writes Neal Stephenson (1996), like a hacker motto of “digital micro-politics” (Lovink, 2002, p. 254). Hackers need to appear, swarm, attack, and then disappear again into the dark fiber of the Deep Web.

Lovink (2002) continues: “The world is crazy enough. There is not much reason to opt for the illusion” (p. 259). Who was it that said Orwell was 30 years off? Tactical media is where we watch the ones watching us.

References:

Branwyn, Gareth. (1994). Introduction: Hackers: Heroes or Villains? In Knightmare, Confessions of a Super-Hacker. Port Townsend, WA: Loompanics Unlimited.

DeLuca, Kevin M. (2007). A Wilderness Environmentalism Manifesto: Contesting the Infinite Self-Absorption of Humans. In, R. Sandler & P. C. Pezzullo (Eds.), Environmental Justice and Environmentalism: The Social Justice Challenge to the Environmental Movement. Cambridge, MA: MIT Press, pp. 27-55.

Greenberg, Andy. (2012). This Machine Kills Secrets. New York: Dutton Adult.

Kluitenberg, Eric. (2008). Delusive Spaces: Essays on Culture, Media, and Technology. Rotterdam: NAi Publishers.

Levy, Steven. (1984). Hackers: Heroes of the Computer Revolution. New York: Anchor Press/Doubleday.

Lievrouw, Leah A. (2011). Alternative and Activist New Media. Cambridge, UK: Polity.

Lovink, Geert. (2002). Dark Fiber: Tracking Critical Internet Culture. Cambridge, MA: MIT Press.

Olson, Parmy. (2012). We Are Anonymous. New York: Little, Brown, and Co.

Raley, Rita. (2009). Tactical Media. Minneapolis, MN: University of Minnesota Press.

Stephenson, Neal. (1996, December). Mother Earth, Mother Board. WIRED, 04.12.

Stephenson, Neal (2012). Some Remarks: Essays and Other Writing. New York: William Morrow.

Sterling, Bruce. (1991). The Hacker Crackdown: Law and Disorder on the Electronic Frontier. New York: Bantam.

Wark, McKenzie. (2012). Telesthesia: Communication, Culture, & Class. Cambridge, UK: Polity.

It Toggles the Mind

Twenty years ago, Arthur Kroker described the predominant spirit of the times as a “spasm” (1993). What Bruce Sterling (1998) describes as “that violently oscillating 1990s state when you feel totally hyper and nauseatingly bored. That gnawing sense that we’re on the road to nowhere at a million miles an hour.” The feeling has expanded to the point where detached irony is our default emotional setting. David Foster Wallace called it “Total Noise” (quoted in Gleick, 2011, p. 403): An all-consuming cultural state that “tends to level everything out into an undifferentiated mass of high-quality description and trenchant reflection that becomes both numbing and euphoric” (Wallace, 2012, p. 301). It’s information anxiety coupled with complete boredom (Gleick, 2011). What happened to the chasm between those two extremes?

Always two things
switching.
Current runs through bodies
and then it doesn’t.
It was a language of sounds,
of noise,
of switching,
of signals.

On again.
Off again.
Always two things
switching.
One thing instantly replaces
another.

It was the language
of the Future.

— Laurie Anderson, United States

Constructing sameness is an essential intellectual activity that goes unobserved. — Mary Douglas, How Institutions Think

A skeuomorph is a design element that remains only as an allusion to a previous form, like a digital recording that includes the clicks and pops of a record player, woodgrain wallpaper, the desktop metaphor, or even the digital “page.” It’s obsolete except in signifying what it supplants. N. Katherine Hayles (1999) describes the concept, writing, “It calls into play a psychodynamic that finds the new more acceptable when it recalls the old that it is in the process of displacing and finds the traditional more comfortable when it is presented in a context that reminds us we can escape from it into the new” (p. 17; cf. Tenner, 2003, p. xii). Skeuomorphs meditate the liminal space between uncomfortable shifts and an uncertain future, translating the unknown into the terms of the known.

Translation is always an amalgam of hope and nostalgia, combining the yearning for home with the urge to press forward into new territories. — Matthew Battles, The Sovereignties of Invention

Just like a cramped muscle, the solution to Kroker’s metaphorical spasm is to stretch it out. In the most general sense, my central research question concerns the process by which we mediate our lives with our technologies. What I call The Medium Picture is that process, what it helps, hides, and hinders. A medium is literally a “middle, intermediary state” (Gleick, 2011, p. 153), and that is the place I’ve been investigating. Skeuomorphs bridge the threshold, obscuring the transition, and that is their purpose when it comes to adapting people to new technologies. They soften the blow of the inevitable upgrade. But every new contrivance augments some choices at the expense of others. What we lose is often unbeknownst to us.

… multifunctional lidless eyes watching, outside-in and inside-out; our technology has produced the vision of microscopic giants and intergalactic midgets, freezing time out of the picture, contracting space to a spasm. — Rosi Braidotti, Nomadic Subjects

With his finger ever on the flickering pulse, William Gibson (2012) writes, parenthetically, “(This perpetual toggling between nothing being new under the sun, and everything having very recently changed, absolutely, is perhaps the central driving tension of my work)” (p. 51). That binary belies a bulging, unexplored midsection. The space between that switch from one extreme to the other, that is what The Medium Picture is about.

References:

Anderson, Laurie. (1984). United States. New York: Harper & Row, p. 22.

Battles, Matthew. (2012). The Sovereignties of Invention. New York: Red Lemonade, p. 84.

Braidotti, Rosi. (1994). Nomadic Subjects. New York: Columbia University Press, p. 43.

Douglas, Mary. (1986). How Institutions Think. Syracuse, NY: Syracuse University Press, p. 60.

Gibson, William. (2012). Distrust That Particular Flavor. New York: Putnam.

Gleick, James. (2011). The Information: A History, a Theory, a Flood. New York: Pantheon.

Hayles, N. Katherine. (1999). How We Became Post-Human. Chicago: University of Chicago Press.

Kroker, Arthur. (1993). Spasm: Virtual Reality, Android Music and Electric Flesh. Montreal: New World Perspectives.

Sterling, Bruce. (1998, October 4). Viridian Design. San Francisco: Yerba Buena Center for the Arts.

Tenner, Edward. (2003). Our Own Devices. New York: Knopf.

Flyover Culture: The Death of the Mainstream

We’re all home for the holidays. Looking around the living room today at the family assembled there, most were clicking around on laptops, two were also wearing headphones, one was fingering a smartphone. The television was noticeably dark and silent with each of us engrossed in his or her own digital experience, be it a game, a TV show, or some, social metamedium. Jaron Lanier (2008) writes, ”…pop culture is important. It drags us all along with it; it is our shared fate. We can’t simply remain aloof” (p. 385). But what happens when we don’t share any of it anymore? Narrowcasting and narrowcatching, as each of us burrows further into our own interests, we have less of them in common as a whole. The mainstream has become less of a stream and more of a mist.

What We Share

A friend of mine noted recently that The Long Tail has gotten so long and so thick that there’s not much left in the Big Head. As the internet-enabled market supported a wider and wider variety of cultural artifacts with less and less depth of interest, the big, blockbuster hits have had ever-smaller audiences. This wasn’t the case just a decade ago. The audiences seem to decrease in proportion to the size of the screens. I have found this splintering more and more in the classroom as I try to pick somewhat universal media artifacts to use as examples. Even the biggest shows and movies I brought up this semester left nearly half of my students out, and if I ever got into the stuff I actually like, I was greeted with little more than cricket sounds. The postmodern promise of individual viewpoints and infinite fragmentation is coming closer to fruition.

Cultural divisions as such used to be framed as high versus low culture. New Yorker writer John Seabrook (2000) argues that we have evolved past such hierarchies into what he calls “nobrow culture.” Definitely erring on the high side, Seabrook doesn’t know Stormtroopers from Sand People. Depending on which side of the high/low fence you stand, he and his ilk have “condescended and/or pandered,” in the words of Hal Foster, to you for far too long. The mixing of high culture’s concerns with low culture’s lack thereof only makes sense if there’s a market in the middle. The mainstreaming of anything requires a middle class.

Middle Class, R.I.P.

The middle class is traditionally thought of as the masses of people who are above “working” class but also not quite “upper” class. By definition, membership in the middle class requires a certain amount of discretionary income. Mainstream pop culture relies on that. As that income diminishes and less of the extant money is spent on media due to an increasingly tech-savvy populous, the funding for frivolous entertainment decreases. Art and commerce have always been odd bedfellows, but their offspring are the least interesting children in history. Focus groups, product placement, and everything “brought to you by” a brand are not cool conventions. Mix that division and decline with pop culture’s obsession with its own past, what Simon Reynolds (2011) calls “retromania,” and we get reality television, ubiquitous advertising, and endless remakes and remixes. Reynolds likens the state of the culture industry to global economics, predicting an inevitable crash: “The world economy was brought down by derivatives and bad debt; music has been depleted of meaning through derivatives and indebtedness” (p. 410-420). If the rest of pop culture ends up like the demonetized music industry, then we can bury the middle class next to the mainstream.

None of this is to say that underground culture is inherently better. It’s never made much sense to describe something aesthetically in terms of the mainstream, and now it makes less than ever. Working the ends against the middle trying to get the best of both worlds, so-called “nobrow culture” ends up with the bad of both without any of the good. Watered-down, diluted, widely disseminated, what’s left of the mainstream is the cultural equivalent of the muddy, middle heartland, viewed from an airplane window. It’s flyover culture.

Wittgenstein (1953) once said there was no such thing as a private language. The presumption being that a language only works if it is shared. The same can be said of culture. It only works if it is shared. Here’s hoping we can continue to find some overlapping dirt to dig.

References:

Anderson, Chris. (2006). The Long Tail: Why the Future of Business Is Selling More of Less. New York: Hyperion.

Lanier, Jaron. (2008). Where Did the Music Go? In Paul D. Miller (Ed.), Sound Unbound: Sampling Digital Music and Culture. Cambridge, MA: The MIT Press, pp. 385-390.

Reynolds, Simon. (2011). Retromania: Pop Culture’s Addiction to Its Own Past. New York: faber & faber.

Seabrook, John. (2000). Nobrow: The Marketing of Culture and the Culture of Marketing. New York: Knopf.

Wittgenstein, Ludwig. (1953). Philosophical Investigations. Hoboken, NJ: Blackwell Publishing.

————

This post benefited greatly from discussion and correspondence with Mark Wieman and Tim Baker.

Shooting Starlets: Girls Gone Wildin’

The transition from adolescence to adulthood is rarely an easy one. As we watch Miley Cyrus shed her youth in real-time, I am reminded of a young Drew Barrymore, coming out of rehab for the first time at age 13. The movies Spring Breakers and The Bling Ring represent the grown-up debuts of beloved childhood Hollywood princesses, Selena Gomez and Emma Watson respectively. The two films are also similar for their adult themes and media commentary. No one would say that a refusal to grow up is endearing, but resistance is fertile. There’s nothing quite as cool as youthful nihilism — especially when wielded by young women. Live fast, die young: Bad girls do it well.

Spring Breakers

The similarities here remind me of when in 2007 the Coen Brothers and Paul Thomas Anderson both did adaptations—both camps tend to write their own scripts—of stories set in West Texas. No Country for Old Men and There Will Be Blood are companion pieces in the same way that Spring Breakers and The Bling Ring are, but here the ladies are the ones with the guns.

Spring Breakers‘ heist scene might be the best few minutes of cinema I’ve seen in years. Brit (Ashley Benson) and Candy (Vanessa Hudgens) rob the Chicken Shack restaurant with a hammer and a squirt gun while Cotty (Rachel Korine) circles the building in the getaway car with the camera (and us) riding shotgun. Our limited vantage point gives the scene an added tension because though we are at a distance, it feels far from safe. Much like the security camera footage of Columbine and Chronicle, and the camera-as-character of Chronicle and Cloverfield, we receive a crippled information flow while experiencing total exposure. Their mantra: “Just pretend it’s a fucking video game. Act like you’re in a movie or something.”

Alien (James Franco) arrives as the girls’ douche ex machina, an entity somewhere between True Romance‘s Drexl Spivey (1993), Kevin Federline, and Riff Raff, the latter of whom is supposedly suing over the similarities. He bails them out of jail after a party gone astray and takes them home to his arsenal. What could possibly go wrong?

Spring Breakers' Alien

Selena Gomez does the least behaving badly, but her role as Faith is still a long way from Alex Russo or Beezus. As she tells her grandmother over the phone,

I think we found ourselves here. We finally got to see some other parts of the world. We saw some beautiful things here. Things we’ll never forget. We got to let loose. God, I can’t believe how many new friends we made. Friends from all over the place. I mean everyone was so sweet here. So warm and friendly. I know we made friends that will last us a lifetime. We met people who are just like us. People the same as us. Everyone was just trying to find themselves. It was way more than just having a good time. We see things different now. More colors, more love, more understanding… I know we have to go back to school, but we’ll always remember this trip. Something so amazing, magical. Something so beautiful. Feels as if the world is perfect. Like it’s never gonna end.

Spring break is heavy, y’all. “I grew up in Nashville, but I was a skater, so I was skateboarding during spring break,” writer/director Harmony Korine told Interview. “Everyone I knew would go to Daytona Beach and the Redneck Riviera and just fuck and get drunk — you know, as a rite of passage. I never went. I guess this is my way of going.” Ultimately the movie illustrates Douglas Adams’ dictum that the problem with a party that never ends is that ideas that only seem good at parties continue to seem like good ideas.

Speaking of bad ideas, Sophia Coppola’s The Bling Ring, which is based on a real group of fame-obsessed teenagers, is full of them. Not since Catherine Hardwicke’s Thirteen (2003; which features Spring Breakers‘ Hudgens) has a group of teens been so overtaken by expensive clothes, handbags, and bad behavior. This crew of underage criminals uses internet maps and celebrity news to find out where their targets (e.g., Paris Hilton, Audrina Partridge, Megan Fox, Orlando Bloom, et al.) live and when they will be out of town. Once caught, they seem more concerned with what their famous victims think than with the charges brought against them [trailer runtime: 1:46]:

Q4LzhgExvrc

It would be remiss of me not to note that two of my favorite composers, Cliff Martinez and Brian Reitzell respectively, put the music together for these movies. The mood of Spring Breakers is mostly set by Martinez in collaboration with Skrillex, Gucci Mane (who’s also in the movie), and Waka Focka Flame, among others. The Bling Ring features a mix of Hip-hop, Krautrock, and electronic pop that reads more eclectic than it actually sounds: Sleigh Bells, Kanye West, CAN, M.I.A., Azeailia Banks, Klaus Schultze, Frank Ocean, and so on. Discounting the importance of music in creating the pressure that permeates these films would be an oversight.

Though these films are both cautionary tails of an extreme nature, they prove that caution isn’t cool. Youth might be wasted on the young, but our heroes don’t concern themselves with consequences.

It’s Tricky: Burgeoning Versioning

More mornings than not, either my fiancée or I will wake up with a song securely stuck in one of our heads. Yesterday morning in hers was “The Pursuit of Happiness” by Kid Cudi (2009). Once she found and played the song, I noticed something a bit off about it. I wondered if it had originally be sung by a woman and if he’d just jacked the chorus for the hook. I distinctly remembered the vocals being sung by a woman but also that they were mechanically looped, sampled, or manipulated in some way.

Upon further investigation I found that the song was indeed originally Kid Cudi’s, but that singer/songwriter Lissie had done a cover version of it. Her version is featured in the Girl/Chocolate skateboard video Pretty Sweet (2012), which I have watched many times (Peace to Guy Mariano). Even further digging found the true cause of my confusion: A sample of the Lissie version forms the hook of ScHoolboy Q’s song with A$AP Rocky, “Hands on the Wheel.” This last amalgam of allusions was the version I had in my head [runtime: 3:26]:

dGd9DTTrX4U

So yeah, I sampled your voice. You was usin’ it wrong.
You made it a hot line. I made it a hot song.
— Jay-Z, “Takeover,” 2001

Citing Serge Lacasse, Justin Williams (2013) makes the distinction between the sampled and nonsampled quotation illustrated above. The former being the straight appropriation of previously recorded material, and the latter being like the variations on a theme found in jazz or covers like the Lissie version above: A song or part of a song performed not cut-and-pasted. Building on Gérard Gennette’s work in literature, Lacasse (2000) calls these two types of quotation autosonic (sampled) and allosonic (performed). Of course the live DJ, blending and scratching previously recorded material, conflates these two types of quotation (Katz, 2010), and when we bring copyright law into the mix, things get even more confusing.

Run-DMC: Raising Hell (1986)For instance, the song “It’s Tricky” by Run-DMC (1986) is primarily constructed from two previous songs. The musical track samples the guitars from “My Sharona” by The Knack, and the hook is an interpolation of the chorus from the hit “Mickey” by Toni Basil (1981). Explaining the old-school origins of the song, DMC told Kembrew McLeod and Peter DiCola, “I just changed the chorus around and talked about how this rap business can be tricky to a brother” (quoted in McLeod & DiCola, 2011, p. 32). Tricky indeed: Twenty years after the song was released, Berton Averre and Doug Fieger of The Knack sued Run-DMC for unauthorized use of their song. “That sound is not only the essence of ‘My Sharona’, it is one of the most recognizable sounds in rock ‘n’ roll,” says Fieger, The Knack’s lead singer. As true as that is, it’s not the most recognizable element of Run-DMC’s “It’s Tricky.”

Ice-T‘s track “Rhyme Pays” (1987) samples a guitar riff from Black Sabbath’s “War Pigs” (1970). I remember the first time I heard Faith No More‘s 1989 cover version of the Black Sabbath song and wondering why in the world they’d be imitating an Ice-T song.

I guess I owe Kid Cudi an apology.

References:

Carter, Sean. (2001). Takeover [Recorded by Jay-Z]. On The Blueprint [LP]. New York: Roc-A-Fella/Def Jam.

Katz, Mark. (2010). Capturing Sound: How Technology has Changed Music. Berkeley, CA: University of California Press.

Lacasse, Serge. (2000). Intertextuality and Hypertextuality in Recorded Popular Music. In Michael Talbot (Ed.), The Musical Work: Reality or Invention? Liverpool: Liverpool University Press, pp. 35-58.

McLeod, Kembrew & DiCola, Peter. (2011). Creative License: The Law and Culture of Digital Sampling. Durham, NC: Duke University Press.

Williams, Justin A. (2013). Rhymin’ and Stealin’: Musical Borrowing in Hip-hop. Ann Arbor: MI: University of Michigan Press.

Paradigms Crossed: Building and Burning Bridges in Skateboarding’s Disposable History

Ever since I first saw Wes Humpston’s Dogtown cross on the bottom of a friend’s skateboard in 6th grade, I knew the wood, the wheels, and the art were going to be a part of my world. Like Alex Steinweiss and the album cover, skateboard graphics created the look of skateboarding. There were years where the only thing one knew about a particular skateboarder was the image on the bottom of his (rarely her) board. In the pre-internet world of skateboards, there were only a few companies, fewer videos, and only a few people who controlled almost everything. If you know anything from this era, it’s probably tied in some way to Powell and Peralta’s Bones Brigade.

The Bones Brigade

Only a few professional skateboarders outside of those pictured above mattered on as large a scale during the 1980s. Arguments could easily be made for Christian Hosoi, Gator Rogowski, Mark Gonzalez, and Natas Kaupas among others (my favorites from the era are Neil Blender and Jason Jessee), but The Bones Brigade defined the times. Stacy Peralta, already a skateboarding veteran from the Zephyr Team and the Dogtown of the 1970s, handpicked an iconic group of guys. From the household name of Tony Hawk to the kooky innovations of Rodney Mullen, from the longevity of Steve Caballero to the fierce fun of Lance Mountain, The Bones Brigade is the most legendary team in skateboard history. The empire they built only crumbled when it grew too big to feel or follow the zeitgeist.

Sean Cliver's Disposable

“While other companies scrambled to reinvent themselves with fresh, young teams and a more street-oriented direction,” Sean Cliver (2004) writes, “Powell Peralta remained steadfast in sticking to its guns but floundered in exactly how to go about bridging the old and new generations–especially when it came to graphics” (p. 50). Two main people bridge the genetic fallacy of the Big Five of the 1980s to the populist era of the early 1990s: Rodney Mullen and Sean Cliver. The former invented many of the maneuvers that make up modern street skating, and the latter designed the graphics and artwork. All credit due to Steve Rocco, Craig Stecyk, Mark Gonzalez, and Marc McKee, but those guys all remained in separate and largely opposing camps. Mullen and Cliver are the only ones who worked under the Bones Brigade banner at Powell Peralta as well as the Jolly Roger at Rocco’s Word Industries (Mike Vallely notwithstanding, who was more of a pawn than a player and who didn’t seem to want any part of it).

Skateboarding pro-cum-team manager Steve Rocco was once told by a company owner that skateboarders couldn’t run companies. After getting fired as a team manager, Rocco decided to do just that. He sniped team riders, pirated images for graphics, and concentrated on a street-smart street style that immediately connected with the kids of the time. The intense intricacies of freestyle were dead and the barriers to entry for riding monolithic vert ramps were prohibitive to most. Street skating was anyone’s game. Walk out the door, jump on your board, grind a curb: you’re street skating. Focusing on that and the irreverence of youth garnered Rocco unmitigated hate from the established skateboard companies, cease-and-desist orders from copyright holders he violated, and millions of faithful followers.

q6btXtUrHTo

A lot of what Rocco did for skateboarding was no different from what Marcel DuChamp and, later, Andy Warhol, did for art. It’s also no different from what sampling and Napster did for music. In his book Disrupt (FT Press, 2010), Luke Williams writes, “Differentiate all you want, but figure out a way to be the only one who does what you do, or die” (p. 2). The irony in skateboarding is that the products don’t differ very much from brand to brand. The subtleties of one board, wheel, or truck are infinitesimal. A world like that needs a Kuhnian shaking-up once in a while, and a lot of the shaking Rocco did back then is still reverberating today: Most skateboard companies are run by current and ex-skateboarders, most BMX companies are run by BMXers, street is the largest genre of either sport, and, thanks in large part to Rocco’s Big Brother Magazine, Jackass is still a thing. As the founder of Foundation and Tum Yeto, Tod Swank, put it to me (2007),

…when Rocco started World Industries, what he really did was liberate skateboarding so that it could move forward. He helped a lot of people start companies, not just me. He lent money and gave advice to a lot of other skateboarders who wanted to start companies. He wanted to see the industry run by skateboarders (p. 274).

“The life of an oppositionist is supposed to be difficult,” wrote Christopher Hitchens (2001, p. 3). Conformity is its own reward, dissent is not (Sunstein, 2003), so by upending the established order, Rocco brought a lot of grief upon himself. There’s the world the way you want it to be, and there’s the way that it is. George Powell and Stacy Peralta depicted skateboarding as they wanted it to be. Steve Rocco was more of a mirror of what it was becoming. For better or worse, it’s still going and growing in that direction.

References:

Christopher, Roy (2007). Tod Swank: Foundation’s Edge. In R. Christopher (Ed.), Follow for Now: Interviews with Friends and Heroes (pp. 269-276). Seattle, WA: Well-Red Bear.

Cliver, Sean. (2004). Disposable: A History of Skateboard Art. Ontario, Canada: Concrete Wave.

Cliver, Sean. (2009). The Disposable Skateboard Bible. Berkeley, CA: Gingko Press.

Hill, Mike (Director). (2007). The Man Who Souled the World [Motion picture]. Los Angeles: Whyte House Entertainment.

Hitchens, Christopher. (2001). Letters to a Young Contrarian. New York: Basic Books.

Peralta, Stacy (Director). (2012). Bones Brigade: An Autobiography [Motion picture]. Santa Monica, CA: Nonfiction Unlimited.

Sunstein, Cass R. (2003). Why Societies Need Dissent. Cambridge, MA: Harvard University Press.

Williams, Luke. (2010). Disrupt: Think the Unthinkable to Spark Transformation in Your Business. Upper Saddle River, NJ: FT Press.

Satisficing and Psychic Erosion

A few years ago, I realized that I was wearing the wrong size pants. All of my pants were too short. Though I’d been buying the same size pants for years and coping accordingly, the realization was sudden. As soon as I was able, which took a few more months, I ditched them and bought all new, appropriately sized pants.

For a long time I used a stereo pre-amplifier I’d gotten at a thrift store to play music from my computer on larger, better-sounding speakers. The increased sound quality was amazing, but the volume knob on the amp had a short in it and often required readjusting. One speaker would go out, and I’d have to go jiggle the knob to get it back.

Pick Any Two.

These two cases are examples of what Herbert Simon called “satisficing.” That is, dealing with decisions that are not optimal but just good enough. Simon claimed that since we can’t know all of the possibilities or consequences of our choices, satisficing is the best that we can do. In other words, we all satisfice in some way on a daily basis. The problem is when a situation starts to wear on you in barely noticeable ways, slowly eroding your psyche, something seemingly small can quietly build into a real issue. I thought my pants were okay, not realizing for a long time that their ill-fitting length made me uncomfortable and wore on my confidence. Though my faulty volume knob was chronic annoyance, I never thought it was that big a deal.

And — in the biggest of pictures — it wasn’t, but the habit of making do, dealing with the okay instead of the optimal, can be dangerous. In his latest appearance on Conan O’Brien’s show, Louis CK addresses a version of satisficing that can erode our psyches in the worst way. By avoiding sadness, we erode our humanness. “Sadness is poetic,” Louie says “You’re lucky to live sad moments.” [runtime: 4:51]

5HbYScltf1c

Being a person, present in the moment, is not always sad, but with our technologically enabled avoidance of sadness, we satisfice our lives away. “You never feel completely sad or completely happy,” Louie says. “You just feel kind of satisfied with your product, and then you die.”

Take It Easy: Television Still Matters

According to Marshall McLuhan’s most famous aphorism, no TV show will ever be more important than the existence of the television as a medium. He never said that the content didn’t matter, he simply said that it didn’t matter as much as the medium itself. Justin Theroux is one of my favorite actors, but his own dismissal of television’s content has been poking at me for weeks:

If I was roped into a seven-year TV contract I’d probably hang myself. It’s a TV show — selling cars, cereal, soda pop… The shows are incidental to the commercials. I always laugh when TV shows pat themselves on the back for being cutting-edge. I mean, an interracial kiss on Ally McBeal is cutting-edge? I’ve never been shocked by anything on television, except the news (IMDB).

Marshall McLuhan

Though I agree with Theroux’s sentiments about TV’s commercialization, to dismiss the medium wholesale is to recklessly miss out on a phenomenon that defines the way we see the world and that has for generations. Since 1960, about 90% of American homes have hosted at least one television set (Spigel, 1992). Film might still be the more powerful medium overall, and the internet might be the newest, but television has a circulation, a frequency, and an intimacy that movies only borrow from time to time and a continuous nature that is rarely replicated online. Television is still the medium that tells our stories.

Moreover, we live in a world composed of the stories we tell, and all of the programming on TV is presented as entertainment — even the news (Signorielli & Morgan, 1996). Cultivation theory states that heavy watchers of television tend to believe that the world outside their homes is like the one they see on the screen (Gerbner, 1967), and the average daily viewing time of seven hours per household and three hours per person has been stable for decades. Where a rented movie or content streamed online offers one a point of krisis, a juncture at which a decision must be made to find something new to watch or to stop watching altogether, broadcast television does a good job of stringing viewers along via overlapping episodes and enticing cold opens. Frequent viewers also tend to be less selective, regardless of their stated preferences, and watch each show until it ends (Signorielli & Morgan, 1996). Sometimes we want to turn on and make decisions; others we want to turn off and just be entertained.

All of that adds up to one thing: content matters.

Television

The TV’s got them images
TV’s got them all
It’s not shocking
Every half an hour
Someone’s captured and
The cop moves them along
It’s just like the show before
And the news is just another show…
— Jane’s Addiction, “Ted Just Admit It,” Nothing’s Shocking

Tom Hanks once said that film is for directors, theatre is for actors, and television is for writers. Lauren Beukes’ The Shining Girls being developed for TV, Dexter originally being a series of novels, as well as Veronica Mars‘ originally being planned as a young-adult novel speak to this. A show like Breaking Bad moves like a novel, while many movies are adapted from short stories. And a recent show like the Sundance Channel’s The Writer’s Room showcases broadcast, screen-based storytelling. The printed page and online flickering signifiers notwithstanding, television is the medium where writers get to shine.

I take it easy
The ice is thinning in the valley of the jeep beats
And when the freaks come out I hug a TV
Somehow a channel zero bender’s less creepy…
Aesop Rock, “Easy,” Bazooka Tooth

“TV shows matter,” writes David Wong (2012). “They shape the lens through which you see the world. The very fact that you don’t think they matter, that even right now you’re still resisting the idea, is what makes all of this so dangerous to you — you watch… so you can turn off your brain and let your guard down. But while your guard is down, you’re letting them jack directly into that part of your brain that creates your mythology. If you think about it, it’s an awesome responsibility on the part of the storyteller.” I might not see the end of broadcast television, but even if it goes away, I don’t believe that the structure of the serial narrative will disappear. The idea of the TV show will endure. No matter what we watch it on.

References:

Farrell, Perry. (1988). Ted Just Admit It [Recorded by Jane’s Addiction]. On Nothing’s Shocking. Los Angeles: Warner Brothers.

Gerbner, George. (1967). Mass media and human communication theory. In F. E. X. Dance (Ed.), Human Communication Theory: Original Essays. New York: Holt, Rinehart & Winston, pp. 40-60.

Rock, Aesop. (2003). Easy. On Bazooka Tooth [LP]. New York: Definitive Jux.

Signorielli, Nancy & Morgan, Michael. (1996). Cultivation Analysis: Research and Practice. In Michael B. Salwen & Don W. Stacks (Eds.), An Integrated Approach to Communication Theory and Research. Mahwah, NJ: Lawrence Erlbaum Associates.

Spigel, Lynn. (1992). Make Room for TV: Televsion and the Family in Postwar America. Chicago, IL: University of Chicago Press.

Wong, David. (2012, August 6). 5 Ways You Don’t Realize Movies are Controlling Your Brain. Cracked.com

The Mythology and Missteps of Dune

Dune“A beginning is a very delicate time,” opens the narrative of David Lynch’s 1984 film adaptation of Frank Herbert’s Dune (1965). Herbert says of the novel’s beginnings, “It began with a concept: to do a long novel about the messianic convulsions which periodically inflict themselves on human societies. I had this idea that superheros [sic] were disastrous for humans” (quoted in O’Reilly, 1981). The concept and its subsequent story, which took Herbert eight years to execute, won the Hugo Award, the first Nebula Award for Best Novel, and the hearts and minds of millions. Chronicler of cinematic science fiction follies David Hughes (2001) writes, “While literary fads have come and gone, Herbert’s legacy endures, placing him as the Tolkien of his genre and architect of the greatest science fiction saga ever written” (p. 77). Kyle MacLachlan, who played Paul Atreides, adds, “This kind of story will survive forever” (quoted in McKernan, 1984, p. 96).

Writers of all kinds are motivated by the search and pursuit of story. A newspaper reporter from the mid-to-late-1950s until 1969, Herbert employed his newspaper research methods to the anti-superhero idea. He gathered notes on scenes and characters and spent years researching the origins of religions and mythologies (O’Reilly, 1981). Joseph Campbell, the mythologist with his finger closest to the pulse of the Universe, wrote, “The life of mythology derives from the vitality of its symbols as metaphors delivering, not simply the idea, but a sense of actual participation in such a realization of transcendence, infinity, and abundance… Indeed, the first and most essential service of a mythology is this one, of opening the mind and heart to the utter wonder of all being” (p.18). Dune is undeniably infused with the underlying assumptions of a powerful mythology.

The sleeper must awaken.

A lot of people have tried to film Dune. They all failed.
— Frank Herbert

After labored but failed attempts by both Alejandro Jodorowsky, Haskell Wexler, and Ridley Scott (the latter of whom offered the writing job to Harlan Ellison; see Ellison, 1989, p. 203) to adapt Dune to film (Hughes, 2001; Tuchman, 1984), David Lynch signed on to do it in 1981 (Naha, 1984). With The Elephant Man (1980) co-writers Eric Bergren and Christopher De Vore, Lynch started over from page one, ditching previous scripts by Jodorowsky, Rudolph Wurlitzer, and Frank Herbert himself, as well as conceptual art by H.R. Giger (who had designed the many elements of planet Giedi Prime, home of House Harkonnen), Jean Giraud, Dan O’Bannon, and Chris Foss. Originally 200 pages long, Lynch’s script went through five revisions before it was given the green light, which took another full year of rewriting (Hughes, 2001). “There’s a lot of the book that’s isn’t in the film,” Lynch said at the time. “When people read the book, they remember certain things, and those things are definitely in the film. It’s tight, but it’s there” (quoted in Tuchman, 1984, p.99).

DuneLynch’s Dune is of the brand of science fiction during which one has to suspend not only disbelief in the conceits of the story but also disbelief that you’re still watching the movie. I’m thinking here of enjoyable but cheesy movies like Logan’s Run (1976), Tron (1984), The Last Starfighter (1984), and many moments of the original Star Wars trilogy (1977, 1980, 1983). I finally got to see it on the big screen last week at Logan Theater in Chicago, and as many times as I’ve watched it (it has been regular bedtime viewing for me for years), it was still a treat to see it at the scale Lynch originally intended.

Dune is not necessarily a blight on Lynch’s otherwise stellar body of work, but many, including Lynch, think that it is. When describing the experience, he uses sentences like, “I got into a bad thing there,” “I really went pretty insane on that picture,” “Dune took me off at the knees. Maybe a little higher,” and, “It was a sad place to be” (quoted in Rodley, passim). Lynch’s experience with Dune stands with Ridley Scott’s Blade Runner, and Terry Gilliam’s The Man Who Killed Don Quixote as chaotic case studies in the pitfalls of novel-adapting and movie-making gone wrong.

Beginnings are indeed delicate times, and Frank Herbert knew not what he had started. “I didn’t set out to write a classic or a bestseller,” he said. “In fact, once it was published, I wasn’t really aware of what was going on with the book, to be quite candid. I have this newspaperman’s attitude about yesterday’s news, you know? ‘I’ve done that one, now let me do something else.'” (Naha, 1984). He went on to write five sequels, and his son Brian and Kevin J. Anderson have written other novels set in the Dune universe. Even for its author, the mythology of Dune has proven too attractive to escape.

References:

Campbell, Joseph. (1986). The Inner Reaches of Outer Space: Metaphor as Myth and as Religion. New York: Harper & Row.

Ellison, Harlan. (1989). Harlan Ellison’s Watching. San Francisco, CA: Underwood-Miller.

Herbert, Frank. (1965). Dune. New York: Chilton books.

Hughes, David. (2001). The Greatest Sci-Fi Movies Never Made. London: Titan Books.

McKernan, Brian. (1984, November). Dune: A Sneak Preview. Omni Magazine, (7)2, 94-97.

Naha, Ed. (1984). The Making of Dune. New York: Berkeley Trade.

O’Reilly, Timothy. (1981). Frank Herbert. New York: Frederick Ungar Publishing.

Rodley, Chris (ed.) (1997). Lynch on Lynch. London: Faber and Faber.

Tuchman, Mitch. (1984, November). The Arts: Film. Omni Magazine(7)2, 40, 98-99.