Is Anyone There? On her and Transcendence

Cinema is our most viable and enduring form of design fiction. More than any other medium, it lets us peer into possible futures projected from the raw materials of the recent past, simulate scenes based on new visions via science and technology, gauge our reactions, and adjust our plans accordingly. These visions are equipment for living in a future heretofore unseen. As video artist Bill Viola (1995) puts it,

The implied goal of many of our efforts, including technological development, is the eradication of signal-to-noise ratio, which in the end is the ultimate transparent state where there is no perceived difference between the simulation and the reality, between ourselves and the other. We think of two lovers locked in a single ecstatic embrace. We think of futuristic descriptions of direct stimulation to the brain to evoke experiences and memories (p. 224).

Miles explains love to Edgar the computer in Electric Dreams (1984)
— Miles explains love to Edgar the computer in Electric Dreams (1984)

Welcome to the world of Pinecone Computers. This model will learn with you, so type your name and press Enter key to begin.
— Miles Harding reading from a computer manual in Electric Dreams (1984)

Since the big-screen tales of the 1980s’ PC-era, the idea of machines merging with humans has been a tenacious trope in popular culture. In Tron (1982) Kevin Flynn was sucked through a laser into the digital realm. Wired to the testosterone, the hormone-driven juvenile geniuses of Weird Science (1985) set to work making the woman of their dreams. WarGames (1983) famously pit suburban whiz-kids against a machine hell bent on launching global thermonuclear war. In Electric Dreams (1984), which is admittedly as much montage as it is movie, Miles Harding (played by Lenny von Dohlen, who would go on to play the agoraphobic recluse Harold Smith in Twin Peaks) attempts to navigate a bizarre love triangle between him, his comely neighbor, and his new computer.

From the jealous machine to falling in love with the machine, the theme remains pervasive 30 years on. As Ray Kurzweil writes of Spike Jonze’s her,

Jonze introduces another idea that I have written about (and that is the central theme of Barry Ptolemy’s movie about my ideas, Transcendent Man), namely, AIs creating an avatar of a deceased person based on their writings, other artifacts and people’s memories of that person. In her, the AIs get together and recreate 1960s philosopher Alan Watts (whom I remember from my teenage years).

Theodore Twombly at work in her (2013).
— Theodore Twombly at work in her (2013).

I’d say “her” is a movie about (the education of) an interesting woman who falls in love with a man who, though sweet, is mired in biology. — , Tweeted on February 16, 2014

in her, Theodore Twombly (played by Joaquin Phoenix) writes letters for a living. Letters between fathers and daughters, long-distance lovers, husbands, wives. He condenses stories from the vapor of their nuances. In doing so, he is especially susceptible to the power of narrative himself since his job involves the constant creation of believable, vicarious stories. His ability to immerse himself in the stories of others makes it that much easier for him to get lost in his operating system (“Samantha,” voiced by Scarlett Johansson) as she constructs narratives to create her personality, and thereby, their relationship.

In many ways, her can be read as a response to Lost in Translation (2003), directed by Jonze’s wife at the time, Sophia Coppola, who, like Jonze did for her, won an Academy Award for Best Original Screenplay. That movie is in part about the dissolution of Jonze and Coppola’s relationship. Where Giovanni Ribisi plays a goofy, self-involved Jonze (“John”) in Lost in Translation, Rooney Mara plays an ununderstanding, judgemental Coppola (“Catherine”) in her: mere caricatures of themselves played out in bit parts. Where others have no problem with it, ex-wife Catherine has no truck with Theodore’s new OS love. He nonetheless remains incredulously committed.

Cognitive scientist Douglas Hofstadter calls our imbuing machines with more intelligence than they have—even when we know better—“The ELIZA Effect,” after Joseph Weizenbaum’s text-based psychoanalytic computer program, ELIZA. Hofstadter writes, “The most superficial of syntactic tricks convinced some people who interacted with ELIZA that the program actually understood everything that they were saying, sympathized with them, even empathized with them” (p. 158). ELIZA was written at MIT by Weizenbaum in the mid-1960s, but its effects linger on. “Like a tenacious virus that constantly mutates,” Hofstadter continues, “the Eliza effect seems to crop up over and over again in AI in ever-fresh disguises, and in subtler and subtler forms” (p. 158). To wit, in Chapter One of Sherry Turkle’s Alone Together (2011; specifically pp. 24-25), she extends the idea to our amenability to new technologies, including artificial intelligence, embodied or otherwise: “And true to the ELIZA effect, this is not so much because the robots are ready but because we are” (p. 25).

More germane to her is a program called KARI, which stands for “Knowledge Acquiring and Response Intelligence.” According to Dominic Pettman‘s first and only conversation with Kari (see Pettman’s Look at the Bunny, 2013), there’s a long way to go before any of us are falling in love with our computers.

Kevin Flynn getting zapped into the computer in Tron (1982).
— Kevin Flynn getting zapped into the computer in Tron (1982).

Others imagine a much more deliberate merging, postulating an uploading of human consciousness into the machines themselves, known in robotic and artificial intelligence circles as “The Moravec Transfer.” Its namesake, roboticist Hans Moravec, describes a human brain being uploaded, neuron by neuron, until it exists unperturbed inside a machine. But Moravec wasn’t the first to imagine such a transition (for another early example, see Stine, 1979). NASA’s own Robert Jastrow wrote in 1984 that uploading our minds into machines is the be-all of evolution and would make us immortal. He wrote,

At last the human brain, ensconced in a computer, has been liberated from the weakness of the mortal flesh… The machine is its body; it is the machine’s mind… It seems to me that this must be the mature form of intelligent life in the Universe. Housed in indestructible lattices of silicon, and no longer constrained in the span of its years by the life and death cycle of a biological organism, such a kind of life could live forever (p. 166-167).

Dr. Will Caster merges with the machine in Trancendence (2014).
— Dr. Will Caster merges with the machine in Transcendence (2014).

In Transcendence (2014) Dr. Will Caster (played by Johnny Depp) and his wife (“Evelyn,” played by Rebecca Hall, who almost seems to be filling in for an unavailable Johansson) do just that. Caster is terminally ill and on the verge of offloading his mortal shell. Once uploaded into a quantum computer connected to the internet, Caster becomes something less than himself and something more simultaneously. It’s the chronic consciousness question: What is it about you that makes you you? Is it still there once all of your bits are transferred into a new vessel? The Casters’ love was strong enough for them to try and find out.

If Kubrick and Spielberg’s AI: Artificial Intelligence (2001) can be read as an allegory for gays being accepted by their parents (see Kraus, 2004, p. 182), what sociological anxieties can we superimpose over her and Transcendence? I am admittedly a lapsed student of AI, having dropped out of the University of Georgia’s Artificial Intelligence master’s program several years ago. My interest in AI lies in the weird ways that consciousness and creation butt heads in the midst of such advanced technologies. Mix a love story in there and you’ve got questions and quests for a lifetime. As Jonze himself puts it, “… a lot of the feelings you have about relationships or about technology are often contradictory” (quoted in Michael, 2013). Love and technology willing, when one of us has to be leaving, we won’t let that come between us, okay?

References:

Hofstadter, Douglas. (1995). Fluid Concepts and Creative Analogies: Computer Models of the Fundamental Mechanisms of Thought. New York: Basic Books.

Jastrow, Robert. (1984). The Enchanted Loom: Mind in the Universe. New York: Simon & Schuster.

Kraus, Chris. (2004). Video Green: Los Angeles Art and the Triumph of Nothingness. New York: Semiotext(e).

Michael, Chris. (2013, September 9). Spike Jonze on Letting Her Rip and Being John Malkovich. The Guardian.

Pettman, Dominic. (2013). Look at the Bunny: Totem, Taboo, Technology. Ropley Hants, UK: Zer0 Books.

Stine, G. Harry. (1979, July). The Bionic Brain. Omni Magazine, vol. 1, #10, pp. 84-86, 121-122.

Turkle, Sherry. (2011). Alone Together: Why We Expect More from Technology and Less from Each Other. New York: Basic Books

Viola, Bill. (1995). Reasons for Knocking at an Empty House: Writings 1973-1994. Cambridge, MA: The MIT Press.

Weizenbaum, Joseph. (1976). Computer Power and Human Reason. San Francisco: W.H. Freeman.

Mixed Metonymies: Mechanization and Culture

Meanings are malleable. Words bend and break under the stress of unintended use, abuse, or overuse. Like machine parts pushed past their limits, cogs stripped bare of their teeth, the language we use wears out, weakening the culture that carries it and our knowledge thereof.

Charles Babbage's wheel work.

Aldous Huxley (1970) writes, “In the days before machinery men and women who wanted to amuse themselves were compelled, in their humble way, to be artists. Now they sit still and permit professionals to entertain them by the aid of machinery” (p. 11). We use metaphors and metonymies of the machine to explain everything from individual bodies  and brains to society and the cosmos (see Lakoff, 1993; Raunig, 2010; Wilden, 1972). Aristotle used many anthropomorphic ideas to describe natural occurrences, but the technology of the time, needing constant human intervention, offered little in the way of metaphors for the mind. Since then, we have compared the human mind to the clock, the steam engine, the radio, the radar, and the computer (Vroon, 1987). Machines, engines, motors—these are visible, tangible things. The mechanizations we need to watch are the ones we can’t see. As Bettina Knapp (1989) writes, “…machines increasingly cut people off from nature in general and from their own  nature, in particular” (p. 28).

Mechanization Takes CommandIn Mechanization Takes Command: A Contribution to Anonymous History (University of Minnesota Press, 2013), originally published in 1948, Sigfried Giedion attempts to elucidate the cause of this splitting from our nature, the break between thought and feeling in modern society. The culprit according to Giedion? Mechanization. He uses a typological approach, moving chronologically through each of his categories: springs (movement), means (hand, key, assembly line), agriculture (gardening, bread-making, meat production), household (chair, table, furniture, feminism, refrigeration), and bath (steam, shower). This provides a matrix of mechanization (time vs type) that creates a fresh view across this “anonymous history.”

In spite of the machines, interesting people are still central to the story. Giedion follows how the in-house feminism of Catherine Beecher and “curtailed drudgery and improved organization” (p. 512) lead to the further mechanization of the home. He illustrates how Charles Babbage informed Frederick Taylor’s time studies, scientific management, and the division of labor of Taylor and Henry Ford, the inventors of modern industrialization.

“More perhaps than machinery,” writes John Kenneth Galbraith (1967), “massive and complex business organizations are the tangible manifestation of advanced technology” (p. 19). Institutions, bureaucracies, organizations like organisms led to the globalization of the machine: processors, keyboards, harddrives, screens, spreadsheets, websites, databases, fiber optic cables, satellites, wireless clouds bulging gray with data… Paul Virilio (1995) shortens the term “cyberspace” from its imaginary original form “cybernetic space-time” (p. 140), the extending of which evokes the ultimate mechanical prosthesis of the mind, a planet-spanning, command-control system to end all such systems.

The usually glum Huxley (1970) has his high notes: “Giving leisure and wealth, machines make general culture possible. There can be no doubt that many people, who would otherwise have longed in vain, are now permitted, thanks to machinery, to satisfy their longing for culture” (p. 11). From tilling machines to networked screens, our technology curates our culture. Like the precision workings of cogs and gears, let us be mindful of the language we use to describe it.

References:

Galbraith, John Kenneth. (1967; 2007). The New Industrial State. Princeton, NJ: Princeton University Press.

Giedion, Sigfried. (1948; 2014). Mechanization Takes Command: A Contribution to Anonymous History. Minneapolis, MN: University of Minnesota Press.

Huxley, Aldous. (1970). America and the Future: An Essay. Austin, TX: Jenkins Publishing Company.

Knapp, Bettina. (1989). Machine, Metaphor, and the Writer: A Jungian View. University Park, PA: Pennsylvania State University Press.

Lakoff, George. (1993). The Contemporary Theory of Metaphor. In Andrew Ortony (Ed.), Metaphor and Thought (pp. 202–251). Cambridge, MA: Cambridge University Press.

Raunig, Gerald. (2010). A Thousand Machines: A Concise Philosophy of the Machine as Social Movement. New York: Semiotext(e).

Virilio, Paul. (1995). The Art of the Motor. Minneapolis, MN: University of Minnesota Press.

Vroon, P. A. (1987). Man-Machine Analogs and Theoretical Mainstreams in Psychology. In W. J. Baker, M. E. Hyland, H. van Rappard, & A.W. Staats (Eds.), Current Issues in Theoretical Psychology (pp. 393–141). New York: North-Holland.

Wilden, Anthony. (1972). System and Structure: Essays in Communication and Exchange. London: Tavistock.

————-

Babbage wheel-work image from James Gleick‘s The Information (New York: Pantheon, 2011, p. 97).

Cool by Committee: Cultural Capital and Art

“Nobody wants to be uncool,” writes Chris Kraus in her book Video Green (Semiotext(e), 2004, p. 24). She’s writing about the trials of graduate school, specifically MFA programs and the inherent ambiguity in determining the value of art. The rigor of graduate work is part of the gatekeeping and cultural encoding that make the art world go ’round, that make cool art cool. Kraus continues,

…this two-year hazing process is essential to the development of value in the by-nature-elusive parameters of neoconceptual art. Without it, who would know which cibachrome photos of urban signage, which videotapes of socks tossing around a dryer, which neominimalist monochrome paintings are negligible and which are destined to be art? (p. 24)

Damien Hirst: Shark

In his search for authenticity, writer Andrew Potter reduces this hard-won pedigree down to just an artist’s brand. His favorite example is Damien Hirst. “One logical endpoint of this takes us to the world of contemporary art,” he writes (2010), “where many of the works in and of themselves are so ludicrous in concept and so inept in execution that the old philistine war cry ‘My child could do that’ is an insult to untalented children everywhere. But this objection misses the point, which is that the work itself is totally irrelevant. What is being sold is the artist himself [sic], his [sic] persona, or better, his [sic] brand” (p. 98). Brands in this context are largely decided on by the gatekeepers in art schools, galleries, and museums, not so much by “the market” in any economic sense. Potter’s reductionist view is blind to an artist’s training and talent, not to mention her art’s raw aesthetic appeal. Hirst’s art speaks in the language of authenticity (see Boyle, 2003), which must make it worse. Potter adds, “[S]narkiness over sharkiness isn’t serious art criticism, and judging Hirst’s work by the criteria of technical skill, artistic vision, and emotional resonance is like complaining that the Nike swoosh is just a check mark” (p. 99). We may think we’re unaffected by such subversions, but that is a danger in itself. “Considering yourself immune to advertising and branding is not a solution,” writes Rob Walker (2008), “it’s part of the problem” (p. 68).

No MediumWhen Thomas Kuhn (1970) conceived of a paradigm, he was referring to the attitudes and beliefs of the scientists in a community, not the scientific facts themselves. His paradigms are “the entire constellation of beliefs, values, techniques, and so on shared by the members of a given community” (p. 175).* Certain things matter because enough of us decide that they do. We also decide that some of those things matter more than others and that some of them are cooler than others. Cool is tribal. It travels in groups, committees, and communities (see Eckert, 2000; Liu, 2004; Wenger, 1998).

All of these examples hover between what Pierre Bourdieu (1986) called social capital and what he called cultural capital: a system of exchange that takes cultural knowledge as its gold standard. Such knowledge creates in-groups and out-groups (Leppehalme, 1997). You are down if you get the reference and not if you don’t. Craig Dworkin writes in his book No Medium (MIT Press, 2013), “…[W]e are misled when we think of media as objects. Indeed, the closer one looks at the materiality of a work—at the brute fact of its physical composition—the more sharply a social context is brought into focus” (p. 30). Communities of people imbue these objects and their relationships with value. Cool could be the product of an MFA, but it could just as easily be the right amount of properly placed irony or the timely subverting of a paradigm. As Dave Allen puts it in his recent piece “White Ants and Flying Saucers,”

As the famous phrase goes: You are entitled to your own opinions, but not your own facts. This is not to say there won’t be another transitory effect that may destabilize the current models, it is just to say that we must work hard to untangle our strongly held beliefs from the actual reality of the situation. That is where the opportunity for informed debate lies, and the opportunity should be embraced by all who have strong and passionate feelings for the “future of music.”

We tend to think of technological shifts as driven by their own forces (see Winner, 1977), as diffusing through the same old channels (see Rogers, 2003), or as slouching toward their own attractors. People still decide what counts though. Untangling the changes and how we feel about those changes points to the impossibility of finding distance from our devices: The changes happen without our noticing. It’s only when we look back that we can tell a threshold has been crossed, that the paradigm has shifted, that what we thought was cool is now not so much. Sound artist David Dunn (1999) describes it this way:

Most of what we live in now is a technological environment. That’s the status quo. That’s the social ground that constrains us. The degree to which we understand these tools is the degree to which we have freedom from them. If we don’t understand them and don’t know how they work, we easily ascribe to them some mystical significance and belief that the machines are doing our thinking for us (p. 65).

Capital may only want more capital, but art and technology don’t want anything. They are each radically subjective in their own ways. As Kaya Oakes (2009) writes, “Any valid culture, anything that changes people’s perception and way of thinking is made of many, many voices, and the disharmony and occasional harmony of those voices is what makes things interesting and complicated when you’re trying to define what that culture means” (p. 17). I prefer interesting and complicated over cool any day.

* Kuhn’s other definition of paradigms involves the models in use as puzzle-solving tools among those scientists (see Kuhn, 1970, p. 175).

References:

Allen, Dave. (2014, March 11). White Ants and Flying Saucers. Beats Music.

Boyle, David. (2003). Authenticity: Brands, Fakes, Spin and the Lust for Real Life. New York: Harper Perennial.

Dunn, David & van Peer, René. (1999). Music, Language, and Environment. Leonardo Music Journal, 9, 63-67.

Dworkin, Craig. (2013). No Medium. Cambridge, MA: The MIT Press.

Eckert, Penelope. (2000). Linguistic Variation as Social Practice: The Linguistic Construction of Identity in Belten High. Hoboken, NJ: Wiley-Blackwell.

Kraus, Chris. (2004). Video Green: Los Angeles Art and the Triumph of Nothingness. New York: Semiotext(e).

Kuhn, Thomas. (1970). The Structure of Scientific Revolutions (Second Edition, Enlarged). Chicago, IL: University of Chicago Press.

Leppihalme, Ritva. (1997). Culture Bumps: An Empirical Approach to the Translation of Allusions. Bristol, PA: Multilingual Matters.

Liu, Alan. (2004). The Laws of Cool: Knowledge Work and the Culture of Information. Chicago, IL: University of Chicago Press.

Oakes, Kaya. (2009). Slanted and Enchanted: The Evolution of Indie Culture. New York: Henry Holt & Co.

Potter, Andrew. (2010). The Authenticity Hoax: Why the “Real” Things We Seek Don’t Make Us Happy. New York: Harper Perennial.

Rogers, Everett M. (2003). Diffusion of Innovations (5th Edition). New York: Free Press.

Walker, Rob. (2008). Buying In: Why We Buy and Who We Are. New York: Random House.

Wenger, Etienne. (1998). Communities of Practice: Learning, Meaning, and Identity. Cambridge, UK: Cambridge University Press.

Winner, Langdon. (1977). Autonomous Technology: Technics-Out-of-Control as a Theme in Political Thought. Cambridge, MA: The MIT Press.

Social Media Fatigue

The closer we get to each other, the less likely we are to have things in common. The more we know about each other, the more likely we are to fundamentally disagree on how the world should work. The more intimate the details we share, the more likely one of us has done something unforgivable in the eyes of the other. Dig deep enough inside anyone and you’re going to find something you don’t like. As my friend Lucas Molandes puts it, the only reason you’re with the person you’re with right now is because all of your previous relationships failed.

I Can't Believe I'm Not Bitter!

Human relationships are messy. We get involved only when we have to. We skim across the surfaces of each other. We give and get only what is needed in each situation: filling out forms, ID numbers, driver’s licenses, log-ins, passwords, online presences, social networks. We inconvenience ourselves for institutions and one another. Even our personal opinions and comments have migrated from scattered sites and blogs to social media silos, soon to be replaced by Likes and Re-Tweets. The illusion of being in touch. Spam disguised as social interaction.

It’s global. It’s local.
It’s the next thing in Social.
Hip-hop, rockin’, or microbloggin’ —
You get updates every time you log in.
So, come on in, we’re open,
And we’re hopin’ to rope in
All your Facebook friends and MySpace memories.
There’s a brand new place for all of your enemies.
You don’t really care about piracy or privacy.
You just want music and friends as far as the eye can see.
So, sign up, sign in, put in your information.
It’s the new online destination for a bored, boring nation.
Tell your friends, your sister, and your mom.
It’s time for something-something-something.com

The numbers say that social media doesn’t replace face-to-face communication, it enhances and encourages it. The numbers say that older people are uncooling social media and driving the youth to other means of interaction. The numbers tell them what we’ve bought in the past, what we’re buying now, and predict what we will buy later. The numbers tell them where we’ve been, where we are, and where we’re going. The numbers know who we are and who we’re likely to become. We are the products of social media. We are what it buys and sells.

And when time like the pyramids has worn away
All the mountains and the valleys of the words that we say
We have got to make sure that something remains
If we lose each other we’ve got no one to blame
— Oingo Boingo, “My Life”

The numbers can’t tell them what it’s like to hold her hand. How nice it is when she’s here or how empty it is when she’s not. They can’t quantify the unashamed laughs of children or the smiles in the eyes of parents. There’s no database for the barely perceivable daydream-driven smirk, no pivot table for the way that curl hits that curve in her neck just so. Big data seems so small in the face of real human detail.

Getting close to someone else is a sloppy, risky mess. The things you love most can quickly become the things you loathe. Taking that chance is the best thing in the world though. And there is no app for that.

It Toggles the Mind

Twenty years ago, Arthur Kroker described the predominant spirit of the times as a “spasm” (1993). What Bruce Sterling (1998) describes as “that violently oscillating 1990s state when you feel totally hyper and nauseatingly bored. That gnawing sense that we’re on the road to nowhere at a million miles an hour.” The feeling has expanded to the point where detached irony is our default emotional setting. David Foster Wallace called it “Total Noise” (quoted in Gleick, 2011, p. 403): An all-consuming cultural state that “tends to level everything out into an undifferentiated mass of high-quality description and trenchant reflection that becomes both numbing and euphoric” (Wallace, 2012, p. 301). It’s information anxiety coupled with complete boredom (Gleick, 2011). What happened to the chasm between those two extremes?

Always two things
switching.
Current runs through bodies
and then it doesn’t.
It was a language of sounds,
of noise,
of switching,
of signals.

On again.
Off again.
Always two things
switching.
One thing instantly replaces
another.

It was the language
of the Future.

— Laurie Anderson, United States

Constructing sameness is an essential intellectual activity that goes unobserved. — Mary Douglas, How Institutions Think

A skeuomorph is a design element that remains only as an allusion to a previous form, like a digital recording that includes the clicks and pops of a record player, woodgrain wallpaper, the desktop metaphor, or even the digital “page.” It’s obsolete except in signifying what it supplants. N. Katherine Hayles (1999) describes the concept, writing, “It calls into play a psychodynamic that finds the new more acceptable when it recalls the old that it is in the process of displacing and finds the traditional more comfortable when it is presented in a context that reminds us we can escape from it into the new” (p. 17; cf. Tenner, 2003, p. xii). Skeuomorphs meditate the liminal space between uncomfortable shifts and an uncertain future, translating the unknown into the terms of the known.

Translation is always an amalgam of hope and nostalgia, combining the yearning for home with the urge to press forward into new territories. — Matthew Battles, The Sovereignties of Invention

Just like a cramped muscle, the solution to Kroker’s metaphorical spasm is to stretch it out. In the most general sense, my central research question concerns the process by which we mediate our lives with our technologies. What I call The Medium Picture is that process, what it helps, hides, and hinders. A medium is literally a “middle, intermediary state” (Gleick, 2011, p. 153), and that is the place I’ve been investigating. Skeuomorphs bridge the threshold, obscuring the transition, and that is their purpose when it comes to adapting people to new technologies. They soften the blow of the inevitable upgrade. But every new contrivance augments some choices at the expense of others. What we lose is often unbeknownst to us.

… multifunctional lidless eyes watching, outside-in and inside-out; our technology has produced the vision of microscopic giants and intergalactic midgets, freezing time out of the picture, contracting space to a spasm. — Rosi Braidotti, Nomadic Subjects

With his finger ever on the flickering pulse, William Gibson (2012) writes, parenthetically, “(This perpetual toggling between nothing being new under the sun, and everything having very recently changed, absolutely, is perhaps the central driving tension of my work)” (p. 51). That binary belies a bulging, unexplored midsection. The space between that switch from one extreme to the other, that is what The Medium Picture is about.

References:

Anderson, Laurie. (1984). United States. New York: Harper & Row, p. 22.

Battles, Matthew. (2012). The Sovereignties of Invention. New York: Red Lemonade, p. 84.

Braidotti, Rosi. (1994). Nomadic Subjects. New York: Columbia University Press, p. 43.

Douglas, Mary. (1986). How Institutions Think. Syracuse, NY: Syracuse University Press, p. 60.

Gibson, William. (2012). Distrust That Particular Flavor. New York: Putnam.

Gleick, James. (2011). The Information: A History, a Theory, a Flood. New York: Pantheon.

Hayles, N. Katherine. (1999). How We Became Post-Human. Chicago: University of Chicago Press.

Kroker, Arthur. (1993). Spasm: Virtual Reality, Android Music and Electric Flesh. Montreal: New World Perspectives.

Sterling, Bruce. (1998, October 4). Viridian Design. San Francisco: Yerba Buena Center for the Arts.

Tenner, Edward. (2003). Our Own Devices. New York: Knopf.

Flyover Culture: The Death of the Mainstream

We’re all home for the holidays. Looking around the living room today at the family assembled there, most were clicking around on laptops, two were also wearing headphones, one was fingering a smartphone. The television was noticeably dark and silent with each of us engrossed in his or her own digital experience, be it a game, a TV show, or some, social metamedium. Jaron Lanier (2008) writes, ”…pop culture is important. It drags us all along with it; it is our shared fate. We can’t simply remain aloof” (p. 385). But what happens when we don’t share any of it anymore? Narrowcasting and narrowcatching, as each of us burrows further into our own interests, we have less of them in common as a whole. The mainstream has become less of a stream and more of a mist.

What We Share

A friend of mine noted recently that The Long Tail has gotten so long and so thick that there’s not much left in the Big Head. As the internet-enabled market supported a wider and wider variety of cultural artifacts with less and less depth of interest, the big, blockbuster hits have had ever-smaller audiences. This wasn’t the case just a decade ago. The audiences seem to decrease in proportion to the size of the screens. I have found this splintering more and more in the classroom as I try to pick somewhat universal media artifacts to use as examples. Even the biggest shows and movies I brought up this semester left nearly half of my students out, and if I ever got into the stuff I actually like, I was greeted with little more than cricket sounds. The postmodern promise of individual viewpoints and infinite fragmentation is coming closer to fruition.

Cultural divisions as such used to be framed as high versus low culture. New Yorker writer John Seabrook (2000) argues that we have evolved past such hierarchies into what he calls “nobrow culture.” Definitely erring on the high side, Seabrook doesn’t know Stormtroopers from Sand People. Depending on which side of the high/low fence you stand, he and his ilk have “condescended and/or pandered,” in the words of Hal Foster, to you for far too long. The mixing of high culture’s concerns with low culture’s lack thereof only makes sense if there’s a market in the middle. The mainstreaming of anything requires a middle class.

Middle Class, R.I.P.

The middle class is traditionally thought of as the masses of people who are above “working” class but also not quite “upper” class. By definition, membership in the middle class requires a certain amount of discretionary income. Mainstream pop culture relies on that. As that income diminishes and less of the extant money is spent on media due to an increasingly tech-savvy populous, the funding for frivolous entertainment decreases. Art and commerce have always been odd bedfellows, but their offspring are the least interesting children in history. Focus groups, product placement, and everything “brought to you by” a brand are not cool conventions. Mix that division and decline with pop culture’s obsession with its own past, what Simon Reynolds (2011) calls “retromania,” and we get reality television, ubiquitous advertising, and endless remakes and remixes. Reynolds likens the state of the culture industry to global economics, predicting an inevitable crash: “The world economy was brought down by derivatives and bad debt; music has been depleted of meaning through derivatives and indebtedness” (p. 410-420). If the rest of pop culture ends up like the demonetized music industry, then we can bury the middle class next to the mainstream.

None of this is to say that underground culture is inherently better. It’s never made much sense to describe something aesthetically in terms of the mainstream, and now it makes less than ever. Working the ends against the middle trying to get the best of both worlds, so-called “nobrow culture” ends up with the bad of both without any of the good. Watered-down, diluted, widely disseminated, what’s left of the mainstream is the cultural equivalent of the muddy, middle heartland, viewed from an airplane window. It’s flyover culture.

Wittgenstein (1953) once said there was no such thing as a private language. The presumption being that a language only works if it is shared. The same can be said of culture. It only works if it is shared. Here’s hoping we can continue to find some overlapping dirt to dig.

References:

Anderson, Chris. (2006). The Long Tail: Why the Future of Business Is Selling More of Less. New York: Hyperion.

Lanier, Jaron. (2008). Where Did the Music Go? In Paul D. Miller (Ed.), Sound Unbound: Sampling Digital Music and Culture. Cambridge, MA: The MIT Press, pp. 385-390.

Reynolds, Simon. (2011). Retromania: Pop Culture’s Addiction to Its Own Past. New York: faber & faber.

Seabrook, John. (2000). Nobrow: The Marketing of Culture and the Culture of Marketing. New York: Knopf.

Wittgenstein, Ludwig. (1953). Philosophical Investigations. Hoboken, NJ: Blackwell Publishing.

————

This post benefited greatly from discussion and correspondence with Mark Wieman and Tim Baker.

Lessons of My Wounded Knee

I’ve spent the last month in a leg brace and the first two weeks of it on crutches. The experience has slowed me down in many ways, not all of which were bad. I’m not recommending cracking a kneecap to get reacquainted with reality, but a good jarring of the sensorium might help us all once in a while. As Doug Rushkoff said recently, “Reality is the human’s home turf.” Nothing brings reality crashing back in like crashing into reality.

Fractured patella

In addition to my patella, I also smashed my phone. The cracking of its screen left it useless for texting or taking pictures. Ironically, the only thing it will do now is send (provided I know or can find the number) and receive calls. I also haven’t been wearing headphones as my injury already makes me an easy mark. These two things — no texting and no headphones — reconnected me with aspects of my days I’d been avoiding or ignoring.

All Citizens Must.

Also, I’ve had to change up my commute. For one thing, I haven’t been able to ride my bike to work (obviously), which is what I was doing when I crashed. And I haven’t been able to take the train because I couldn’t walk that far on crutches. It should also be noted that there are only a few CTA train stations with elevators. Stairs were out of the question for a few weeks. This put me on a multiple bus-route commute that took me through parts of Chicago I’d never seen.

Possibly the most important factor that has made this an enlightening experience is sociological rather than technological. Collectively we tend to other the impaired among us. That is, there seems to be a clear delineation between the impaired and the normal; however, if one of us is only temporarily injured, we sympathize, empathize, or pity them.

In the month that I haven’t been texting or listening to music and have had a bum leg, I’ve had countless uplifting and informative conversations with people whom I wouldn’t have spoken to otherwise and who wouldn’t have spoken to me for one reason or the other. All of the above has made me feel far more connected than any technology or so-called “social” media.

Triangle of Doom.

Speaking of, I posted this on Facebook about a week into my recovery, and I repost it here because it garnered the most response of anything I’ve ever posted on there:

My smashing my knee into the pavement at the origami triangle fold of traffic that is the intersection of Elston, Fullerton, and Damen in Chicago has shoved me out of my comfort zone in several ways. One thing I noticed today on my temporarily revised, much-longer commute to campus is a lot of needless anger: a man walking by the bus stop, angry at his dog for being a dog; a lady with her children, angry at them for being children; people on the bus, angry about being on the bus; the bus driver, angry about the people on the bus; and on and on.

I’m not exactly happy that my right patella is fractured in two places, and I’ve certainly had good and bad days since I broke it, and I’m not better than any of those mentioned above, but I try to smile at everyone, laugh at my fumbling around on crutches, do my work, and generally let others carry the anger.

It’s so easy to be angry, but it doesn’t take much more effort to be pleasant, and being pleasant makes everything easier for everyone.

Getting out of your comfort zone doesn’t have to be quite so uncomfortable, but sometimes being forced is the only way for it to happen. It feels like I needed it.

With that said, a physical therapist saw me out walking with my leg brace on the other day. He stopped and asked me about my injury with genuine and professional interest. He then informed me that a broken patella is the most painful kind of injury, which, he added, is supposedly why it is the chosen punishment for those late on their loan or gambling payments. I don’t recommend getting behind.

Satisficing and Psychic Erosion

A few years ago, I realized that I was wearing the wrong size pants. All of my pants were too short. Though I’d been buying the same size pants for years and coping accordingly, the realization was sudden. As soon as I was able, which took a few more months, I ditched them and bought all new, appropriately sized pants.

For a long time I used a stereo pre-amplifier I’d gotten at a thrift store to play music from my computer on larger, better-sounding speakers. The increased sound quality was amazing, but the volume knob on the amp had a short in it and often required readjusting. One speaker would go out, and I’d have to go jiggle the knob to get it back.

Pick Any Two.

These two cases are examples of what Herbert Simon called “satisficing.” That is, dealing with decisions that are not optimal but just good enough. Simon claimed that since we can’t know all of the possibilities or consequences of our choices, satisficing is the best that we can do. In other words, we all satisfice in some way on a daily basis. The problem is when a situation starts to wear on you in barely noticeable ways, slowly eroding your psyche, something seemingly small can quietly build into a real issue. I thought my pants were okay, not realizing for a long time that their ill-fitting length made me uncomfortable and wore on my confidence. Though my faulty volume knob was chronic annoyance, I never thought it was that big a deal.

And — in the biggest of pictures — it wasn’t, but the habit of making do, dealing with the okay instead of the optimal, can be dangerous. In his latest appearance on Conan O’Brien’s show, Louis CK addresses a version of satisficing that can erode our psyches in the worst way. By avoiding sadness, we erode our humanness. “Sadness is poetic,” Louie says “You’re lucky to live sad moments.” [runtime: 4:51]

5HbYScltf1c

Being a person, present in the moment, is not always sad, but with our technologically enabled avoidance of sadness, we satisfice our lives away. “You never feel completely sad or completely happy,” Louie says. “You just feel kind of satisfied with your product, and then you die.”

Borg Like Me by Gareth Branwyn on Kickstarter

As you know, my interests tend to veer from the high-tech to the underground, from authors to zine-makers, from science to punk. Well, my friend Gareth Branwyn is a bit of both. He’s been an editor at Mondo 2000 and bOING-bOING, as well as at both high-minded WIRED and the D.I.Y.-bible MAKE. He recently stepped down as Editorial Director of the latter and is currently compiling all of his various and important writings into one volume, but first he has to fund the project.

Borg Like MeI interviewed Branwyn years ago (2001), and he told me then:

One of the great things about being so bloody old is that I’ve had a chance to experience every flavor of fringe media from the mid-’70s on. I caught the tail end of ’70s hippie media, then the punk DIY movement of the ’80s, then the ’zine publishing scene of the ’90s, and then web publishing in the ’90s.

He’s never left the scene, making his one of the most important voices in (any) media today. Borg Like Me will be indispensable for understanding 21st-century media mayhem. Don’t take my word for it, check out a 25-page sample of the book [.pdf], and watch the video on its Kickstarter page. A worthier cause you’re not likely to find or fund.

My Rosi Braidotti Piece on H+ Magazine

My piece about Rosi Braidotti’s latest book, (“Beyond the Body with Rosi Braidotti,” from June 1st, 2013), was picked up by h+ Magazine.

H+ Magazine

The site describes itself like so:

h+ covers technological, scientific, and cultural trends that are changing — and will change — human beings in fundamental ways. We follow developments in areas like NBIC (nano-bio-info-cog), longevity, performance enhancement and self-modification, Virtual Reality, “the Singularity,” and other areas that both promise and threaten to radically alter our lives and our view of the world and ourselves.

More than that, h+ aims to reflect this newest edge culture by featuring creative expressions of humanity on a razor’s edge where daily life and science fiction seem to be merging.

I’m sure you’ve already read it, but here it is anyway. Thanks to Peter Rothman for spreading the word(s).