Aesop Rock’s 900 Bats

Aesop Rock, who previously wrote here about breakfast, just launched a new website called 900 Bats — a creative resource for arts, information, and oddities.  It shows the breadth of his interest in art (i.e., video, audio, art, photos, etc.) and as an artist. It’s not his own artist site (try as I might to get him to do one), it goes way beyond something like that.

Aesop’s first post describes the concept:

In an effort to supply a sandbox for what I hope proves to be a multifarious and growing mix of contributors, I, with the help of Alex Tarrant and Justin Metros,  have created 900bats.com.  Original writing, photography, artwork, audio, and video content from varying sources will be posted regularly.

Contributors for the site so far include: Aesop Rock, Alex Pardee, Alexander Tarrant, Chrissy Piper, Colin Evoy Sebestyen, Coro, DJ Big Wiz, Jeremy Fish, Justin Metros, Kimya Dawson, Nick Flanagan, and Rob Sonic. Jeremy Fish did the illustrations for the site, and Alex Pardee supplied the logo. The site was named for the 900 bats that were killed by renovation workers at Bala Fort in Alwar district who put them on fire to avoid disruption in work.

danah boyd: Privacy = Context + Control

danah boyd is one of the very few people worthy of the oft-bandied title “social media expert” and the only one who studies social technology use with as much combined academic rigor and popular appeal. She holds a Ph.D. from UC-Berkeley’s iSchool and is currently a Senior Social Media Researcher at Microsoft Research New England and a Fellow at Harvard University’s Berkman Center for Internet and Society. As the debates over sharing, privacy, and the online control of both smolder in posts and articles web-wide, boyd remains one of a handful of trustworthy, sober voices.

boyd’s thoughts on technology and society are widely available online, as well as in the extensive essay collection, Hanging Out, Messing Around, and Geeking Out (MIT Press, 2009). In what follows, we discuss several emerging issues in social media studies, mostly online privacy, which has always been a concern as youth and digital media become ever more intertwined.

Roy Christopher: Facebook is catching a lot of flack lately regarding their wishy-washy Terms of Service and their treatment of their members’ privacy. Is there something happening that’s specific to Facebook, or is it a coincidental critical mass of awareness of online privacy issues?

danah boyd: Facebook plays a central role in the lives of many people. People care about privacy in that they care about understanding a social situation and wisely determining what to share in that context and how much control they have over what they share. This is not to say that they don’t also want to be public; they do. It’s just that they also want control. Many flocked to Facebook because it allowed them to gather with friends and family and have a semi-private social space. Over time, things changed. Facebook’s recent changes have left people confused and frustrating, lacking trust in the company and wanting a space where they can really connect with the people they care about without risking social exposure. Meanwhile, many have been declaring privacy dead. Yet, that’s not the reality for everyday folks.

RC: Coincidentally, I just saw yours and Samantha Biegler’s report on risky online behavior and young people. The news loves a juicy online scandal, but their worries are always seem so overblown to those in-the-know. What should we do about it?

db: Find a different business model for news so that journalists don’t resort to sensationalism? More seriously, I don’t know how to combat a lot of fear mongering. It’s not just journalists. It’s parents and policy makers and educators. People are afraid and they fear what they don’t know. It’s really hard to grapple with that. But what really bothers me about the fear mongering is that it obscures the real risks that youth face while also failing to actually help the youth who are most at-risk.

RC: NYU’s Jay Rosen maintains that his online presence is “always personal, never private.” Is that just fancy semantics or is there something more to that?

db: The word “private” means many things. There are things that Jay keeps private. For example, I’ve never seen a sex tape produced by Jay. I’ve never read all of his emails. I’m not saying that I want to, but just that living in public is not a binary. Intimacy with others is about protecting a space for privacy between you and that other person. And I don’t just mean sexual intimacy. My best friend and I have conversations to which no one else is privy, not because they’re highly secretive, but because we expose raw emotional issues to one another that we’re not comfortable sharing with everyone. Hell, we’re often not sure that we’re comfortable admitting our own feelings to ourselves. That’s privacy. And when I post something online that’s an in-joke to some people but perfectly visible to anyone, that’s privacy. And when I write something behind a technical lock like email or a friends-only account because I want to minimize how far it spreads, that’s privacy. But in that case, I’m relying more on the individuals with whom I’m sharing than the technology itself. Privacy isn’t a binary that can be turned on or off. It’s about context, social situations, and control.

RC: Hannah Arendt defines the private and public realms respectively as “the distinction between things that should be hidden and things that should be shown.” How do you define the distinction?

db: I would say the public is where we go to see and be seen while minimizing our vulnerabilities while the private is where we expose ourselves in a trusted space with trusted individuals.

———————–

Ed. Note: It has come to my attention that what Jay Rosen actually said was, “In my Twitter feed I try to be 100 percent personal and zero percent private.” Apologies to everyone, especially Jay, for the misquote.

The Mesh We’re In: The Ecological Thought

If Special Agent Dale Cooper actually did quit the FBI and retire in Twin Peaks, this might be the book he would write. His beliefs in the connectivity of all things, Tibeten philosophy, and respecting others are all represented throughout The Ecological Thought (Harvard University Press, 2010). Actual author Timothy Morton puts so many aspects of our world into perspective that it makes describing this book and its ideas difficult. His writing flows like so much water over the falls, but the falls are the hard part.

Is this an environmental book? Yes and no. It’s environmental, anti-environmental, and post-environmental. The ecological thought knows the only way out is through. It’s not back-to-Nature, it’s get-past-Nature. It’s not about balance, it’s about difference. According to the ecological thought, this is the mesh we’re in:

Do we fill the hole in the world with holism or Heidegger? Or do we go all the way into the hole? Perhaps it’s a benign hole: through it we might glimpse the Universe. Many environmental writer tell us to “connect.” The issue is more about regrouping: reestablishing some functioning fantasy that will do for now, to preserve our sanity. Yet this is radically impossible, because of the total nature of the catastrophe and the fact that there is no script for it (we are “still here,” and so on). It’s like waking up: it becomes impossible to go back to sleep and dream in good faith. The ecological disaster is like being in a cinema when suddenly the movie itself melts. Then the screen melts. Then the cinema itself melts. Or you realize your chair is crawling with maggots. You can’t just change the movie. Fantasizing at all becomes dubious (p. 31-32).

Sustainability is a fantasy. Your Prius is no more or less sustainable than your bicycle or your diet. This world is not sustainable. There’s no “re-enchanting” it. There is only enchantment. The end isn’t coming; it already happened. This is what the end looks like. It’s camouflaged to look like the now.

“The effect of mimicry is camouflage…” wrote Jacques Lacan, “It is not a question of harmonizing with the background, but against a mottled background, of becoming mottled — exactly like the technique of camouflage practised in human warfare” (p. 99). Morton writes, “Camouflage, deception, and pure appearance are the stock in trade of life forms” (p. 18). Non-humans do so many things that are supposed to be what separates us (e.g., language, imagination, reason, play, technology, etc.). Solidarity is the only choice. And why are there life forms at all? “Only because it benefits some replicators to clump together” (p. 85). Please, don’t draw lines in the mesh.

Space isn’t something that happens beyond the ionosphere. We are in space right now. — Timothy Morton

Do you realize, we’re floating in space? — The Flaming Lips, “Do You Realize?”

“There is a bigger picture here” (p. 121). Indeed. Perspectives abound. The Ecological Thought thinks irresistible, impossible, impassible things, because it has to. Because we all have to.

References:

Lacan, J. (1977). The four fundamental concepts of psychoanalysis. London: The Hogarth Press.

Lynch, D. & Frost, M. (Producers). (1990). Twin Peaks [Television series]. New York: ABC.

Morton, T. (2010). The ecological thought. Cambridge, MA: Harvard University Press.

Browser Don’t Surf: The Web’s Not Dead… Yet.

Remember when people used to “surf the web”? Now it is said that typical daily browsing behavior consists of five websites. William Gibson’s age-old summary of web experience, “I went a lot of places, and I never went back” has become, “I go a few places, and I stay there all the time.” We don’t surf as much as we sit back and watch the waves. I started this post several months ago when I noticed that the lively conversations that used to happen on my website had all but ceased (and eventually ceased altogether). Though the number of visitors continued to increase, the comments had moved elsewhere. A link to a post here on Facebook garners comments galore on Facebook, but none on the actual post. I doubt that I’m alone in experiencing this phenomenon.

I Tweeted (that still sounds silly, doesn’t it?) sometime last year  “Facebook 2009 = AOL 1999.” I was being snarky at the time, but there are good reasons that the analogy holds. As Dave Allen of North pointed out recently, search engine optimization (SEO) and search engine management (SEM) are shams for users. For those that don’t know, SEO and SEM are strategies for gaming Google’s search algorithms, thereby attaining higher page-rank in search results. That’s great if the optimized site actually has what you’re looking for, but unfortunately this is becoming less and less the case (Dave was looking for some bamboo poles from a local source for his backyard in Portland. I challenge you to find one using Google).

Enter closed communities like AOL and Facebook: These social networks help filter the glut by bringing the human element back into the process. So-called “social search” or “social filtering” helps when Google fails. So, even as Facebook has become the new “training wheels” of the Web (as AOL was before it), it also serves as a new organizing principle for all of the stuff out there.

Once I read the Wired cover story on the death of the web, I knew this idea had to be revisited. The claim that the web is dead is more than a ploy to sell magazines and less than a statement of truth. Yes, we’ve used the terms “web” and “internet” interchangeably (even jokingly combining them in the portmanteau “interwebs”) when they’re not the same thing, but don’t get it twisted: The web is not dead. It’s changing, growing, reorganizing, yes. But it’s far from dead.

Organizing principles are just filters; they include, they exclude, they make sense of would-be chaos. Good examples include books, solar systems, and city grids. As an organizing principle, the web is lacking at best, but it’s not lacking enough to wither and die just yet. Sure, the “app-led” (i.e., Appled) future, with its smart phones, iPhones, iPads, and other gadgets is forming closed silos using the internet’s backbone, but you aren’t likely to be sitting at your desk using anything other than the web for a while to come yet.

That brings us back to the shift from outlying sites (like this one) to filtering sites (like Facebook). As long as web search is run by algorithms that can be gamed (thereby rendering them all but useless), then the closed silos will stack — on and off the web proper. Where will that leave sites like mine? I don’t know, but no one is interested in The Roy Christopher App just yet.

SXSW 2011: My Panel/Talks

Voting has begun for South by Southwest 2011. I have proposed two talks and one panel. I am hereby requesting your support. Click on the links below and vote for these ones:

INTERACTIVE: Disconnecting the Dots: How Our Devices are Divisive:
We drive cars to the gym to run miles on a treadmill. Inclement weather notwithstanding, why don’t we just run down the street? The activities are disconnected. We sit in close physical proximity with each other and text others far away. The activities are disconnected. Technological mediation creates a disconnection between physical goals and technology’s “help” in easing our workload. There are at least two types of disconnection enveloping our days: one between ourselves and our environment (e.g., pumping water vs. pumping iron) and one between ourselves and each other (e.g., individual distraction vs. global connection) with technology wedged in between in both cases. If our culture is essentially technology-driven, then what kind of culture emerges from such disconnections between our physical goals and our technologically enabled activities?

FILM: Building a Mystery: Taxonomies for Creativity:
There is a limit — a rule of the grammar, if you will — of the number of elements that the average story can carry. There’s a point at which too many elements cause one story to fall apart, a line across which something else (e.g., a sequel) is needed. This limit is qualitative to be sure, but it’s not hard to tell when it’s been exceeded. While building a theory and weaving a narrative are very different enterprises, one can see parallels in the amount of elements each will carry. It’s less like the chronological restrictions we place on certain activities (e.g., you must be 18 to vote, 21 to drink, etc.) and more like having enough cream and sugar in your coffee. It’s a difference like the one between hair and fur. So, how many elements make a good story?

MUSIC: Finding Success and Thriving on Chaos:
If you need help finding your way into the current music milieu or your way from a rut to a groove, this is the talk for you. Helmed by musicians with lengthy and successful yet unconventional careers and unconventional takes on the upended music industry (e.g., Paul D. Miller a.k.a. DJ Spooky, Dave Allen of Gang of Four/Shriekback, Aesop Rock, Rebecca Gates of The Spinanes, et al.), this panel will be stoked and stocked with helpful information, insight, and inspiration for the aspiring as well as the veteran artist. From punk rock to Hip-hop, all genres are welcome. The unserious need not apply.

Okay, so there are a million other awesome-looking panels and talks, but I must implore you all to vote for these. Voting closes on August 27th, so vote early and everyday until then. Please and thank you.

Preston the Cat: R.I.P.

Suspected to have been dead for years, Preston the Cat finally received the call yesterday. He stayed at my parents’ house for seventeen years, through the tenure of two horses, and outlived Priscilla the Cat, Winnie the Dog, and Hershey the Goat. Like his archenemy, His Own Tail, he never liked me much, but we were almost friendly during his last days. At the time of his death, I only have one Preston-inflicted wound requiring a band-aid.

He is survived by Cindy (his initial owner), Moms (couldn’t care less about a cat), Jack (his primary caretaker who affectionately referred to him as “Worthless Furball”), myself, Push Broom, and his best friend Basket of ‘Tatas. His scowl, tail-hating neurosis, and intermittent but incessant knocking on the door will be missed.

Obscured by Crowds: Clay Shirky’s Cognitive Surplus

In The Young & The Digital (Beacon, 2009), Craig Watkins points out an overlooked irony in our switch from television screens to computer screens: We gather together around the former to watch passively, while we individually engage with the latter to actively connect with each other. This insight forms the core of Clay Shirky’s Cognitive Surplus: Creativity and Generosity in a Connected Age (Penguin, 2010). Shirky argues that the web has finally joined us in a prodigious version of McLuhan’s “global village” or Teilhard de Chardin’s “Noosphere,” wherein everyone online merges into one productive, creative, cooperative, collective consciousness. If that seems a little extreme, so are many of Shirky’s claims. The “cognitive surplus” marks the end of the individual literary mind and the emergence of the Borg-like clouds and crowds of Web 2.0.

Okay, not exactly, but he does argue for the potential of the cognitive collective. So, Wot’s… Uh, the deal?

Is Clay Shirky the new Seth Godin? I’d yet to read anything written by him that didn’t echo things I’d read David Weinberger or Howard Rheingold (or Marshall McLuhan, of course), and I hoped Cognitive Surplus would finally break the streak. Well, it does, and it doesn’t. As Shirky put it in his previous book, Here Comes Everybody (Penguin, 2008), “society doesn’t change when people adopt new tools; it changes when people adopt new behaviors.” This time around he argues that we adopt new behaviors when provided with new opportunities, which, by my estimate, are provided by new tools — especially online.

Steve Jobs once said that the computer and the television would never converge because we choose one when we want to engage and the other when we want to turn off. The problem with Shirky’s claims is that he never mentions this disparity of desire. A large percentage of people, given the opportunity or not, do not want to post things online, create a Facebook profile, or any of a number of other web-enabled sharing activities. For example, I do not like baseball. I don’t like watching it, much less playing it. If all of the sudden baseballs, gloves, and bats were free, and every home were equipped with a baseball diamond, my desire to play baseball would not increase. Most people do not want to comment on blog posts, video clips, or news stories, much less create their own, regardless of the tools or opportunities made available to them. Cognitive surplus or not, its potential is just that without the collective desire to put it into action.

Shirky’s incessant lolcat bashing and his insistence that we care more about “public and civic value” instead comes off as “net” elitism at its worse. The wisdom of crowds, in James Surowieki’s phrase, doesn’t necessarily lead to the greater good, whatever that is. You can’t argue for bringing brains together and then expect them to “do right.” Are lolcats stupid? Probably, but they’re certainly not ushering in the end of Western civilization. It’s still less popular to be smart than it is to be a smartass, but that’s not the end of the world, online or off-. The crowd is as wise as the crowd does. Glorifying it as such, as Jaron Lanier points out in You Are Not a Gadget (Knopf, 2010), is just plain wrong-headed.

The last chapter, “Looking for the Mouse,” is where Shirky shines though. [Although its namesake echoes a story by Jaron Lanier from a 1998 Wired article about children being smarter and expecting more from technology. Lanier wrote, “My favorite anecdote concerns a three-year-old girl who complained that the TV was broken because all she could do was change channels.” Shirky’s version involves a four-year-old girl digging in the cables behind a TV, “looking for the mouse.”] His ability to condense vast swaths of knowledge into a set of tactics for new media development in this last chapter is stunning compared to the previous 180 pages. Perhaps he is the new Seth Godin afterall.

References:

Lanier, J. (1998, January). “Taking Stock.” Wired, 6.01.

Lanier, J. (2010). You Are Not a Gadget: A Manifesto. New York: Knopf.

Shirky, C. (2010). Cognitive Surplus: Creativity and Generosity in a Connected Age. New York: Penguin.

Surowieki, J. (2005). The Wisdom of Crowds. New York: Anchor.

Watkins, S. C. (2009). The Young & The Digital. New York: Beacon.

Operation: Mindcrime — Inception

In his book Speaking into the Air (University of Chicago Press, 1999), John Durham Peters points out that if telepathy — presumably the only communication context more immediate than face-to-face interaction — were to occur, how would one know who sent the message? How would one authenticate or clarify the source? Planting an idea undetected into another’s mind, subconsciously in this case, is the central concept of Christopher Nolan’s Inception. [Warning: I will do my best to spoil it below.]

Looking down on empty streets, all she can see
Are the dreams all made solid
Are the dreams all made real

All of the buildings, all of those cars
Were once just a dream
In somebody’s head
— Peter Gabriel, “Mercy Street”

The meta-idea of planting an idea in someone’s mind, known to some as memetic engineering, is not new; however, conceptualizing the particulars of doing it undetected is. Subconscious cat-burglar Dominic Cobb (Leonardo DiCaprio) specializes in extracting information from slumbering vaults. After a dream-within-a-dream heist-gone-wrong, he’s offered a gig planting something in one: and idea that will grow to “transform the world and rewrite all the rules.” Cobb reminds me of Alex Gardner (Dennis Quaid) in the 1984 movie Dreamscape. Gardner is able to enter the dreams of others and alter their outcomes and thereby the outcomes of “real” situations. Cobb and his team do the same by creating and sharing dreams with others. The ability to share dreams — or to enter other worlds together via dreams, computer networks, hallucinations, mirrors, lions, witches, wardrobes, what-have-you — seems to be a persistent human fantasy. Overall, Nolan does a fine job adding to that canon of stories.

Cognitive linguist George Lakoff gets theory-checked mid-film when Cobb’s partner Arthur (Joseph Gordon-Levitt — standing in for Heath Ledger?) explains inception with the “don’t think of an elephant” ploy. What are you thinking about right now? Exactly. The problem is that you know why you’re thinking that right now. Successful inception requires that you think you thought of the idea yourself, independent of outside influence. It’s the artificial insemination of an original thought, “pure inspiration” in Cobb’s terms.

For better or worse, this concept (which takes the entire first act to establish), its mechanics (designer sedatives to sleep, primitive “kicks” to wake up), and the “big job” (a Lacanian catharsis culminating in the dismantling of a global empire) are just the devices that might enable the estranged Cobb to return home to his children. His late wife Mal (Marion Cotillard — standing in for Brittany Murphy?), or rather his projection thereof, haunts his dreams, jeopardizing his every job. Mal is a standout strong character and performance in a cast of (mostly; see below) strong characters and performances. She is beautiful, scary, and maintains an emotional gravity intermittently missing in this often weightless world. She is the strange attractor that tugs the chaos along. Whenever the oneiric ontology of Inception feels a bit too free-floating, Mal can always be counted on to anchor it in anger and affect.

The first time through, I thought that over-explaining the “idea” idea was the movie’s one flaw, finding myself thinking, “Okay, I get it” over and over. The second time through though, I honed in on it: The one thing preventing the concept from fully taking hold in the holiest of holies in my head is Ellen Page. Sure, she ably carried the considerable weight of Hard Candy (2005) and manhandled the tomboyish Juno (2007) to breakout success (admittedly with Michael Cera’s help), but her character and performance in Inception is the splitting seam that unstitches the dream into so many threads of sober consciousness. She’s supposed to be a brilliant architect yet simultaneously unaware of the ins-and-outs of inception and extraction, but she only believably excels at the latter. Where Keanu Reaves’ bumbling and understated Neo made The Matrix (1999) work by asking questions and pulling the viewer into the second world, Page’s clueless Ariadne drags us, the pace, and the other actors down. With the inexperienced patron Saito’s (Ken Watanabe) cues and clues to guide us through the intricacies of dream-theft, Ariadne is rendered all but unnecessary. She’s mostly redundant.

The seed of every story is a conceit, an unrealistic event or idea that the rest of the story sets out to explain. The survivors of a loved one who has committed suicide can never really know why he or she did so. The living can always see another option. If nothing else, Inception succeeds in explaining the suicide of a completely rational person, but I think it succeeds at much more than that.

Note: This post greatly benefited from discussions with and thoughts from Jessy Helms, Cynthia Usery, and Matt Morris.

What Means These Screens? Two More Books

Every once in a while our reliance on technology initiates a corrective or at least a thorough reassessment. In a sort of Moore’s Law of agentic worry, the intervals seem to be shortening as fast as the technology is advancing, and the latest wave is upon us.

Sometimes these assessments are stiflingly negative and sometimes they are uselessly celebratory. Jaron Lanier’s recent book flirts with the former, while other current thinkers lean toward the latter. For instance, where Clay Shirky sees the book as an inconvenience borne by an era characterized by a lack of access, Nicholas Carr’s The Shallows: What the Internet is Doing to Our Brains (W. W. Norton & Co, 2010) laments the attempt to shred their pages into bits and scatter them all over the internet, decontextualizing great paragraphs, sentences, phrases, words. Apparently Shirky would rather read War and Pieces than War and Peace.

For all of its astute observations and well-argued points, The Shallows sometimes exhibits a strange disparity between what Carr hesitates to claim and what he writes as common knowledge. For example, he states outright that language is not a technology (p. 51) – a claim with which I not only disagree but feel is rather bold – yet hedges when saying that the book is the medium most resistant to the influence of the internet (p. 99) – a claim that seems pretty obvious to me. Books, as a medium and as an organizing principle, just do not lend themselves to the changes the digital revolution hath wrought on other media. Their form nor their fragmentation makes near as much sense.

When we do research, we rarely read an entire book. We scour indices and tables of contents for the relevant bits. As Howard Bloom gleefully explains in his contribution to this year’s summer reading list:

…if you prefer playing video games to plowing through a thousand pages of Joyce’s Odesseus and falling out of your beach chair with periodic bouts of sleep, I highly recommend the Google Book Search e-approach, deep dives into the minds of philosophers you would normally never think of sampling between games of badminton.

As much as I’d love to be able to run a digitally enabled quick-search on all the books on my bookshelf, that doesn’t mean I don’t want the option of pulling one down in its entirety once in a while. The same could be said for the fragmentation of the album as the organizing principle for music. It doesn’t take a 19th century librarian to see that preferring the excerpts and snippets of research is not the same thing as never wanting a book to read. This is the thick thicket, as Matt Schulte would call it, of digitizing books.

Carr’s point though, is not just the dissolution of our books, but the dissolution of our minds. He claims that the manifold fragments and features of the web are preventing us from concentrating for a book-length spell, much less wanting one. As clear as his argument reads and as solid as his research seems (Carr assembled a firm foundation of writing history and media ecology on which to build), it’s difficult not to take the very point of it as so much pining for a previous era. He’s careful to blunt that point by praising the web’s usefulness and to self-analyze his own tech-habits just enough to soften the prickly parts of his argument. It’s a seductive read in spite of itself.

I thoroughly enjoyed all of The Shallows, but the last chapter, “A Thing Like Me,” is one of the more frustrating twenty-odd pages I’ve read in some time. Not because it was bad, but because it was so dead-on in-tune with my recent thoughts on media and minds. It was a lengthy and weighty I-wish-I’d-written-that experience. Damn you, Nicholas Carr!

Speaking of things I wish I’d written, Tom Bissell’s Extra Lives: Why Video Games Matter (Pantheon, 2010) is a prefect model of how to write about something totally geeky, maintain the things that make it geeky, and still make it accessible to anyone. When I was a gamer, a self-identification I wouldn’t feel comfortable using even in jest today, there wasn’t such a category. Playing video games was a subset of the larger “nerd” label. Given my hiatus from said world, I should’ve been outmoded by Bissell’s admittedly narrow focus on recent console games, a focus he admits runs the “danger of seeming, in only a few years, as relevant as a biology textbook devoted to Lamarckism.” Thankfully, what this book’s subject matter lacks in breadth, Bissell’s intelligence, insight, writing, and wit make up for in spades.

Adult indulgence in video games begs questions of maturity and responsibility in the adult, but it also begs questions of the games as well. Bissell explores some of both, but mostly the latter. He thoroughly refutes Roger Ebert’s recent claim that video games can never be art (Ebert has since retracted his statements), snags insider insights via interviews with several top game designers, makes fun of Resident Evil‘s deplorable dialog, and descends into the depths of addiction and abuse — on the screen and IRL — with Grand Theft Auto IV. It’s a thumb-blistering journey through the screen and into the machine, and, in spite of its candor and seriousness, it’s damn funny.

What I can say for very few recent books, I can say for The Shallows and Extra Lives: They are as entertaining and funny as they are provocative and informative. Simply put, they are good reads. Carr and Bissell should be proud.