Seven Decades of TV Noir: A Short Introduction

Originally published in the Noircon 2012 Proceedings
Edited and Produced by Lou Boxer, Deen Kogan, and Jeff Wong
November 2012
Philadelphia, PA 

Introduction

It might be a bit surprising to note that the legacy of TV noir stretches back almost seven decades. That is nearly as long a history as its more celebrated competitor and inspiration, film noir. While the first films noir were shot between 1941-1945, as Ray Starman notes in his book TV Noir, the first noir TV series made their debuts in 1949 with Martin Kane, PI and Man Against Crime. Of course, there is a major and indisputable reason why TV noir came after film noir. While Hollywood was reaching the apex of its studio era in the 1940s, there were really no TV networks until 1947.

Image

But since 1949, noir has been a constant staple of television programming, generating hundreds of series and thousands of hours of viewing pleasure. Given its plentiful output, cultural impact, and historical legacy, it is likely that more people have encountered noir stories watching TV shows in their living rooms than by sitting in movie theaters. Yet in terms of scholarly and fan-based activities, there have been more books and festivals dedicated to film noir than TV noir. TV noir is often dismissed as “inferior” to film noir, and many fans of noir will tell you they don’t “watch television.”

These are my working notes to accompany the 2012 Noircon panel, “Crime in Primetime: TV’s Most Innovative Noir Series.” Both here and during the panel, I will make a case for reassessing TV’s contributions and innovations to the noir style and storytelling. Towards these ends, I will briefly lay out some of the major ways in which TV noir differs from its cinematic counterpart in terms of its forms, authors, and constraints, and some of the reasons why you should be watching some truly exceptional television series right now if you are a fan of noir.

1. The Forms

Noir and film noir appear on television in a myriad of ways from the replaying of classic movies to throwback skits on variety shows to “special episodes” of non-noir TV series. In this last instance, the noir style is mostly an excuse to shoot an episode in black and white, smoke cigarettes, and wear fedoras. In fact, the grisly crime stories that open many local nightly news broadcasts might be the most noir productions on TV. The pre-eminent form of TV noir is the one-hour prime-time drama. The one-hour drama is not actually a full hour. If you take into account the commercial interruptions and the station identifications, most hour-long shows are approximately 42 minutes long.  Each episode of a noir TV series is considerably shorter than the typical length of a classical film noir, which typically run between 90 and 120 minutes.

Another major difference from the movies is TV noir’s serial format. TV noir is ultimately about the series—depending on the era and the network that means anywhere between 13 to 26 episodes per season. Therefore while Hollywood movies clearly were an influential source for early TV, the bigger influence in the beginning was commercial radio. Television appropriated one of radio’s main programming strategies: the regular and predictable rhythm of a prime time schedule.

TV noir’s profuse output makes it harder to discuss these series in the same depth with which we can analyze films noir. It’s a matter of scope and scale. Consider Law and Order. Over its twenty year run, the original Law and Order series produced 456 episodes. That is an astounding narrative archive that would take almost two months to view in its entirety if you watched eight episodes a day (as seen in the chart that tracks the major characters over 2o seasons, from Wikipedia, below).  Such a large number of episodes also raises the difficulty of being a TV noir completist, i.e. the likelihood that you have seen every episode of a series. And multiply that problem by hundreds of series, and you can see the analytical difficulty of writing about and discussing TV noir.

Screen Shot 2013-05-10 at 7.21.29 PM

In terms of serial structure, each episode of a series can be a self-contained story, which can sometimes feel like a slimmed down (or for some, a watered down) version of a film plot with a rushed and frequently predictable resolution. Or shows can employ the concept of continuing storylines or story arcs to extend narrative events over multiple episodes. Story arcs can tell a more complex story that unfolds over dozens of hours of viewing. Some of TV noir’s most innovative shows have opted for this latter approach. Shows like The Wire and Breaking Bad use intricate story structures to achieve an almost novelistic storytelling depth that is simply impossible to attempt in a ninety-minute film noir. Long-running series have their benefits.

Having to generate a new noir story every week also meant that TV noir returned to its serial roots in pulp fiction and mystery magazines. There can be a thrilling vicariousness in watching a well-made TV noir because you establish a powerful connection with characters that can extend and deepen for years. But the ongoing run of a TV series can lead to challenges for writers who have to make sure the show’s major cast members survive the impending doom of the fateful noir universe, robbing many TV noir series of the narrative uncertainty that has certainly animated some of our best films noir.

2. The Authors

Film has always tended to be identified as a director’s medium. First raised by the concept of auteur theory, directors are assumed to be the main creative driver behind a film. It is just not the same on television. An episodic TV show requires a unified look and similar directing style over multiple episodes, so it is important for directors to “stick to the script.” Even if a director wanted to get more visually adventurous, there is not enough time in pre-production to execute such ideas, and it’s the producers (not the director) who hire the creative personnel on the set. Even with such limitations, some well-known directors have been drawn to the small screen including David Lynch (Twin Peaks), Alfred Hitchcock (Alfred Hitchcock Presents), Martin Scorsese (Boardwalk Empire), Quentin Tarantino (episodes of CSI:), and Rian Johnson (an episode of Breaking Bad).

But ultimately, television is a producer’s medium. For example, name a director of a recent TV episode you watched. Did you draw a blank? Don’t feel bad. Most people don’t recall who directs a TV episode, because we conventionally attribute authorship of a TV series to a producer or show creator. Now name a TV producer. Who created Hill Streets Blues? The Streets of San Francisco? Law and Order? Breaking Bad? For most who have watched these shows numerous times, the names Steven Bochco, Quinn Martin, Dick Wolf, and Vince Gilligan would likely come to mind.

But Bochco, Martin, Wolf, and Gilligan (even when they also write entire episodes for their own series) tend to more in control of a team of writers who operate under the constraints of the show’s template. This point returns me to my essay about networked authorship for Noircon 2010, where I examined the role of multiple writers on a film noir script. My argument was that there was much cross-pollination among creative personnel in classical Hollywood and the writing behind many films noir might be best understood as the hybridization of shared interests in hard-boiled stories and creative exchanges rather than the work of single auteur. I used Strangers on a Train, and the roles of Patricia Highsmith, Raymond Chandler, and Alfred Hitchcock as my example.

If we turn our attention to TV writing, the idea of networked authorship is its basic creative model—even more so than film. TV series must employ teams of writers and directors to meet the stringent production demands of producing between 13 and 26 episodes in a short time frame. This means that the show’s creator tends to be the creative hand that guides the overall evolution of the series, but separate writers are credited with individual episodes. Increasingly, some of our best noir TV shows have raised networked authorship to a new level. HBO’s The Wire might be one of the best examples. That show’s creator, David Simon, hired some of the best literary noir writers in the business to pen episodes, including George Pelecanos and Dennis Lehane. As a result, the quality of The Wire’s writing shines in that series. (Image from Time Magazine, by David Johnson, picturing Ed Burns, David Simon, and George Pelecanos in Baltimore, where The Wire is set)

Image

Moreover, like the Hollywood studio system that preceded it, TV noir can build new series on pre-existing literary properties. TV noir over its seven decades has been dominated by topical series that seem to be tireless retreads of “ripped from the headlines” stories, but TV noir can and has benefited from adapting hard-boiled literature. While one of the more prominent examples might be a show like the 1980s’ Mickey Spillane’s Mike Hammer, consider a couple shows like Justified and Wallander. The first show, a hit on F/X, is based on a character created by Elmore Leonard, and the latter show, a series from the BBC, is based on Swedish novelist Henning Mankell’s internationally known police inspector, Kurt Wallander. While source material alone is no guarantee of quality when adapting for a new medium, Justified‘s writing has clearly benefited from a rule shared among the writing staff: when in doubt, WWLD? (What would Leonard Do?) (Image from NJ.com, Elmore Leonard with Timothy Olyphant of Justified)

Image

3. The Constraints

As we begin to consider the most innovative noir series in TV history, I believe we will find that the constraints of TV have fueled those shows in powerful ways. As Shannon Clute and I discuss in our book, The Maltese Touch of Evil: Film Noir and Potential Criticism (Dartmouth University Press, 2011) there are very good reasons to look at films noir as “constrained texts.” Moreover, some of our most creative noir films have consciously embraced constraints to reveal noir’s potential.

From its earliest days, the TV industry has been subjected to more constraints than the film industry. Part of this is due to the fact that TV was projected directly into our living rooms. Therefore, long after film noir shook off the most restrictive forms of censorship, TV’s versions of noir tales had to meet prime-time viewing standards on broadcast networks and depictions of sex, violence, and use of language were heavily censored well into the 1990s. In 1993 NYPD Blue could still use censorship constraints in its pursuit of quality storytelling. Blue (as the show was frequently referred to) intentionally stirred controversy with partial nudity and adult language that now seems tame compared to pay-cable outlet shows like The Sopranos, as in this clip of NYPD actress Charlotte Ross. But pushing against the constraints of broadcast standards did much to usher in today’s more complex noir series.

TV budgets were also historically lower than for films, so TV shows tended to be restricted to a few sets that got reused in almost every episode. Many TV noir sets can have the feel of a locked room mystery. Location shooting was typically too expensive, so TV began to reuse old movie studio sets in ways reminiscent of Poverty Row studio practices. But this constraint also became one of TV’s biggest contributions to noir. TV noir tended to seek a distinctive “locale” for exterior shots that would give the show a geographic identity and to differentiate series from one another. This moved TV noir stories beyond the urban environs of LA, New York, and San Francisco (though plenty of noir shows are still set in these cities). On television we get more noir than expected from sun-drenched locales such as Hawaii (a favorite destination of crime shows, such as Magnum, PI) or Miami (Miami Vice or CSI: Miami). We also get noir in some of the smaller regions of the US, as in Justified (with its great use of Harlan County, Kentucky) or Breaking Bad (and the emergence of New Mexico as a noir border crossing region)

Image

TV had a much more restricted image size. Compared to the “big screen,” early television sets were extremely small. The quality of the TV image could never compare to a film print (until extremely recently in the digital age, and still a digital print can’t match a well-struck 70mm Technicolor print). It was not even a fair comparison. TV in the 1950s was a blurry, pixelated, and electronically refreshed mess compared to the luxurious richness and dense visual field of projected film. And the technological indignities continued over the decades: TV developed color long after film, new sound technologies were slow to be adopted. Sometimes, with the rise of HDTV, we forget all that.

Visual personnel working on TV series have had much greater limitations in visual design and cinematography. This meant that until fairly recently TV was not focused on visual storytelling as much as narrative design and the growth or goals of recurring characters. A show like The Fugitive in the 1960s is a great example. Dr. Kimble’s four season pursuit of a one-armed man was used as a continuing story arc to find a creative benefit from episodic storytelling. Since TV, like film, constantly recycle narrative strategies, The Mentalist pursues a similar narrative structure with a serial killer named Red John.

Perhaps one of the best TV shows of all time is Breaking Bad (this is the TV series I will discuss on the panel, “Crime in Primetime”). The show’s creator Vince Gilligan is on record as being a fan of constraints, and mentioning that he has embraced constraints in the writing and design of Breaking Bad. Gilligan’s embrace of constraints can be seen in the final product. The show has a fairly small number of cast members for a show going into its fifth and final season. The show has played out over a restrictive time frame, roughly a year of action over the first four seasons. The show elegantly uses the “cold open” (a modern variant of in media res) to introduce visual metaphors and plot elements that engage the viewer into the complex world of Breaking Bad, as in the recurring and ultimately deeply meaningful motif of the pink bear in Season Two.

Image

And the show uses the character of Walt White—a high school teacher turned meth cook—to examine with an uncanny depth and human perspective the global reach of today’s decaying noir universe. And Gilligan does all this under the censorship constraints of AMC, a basic cable station.

Conclusion

What are the most innovative TV noir shows? I’ve only begun to scratch the surface of crime, mystery and noir drama on the small screen. For seven decades, TV has supplied memorable (and some not-so-memorable) noir programming that has advanced noir storytelling. But with shows like The Shield, Justified, and Breaking Bad, TV noir is in a period of renascence. Noir stories are becoming more complex and intricate. New technologies and higher budgets have allowed TV noir to expand its visual design. And long-form stories are becoming ever more elaborate in shows such as 2007’s Forbrydelsen (a brilliant Danish police procedural recently remade by AMC as The Killing–the image below is of the character of Sarah Lund from Forbrydelsen).

Image

Noir’s seventh decade is off to a great start.

Let the debates begin.

Tweeting the Lesson: Social Media Curation and the Cultivation of the Imagination

This is a presentation I gave in a lightning session in Madison, WI at 28th Annual Teaching and Learning Conference in August 2012. Since a lightning session could only be 10 minutes long, I gave myself the constraint of making my points in 10 slides only. Each slide lasted for exactly one minute (yet another constraint, coming out of my interest in the Oulipo and my earlier practices around Pecha Kucha presentations). I also wanted the visuals in each slide to carry the point as much as my spoken text, so I did spend a good deal of time thinking about how to design the visual portion of this talk. In the end, I focused on how many different ways I could use the twitter logo (that little blue bird) to make the underlying points that the goal in using Twitter in higher education was to help students “cultivate their imagination.”

I am glad I waited to post this talk, because my experience using social media in my MOOC, Investigating Film Noir, which I taught from March-April 2013 has helped me confirm my basic thesis here. Using Twitter, Storify, and Pinterest in my MOOC, I do believe I started to reach some of the goals that I articulate in this talk. My MOOC cultivates outside subject matter experts, and involved the synthesis and application of learning outcomes. In my MOOC, students would twit-pic some of the images from the films we were studying and other students would re-tweet or comment on those images. We would ask questions on Twitter, opening up answers to our questions to experts beyond those enrolled in the course. And many of the tweets exchanged among students engaged in a metacognitive awareness of what it means to analyze film.

Here are the ten slides. Each slide is followed by my spoken commentary: Image Since I only have 10 minutes in a lightning session, so I will focus on a single, main learning outcome for those of you interested in building social media in your course designs. Throughout this talk, I will return again and again to “cultivating the imagination” of our students. That phrase intentionally recalls the work of John Seely Brown and Doug Thomas in their wonderful book, A New Culture of Learning. 

To achieve this “cultivation of the imagination” we will need innovation at 3 levels:

  • we will need trained and innovative instructors
  • we will need new and innovative assignments
  • we will need to encourage students to want to be innovative

I see “cultivation of the imagination” as the major learning outcome for higher education. Regardless of your favorite learning theory, whether you are a constructivist, a clicktivist, a connectivist, or subscriber to any other learning theories, creating imaginative and self-efficacious thinkers rates highly all of them. If we cultivate our student’s imaginations, we will have the lifelong learners we desire.

Social media is one area to explore along that path. Image Social media tools intrigue me, because they have the capacity to be tools to encourage and support lifelong learning, and they foreground their very informality. Twitter was not designed with a college accreditation standards or assessment techniques in mind. It is a commercial tool that can be used for learning from others through the open and global Web, but it takes some design and purposefulness to make that happen. Currently, when we bring social media into the classroom, we tend to kill what makes it such a great learning tool to begin with. We tend (in my experience as an instructional designer) to enclose it within the confines of the formal course objectives and that frequently short-circuits its engagement with larger online networks. In these instances, students start to wonder why they are using Twitter in the first place. Seldom if ever will a course hashtag trend worldwide.

An Integrated learning approach tries to have it both ways--to balance and transfer the formal and the informal learning. The course “walls” of a socially mediated learning experience need to be more porous than solid. I see Twitter (to just pick one social media tool out of the many) as a way to support student-centered learning. But I want students on Twitter in all its informal messiness. That way, Twitter can bring in expert voices in your classroom. Twitter can be medium for your students share their passions and curiosity with a large number of followers and thereby build new connections and new relationships through the classroom and beyond the classroom. Image And that brings me to curation.

Twitter is frequently seen as a curatorial site. But I would make a critical distinction between three modes of curation. 

Twitter is built with informal curation tools. It is a way for any user of Twitter to receive and assess the constant flow of tweets. We can favorite a tweet and keep it for future reference. We can retweet to show our interest in another person’s tweet. But informal curation is mostly about receiving new messages, and giving them an initial assessment.

A step up on the curatorial ladder is a more formal personal curation. Using tools such as Storify, paper.li, or scoop.it, social media stories can be assembled and recalled later when one wants to review or better still, apply that information.

But finally, curation can be crowd-sourced and shared. It can be the basis of a networked curation. Students can produce new knowledge from an archive of aggregated tweets. They can add new information to others tweets by providing new contexts, new explanations, new insights using a host of social media aggregators. This is the key goal for my talk today. How do we design our projects and use our learning outcomes to encourage these practices? When students engage in networked curation, they will get closer to the goal of cultivating the imagination.

Slide04

But as long as social media is locked within the confines of an online course, we will likely experience something closer to #TwitterFail. Students will feel that the exercise of social media is basically hollow. It becomes just another task to complete versus a new kind of digital and informational literacy. To reach that higher learning outcome, we need more learning experiments involving social media curation and cultivation. We are at an early phase of social media integration into the online curriculum. What we don’t know is much greater than what we do know at this point.

How curation will continue to develop in the future is an open question in my mind, but now is the time for experimentation. And if the experiments are focused on generating more powerful learning outcomes through social media, we will find ourselves on the right track, even if some of those experiments fizzle.

Slide05 That brings me back to Twitter in its specific form. I see Twitter as having multiple layers and where those layers reside is telling to me.

The top portion of an expanded tweet is the message and the person who sent the message – what we will call the “content.” There are some basic curatorial tools that require little effort on the part of the person who reads the message – one can retweet the message or favorite it. I will return to “reply” function in a moment.

The next layer down is the crowd sourcing information. You can find out if others have found this tweet interesting. How many times has it been retweeted? Favorited? Who did this? And should I follow some of these people since they might share some of my interests?

Twitter also gives you a “timestamp” that will be useful when you start to aggregate multiple tweets.

But the bottom layer of a tweet is its cultivation layer. Retweeting and favoriting are good, but replying builds new connections and adds to the original knowledge object. It is in replying that the message is extended, and the learner can express new information. But this is the bottom layer of a tweet. Slide06

Compare this to a tool like Storify. Storify is one aggregator among many on the open Web, but it will stand in for other ways of cultivating social media knowledge.

In Storify, the connection and cultivation layer is the top layer. To engage in Spotify is to be both a curator and a cultivator.

As I will argue throughout this talk, you need both. You need to sort through the massive information network and make choices (curation) and have a means for adding new information and new syntheses in a structured way (cultivation). Or as I prefer to state it: we achieve cultivation through curation. We see critical thinking at both the level of the assemblage and its new context, and like cultivation in agriculture, bring forth an entirely new ecology of knowledge.  I particularly like Spotify because it foregrounds its role as a knowledge cultivator through its trope of “stories.” To tell a story is an act that brings together many different learning outcomes. Image Which brings us back to cultivating the imagination. How does this approach to social media apply to online course design?

I would argue that “cultivation through curation” touches on almost all the key learning outcomes we might seek in our course designs. Spotify can support the evaluation of information, as student learners evaluate their aggregated tweets. Spotify can support the synthesis of knowledge, and we can assess how well students bring together tweets to show their engagement with disciplinary knowledge. Spotify can aid in comprehension as students have to sort through and make sense of information that can be crowd-sourced and supplemented by subject matter experts from beyond the course. Image  And this brings us back to the concept of “networked curation.” If one tweets to connect, then one of the major connections we are making through social media is a connection to participatory learning.

Social media rewards active participation in ways that make it valuable for online learners and online learning communities. One thing that I do like is how Twitter, or even tools like Storify and Pinterest, are only “parts of the story.” These social mediated communication streams and tools only come to life when learners engage with them, and complete the story. Storify allows students to not only aggregrate their tweets, but comment on the “new story” that emerges from the act of aggregration. As tweets are built into stories, and as students share and reflect on those new stories, new possiblities for curation and the cultivation of the imagination emerge. Slide09

And these are four of the major takeaways when we focus on social media curation. As is always important in instructional design, you need to consider what social media adds to your discipline. This is going to be different for an Art History course, or a Sociology course, or a Business course. But in each case, the art of communication and the science of connection and curation can come together to transform student learning and engagement. Part of what is fun about social media curation is how visible the learning is. You can follow alongside your students as they create these new knowledge connections and these new archives. Slide10

Which brings me to my final point. If we take the cultivation metaphor to its logical end, we start to recognize that far too frequently in our efforts to connect our students to disciplinary knowledge, we inadvertently bracket off the wealth of networks, the flow of information, collaborative energies, and networked data. We tend, even when we intend the opposite, to encourage our students to construct their information as personal, as if they are building old-fashioned knowledge cabinets. They might be able to collect our “disciplinary specimens” and arrange them in their own self-contained cabinets of curiosity that harken back to the early days of the Enlightenment and the desire for encyclopedic knowledge. But I want to leave this talk by offering another vision, one not of information contained and walled-off, but of information set free and shared. In this final maneuver then, social media curation of disciplinary knowledge might lead towards opening up new fields of knowledge, new areas of engagement and collaboration. Information can be powerful when it is shared. Let’s see what kinds of information fields might open up in the social media age.

Link

26 Ways to Use Twitter for Teaching and Learning: A Storify-ied Reflection

26 Ways to Use Twitter for Teaching and Learning: A Storify-ied Reflection

On 1/4/2012, Ed O’Neill (Twitter: @learningtech) posted an interesting series of 52 tweets on the topic of “Using Twitter for Teaching and Learning.” I liked how he was using Twitter to discuss Twitter, but wasn’t sure about the best way to respond to his tweets. I felt that since his tweets started from a single presentation that it would be best to try to respect the order and logic of his initial presentation.

Towards that end, I used Storify to collect his tweets in the order they were originally tweeted. This, of course, required me to re-order his tweets since twitter posts (in their very nature) arrive in reverse chronological order. Once I assembled all 52 of his tweets, the original aim of his presentation became much clearer to me, though I liked the way that Twitter required Ed to chunk his presentation into different lexia.

And this is worth noting: I felt that these lexia mostly benefited from having to be fit into 140 characters. Twitter created a constraint that I felt was in keeping with the cognitive power of Ed’s argument. (And here I am intentionally thinking about Edward Tufte’s well-trod pamphlet The Cognitive Power of PowerPoint, where he notes how PowerPoint seems to rob slides of their full evidentiary and illuminating potential). Twitter’s constraints probably work because breaking text into brief but precise messages is a different kind of cognitive activity than the cut-and-paste, overly visually templated, and bullet pointed mentalities of many PowerPoints.

I also liked that, as I sought to extend my engagement with Ed’s originating tweets, that Storify let me write in the “margins,” so to speak. Storify lets you comment on each individual tweet, so you can weave one’s one thoughts and reflections into the very fabric of a series of related tweets. This strikes me as a great way to restore an argumentative or presentational flow that might seem missing in the ever-flowing stream of tweets that sail through the columns of my TweetDeck.

I want to thank Ed again for starting this conversation. Ed brings a great perspective to learning technologies (in keeping with his twitter de plume) that reminds all of us educators the importance of connecting new technologies to learning objectives and the spread of best practices.

Finally, twitter, in my opinion, has not gained the traction I might have expected in higher education classroom, and Ed’s tweets contain many ideas that faculty members could start using in their classes right away, particularly the value of Twitter as a tool for communication between faculty and students and a way to get students collaborating and reciprocating around course-based ideas and themes.

If you want to read my Storify-ied reflection, click on the link at the top of this post.

Revealing Generation Text: A Video Documentary on Cell Phones in High School

“Mobile phones are a way of life for Generation Text.”
–Craig Watkins, The Young and the Digital

As anyone who reads my blog knows, I am a big supporter of digital and multimedia literacies in higher education. Since my postdoctoral work at the Institute of Multimedia Literacy back in the early ’00s, I have been researching and writing about uses of new media and new technologies in the classroom. In each of my academic appointments at Saint Mary’s College and IUPUI, I also have experimented with multimedia literacies in my own pedagogy, frequently putting into practice and evaluating best practices around new technologies and their potential to improve student learning outcomes. And my own courses typically revolve around project-based learning. I prefer (whenever possible) to have my students create multimedia projects as a major learning component. Probably no surprise again, but I’m a big fan of higher order processes in learning (following Bloom’s famous taxonomy) and find project-based learning to yield marvelous moments of synthesis and evaluation.

However, my focus to date has been exclusively on new technologies and their applications in higher education. But that changed earlier this year when I got the opportunity to collaborate with my younger brother on an Oppenheimer Family Foundation grant. My brother, Bob Edwards, is a teacher at Phoenix Military Academy, a high school in the Chicago Public School district. Bob has been similarly interested in the role of new technologies in the classroom. I was thrilled to hear that he had won a Teacher Incentive Grant from the Oppenheimer Family Foundation. This grant is designed to support project-based learning in the Chicago Public Schools. My brother’s winning proposal was:

 Revealing Generation Text:
Students will research cell phone usage, investigating how texting is affecting teenagers. They will create a documentary film disclosing their findings.

My brother asked me to be a part of his project, since he knew of my areas of interest and because he was planning on making a documentary film involving teams of his students from his senior-year Creative Writing class. Together we carved out a plan to have his students work in small teams with dedicated assignments. Each team would interview students, teachers, and administrators to hear their thoughts and observations about Generation Text and student texting. Moreover, we decided each team would additionally focus on a different issue related to Generation Text and the cell phone polices of Chicago Public Schools:

  1. What is and what should be the cell phone policy in Chicago Public Schools?
  2. What are the pros and cons of cell phone confiscation by teachers and administrators?
  3. How can cell phones be used as learning tools?
  4. What would a world without cell phones look like?

Using the money from the grant, we were able to purchase four Cisco Flip cams HD. A quick aside about Flipcams: I am aware that Cisco has ceased support for this project in April 2011, but it was a great tool for us, especially for its ease of use, the quality of its picture, and in my opinion, the quality of its small internal microphone. The Flip cams worked easily and flawlessly for our documentary and it’s a shame that this product is no longer going to be on the market.

The students spent several weeks arranging for their interviews and shooting their videos. We got lots of great footage from each team, and then set about assembling the final cut.

You can see the final results for yourself here by watching the video, which we just completed, and was just screened for the Oppenheimer Family Foundation and a group of first year students at Phoenix Military Academy.

I want to say a few more words about the documentary and what I learned. Overall, we wanted to have the student voices take center stage and present a balance portrait of the everyday realities of cell phone uses and abuses in Chicago Public Schools. As someone who has worked exclusively in higher education, I can say the video was very illuminating to me. I consider myself up to date on the literature around cell phones in higher education, so I was surprised to learn about some of the very real problems created by cell phone uses in K-12 education.

While I expected to hear about the cell phone as a tool of “distraction” (and this was a major theme echoed by all participants in the video), the cell phone is also a tool for bullying in K-12. I wasn’t aware of this as I began working on the documentary. As the Pew Internet and American Life Project noted last year,

Over a quarter (26%) of teen cell phone users reported having been harassed by someone else through their cell phone. Girls are significantly more likely to experience this (30%) than boys (22%). This trend is more common for those teens whose parents are under 40 and low in educational attainment.

Responses in the focus groups were split with regard to how serious of a problem this is. Some teens clearly believe this is major problem with serious social and psychological consequences, while others feel that it is “not really a big deal.”

Bullying and student fights came up as a big issue in our documentary. One reason for the “zero tolerance” ban in Chicago Public Schools is to stop the problem of bullying by cell phone. Therefore, educators who are trying to experiment with educational uses for cell phones have to be aware of the negative uses of this technology among K-12 students.

On the positive flipside, it was great to see the eagerness with which students embrace the new capabilities of cell phones, especially smart phones. However, there is something of a “cell phone arms race” among high school students with students wanting not just a cell phone, but the “right” cell phone. Clearly there is a digital divide between high school students who have cell phones and those who have smart phones. This divide, especially in terms of social capital among one’s peers, appears to be a much greater gap than the one between students who have laptops and students who have no laptops. The role of texting and mobile communication in student’s everyday lives is primary in this regard. Students see their cell phone as an extension of their identity and life style (many students commented on how confiscation of their phone was literally “the end of their world;” hyperbolic, perhaps, but the sentiment is genuine).

In one segment, a group of students that ran “speed tests” between a laptop and a mobile phone (running Android 2.0). These students were able to demonstrate that they could get faster results from Google through their mobile phone connection than the laptop running on the school’s wireless network. What intrigues me about this is that the students are aware of these speed differences. You get a sense that the mobile generation is deeply savvy about connection speeds – a useful thing to be aware of—especially if your “life” is being conducted through a mobile device.

Most of the other major insights we found are in the final video itself, but I was glad to hear that the students themselves are aware of how the cell phone can be a distraction. There is no doubt that many students are surreptitiously texting each other all day long, and as long as the policy is “zero tolerance,” students will continue to do their best to keep their texting out of sight. But even against the backdrop of “texting as distraction,” many students are keenly interested in exploring how cell phones can be used as educational tools. A good sampling of those possible uses can be found in the Speak Up 2010 Survey, which surveyed high school students on what they would like to use cell phones for during the school day:

  • Check grades
  • Conduct research
  • Take notes in class
  • Collaborate with friends
  • Use the calendar
  • Send an email
  • Access online textbooks
  • Check out school activities
  • Create and share videos

That list is a good starting point for conversation and reminds me of how I use my iPhone professionally. I anticipate that the most common counter-argument among K-12 teachers would be computer labs and laptops can fulfill these functions, but not without some caveats. Certain smart phones are great video tools and are much easier than checking out a video camera from the AV closet. Moreover, in many school districts, students are much more likely to have cell phones rather than laptops (due to the difference in expense, even with having to pay a monthly wireless plan), and we have to be cognizant of this “digital divide” as well. Finally, if we don’t bring cell phones in our classrooms, we run the risk of having students miss opportunities to use a mobile device as a component of a formal learning exercise rather than as a personal texting or game playing tool. As we state in the documentary (quoting Liz Kolb), educators can take a lead role in teaching students how to use cell phones more ethically, a key lesson for preparing them for 21st century learning and professional occupations.

Of course, as the video taught me, we have to tread carefully around the topic of cell phones in high schools. They have many positive uses, but also great potential downsides. My sense  is that what may happen going forward are some small modifications to the current “zero tolerance” policies in many high schools and exploring ways of using cell phones in a limited capacity as educational tools.

Personally I think it is worth the effort to experiment with cell phones due to their popularity and their ability to bring students into the classroom. Revealing Generation Text ends with a student who recalls that the cell phone is a communication tool and wouldn’t it be great if cell phones led towards more communication between students and teachers? That’s a vision I would love to see come to fruition some day.

All in all, I am very thankful my brother invited me to be a part of his grant and documentary video project. Thanks also to the Oppenheimer Family Foundation, the musician Moby who gave us permission to use his song “Flying Foxes,” and all the teachers, administrators, and students who participated in our project. I learned a lot about cell phones in K-12 education.

I would love to hear feedback on this video and the thoughts of other educators who have been or might be considering using cell phones in their classrooms or schools.

Remix and Potential Criticism: CSA 2011 Talk

Here is the version of this paper I presented today at the Cultural Studies Association Conference 2011 at Columbia College Chicago. I am planning on revising this for publication. This version of the paper was crafted to fit into a 15 minute time slot so I try to hit the high points of my argument in just under 2000 words. As always, comments are most welcome, but I am most interested in where my argument is unclear or where it could benefit from expansion or concision. Thanks!

Epigram:

Potential reading has the charm of making manifest the duplicity of texts, be they oulipian or not.”

–Harry Mathews

In the July 2005 issue of Wired Magazine, the sci-fi novelist William Gibson offered his take on remix culture in the essay, “God’s Little Toys: Confessions of a Cut and Paste Artist.” In that piece, Gibson directly linked digital remix culture back to the 1950s and the Beat Generation, especially William Burroughs and Brion Gysin’s technique of “cut ups.” In so doing, Gibson was conferring aesthetic cachet on a new set of disreputable practices by finding an older set of disreputable practices that have become respectable and tame over time. Besides glancing backwards for historical antecedents, Gibson also looked ahead: “We live at a peculiar juncture, one in which the record (an object) and the recombinant (a process) still, however briefly, coexist. But there seems little doubt as to the direction things are going.” (2005) For Gibson, the recombinant is marked by “appropriation” and “borrowings,” and its key unit is the “sample.” As he explains: “Everything I wrote, I believed instinctively, was to some extent collage. Meaning, ultimately, seemed a matter of adjacent data.” (2005) These notions—the “cut and paste” artist, the dominance of collage, and appropriation as creative act—exert a powerful hold on the critical and popular imagination and comprise a conventional litany on the pros and cons of remixing.  However appropriative collage is not the only method for deriving meaning from adjacent data.

In this paper, I argue recombinatorial remix explores the “potentiality” that exists inside all texts. Rather than focusing on “appropriation,” “borrowing” or even “artistic pilferings” (all of which have a history of practice as long as art itself), I will examine how formal, restrictive, and mathematical approaches to recombinatorial play are transformative and creative in an Oulipian sense. Texts remixed under constraint are experimental and playful in different ways from “cut and paste” works.  Traditional uses of “sampling” or “borrowing” or “pilfering” overemphasize the creative role played by randomness and chance by focusing too much attention on the aleatory dimension as the key remix aesthetic. The abundant references to Dadaism and Surrealism, e.g., in remix culture attest to a framing of the remix as an heir to “automatic writing” and “exquisite corpses.” If we generate a remix through aleatory mechanisms, the resulting “information” will be dependent upon chance operations.

I argue that such approaches ignore the rich legacy of creative works that have focused on potential of another kind. In this other approach, “potential” is generated from formal, mathematic logics, rules of mean and variation, and restrictive and constrained artistic modes. One group in particular, the Oulipo, has been at the forefront of this conscious investigation into potentiality.

The Oulipo is an acronym for Ouvroir du Litterature Potentielle, roughly translated into English as “Workshop of Potential Literature.” The Oulipo is a group of French writers and mathematicians whose creative work and research focuses on “all writing that [is] subjected to severely restricted methods.” (Mathews 205) Some of the better-known members of the group include the writers Raymond Queneau, Georges Perec and Italo Calvino.

The Oulipo focuses on “creations that create” more than “created creations.” These “creations that create” are characterized by the use of formal, artificial, even mathematical constraints that are determined before the act of writing begins. The Oulipo believes that constraints help writers “escape that which is called inspiration.” (Lowenthal, xii) Lowenthal continues that Raymond Queneau, one of the founding members of the Oulipo, thought that “the typical act of inspiration draws from limited resources. Rather than restricting the possibilities of creation, [Queneau] argued, the use of artificial structure–mathematical and otherwise–opens the way to a vaster range of potential creation.” (xii) This idea of “potential” creation was specifically defined against Surrealism and very different in its operations from the technique of “automatic writing” or the random combinations of Burroughs’s cut-up texts.

Oulipian works reveal what they mean by “severely restricted procedures.” Georges Perec wrote A Void, a lipogrammatic novel, where he did not use the letter “e.” Italo Calvino wrote If on a Winter’s Night a Traveler, a book made up of ten first chapters of imaginary novels. But Raymond Queneau might have written the most famous work of potential literature, his poem A Hundred Thousand Billion Poems. It is a 14-line sonnet with a twist: there are ten alternate lines for each line in the poem. [the previous link brings you to a digital version of this poem] Even though there are only 140 total lines of poetry, the potential of the poem is spectacular. It has been determined that it would take more than a lifetime to read every possible version of the poem. Therefore, most of the meaning of the poem lies in a “potential” state, waiting to be remixed.

It is worth noting that most remix artists would probably not identify themselves with the Oulipo or their legacy. But DJs and remix artists frequently have more in common with Georges Perec’s conscious use of constraints than William Burroughs’s chance-driven chains of meaning. For example, let’s look at DJ Freelance Hellraiser’s 2002 musical mashup, “A Stroke of Genie-us.” This song popularized the ‘A versus B’ mashup. In this song, Freelance Hellraiser took the musical track from Christina Aguilera’s “Genie in a Bottle” and recombined it with the Strokes’ vocals from “Hard to Explain.” The resulting mashup is not a random compilation of the two songs, but a seamless integration of the two competing musical styles into a new mix. To accomplish this new work, Freelance Hellraiser restricted himself to the limited resources of the “A” text and the “B” text.  This is much more in keeping with lipogrammatic constraint than the random rearrangement of the “cut-up.” In fact, the pop-worthiness of “A Stroke of Genie-us” would not have been accomplished if Freelance Hellraiser randomly compiled snippets of the music and the lyrics together. The success of the mashup is in its conscious and organic embrace of its constraint.

DJ Dangermouse extended the A vs. B mashup on a grand scale when he created The Grey Album. Like “A Stroke of Genie-us” Dangermouse’s choice of title self-reflexively puns about the constraint. Dangermouse remixed Jay-Z’s acapella version of The Black Album with musical samples from The Beatles’ White Album—creating that collision of opposites, The Grey Album. The ‘A vs. B’ mashup follows the logic of what the Oulipo call “The Prisoner’s Dilemma,” a lipogrammatic constraint that forces writers to compose a story using a limited set of alphabetic characters. Moreover, the “A vs. B” mashup is—in an Oulipian sense—a “creation that creates” and can be used just like the sonnet form is by both professional and amateur poets. “A Stroke of Genie-Us” and The Grey Album spawned numerous remixes, mostly from amateur remixers. The Grey Album inspired remixes using the music of Weezer, Pavement, Prince, Metallica, Radiohead, and the Wu-Tang Clan.

So far I have focused on how ‘formal constraint’ is another way to approach the ‘potential’ of remix culture. But there is another dimension at play here: the role of the reader (or the listener or the viewer or the interactant). The use of constraints to generate potential texts leads towards “potential criticism.” A fuller demonstration of “potential criticism” can be found in my forthcoming book The Maltese Touch of Evil: Film Noir and Potential Criticism, which I co-wrote with Shannon Clute. The book will be published this Fall by the University Press of New England.

Oulipian Harry Mathews states at the beginning of his essay “Mathews’ Algorithm:” “Potential reading has the charm of making manifest the duplicity of texts, be they oulipian or not.” (Oulipo 1973, 105) Mathews continues: “the resultants derived from these texts can be used to two different ends: either the ‘analysis’ of the texts put in play, or the creation of a new work” (1973, 105). As we have seen, with DJs Freelance Hellraiser and Dangermouse, the resultants of remixed texts are ‘the creation of a new work.’ What might not be as readily apparent is how these resultants operate as an ‘analysis of the texts put in play.’ Here it is important look more closely at how producing texts under constraint is both a creative and an auto-exegetical act, which explains why Mathews begins an essay about his recombinatorial algorithm with a discussion of “potential reading,” and claims it makes manifest the duplicity of texts produced through constraint.

Shannon Clute and I have identified three crucial elements to constrained textual productions: (1) intertextual allusions to other texts; (2) self-reflexive punning (often marked by quirky humor); and (3) formal mathematical logics that allow the text to explore and ultimately map its own typology. Taken together, these elements literally enact potential reading and reveal the duplicity of texts.  In my previous work on mashups and remix culture, Chuck Tryon and I began to develop the idea of critical digital intertextuality. Here, I extend my investigation into intertextuality by seeing it alongside self-reflexive punning and formal restrictive logics that together form ” a geometry of auto-exegesis.” In other words, texts written or produced under formal constraint can literally “read themselves.”

These three crucial elements are a particularly rich vein of creative work for scholars of remixes to explore. As Georges Perec notes in his Afterword to A Void, citational art, i.e. texts under constraint that contain intertextual allusions, honor and mimic traditions of punning and plagiary with longstanding roots in French Literature, going as far back as the 15th century and the work of Rabelais. Moving our frame of reference from 20th century artistic innovations, Perec identifies key characteristics of recombinatorial play within the humorous pun-filled plagiarism of stories like Rabelais’s Gargantua and Pantagruel and Sterne’s Tristam Shandy.

Examples of video mashups can illustrate this point. Consider mashups like “Vote Different” using Hillary Clinton’s video embedded in an Apple Macintosh Commercial, “Endless Love” featuring a musical duet between President Bush and Prime Minister Blair, or “Shining,” a remixed trailer for Kubrick’s The Shining as a romantic comedy. While all three remixes are quite different, they each form a geometry of auto-exegesis via intertextual allusion, self-reflexive punning, and the use of formal constraints. Moreover, they confirm Mathews’s main point that potential readings through remix have the ‘charm of making manifest the duplicity of texts, be they oulipian or not.’ These three remixes are quite duplicitous: they subvert the meanings of their original source texts, and do so using quirky humor and self reflexive punning driven by restrictive procedures. Moreover, self-reflexive punning and humor are not supplementary to the remix, but an outcome of what Perec calls ‘citational art.’ Quirky humor and self-reflexive punning are how texts remixed with other texts pleasure themselves under constraint.

In conclusion, we can identify remixes that engage even more thoroughly in artificial Oulipian constraints, as in Lenka Clayton’s remix video, “Qaeda Quality Question Quickly Quickly Quiet.” In a remix such as this, the power of potential criticism demonstrates a potential political dimension as well. Using a constraint quite popular among Oulipians—the alphabetic list—Clayton reorders every word of Bush’s 2003 State of the Union address in alphabetic order. The resulting twenty minute video is a new recitation of the State of the Union, literally word for word.  Through the use of alphabetic constraint, Clayton’s video enacts a potential reading of the State of the Union. Unmoored from their original semantic positions, the individual words are rattled off as a list that reveals the latent, even hidden, meanings of Bush’s speech. Clayton’s work demonstrates that potential criticism can move beyond playful combinations of pop culture and investigate and reveal the latent meanings and duplicity contained in the texts that constitute our civic and political lives. Clayton’s video calls to mind what Jacques Roubaud has said about Raymond Queneau’s “A Hundred Thousand Billion Poems:” Its constraint is rather elementary, but its potentiality is spectacular.” (2004, 100-101; trans. Jean-Jacques Poucel)

Works Cited:

Gibson, William. “God’s Little Toys,” Wired Magazine 13.07 (July 2005) http://www.wired.com/wired/archive/13.07/gibson.html

Lowenthal, Marc. Raymond Queneau: Stories and Remarks. Lincoln: University of Nebraska Press, 2000.

Mathews, H. Oulipo Compendium. Trans. and eds. by Harry Mathews and Alastair Brotchie, London: Atlas Press, 2005.

Oulipo. Oulipo: La littérature potentielle. Saint-Amand: Editions Gallimard, 1973. Trans. Shannon Clute.

Perec, Georges. La disparition. Paris: Editions Denoël, 1969.

––––––. A Void. Trans. Gilbert Adair. Boston: Verba Mundi, 1994.

Roubaud, Jacques. “Perecquian OULIPO” Trans. Jean-Jacques Poucel. Yale Studies, 105, Pereckonings: Reading Georges Perec (2004): p. 99-109.

As We May Publish, Part Two: A Reader’s Reflection on Two Publishing Experiments

“Free their books and their minds will follow.”

–Masthead slogan for The Concord Free Press

1. The Experimental Reader

In yesterday’s post (“As We May Publish”) I discussed what authors might consider taking away from AAUP’s report on “Sustaining Scholarly Publishing.” My reflections were oriented around why authors should care about the changes currently underway at university presses. I also mentioned that my interest in this topic was being driven—to some extent—from my own authorship perspective: my experiments in open access publishing, my interest in alternative scholarly publishing, and my forthcoming university press book that has digital and database logics at the core of its critical methodology.

As a companion set of ideas to that post, I  look today at two publishing experiments that came to my attention as a reader of a particular genre of fiction. With a background in English Literature and as a scholar of film noir, I read noir fiction and hard-boiled literature. That genre—coming out of the pulp magazines, the dime novel, and the comic book—has always been at the forefront of publishing shifts. Noir authors and noir publishers have tended to adapt to new business models while retaining (and even extending) their core thematic interests and stories. Through my interest in that genre, two experiments came to my attention that I don’t think are yet widely known in digital humanities circles. I bring them up as case studies that have piqued my interest as a reader who likes to explore publishing experiments—and these examples come out of the “wild” category of the publishing ecosystem—and they help me think about the reader’s role during this moment of experimentation.

My first example will be Concord Free Press, which gives away its books for free–literally. But that is only part of the story: Concord Free Press has a particular institutional mission that asks readers to make voluntary donations to the charity of their choice in exchange for a free book. The Concord Free Press calls its mission “generosity-based publishing.” My second example is Level 26, a new media publishing venture started by CSI: creator Anthony Zuiker. His Level 26 venture involves book publishing, web community and video productions, organized around what Zuiker calls the “digi-novel.”

Though my two publishing examples are very different, both share a desire to “free their books” and encourage their readers to “give back” in highly structured ways via the open Web. I am not using the word “free” as in “free beer” (one of my favorite lines from Lawrence Lessig in his book Free Culture) but “free” as in “free speech,” “free culture,” and “freeing” as in “liberating” or “having independence.” To me, these projects are interesting to consider in light of the ongoing conversations around the sustainability of scholarly publishing. These examples strike me as publication models that are taking advantage of digital affordances. They are asking new questions about the role of presses, the nature of the “book,” and the participation of readers in online activities.

2. Scott Phillip’s Rut and the Generous Reader

“If you took the tender portrait of a town in decline in Richard Russo’s Empire Falls, mixed it with Kurt Vonnegut at his most satirical and biting, then sprinkled in a few grams of meth and a generous shot of piss from a syphilitic hobo into the resulting solution, you’d have a drink that could almost put you on your ass as fast as Rut sure-as-shit will.”

Review of Rut by Spinetingler Magazine

Scott Phillip’s most recent book Rut was published by Concord Free Press. Scott Phillips is best known for his novel The Ice Harvest, which was also turned into a neo-noir film starring John Cusack. Instead of going with a traditional publisher, his latest work is being distributed for free by Concord. What this means is that it does not cost anything to obtain a copy of the book. It is actually free (as in costing no money). However, there is a reason for this. Concord Free Press asks all readers who take possession of the “free” novel to make a voluntary donation to the charity of their choice. The book-as-giveaway is used an incentive to increase charitable donations. And so far, the experiment is paying off. Concord Free Press has raised over $200,000 dollars in donations via the books has it published.

Now, it is important to note that its books, including Rut, are not published as e-books or released as PDFs—they are only available as traditional paperback novels. In fact, in one of my favorite parts of this experiment, the back page of each novel has ten blank lines where each reader of the book is supposed to sign their name and then pass the book along to another reader. This decidedly analog approach to forming a network of readers is a great way to encourage an ever-expanding readership. The physical book operates metaphorically a little bit like a digital bit; it is not meant to sit on one’s shelf, but is always intended to be in transit to another reader. It’s a virtuous model of sharing: the reading circle as lending library.

How do online communities come into play here? Concord Free Press hosts a website and asks all of the people who donate due to one of their books to log onto their site (www.concordfreepress.com) and note where they gave and how much. What one notes when visiting Concord’s website is how many donations are significantly more than the reader would pay at a bookstore for a 230 page novel like Rut. It is not uncommon to see donations of $25.00 and higher, suggesting that one act of generosity (giving away a book by an established author) results potentially in a greater act of generosity (a donation in excess of the typical consumer purchase of a paperback).

ForeWord Magazine says that Concord Free Press “re-conceptualizes the very goals of publishing, a grand experiment in subversive altruism.” It is this aspect of the experiment that I want to consider most closely. While Concord’s “subversive altruistic” model will not necessarily work for all publishers and all authors (note that Concord Free Press is publishing the work of already established authors), it is worth considering how models of “generosity” can induce and support participation via the open Web. [As an aside, is this type of generosity akin to the free labors that go into supporting a publication model like Wikipedia?]

At this point, the publication model of the Concord Free Press raises more questions than answers for me. Still I want to explore the implications of this kind of experiment on scholarly publishing. What would the scholarly version of this experiment look like on the open Web? How could scholars benefit from giving away books for free and asking for audience participation in return? Would a model of “book sharing and lending” (which is also at the heart of Concord’s experiment) work for a scholarly book? Is the logical extension of Concord’s paradigm to publish their novels in digitally native formats and remove the need for actual physical book publications? How might scholars locate funding to write books that encourage “subversive altruism?”

3. Anthony Zuiker’s Level 26 and the Active Reader

“Not a hint of this appeared in the mainstream press. This material was relegated to a bunch of serial-killer-fan web sites, the most active being Level26.com.”

–Self-conscious, metatextual reference from the novel, Level 26: Dark Prophecy (p. 117)

CSI: creator Anthony Zuiker is exploring the “digi-novel” in a series of crime stories focused around a criminal profiler, a “Special Circs” agent named Steve Dark. Two novels in the series have already been published, Dark Origins and Dark Prophecy. Each novel is supported by (even architected around) digital components and an online community. There is a website, Level26.com, where fans can meet up. There are videos that function as “cut scenes” interspersed throughout each novel. There are the Level 26 apps that bring the novel and its digital components into one application. A quick disclaimer: I don’t think Level26 is everyone’s cup of tea. The story itself can be quite gruesome (a bit beyond where even the CSI: TV series will go) and will mostly appeal to hardcore fans of Zuiker’s TV shows or mainstream readers of the serial killer/mystery genre.

Zuiker’s vision for the “digi-novel” seems to be a digital convergence between the storyworlds of television, web, and book. But up to this point, it still feels more like a group of parts than a converging, transmediated storyworld.  For example, there are “cyber-bridges” that extend the story beyond the written word. You can go to YouTube to see examples of his cyber-bridges. Cyber-bridges are video segments that occur approximately every 20 pages in the book, or even re-organize into its own one hour movie. To support the viewing of cyber-bridges, Zuiker hosts a free online community, Level26.com. To encourage readers to buy the book, the cyber-bridges must be unlocked using a printed code found in the book. (Of course, from an archival standpoint, one wonders what happens when and if the website goes offline in a few years for the book’s future readers.) I tend to find that the cyber-bridges interrupt my reading rather than plunging me deeper in the story. Cyber-bridges operate too often like cut-scenes in a video game, but without the feeling that one has “leveled.” And there can be a jarring effect when the characters in your reader’s “eye” are fleshed out in the video segments. I make note of these issues to highlight that Level 26 is still in an experimental stage. Zuiker himself has written on the problems he has faced in making all the pieces of the Level 26 franchise work.

Of particular interest to me is the community forming at Level26.com (which has had a community as large as 100,000 members). This is a community that is built for fans, aided by fans, but was not originally founded by fans. In fact, “official fan sites” can frequently be problematic, especially if readers sense that the community is little more than a marketing gimmick for a movie, TV show, or book. One way Level 26 is addressing this concern is to encourage fan participation on topics beyond the Level 26 novels, and making the site a destination for fans of serial killer fiction in general. How successful that maneuver will be has yet to be determined, but there are dedicated community members already operating around the subjects of serial killers, crime detection, and CSI: fandom. There is no doubt that Zuiker has learned a thing or two about building “franchises” from his CSI: success, and that this project benefits from his position in the media industries. In addition, Level26.com hosts fan contests, has a section for reader suggestions for future novels, and has active commentary sections related to the books.

Level 26 also takes advantage of handheld, touchscreen devices and in the process encourages active readers to click, touch, and play with the text and its digital components as the story unfolds on the screen. Using iPhone and iPad apps, Zuiker can eliminate the printed book’s hybrid status–straddling the analog-digital worlds–and produce a single, unified digital work. As Zuiker writes in February 2011: “”Years ago, when I started working on Level 26: Dark Origins, there wasn’t a device available to showcase my vision for what the Digi-novel could be. Now, with the release of the iPad, it’s time to unleash the Ultimate Digi-novel!” While the interactivity of Dark Prophecy as an app is still fairly rudimentary (and maybe not quite living up to the hype of being the “ultimate digi-novel”) nonetheless one can begin to see the promise of digi-novels as a mode of digital storytelling. Issues that have plagued early experiments in digital storytelling are still present in the app version of this novel. Beyond the interruptions of the cyber-bridges, pulling up electronic dossiers on characters or collecting evidence in the flow of a particular chapter can feel like tangents from the main storytelling rather than valuable hyperlinks. But even with these critiques, I fully appreciate how Zuiker is experimenting with digital storytelling and taking creative risks.

What might Level 26 suggest for the future of scholarly publication? First, the use of cyber-bridges would not be interruptive in a scholarly argument the way it is in a fictional narrative. I can see the potential of having cyber-bridges in a film or media studies book that could embed videos right alongside the written argument. Second, I think Level 26 points towards existing scholarly work that are moving more towards a “tablet-based reading” protocol or towards the expanded role of “video” in our reading practices. Here I sense deep affinities to a project like Alex Juhasz’s recent MIT Press “book,” Learning from YouTube.” For me, certain disciplines seem primed to continue these trends and experiments in scholarly publication, especially scholars in film, television, and new media.

I would love to hear about other examples around experiments in publishing, along the lines of Concord Free Press and the Level 26 franchise.

As We May Publish: My Reflections on AAUP’s Sustaining Scholarly Publishing Report

This blog post is a response to questions posed to me via Twitter by Shana Kimball, Head of Publishing Services, Outreach and Strategic Development at MPublishing, University of Michigan Library (http://lib.umich.edu/spo) She asked me in a tweet: “Curious about what you think authors should take notice of in the AAUP report? How should it change their publishing habits?” My immediate reaction was: great questions. But I realized quickly that my response was not likely to fit into a tweeted reply.

Some background: Since last week, I’ve been commenting on a new report from an AAUP Task Force on Economic Models for Scholarly Publishing. The members of the task force, representing several well-known university presses, have written a report entitled “Sustaining Scholarly Publishing: New Business Models for University Presses.”[1] The focus of the report is on university presses as a “keynote species” in the scholarly ecosystem, and the report’s conclusions speak to the need for collaboration and experimentation at and among university presses. So, there is a lot to digest in the report for the presses themselves and for their future, but why should any of these issues matter to authors, scholars, non-publishing faculty members, or even University administrators?

First, while most authors probably know (or at least sense) that university presses are in a transitional period and facing severe challenges, the report provides background and case studies about what is actually happening behind the scenes at the presses and discusses some of the new publishing models and strategies that are already being tried. The report seeks to address the challenges of a changing landscape for university presses from “new technologies to new economic conditions to changing relations with stakeholders.” While some might suggest that university presses—like many other businesses or academic enterprises—will just need to adapt to new economic and publishing realities, the report notes that the long-term sustainability of scholarly communication is not solely a question for university presses to answer, but in fact, requires a larger conversation between all shareholders in scholarly publishing, including authors, researchers, universities, funding agencies, and librarians.

And the stakes in the evolution of scholarly communication remain high for faculty members since university press publication remains one of the primary ways we establish our reputations as scholars. The importance of publishing with a University Press is a major criterion in many promotion and tenure decisions and is still a major conduit by which we review, edit, share, and archive our research and our scholarship among our peers and within our various disciplines. Therefore, I would argue it behooves us as an academic community to be very concerned about what is happening at university presses. To use the report’s “ecosystem” metaphor, authors are an important species in that system as well.

Moreover, change in scholarly publishing is already being driven by faculty—sometimes directly, sometimes indirectly—especially in the context of our everyday campus-based requests and needs. The scholarly ecosystem is deeply interconnected. For example, changes in how we access other people’s research (the demand side of the equation) is being felt by university presses (the supply side). If the demand rises for PDF files, then presses need to consider not only how to meet that need now but be able to have in place economic models that will allow them to meet that need in the long run.[2] In this regard, I would cite what Barbara Fister, a librarian at Gustavus Adolphus College, wrote earlier this year in Library Journal: “Times have changed. The migration to electronic formats untethered to physical libraries has already happened—so far as readers are concerned. How often these days does one encounter even the hoariest of scholars insisting that print journals in the library are preferable to electronic access at their desk? But the publishing system (and much of the prestige bound up in it) remains tied to archaic production processes, largely because librarians and their budgets have managed to throw together make-shift bridges to close the gaps between scholars and the resources they need. We’re good at making things work for our faculty, for now, and we do it quietly, making compromises behind the scenes. We’re not so good at meeting the needs of future scholars. They aren’t able to raise a stink, so we ignore them.” Faculty members are not neutral or disinterested parties in this conversation—we already impact the system.

Digital publishing realities are here today. This is no longer quite the horizon issue it was ten years ago. Many of us are already exploring and adopting open access (OA) models and utilizing open publishing formats and platforms. Therefore, I would argue that it is valuable to have the perspective of how open access (among other shifts) are impacting the university press system.

One reaction I had in reading the AAUP report and its growing body of comments is that a “healthy scholarly ecosystem” may very well look different, depending upon whether you are a librarian, an editor at a university press, an author, a grant administrator, a member of a scholarly society, etc. We may find broad agreement on what qualities we would all ideally want in a “healthy scholarly ecosystem.” But in the comments I’ve read so far, I detect thornier issues when the conversation switches to different university press missions, press priorities, scholarly agendas, and economic realities (among other things). I believe the report’s strength is in starting these conversations now and trying to bring all the shareholders to the table for the discussion.

But this set of concerns leads to Shana’s second question: “How should the AAUP report change publishing habits?” I have two responses to this question.

First, I think, for many of us, our publishing habits have already been changing. We look to university presses today for support of new experiments and new forms of scholarship. Here, I am definitely influenced by my own experiences in scholarly publication. I have been involved in alt-scholarly publishing since 2005 when I began using podcasting as an open access platform for my scholarship on film noir. That project has generated a forthcoming book, The Maltese Touch of Evil: Film Noir and Potential Criticism (November 2011, University Press of New England). The Maltese Touch of Evil engages in experiments around publication and reception, including considerations about how we might take advantage of digital logics to extend our critical methodology. Finally, I am also working this year to launch a new open access journal dedicated to multimedia work in film and media studies that utilizes open peer review. As I have experimented with new forms of publishing, I have found organizations like MediaCommons to be sites for collaboration, encouragement and support, and the AAUP report becomes another link in that chain. The report and its use of CommentPress is part of that community-building effort.

Community building around newer models of scholarly publishing should remain a priority in my estimation, since it is unlikely that a “one size fits all” publishing approach will emerge in the future. As the AAUP report acknowledges:

“The one evident conclusion that emerges from the various reports on the current state of scholarly publishing, as well as in the research undertaken for this report, is that no single new business model will replace the traditional print-based model. Rather, a mix of revenue sources will be required to sustain scholarly publishing in the future, and that mix is likely to vary for different kinds of publications.”[3]

Second, for authors who haven’t changed their publishing habits yet or are not particularly interested in doing so, I hope the report and its surrounding discussion encourages curiosity and consideration about what is happening in academic publishing in the digital age. I hope authors throughout academia want to learn more about the benefits of open access publishing, experiment with digital publishing projects, or talk with their editors about how to release their books in new digital formats. And even beyond a publishing issue, I hope reports such as this one help educate senior faculty about the changes that are greeting the newest members of our profession: graduate students and junior professors. We no longer can afford to have overly conservative promotion and tenure committees that are not giving adequate consideration and scholarly weight to new forms of publication. As university presses meet the challenges of a transformed publishing landscape, the projects that are likely to emerge will be like paper-based books in content only. As we ask our presses and our libraries to meet the challenges of digitally enabled scholarship, let’s advocate for faculty governance that will review and reward this scholarly activity appropriately.

Finally, I hope the report opens a few eyes about the new realities at university presses. Some of these changes are likely to be quite transformational, and will be a major part of our evolving scholarly ecosystem for years to come. If you haven’t considered it before, maybe now is the right time to start asking how your next book or journal article might benefit from open access as the primary mode of publication, or how might the content of your research benefit from being published in a digital format.

I would also love to hear responses and reactions from other authors and scholars on what the AAUP report might mean for their publishing habits and needs.


[1] The full report is being published at MediaCommons.org using the CommentPress software. You can read more about CommentPress and its functionality here, but the key feature of CommentPress allows readers to leave comments at the paragraph level and engage in a targeted and threaded discussion of the report’s findings and conclusions.

[2] In support of this contention, I think the report’s section on “Open Access as a Primary Model of Publication” is particularly illuminating with its case studies of publishers associated with the National Academies and RAND corporation.