The papers discussed in our online seminars are often excellent, but this one is apocalyptic, as well as being an exercise in fine writing. Lee Drummond was once a Chicago anthropology PhD specialising in the Caribbean and later in San Diego's tourist attractions. He taught at McGill University for over a decade before retiring to the wilderness (Palm Springs!). July and August there are his winter when he withdraws into his airconditioned study to avoid the heat. The OAC is a principal beneficiary of this aestivation, as witnessed by the paper attached here. All I can say is you gotta read it, whether or not you participate in the seminar.
Lee's point of departure is Lance Armstrong's confession on the Oprah Winfrey show. The man who perhaps deserves to be known as the greatest American athlete ever admitted taking performance enhancing drugs, thereby triggering an intense public outcry. Lee deconstructs what he takes to be a key feature of the American ideology, the opposition of nature to culture, showing that biology and technology have been inextricably woven together throughout human evolution and even before. If it is impossible to identify the unequal influence of technology in sporting performance, what about other areas of cultural achievement, like literature for example? Should Hemingway's Nobel prize be taken away or Coleridge's poetry eliminated from the canon because they wrote under the influence of mind-altering substances?
Not content with this reductio ad absurdum, Lee then launches into a savage critique of American civilization and of the cultural anthropology it has spawned. Drawing on Marx's happy phrasing in the 18th Brumaire, he argues that the American tragedy (New World genocide) now reappears as farce (reality TV shows), one of which actually replayed the former in a grotesque reenactment of the competitive ideal. Anthropology tends to celebrate cultural achievement around the world, whereas in Lee's view, the current state of American society suggests that culture may be a disease killing off its carriers just as their ancestors once killed off the original inhabitants of what passes for the land of the American dream.
Replies are closed for this discussion.
How would we ever know how many males vs females, adults vs kids, etc who read, watched or otherwise got engaged with Twilight? And what would it tell us (other than that Meyer made loads of money)? Whether girls liked Twilight more than the superhero movies also popular at the times is a possibility but what does that tell us?
Peter Wogan said:
But even if race isn’t a clear pattern in the audience, surely gender is, right?
How would we ever know how many males vs females, adults vs kids, etc who read, watched or otherwise got engaged with Twilight?
If we were a major consumer products company or advertising agency we would commission Gallup or some other research organisation to conduct a large-sample survey. The results would likely be presented as maps based on correspondence analysis (a statistical methodology brought to prominence by Pierre Bourdieu). What we would see on the map would not be neatly bounded categories of the sort anticipated by Aristotle, the Scholastics, and students trained by the University of Chicago to ask, "How do you define that?" Instead the maps would show clusters indicating what sorts of traits, a.k.a., answers to the questionnaire, have higher probabilities of being found together.
Alternatively, these days, we would send web crawlers scanning for data, winding up with samples of millions instead of just thousands of records. The results, however, would be presented in the same fashion, maps showing clusters of traits/attributes that most frequently co-occur. Given the amount of online activity surrounding blockbuster movies and TV series these days, finding the relevant data would be pretty simple, basically the kind of thing that the NSA does already with online traffic, e-commerce, POS scanner, and CCTV data. Then, if we had, in addition, hired some smart young qualitative researchers, we could ask them to develop "personas," ideal-type descriptions, of the individuals found in these clusters.
Now we, as academics (some of us) or independent scholars (me) don't have that kind of resources. Are we flat out of luck? No. No need to be hassling people with questionnaires that ask them to stereotype themselves — the old Census-based, race (white, black, Hispanic, Asian, Other), gender (male, female, LGTB), religion (Protestant, Catholic, Jewish, Muslim, Other) format. Instead, we can do what Louise's Student L has done so very well, go to places where hard-core fans assemble, on or off line, open our eyes and look around us and listen to what people say. There's a word for that—ethnography. It doesn't take rocket science to observe that most of the fans attending Twi-con were female or note the gender implications of someone saying, "I like the way Twilight involves people of all ages, from girls to grandmothers [emphasis added]." When L notes that it costs hundreds of dollars to attend and people are still lined up to buy the merchandise from the vendors, the economic implications are at least highly suggestive. We might even whip out our smartphones and snap a few crowd scenes, then count the people in them at our leisure. Random sampling? Statistical validity? No. But, "eyeballing these shots we can see that nine out of ten are female, mostly X or Y instead of Z" (where X, Y and Z are racial, religious, or other stereotypes, e.g., Goth vs Lolita vs Urban Casual styles)? Not so very difficult, is it?
However, putting methodology aside, we might ask why it is that things are so complicated these days and, examining a real cultural phenomenon, when and why people reject stereotypes instead of embracing them. This is a subject I address at length in my book on Japanese consumer behaviour. One of the things I discovered examining the research produced by the think tank associated with the ad agency that employed me was that there was a clearly visible trend, in visual imagery as well as analytical comment, from confident assertion of typologies in the early 1980s to attempts to probe the motives of increasingly slippery consumers who no longer fit neatly into traditional demographic categories. Further research revealed that this was a global problem, at least in OECD (advanced industrial) economies. Easily defined mass markets were fragmenting everywhere, leading to all sorts of market research innovations, from adding psychographics to demographics, to increasingly fine-grained breakdowns in what was called geodemographic analysis (the micro-niche marketing approach used, for example, so successfully by the Obama campaign), to, at the current limit, one-to-one marketing in which companies like Amazon track each customer's individual purchases and use this data to suggest books or other items that they might also want to buy. I have suggested before, but will now suggest again, that everyone should have a look at Albert-Lazlo Barabasi's Bursts [a short summary by the man himself can be found on Huffington Post].
Coming back closer to our topic, however, I would also recommend Danah Boyd's It's Complicated: The Social Lives of Networked Teens. Boyd is, hold onto your hat, an anthropologist, and judging by the book very skilled ethnographer, employed by Microsoft (your student L might want to consider this kind of career). In one of her most compelling observations, she describes a high school where the students attend class in integrated classrooms where they are taught that old-fashioned ideas like race shouldn't matter. In the cafeteria, however, they break up into racially defined groups, whites, blacks, Hispanics, Asians, each of which also has its own clearly bounded turf in other parts of the school grounds. It is the latter pattern that persists in their online behaviour, where social media interactions also break down along racial lines. They are told repeatedly in class that "We are all human, we are all individuals." They know that it isn't polite to bring up racial differences in mixed company. But when it comes to behaviour in their segregated groups on or off line—that's a different story.
Might be something to think about when asking how people think with or respond to what they see in movies.
I didn’t have anything specific in mind, I just wanted to invite you to share some of the most interesting insights and/or new questions to come out of your close-in work with specific Twi fans. The possible directions are wide open, especially since John, Lee, and I know so little about these fans and are eager to know more…
Louise, John, Peter, All –
The Pace of Things Meets Resurrection and the Light
(Courtesy of the Easter Bunny)
Your last Comment caught me off-guard, in a couple of respects. First, when you write:
Twilight...I can't believe I let you all drag me back into the morass that was Twilight! I don't think anyone on the planet is talking about Twilight except us. It is starting to feel like the resurrection of a cult (perfect at Easter).
. . . you make me feel my years, big time. It’s almost as sobering an experience as hanging out with my granddaughter. Since the Saga debuted in 2005 and Breaking Dawn, Part 2 was released just a year and a half ago, I thought that, as a cultural anthropologist (of sorts) I was being shockingly contemporary, even avant in undertaking a cultural analysis of the phenomenon. I now discover that in the fleeting reality of the world of fandom, Twilight (and its anthropologist) is yesterday’s news. The kids have moved on, and whatever profound cultural truths the Saga might reveal are fast becoming stale, fit for consumption only by geezers still stirring the ashes.
The realization is not entirely unexpected, however. I did see a portent of things to come a few (light) years ago with the release of Avatar. That movie was hyped to the gills, every talk show host babbled on and on about it, including Yoda and Kathie Lee, totally wasted as usual on their Tequila Tuesdays morning show. When Avatar opened it exceeded even those wild expectations, rapidly climbing to the position it now holds as the Number One top box office movie of all time. Having thought and written quite a bit about movies (now dry as dust) such as Jaws, Star Wars, and E. T., I began to get interested in taking a crack at Avatar. Then, to my complete surprise, the movie disappeared from the 24/7 cable news circuit. Unlike those earlier “classics” that had fueled my interest, movies that became instant icons and stayed in the news and in dinner party conversations for months, Avatar was gone, superceded by the very latest thing to happen.
James Gleick, an excellent science writer and biographer, chronicled this phenomenon a decade before Avatar’s appearance, in Faster: The Acceleration of Just About Everything. Obviously, “everything” includes the popularity and obsolescence of major movies. If Gleick is right, and I think he is, where does that leave those of us who try as best we can to conduct a cultural analysis of the society (societies) in which we live? That, I’d suggest, is a mega-question everyone in the social thought business (from wasted ladies on morning talk shows right through all those big thinks swimming around in the world’s think tanks) needs somehow to take into account.
As always in our “sky’s the limit” forum, everyone – participants, lurkers, even (or especially) those elusive Romanian anthropologists – is invited to step up to the plate and swing for the bleachers. Is the pace of things – contemporary life – simply too fast to analyze? Can we do anything but grab hold and try to hang on while the roller coaster of human existence barrels down the track?
My answer? What keeps me thinking and writing about such evanescent things as major movies? It’s provisional, at best: Whether a social / cultural phenomenon lasts a week, a month, a year, or ten years, if it moves people, particularly if it moves tens of millions of people, it is worth the attention of the cultural analyst. The Twilight Saga may now be part of a rapidly disappearing past, but its residue – in print, film, video, individual minds – remains with us, is a part, however antiquarian it may seem, of our history. And “history” is a living thing, not the mute debris of the past. So, such things are “good to think” – with and about.
Louise’s reference to Easter, which is tomorrow in my time zone, touches on a highly improbable but, I think, intriguing relationship between Twilight and themes of rebirth and immortality. Perhaps this will help to address the “faster” problem. Reza Aslan’s book, Zealot: The Life and Times of Jesus of Nazareth makes two very interesting arguments, only one of which is likely to be emphasized at tomorrow’s sunrise services. The unlikely argument is that the historical Jesus was by no means a “love thy neighbor,” “turn the other cheek” sort of guy, the beneficent image of Christian love proclaimed throughout the Western world. Instead, he was a Jewish terrorist, intent on the violent expulsion of the occupying Roman army and the restoration of Judea to its former glory as a theocratic state. The Temple priests of Jerusalem and the Roman governor recognized him as such and crucified him. Aslan’s other argument, the staple of those Easter services, is that the cult that began to form immediately after the Crucifixion was unique in the Judean world for its belief in the resurrection and immortality of the slain Messiah. In the decades before and after Jesus lived, Roman-occupied Judea was in political and religious turmoil. Jewish revolutionaries claiming “messiah” status in the exact mold of Jesus emerged from the masses, led violent movements against the established order, and were put to death. What distinguished Jesus’ failed rebellion from all the rest was that his closest followers, Peter and several other apostles, swore throughout their lives that they had witnessed his Resurrection. The historical Jesus was reborn as the Son of Man. And with that rebirth, Christianity was born.
Resurrection and immortality: two ideas with tremendous appeal, phenomenal staying power. They’re still very much with (a considerable number of) us, and will again assume prominence tomorrow, taking lead-off (if not exactly Breaking News!) positions on the cable news and talk shows. And although none of those talking-head commentators will mention it, resurrection and immortality are also dominant themes in another, not so long-lasting cultural phenomenon: the Twilight Saga. It’s up to the cultural analyst to make that connection.
I would suggest that even now, with Twilight long superceded in the popular imagination of fandom by the Hunger Games series and Divergent, its residue remains in the minds and personae of those tween and teen girl fans and former fans. Eternal life, eternal youth is not far from the concerns of those girls who, like Bella, wake each morning to find some premonition of their grandmothers staring back at them from the bathroom mirror. The multi-billion dollar cosmetic and plastic surgery industries attest to this central fact of contemporary life. As does the phenomenal popularity of a poorly-written tale of a teenage girl who above all else wanted eternal love and eternal life. As a vampire. Remember, with Easter past, Senior Prom will soon be upon girls who, like Bella, are high school seniors. Before the limo pulls up to whisk them off to the hotel ballroom and, later, the hotel suites above it rented by doting parents, suites where God only knows what will happen. . . before all that there will need to be, as well as the trip to the boutique and the salon, that first (?) appointment with the plastic surgeon for a nip here, a tuck there, perhaps, in desperate cases, a nose job. No suffering is too great to promote that chance at eternal youth, eternal love.
The connection may be even closer, even more macabre. Suffering under the surgeon’s knife is a dreadful price for a young girl to pay, but it is nothing compared with the days of agony Bella and every other vampire experiences during “the change,” the transformation from human to vampire. The biting vampire’s venom that courses through their bodies, slowly and agonizingly transforming every cell is, like the Agony on the Cross, the way, the truth, and the light to Immortality.
I did want to get to the other issue you mentioned – whether the viewer “identifies” with movie characters. Another deep one. But, as usual, I’ve overstayed my welcome (typical anthropologist!) and should cut this short.
I want to thank you for your marvelously clear and sophisticated account of the exciting things going on in the borderland separating that tried-but-not-so-true dichotomy, “qualitative” vs. “quantitative” research. Your long career in advertising brings it home in a way that “research methodology” texts don’t – at least any I’ve been able to read for more than a few minutes.
I confess that I was only on familiar ground when you began with a reference to “market surveys,” but then immediately found myself reading and rereading your discussion of “correspondence analysis,” probability cluster traits, “web crawlers,” ideal-type personae, and even tightly focused ethnography (courtesy of Microsoft!), all research strategies designed to identify those “increasingly slippery consumers who no longer fit neatly into traditional demographic categories.” Since I’m pretty much convinced that “social reality” is a very slippery fish, I’m intrigued that these so-called “objective” methods point to the same conclusion.
The closest I’ve come to implementing this kind of approach is hanging out (that is, lurking and voyeuring) in movie theatre lobbies and bookstore coffee corners to see who goes to certain movies or browses certain books and what they talk about. And, yes, that’s probably about as effective as hanging out in old-style, hard-drinking singles bars versus today’s practice of logging on to eHarmony.com and letting electrons do the rest (but here I don’t have any “data points” for comparison).
I’ll try to check out the references you cite. I’m particularly intrigued by Danah Boyd’s ethnography of a high school, where teachers recited the mantra of racial equality all day and students kept to their own ethnic groups in the cafeteria, school grounds, and online. Does this possibly remind you of something? Say, the AAA reciting the same platitudes in its “Statement on Race,” while outside its hotel meeting rooms race-inspired gang warfare raged in the streets?
I think the kind of vital, hybrid research you outline in your Comment has affinities with other areas central to anthropological inquiry. Remarkable advances in the science of genotyping are shedding light on the pre-historical movements of human populations and even on human evolution as a whole (our long-debated biological ties with Neanderthals). And equally impressive developments in MRI technology now make it possible to observe a number of human brain functions. The anthropology of “symbolism” and the philosophy of “other minds” may never be the same.
Anyway, thanks again.
In the Spirit of Easter: Jesus was a zombie, the Ricktator is going to get him, and Johhny Depp is Trancendent.
You all have written such thought provoking things in the last 12 hours but I will only be able to get to some of them in a roundabout way. Forget roundabout, I am just going to write thoughts as they come. Please forgive me (it's Easter after all) for not engaging in a direct conversation.
Lee says, “It’s up to the cultural analyst to make that connection.“
I believe that is our job, to make connections that we, as anthropologists, are aware of that others just take for granted. I always thought that was the definition of anthropological research, not in an arrogant way, but because we are foolish enough to pay attention to the patterns, the little things, the seemingly innocuous. I remember when I was doing my dissertation fieldwork and a friendly academic from a local college advised me that I should just give up what I was doing because everything in the community was based on money. The disruptive behavior I was studying cost the community money and that is why they hated the misbehaving women. Needless to say, I ignored him and lo and behold, the women (anti-nuclear protestors) actually brought tons of money into the community. The reason the locals hated the women was because they were challenging in very vivid ways the category of Woman. The point: only we would pay attention the the hundreds of little things people said and did that could be put together into a story very different from the one the economist told.
So, your work on Twilight in the past few weeks isn’t really just about Twilight. It’s about how groups of people come together to do what? Share a story that helps them do things, think things, be things, and tell other stories. I was talking about the Twi-fans but as I reread that, it is also talking about the people involved in this discussion. As anthropologists that is, I think, what we study as well as what we do. You can stick with Twilight or go onto another subject, you will be hopefully doing the same thing. And that, I think, is good.
Movie goers do exactly the same thing, whether it is a blockbuster movie or not. I sympathize with Lee’s lament that the movies he so carefully analyzed are not necessarily the ones people are talking about today. But in some ways they are since references to ET (“phone home”) and Jaws (“We’re going to need a bigger boat”) can be heard every week in the news, in magazines, online, in conversations, in classrooms. Casablanca, The Godfather, Independence Day, Jurassic Park, etc: if you listen, they are being mentioned every day. I still hear endless Arnold Schwarzenegger references.
To look at any movie in isolation is to miss the point of the movies. Roughly speaking, there have probably been about 60,000 feature films released in the US (I can’t get a good number on that). That’s 60,000 stories, many of them similar, that we can USE to make sense of something else, to test out new friends (my son and all my students talk about how they will arrange a screening of their favorite movie to see if a potential new friend will fit in), to be an anthropologist for a few hours and visit other worlds. I am reading book 4 of George R. R. Martin’s The Song of Fire and Ice (aka, Game of Thrones). Here is what a character thought today about books: “He understood the way that you could sometimes fall right into them, as if each page was a hole into another world.” That is what I think movies are and it is up to us to demonstrate that.
If we look at movies the same way that any other discipline does (I am looking at you Cinema Studies and English Departments) then we are not doing anthropology. No one needs close readings of movies from us. They need something else. They need us to make connections and not just to reality but to the ways stories are used. But if you don’t start with a movie as a story, then none of what I am saying makes sense. By the way, the Academy Awards in the last few years has shifted from calling the movies works of art and now they are calling them stories. Just saying…
Example: today I went to see the new Johnny Depp movie Transcendence. You may think I went because I have written a book on Johnny Depp and so wanted to keep up with his oeuvre. Not true. I hate most of his recent movies (except Pirates). You may think I went (as I posted on Facebook) in honor of Easter. No, that was just a joke. I actually went because I was out shopping and it was too early to go home (no one there) and if I went to the theatre that Trancendence was playing in, starting in 20 minutes, I could go to Whole Foods afterwards and get some fish for dinner. That won’t come up in a purely textual analysis. Want an analysis of Trancendence? Well, Johnny Depp, a brilliant computer scientist, gets shot so his wife and best friend load his consciousness into a computer and takes over the world and even when he is just a computer image his wife still loves him. Wait, isn’t that just like Scarlett Johansson last year as a computer program in Her when that crazy actor (what’s his name, the one with the scar on his lip, oh yeah, Joaquin Phoenix) falls in love with her? And then it recalls the dozens of other movies where computers take over the world or people love their technology. Transcendence will disappear from most people’s movie radar pretty soon (I liked it but I doubt it will be popular) but the story, because it is one we tell over and over, lives on in the connections we make to it. That connection can just be a magazine article about Depp or one of his endless TV interviews, but it is still a connection. Or it can be a connection to the NSA (as most print media reviewers are saying). But is is still a story that goes back to those mythological themes, especially: what does it take to be human?
What does that have to do with Jesus being a zombie? Well, super healthy Whole Foods didn’t have marshmallow peeps which are essential for making peep scenes (which are often about the latest blockbuster movies) so I will have to go out tomorrow, on Easter, and hope there are some left (not to eat, to make a scene). Last year I did a zombie peep scene and anyone into zombies knows the story of Easter in which Jesus dies and rises from the dead, so zombie-like. Rick, of Walking Dead (the most popular show on cable), wouldn’t care if Jesus were the Savior or not, he’s a zombie and Rick will eliminate him. But wait, Walking Dead is on hiatus so I have to watch Game of Thrones Sunday nights. But wait again, it’s Easter all over again because Catelyn has died and then been resurrected. She’s walking dead. Thank you Jesus, thank you Jesus, thank you Jesus (a line from Dante’s Peak, revised for Easter). Is this a rant? No, this is what it would look like if we could get people to talk about the connections they make between movies and everything else. Wouldn’t that be fun? Or just too weird?
Happy Easter! And welcome to the longest running thread in OAC (or almost any known online forum's) history -- well on its way to the 1000th post.
Where once there were five, now there are two (plus you). The first 800-or-so posts came mostly from Keith, myself, Huon -- all of whom have moved on -- and John and Lee, who still aren't satisfied.
Lee fancies himself to be a "Nietzschean," by which he means he would like to get rid of the "humans." For someone who uses far too many "quotes," it struck me as curious early on that Lee puts "humans" in brackets. He is fascinated with all things not-human, including the genetics of pre-human biology and, of course, vampires. Thus, the topic of Twilight. His hope is that Twilight fans signal a shift away from "humanity." As you have aptly illustrated, they do not.
John is a Japanese advertising man whose primary interest is taking people and "seducing them of their affections" (his language.) As all those in his business know, the "best" sort of a person is a young women, since they will grow up to become "shoppers" and, thus, the life-blood of ad-driven commerce. Thus, again (he hoped), Twilight would tell us something about their "affections." According to John, anything is permitted -- short of actual *rape* of course. Mind-rape via psychological warfare is totally within permissible bounds. Indeed, if the mind were treated like the body, then how could we have any advertising at all in our times of non-violence?
Both John and Lee are "lapsed Baptists." They are what Hiliare Belloc described as "neo-pagans" (in his 1928 "Survivals and New Arrivals") and have rejected their ancestral religious heritage. John has adopted a Taiwanese itinerant version of Daoism and Lee pines for the Dionysian pantheism of the pre-Socratics. Yes, there has been a lot of "biography" shared in these many heart-warming posts.
Me? I'm a Catholic who will shortly be heading to a Latin Mass in a small chapel in Queens -- where I will be performing "cannibalism" on some "zombie" flesh. <g> I am also organizing the academic Center for the Study of Digital Life, where I will be hiring some anthropologists to help us to understand what *digital* technology is doing to the HUMANS. I joined this group to perform some ethnography on the ethnographers and it has been highly instructive.
Causes: As we've known for 2500 years, there are four of them. Formal. Final. Material. Efficient.
The one that survived (on the surface) as a result of that marvelous "skeptical" attack on all things Catholic (and Aristotelean) generally known as the Enlightenment is *efficient* cause. This is crucial for engineering and its the driver of the Industrial Revolutions. Nowadays, as you have noted, cause-and-effect isn't quite what it once was. Statistics has replaced "causation" with "correlation" and the underlying science-of-logic has taken quite a blow from the work of Kurt Godel (and others.) RIP "efficient" causality!
That leaves us without any "causes" to support -- or does it? Not really. We humans love our causes!
The one that seems most popular today is *material* cause, as expressed in terms of "complexity" and "emergence." Typically cast in terms of a "loophole" in the 2nd Law of Thermodynamics (an earlier but now obsolete formulation of efficient causality), emergence is the focus of a high-school curriculum known as "Big History" (funded by none other Bill Gates and very popular among the Russian neo-Cosmosists) and the topic of a growing library of popular and even philosophical works, such as Manuel Delanda's "Philosophy and Simulation: The Emergence of Synthetic Reason." John likes all this.
Slightly less popular is *final* cause -- the basis of all religious fundamentalisms, as well as romantic and utopian thinking. We still have many versions of Teilhard's "Omega Point" and Marx's "communism" running around. Of course, let's not forget that we also have the Singularity! And, the WAR ON TERROR -- which is actually one group of end-of-the-worlders (the "evangelical" Protestants) fighting another group (the "evangelical" Muslims), just as Sam Huntington (and Arnold Toynbee before him) said they would. Lee likes all this since it promises to end this "fallen world."
I prefer *formal* causality -- which is how "environments" operate. Since, as an epigoni of Marshall McLuhan, I believe that the "environment is the message" and that communications technologies (e.g. "media") constitute those changing environments, Yes, the Latin Mass that I'm heading to is also one of those "environments," or, following McLuhan (who was also a Catholic), a very powerful counter-environment. In *formal* terms! Accordingly, I am very interested in the *digital* "surround" in which we now live. John has no idea what I'm talking about (being a material cause sort of a guy.) Lee has cooked up the notion of a "semiospace" but is also apparently baffled by my discussion of "formal cause" (being a final cause sort of a guy.)
Just as well. I suspect that both John and Lee are the sorts of people that the Center will be studying (instead of employing), so the OAC has saved me a lot of trouble.
Good luck with your peeps and Happy Easter!
Mark Stahlman/Jersey City Heights
So, your work on Twilight in the past few weeks isn’t really just about Twilight. It’s about how groups of people come together to do what? Share a story that helps them do things, think things, be things, and tell other stories. I was talking about the Twi-fans but as I reread that, it is also talking about the people involved in this discussion.
Louise, this is very shrewdly observed. I can see that Student L takes after her teacher.
I should confess that, unlike you and Lee, I am not a movie fan. I rarely go to a movie theater unless I am in Virginia, when I sometimes go out with my son-in-law to see one of the superhero action films that he is into or go with my daughter, son-in-law and the grandkids to see something Disney or Pixar-like en famille. The movies I see are mainly those that pop up on the airline entertainment systems when I am on international flights with several hours to kill.
My personal obsession is the question, "How do we know that?" It has led me from being a smart-assed kid growing up in a religious family disillusioned by the gap between what people said they believed and how they sometimes behaved, to philosophy of science as an undergraduate, then anthropology in graduate school, and a doctoral dissertation on Daoist magic. My business career has added to both a strong awareness of how often decisions to act as if certain things are known are constrained by deadlines, political or economic pressures, or people simply running out of steam and saying, "Let's go with this. It's good enough." It is this perspective from which my comments emerge. When it comes to examining particular films, I will have to rely on you, Lee and whoever else chimes in for details to stimulate my grey cells.
Thanks, by the way, for mentioning peeps. I had no idea what they were. Wikipedia came to the rescue. I will now seem less hopelessly out of it should my grandkids bring them up.
He Is Risen! And boy is he pissed. Or, if not exactly pissed, then prickly as ever. Welcome back. As one of your favorite seers, Yogi Berra said, it’s déjà vu. . . And “Happy Easter” to you.
But after your extended absence, imagine my surprise. There I was, beginning the long, painful struggle into consciousness that is morning, starting on my coffee and Captain Crunch, firing up my creaky old computer, when, Voila! There you were, heading off to Latin Mass in a chapel where, despite its modest size, doubtlessly has the Formal Cause emanating from its altar. I, on the other hand, can look forward to Martha Stewart-designed Easter eggs, a grandchild now tweenishly bored with an occasion she used to delight in, and a not-so-amused cat in bunny ears. I don’t even have one of Louise’s marshmallow peep shows to brighten the day. This land, our land, is truly a plural society.
But since this a day our plural society (or some of its pluralities) reserves for thoughts of death, rebirth, cleansing of sins, and salvation, let me respond to your thrusts and parries with what, for me, is an uncharacteristic openness. After all, isn’t confession the only way to receive His Grace?
First, to edge into this Confessional booth, let me provide a bit of background and housekeeping regarding my activity on the OAC and my own think tank (not so well-funded as your Center for the Study of Digital Life), the Center for Peripheral Studies, www.peripheralstudies.org . Before I downloaded my Lance Armstrong essay last August onto the OAC website, I had never participated in any “Web 2.0” online interaction (except email, of course, which I take it doesn’t count). I don’t “friend” on Facebook or “tweet” on Twitter – the very terms and implications grate on my nerves. In short, I try, feebly and ineffectively I know, to preserve the last shreds of my privacy in your New Age of the Digital Environment.
At the other extreme of this whole writing-down-your-ideas-for-others-to-read thing, that is, actual publication in journals and books, I haven’t tried that for years. After knocking on enough closed doors, one eventually gets the message. That’s why, when I came across the OAC, it appealed to me as a promising middle-ground arrangement: The site is dedicated to anthropology, rather than to swapping recipes, photos, or spouses, and features a rafter of thoughtful pieces on the subject. And what truly astonished me is that you can write something, download it on the website, and, lo and behold, there it is! for anybody who wants to read it. No editorial committee, no “pee-you review,” no waiting months or years to see your thoughts in print. And, again completely unlike formal publications, anybody who wants to can respond to it, right away, and get that response entered alongside the original. The more the merrier. Such a marvelously simple, incredibly powerful idea. Keith Hart deserves huge praise for what I’m sure are thousands of hours and dollars he’s given to this project which, as far as I can tell, is unique to anthropology. He’s clearly the Daddy Rabbit of Bunny Day (bonne idée) and all the other days since the OAC began several years ago.
My participation in OAC did involve a lot of learning-while-doing education. Although I’d seen the terms, I had no first-hand knowledge of “blogs,” “posts,” “threads,” or, particularly, of the OAC’s distinctions among “Comments,” “Forums,” “Featured Blog Posts,” “Latest Blog Posts,” and “Groups.” Keith was very helpful and patient in getting me oriented to his website world. Once underway, responses to the Lance essay were quick and numerous – far more than I’d expected, but then, again, I had no basis for comparison. I immediately found myself in the e-company of experienced bloggers, posters, and the like who, like Keith, were tolerant of my neophyte ways. I should note in passing that you’ve miscounted the current participants in the Forum – neglecting Peter Wogan, who is apparently another newcomer to online discussion groups and, a truly damming flaw, an anthropologist who breaks all the rules and writes about popular movies (see his excellent Hollywood Blockbusters: The Anthropology of Popular Movies).
Something I do find puzzling about the reception accorded my Lance essay is that the piece – and I’m not at all embarrassed to admit it – is a topical and rather sketchy attempt to flag a few issues I saw arising from the scandal surrounding Armstrong. You and John have not been hesitant in pointing out its shortcomings, and in my defense I can only say that I did not intend it to be a comprehensive analysis of anything, particularly American society / culture writ large. It was unapologetically polemical, an exercise in thrust-and-parry anthropology (I believe you’re familiar with the approach). I continue to be perplexed, and even a bit wary, that a discussion of the ramifications of a bicycle racer’s failed career, which has morphed (improbably) into another discussion of teenage heartthrob movies about vampires and werewolves, should garner such attention. The OAC is poised to take on much more important, much deeper issues, such as, oh, say, whether cultural anthropology might possibly want to develop a theory of culture (perish the thought!) or whether we might postpone the goody-goody advocacy of politically correct causes for a while and stare right into the face of an abyss: Is the human species inherently violent and aggressive (are we Chagnon’s “fierce people”) or has it come by that behavior late along in the human career and earlier on we were Rousseau’s pacifists in fig leaves (Sahlin’s “affable savages”). If improvement is needed in the OAC, I’d suggest that members come forward – and only they can do it; Keith can only do so much – and engage the community on these or similarly pressing topics. Hey, it’s only a click away. Go for it!
To try to tie my OAC experience together with one of my past lives, I think its discussions represent the best possible sort of a graduate seminar. Most such get-togethers, as I recall, did not have the quality or detail of exchanges on OAC. Peter and Keith, who are still hunkered down in those trenches, might want to revise and update my faded recollections. The basic idea, though, is that someone presents a reasoned argument, someone else makes a reasoned response, then that gets passed back and forth around the seminar table until, at the end, everyone’s understanding of the topic at hand is broadened, deepened, directed in new directions. Regrettably, in the real-life seminars of memory there are several barriers to this give-and-take of ideas. First, there is usually a status difference: a professor and students or, at least, a convenor who directs traffic. And in the former case, those students may well get graded on what they say (or don’t say – lurkers beware!). Nothing inhibits discussion like a grade. Second, even if things proceed in as egalitarian a manner as possible, everything happens on the fly. Apart from an initial presentation, which may or may not happen, one isn’t supposed to drone on for fifteen or twenty minutes. And third, it’s impossible to stop and check on the spot references to this or that work, this or that thinker, this or that historical event. Participant A says something like, “Lévi-Strauss argued such-and-such,” and Participant B responds with, “Yeah, but in another place he wrote something else,” or “That passage actually meant. . .”
What greatly appeals to me about our Forum is that it is possible for each participant to read what others have written, take time to think about it, check their references and search for other, new references that occur to him or her, then respond in an unhurried, deliberate way. When you refer to me as “someone who uses far too many ‘quotes’,” I guess I should ask, “Too many for whom?” I think it helps a lot to focus discussion if I provide a quote by another participant or an outside reference which keeps to the specifics of what is under debate. Otherwise, it’s too easy to drift into stereotyping, into “He said, she said” standoffs.
To transition from housekeeping to the furniture itself, let me – ah yes! – respond to your quoted remark:
Lee fancies himself to be a "Nietzschean,” by which he means he would like to get rid of the "humans.” For someone who uses far too many "quotes,” it struck me as curious early on that Lee puts "humans” in brackets. He is fascinated with all things not-human, including the genetics of pre-human biology and, of course, vampires. Thus, the topic of Twilight. His hope is that Twilight fans signal a shift away from "humanity." As you [Louise] have aptly illustrated, they do not.
Almost everything you write here is true, with the crucial exception that I do not at all want “to get rid of the ‘humans’” considered as the ongoing process of humanity. An anthropologist would have to be pretty self-destructive to want to do away with the basis for his professional existence. However, I’ve declared from the beginning that my principal interest, how I find it possible to identify and treat a phenomenon, is by paying close attention to its boundaries. Here I’ll compound my sin of over-quoting by resorting to the last resort of the academic scoundrel and quote myself. My little virtual salon, the Center for Peripheral Studies, is above all else, about boundaries, as I try to make clear in what passes for a “mission statement.”
The Center exists to explore boundaries and their interconnective or intersystemic properties: boundaries between individuals, between human groups, between humans and animal species, between human and extraterrestrial species, and, ultimately, between human thought and physical reality. The perspective or bias that inspires these explorations is that the essence of a person, group, species, idea, or object is its edges: the interactions or intersystems it sets in motion in the process of being.
I simply don’t regard it as contentious – as some sort of radical philosophical or ideological stance – to recognize that We, “humanity,” has come from something that was Not-Us (the animal world) and, barring our own annihilation or the inconvenient comet, is going to something (which, since I haven’t a clue what that might be, I’ve simply labeled “Something Else”) that is fundamentally different from Us. On an ethical level, I do recoil at the horrors “the humans” have wrought on the project of “humanity,” but since we happen to be the only game in town (unless E. T. shows up), we’re stuck with Us.
Since this is a Confessional, it may ameliorate – or not – the heady, cerebral sound of all this to let you and other readers in on a little joke, an incident of what Geertz might have called “deep play”: I came up with the name, “Center for Peripheral Studies” as an offhand joke, a little wordplay years ago (in the mid-1990s) during a phone conversation with the then editor of Anthropology News. She was a bright, perceptive person (I could tell because, uncharacteristically, she was publishing my stuff), and in that phone call she was trying to be ever-so-polite in asking me (she needed a biographical blurb for my latest masterpiece) if I had a real job, something that would lend at least a veneer of academic respectability to my Mad Hatter ideas. In a flash, I replied, “Well, I am the director of the Center for Peripheral Studies,” She got a hearty laugh out of that, but, amazingly, included that as my “title” in the bio. Hence was born that august research institution.
Continuing on the subject of biography, you’ve generously supplied John and me with thumbnail sketches:
Both John and Lee are "lapsed Baptists." They are what Hiliare Belloc described as "neo-pagans" (in his 1928 "Survivals and New Arrivals") and have rejected their ancestral religious heritage. John has adopted a Taiwanese itinerant version of Daoism and Lee pines for the Dionysian pantheism of the pre-Socratics. Yes, there has been a lot of "biography" shared in these many heart-warming posts.
If I ever come to dusting off my decades-old C. V., I’ll be sure to include “Neo-pagan Dionysian Pantheist” in my latest job description. Think that might land me a job at Harvard? Perhaps not.
In any event, since this sacred day is winding to a close – my tween has gone back to her Minecraft and my disgusted cat has slunk off somewhere, let’s get down to basics. As a very good friend from my Guyanese days once said, “I’ve only heard truth spoken over an empty bottle.” Meaning, of course, that all the rum shop bluster of grand exploits dies away and the guys around the table, with no more rum to fuel their fantasies, tell it like it is.
Still in that Confessional booth, I will “tell it like it is” or, at any rate, as I see it (you’re free to thrust and parry as you wish).
As you know, I am stridently anticlerical. I am appalled at the cruelties, the atrocities “religious” people commit in the name of God. And, yes, that includes your own Catholicism – “Formal Cause” or not I simply find it incomprehensible that you could defend the endless atrocities committed in the name of the Christian God. But I am ecumenical: I levy the same charge against those Muslims who believe they have found the Truth and don’t mind stoning women to death or setting off bombs in pizza parlors filled with kids to advance their Holy Cause.
From that bit of knowledge about me, it is an easy step to identify me as an atheist. I am not an atheist. I am not religious enough to be an atheist. The atheist believes in something, in the proposition that “There is no God.” Speaking just for myself, I find that assertion, that outlook on life, ludicrous. Certainly, I don’t believe that God is an old white guy with a long beard, dressed in a toga and reclining on a cloud. All I have to go on is an endless series of question marks: What is this about? Is there any meaning here? And you can’t believe in a question.
But who am I, or you or anyone, to pronounce on the forces that may orchestrate the Cosmos? After centuries of scientific work and thought – which your Catholic Church impeded every step of the way (Hey, what’s a few excommunications, a few house arrests, a few auto-da-fé if they mean keeping the faithful in line?) – we’ve extended the known Universe beyond its terra-centric and heliocentric boundaries (yes, again) to some thirteen billion light years in every direction. And that’s just our hopelessly provincial take on things. The Cosmos, not the Universe, probably goes on and on – we’ve just seen a little patch of it – and may even contain such exotica as white holes, parallel universes, M-brane strings, ten or twelve dimensions of spacetime, and anti-matter inflation. Haldane nailed it long ago: “The Universe is not only queer, it is queerer than we can suppose” Given that pronouncement, how can anyone possibly assert that established religion – or any system of thought – captures the way things are?
By far the most profound statement I have seen of this idea, and the one I subscribe to in part – it may count approximately as my “religion” if you like, since today is a highly charged religious occasion – was made by Charles Darwin. Darwin, reviled by believers from his day to our own as the Anti-Christ, the Great Atheist, was nothing of the sort. A friend, troubled by the implications of the theory of evolution for his own faith, wrote Darwin, asking if in all his years of studying countless organisms he perhaps saw some evidence of a grand design, of God’s hand in nature. Darwin replied (yet another quote!):
With respect to the theological view of the question. This is always painful to me. I am bewildered. I had no intention to write atheistically. But I own that I cannot see as plainly as others do, and as I should wish to do, evidence of design and beneficence on all sides of us. There seems to me too much misery in the world. I cannot persuade myself that a beneficent and omnipotent God would have designedly created the Ichneumonidae with the express intention of their feeding within the living bodies of Caterpillars, or that a cat should play with mice. Not believing this, I see no necessity in the belief that the eye was expressly designed. On the other hand, I cannot anyhow be contented to view this wonderful universe, and especially the nature of man, and to conclude that everything is the result of brute force. I am inclined to look at everything as resulting from designed laws, with the details, whether good or bad, left to the working out of what we may call chance. Not that this notion at all satisfies me. I feel most deeply that the whole subject is too profound for the human intellect. A dog might as well speculate on the mind of Newton. Let each man hope and believe what he can.
— Charles Darwin, Letters [my emphasis]
The Ichneumonidae are a large family of parasitic wasps whose females embed their eggs in the bodies of living caterpillars. The eggs hatch and the wasp larvae begin to feed. Have you seen Aliens? Is a Formal Cause at work here?
Anyway, Happy Easter (our day had to be better than His)!
Thanks for the "confession"-- it underscores the fact that I have a problem. I need to find people who do NOT think like everyone else.
Being "appalled" at what passes for "popular history" is *exactly* what I don't need. <g>
Why do you think that you think so much like everyone else? (And, is that a proper anthropological question?)
Imagining humanity to be a "process" is commonplace. Longing to meet an E.T. is an everyday sentiment. No need to study anthropology. No need to have met Prof. Geertz.
I would never have imagined you (or Darwin, who as an evolutionary geneticist, I have also read) to have been an atheist. Your "agnostic" views about these things are the overwhelming average view of people in your age/education/cultural cohort. Nothing at all out-of-the-ordinary.
What is it about the environment you share with that cohort that makes you (and so many others) "think" the same way about these things?
And, given this phenomenon, how might you suggest that I try to find some people who "think different"? Find another environment?