I recently commented on the Australian Anthropological Society mailing list that universities seem to be fast becoming the worst places to produce academic scholarship as the trend towards metrics takes over. I received responses adding up to "It's not just you that thinks this... but where else do we go?"
Academic jobs are increasingly won and defended by playing into a system of metrics that measures scholarly output by ranking journals and so on. But, as we all know, this increases pressure to publish prematurely and decreases overall quality of what's in print. As a result, the production of rigorous scholarship within universities appears to be more of a luxury than a default practice.
Many academics are campaigning hard to change this, especially by working on issues such as job security and teaching workloads. There are debates about university business models, journal business models, and how these may be changed to accommodate a slower, more reflective production of scholarly work. However, by most accounts, the metric trend looks set to continue into the foreseeable future.
If universities aren't listening to academics or their unions, what can we do? Are there ways to side-step universities? After all, not everyone depends upon tertiary institutional channels to get work done.
Alternatives for individuals could include:
The problem with individual strategies is that they all require compromise and do little, if anything, to improve academia's position. We need to convince funding institutions that the independent and rigorous production of academic scholarship matters. Sciences such as biology and physics have cemented their position by producing tangible discoveries and engaging the public imagination. How important is it for the social sciences and humanities to do something similar? Will this increase our bargaining power within universities? And if we can't, where should we go?
In 1973, the British universities were invited to be part of a review of higher education that would include polytechnics, futher education colleges etc. The Association of University Teachers refused to join in on the grounds that professors were on the elite civil list which included the royal family, judges, admirals, senior civil servants etc. The others got a 25% pay rise and we got nothing. The next year University salaries were frozen at a time of 25% inflation -- a half cut in two years. It should be said that the academics then treated administrators like servants with contempt. In Manchester during February 1974 in freezing weather the maximum temperature allowed for our offices was 58F. The typists were working with gloves on. I had the occasion to go to the main administration building and the temperature inside was 74F. I knew then that we were in for a war that we would lose. The administrative class did not win all by themselves. The neoliberal turn of 1979/80 lent the power of the state to the corporatization of universities and we have been losing ever since. In Thatcher's case, taking out the universities was similar to her emasculation of city finances, the judiciary and other sources of decentralized power.
I want to insist, however, that the academics have themselves to blame for the situation described by Erin, Fran, Ryan and others. It is not just an issue of overweening bureaucrats supported by neoliberal govenrments and corporations. It's our fault too. And I have spent several decades trying to work out why. I have long observed that the most dehumanised, impersonal and exploitive labour markets are in universities, not in government or business. In the latter, you are more likely to be treated with consideration. The barbaric hiring practices of academia are legion. Why?
One hypothesis is that academics think of themselves as being detached from money and power, a sort of Brahminical residue in the caste structure. They therefore see no reason to moderate their treatment of vulnerable human beings, because they are not in a market or politics. Another way of putting this is that academics feel able to indulge their inhumanity because there is nothing really at stake, whereas the others inhabit institutions where it is known that there is. I realise that this is winging it, but the question is rarely if ever asked.
Another possible explanation is the private nature of intellectual work. There are few other walks of life where anti-social behaviour can be excused by saying "I have to write a paper". This individualism was once moderated by an informal culture of sharing which has been eroded by our swallowing the logic of intellectual property in the rat race for competitive personal advancement. Like Paul Stoller I have watched the last shreds of cooperation and communal life evaporate in the last few decades.
Finally, when I taught in Cambridge in the 80s and 90s, we had a rule that only 2 out of 11 department members could be on leave at any one time, so that the burden of administration and teaching would be fairly shared. Now the established professoriate take leave whenever they can be bought out and the administration happily replaces them with poorly paid and precarious adjuncts. In one department I know personally, three junior lecturers, all women, left in quick succession (not to other academic jobs) because they carried an unfair burden of teaching and administration and did not share in the power of decision-making enjoyed by the senior faculty.
We will not solve this problem until we take a critical look at our own behaviour over the years. This is something newcomers will find it hard to do and the old lags prefer to gloss over their own culpability, choosing rather to blame a class struggle with the powers that leaves us free from guilt over how the universities arrived at this sorry pass.
The following comment related to this thread was just posted on Savage Minds
Speaking up and sharing experiences is a good start, yes. But we can go further by examining the world through the eyes of people whose job it is to make things happen. Consider, for example, Dan Hill’s Dark Matter & Trojan Horses: A Strategic Design Vocabulary (Strelka Press):http://www.cityofsound.com/blog/2012/08/dark-matter-trojan-horses-s...
Hill is an experienced practitioner as well as advocate of what he calls “strategic design” and contrasts with the “design thinking” recently popular in business circles. Design thinking begins with a project [also, I observe, a budget and a deadline]. The designer is focused on finding the most elegant or efficient way to achieve a specified goal. In contrast, strategic design begins with a problem, a situation in which the goal to be pursued may be unclear. The designer must then be ready to explore the total context [we anthropologists might call it culture] in which the problem arises. So far, so good. I see some common ground here.
What, then, should the designer be looking for? Hill suggests three possibilities.
1. The Macguffin. “The Macguffin” is a term coined by Alfred HItchcock to describe a plot device, the focus of attention around which the action occurs. The falcon in The Maltese Falcon is a good example. Note that it need not be of any other significance. A treasure map, a weapon design, a lost will, the Ark of the Covenant—all can and have been used in similar ways.
2. The Trojan Horse. It may look like a straightforward solution to a particular issue, a solution that everyone is happy to accept. Its implementation, however, may conceal within it all sorts of unexpected and ultimately transformative innovations.
3. The Platform. In Japan, NTT did it with Do-Co-Mo. Apple has done it worldwide with iOS, i Tunes, and the App Store. The goal is not a final, all-in-one solution but the foundation for an ecology to which all sorts of new players will be attracted to contribute.
The “Dark Matter” in the title brings up another issue with its own possibilities. Every project is immersed in a normally invisible sea of habit, regulation, politics, existing institutions that purely technical or aesthetic solutions neglect at their peril. They may, on the other hand, offer opportunities to effect real change with no technical or aesthetic innovation at all.
One of the most striking examples in the book is the city of Newcastle (not the one in Britain, the one in New South Wales in Australia). Like numerous other cities around the world Newcastle was left with a depressed and decaying center, hollowed out as prosperous people moved to the suburbs and businesses and employment followed them. Today that once-decaying city center is thriving and a major tourist attraction. How did that happen?
A strategic designer observed that commercial space downtown was available only on long-term commercial lease terms, too expensive and risky for start-ups, artists looking for studios, people with ideas for new restaurants [the bunch that Richard Florida talks about as essential for the thriving of creative cities]. A small loophole/tweak in zoning regulations offered the possibility of short-term licensing, return-on-demand agreements that made it possible for landlords to minimize their risk as well. When the landlords bought into the idea and joined the design team in promoting it, the creative folk poured in, revitalizing the city center.
Just brainstorming, now—but suppose you regarded the university as a now decaying institution whose revitalization is hampered by existing arrangements. Is there some point at which something like that licensing arrangement or something else entirely would transform the situation? We’ve got a lot of smart people here, people who know a lot and could easily learn more about how universities now operate. What if they were looking for opportunities instead of doing nothing but moaning about the barriers that currently exist?
Let me add here that what I like about Dark Matter and Trojan Horses is the combination of fresh vocabulary, which helps to stimulate fresh thinking, and a forward-looking stance that transcends the academic position stuck behind its critical barricades.
Does anyone know if there's any one really good historical, global account of the changing nature of academia? I know there is a lot of commentary on it published in different places. It would be good to get a sense of how tertiary education has changed globally (for better and worse), but also how universities differ from each other. Which ones are doing things well, and why?
John, I was in Newcastle, Australia, when all that happened. I first moved there in 1996 as an undergraduate. At that time, Newcastle had already transformed significantly, replacing previously industrial land around the harbour with parks and restaurants. The centre of the city was still a ghost town, but was revitalised temporarily during the first Fringe Festival in 1996 with events and exhibitions held in vacant shops, as the book describes. Over the next few years, a massive construction effort saw dozens of apartment blocks built in town, which attracted grocery stores, restaurants, cafes and clothing stores. It's still a fairly quite place but it's a long way from being the alienating, decaying industrial city that threatened to take over when I originally arrived.
Many people at The University of Newcastle – faculty and students – were involved in various aspects of this transformation. Indeed, as the largest employer in town, the university is pivotal to Newcastle's economy and social life. While the university suffers from much the same meta-problems as everyone else, it is nevertheless a good example of community engagement. My impression as an undergraduate and, later, as a research assistant, was that the university and many of its faculty viewed their job to be not just teaching and publishing but also committed to the city.
Later, living in Sydney and working at Sydney uni, I felt little of this sense of embeddedness. The university and the city alike felt too large for community. Engaging is difficult as everyone's too busy (and live far away from each other), and my focus shifted entirely to getting my own work done. In other words, it was a perfect breeding ground for the individualism that Keith describes and the metrics to measure production. Some departments defend their position better than others – Anthropology at Sydney Uni puts in a good effort – but it is a constant battle.
I would like to see the public demand more from their universities. After all, they're largely funded with public money. If universities were answerable to the public, then performance may just have to be measured in more human (and rational) ways. However, it's up to academics to show that we can contribute to public life, that we have social relevance.
For me, open access keeps appearing as the platform on which this relevance can be built as it creates the community space that I saw in Newcastle and allowed the city to be transformed. Like Newcastle, it requires multiple players of different kinds that stretch way beyond the university. It also requires conceptualising how different projects will work together – not just blogging, not just open access journals, but a whole ecosystem of global knowledge production (research, writing, teaching) in which the parts cross-fertilise. We could learn from the hard sciences, who have done well to capture the public imagination, but also take their lessons further, using new tools that are available to change the ways that global knowledge production is done. As Gawain always tells me, there are advantages to riding the crest of the wave. This is especially true if you've come from behind.
Sorry to come to this conversation late, but I have been rather overwhelmed with my (ultimately quixotic) efforts to try to juggle being an activist (at least a little), professional author, and full-time academic all at the same time. It's quite impossible to do all these things at once even if one accepts that will inevitably end up doing all of them rather badly.
However, partly as a result, I have thinking about these issues quite a lot recently. Is it possible to be an intellectual outside the academy? Especially if one is not already independently wealthy?
Keith is right that in the past, independent intellectuals were not numerous, because patronage was hard to come by. What Keith calls the "national university" - basically the Prussian model - that was adopted in the US and eventually became a worldwide phenomenon, was, indeed, a post-Enlightenment phenomenon (Enlightenment thinkers, at least in France, wanted nothing to do with universities, which they saw as medieval, superstitious, etc). Nonetheless, the new synthesis basically combined the older notion of an autonomous, self-governing community of scholars whose primary purpose was scholarship, and training a new generation of scholars - in other words, that was organized around its own values and imperatives - and a newfound alliance with the nation-state based on a willingness to train its functionaries. (And then gradually this was extended to everyone destined for a white-collar job.)
Keith and others have offered an economic interpretation of how this broke down, but I wonder if one might not also add a political one. I've always found it telling that the most productive, creative moments for social theory seem to surround moments of world revolution - just before and just after 1917 (especially in Germany), just before and after 1968 (especially in France). I remember chatting with an intellectual historian once about why it happened in those particular places at the time and he surprised me by saying "well, a big factor was, those were places where there was just huge amounts of funding sloshing around." It was easy to get grants, set up new institutions (think of the Frankfurt school), or just exist at the margins of the academy, like Walter Benjamin or the Bataille circle, the Situationists, and a million other examples. and as a result, any number of people who might otherwise have been postmen or clerks somewhere had time to take part in informal seminars and pursue their pet intellectual projects. Actually I'm not sure exactly how it all worked, where the money was coming from, how it spread around, but it's clear that it was possible to live an intellectual bohemian life (and remember, as Bourdieu notes, bohemians were not mainly of elite background - many were children of peasants!) - that it's not really possible to live today.
I wonder if after the campus unrest of the '60s, there was a decided move, on the part of those ultimately responsible for all this funding, to say "never again." Certainly, that penumbra surrounding the academy has vanished. But at the same time we see several convergent trends over the last 40 years
* increasingly, everyone involved in intellectual or cultural production is being absorbed in the academy - most artists, writers, poets, even many investigative journalists, now have academic posts. The independent intellectual basically no longer exists.
* at the same time, after perhaps 800 years, the original, medieval notion of the academy as an autonomous organization of scholars pursuing their own ends has been decisively put to an end. As Gayatri Spivak recently put it, "even thirty years ago, when you said 'the university,' you were referring to the faculty. Now when you say 'the university,' you're referring to the administration." It's just accepted that even senior scholars at major universities are hired not because of their likely contribution to scholarship, but of their skill as administrators and how well their work will answer to imperatives set by the administration and not the scholarly community itself.
* the endless growth and power of the administrative apparatus does not have the result of freeing intellectuals, now that they have all been herded into the university apparatus, from administrative tasks but in fact has the opposite effect - they all have to spend more and more of their time on administration (just, again, with less and less ultimate autonomy)
* the main way this happens is by the corporatization of the academic bureaucracy - that is, the introduction of competitive principles at every level, so that academics spend more and more of their time selling things to one another - writing grant proposals, judging grant proposals, writing letters of assessment or recommendation, marketing a new program or department or the university itself, or proposing plans to do so, etc etc etc. (Academic publishing is a particularly dramatic case in point - editors no longer really edit anything, everything is outsourced, they spend all their time at meetings trying to sell proposals to each other, or on the road trying to sell their books.) Professional self-marketers thrive (but in order to do so have to internalize the very habitus their work is usually ostensibly critiquing), the kind of old-fashioned impractical eccentrics who used to find the university their only refuge are relegated to their mothers' basements making the occasional acute intervention on the internet.
On the latter, I've actually always thought of this as one of the surest way for a society to commit suicide. Most have some way to accommodate their brilliant, creative, but impractical citizens (and while the three obviously don't always go together, they often do). Ours seems to have decided we have no use for such characters at all. Add to that the plan to place all our young graduates in massive debt, eliminating any remaining bohemian enclaves, you're making intellectual, creative, even technological stagnation pretty much inevitable. As indeed can be observed. As I've written somewhere else, "radical" social theory has been reduced to writing endless comments on French thinkers from forty or fifty years ago, all written in the guilty knowledge that if Deleuze or Foucault were to appear today, they'd probably not be able to get a job at all, and certainly be denied tenure.
I'm not saying this was a self-conscious conspiracy - it happened gradually and haphazardly - but if one actually were coming up with a plan to completely eliminate any radical potential from intellectual life, it would be hard to think of a better one.
David Graeber said:
Is it possible to be an intellectual outside the academy? Especially if one is not already independently wealthy?
Keith is right that in the past, independent intellectuals were not numerous, because patronage was hard to come by. What Keith calls the "national university" - basically the Prussian model - that was adopted in the US and eventually became a worldwide phenomenon, was, indeed, a post-Enlightenment phenomenon.
. . . .
I'm not saying this was a self-conscious conspiracy - it happened gradually and haphazardly - but if one actually were coming up with a plan to completely eliminate any radical potential from intellectual life, it would be hard to think of a better one.
Thanks for adding so much to this conversation, David. It seems clear enough that we need a better analysis and social history, not only of universities, but also of intellectual politics in the twentieth century. I will touch on each briefly, since you bring them up together in this fruitful way.
It is true that the national university system combined the idea of a self-reproducing scholarly community with training students for the bureaucracy. I would add to that the principle of the medieval guild system, since it is this which is being undermined by the explosion of money, markets and telecommunications since 1980, not bureaucracy per se. And if a pseudo-commercial bureaucracy is destroying scholarship, then research, teaching and publication are equally being transformed by the digital revolution. Moreover, the displacement of academics by administrators could be seen as a reactionary response to the contradictions of this historical moment.
The university system only came into its own after WW2 with the massive expansion of public services, welfare states and Cold War research funding of that period. Most new entrants look back to the 60s and 70s as a norm, when it was in fact, exceptional and unrepeatable. Research and teaching were organzed as top down guild specialisms under the control of the masters, involving protracted apprenticeship to hierarchy and the expectation of a job for life at the end of it. One analytical task is to figure out what undermined all that. It is not hard. People can now get knowledge in other ways that are less rigidly compartmentalized, less expensive and give greater scope for self-learning.
If we selectively revisit the 60s and 70s as a period of insurgency and rebellion, we should remember that this was the apex of postwar national capitalism, when, as Feyerabend put it, politicians could say "Science gave you colour TV and a man on the moon. I'm in favour of science. Vote for me." The neoliberal counter-revolution at the end of 70s killed off that bonanza.
Through my reading, it seems that independent intellectuals and artists flourished in some periods more than others, such as the 1840s and 1900s in Paris. But the great crisis of world civilization in the 30s and 40s generated an outpouring of such activity. We are still living off the technical and social inventions of WW2 and after. I have been particularly interested in working class autodidacts (with whom you are intimately familiar). In the late 40s Sartre and De Beauvoir were world famous, the Brad and Angelina of their day. This leads me to reflect on a broader cultural approach to the situation in the western universities today. It could be that the North Atlantic societies are just decadent. Hence the aptness of your closing remark. As you say, it doesn't take a conspiracy to explain why the wind has left the sails of the old empires.
The trick is to find out where it is blowing now.
Keith, David, thanks to you both for adding so much to this conversation. A special shout-out to David for adding the political dimension. Let me add another—changes in knowledge production arising from changes in scale and technology.
First, scale: We have all heard the phrase "information explosion" and heard about the implications in terms of hyper specialization and the thinning out of general knowledge—the stuff that everyone is supposed to know. Could we go a bit further? Here are some thoughts stimulated by reading in social network analysis, where the limited bandwidth of actors in social networks has long been recognized as a problem. Consider "small worlds," for example. It has long been possible to be within a few short links of connection with everyone in the world. But human individuals lack the capacity to keep up to date with and cultivate strong ties with more than a handful of people. Even politicians and salespeople rarely know the names and a few salient facts about more than a few hundred and require assistance—files and staff—to do so.
With these thoughts in mind, let us consider a simple model, an abstraction from reality whose relation to reality will definitely need consideration. Assume a world of 1,000 scholars in which 50 books are published each year, 5 of which are hailed as masterpieces that everyone should know about. Let us also assume that all 1,000 scholars have the ability to read and retain knowledge from, say, 100 books. In year 1, when we start the model running, all 1,000 scholars can read all of the 50 books written that year and be suitably impressed by the 5 masterpieces. In year 2, reading all 50 books raises the number of books they have read to 100—the limit of their memory capacity. In year 3, to read all 50 books and add what they learn from them, they have to forget what they learned from 50 of the books read in year 1 or year 2. It is easy to see why filtering mechanisms, including disciplinary boundaries arise. These make it possible to shrink the number of books read each year. Let us assume that, for disciplinary reasons, only 10 of the 50 books published in a given year are read, leaving it still possible to read the 5 masterpieces whose significance transcends disciplinary boundaries. Even with this constraint, however, it will only take 7 years (7 x 15= 105) to exceed the 100 book limit of memory capacity. If, for disciplinary reasons, a scholar reads only the 10 must-reads in her discipline, she will hit the memory capacity limit in 10 years, and not have read any of the 50 masterpieces published during the same period.
Now change the parameters. Assume a world of a million scholars producing the same proportions of books each year. Fifty is to 1,000 as 50,000 is to a million. Forget books in general. If the scholars read only what their disciplinary filters say they ought to read they have to read 10,000 books a year. But if they can read and remember at most only 100 books a year that implies that they can read and remember only 1% of the books they need to read to keep up with their disciplines. Notice that by the same logic, they can, if they eliminate disciplinary boundaries and read only masterpieces, read only 0.5% of those published in a given year, and to read any more the next year must forget everything they learned the previous year.
The implications seem pretty clear. The medieval model of the university as a place to study the classics and learn all that there was to be learned is impossible. The chances of agreement on a universal canon of books every literate person should have read also approach zero. The question is how to redesign education to equip people to scan what is, for practical purposes, an infinity of offerings, identify what is interesting and useful, perhaps even world-transforming, and persuade enough others of its importance to generate a movement that can, at a minimum, produce a transient fad that offers scholars a livelihood so long as it lasts or, at a maximum, change the world as we (but who are we?) know it.
This is where technology comes in. The ICT technology that sustains the information explosion also makes it possible to reach out and connect, via small-world links with all sorts of people doing and reading all sorts of things. Increasingly it makes it possible to rapidly survey vast amounts of potentially interesting and, perhaps world-transforming, information. It does not, in and of itself, solve the problem of, to borrow the late Paul Wellstone's words, mobilizing, energizing, and organizing people in large enough numbers to effect significant change in cities, nations, or the world as a whole. But without solving that particularly wicked problem, our chances of making a difference by talking to each other and others we can persuade to join us are exceedingly slim.
My previous comment was focused on issues related to scale and technology. Yet another dimension opens up if we ask ourselves where, intellectually speaking, the action is these days.
It has been nearly two decades since John Brockman published Third Culture: Beyond the Scientific Revolution in 1996, arguing that the literary humanities pursuit of pseudoscientific "theory" had aliened readers and opened a space being occupied by such scientist-authors as Stephen Gould, Stephen Gould, Richard Dawkins, who were writing about science for educated readers who appreciate good writing as well as learning something new about the world. Brockman had already founded Edge in 1991. The more recent success of the TED talks demonstrates a successful, if new-mediatized appeal to an audience similar to that envisioned by Brockman. Some of the talks seem shallow, mere pseudointellectual entertainment. But, at least in my experience, the signal to noise ratio of genuinely interesting stuff to nonsense is higher than in most of the anthropological journals I peruse. And now, of course, Coursera is offering full-blown university-style courses packaged in online segments short enough for busy people to pursue at their leisure. The variety of content available on iTunes U and in podcasts produced by major universities is amazing. And as Brockman might have predicted, much of it is science for the sophisticated layman, pitched at what I think of as the Scientific America or Smithsonian level.
But what, you might ask, of serious scholarship? In my own reading, pursuit of advances in knowledge has led me, on the one hand, back to Asian studies, where an explosion of detailed research, largely undertaken by historians stimulated by an earlier generation of studies conducted by social anthropologists, has enormously enriched our understanding of China, Japan, Korea and surrounding parts of Asia. It has also led me to a growing body of sociological theory rooted in attempts to bring sophisticated mathematical thinking to bear on understanding how societies work. Miller and Page Complex Adaptive Systems, Byrne and Ragin, eds., The SAGE Handbook of Case-Based Methods, Andrew Abbott's Time Matters, The Chaos of Disciplines, and Methods of Discovery are all deferential to the classics (Marx, Weber, Durkheim, Simmel) but have moved far beyond them in attempts to think through the logic of social life. Another recent discovery, John F. Padgett and Walter W. Powell (2012) The Emergence of Organizations and Markets also looks extremely promising, drawing on autocatalysis, an idea developed by biochemists to explain the origins of life, as a model for understanding innovation and invention in organizations.
Another field in which I have stumbled across interesting stuff is architecture and urban planning. Here Dan Hill's Dark Matter and Trojan Horses intersects with occasional work my company does for the architecture and urban planning program at the University of Yokohama. What is striking here is people who not only take seriously the cultural and political aspects of their work but build on a strong foundation of knowledge of material systems, water mains, sewage disposal, transport, energy and information networks, that sort of thing. This foundation keeps what they talk about literally grounded and less prone to, shall we call it "lightness of being," that afflicts so much "critical" anthropological speculation these days.
That last remark is not, by the way, directed at either Keith Hart of David Graeber, in whose scholarship and worldly experience I have a great deal of confidence. It does, however, express concern for colleagues who remind me of a younger me, drifting in and out of academia with a head filled with cloudy ideas whose relationships to material realities were mostly questionable. If we aspire to reinvent the university or—more modestly—reinvent anthropology for the 21st century, we need to explore a wider world than the aging apparatus of anthropological theory now addresses.
Erin asks, "What if there were something like life-long learning in anthropology?"
Sitting on top of our kitchen counter is a book, Haruo Shirane (1998) Traces of Dreams: Landscape, Cultural Memory, and the Poetry of Bashô. Shirane was a friend of my wife in graduate school at Yale. I pick the book up, start browsing through the introduction and come across the following passage,
The seventeenth century witnessed not only a dramatic rise in the standard of living for almost all levels of society but a striking change in the nature of cultural production and consumption. In the medieval period, provincial military lords (daimyô) were able to learn about the Heian classics from traveling renga (classical linked verse) masters such as Sôgi (1421-1502), but the acquisition of classical texts was limited to a relatively small circle of poet-priests, powerful warriors, and aristocrats, who were deeply rooted in the traditional culture of Kyoto. A monopoly—epitomized by the Kokin denju, the secret teachings of the Kokinshû—had been established over the study of classical texts, the study of which was often passed on through carefully controlled lineages, in one-to-one transmissions to the elected few. In the seventeenth century, by contrast, anyone who could afford to pay for lessons could receive instructions from "town teachers" (machi shisô) in any one of many arts or fields of learning. The transmission of learning was not dependent, as it had been in the medieval period, on the authority of poetry families or the patronage of large institutions such as Buddhist temples or power military lords.
I am reminded that, in Japan today, there exists alongside the universities a system of "culture centers." Operated mostly by newspapers and department stores, they play a role analogous to that of the "town teachers" mentioned by Shirane, offering lifelong learning classes to housewives and retirees on a vast range of subjects from homely cooking skills to classical Japanese literature and urban planning.
This reflection reminds me of other worlds of private education in the West, piano and other music teachers and operators of craft shops who offer classes in knitting, crocheting or macrame, operating in effect as one-teacher culture centers with a limited range of offerings. My mind spins on, where was it that I saw a reference to philosophy cafes? A Google search turns up 5,700,000 hits. The first, from Wikipedia, says,
Café philosophique ("cafe-philo") is a grassroots forum for philosophical discussion, founded by philosopher Marc Sautet (1947–1998) in Paris, France, on December 13, 1992.
There were about 100 "cafés-philos" operating throughout France and some 150 cafés-philos internationally at the time of Sautet's death in 1998.
The subjects discussed at the cafes had a range that varied from the Santa Claus myth to truth to beauty to sex to death. They posed such questions as What is a fact? and Is hope a violent thing? Sautet made the discussions seem fun and exciting. The concept was to bring people together in a public friendly forum where they could discuss ideas. A cafe tended to have this type of atmosphere where people were relaxed drinking coffee and carrying on conversations. This concept ultimately developed into Café Philosophique that he founded.
Thousands of participants in philosophy cafes worldwide have adopted Sautet's idea as a way to enhance their thinking. Ideas are thrown out with concern for accuracy and philosophical rigor. The concepts discussed were in the spirit of tolerance and openness. The idea of Sautet's philosophy cafes have spread around the world. The concept that started in France and subsequently entered England, Germany, Belgium, Austria, Switzerland, and eventually throughout Europe is now in the United States, Canada, South America, Greece, Australia and even Japan. Due to this success, the French president Jacques Chirac sent a founding member on a good will mission to Latin America to introduce the concept there.
A common element in these, I will call them "para-academic," institutions is their social dimension. On any given subject, those who come to learn could find more brilliant lectures and better illustrated demonstrations on-line via Coursera, iTunes U, etc. What they still can't find is social opportunities, real-world places to meet people who share similar interests, in settings where a shared hobby can lead to drinks, dinner, or (we gracefully draw the curtain) other forms of social activity.
Is it possible to imagine at least a few entrepreneurial anthropologists living comfortably, even prospering, by pursuing this line? Just had dinner last night with an American friend living in Japan who has spun teaching English to dentists into organizing tours to international medical conferences and has just founded a company to take advantage of what she has learned and the contacts she has made to organize other, now I will call them "learning-socializing" events, related to politics and spirituality, topics in which she has strong personal interests. Not an anthropologist (originally and still, in another of her many roles, a professional jazz pianist), but perhaps a model that anthropologists stuck with no jobs or crap jobs in today's academic world might want to consider.
There are parallels of course in applied anthropology, not least in the world of 'development' and dev studies.
For resistance to the audit culture in development, see The Big Push Forward, which kicked off with a meeting at the Institute of Development Studies (University of Sussex) in September 2010, The Big Push Back.
For tips on escaping from the academic frying pan into this particular fire, see Duncan Green's quick guide on How to get a job in development, sharing some similarities with Keith's recommendations.
"Is it possible to be an intellectual outside the academy? Especially if one is not already independently wealthy?"
Ya, that's a question that has been on my mind a lot lately as well. I can see the writing on the wall--everything leads to the academy. That's where all of this is supposed to lead, and I get a lot of "advice" about what I need to be doing in order to make it within this system. I am in the writing stage of my PhD. I know what I am supposed to do next in order to try to get myself further entrenched into academia. Lots of my colleagues keep pushing forward, even though they are concerned about what lies ahead. But is this really the only option? Can something like anthropology only exist within the current university structure. Maybe. But maybe we have to push back and start creating other options and possibilities.
"I wonder if after the campus unrest of the '60s, there was a decided move, on the part of those ultimately responsible for all this funding, to say 'never again.'"
It would be interesting to try to track down some of these histories in the academy (if these kinds of changes are reflected or documented in any way, in policy changes, etc). Maybe through some good old fashioned ethnography.
"... increasingly, everyone involved in intellectual or cultural production is being absorbed in the academy - most artists, writers, poets, even many investigative journalists, now have academic posts. The independent intellectual basically no longer exists."
Absorbed is a good way of putting it. Maybe "neutralized" is another. This reminds me of something one of my profs said during a seminar class. She called the university a "convenient place to put inconvenient people." That really resonated with me, especially when I thought about how bogged down my whole department was with a lot of bureaucratic meetings, evaluations, over-extended teaching loads, getting tenure, etc. And us grad students were being socialized into the same system. We were all too busy trying to keep up with our workloads, and there was little time to look around and think about what the university was really producing. Everyone I knew was too busy to really think about addressing problems in the university--there was no time. So it all just keeps going forward. And radical ideas end up being little more than words jotted down in papers and turned in as final papers for one class or another. So all of the talk about power, hegemony, hierarchy and all that is reduced to little talking points and final grades. The grade feels like something has been done with this information, but it really hasn't. And it seems to me that the publishing regime just extends that pattern. A lot of ideas circulated, but where do they really go? Who sees them? When are they ever actually applied? At AAA conference hotels?
"The trick is to find out where it is blowing now."
Ya, and the other part of this is accepting the possibility that there are other routes. Lots of people seem to just being going with the flow because they don't see any other alternatives. But that keeps the current system going, even if it's heading off a cliff.
"If we aspire to reinvent the university or—more modestly—reinvent anthropology for the 21st century, we need to explore a wider world than the aging apparatus of anthropological theory now addresses."
I don't know if the problem is just a matter of one aging apparatus or another--when it comes to theory it might also be a case of deciding what to use to reestablish the theoretical foundation(s). If I had my way I would probably start with folks like Wolf, Mintz, Roseberry, and Trouillot (who all acknowledge the importance of history, not to mention material conditions), then work my way forward and backward a bit. I don't think the anthropology of the 21st century has to jettison everything or completely reinvent itself.
Ryan knows what we have exchanged privately and this medium encourages people like me to be flippant. But I say what I am about to say from the heart, knowing intimately what he and many like him are going through.
1. There is only one reason to do a doctoral thesis and that is because you love it. It is an incredible privilege to write a book of your own and to be given some of the conditions to achieve it. The downside is doing it under a bureaucratic hierarchy. The other issue is seeing it as a ticket to a job. Either it is worth doing for itself or not. And given the current employment situation, it should be easier to take such a perspective on board.
2. Never accept that life means being one thing. Always pursue several activities at once, whether these are intellectual, economic or practical. My greatest fear was being owned by an employer with no alternatives. You may think getting a job, any job, is an end in itself. But once an employer senses that you don't want to move, they treat you like shit. The only way is to keep open the possibility of moving. So pluriactivity and mobility are essential.
This kind of advice doesn't suit every personality and there are costs to being a nomadic bricoleur. Maybe it once made sense to seek a job for life in academic bureaucracy. Not any more.
ryan anderson said:
"Is it possible to be an intellectual outside the academy? Especially if one is not already independently wealthy?"
Ya, that's a question that has been on my mind a lot lately as well.