Tag Archives: humanities

A Mark I Once Made

I’ve owned this old paperback copy of Thus Spoke Zarathustra since the late 1970s. Ever since then, it has been my constant companion.

It was this very book that first awakened me to the pleasures of reading philosophy and the possibilities of doing philosophy, when I was still innocent of all serious philosophy.

I do not know why I first bought the book — it was not exactly recommended reading in the public high school I attended — but I seem to remember that I first read Zarathustra on a hiking trip in the Black Mountains. I know that it was night and there was a fire burning.

Was it The Stillest Hour? I would like to think so. I was reading by candlelight, when the words of Zarathustra’s Prologue leapt out at me. I’d never read anything like this! Whose words were these? In an uncanny way they seemed to be mine — or at least I wanted them to be mine. I couldn’t say I understood them fully but knew those words to be true and I wanted to live their truth.

So I dripped wax on the page, to mark it.

zarawax1

I left my mark in wax — as if I could make these words my secret, as if sealing a letter with wax. A letter, but to whom? To Nietzsche? More likely to myself, promising I would return.

It was just the first, not the last time that the beauty, the passion, the madness and the truth of Nietzsche’s writing in Zarathustra struck me, stopped me in my tracks, overcame me. But I believe it was at that moment that I began to tell a new story about myself and about the world, or at least it was one of the first times I understood that I might have a story to tell. I was 17 years old.

Now, I have little claim on Nietzsche. I am not a professional philosopher and I am not a Nietzsche scholar by any stretch of the imagination. I once wrote a few words about Untimely Meditations in a review of a book by Bernard Williams; and when, in the 90s, I included Nietzsche in my Western Civilization courses I usually taught The Birth of Tragedy or The Genealogy of Morals. But curiously enough, I never taught Zarathustra or wrote about Zarathustra, which of all Nietzsche’s writings has arguably — no, undeniably — had the strongest claim on my life and my imagination. The book has done its quiet, subtle work in my life for nearly thirty five years.

“The dew,” Nietzsche writes, “falls upon the grass when the night is most silent.”

O, The Humanities!

Last week, the National Research Council of the National Academies issued Research Universities and the Future of America: Ten Breakthrough Actions Vital to Our Nation’s Prosperity and Security. I came to the report wondering how this august committee of bureaucrats, bigwigs and business people might go about defining the mission of the research university and how they would define “prosperity”; and I wanted to see what sort of future they envision for research that doesn’t immediately yield new machines, products or services, and doesn’t necessarily play well — historically has not played well — with business: namely, the kind of research I do and I value, research into the human world and the human condition.

I’ve noticed that in most national debates over educational policy and funding (which this report is supposed to inform) and in discussions of the R & D Tax Credit (which this report touches on), “research” gets defined way too narrowly. It gets restricted to scientific research and the invention of useful products and machines. As for prosperity, it tends to get confused with economic growth, or reduced to GDP and employment figures. It’s a limited, myopic view in which “research” is valued only insofar as it yields new machines and tools and products to fuel economic growth.

That’s pretty much the view here.

There are gestures throughout this report to find a place for the humanities (along with the social sciences) in the research university centered around science and engineering. The authors consistently maintain that the research university has to be “comprehensive” in scope, “spanning the full spectrum of academic and professional disciplines,” in order “to provide the broad research and education programs required by a knowledge — and innovation — driven global economy.” But there is not much ink spilled here on the value or the purpose or the place of the humanities. The idea that I advanced as a “crazy” idea in previous posts (here and here and here)– that research in the humanities might provide a much-needed critical orientation in an innovation-driven economy (and should therefore be covered by the R & D tax credit) — seems just as crazy as ever.

Perhaps we can expect a bolder stance on the humanities in the forthcoming report on the humanities and social sciences from the American Academy of Arts and Sciences mentioned in the footnotes here. Maybe without that report this group felt unqualified to tackle the subject, or they were simply being deferential to their colleagues. Be that as it may, Research Universities focuses on the humanities in just one place. This is in a chapter about “national goals.” It opens with a jingoistic account of American progress. Cue the bombastic voiceover:

In the course of our history, our nation has set grand goals that have defined us as a nation. And then we accomplished them. We created a republic, defeated totalitarianism, and extended civil rights to our citizens. We joined our coasts with a transcontinental railroad, linked our cities through the interstate highway system, and networked ourselves and the globe through the Internet. We electrified the nation. We sent men to the Moon. We created a large, strong, and dynamic economy, the largest in the world since the 1870s and today comprising one-quarter of nominal global gross domestic product (GDP).

The most muddled word in this historical muddle is, of course, “we.” The pronoun carries a lot of freight here, and it is meant to reduce history to a story of central planning. We set grand goals and we accomplish them: how grand!

At best, this version of American history is nothing more than the committee projecting the fantasy of central planning on to the past. But it’s also an attempt to sanitize history, to scrub off all the blood and dirt from our past and forget our present afflictions and troubles. Civil rights? The creation of a republic? These weren’t grand goals advanced in a planning session, set out in the form of pure ideas and then acted upon, but the very difficult, tough and very real struggles of people to gain and maintain their liberty. In the area of civil rights, some would say we still have a long way to go; in the matter of the republic, some would argue that we are now more than ever at risk of losing it, if we have not already lost it.

The railroad? Think only of Josephson’s account of how the railroads were laid. Or to take a more recent example, consider what was really involved in networking “ourselves and the globe through the Internet” (and don’t forget that networks are not only systems of inclusion, but of exclusion). The Eisenhower Interstate system may have been the closest we ever came to nation-wide military-industrial planning; but even that took a lot of cajoling, a propaganda campaign, and some serious political maneuvering, and given our current car-crazed, oil-dependent, environmentally-weakened, militarized state, it is debatable whether the Interstate system really deserves unqualified accolades.

Of course these questions and considerations were kept out of the discussion here. But I would hasten to add that these are exactly the kinds of questions and considerations that research in the humanities (and social sciences) allow us to ask. These are questions not only about the past, but also about where we are going, what we want, what we need to do, what is the best thing to do, how we should go about doing it, and how we ought to discuss all those questions.

Just as importantly, the humanities allow us to look at the American story and ask who “we” are, and help us recognize that we are a plurality, not reducible to a single historical agency or identity or even a unified, entirely coherent, unimpeachable history. Indeed, it’s fair to say that the humanities – research into a broad domain of language and historical experience, and questions about the role of language in historical experience as well as the incommensurability of language and history – give us at some very basic level an awareness that history is many stories, that we can ask questions about those stories and that doing so creates the option of telling (and living) another story.

You’d think that at least some of this thinking – which is hardly radical or new – would find its way into this report. Or at least that at some point this report would acknowledge that research into language, thought and history is of value to deliberative democracy, and to considerations of American prosperity. But, no – not even a gesture toward the traditional notion of the “liberal arts” (artes liberales) as the arts most befitting a free people – arts of language and understanding that equip a free people to deliberate and exercise their freedom. In fact, when the report turns to “civic life,” the humanities play no role whatsoever in the discussion. Instead, The Council considers research in the humanities under the heading “Enhanced Security.”

Research in the social sciences and humanities has allowed us to better understand other cultures we may be allied or in conflict with so we can adapt strategies to improve diplomatic and military outcomes.

A handmaid to military strategy and diplomacy: that is a pretty poor rationale for the humanities – about as poor as one can imagine. Humanists can help military generals and diplomatic missions “adapt strategies” for dealing with friends and obliterating enemies. The understanding of “other cultures” – which involves complex, enduring, maybe unanswerable questions of interpretation, translation, language arts, anthropology, history – has been placed here in service of the all-powerful State. “We” are no longer the people, in the plural and in all our plurality, with all the uncertainties that entails, but one singular, grand, innovation-driven, militarized, secure State.

Our friends may delight in this technocratic fantasy, but our enemies had better look out.

What’s Eating American Intellectuals And, Now, What’s Eating Me

In yesterday’s post about what’s troubling American intellectuals I arrived at what I considered a fairly uncontroversial point of view, namely, that the diminished social stature of the intellectual – and, in some quarters, the scorn and mockery of educated “elites” — indicates something disturbing about our attitudes toward education and where we are headed as a society.

Just what that something might be is up for grabs, but I was tending toward the dramatic and alarming view that this is the first stage of the eclipse of liberal arts education in America, the onset of a dark age. I tried to hint at that in the final paragraph of my post.

Not a single comment all day — until last night, when someone registered his strong disagreement on my Facebook page, and I had the sinking feeling that maybe everybody had strongly disagreed with what I wrote, but was just too polite to say so.

Michael commented that he “lost all faith in the ‘liberal intellectuals’ long ago,” and he goes on to say my post fails to register how badly intellectuals of all stripes have failed us, so they might just deserve our scorn.

…it was a bunch of Ivy Leaguers who got us into the damn mess we’re in–the latest version of “the best and the brightest.” Your intellectual aristocracy has failed us, Louis. They’ve screwed up the environment probably beyond redemption, they’ve brought us war without end, they’ve totally fucked up the global casino economy. This last half century of downhill slide wasn’t the consequence of bunch of climate-denying yahoos and creationist boobs; it was all the brilliant scientists at MIT, all those glorious minds at the Kennedy School of Government, all those experts at G’town International relations, all those Harvard Business School MBAs. Thanks a million, minds.

Just to be clear, I am not out to defend tenured Ivy League professors, the best and the brightest, or an intellectual aristocracy (if there is such a thing). They don’t need me to defend them. Nor am I trying to put them at ease. I am simply trying to understand why they are so ill at ease these days, and what that might mean.

If I widen the historical lens I begin to wonder whether a certain idea of the intellectual is passing from the American stage and maybe from the world stage. Technocrats and scientists still garner our respect and admiration (despite what Michael says about the folks at MIT and elsewhere), and we are still captive to a narrative of scientific and technical progress; but we may have lost our faith in the idea that we can ever learn anything of consequence about human affairs or the human condition. That’s not something I can lay out arguments to prove; it is simply something I wonder about, and it’s a possibility I dread.

On the other hand, I can’t really go where Michael is going with his comment, partly because I recognize the inherent fallibility of all intellectual undertaking — it’s no surprise that the best and the brightest would fail to deliver us from evil; nobody can — and because I admit that most human endeavor ends in pure folly, no matter how noble and inspired and smart it might at first seem.

That is no reason to give up on education or enlightenment. This is a point Russell Kirk made, snidely, but powerfully, in a passage quoted by Bainbridge:

Populism is a revolt against the Smart Guys. I am very ready to confess that the present Smart Guys, as represented by the dominant mentality of the Academy and of what the Bergers call the Knowledge Class today, are insufficiently endowed with right reason and moral imagination. But it would not be an improvement to supplant them by persons of thoroughgoing ignorance and incompetence.

To be sure, the current wave of populism will pass. My concern is that after the revolution, we’re going to have to start rebuilding, and it’s difficult to do that in darkness.

Credit Where Credit is Due: The Human Side of R & D

Amar Bhidé argues in a recent op ed that making the R & D tax credit permanent will “not encourage the broad-based innovation that is crucial for widespread prosperity,” and he is skeptical of the idea – which has been around since the credit was first instituted, on a temporary basis, in 1981, and which has been one of the arguments advanced by the Clinton, Bush, and now the Obama administration for making the credit permanent — that there is significant “spillover” or “public benefit” from private investment in research. While his skepticism seems warranted, the question whether corporate investment in “research” can produce “higher returns for society” really turns on how we think about research, innovation and technology, and how we address the broader, unsettled question of the proper role of business in society.

“Research and experimentation” has been a murky area, even after reforms were made to correct abuses of the original 1981 statute, which yielded such triumphs of “research” as Chicken McNuggets or different flavors of soda pop, and creative accounting that wrote off failed ventures as “experiments.” In the 1986 reforms, Congress developed a test – a statement of what qualified as research — to clarify the law on this point. As Robert S. McIntyre noted in a 2002 piece on the credit and its abuses:

The IRS eventually interpreted this “public benefit” or “discovery” test to require that qualifying research must be directed at “obtaining knowledge that exceeds, expands, or refines the common knowledge of skilled professionals in a particular field of science or engineering.” In other words, if everybody already knows what a “research” project is intended to “discover,” then the government won’t foolishly subsidize it with a tax credit.

This excess, expansion, or refinement of “knowledge” – something that goes beyond what “skilled professionals” in “a particular field of science and engineering” already know, or can anticipate or intend – is where the law tells us to look for the public benefits of corporate R & D. True innovation lies in unexpected outcomes. And businesses should be rewarded for advancing technical knowledge, or at least given an incentive to do so, because the advancement of technical knowledge will bring economic prosperity and other benefits.

There are lots of assumptions being made here about the way things work, and it’s not at all clear that things really do – still — work this way. Much of the thinking here and business, society and technology goes back to the post-war era. There are, for instance, connections to the theories of economist Robert Solow about the role of technical progress in growth of industrialized countries. Solow observed that technological advancement is the key force in economic growth – the “residual” after all conventional inputs, including capital and labor, are accounted for. It seems reasonable to conclude from his observation that if we encourage capital-rich companies to invest more in R & D, technological advancement will propel the economy forward — while at the same time delivering new “knowledge” and new “discoveries” (and, the Obama administration hopes, new jobs).

It is no discredit to Solow to say that his thinking exemplified and helped fuel the technological optimism of the postwar period. He famously calculated that four-fifths of the growth in US output per worker could be attributed to technical advances. Now, certain technical advances have led to a decline in US productivity, or threaten US workers with obsolescence. This is just a small instance of the way in which our experience challenges our faith in technology.

Belief in the “residual” power of technology to fuel economic growth and materially benefit society has survived well beyond the industrialized national economies Solow studied for a number of reasons. Corporations do not simply – or cynically — want to encourage the belief that tax breaks they receive will somehow benefit the larger society; corporations, too, are creatures of technology and wholly captive to technological optimism. More broadly, the notion that technology can deliver economic as well as social benefits is still something Americans believe, or want to believe, despite lots of evidence to the contrary. We’ve staked our whole way of life on the idea.

Of course, Bhidé doesn’t go this far. Instead, he argues, we need a more “inclusive” view of innovation – one that takes into account “innovations in design, marketing, logistics and organization” – if we are to get beyond the narrow (and, to his mind, mistaken) view that increased R & D spending is going to correct market failures or produce beneficial outcomes.

But that broader view is not something we should expect scientists and engineers, or lawmakers and the IRS, to deliver. Nor is it a subject on which we should simply defer to professors of business or economists — who will never settle the matter anyway. Part of the trouble, in my view, is not simply with the idea of innovation. It’s with the idea of “research” as something that produces only scientific and technical knowledge, or as an activity to be undertaken solely by scientists and engineers (aided and abetted, perhaps, by economists and professors of business administration).

It seems to me that if we are going to provide incentives for research, we can try to do better than hope for spillovers or accidental benefits from the lab-work of scientists and engineers. Why not broaden the scope of the research we encourage and underwrite with tax credits to include other kinds of research that might benefit the public? What would the corporate R & D picture look like then? How might organizations capture research into the human condition or the social world, and develop its discoveries and perspectives to improve their own performance, or obtain a truer and more complete picture of the world? How would their performance measures change, along with measures of prosperity, or real wealth?

You have to wonder why these considerations don’t really have a place in the conversation, and even seem out of bounds, far-fetched. It’s worth remembering, in this context, that to “credit” something is to put stock in it, believe in it, lend it credence. Are we now so captive to the story of scientific and technical progress that we think other forms of research could never benefit the public or contribute to the common wealth?

This much is clear. Scientific research unchecked by critical judgment, historical perspective, the broad study of culture and society, or meaningful public debate, is bound to lack human scale and a vital connection to the very “public” it is supposed to benefit. And the study of the choices we make, or how we make them, or what it is like to live in this moment, at this particular time and in this particular place, is bound to yield some richer understanding of what it will take to make the right choices tomorrow.

Unscientific about society

Social science may be able to account for society in part because it has remade society to suit its particular kind of knowledge (the “science” that I would call theory).

For most theorists who study society, there are great social forces at work, will we, nill we, and most of them do us no good; the self is a social construct; the individual is more patient than agent, subjected to a false or inauthentic subjectivity, often a victim.

But there are, interestingly enough, some resources for rethinking society in the history of the word society itself. Society, societas, denotes an elective or voluntary association, not an array of (dark, often hidden) forces that constrain and define and overwhelm the individual.

I want to think about the social not just as a precondition but as a human accomplishment, the fruit of liberty and free association, a state that human beings can achieve simply by choosing to come together, not just a gulag of the alienated, overdetermined self.

What Can Make You Soft?

A short while ago the bright lights at McKinsey and Co. announced that they had been thinking seriously about the role of business in society, and were prepared to go beyond the usual bromides about corporate social responsibility. An article by Ian Davis in The McKinsey Quarterly focused on the need for CEOs and other executives to wield “soft power” — which Harvard’s Joseph Nye describes as “the ability to get what you want by attracting and persuading others to adopt your goals.”

With carrots and sticks you can coerce people to do what you want; but carrots can get expensive and people resent too much stick. The soft leader takes a different and more subtle tack, entering into controversy to set or re-set an agenda, to frame the discussion, to gain credibility on an issue, and, above all, to lead by persuasion.

For Nye, whose book deals mainly with the exercise of American political power, soft is the road not taken by the Bush administration after 9/11 and in the war on terror. (If anything he is far too soft in his criticism on this point, but maybe he is just practicing what he preaches, persuading gently rather than berating and bashing.)

For Davis, soft power may not be the be all and end all of contemporary business leadership, but it’s an important ingredient. Though business leaders have been historically reluctant to enter into social and political discussions for fear of being compromised or caught up in controversy, Davis writes, they “are particularly well positioned” to exercise soft power on local, national and global issues.

Why? Partly because CEOs and other business leaders are used to dealing with “complex trade offs”; surely, Davis reasons, they can readily apply those skills to the “big social issues from climate change to health care to poverty. Business, particularly big business, has a vital role in resolving these immense challenges.” Not to mention a vital interest in directing the outcome of public debate.

You’d think that lobbying Congress, investing in some feel-good PR, and pulling the honeywagon up Capitol Hill would be enough. But it’s not. In an article in the Economist magazine, Davis makes the exercise of soft power out to be the fulfillment of a Rousseauist social compact; but it’s less a social obligation than a prudent calculation, to be involved in real world issues, risking some public controversy, perhaps, but in the long run putting oneself in a better position to manage risk.

The thinking here is that by directing social and political change, or at least having a hand in it, you will be better positioned to anticipate it, exploit it, profit from it, turn it to your advantage. That all sounds very compelling — as long as you don’t worry too much about the tendency of history to take unexpected turns. Still, it’s worth considering how many times an unforeseen twist, or an unintended or unanticipated consequence, has undone even the best generals, politicians, diplomats, revolutionaries, dictators, bosses, organizers and televangelists.

Enter at your own risk. My main concern is this: how do you learn to wield soft power? Where do you learn how to exercise it? Who teaches soft skills? For the ancient world, there were schools of rhetoric to teach the art of soft power or persuasion. In the early modern and modern worlds, there were schools of liberal arts. But where do you learn those arts now?

Certainly the business schools are ill-equipped to teach rhetorical prowess and the practice of soft power in any real way; language and constructive use of language (in dialogue, in persuasion), the ability to translate, literally and figuratively, among different languages or different ways of seeing the world, the ability to parse a conversation or to frame a conversation at the outset, to place events and people and positions in historical context so as to better understand them — all this requires a kind of patience and diligence that has not exactly been institutionalized in our MBA programs.

So what about the liberal arts? What about the humanities? Can they make softer leaders? Maybe. If the deliberate and careful study of language, history and language arts has an important social and civic function, it surely must be something like this.

But it’s the rare CEO who has spent much time studying the humanities, except to fulfill a set of requirements or to play at business ethics; and, what’s worse, it’s the rare humanities program or humanities curriculum which thinks that its business is to teach anything practicable in the practical world. Teaching “critical thinking,” as many humanities programs claim to do, may be a start, but when thinking is captive to a particular cultural and political agenda, as it too often is, it ceases to be critical; and learning to use theoretical jargon is no substitute for learning how to parse a sentence in Latin or Russian or French — or English, for that matter. Grammar always trumps theory.

Peter Drucker was aware of this deficit in our educational system. In Managing in the Next Society, Drucker saw the need for a third way — a way in between the mix of practical education and new age sophistry of the business schools on the one hand and the narcissism, self-destructiveness, and ethical irresponsibility of liberal arts programs on the other. In the meantime, those who want to soften themselves up to lead in the real world will have to be autodidacts, or — more likely — flush with money to hire people who know how to institute softness. And that is the rarest kind of business consultant. It’s a hard world out there.