You are currently browsing the category archive for the ‘Philosophy’ category.

images
(Photo: Brittanica)

How many scholarly stakes in the heart will we need before Martin Heidegger (1889-1976), still regarded by some as Germany’s greatest 20th-century philosopher, reaches his final resting place as a prolific, provincial Nazi hack? Overrated in his prime, bizarrely venerated by acolytes even now, the pretentious old Black Forest babbler makes one wonder whether there’s a university-press equivalent of wolfsbane, guaranteed to keep philosophical frauds at a distance.

To be sure, every philosophy reference book credits Heidegger with one or another headscratcher achievement. One lauds him for his “revival of ontology” …Another cites his helpful boost to phenomenology by directing our focus to that well-known entity, Dasein, or “Human Being”… A third praises his opposition to nihilism, an odd compliment for a conservative, nationalist thinker whose antihumanistic apotheosis of ruler over ruled helped grease the path of Adolf Hitler in the 1930s.

Next month Yale University Press will issue an English-language translation of Heidegger: The Introduction of Nazism Into Philosophy, by Emmanuel Faye, an associate professor at the University of Paris at Nanterre. It’s the latest, most comprehensive archival assault on the ostensibly magisterial thinker who informed Freiburg students in his infamous 1933 rectoral address of Nazism’s “inner truth and greatness,” declaring that “the Führer, and he alone, is the present and future of German reality, and its law.”

More

Carlin Romano
The Chronicle of Higher Education

maher
Bill Maher, in Jerusalem

Robert Burton’s 17th century treatise, The Anatomy of Melancholy, treats psychological disorders as a religious problem. Depression, Burton believed, is an expression of original sin. Three centuries later, Freud reversed the diagnosis entirely by calling religion a symptom of mental dysfunction. Now, a growing number of scientists are studying why we are religious with modern research methods from a range of disciplines. For some interpreters, such as philosopher Daniel Dennett and evolutionary biologist Richard Dawkins, science reveals religious beliefs to be malignant memes gnawing their way through believers’ brains, diseases needing to be cured. Yet for many of the researchers closest to this work, the recognition that religion has biological roots only makes it harder to talk about severing it from ourselves.

This must have come as a disappointment to comedian and Real Time host Bill Maher, who traveled the world making fun of religious people for his documentary Religulous. Standing at the prophesied site of Armageddon — 

Meggido, Israel — 

Maher indicts religion as a “neurological disorder” that causes the afflicted to wish for apocalyptic death.

Maher interviewed Dean Hamer and Andrew Newberg, two scientists who study the biology of religion, to back up his anti-religious polemic; neither says much of substance in the film. Hamer, a geneticist at the National Institutes of Health, is the author of The God Gene, which posits that human beings are genetically predisposed for “self-transcendence,” the feeling that there is something beyond ordinary experience. In other words, we’re hard-wired to believe in a higher power. In his research, Hamer noticed a correlation between personality survey data and different alleles of the gene VMAT2, which codes for an emotion-regulating brain chemical. In the course of human evolution, he suspects, this gene helped foster “an innate sense of optimism” that had adaptive benefits.

Since the NIH doesn’t sanction Hamer’s religion research, Maher interviewed Hamer at a lab at American University. During the interview, “[Maher] really kept on pushing me to say that science proves religion is wrong,” Hamer recalls. “And I kept on trying to push back and say, ‘Science proves that people have an innate desire for religion.'” The interview lasted about an hour and a half, Hamer tells us, yet only a two-second clip from their conversation made the final cut. The scene is sandwiched in the middle of an awkward chat between Maher and an “ex-gay” Christian pastor who denies that homosexuality is innate. Then, the camera cuts to Maher asking Hamer if he’s the guy who discovered the “gay gene.” Hamer says yes. (Before The God Gene, Hamer wrote about the “gay gene” in another book, The Science of Desire.)

Still, Hamer has no regrets about his moment on the big screen. “Overall I was happy because I was one of the few people in the entire film that [Maher] did not make fun of.”

Religulous was slightly more attentive to Andrew Newberg, the University of Pennsylvania neurologist known for his research on religious experience. In the film he and Maher walk and talk at New York City’s Grand Central Station. Most of their conversation is muted to make way for Maher’s voiceovers, but we do hear Newberg trying to tone Maher down a bit. “How we define what is crazy or not crazy about religions is ultimately up to how we define ‘crazy,'” Newberg explains. When he mentions his studies on people speaking in tongues, the conversation is cut short to make way for shots of Pentecostals looking crazed.

Using single emission computed tomography (SPECT), Newberg and his colleagues have studied the differences between the normal brain states and peak experiences of meditating Buddhist monks and praying Christian nuns. Among both, they observed an increase in blood flow to regions responsible for thinking and planning. Meanwhile, activity decreased in the posterior superior parietal lobe, an area that affects how we orient ourselves in the world. In a sense, this research shows that what goes on in the brain mirrors how believers describe their own religious experiences: a heightened awareness of a different way of being in the world.

Although Newberg does not regret being in the film, he admits he’s disappointed that Maher didn’t take his findings more to heart. “I think it’s a little difficult to write off everybody who has ever been religious as being delusional or psychotic,” he says. “I don’t think the data really supports that.”

Bill Maher may have hoped that science — 

religion’s age-old enemy, as the common story goes — 

would vindicate his ruthless agnosticism. But as more researchers explore religiosity, the variety of perspectives and interpretations on human faith is growing more complex, not more black and white.
Cognitive scientist and Evangelical Christian Justin Barrett, for instance, sees no contradiction between studying religion and being religious. His widely cited research examines common patterns of supernatural beliefs across cultures in order to describe the innate mental processes that give rise to them. And neuroscientist Rhawn Joseph, who has self-published several philosophical books alongside a long list of scientific publications, goes even further to claim that “each and every human being is born with a brain and mind that serves as a transmitter to god.” But in order to keep the battle lines between believers and nonbelievers clear, Bill Maher’s Religulous chose to ignore, as Hamer puts it, “the basic human biology of why religion is important.”

Nathan Schneider
Seed

From a review of Susan Nieman’s “Moral Clarity: A Guide for Grown-Up Idealists”:

It is very hard to write well about ethics, and especially so in a way that engages and interests that elusive phantom of writers’ imaginations, the general reader…like its predecessor, “Moral Clarity” is a sustained defense of a particular set of values, and of a moral vocabulary that enables us to express them. Neiman sees these values as neglected or threatened all along the political spectrum. They received their strongest defenses in the moral thought of the Enlightenment, in David Hume and Adam Smith, but more particularly in Jean-Jacques Rousseau and Immanuel Kant. So the book is not only a moral polemic, but a powerful argument in support of the resources that these Enlightenment figures left us. Neiman, an American who is currently the director of the Einstein Forum in Berlin, boldly asserts that when Marxism, postmodernism, theory and fundamentalism challenge the Enlightenment they invariably come off second best. I agree, and I wish more people did so.

Neiman’s Enlightenment is not the hyperbolic ideology detected by some critics. It is not the unthinking worship of science, the materialistic, technological ideology that upset the Romantics and continues to upset their followers. It is not an unthinking confidence in the human capacity for knowledge, and still less in human perfectibility and unending progress. On the other hand, neither is it merely an expression of liberty, a resistance to unearned authority and the discovery of tolerance, which, she argues, provides too pallid an ideology to tempt people away from the superstitions and fundamentalisms that promise them more. It is rather an attitude encapsulated in four virtues: happiness, reason, reverence and hope. The moral clarity of her title is therefore not the ability to calculate answers to the practical conundrums that life sets us. It is rather the ability to see life in ways infused with these categories: to cherish happiness, to respect reason, to revere dignity and to hope for a better future.

It may seem surprising that we could need reminding of these things, but a foray into an airport bookstore, or a trip around any gallery of contemporary art, would show how far our culture would have to move before it gets back to being comfortable with them. To take just one significant example that Neiman highlights, the current value placed on being a “victim,” and the glorification of victims as heroes, should be seen as a denial of human freedom and dignity, a denial of happiness and a barrier against hope.

Although her philosophical heroes are associated with the secular character of the Enlightenment, Neiman is deeply respectful of religious traditions and religious writings, and rightly dismissive of the kind of brash atheism that confidently insists there is no good in them. On the other hand, following Plato, she does not see ethics as the distinct preserve of the faithful. Instead, she writes, “religion is rather a way of trying to give shape and structure to the moral concepts that are embedded in our lives.” Her most profound engagement with a religious text is with the Book of Job, the confrontation with natural evil and injustice that conditioned almost all the subsequent contortions of theology.

Philosophically, one of the deepest discussions in the book is Neiman’s appropriation of Kant’s doctrine of freedom. This is a notoriously treacherous area, but Neiman correctly aligns it with the human capacity for noticing or inventing (it does not necessarily matter which) possibilities for action. As well as whatever is the case, we have what might be the case, or what we could make come about, as well as what ought to be the case. Freedom, in the sphere of action, is therefore associated with a refusal to accept that what is the case limits and constrains our possibility for doing the other thing, surprising the psychologist, as it were. If the biological scientist comes along and tells us that we are all selfish, we do not need to conduct surveys and build laboratories to disprove it. We just need to remember that it is open to us to tip the waitress although we will never see her again, or to refuse to comply with the unjust demand to condemn the innocent who is accused of some crime, even if it would benefit us to agree. If the biological scientist says that it is against human nature to do these things, we have it in our hands to refute him on the spot. If on the other hand he retreats to saying that doing them is just a disguise for selfishness, first, it is not clear that he is doing science anymore, and second, we can properly reply that if so it is the disguise, and not our supposed true nature, that matters to the waitress or the innocent who is accused. Theories about how moral education works are not nearly as important as we tend to think, provided we can keep our confidence that such education can work. The problem with our contemporary “scientism” about human nature is that too often it half convinces us that it cannot, and thus, Neiman says, helps dissolve both reverence and hope.

In other words, like its predecessor, “Moral Clarity” is a sustained defense of a particular set of values, and of a moral vocabulary that enables us to express them. Neiman sees these values as neglected or threatened all along the political spectrum. They received their strongest defenses in the moral thought of the Enlightenment, in David Hume and Adam Smith, but more particularly in Jean-Jacques Rousseau and Immanuel Kant. So the book is not only a moral polemic, but a powerful argument in support of the resources that these Enlightenment figures left us. Neiman, an American who is currently the director of the Einstein Forum in Berlin, boldly asserts that when Marxism, postmodernism, theory and fundamentalism challenge the Enlightenment they invariably come off second best. I agree, and I wish more people did so.
Neiman’s Enlightenment is not the hyperbolic ideology detected by some critics. It is not the unthinking worship of science, the materialistic, technological ideology that upset the Romantics and continues to upset their followers. It is not an unthinking confidence in the human capacity for knowledge, and still less in human perfectibility and unending progress. On the other hand, neither is it merely an expression of liberty, a resistance to unearned authority and the discovery of tolerance, which, she argues, provides too pallid an ideology to tempt people away from the superstitions and fundamentalisms that promise them more. It is rather an attitude encapsulated in four virtues: happiness, reason, reverence and hope. The moral clarity of her title is therefore not the ability to calculate answers to the practical conundrums that life sets us. It is rather the ability to see life in ways infused with these categories: to cherish happiness, to respect reason, to revere dignity and to hope for a better future.
It may seem surprising that we could need reminding of these things, but a foray into an airport bookstore, or a trip around any gallery of contemporary art, would show how far our culture would have to move before it gets back to being comfortable with them. To take just one significant example that Neiman highlights, the current value placed on being a “victim,” and the glorification of victims as heroes, should be seen as a denial of human freedom and dignity, a denial of happiness and a barrier against hope.
Although her philosophical heroes are associated with the secular character of the Enlightenment, Neiman is deeply respectful of religious traditions and religious writings, and rightly dismissive of the kind of brash atheism that confidently insists there is no good in them. On the other hand, following Plato, she does not see ethics as the distinct preserve of the faithful. Instead, she writes, “religion is rather a way of trying to give shape and structure to the moral concepts that are embedded in our lives.” Her most profound engagement with a religious text is with the Book of Job, the confrontation with natural evil and injustice that conditioned almost all the subsequent contortions of theology.
Philosophically, one of the deepest discussions in the book is Neiman’s appropriation of Kant’s doctrine of freedom. This is a notoriously treacherous area, but Neiman correctly aligns it with the human capacity for noticing or inventing (it does not necessarily matter which) possibilities for action. As well as whatever is the case, we have what might be the case, or what we could make come about, as well as what ought to be the case. Freedom, in the sphere of action, is therefore associated with a refusal to accept that what is the case limits and constrains our possibility for doing the other thing, surprising the psychologist, as it were. If the biological scientist comes along and tells us that we are all selfish, we do not need to conduct surveys and build laboratories to disprove it. We just need to remember that it is open to us to tip the waitress although we will never see her again, or to refuse to comply with the unjust demand to condemn the innocent who is accused of some crime, even if it would benefit us to agree. If the biological scientist says that it is against human nature to do these things, we have it in our hands to refute him on the spot. If on the other hand he retreats to saying that doing them is just a disguise for selfishness, first, it is not clear that he is doing science anymore, and second, we can properly reply that if so it is the disguise, and not our supposed true nature, that matters to the waitress or the innocent who is accused. Theories about how moral education works are not nearly as important as we tend to think, provided we can keep our confidence that such education can work. The problem with our contemporary “scientism” about human nature is that too often it half convinces us that it cannot, and thus, Neiman says, helps dissolve both reverence and hope.

Simon Blackburn
New York Times

I find cooking to be very calming, even in the rush of it all for business. I think about the combinations of ingredients and the anticipated delight of the diner. From my own philosophical experience, I try to be aware of the good, the true and the beautiful in all endeavors – including cooking. A lot to ask from an item to be consumed but I hope that I’m paying attention. I want to be aware of that first sip of good tea, coffee or wine and note that I should pay attention because this is good. It likely seems like fuzzy philosophy but being open to the ineffable is being open to delight. Or is that a tautology? In any case, I don’t see cooking as a vacation from philosophy but the action can put me in a state of mind where thinking is clearer. In some ways, having the mise en place kind of discipline is very Kantian in that there is a great deal of freedom arising through the discipline. I can’t have chaos in the kitchen, and I clean as I go. I hope that because of that discipline that I can make culinary ideological leaps as well…Although there is great freedom offered in Nietzsche and great process can be learned from Kant, neither would likely be much good in a kitchen – not to trivialize. It’s somewhere between the chaos and the control.

Karen Peters, by way of Elatia Harris
3 Quarks Daily


Vitruvian Man, by Leonardo da Vinci

In an early chapter of his interesting new book, Symmetry: A Journey Into the Patterns of Nature, Marcus du Sautoy describes a visit to the Alhambra, the great Moorish palace in Granada, Spain. He and his young son spend an afternoon identifying 14 different types of symmetry represented in paving patterns, ornamentation, and tile work. To the layman, the patterns may look simply like pretty forms, but to du Sautoy, who teaches mathematics at Oxford University, they are expressions of deep geometries that have their own names: gyrations, *333s, miracles, double miracles.

Du Sautoy’s book is about mathematics, but his excursion to the Alhambra is a reminder that symmetry has always been an important part of architecture. Symmetry appears in small things and large: Floor tiles may be laid in symmetrical patterns; the design of door paneling can be symmetrical, and so can window panes. In frontal symmetry, the left side of a building’s facade mirrors the right (the entrance usually being in the middle); in axial-plan symmetry, the rooms on one side of the axis are a mirror image of those on the other. If the women’s restroom is on one side, chances are the men’s is on the other. Sometimes not being symmetrical is important; the fronts and backs of buildings, for example, are intentionally different.

Symmetros is a Greek word, and ancient Greek architecture used symmetry as a basic organizing principle. As did Roman, Roman-esque, and Renaissance. Indeed, it is hard to think of any architectural tradition, Western or non-Western, that does not include symmetry. Symmetry is something that Islamic mosques, Chinese pagodas, Hindu temples, Shinto shrines, and Gothic cathedrals have in common.

Architectural Modernism thumbed its nose at tradition and firmly avoided symmetry. Being symmetrical was considered as retrograde as being, well, decorated. All exemplary Modernist buildings celebrated asymmetry: The wings of Walter Gropius’ Bauhaus shoot off in different directions; the columns of Mies van der Rohe’s Barcelona Pavilion are symmetrical, but you can hardly tell, thanks to the randomly spaced walls; nothing in Frank Lloyd Wright’s pinwheeling Fallingwater mirrors anything else; and Le Corbusier’s Ronchamps dispenses with traditional church geometry altogether. The facades of Philip Johnson’s Glass House are rare instances of Modernist symmetry, although all the elements of the interior—kitchen counter, storage wall, and brick cylinder containing the bathroom—are carefully located off-center.

Yet some Modernist pioneers did eventually recognize the evocative power of symmetry. After 1950, for example, Mies’s designs are increasingly symmetrical, both in plan and elevation. The Seagram Building is rigidly axial in plan—and has a front and a back—just like McKim, Mead, and White’s Racquet and Tennis Club across the street. Louis Kahn is a late Modernist who eschewed all architectural traditions except one; he returned to the symmetry of his Beaux-Arts education in the planning of his buildings. Eero Saarinen’s Ingalls Rink at Yale is axially symmetrical, but then hockey, like basketball or football, is played within symmetrical bounds.

Yet today’s expressionist fashion demands architectural asymmetry at any cost. That’s a shame, since architects sacrifice one of their art’s most powerful tools (not all architects—Norman Foster and Renzo Piano often use symmetry to great effect). Without occasional symmetry, all those angles and squiggles start to look the same. The hyperactive geometry of Daniel Libeskind’s addition to the Denver Art Museum, for example, can quickly become tiresome. The fey asymmetry of SANAA’s much-heralded New Museum of Contemporary Art in New York loses its impact after several viewings. A welcome exception is Frank Gehry’s Walt Disney Concert Hall in Los Angeles. While the exterior and the lobby are whimsically composed in standard Gehry fashion, the hall itself, like most concert halls, is perfectly symmetrical about its longitudinal axis. I don’t know if this was done for acoustical reasons or because the architect recognized the inherent calmness that axial symmetry affords.

Why is architectural symmetry so satisfying? As Leonardo da Vinci’s famous drawing demonstrated, it reflects the human body, which has a right side and a left, a back and a front, the navel in the very center. Du Sautoy writes that the human mind seems constantly drawn to anything that embodies some aspect of symmetry. He observes that “[a]rtwork, architecture and music from ancient times to the present day play on the idea of things which mirror each other in interesting ways.” When we walk around a Baroque church, we experience many changing views, but when we walk down the main aisle—the line along which the mirror images of the left and right sides meet—we know that we are in a special relationship to our surroundings. And when we stand below the dome of the crossing, at the confluence of four symmetries, we know we have arrived.

Witold Rybczynski
Slate

The best criticism, as Adam Gopnik wrote in an appreciation of the poet and critic Randall Jarrell, should be “not a slot machine of judgment but a tone of voice, a style, the promise of a whole view of life in a few pregnant sentences”.

And people who worry about the present state of criticism tend to fall into the trap of regarding it as a public service. The health of the arts, they say, depends on a robust and vigorous culture of criticism. I sympathise with the view and occasionally feel flattered by it. But I think it inflates the role of critics. As Robert Hughes once said, practising criticism is “like being the piano player in a whorehouse; you don’t have any control over the action going on upstairs”.

In place of public edification, I believe criticism is better seen as a (potential) public pleasure. It sounds obvious, but a piece of criticism, in the first instance, has to be worth reading. A good column might be a leisurely, soft-pedalled essay hinging on subtle discriminations, an ecstatic love letter to some new discovery, or a fuming snort of disgust. What matters is that it is written with conviction, and that it opens the reader’s eyes to things about its subject that they may not have considered in quite those terms before.

“Art deserves to be met with more than silence,” says The Guardian’s critic Adrian Searle. Artworks, he continues, “accrue meanings and readings through the ways they are interpreted and discussed and compared with one another”. It’s in this process that the real stimulations of criticism are to be found.
In the end, let’s face it, criticism is an indulgence: one that matters a great deal to those who have had their worlds changed and amplified by reading great examples of it, but hardly at all to many others.

Contrary to those who believe journalistic criticism will struggle to survive in the internet age, however, I think people are actually going to want more and more of it. If you step back and survey the situation, it seems simple. In affluent societies, of which there are more in the world than ever before, the arts rise in stature, and as they do, people naturally want to discuss them.

Nothing has happened in the digital age to fundamentally affect this, except that people increasingly feel themselves to be drowning in arbitrary information and ill-informed punditry. So, will they react by switching off entirely? Or will they rather seek out, with increasing appetite, the writing that seems best and most enjoyable to read? I think the latter.

Critics rehearse in public what we all do all the time: we make judgments. It’s common these days to hear people say, “I’m not being judgmental” or “Who are you to judge me?” But making judgments is how we negotiate our way through the world, how we organise and sharpen our pleasures and carve out our identities.

One could even say that critics try to do, in a breezier and less committed way, what artists do by nature (and without the need to apologise). For at the heart of every creative act are a zillion tiny decisions — conscious and unconscious — about what to do, what not to do, and what simply won’t do. All are forms of criticism: “taking the knife of criticism to God’s carefully considered handiwork”, as John Updike put it. That’s why, when you ask good artists about their contemporaries, they will either choose not to comment or say things that make even the most savage critic look benign.

Good criticism (and I mean this as an expression of an ideal) should be risky, challenging, candid and vulnerable. It should be urbane one moment, gauchely heartfelt the next. It should kick against cant wherever it sees it, and cherish and applaud not only art but the impulse to make art, for that impulse, which comes out of life as it is lived, is the real mystery, and the source of everything that makes it wonderful.

Sebastian Snee
The Australian

Professional critics perform a role that, in most aspects, is impossible to defend. Where does one start? With the arrogance of setting oneself up as a public judge of other people’s creative endeavours? With the inevitable superficiality of one’s responses, as one lurches from one subject to the next? Or with one’s repeated failure to get the tone right, to find the right combination of sympathy and discrimination, enthusiasm and intolerance?

The psychodynamics of criticism are easy enough to nail down. Just as children attracted to the police force are, naturally, weaklings desperate to wield power and exact revenge, critics are bookish nerds with bullying instincts.

“Just doing the job,” we tell ourselves as we pontificate from the safety of small, booklined studies in the suburbs where no one can disturb us, let alone take issue with us.

And, of course, we’re hobbled by jealousy. Don’t doubt it for a second: critics envy artists. Inside every critic is a painter, photographer or sculptor fantasising about the opening of their own sell-out show.

In light of this, no one should be surprised that critics are rumoured to be losing their clout. Entertainment has ousted serious writing about the arts in all but a handful of newspapers and magazines. Criticism has given way to profiles, interviews and all the vapid paraphernalia of publicity.
Marketing and PR, says the prevailing wisdom, have eclipsed the influence critics once had over the reception of books, films and exhibitions. And reviewing on television — the only medium that can hope to compete with the spin machine — has been reduced to “I liked it”, “I didn’t”, with star ratings attached. Meanwhile, blogs are supposedly diluting the power that well-known critics once had.

If all this is really happening, what is the loss to our culture? What use, really, is criticism?

The great British theatre critic Kenneth Tynan once described the critic as “a man who knows the way but can’t drive the car”. It’s a neat and typically brilliant formulation, but to my mind a little generous. Often critics don’t even know the way.

But perhaps this matters less than people think. There are two assumptions about critics I think we need to jettison if the good name of criticism (and I use the phrase with irony) is to be salvaged.
One is the assumption that critics need, as often as possible, to be right. “To be right,” the painter Franz Kline once said, “is the most terrific personal state that no one is interested in.” The other is that they need to educate and edify their readers.

Of course, rejecting the first assumption — the importance of being right — is dangerous, because it sounds suspiciously close to insisting that critics don’t need to make judgments. But that’s preposterous: of course we do. It’s part of our contract with the reader. Making a negative or positive judgment may not be the most interesting thing a good review does. But it remains fundamental. From it, most of the truly interesting and fun aspects of criticism arise.

Many critics — perhaps out of politeness or timidity — don’t seem to want to admit this. A study conducted by the national arts journalism program at Columbia University in New York a few years ago came up with some sobering facts. It asked how much critics earn (most make less than $US25,000 a year from critical writing), who they are (most are over 45 and white, and about half are female), how many are also practising artists (44 per cent), and who their favourite artists are.

Most astonishing of all was that only 27 per cent of those surveyed said they placed an emphasis on forming and expressing judgments. Of the five aspects of reviewing queried in the survey, making judgments ranked last.

So what exactly do critics think their job entails, if not criticism (which, in case you suddenly doubted it, is the judging of merits, faults, value and truth)? The answer is education. Art critics believe their job is primarily to educate their readers about art. An extraordinary 91 per cent of those surveyed by the Columbia program said their role was not just to inform their readers but educate them.

“The goal sounds benign,” as Christopher Knight noted in the Los Angeles Times at the time, “but its courtly arrogance is actually astounding. When a writer begins with the presumption that the reader is uneducated about the subject — or at least not as well educated as he — be prepared to be bored silly by what is written. Worse, a creeping tone of superciliousness is almost impossible to escape.”
Those who are made nervous by the business of expressing judgments often express the belief that criticism should be about contextualising. In other words, rather than merely telling readers whether Dirty Sexy Money is worth watching, critics should be explaining what the show means, what it says about our culture right now.

Again, this sort of thing is fine in theory. But in my opinion wisdom of the where-we’re-all-at kind is overrated and usually unreliable. Teenagers and merchant bankers are more savvy about what’s really going on in society than people who read books and go to art galleries. They have to be; for them, it’s a question of survival.

I’m not suggesting that critics should offer opinions and nothing else. Facts, too, are important. It’s fun to find out what Titian’s friends thought of him, or what Damien Hirst gets up to in the commercial sphere, or that Mogul artists obtained yellow from the urine of cows fed on mangoes.

But critics need to police their tone when imparting facts. If they affect the tone of a professional lecturer — or, just as bad, a street-smart stylist — they are asking for trouble.

Sebastian Smee
The Australian

There are few subjects more timely than the one tackled by Susan Jacoby in her new book, “The Age of American Unreason,” in which she asserts that “America is now ill with a powerful mutant strain of intertwined ignorance, anti-rationalism and anti-intellectualism.”

For more than a decade there have been growing symptoms of this affliction, from fundamentalist assaults on the teaching of evolution to the Bush administration’s willful disavowal of expert opinion on global warming and strategies for prosecuting the war in Iraq. Conservatives have turned the term “intellectual,” like the term “ liberal,” into a dirty word in politics (even though neo-conservative intellectuals played a formative role in making the case for war against Iraq); policy positions tend to get less attention than personality and tactics in the current presidential campaign; and the democratizing influence of the Internet is working to banish expertise altogether, making everyone an authority on everything. Traditional policy channels involving careful analysis and debate have been circumvented by the Bush White House in favor of bold, gut-level calls, and reasoned public discussions have increasingly given way to noisy partisan warfare among politicians, commentators and bloggers alike…

As Ms. Jacoby sees it, there are several key reasons for “the resurgent American anti-intellectualism of the past 20 years.” To begin with, television, video games and the Internet have created a “culture of distraction” that has shortened attention spans and left people with “less time and desire” for “two human activities critical to a fruitful and demanding intellectual life: reading and conversation.”

The eclipse of print culture by video culture began in the 1960s, Ms. Jacoby argues, adding that the ascendance of youth culture in that decade also promoted an attitude denigrating the importance of tradition, history and knowledge.

By the ’80s, she goes on, self-education was giving way to self-improvement, core curriculums were giving way to classes intended to boost self-esteem, and old-fashioned striving after achievement was giving way to a rabid pursuit of celebrity and fame. The old middlebrow culture, which prized information and aspiration — and which manifested itself, during the post-World War II years, in a growing number of museums and symphony orchestras, and a Book-of-the-Month club avidity for reading — was replaced by a mass culture that revolved around television and blockbuster movies and rock music.

It was also in the ’60s, Ms. Jacoby writes, that a resurgent fundamentalism “received a jolt of adrenaline from both the civil rights laws” in the early years of that decade and the later “cultural rebellions.” She succinctly records the long history of fundamentalism in America, arguing that poorly educated settlers on the frontier were drawn to religious creeds that provided emotional comfort without intellectual demands, just as “the American experiment in complete religious liberty led large numbers of Americans to embrace anti-rational, anti-intellectual forms of faith.”

Michiko Kakutani
New York Times

From an interview with art critic Matthew Collings:

image0012.jpg

You could have said 50 years ago that the equivalent people in charge of modern and contemporary art packaged it for the masses because they thought it was good for them, or it would save society, or it was against fascism, or something. But now they don’t even pretend it’s out of decent motivations. It’s just for commercial reasons. In any case, I don’t care about any of that. But as I said, I only think those types of things when I’m being extreme.

The fact is, I am interested in what the grain is — the grain of contemporary art. But I don’t think that to be involved with that, you have to be involved in a zombie way. I think you can be involved in an intelligent way, and that might mean being sceptical. It might mean thinking against the grain. But that’s only because you’re thinking about the bigger picture…The equivalent in our time is young British art, (the yBas), full of nihilism, satire, surrealism and decadence. That stuff can be pretty good, and I am sometimes interested in it. But again I would feel like I was suffocating if I thought that was all art could be. And because this art is so popular it’s like there’s no air. We’ve got to hear all this mind-destroying stuff all the time about the very narrow issues and concerns of this art, and of the art of the recent past, like Warhol and Bruce Nauman, and so on, that’s supposed to have begun it all. So when I’m gooning on the TV in front of the Turner Prize, and ironically indicating a bit of disapproval, while seeming to be blindly following the agenda; and then in interviews like this actually being quite explicitly aggressive toward the contemporary scene; it’s just to let in some air.

I don’t really mean that I hate those artists or even those moronic zombie curators, with their ghastly pc homily ideas. I went to art school to be an artist. For one reason or another I fell into this journalistic world. But I thought I was just explaining stories about what I knew to be the codes of the art world. I didn’t necessarily agree with the codes, I just felt I could describe them, because I knew them well. I never had the remotest interest in making this contemporary art scene that we now have, which as everybody knows is mostly just crap, accessible to an audience who has no real interest in it anyway….But I now find myself to be this person who meets strangers in the street who say ‘I really liked your programme’ — about art I actually might not have much interest in – and they say: ‘And it really opened my eyes to it!’ It’s rather moving to be praised like that, or acknowledged, or whatever, but it’s confusing. I don’t revere the art world, or at least certainly not the contemporary art world. But I learned to think in an art context. Art school was my higher education. So all that is an explanation of what I do.

3:AM Magazine

image001.jpg

Bruce Nauman describes his place in art this way:

“I think that it’s not knowing what’s coming or what art is supposed to be or how you’re supposed to go about being an artist that keeps it interesting. It’s going into the studio and finding out what seems to be available or not. It’s almost, in a sense, a philosophical kind of quest, but on the other hand the reason I became an artist was because I like to make things. Sometimes they help each other out, and sometimes they get in each other’s way.”

Philadelphia Inquirer