You are currently browsing the category archive for the ‘Popular Culture’ category.
For many, graffiti carries the stigma of crime, violence, and decay. But for one artist and activist, graffiti is just the opposite. Favianna Rodriguez is the co-founder of the East Side Arts Alliance in Oakland, California, a cultural center dedicated to providing an artistic haven for the local community. Through music, dance and art, ESAA preserves a creative environment wherein all locals can advocate for positive social change. One program in particular, Visual Element, is a graffiti art program provided for high school students, where they can practice their art on legally commissioned walls. A staunch political artist in her own right, Rodriguez’s work has been exhibited all over the world. And now with food and environmental justice as her hub, Rodriguez maintains that we may not all be artists, but everything we do has intrinsic political power. And that power means change.
» HalogenLife: Tell me how East Side Arts Alliance got started.
» Rodriguez: In 2000 I was approached by a few other activists who were artists and culture workers. They asked me, ‘Do you want to start a cultural center?’ East side was a really inter-generational community. There was La Peña in Berkeley, but at that point there was no place for young people and art and on stuff that we found relevant. Graffiti, jazz, black music those are the things that honor the diversity in Oakland. There are over a hundred languages spoken, especially in the neighborhood I live in. And in creating ESAA, it was finding out how art can play a role in the economic development in the area. Why do we need art? We need cops and teachers, right? But we’re looking at how when artists come into cities they establish cultural zones. Gentrification is also a part of that. We [the ESAA] became a priority for the San Antonio neighborhood. And we’d also go to the state to support us. We were evicted at one point for our views on anti-police brutality. We had office spaces then, and it’s taken us seven years to get a building. There’s sixteen units of affordable housing, a soundbooth, library, performance space, and spaces for Visual Element.
» HalogenLife: What’s visual element?
» Rodriguez The best way to meet a young person is to meet them at their level, so that’s why we chose graffiti as a basis. So we thought, what’s the political history of graffiti? Its origins are in hip-hop. And also, what’s the history of muralism? It goes back to the tradition of Mexican muralists. Visual Element is about combining those two. Visual Element is how young people can collectively get together, doing large-scale productions art jams, schools, and have a unified message around gentrification or stopping a war.
Virginette Acacio
HalogenLife
It says something about the phenomenon analyzed in David Crystal’s new book Txtng: The Gr8 Db8 (Oxford University Press) that the very title will tend to divide readers into two camps. One will be amused. The other will be disgusted.
The visceral reaction is more interesting to think about, in some ways – for disgust suggests that some boundary has been breached, some norm transgressed. We hear fewer warnings than we once did about texting and instant messaging – how they are destroying the English language, turning young people into semi-literate barbarians, and otherwise hastening the decline of civilization. But that doesn’t mean the sentiment itself is gone. It’s just difficult to come up with new ways to express curmudgeonliness.
And anyway, the cause is lost. According to one estimate cited by Crystal, some 158 billion text messages were sent in the United States in 2006 – almost twice as many as the previous year. As of the middle of this decade, roughly one trillion such messages per year are being sent worldwide. Text messaging has emerged as a growth industry, at one point generating more than three times the revenue of all Hollywood box office receipts. That rate of expansion is bound to decline. But it has established an array of abbreviations, contractions, emoticons and orthographic mutations – all made useful, if not inescapable, by the need to stay succinct while texting, given the size of the screen. IMHO. omg! LOL!
It can’t be helped. And as for Crystal – an honorary professor of linguistics at the University of Wales at Bangor and editor of the Cambridge Encyclopedia of the English Language, among other works – he does not complain. In his latest book, he makes the argument that the idiolect of texting is not just a response to the limitations of the medium but the product of basic, ordinary processes found in other forms of communication.
The most obvious case is initialism – with AWOL, ASAP, and SNAFU, for example, having long since become so commonplaces that practically replace the phrases they condense. E-mail revitalized the practice with expressions such as IMHO (in my humble opinion) and ROTFL (“rolling on the floor laughing”). Since then, texting and instant-messaging have turned initialism into a kind of competitive sport – with someone coining ROTFLMAOWTIME (“rolling on the floor laughing my ass off with tears in my eyes”).
Also familiar from pre-digital times is the habit of shortening words. Crystal cites a dictionary of common abbreviations from 1942 listing such text-message-like usages as amt (amount), agn (again), and wd (would). Mashing together letters and numbers to create phonetic shorthand (“before” as b4) is an example of the logogram, related to conjunctions of characters found in languages such as Chinese. It is also akin to the old puzzle form known as the rebus.
Beyond its utilitarian value of permitting users to say as much as they can in as few keystrokes as possible (which also means saving money) the language of texting is a manifestation of “the human ludic temperament,” as Crystal puts it. That is, it is a form of play: something closely associated with the process of learning to use language itself. Pace the alarms occasionally raised about how texting undermines literacy, Crystal cites recent studies showing that pre-teen students who text had standard language skills equal to or better than those of non-texters.
“Teenage texters are not stupid,” says Crystal. But what they lack is a sense of “the consequences of what they are doing, in the eyes of society as a whole…. They need to know when textisms are effective and when they are not. They need to appreciate the range of social reactions which texting, in its various forms, can elicit. This knowledge is slowly acquired from parents, peers, text etiquette websites, and (in the narrow sense) teachers. Teenagers have to learn to manage tis new behavior, as indeed do we all. For one thing is certain: texting is not going to go away in the foreseeable future.”
It is also, at this point, a cross-cultural phenomenon. Crystal includes a set of tables showing the textisms used in a dozen languages. Chances are this will not be the last book on the subject by a linguist. As long as none of them is actually written in txt-ese, I guess I can live with that thought.
Scott McLemee
Inside Higher Ed
Everyone has been talking about an article in The Atlantic magazine called “Is Google Making Us Stupid?” Some subset of that group has actually read the 4,175-word article, by Nicholas Carr.
To save you some time, I was going to give you a 100-word abridged version. But there are just too many distractions to read that much. So here is the 140-character Twitter version (Twitter is a hyperspeed form of blogging in which you write about your life in bursts of 140 characters or fewer, including spaces and punctuation marks):
Google makes deep reading impossible. Media changes. Our brains’ wiring changes too. Computers think for us, flattening our intelligence.
If you managed to wade through that, maybe you are thinking that Twitter, not Google, is the enemy of human intellectual progress.
With Twitter, people subscribe to your “tweets.” Those who can make life’s mundane details interesting garner a large audience. Several services have been created to compete with Twitter. Others have been started to help people manage the prodigious flow of information from Twitterers.
There is even a version, Yammer, for use inside companies. You follow the word bursts of particular employees. (“In the weekly staff meeting. Good bagels. Why is everyone wearing khakis? All staff must file their T.P.S. reports on time, O.K.?”) As if there weren’t already enough to distract us in the workplace between meetings, phone calls, instant messages, e-mail messages and those Google searches.
If people question the benefit of Google, which has largely liberated us from the time-wasting activities associated with finding information, there is outright hostility to a tool that condenses our lives into haiku. The co-founder of Twitter, Jack Dorsey, was asked by M.I.T.’s Technology Review magazine — in a tweet, of course — why when people who aren’t familiar with Twitter are told about it, they are “uncomprehending or angry.” His response was brief and unsatisfying: “People have to discover value for themselves. Especially w/ something as simple & subtle as Twitter. It’s what you make of it.”
It is hard to think of a technology that wasn’t feared when it was introduced. In his Atlantic article, Mr. Carr says that Socrates feared the impact that writing would have on man’s ability to think. The advent of the printing press summoned similar fears. It wouldn’t be the last time.
When Hewlett-Packard invented the HP-35, the first hand-held scientific calculator, in 1972, the device was banned from some engineering classrooms. Professors feared that engineers would use it as a crutch, that they would no longer understand the relationships that either penciled calculations or a slide rule somehow provided for proficient scientific thought.
But the HP-35 hardly stultified engineering skills. Instead, in the last 36 years those engineers have brought us iPods, cellphones, high-definition TV and, yes, Google and Twitter. It freed engineers from wasting time on mundane tasks so they could spend more time creating.
Many technological advances have that effect. Take tax software, for instance. The tedious job of filing a tax return no longer requires several evenings, but just a few hours. It gives us time for more productive activities.
But for all the new technologies that increase our productivity, there are others that demand more of our time. That is one of the dialectics of our era. With its maps and Internet access, the iPhone saves us time; with its downloadable games, we also carry a game machine in our pocket. The proportion of time-wasters to time-savers may only grow. In a knowledge-based society in which knowledge is free, attention becomes the valued commodity. Companies compete for eyeballs, that great metric born in the dot-com boom, and vie to create media that are sticky, another great term from this era. We are not paid for our attention span, but rewarded for it with yet more distractions and demands on our time.
The pessimistic assumption that new technologies will somehow make our lives worse may be a function of occupation or training. Paul Saffo, the futurist, says he could divide the technology world into two kinds of people: engineers and natural scientists. He says the world outlook of the engineer is by nature optimistic. Every problem can be solved if you have the right tools and enough time and you pose the correct questions. Other people, who can be just as scientific, see the natural order of the world in terms of entropy, decline and death.
Those people aren’t necessarily wrong. But the engineer’s point of view puts trust in human improvement. Certainly there have been moments when that thinking has gone horribly awry — atonal music or molecular gastronomy. But over the course of human history, writing, printing, computing and Googling have only made it easier to think and communicate.
Damon Darlin
New York Times
Although there are many anecdotal stories of breakthroughs resulting from daydreams – Einstein, for instance, was notorious for his wandering mind – daydreaming itself is usually cast in a negative light. Children in school are encouraged to stop daydreaming and “focus,” and wandering minds are often cited as a leading cause of traffic accidents. In a culture obsessed with efficiency, daydreaming is derided as a lazy habit or a lack of discipline, the kind of thinking we rely on when we don’t really want to think. It’s a sign of procrastination, not productivity, something to be put away with your flip-flops and hammock as summer draws to a close.
In recent years, however, scientists have begun to see the act of daydreaming very differently. They’ve demonstrated that daydreaming is a fundamental feature of the human mind – so fundamental, in fact, that it’s often referred to as our “default” mode of thought. Many scientists argue that daydreaming is a crucial tool for creativity, a thought process that allows the brain to make new associations and connections. Instead of focusing on our immediate surroundings – such as the message of a church sermon – the daydreaming mind is free to engage in abstract thought and imaginative ramblings. As a result, we’re able to imagine things that don’t actually exist, like sticky yellow bookmarks.
“If your mind didn’t wander, then you’d be largely shackled to whatever you are doing right now,” says Jonathan Schooler, a psychologist at the University of California, Santa Barbara. “But instead you can engage in mental time travel and other kinds of simulation. During a daydream, your thoughts are really unbounded.”
The ability to think abstractly that flourishes during daydreams also has important social benefits. Mostly, what we daydream about is each other, as the mind retrieves memories, contemplates “what if” scenarios, and thinks about how it should behave in the future. In this sense, the content of daydreams often resembles a soap opera, with people reflecting on social interactions both real and make-believe. We can leave behind the world as it is and start imagining the world as it might be, if only we hadn’t lost our temper, or had superpowers, or were sipping a daiquiri on a Caribbean beach. It is this ability to tune out the present moment and contemplate the make-believe that separates the human mind from every other.
“Daydreaming builds on this fundamental capacity people have for being able to project themselves into imaginary situations, like the future,” Malia Mason, a neuroscientist at Columbia, says. “Without that skill, we’d be pretty limited creatures.”
Jonah Lehrer
Boston Globe
You could say that perfectionism is a crime against humanity. Adaptability is the characteristic that enables the species to survive—and if there’s one thing perfectionism does, it rigidifies behavior. It constricts people just when the fast-moving world requires more flexibility and comfort with ambiguity than ever. It turns people into success slaves.
Perfectionists, experts now know, are made and not born, commonly at an early age. They also know that perfectionism is increasing. One reason: Pressure on children to achieve is rampant, because parents now seek much of their status from the performance of their kids. And, by itself, pressure to achieve is perceived by kids as criticism for mistakes; criticism turns out to be implicit in it.
Perfectionism, too, is a form of parental control, and parental control of offspring is greater than ever in the new economy and global marketplace, realities that are deeply unsettling to today’s adults.
“I don’t understand it,” one bewildered student told me, speaking for the five others seated around the table during lunch at a small residential college in the Northeast. “My parents were perfectly happy to get Bs and Cs when they were in college. But they expect me to get As.” The others nodded in agreement. Today’s hothouse parents are not only over-involved in their children’s lives, they demand perfection from them in school.
And if ever there was a blueprint for breeding psychological distress, that’s it. Perfectionism seeps into the psyche and creates a pervasive personality style. It keeps people from engaging in challenging experiences; they don’t get to discover what they truly like or to create their own identities. Perfectionism reduces playfulness and the assimilation of knowledge; if you’re always focused on your own performance and on defending yourself, you can’t focus on learning a task. Here’s the cosmic thigh-slapper: Because it lowers the ability to take risks, perfectionism reduces creativity and innovation—exactly what’s not adaptive in the global marketplace…
Perfectionists fear that if they give up perfectionism, they won’t be good anymore at anything; they’ll fall apart. In fact, perfectionism harms performance more than it helps. The worst thing about it, says Randy Frost, is the belief that self-worth is contingent on performance—that if you don’t do well, you’re worthless. It’s possible to escape that thinking.
· First, watch a movie or a sunset or engage in some activity not affected by your perfectionistic strivings. Pay attention to how much pleasure you get from it.
· Then engage in some activity—say, tennis—that is subject to your perfectionism. How much pleasure do you get from it?
· Ask yourself: So I miss a shot, what does it mean for my self-worth?
· Apply that same insight to all other activities: Is this perfectionistic orientation worth it for this task?
· Now you actually need to experiment with a different way of evaluating yourself and your performance. So deliberately make a mistake; miss a shot in tennis.
· Ask yourself: Does your opponent think less of you? Do observers think less of you? If your opponent makes a mistake, do you think less of him?
· Play tennis and concentrate only on the motion of your body. Did you enjoy that set more?
· Understand the nature of mistakes. They’re something we learn from—more than from our successes.
· Look upon failure as information, not a fixed or frozen outcome. It’s a signal to try something else—another chance to learn.
Hara Estroff Marano
Psychology Today
During the recent Association of Arts Administration Educators conference here in Madison, the increasing proficiency and professionalism around our collective conversation was both a source of pride, and a cause for pause. As a field of educators, researching and teaching cultural management and leadership, we’re clearly growing in reflection, connections, and success. But what if we’re doing so at a time when the profession, as we’ve defined it, is changing rapidly? What if we’re all getting increasingly proficient at a decreasingly relevant part of the ecosystem?
Consider, for example, the three-word phrase that often crops up at such conferences: ”professional arts organization.” This phrase captures, in shorthand, the specific category of cultural endeavor we tend to be discussing. Professional arts organizations require professional management, aesthetic integrity, curatorial control, and stable but responsive structures to hold them together while moving their mission forward. These are the standards that drive our teaching and learning about the field.
But each of those three words — ”professional,” ”arts,” and ”organization” — is in radical flux at the moment. That suggests that a phrase (and an assumption) combining all three could mean less and less in shorthand form.
This concern may come from my current reading matter, Clay Shirky’s new book Here Comes Everybody, about the increasing opportunities for collective action without traditional organizational structures — think Flickr or Wikipedia or iStockPhoto. But there’s something rumbling in the world that questions our basic assumptions about arts and cultural management. Let’s take a look at each word in the phrase, in reverse order:
· Organization
The formal organization (social, commercial, political, etc.) evolved in response to a set of structural barriers to collective action. Work that required more than one or a few people to complete — highway systems, national defense, mass-produced goods, save-the-spotted-owl initiatives, performing arts touring networks, museums — created large problems of coordination, alignment of resources (enough money in one place under one decision system), and high transaction costs (everyone having to agree every time…exhausting). The organization resolved these challenges through formalized decision structures, consolidated resources, and persistent identity (for example, a corporation lives separately from its founders, and is endowed with many/most of the rights of an individual). There was a cost to this structure, to be sure. A significant portion of any organization’s energy is consumed by self-maintenance rather than delivering on its purpose. Since the option was to not do the thing at all, we figured the costs were acceptable and necessary.
With the evolution of digital communications networks and software, however, many of the original challenges that required an organization are gone or significantly reduced. Collective action is increasingly available to distributed groups who don’t even know each other by name, and may convene around a cause only to disburse thereafter. The cost of production and distribution has dropped to almost zero for many goods and services. Organizations are still necessary and essential parts of the mix, but they’re not the only (or even the optimal) solution to every question, as they once were.
· Arts
There’s little need to go on about this particular word, which we all would agree is a fast-moving, increasingly amorphous creature. When we talk about ”arts” in the context of ”arts management” or ”arts organizations,” we still generally mean predominantly Western forms of expression, with an assumed emphasis on technical or aesthetic excellence. We don’t always mean this, of course. But if you nudge most conversations by professionals, you’ll find this assumption just beneath the surface. Evidence comes from the fact that we still add qualifiers to the word when we mean something other than the above: ”community arts,” ”amateur arts.”
· Professional
Specialized organizations in specialized industries require specialized professionals — trained in the task by formal process or apprenticeship. Professionals earn the term when they are paid for their specialized work and when the nature and frame of their efforts are defined and evaluated by their peers rather than by their customers. Professional writers define what professional writers do. Professional doctors and realtors define the parameters and certifications for their peers.
But, again, what happens to the word ”professional” when works of comparable quality and skill can be conceived, produced, and distributed without expensive or centralized means of production? Flickr has millions of exceptional images, many shot by individuals with no formal training, expecting no pay, and unfiltered by a traditional gatekeeper (curator, publisher, agent).
Says Shirky:
When reproduction, distribution, and categorization were all difficult, as they were for the last five hundred years, we needed professionals to undertake those jobs, and we properly venerated those people for the service they performed. Now those tasks are simpler, and the earlier roles have in many cases become optional, and are sometimes obstacles to direct access, often putting the providers of the older service at odds with their erstwhile patrons.
So, am I suggesting that we abandon our foundational phrase ”professional arts organization”? Of course not. As long as there are complex processes, specialized physical requirements of expression (theaters, museums, even on-line forums), and a recognition of the value of extraordinary skill, vision, and voice, we will need organizations, professionals, and filtering systems to find, foster, and connect expressive works to the world.
But we may want to recalibrate our underlying assumptions as an industry (and as educators who hope to advance that industry and its goals) about the specific role of what we now call ”professional arts organizations.” These are a subset of a massive ecology available to us to achieve our larger purpose. If we stick too rigidly to our terms, we may become obstacles to the missions we claim to have.
Andrew Taylor
The Artful Manager
The following comment by Dary appeared on Taylor’s posting and is a worthwhile continuation of the argument:
I actually just saw this guy speak at a… ahem… super-dorky “Web 2.0” Conference in San Francisco. He was really, really engaging and had some pretty cool viewpoints. One of his hypotheses is that our society as a whole is coming out of an age of collective intellectual inebriation much like society did prior to the Industrial Revolution. He told a story about how rampant gin was in 19th-century England – to the point where there were gin pushcarts like our current-day ice cream carts – and how society as a whole was just drunk and lazy for decades. And then it went out of fashion, people starting doing stuff, and we got the Industrial Revolution.
He makes the analogy of that gin-soaked drunkeness to the TV-soaked stupor of the past 50 years or so. He says now people are watching less television (which I haven’t checked the numbers on) and are spending more time applying actual brain power to such things as updating Wikipedia articles, tagging sites on del.icio.us and ma.gnolia, writing blogs, and twittering (brain power optional on that one).
His views are, of course, open to debate and there’s some intriguing counter-arguments to the seemingly pristine virtues of collective intelligence.
Anyway, in terms of how Shirky’s theories and the new communal web apply to Professional Arts Organizations, I’m not exactly sure what exactly you’re getting at. With “Organizations” the web makes it easier to schedule things and get in touch with people. Of course. You don’t really redefine anything with “Arts” in terms of this new landscape except to touch on the fact that Professionals think Amateurs are lame. And with “Professional”, you argue Web 2.0 makes it easier for non-professional artists to have their material discovered? Yes, of course, again. I dunno.
What’s more interesting to me is how a larger pool of available pieces of media changes society’s collective agreement on what is worthwhile and valuable in the arts and in general. Colbert jokes about “truthiness”, but it’s actual a valid point of philosophical debate within this new worldwide, social move to open up human knowledge. It’s especially pertinent to music I think, not just in terms of what a society consumes, but how they consume it. And I go back-and-forth between whether these new aspects are wonderful and free or troubling and insulting.
Ask someone how many concerts they’ve been to vs. how many YouTube videos of concerts/pieces they’ve watched in the past year- my ratio is deplorable! And the idea that it’s now easy to create music – for $500 you can build a moderately decent home studio and create recordings of moderately decent quality – so professionals aren’t as necessary anymore is worrisome.
It’s all happened so fast I don’t think people in general have really stopped to think about what this means for our society’s appreciation of the arts and value system for judging works.
So I’m thinking out loud, but clearly this is a contentious point for me. Thoughts?
The best criticism, as Adam Gopnik wrote in an appreciation of the poet and critic Randall Jarrell, should be “not a slot machine of judgment but a tone of voice, a style, the promise of a whole view of life in a few pregnant sentences”.
And people who worry about the present state of criticism tend to fall into the trap of regarding it as a public service. The health of the arts, they say, depends on a robust and vigorous culture of criticism. I sympathise with the view and occasionally feel flattered by it. But I think it inflates the role of critics. As Robert Hughes once said, practising criticism is “like being the piano player in a whorehouse; you don’t have any control over the action going on upstairs”.
In place of public edification, I believe criticism is better seen as a (potential) public pleasure. It sounds obvious, but a piece of criticism, in the first instance, has to be worth reading. A good column might be a leisurely, soft-pedalled essay hinging on subtle discriminations, an ecstatic love letter to some new discovery, or a fuming snort of disgust. What matters is that it is written with conviction, and that it opens the reader’s eyes to things about its subject that they may not have considered in quite those terms before.
“Art deserves to be met with more than silence,” says The Guardian’s critic Adrian Searle. Artworks, he continues, “accrue meanings and readings through the ways they are interpreted and discussed and compared with one another”. It’s in this process that the real stimulations of criticism are to be found.
In the end, let’s face it, criticism is an indulgence: one that matters a great deal to those who have had their worlds changed and amplified by reading great examples of it, but hardly at all to many others.
Contrary to those who believe journalistic criticism will struggle to survive in the internet age, however, I think people are actually going to want more and more of it. If you step back and survey the situation, it seems simple. In affluent societies, of which there are more in the world than ever before, the arts rise in stature, and as they do, people naturally want to discuss them.
Nothing has happened in the digital age to fundamentally affect this, except that people increasingly feel themselves to be drowning in arbitrary information and ill-informed punditry. So, will they react by switching off entirely? Or will they rather seek out, with increasing appetite, the writing that seems best and most enjoyable to read? I think the latter.
Critics rehearse in public what we all do all the time: we make judgments. It’s common these days to hear people say, “I’m not being judgmental” or “Who are you to judge me?” But making judgments is how we negotiate our way through the world, how we organise and sharpen our pleasures and carve out our identities.
One could even say that critics try to do, in a breezier and less committed way, what artists do by nature (and without the need to apologise). For at the heart of every creative act are a zillion tiny decisions — conscious and unconscious — about what to do, what not to do, and what simply won’t do. All are forms of criticism: “taking the knife of criticism to God’s carefully considered handiwork”, as John Updike put it. That’s why, when you ask good artists about their contemporaries, they will either choose not to comment or say things that make even the most savage critic look benign.
Good criticism (and I mean this as an expression of an ideal) should be risky, challenging, candid and vulnerable. It should be urbane one moment, gauchely heartfelt the next. It should kick against cant wherever it sees it, and cherish and applaud not only art but the impulse to make art, for that impulse, which comes out of life as it is lived, is the real mystery, and the source of everything that makes it wonderful.
Sebastian Snee
The Australian
Professional critics perform a role that, in most aspects, is impossible to defend. Where does one start? With the arrogance of setting oneself up as a public judge of other people’s creative endeavours? With the inevitable superficiality of one’s responses, as one lurches from one subject to the next? Or with one’s repeated failure to get the tone right, to find the right combination of sympathy and discrimination, enthusiasm and intolerance?
The psychodynamics of criticism are easy enough to nail down. Just as children attracted to the police force are, naturally, weaklings desperate to wield power and exact revenge, critics are bookish nerds with bullying instincts.
“Just doing the job,” we tell ourselves as we pontificate from the safety of small, booklined studies in the suburbs where no one can disturb us, let alone take issue with us.
And, of course, we’re hobbled by jealousy. Don’t doubt it for a second: critics envy artists. Inside every critic is a painter, photographer or sculptor fantasising about the opening of their own sell-out show.
In light of this, no one should be surprised that critics are rumoured to be losing their clout. Entertainment has ousted serious writing about the arts in all but a handful of newspapers and magazines. Criticism has given way to profiles, interviews and all the vapid paraphernalia of publicity.
Marketing and PR, says the prevailing wisdom, have eclipsed the influence critics once had over the reception of books, films and exhibitions. And reviewing on television — the only medium that can hope to compete with the spin machine — has been reduced to “I liked it”, “I didn’t”, with star ratings attached. Meanwhile, blogs are supposedly diluting the power that well-known critics once had.
If all this is really happening, what is the loss to our culture? What use, really, is criticism?
The great British theatre critic Kenneth Tynan once described the critic as “a man who knows the way but can’t drive the car”. It’s a neat and typically brilliant formulation, but to my mind a little generous. Often critics don’t even know the way.
But perhaps this matters less than people think. There are two assumptions about critics I think we need to jettison if the good name of criticism (and I use the phrase with irony) is to be salvaged.
One is the assumption that critics need, as often as possible, to be right. “To be right,” the painter Franz Kline once said, “is the most terrific personal state that no one is interested in.” The other is that they need to educate and edify their readers.
Of course, rejecting the first assumption — the importance of being right — is dangerous, because it sounds suspiciously close to insisting that critics don’t need to make judgments. But that’s preposterous: of course we do. It’s part of our contract with the reader. Making a negative or positive judgment may not be the most interesting thing a good review does. But it remains fundamental. From it, most of the truly interesting and fun aspects of criticism arise.
Many critics — perhaps out of politeness or timidity — don’t seem to want to admit this. A study conducted by the national arts journalism program at Columbia University in New York a few years ago came up with some sobering facts. It asked how much critics earn (most make less than $US25,000 a year from critical writing), who they are (most are over 45 and white, and about half are female), how many are also practising artists (44 per cent), and who their favourite artists are.
Most astonishing of all was that only 27 per cent of those surveyed said they placed an emphasis on forming and expressing judgments. Of the five aspects of reviewing queried in the survey, making judgments ranked last.
So what exactly do critics think their job entails, if not criticism (which, in case you suddenly doubted it, is the judging of merits, faults, value and truth)? The answer is education. Art critics believe their job is primarily to educate their readers about art. An extraordinary 91 per cent of those surveyed by the Columbia program said their role was not just to inform their readers but educate them.
“The goal sounds benign,” as Christopher Knight noted in the Los Angeles Times at the time, “but its courtly arrogance is actually astounding. When a writer begins with the presumption that the reader is uneducated about the subject — or at least not as well educated as he — be prepared to be bored silly by what is written. Worse, a creeping tone of superciliousness is almost impossible to escape.”
Those who are made nervous by the business of expressing judgments often express the belief that criticism should be about contextualising. In other words, rather than merely telling readers whether Dirty Sexy Money is worth watching, critics should be explaining what the show means, what it says about our culture right now.
Again, this sort of thing is fine in theory. But in my opinion wisdom of the where-we’re-all-at kind is overrated and usually unreliable. Teenagers and merchant bankers are more savvy about what’s really going on in society than people who read books and go to art galleries. They have to be; for them, it’s a question of survival.
I’m not suggesting that critics should offer opinions and nothing else. Facts, too, are important. It’s fun to find out what Titian’s friends thought of him, or what Damien Hirst gets up to in the commercial sphere, or that Mogul artists obtained yellow from the urine of cows fed on mangoes.
But critics need to police their tone when imparting facts. If they affect the tone of a professional lecturer — or, just as bad, a street-smart stylist — they are asking for trouble.
Sebastian Smee
The Australian
It would seem that, in some ways, the Carnegie International markets itself. It is, after all, the only regular exhibition of contemporary artists’ work from all over the world in North America. And it has been around for more than a century.
But this year’s show that starts May 3, the 55th since Andrew Carnegie held the first one in 1896, has a few more extras that led Carnegie Museum of Art to work a little harder to attract audiences. It’s longer than past shows, running nearly nine months compared to about six months normally; and it comes as the region is about to launch a big birthday party for the city’s 250th anniversary.
To build and keep the buzz going through the summer months and deep into winter, the Carnegie decided on four marketing firsts — giving the contemporary art show a title; advertising it in The New York Times; buying substantial Internet ads and launching an interactive Web site where visitors can learn about the artists and comment on their work at kiosks in the museum.
“Life on Mars” — the title of this year’s exhibition that showcases 40 artists — poses three questions: Are we alone in the universe? Do aliens exist? Or are we, ourselves, the strangers in our own worlds?
“It tees it up in a way for people who might not be familiar with the International,” said Kitty Julian, marketing director for the Carnegie Museums of Art and Natural History. “You wonder, what is that? It’s a way of opening a dialogue with our audiences.”
Attracting different audiences is essential because “nine months is a long time to hold people’s attention,” Julian said. In spring and summer, the exhibition hopes to draw cultural tourists from Los Angeles; New York; Baltimore; Chicago; Cleveland; Columbus, Ohio; Cincinnati; and Washington, D.C. In August and September, the focus will be college students and professors who are back in school here and in nearby states. In November and December, the target will be friends and family visiting Pittsburgh during Thanksgiving and Christmas.
Some board members did question why the second oldest survey of contemporary art — the first, begun in 1895, is the Venice Biennale — needed a title.
“The board’s reaction to the title was, ‘Why now? We’ve never done that before,’ ” said Richard Armstrong, the Carnegie Museum of Art’s H.J. Heinz II curator..
“It’s tremendously useful that we have a title, as a provocation. You understand why the hell you’re there,” Armstrong said, adding that the show is “an artistic reaction to an overly elaborate world. The show really has its own personality. What does it mean to be affiliated or alienated in today’s world? How can artists make an impact on that condition by exploiting it, examining it, explicating it or even ignoring it?”…
How relevant is a contemporary art show? Armstrong sees it as a forum “where people who have close connections to the avant-garde can self-reflect on the vocabulary of the moment. It’s an effort at collective consciousness.”
As an example, he noted that when some people think of the 1920s, they think of flappers, the stock market’s rise and fall, laissez-faire economics and Prohibition. But that decade’s culture, Armstrong noted, featured experimental music by Francis Poulenc and Erik Satie, daring modernist architects such as Walter Gropius and Marcel Breuer, and experimental literature by e.e. cummings.
“That’s what defined that era,” Armstrong said. “It’s very important for the cultural sector to be self-aware and define itself. Pop culture does it. The socio-political and economic sectors do it. Do you want to be recalled as the era of Paris Hilton or the era of one of the artists in the exhibition? We have the financial means, the space, the judgment and we really have the moral obligation to do this.”
Marylynne Pitz
Pittsburgh Post-Gazette