You are currently browsing the category archive for the ‘Thinking’ category.

Our memory is like an ear of corn. At least, that’s what Valerie Reyna was taught in graduate school.

Its Forrest Gumpish feel notwithstanding, the metaphor seemed scientifically sound. After all, researchers had already concluded there are two distinct types of memory: Verbatim, which allows us to recall what specifically happened at any given moment, and gist, which enables us to put the event in context and give it meaning.

“We were taught you extracted the gist from the verbatim memory,” recalled Reyna, an experimental psychologist and former senior research adviser to the U.S. Department of Education. “It was like husking an ear of corn. You threw away the husk, which was the verbatim, and you kept the gist, which was the kernel of meaning.”

There it was: Neat. Simple. Agrarian.

And also, as Reyna discovered over decades of subsequent research, wrong.

After conducting numerous studies with her partner, psychologist Charles Brainerd, Reyna concluded that verbatim and gist memory are separate, parallel systems. So separate, in fact, that “there is some evidence” they occupy different sections of the brain.

Reyna and Brainerd’s hypothesis, which they call “fuzzy trace theory,” explains how we can “remember” things that never really happened.

When an event occurs, verbatim memory records an accurate representation. But even as it is doing so, gist memory begins processing the information and determining how it fits into our existing storehouse of knowledge. Verbatim memories generally die away within a day or two, leaving only the gist memory, which records the event as we interpreted it.

Under certain circumstances, this can produce a phenomenon Reyna and her colleagues refer to as “phantom recollection.” She calls this “a powerful form of false alarm” in which gist memory — designed to look for patterns and fill in perceived gaps —creates a vivid but illusory image in our mind.
Mental snapshots soon fade; what lingers are our impressions of an occurrence, which are shaped by the meanings we attach to it…

“We’re looking at a number of things, including the effect of emotion on memory — how emotion interacts with your interpretation of events,” Reyna said. “Does arousal interfere with your encoding of memory? Does it ‘stamp it in,’ as some of the neuroscience literature suggests? The effect might be more complex than that.”

One question that can’t be answered in the lab is why, in evolutionary terms, we would develop two separate memory systems. Reyna, who has given this considerable thought, noted that if all we had was our rapidly fading verbatim memory, “it would be very hard to function — especially in an oral culture. Cognition appears to be engineered around gist memory, which endures and is stable.”

Consider the case of one of our prehistoric ancestors who is attacked by a saber-toothed tiger but manages to escape before being eaten. Verbatim memory would tell him precisely where the altercation took place, exactly what the tiger looked like and what tree he climbed to get beyond the animal’s reach. Gist memory would tell him: “Tigers are dangerous. If I go walking in the forest after dark, I’d better bring my spear.”

The first would be interesting; the second, essential. As Reyna wryly noted, “You don’t have to count the stripes to know the tiger is bad.”

Tom Jacobs
Miller-McCune

During the recent Association of Arts Administration Educators conference here in Madison, the increasing proficiency and professionalism around our collective conversation was both a source of pride, and a cause for pause. As a field of educators, researching and teaching cultural management and leadership, we’re clearly growing in reflection, connections, and success. But what if we’re doing so at a time when the profession, as we’ve defined it, is changing rapidly? What if we’re all getting increasingly proficient at a decreasingly relevant part of the ecosystem?

Consider, for example, the three-word phrase that often crops up at such conferences: ”professional arts organization.” This phrase captures, in shorthand, the specific category of cultural endeavor we tend to be discussing. Professional arts organizations require professional management, aesthetic integrity, curatorial control, and stable but responsive structures to hold them together while moving their mission forward. These are the standards that drive our teaching and learning about the field.
But each of those three words — ”professional,” ”arts,” and ”organization” — is in radical flux at the moment. That suggests that a phrase (and an assumption) combining all three could mean less and less in shorthand form.

This concern may come from my current reading matter, Clay Shirky’s new book Here Comes Everybody, about the increasing opportunities for collective action without traditional organizational structures — think Flickr or Wikipedia or iStockPhoto. But there’s something rumbling in the world that questions our basic assumptions about arts and cultural management. Let’s take a look at each word in the phrase, in reverse order:

· Organization
The formal organization (social, commercial, political, etc.) evolved in response to a set of structural barriers to collective action. Work that required more than one or a few people to complete — highway systems, national defense, mass-produced goods, save-the-spotted-owl initiatives, performing arts touring networks, museums — created large problems of coordination, alignment of resources (enough money in one place under one decision system), and high transaction costs (everyone having to agree every time…exhausting). The organization resolved these challenges through formalized decision structures, consolidated resources, and persistent identity (for example, a corporation lives separately from its founders, and is endowed with many/most of the rights of an individual). There was a cost to this structure, to be sure. A significant portion of any organization’s energy is consumed by self-maintenance rather than delivering on its purpose. Since the option was to not do the thing at all, we figured the costs were acceptable and necessary.

With the evolution of digital communications networks and software, however, many of the original challenges that required an organization are gone or significantly reduced. Collective action is increasingly available to distributed groups who don’t even know each other by name, and may convene around a cause only to disburse thereafter. The cost of production and distribution has dropped to almost zero for many goods and services. Organizations are still necessary and essential parts of the mix, but they’re not the only (or even the optimal) solution to every question, as they once were.

· Arts
There’s little need to go on about this particular word, which we all would agree is a fast-moving, increasingly amorphous creature. When we talk about ”arts” in the context of ”arts management” or ”arts organizations,” we still generally mean predominantly Western forms of expression, with an assumed emphasis on technical or aesthetic excellence. We don’t always mean this, of course. But if you nudge most conversations by professionals, you’ll find this assumption just beneath the surface. Evidence comes from the fact that we still add qualifiers to the word when we mean something other than the above: ”community arts,” ”amateur arts.”

· Professional
Specialized organizations in specialized industries require specialized professionals — trained in the task by formal process or apprenticeship. Professionals earn the term when they are paid for their specialized work and when the nature and frame of their efforts are defined and evaluated by their peers rather than by their customers. Professional writers define what professional writers do. Professional doctors and realtors define the parameters and certifications for their peers.
But, again, what happens to the word ”professional” when works of comparable quality and skill can be conceived, produced, and distributed without expensive or centralized means of production? Flickr has millions of exceptional images, many shot by individuals with no formal training, expecting no pay, and unfiltered by a traditional gatekeeper (curator, publisher, agent).

Says Shirky:

When reproduction, distribution, and categorization were all difficult, as they were for the last five hundred years, we needed professionals to undertake those jobs, and we properly venerated those people for the service they performed. Now those tasks are simpler, and the earlier roles have in many cases become optional, and are sometimes obstacles to direct access, often putting the providers of the older service at odds with their erstwhile patrons.

So, am I suggesting that we abandon our foundational phrase ”professional arts organization”? Of course not. As long as there are complex processes, specialized physical requirements of expression (theaters, museums, even on-line forums), and a recognition of the value of extraordinary skill, vision, and voice, we will need organizations, professionals, and filtering systems to find, foster, and connect expressive works to the world.

But we may want to recalibrate our underlying assumptions as an industry (and as educators who hope to advance that industry and its goals) about the specific role of what we now call ”professional arts organizations.” These are a subset of a massive ecology available to us to achieve our larger purpose. If we stick too rigidly to our terms, we may become obstacles to the missions we claim to have.

Andrew Taylor
The Artful Manager

The following comment by Dary appeared on Taylor’s posting and is a worthwhile continuation of the argument:

I actually just saw this guy speak at a… ahem… super-dorky “Web 2.0” Conference in San Francisco. He was really, really engaging and had some pretty cool viewpoints. One of his hypotheses is that our society as a whole is coming out of an age of collective intellectual inebriation much like society did prior to the Industrial Revolution. He told a story about how rampant gin was in 19th-century England – to the point where there were gin pushcarts like our current-day ice cream carts – and how society as a whole was just drunk and lazy for decades. And then it went out of fashion, people starting doing stuff, and we got the Industrial Revolution.

He makes the analogy of that gin-soaked drunkeness to the TV-soaked stupor of the past 50 years or so. He says now people are watching less television (which I haven’t checked the numbers on) and are spending more time applying actual brain power to such things as updating Wikipedia articles, tagging sites on del.icio.us and ma.gnolia, writing blogs, and twittering (brain power optional on that one).

His views are, of course, open to debate and there’s some intriguing counter-arguments to the seemingly pristine virtues of collective intelligence.

Anyway, in terms of how Shirky’s theories and the new communal web apply to Professional Arts Organizations, I’m not exactly sure what exactly you’re getting at. With “Organizations” the web makes it easier to schedule things and get in touch with people. Of course. You don’t really redefine anything with “Arts” in terms of this new landscape except to touch on the fact that Professionals think Amateurs are lame. And with “Professional”, you argue Web 2.0 makes it easier for non-professional artists to have their material discovered? Yes, of course, again. I dunno.

What’s more interesting to me is how a larger pool of available pieces of media changes society’s collective agreement on what is worthwhile and valuable in the arts and in general. Colbert jokes about “truthiness”, but it’s actual a valid point of philosophical debate within this new worldwide, social move to open up human knowledge. It’s especially pertinent to music I think, not just in terms of what a society consumes, but how they consume it. And I go back-and-forth between whether these new aspects are wonderful and free or troubling and insulting.

Ask someone how many concerts they’ve been to vs. how many YouTube videos of concerts/pieces they’ve watched in the past year- my ratio is deplorable! And the idea that it’s now easy to create music – for $500 you can build a moderately decent home studio and create recordings of moderately decent quality – so professionals aren’t as necessary anymore is worrisome.

It’s all happened so fast I don’t think people in general have really stopped to think about what this means for our society’s appreciation of the arts and value system for judging works.

So I’m thinking out loud, but clearly this is a contentious point for me. Thoughts?

The best criticism, as Adam Gopnik wrote in an appreciation of the poet and critic Randall Jarrell, should be “not a slot machine of judgment but a tone of voice, a style, the promise of a whole view of life in a few pregnant sentences”.

And people who worry about the present state of criticism tend to fall into the trap of regarding it as a public service. The health of the arts, they say, depends on a robust and vigorous culture of criticism. I sympathise with the view and occasionally feel flattered by it. But I think it inflates the role of critics. As Robert Hughes once said, practising criticism is “like being the piano player in a whorehouse; you don’t have any control over the action going on upstairs”.

In place of public edification, I believe criticism is better seen as a (potential) public pleasure. It sounds obvious, but a piece of criticism, in the first instance, has to be worth reading. A good column might be a leisurely, soft-pedalled essay hinging on subtle discriminations, an ecstatic love letter to some new discovery, or a fuming snort of disgust. What matters is that it is written with conviction, and that it opens the reader’s eyes to things about its subject that they may not have considered in quite those terms before.

“Art deserves to be met with more than silence,” says The Guardian’s critic Adrian Searle. Artworks, he continues, “accrue meanings and readings through the ways they are interpreted and discussed and compared with one another”. It’s in this process that the real stimulations of criticism are to be found.
In the end, let’s face it, criticism is an indulgence: one that matters a great deal to those who have had their worlds changed and amplified by reading great examples of it, but hardly at all to many others.

Contrary to those who believe journalistic criticism will struggle to survive in the internet age, however, I think people are actually going to want more and more of it. If you step back and survey the situation, it seems simple. In affluent societies, of which there are more in the world than ever before, the arts rise in stature, and as they do, people naturally want to discuss them.

Nothing has happened in the digital age to fundamentally affect this, except that people increasingly feel themselves to be drowning in arbitrary information and ill-informed punditry. So, will they react by switching off entirely? Or will they rather seek out, with increasing appetite, the writing that seems best and most enjoyable to read? I think the latter.

Critics rehearse in public what we all do all the time: we make judgments. It’s common these days to hear people say, “I’m not being judgmental” or “Who are you to judge me?” But making judgments is how we negotiate our way through the world, how we organise and sharpen our pleasures and carve out our identities.

One could even say that critics try to do, in a breezier and less committed way, what artists do by nature (and without the need to apologise). For at the heart of every creative act are a zillion tiny decisions — conscious and unconscious — about what to do, what not to do, and what simply won’t do. All are forms of criticism: “taking the knife of criticism to God’s carefully considered handiwork”, as John Updike put it. That’s why, when you ask good artists about their contemporaries, they will either choose not to comment or say things that make even the most savage critic look benign.

Good criticism (and I mean this as an expression of an ideal) should be risky, challenging, candid and vulnerable. It should be urbane one moment, gauchely heartfelt the next. It should kick against cant wherever it sees it, and cherish and applaud not only art but the impulse to make art, for that impulse, which comes out of life as it is lived, is the real mystery, and the source of everything that makes it wonderful.

Sebastian Snee
The Australian

It turns out that dull tasks really do numb the brain. Researchers have discovered that as people perform monotonous tasks, their brain shifts towards an at-rest mode whether they like it or not.

And by monitoring that area of the brain, they were able to predict when someone was about to make a mistake before they made it, a study published Monday in the Proceedings of the National Academy of Sciences found.

“There’s this thing that’s probably intrinsic where your brain says I do need to take a little break here and there’s nothing you can do about it,” said study author Tom Eichele of Norway’s University of Bergen.

“Probably everyone knows that feeling that sometimes your brain is not as receptive or as well performing and you didn’t do anything to actually induce that.”

When that happens, blood flows into the part of the brain which is more active in states of rest.

And since this state begins about 30 seconds prior to a mistake being made, it could be possible to design an early-warning system which could alert people to be more focused or more careful, Eichele said.

That could significantly improve workplace safety and also improve performance in key tasks such as airport security screening.

Discovery News

There are few subjects more timely than the one tackled by Susan Jacoby in her new book, “The Age of American Unreason,” in which she asserts that “America is now ill with a powerful mutant strain of intertwined ignorance, anti-rationalism and anti-intellectualism.”

For more than a decade there have been growing symptoms of this affliction, from fundamentalist assaults on the teaching of evolution to the Bush administration’s willful disavowal of expert opinion on global warming and strategies for prosecuting the war in Iraq. Conservatives have turned the term “intellectual,” like the term “ liberal,” into a dirty word in politics (even though neo-conservative intellectuals played a formative role in making the case for war against Iraq); policy positions tend to get less attention than personality and tactics in the current presidential campaign; and the democratizing influence of the Internet is working to banish expertise altogether, making everyone an authority on everything. Traditional policy channels involving careful analysis and debate have been circumvented by the Bush White House in favor of bold, gut-level calls, and reasoned public discussions have increasingly given way to noisy partisan warfare among politicians, commentators and bloggers alike…

As Ms. Jacoby sees it, there are several key reasons for “the resurgent American anti-intellectualism of the past 20 years.” To begin with, television, video games and the Internet have created a “culture of distraction” that has shortened attention spans and left people with “less time and desire” for “two human activities critical to a fruitful and demanding intellectual life: reading and conversation.”

The eclipse of print culture by video culture began in the 1960s, Ms. Jacoby argues, adding that the ascendance of youth culture in that decade also promoted an attitude denigrating the importance of tradition, history and knowledge.

By the ’80s, she goes on, self-education was giving way to self-improvement, core curriculums were giving way to classes intended to boost self-esteem, and old-fashioned striving after achievement was giving way to a rabid pursuit of celebrity and fame. The old middlebrow culture, which prized information and aspiration — and which manifested itself, during the post-World War II years, in a growing number of museums and symphony orchestras, and a Book-of-the-Month club avidity for reading — was replaced by a mass culture that revolved around television and blockbuster movies and rock music.

It was also in the ’60s, Ms. Jacoby writes, that a resurgent fundamentalism “received a jolt of adrenaline from both the civil rights laws” in the early years of that decade and the later “cultural rebellions.” She succinctly records the long history of fundamentalism in America, arguing that poorly educated settlers on the frontier were drawn to religious creeds that provided emotional comfort without intellectual demands, just as “the American experiment in complete religious liberty led large numbers of Americans to embrace anti-rational, anti-intellectual forms of faith.”

Michiko Kakutani
New York Times

Why does it seem odd to suggest that art can be humorous? It’s not as though we don’t encounter the words ‘art’ and ‘joke’ often enough in the same sentence, especially if ‘art’ is qualified by the adjective ‘modern’. But when we do it usually means that people’s suspicions are aroused. We make out that the joke is on us, so the art can be dismissed as not serious and therefore irrelevant. Art is supposed to come out of some discernible effort on the part of the artist, and the apparent effortlessness of a good joke inevitably undermines that expectation. If art is a joke then it’s not art, or so the thinking goes.

On the other hand, jokes and art have a good deal in common. They challenge assumptions, unsettle cosily habitual thought patterns and mock stereotypical behaviour. Surely they should often be found in each other’s company? In fact they are.

To take just two examples, the films of Swiss artist Roman Signer, currently showing in Edinburgh and soon to be seen in London, explore the comedic poetry of our encounter with objects. He calls himself an “emotional physicist” – maybe he really isn’t far removed from the comedian who walks into a lamppost. And the fact that we laugh at David Shrigley’s drawings reinforces rather than detracts from the sharp eye with which he observes life’s darknesses.

Making art nearly always involves destruction, even if it’s only the pristine purity of a white sheet of paper. Humour, too, can be merciless. Harnessed together they can add up to much more than the sum of their parts. Modern art’s iconic figure, Marcel Duchamp, was nothing if not a joker. His sardonic sense of humour is evident everywhere, especially in the postcard-size reproduction of the Mona Lisa to which he added a moustache and goatee, together with the words LHOOQ. Telling us that the only reason we look at Leonardo’s painting is because the subject has a hot arse (elle a chaud au cul) is, of course, deliberately provocative.

Duchamp’s defacement of a cherished treasure is insolent, yet if it causes anger it does so not because it is attacking Leonardo – who is beyond that, anyway? – but because it is mocking our lazy prejudices about what has cultural value. Art, he is saying, is about ideas, so seeing it requires us to use our brains rather than merely indulging our propensity to emotional incontinence.

Michael Archer
The Guardian

Over the past decade, two facts have become increasingly obvious – that our ever-increasing consumption is wrecking the planet, and that continually chasing more stuff, more food and more entertainment no longer makes us any happier. Instead, levels of stress, obesity and dissatisfaction are spiralling.

So why is our culture still chasing, consuming, striving ever harder, even though we know in our sophisticated minds that it’s an unrewarding route to eco-geddon? New scientific studies are helping to reveal why. It’s our primitive brains. These marvellous machines got us down from the trees and around the world, through ice ages, famines, plagues and disasters, into our unprecedented era of abundance. But they never had to evolve an instinct that said, “enough”.

Instead, our wiring constantly, subliminally urges us: “Want. More. Now.” Western civilisation wisely reined in this urge for thousands of years with an array of cultural conventions, from Aristotle’s Golden Mean (neither too much, nor too little) to the Edwardian table-saying: “I have reached an elegant sufficiency and anything additional would be superfluous.”

Consumer culture ditched all that, though, constructing instead an ever more sophisticated system for pinging our primitive desire circuits into overdrive. It got us to the point where we created everything we need as a basis for contentment. Now it’s rushing us past the tipping point, beyond which getting more makes life worse rather than better. And it’s making our brains respond more weirdly than ever.

Our old wiring may condemn us to keep striving ever harder until finally we precipitate our dissatisfied demise. But, instead, we could learn to practise the comfortable art of “enough” in this overstuffed world. There is a broad armoury of strategies we can adopt to proof our brains against the pressure to pursue and consume too much, to work too hard and to feel constantly inadequate and underprivileged. The most fundamental of these is knowledge: forewarned is forearmed.

The Times

Ours are ominous times. We are on the verge of eroding away our ozone layer. Within decades we could face major oceanic flooding. We are close to annihilating hundreds of exquisite animal species. Soon our forests will be as bland as pavement. Moreover, we now find ourselves on the verge of a new cold war.

But there is another threat, perhaps as dangerous: We are eradicating a major cultural force, the muse behind much art and poetry and music. We are annihilating melancholia…

Why are most Americans so utterly willing to have an essential part of their hearts sliced away and discarded like so much waste? What are we to make of this American obsession with happiness, an obsession that could well lead to a sudden extinction of the creative impulse, that could result in an extermination as horrible as those foreshadowed by global warming and environmental crisis and nuclear proliferation? What drives this rage for complacency, this desperate contentment?

Surely all this happiness can’t be for real. How can so many people be happy in the midst of all the problems that beset our globe — not only the collective and apocalyptic ills but also those particular irritations that bedevil our everyday existences, those money issues and marital spats, those stifling vocations and lonely dawns? Are we to believe that four out of every five Americans can be content amid the general woe? Are some people lying, or are they simply afraid to be honest in a culture in which the status quo is nothing short of manic bliss? Aren’t we suspicious of this statistic? Aren’t we further troubled by our culture’s overemphasis on happiness? Don’t we fear that this rabid focus on exuberance leads to half-lives, to bland existences, to wastelands of mechanistic behavior?

I for one am afraid that American culture’s overemphasis on happiness at the expense of sadness might be dangerous, a wanton forgetting of an essential part of a full life. I further am concerned that to desire only happiness in a world undoubtedly tragic is to become inauthentic, to settle for unrealistic abstractions that ignore concrete situations. I am finally fearful of our society’s efforts to expunge melancholia. Without the agitations of the soul, would all of our magnificently yearning towers topple? Would our heart-torn symphonies cease?

Eric G. Wilson
Chronicle of Higher Education

As the political theater season kicks into full swing in Iowa tonight, I’m struck by the pervasiveness of contrived events — events designed and delivered specifically to be reported on and YouTubed and blogged. Way back in the 1960s, historian Daniel Boorstin labeled these as ”pseudo-events,” voicing concern even then about their impact on our collective experience of community. As Boorstin defined it, a pseudo-event had the following characteristics (from The Image: A Guide to Pseudo-events in America):

1. It is not spontaneous, but comes about because someone has planned, planted, or incited it. Typically, it is not a train wreck or an earthquake, but an interview.

2. It is planted primarily (not always exclusively) for the immediate purpose of being reported or reproduced. Therefore, its occurrence is arranged for the convenience of the reporting or reproducing media. Its success is measured by how widely it is reported…

3. Its relation to the underlying reality of the situation is ambiguous. Its interest arises largely from this very ambiguity. Concerning a pseudo-event the question, ‘What does it mean?’ has a new dimension. While the news interest in a train wreck is in what happened and in the real consequences, the interest in an interview is always, in a sense, in whether it really happened and in what might have been the motives. Did the statement really mean what it said? Without some of this ambiguity a pseudo-event cannot be very interesting.

4. Usually it is intended to be a self-fulfilling prophecy. The hotel’s thirtieth-anniversary celebration, by saying that the hotel is a distinguished institution, actually makes it one.

We can all wring our hands at the fact that pseudo-events now comprise the large majority of our media experiences. But the more compelling question for me (at least for this blog) is how cultural managers should respond to the dominance of false reality. We are, after all, purveyors of contrived content — often meticulously planned, scripted, crafted, practiced, and delivered to exacting standards. What distinguishes our work from the larger social theater of politics, of marketing, of media?

Back in a 2000 essay in the New York Times, playwright Tom Donaghy called this very question for his peers in the live theater. In a world of reality television and ”realness” in the commercial media, what’s the unique and powerful role of live cultural experience? Thankfully, he answered his own question:

[It is theater’s singular power] to contemplate our collective reality; as audience, actor and story engage in an unspoken discussion of what reality is, how definitions of reality can be broadened. Theater affords this opportunity like no other medium, as actors and audiences breathe side by side, together engendering the spiritual and meditative power that that shared experience implies.

In the end, we’re all wielding the same tools to construct the experiences and events we offer to the world. The difference is in the intent and purpose with which we wield them.

Andrew Taylor
The Artful Manager

Music, unlike every other art form we’ve got, is about hearing rather than seeing. And yet seeing is such a dominant sense that it still plays a very conspicuous role in music-making, whether it be the created-through-and-for-the-eyes scores we disseminate to musicians who then turn it into music by looking at them, the silent visual cues of a conductor that keep an ensemble together, or even the less formal visual nods that members of jazz and rock bands signal each other with…

Back in 1983, I stopped writing down the music I was composing for nearly two years, transmitting it only through aural means. My slogan at the time was: “Divorce sight from sound, now!” In crazier moments, I even contemplated boycotts of organizations that I felt were too reliant on visual means to create sonic realities, e.g. orchestras, etc. I only half-jokingly attempted to get some friends to construct and carry picket signs to venerable performance institutions, but to no avail. They thought, alternately, that I was either completely joking or totally out of my mind. I was, perhaps, too zealous and possibly even more naïve.

For better or worse, we are living in a visually-dominant culture and for music to have meaning within our culture, it needs to be seen to some degree. And in that regard, music is not unique. While food is something that is supposedly experienced predominantly by the sense of taste, an amusing blindfold test in last week’s issue of Time Out New York actually proved how impossible it is to identify ingredients in a meal without seeing them, even for the most seasoned culinary savants (among them a prominent chef and a food critic who has worked for Gourmet magazine).

That said, to this day, I continue to eschew visual references when they refer to musical matters. For example, I’d never say: “What concert did you see last night?” And when people say they’re going to send me a recording, I always say that I’m “listening forward” to it. I’m also still somewhat suspect of musical matters which only make sense when you can see what they are. And actually, truth be told, this is probably why, though I voraciously attend concerts, my ideal mode of listening to music is on a recording with my eyes closed. Yet at the same time I confess that I love looking at record covers.

How reliant are you on your sense of sight when you create or experience music? How much do you feel you are losing from the experience of music when you are only able to listen to it, e.g. on the radio or a recording? And what exactly is it that you are losing?

Frank J. Oteri
NewMusicBox