Categories
Education Enlightenment Ignorance Philosophy Politics Uncategorized

Yes, Virginia, There is Still Racism and Colorism

I write this not to equate my experience with those of who suffer the at-times-frustrating, at-times-horrific, but always-absurd effects of racism and colorism. After all, I am an American of primarily European descent who enjoys many of the privileges of such a background. However, my particular appearance in the particular context of where I grew up strategically positioned me to experience a small amount of racism/colorism, and this experience served to ally me with people of color from an early age. I write not to say “I have experienced racism,” but rather to show the ubiquity and absurdity of racism/colorism to those who haven’t experienced it, don’t see it, and therefore don’t understand why the civil rights movement is still ongoing.

I am fifty percent Portuguese, twenty-five percent Lithuanian, twelve-and-a-half percent Syrian, six-and-a-half percent Irish, and six-and-a-half percent English. I know this is not how things work at the genetic level, but it’s nonetheless the breakdown as far as the sources of origin of the cultures I inherited and with which I identify. Also, my mother’s first husband was Jewish, and so even though I am Roman Catholic, my older brother and sister were Jewish and so we celebrated Bar- and Bat Mitzvahs and Hanukkah and the like. It was an accepting and multicultural household from the beginning, even if most of those cultures were of European descent.

But there was a twist. My Portuguese and Syrian ancestry bequeathed to me olive colored skin, skin that ripened to a chestnut color in just one day of summer sun. And I felt an identity with these pieces of my background, especially the Portuguese half of my family. We would spend lots of time in East Providence, Rhode Island, an enclave of Portuguese immigrants who came here looking for the American Dream and worked furiously as stevedores, day-laborers, maids, au-pairs, and the like, in order to achieve an approximation of that dream. My Dad and most of his siblings went to college. (The fact that they struggled to afford the homes and second cars that my grandfather could buy on his dock-worker’s wages is beyond the scope of this essay.) When I was in East Providence, I was surrounded by the sounds, the food, and the religion of this culture. On the Festas do Espírito Santo, people carried baskets of bread on their heads, crowned queens of the festival and marched through the streets singing songs. It was all very “ethnic.”

Contrast this with the place my parents moved to for work: Pittsfield Massachusetts. One of the WASP capitals of the world, the center of town, Park Square, was encircled by an Episcopal church, a Congregational church, a Baptist church and a Methodist church. The Second Congregational Church – the Black Church—was located in my neighborhood, a mile west up Columbus Avenue. The central Catholic church? A mile north. Its website boasts of it being “…the only known street-level church in the United States without steps.” Tradition says they had to lay the foundation without steps, quickly enough to avoid the anti-Catholic powers-that-be could putting a stop to its construction.

I presume every American has at least a cursory knowledge of the nativism against Catholics in the United States. Surely everyone who knows anything about the election of President John F. Kennedy is familiar with the questions he faced regarding his Roman Catholicism. In September of 1960, 150 Protestant ministers met in Washington and proclaimed that Kennedy could not remain independent of the Roman Catholic Church unless he denounced its teachings. K. O. White, pastor of Houston’s Downtown First Baptist Church and former pastor of Metropolitan Baptist Church in Washington, D.C., put it this way to the presidential hopeful: “The reason we are concerned is the fact that your church has stated that it has the right, the privilege, and responsibility to direct its members in various areas of life, including the political realm. We raise the question because we would like to know if you are elected President and your church elects to use that privilege and obligation, what your response will be under those circumstances?”

But what reason could I, a millennial Catholic of European descent, have to fear in Pittsfield, Massachusetts half a century later? Was not Kennedy elected president after all? Was not Berkshire County now fifty percent Catholic? And, just to make things even less complicated, my parents enrolled me in the Roman Catholic schools of Pittsfield. How could I possibly experience any kind of discrimination?

Indeed, I did not experience any nativism in Pittsfield. But I did experience colorism in these Catholic schools. I was surrounded by a lot of Kelly’s, Murphy’s, and O’Sullivan’s, and when they saw a tan kid with a weird name with three “a’s” in it, they asked me, “What kind of a name is that?” “Portuguese,” I replied. “Puerto Rican?” “No, PORTUGUESE.” “Same difference.” And from then on I was known as Puerto Rican. One of my close friends, also of darker skin tone compared to the rest, was known as “The Spic.” This was even more odd, considering he had an Irish first and last name. The other thing we had in common besides skin tone was the fact that we were out-of-towners. Everyone else’s parents were hometown boys. My friend and I never bonded over any of this in an expressed way. But I remember my little elementary school-self trying to stay out of the sun so that my skin would be lighter, and I remember my friend pining for blue contact lenses.

Was any of this malicious in intent? Do I hold any of my classmates responsible for this? No. They were just kids, after all, and so victim to the situational forces surrounding them. Did their and their parents’ view of me keep me off of certain sports teams and out of certain birthday parties? I do not know. If I were a better athlete, I would have a better case to make against them.

Everything I experienced pales in comparison to the racism experienced by blacks in our community. I remember one day catching a couple of my classmates in the ally outside the middle school telling jokes about black people using the N-word. In a rare moment of courage, I spoke up: “How can you tell those jokes? What about Gerard? He’s our friend, and he’s black.” Their reply? “Gerard’s not like other black kids.” (“How would they know?” I think to myself now. “He was the only black kid in our school.”) Even this overt racism pales in comparison to the violent racism that exists in other places in our country. So no, I am not equivocating my experience in any way to what that of my friends who are People of Color. But I want to show the ubiquity of racism and colorism, the propensity of it to rear its ugly head even in the most monochrome of settings. Nativist and birtherist—what should just be called “racist”—worldviews still abound. As does their offspring, colorism, which infects the minds of all of us unconsciously, even its victims, as we look down on ourselves and those with hues different from our own. So, even though Kennedy was shot and Obama was not, we may have to attribute this more to the evolution of presidential security than to the evolution of our consciousness.

There is one thing for which I am grateful, and that is the way the colorism I experienced made me gravitate toward being an ally of people of color. First, at Suffolk University in Boston, where the black Director of Multicultural Affairs had the courage to hire me even though, as she explained to me, she usually reserved that position for students of color or GLBTQ students. In another rare moment of courage, I pointed out to her “You’re a black woman. Your assistant director is a Puerto Rican woman. Your administrative assistant is a black woman. What better person to run around making all of your copies and your coffee than a white man?” She made me her diversity hire. I didn’t realize at the time just how much she was going out on a limb for me. Nor did I understand why my black manager at a later job had a directory of black businesses from which he hired people. I didn’t have the courage to ask him then, but I understand now. Much more recently, I slipped into unconsciousness again, wondering why a black visitor felt singled-out by security at a local venue. It took a very kind colleague to remind me that she probably overlooks dozens of these “singled-out” experiences each week. I am thankful for these friends and colleagues who have exercised so much patience with me as I stumble toward some idea of what their experience is really like.

A couple of these friends took me under their wing at my first job out of college. They were Puerto Rican, and it is with great pride that I now look back on the name they bestowed upon me, even with all of its colorist history. While “Puerto Rican” felt racist to me all those years ago in grade school, today, among these true Puerto Rican friends, I relish the fact that they feel comfortable enough to affectionately call me “El Gringo.”

Categories
Contemplative Living Education Enlightenment History of Ideas Ignorance Philosophy Politics Religion Self-Help Theology Uncategorized

A Story About You and Me: Myth, Demythologization, and the Surplus of Meaning, on the Eve of the Opening of the Museum of the Bible

“TWO creation stories? What do you mean the Bible has TWO creation stories?” Well, in the first one, God creates the earth, populates it with animals, and then creates men and women to have stewardship over it. “And the second one?” In the second one, God creates the earth, then creates a man, then tries I vain to find a suitable companion for the man by creating all of the animals, and then, finally, God puts the man to sleep, extracts his rib, and creates a woman. “I never realized the two stories were so different. They even contradict each other. How could people include both of these stories if they don’t even coincide scientifically and historically with one another?” Because the creators and the caretakers of these stories were not doing history, or science. They were doing myth.

On the eve of the opening of the Museum of the Bible here in Washington, DC, I find myself reflecting on the two most ubiquitous views of scripture I hear voiced by people in my role as a teacher of religious studies. The first is what we might call the “reductionist” view, which claims that all of these myths are merely humanity’s early attempts to explain the world. This is where we get our modern connotation of the word “myth” as something false, made-up. The second view we might call “literalist,” as it holds these stories to be literally true, even when their truth seems to go against widely accepted scientific and historical truths about the world. Both the reductionist and the literalist views of myth are based on misconceptions of the origins and the purposes of these sacred stories. A brief look at some of the foremost thinkers-on-myth will not only elucidate these origins and purposes, but may even show us how we might discover the true value of these stories, the reasons why we have been telling them to each other for thousands of years.

“Myth” comes from the Greek word “mythos,” which means “story.” In religious studies, ‘myth’ does not have the connotation of “false” or “made-up” that our popular usage carries. Stories may be fiction or nonfiction, but neither of these designations takes away their status as stories, or myths.

Myths have often been dismissed as early attempts by human beings to explain natural phenomena. This dismissal of myth is part of the demythicization process that has been underway since the Renaissance, as we tried to replace mythological explanations with scientific ones. For example, the story of Noah’s Ark seems like a story about why we have rainbows, and, now that we know what physical processes cause rainbows, we no longer need that story. But Noah’s Ark is no more a story about rainbows than the film The Matrix is a story to explain the phenomenon of déjà vu. Both stories do attempt explanations of those phenomena, but those explanations are merely to lend credibility to the rest of the story. Neither story was created merely to explain these things.

The word “myth” is almost always reserved for a particular class of stories, stories which point to a sacred reality. “Sacred reality” does not necessarily refer to God or gods or heaven. It may also refer to the natural order of things, an ethical order, or some other concept or value that is placed “on high” by a particular people. It may not be clear to the story teller herself what exactly this sacred reality is—again, the elements in the myth are only pointing to the sacred reality, even when that reality is named. It is also important to note that the ontological status of that sacred reality (whether or how it exists) is not what is really important. “Sacred reality” is real enough just by virtue of its being designated as “sacred,” over and against the “profane”—the normal, everyday, worldly concerns of a people. The question is not “What is true?” but “What have peoples found necessary to point to and preserve as centrally important for their entire existence?” If a people have a myth saying, “God created man,” we do not know whether that deity exists, but we do know that “man” must be important to these people. So myth, or “truth embodied in a tale,” contains a kind of truth that is different from scientific or historical truth.

How do we know when we are in the realm of myth? How do we know when a story is speaking about meaning—existential, psychological, spiritual meaning—rather than about scientific, objective fact? Mircea Eliade points out that myth always take place in illo tempore, literally “at that time.” Eliade uses in illo tempore to refer to the unique phrases that begin all myths, phrases that bring the reader into “mythic time,” announcing that eternal, mythic, spiritual truths are about to be disclosed (as opposed to scientific or historical truths.) Famous examples of in illo tempore include “In the Beginning,” “In the Dreamtime,” “When on High,” and “In a Galaxy Far Far Away.” Myth is something that never happened and always happens. “On April 10, 1979” is history; “Once upon a time” refers to eternity.

Paul Ricouer defined myth as “a pattern of symbols.” This symbolic nature of myth may account for the connotation that myth is something different from fact: Whether a story is historically true or not, it always has a meaning beyond the literal meaning of what is being related. In fact, a myth always has a plethora of meanings. A story about a tree may have an ostensive reference to the particular tree to which the original story teller points while telling the story. It also refers to the central tree in any village in which the story is later told. Finally, “tree” may also symbolize the interconnectedness of nature or the human family, or some other meaning that is contained within the story.

The word “symbol” comes from two words, ballein, “to throw” and sym, “together. So a symbol is a place where two apparently unrelated things are “thrown together.” “Tree” does not literally have anything to do with “human family,” although it may be used to symbolize that in a myth. Of course, “tree” does have characteristics that make it a useful symbol for “human family.” Other symbols are less obvious: The “Golden Arches” may mean “hamburger” or “food” or “stomach ache” to a particular person, even though a gold letter “m” has nothing to do necessarily with any of those things.

Because of their symbolic nature, myths contain an infinite amount of meaning. Ricouer referred to this as the “surplus of meaning,” stating that the discursive interpretations of a symbol or pattern of symbols can never exhaust the possible meanings of that symbol. Something of this idea is contained in the adage “a picture is worth a thousand words.” A myth is worth an infinite number of words. As Hans-Georg Gadamer points out, a text has many meanings: the literal and symbolic meanings intended by the author, the meanings constructed by the author’s original audience in their own place and time, and the meanings constructed by the current reader. This last case—the current reader—is what truly opens up the idea of a surplus of meaning. The current reader’s life and world are constantly new and changing, meaning there is an infinite number of things to which the text can refer. In other words, you can never read the same book twice. We can re-read Shakespeare hundreds of years later, and every angst-y young lover can have his/her own Romeo or Juliet. This is also why myths are repositories of wisdom, containers for truth that is at once ancient and timeless and yet ever-new and relevant.

With all of the infinite number of ways we can interpret a text, how do we know which one is correct? Hermeneutics (from the Greek god Hermes, the “messenger”) is the science or art of interpretation. It was originally concerned with issues surrounding the interpretation of texts, specifically the Bible. Hermeneutics has an even wider application today, referring not only to the interpretation of texts, but also to visual art and music. It even asks questions about the interpretation involved in the very acts of seeing, hearing, and being in the world.
Gadamer points out that there are two basic facts about human understanding. These facts are present in every act of understanding, whether it is reading a book, watching a film, or engaging in a conversation:

1. You can’t understand the whole if you don’t understand the parts, and
2. You can’t understand the parts if you don’t understand the whole.

The first fact is obvious. You can’t understand a sentence if you don’t understand the words, and you can’t understand a book if you don’t understand the chapters. The second fact is less obvious, but here is an example that might help: If I say “He cut the blades of grass,” you know that I am talking about mowing a lawn. But you cannot know this by merely looking at the parts: Is “he” an animal or a man? Does “blades” refer to knives or swords, or to grass? Does “grass” refer to a lawn or to marijuana? And yet we get the meaning. How can this be so? The problem also comes to light when you think about any book or film that has a twist at the end. For example, the viewer of The Sixth Sense thinks that she understands the movie throughout the whole film, until the very end. Then, in the last scene, some information is given that requires the viewer to go back and review every scene of the film with new eyes. The viewer needs to understand the whole in order to more fully understand the parts, at the same time as he/she needs to understand the parts in order to understand the whole. But how can this be so? These two truths are contradictory—a paradox. So how do we accomplish understanding?

Gadamer’s answer comes in the form of a shocking word: “prejudice.” Human beings pre-judge all the time. This skill has earned a bad name because of its unbridled use in discriminating against various groups of people throughout history. However, prejudice or pre-judgement is a necessary step in understanding. A pre-judgement provides us with an immediate understanding of the whole—a grossly incomplete understanding, but an understanding nonetheless. This prejudiced understanding is then confirmed, unconfirmed, modified, or deepened as the reader comes to understand each of the parts. Then, once all the parts have been taken into account, the reader has an informed understanding of the whole. It is no longer a pre-judgement.

This happens all the time with books. We begin by literally “judging the book by its cover.” The cover and title give us an immediate idea of the whole. Other factors may contribute, too—such as who gave us the book, or in what section of the bookstore it was it found. Then we read the table of contents, a part of the book which helps us to confirm or deny our initial judgment. Finally, we read the book and can make a true judgment as to its contents.
What about the case of our confusing sentence, “He cut the blades of grass”? Where do we get our prejudged whole so that we are not caught up in the ambiguity of each and every word? Scientists have witnessed Gadamer’s paradoxical truths at work even in the very act of reading. Observing the human eye’s behavior reveals that the eye does not read linearly, deciphering each word in order from beginning to end. Rather, the eye darts all over the place: from the beginning of the sentence to the end, then to another word toward the beginning, then to a word further on—all in an attempt to understand the whole and the parts simultaneously, knowing that one cannot be done without the other.

This process of understanding takes the shape of a spiral. As Ray Hart says, the hermeneutic spiral recognizes that our first reading of a work gives us an understanding of it, but that repeated readings are necessary to deepen this understanding. Our understanding of the whole is never complete. When we read a book once, we overcome our pre-judged understanding. But when we read it a second time, we are able to understand all of the parts better, now seeing them in the context of the whole. This process goes on indefinitely.

Does the hermeneutic spiral mean that we can never have a correct interpretation of a work? Most hermeneutics speak of the validity or invalidity of an interpretation, rather than whether it is “correct” or not. The idea here is that if one can show that the pattern of symbols (myth) roughly fits the pattern of the interpretation, then the interpretation is valid. Making sure these “patterns” fit is another way of talking about the internal context of the work. In our example, “He cut the blades of grass,” interpreting “blade” as “sword” is invalid, because while we can think of a time when a sword would cut something, we cannot think of a time when we would use it to cut grass. It doesn’t fit the context. Validity has a wide range, though, especially if the reader-interpreter indicates how they are using the myth. If the reader is claiming that their interpretation is what the author originally meant, the criteria for validity is different from that which would be required for an interpretation that claims to apply the myth to one’s own life.

We said earlier that looking at myths can answer the question “What have peoples found necessary to point to and preserve as centrally important for their entire existence?” Thanks to the work of Ricouer, Gadamer, Carl Jung, and Joseph Campbell, we have discovered that we can also ask another set of questions of myth: “What can this story tell me about myself?” How can its symbols be translated into a meaning that is personally relevant to me? In what ways can this story’s symbols get me to think about myself existentially, psychologically, developmentally, spiritually?

This particular type of interpretation is called demythologization, a term coined by the theologian Rudolf Bultmann . Where demythicization (de+myth = “remove the story”) sought to remove myth and replace it with science, demythologization (de+myth+logos = “remove the symbols of the story”) seeks to remove the symbols from the myth, exposing deeper philosophical meanings that are relevant to our own lives. For the demythologizer, myths are not just stories to explain the world, or ways of learning about the guiding principles of a culture. They are not stories about something that happened thousands of years ago. They are stories about you and me, right here, right now.

Demythologization asks us to see ourselves in the story. One or more of the symbols represent us. The story of David and Goliath may tell us about a historical event or a legendary event. It may tell us something about the place the underdog held in the value system of the ancient Hebrews, or in the hearts of modern day Jewish people. But it can also teach us something about how to think about a bully when we are in third grade. And then again, when we get to be an adult, the story may give us insight into how we can deal some other seemingly insurmountable challenge we are facing. At another time, it may clue us in to our own bullying tendencies. All of these meanings and more are possible, as we grow and develop and read and re-read. It is auspicious that these ideas about interpretation can be found in all of the great wisdom traditions of the world. We find them in the PaRDes, the four-level hermeneutic of the Torah in Rabbinic Judaism. We find them in Islam, in the historical, spiritual, and mystical levels of meaning in the Qur’an. We find them in the Christian Lection Divina. We find them most consciously in the psychology of C. G. Jung, and in the theory of myth given to us by Joseph Campbell. We find them from the very mouths of babes who, when they are read the story of Little Red Riding Hood, exclaim with wide eyes, “What did I do next, Daddy? What did I do next?”

Indeed, myths can reveal eternal truths about a people, all humanity, the world, and about you and me. But to treat myths as history or as science, whether for the purpose of discrediting them or of exalting them beyond all reason is to grossly misunderstand their origins, their purpose, and their true value.

Categories
Education Enlightenment Ethics History of Ideas Philosophy Religion Science Theology

A Letter To My Students:

When you go out into the world, especially the academic world, you will undoubtedly be confronted by something called “reductionism.” Reductionism is when we reduce something to some other thing. It is when we analyze a whole into its parts and then assert that the whole is equal to one of those parts.

For example, when you say you are in love, the reductionist says, “What you call love is actually the experience of endorphins and other biochemicals produced by your body in response to the physical—visual and/or auditory and/or pheromonal—presence of a possible mate.”

“But I’m in love,” you say. “I want to spend the rest of my life with this person!”

“Yes, but what you call ‘love’ is just the endorphin experience I just described, coupled with feelings of love and care that have evolved because of the evolutionary dividend of ensuring that your offspring will survive the harsh world due to the aid and protection of two parental figures.”

“I guess you’ve never been in love!” you reply, exasperated.

The problem is not that the reductionist’s accounts aren’t true, but that they are incomplete. The reductionist almost always become so excited about the fruitfulness of his reductionist theory that he forgets that it is just one aspect of the thing at hand. He becomes blind to the truth of the whole and becomes enraptured with his own little theory. This is easy to do in the case of intangible things like “love.” It is more difficult the more concrete and undeniable the thing in question is, such as when you see your reductionist friend outside after class, enjoying an apple.

“Wow, this apple is delicious!” your reductionist friend says. You decide to have some fun:

“You know, what you are calling a delicious apple is merely the mind’s integration of biochemical responses produced by the malic acid of the apple triggering taste receptors in the mouth, causing the evolutionarily developed habit of salivating, chewing, and swallowing in order to increase caloric intake proportionate to metabolism.”

“You’re a jerk,” your reductionist friend replies.

The reason for your reductionist friend’s anger is that he knows you have called him out on the pitfall of his worldview. You are exposing the fact that if we want to be reductionist in our view of the world, then we have to be reductionist in our view of the whole world. But reductionists tend to reduce only the things they want to reduce. If we saw this conversation continue, we would probably hear you say “What? Isn’t my reduction of your apple the same as your reduction of my being in love?” And we would probably here your friend reply that they are not the same, because the apple is somehow more real than love.

The problem with this reductionist view is that what are allowed to count as “real” are only those things that have objective reality. Love is not real because it is highly subjective and so we call it an epiphenomenon that can be reduced to endorphins and pheromones etc. Apples, on the other hand, are real because we can all see them and feel them and touch them and eat them. While we may all disagree on the merits of the taste of an apple, none of us can disagree that the apple exists.

This view of what counts as real and what does not is the product of what John Clayton called the “Enlightenment Project,” that collective endeavor that took place beginning in the 16th century in Europe in which intellectuals decided that what counts as knowledge is only that which can be established by a combination of firsthand experience and reason, without recourse to any kind of tradition, authority, subjectivity, or emotion. It is this narrow prescription of knowledge that presumably allowed us to break free from the intellectually repressive and human rights-oppressive bonds of religion and allowed us to study the world in such a way that has yielded us all the wonders of modern science and technology.

Unfortunately, the Enlightenment worldview reduced religion to only one aspect of itself (morality) and undermined any claim it may have had to other truths. I speak in the past tense, but this is still the position of religion in much of academia and other public intellectual spheres. I am thankful that after seeking out places to study religion in graduate school, I discovered that there are an increasing number of schools and departments who contain both reductionist and nonreductionist thinkers. These institutions, too, are largely the product of the Enlightenment, and so I don’t want to beat up on the Enlightenment too much. As Clayton once said to me, “The fact that we are able to be here critiquing the Enlightenment Project in a safe, legally protected space is due to the Enlightenment Project!”

Many post-Enlightenment thinkers critiqued religion. They saw that there were many religions with many diverse and conflicting belief and practices. This meant that religion must be subjective and therefore relative, and so not worthy as a source of knowledge or wisdom. These thinkers looked past the subjective aspects of religion and studied only what they had in common, what was objective. Initially this meant the golden rule, which is present in all the major religions. Later, theorists focused on religions’ institutionalized means of social control, and so reduced religion to that. Feuerbach, Marx, Nietzsche, and Freud are the examples par excellence.

In this reductive view, religion was seen as the enemy of scientific knowledge and of human rights. But we have forgotten—or we never knew—that religion also made space for these same developments. Medieval Islam, for example, saw the world as the “Cosmic Qur’an” or revelation of God, and so studying it was seen as a sacred duty revealing to us aspects of God. This is why so many scientific and mathematical discoveries were made in the Islamic world during that time. On the human rights side of things, we could look back even further within Islam to Muhammad’s revolutionary elevation of the status of women and orphans in the extremely oppressive culture out of which he sprang in the sixth century CE. Or we could note the Buddha’s rejection of caste and gender differences in India way back in the sixth century BCE. Or, more recently, the inspiration toward economic and political revolution that Christianity provided in Central and South America in the twentieth century.

For all of the insights reductionists have given us into religion (for they have – their insights are not false, but only incomplete), they have mostly been unable to see religions in all their aspects. Scientifically, we have a duty to study those aspects as well. Even when someone like Jonathan Haidt studies one of the common positive outcomes of religion as he does in his illuminating book The Righteous Mind, the problem is that is it still remains a reductionist account. Religion becomes a useful adaptation that creates social cohesion and so has the evolutionary payoff of helping us to work together and survive the harsh world. Can that really be it?

You might be wondering if there is a piece of religion that resists reductionism. If scholars have studied the negative and now the positive and have found reductionist theories in both areas, then is there anything left? The answer can be found in an important book of which too many people are ignorant. It is The Two Sources of Morality and Religion by Henri Bergson. Bergson acknowledges the social-control and cohesion aspects of religion and the legitimacy of the critiques of those aspects. But he points out that there is another source of religion and morality, the individual, subjective experiences of individuals. Sure there is the Moses who hands down the 613 laws to the people. But there is also the Moses who stands dumbfounded and in awe of God’s presence at the burning bush. There is the Muhammad who provides Muslims with guidance for almost every major and minor social interaction one can think of, but there is also the Muhammad that was comforted by God’s words to him, that “wherever you experience hardship, this will be followed by ease.” And who could possibly ignore the Buddha’s tremendous introspective insights into the nature of consciousness and reduce him to merely a social reformer?

In academia, there are some who felt there was wisdom and value to be found by reaching back into religious traditions. People like Mircea Eliade and Huston Smith, while respected in academic and in popular circles, were sometimes labeled “traditionalist” or “conservative,” with all of the pejorative connotations that those terms held for those who saw progress as possible only within the confines of the secular trajectory of the Enlightenment Project. However, I would argue that these thinkers were the true progressives, the ones who first rediscovered value in beliefs, practices, and narratives other than their own, who insisted on the value of pluralism, and who saw that diversity is a prerequisite of intellectual, emotional, psychological, and spiritual growth. I use past tense here because it is my (slightly overly optimistic) view that we are beyond the narrow Enlightenment Project view of the world and of religion. I see a re-appreciation of religions and the religious in academia, in spite of the fact that our world is still plagued by the negative as well as positive experiences of this phenomenon.

Do scholars ever transcend the reductive tendency? There are many who have. There are those like Paul Tillich, who are scholars who happen to also be religious, and so they use scholarship to further their religious efforts. There are those like Carl Jung, who saw scholarship as a way of opening up deeper understandings of the spiritual truths of the universe, finding in the common themes and structures of religion a human path that transcends any one particular religion. There are natural scientists like Andrew Newberg who, while using fMRI scans to illuminate the neurological aspect of human experience, know that this is just one aspect of that experience and that we must guard against reducing any of life to this one measureable picture of it. And there are those like Ninian Smart, who emphasize the value of all studies in trying to grasp something that is too huge to ever be fully comprehended even by all lenses, let alone by any one.

And, more generally, there are the writers and poets and artists and musicians who daily do justice and reverence to the particular, the subjective, the mystical. These genres of creativity are more naturally inclined to this (someone once defined the artist as one who makes us notice the universal in the particular), but I find also that it is especially the voices of minority and underprivileged persons who have aided all of us in resisting the tyranny of reductionism (for we are all influenced by it, as it is the dominant narrative). People like Alice Walker and Emily Dickinson and Walt Whitman—it is their very underprivileged positions that give them a privileged view from which to break free of the dominant trends of the mainstream. To them we are thankful, as well as to the voices of any worldview different from our own, for they can all help us to see outside of our own boxes. That is, so long as we translate and interpret them with integrity, and not just with an eye to what we want to find confirmed in them.

I should also point out that we are all reductionists in a sense. Any time we analyze a phenomenon and attempt to explain it in terms of something else—a model, or an image, or a series of concepts—we are reducing things. Basically, we are reductionists whenever we open our mouths! Whenever we say that something “is” something else. I think this reductionism is okay and even necessary. Or perhaps it is not necessary, but it is not necessarily destructive. A music scholar might analyze a John Coltrane recording to find the scales and modes and chord substitutions Trane used in his saxophone solo, but this does not mean she loses the power and magic of the experience of the music, or thinks that she can just reduce it to those notes. But so much of our discourse is reductive and analytical that we must be on guard about what the Buddhist Lankavatara Sutra describes as “mistaking the finger pointing at the moon for the moon.” The only way to avoid this is through using language in an apophatic (“speaking away”) way, insisting that your listener must experience the thing for him or herself, with no delusions that what is being said is saying something in itself or captures the experience of something else.

And so, all of you mystics—and we are all mystics according to the reductionist. At least any of you who have seen God, or felt the oneness of all things, or who have been in love, or who have enjoyed an apple—consider yourselves forewarned of the reductionist voices you will encounter in your future endeavors to make sense of the world. Recognize that they have important perspectives to offer. Finally, know that their reign has ended, and that their voice can exist only as one among many ways of seeing and being in the world.