This post is a bit different. I am attempting to link an analysis of psychology to the historical narrative of modern American Christian fundamentalism. Most of the post is the buildup of an argument founded upon modern psychology, while the last third of the post is reserved for more theological questions. Keep that in mind. I think the post gets less dull and more interesting as you move through it. (But I think all of this is fascinating!) Anyway, let’s get started.
An Brief Look at Logic in the Human Mind
Developmental psychologist Jean Piaget spends the first half of his book, “The Psychology of Intelligence,” disposing certain myths about logical thought. Piaget notices that, despite popular belief among logicians of his day, humans do not think entirely logically, nor is logical thought the apex of human thought in general. To him, logic is only a set of operational tools that help us modify other, more intuitive, mental constructions. These mentally created, intuitive concepts are not logical, but they collectively form a sort of “mental universe” in which our minds live and operate. This universe in the mind is a woven web of created thoughts, of unconscious intuition that makes up literally everything you know or take for granted, whether that is the physical space around you, some imaginary concept of the Earth as a whole, your understanding of “self” in the context of your world, etc. These mental constructions lay at the core of human thought, especially religious thought.
Your “mental universe” is modified by new information and logic. Logic fills in gaps in your web of thought, linking concepts to each other. But when you receive new information that structurally contradicts a few other bits of information in your mental universe, your mind goes into a state of “disequilibrium,” and you must logically rearrange either the new data or entire sections of your own massive mental universe. You do this in order to make the new data fit properly within the mental universe, establishing mental “equilibrium.”
Logic is not the foundation of human thought. It is merely an operational tool to help us modify a constantly fluctuating web of related ideas. Without it, we would would just assimilate facts at random, regardless of their compatibility with existing thought structures, and we would be certifiably insane, or perhaps we would have the intelligence of a 2 to 6 year-old (I mean this seriously). Sometimes logical operations are complex, and it may take serious mental effort to keep track of all the operations necessary to logically relate a series of distant ideas. For this reason it sometimes takes incredible mental strain to accomplish rigorous logical tasks. But more often we make easy work of obvious syllogisms. Most of the time, unless we think real hard, we overlook a lot of more subtle logical connections, because those connections do not immediately assault us with mental disequilibrium. These ideas may be simply too logically distant to easily detect their relationship. We only modify our construction of the mental universe when new facts present a logical disequilibrium of the mind. We can handle an incompatible fact (or set of facts) in a couple of ways:
1. We can reject the fact outright, dismissing its conceptual validity and expelling it from the intuitive web of thought in our own minds. This is a common means of protecting our broad constructions of reality. For instance, if you witnessed 5 generations of your ancestors walking around in front of you, perfectly alive and talking to you, your first reaction might be intense fear. You probably cannot handle something so far beyond your mental framework of the possible. After it is all over, however, you might tell yourself it never even happened, and try to believe it never did. You completely deny it. But this should not be confused with rationalizing the fact, which is different. Rationalization is in option 3
2. We can accept the fact, but without any critical insight into it. In psychological terms, we can accept the fact for purposes of recall and other minor uses, but not attempt to perform logical operations that link the fact to problematic sectors of our mental universe. However, as soon as we try to apply it to these incompatible sectors, we are forced to confront it and resort to options 1 or 3, or simply regress back into the non-confrontation we began at in option 2.
3. We can rationalize the fact, playing through the variety of mental actions that may eventually form propositions we call rational ideas. But these ideas should only be understood as rational to the individual, while they may appear irrational to others. Inevitably, the new fact is almost never the object modified by logic. It is almost always our own perception of reality that is modified. We may question the fact, too, but a single idea is less likely at fault than is the structure of the malleable web of intuition that makes up our entire mental universe. In some cases, we will encounter facts that, if we take them seriously enough, will cause us to completely deconstruct whole systems of thought. It is cases like these that constitute the essence of a midlife crisis, or the demythologiziation crisis that occurs in the minds of many young college students.
The decision to take any one of these steps is usually made in a matter of seconds, but sometimes, when we encounter a large number of congruent facts that line up against our worldviews, we might decide to take option 3 and work it out in our minds. In this case it may take us months or years to form a new stable paradigm. And it can be painful, too. Nothing hurts like knowing we are wrong about the things that mean the most to us. It would be very difficult for a parent to admit to himself that he made horrible mistakes in raising his children. That would be almost impossible to swallow. We can also imagine what it must be like to admit we have been wrong about aspects of religion, especially when we are so sure of our intense, often ecstatic, religious experiences. These experiences are personal, and we hold on to them for dear life. In fact, coping with the deconstruction of long held religious “myths,” as they are called, can drive people mad. Yet this kind of “deconstruction,” as they call it, happens to most people at some point in their lives, though often in completely unique ways. Yet the plus side to deconstruction is that it is naturally followed by a period of “reconstruction.” We build more stable mental universes that way.
On a side note, all pastors and theologians should take note that deconstructing one’s religious ideas can be the most gut-wrenching feeling a person will ever experience. St. Augustine’s conversion comes to mind, but my own experience has taught me more than any book ever could. I have had only a handful of such experiences, but my last one was my deepest felt. It turned my world on its head, and it took me several destructive months to cope with it. With that in mind, I warn pastors not to take someone’s religious conviction lightly, no matter how ridiculous or seemingly illogical it may appear. People have literally killed themselves over these kinds of mental changes.
Bigger than Logic: Narrative
Logic is a means of modifying what we have called an “intuitive web of thought” or a “mental universe.” In general, these terms refer more specifically to a what we call “narrative.” Narrative is the final form of an adult human’s conception of reality.
Narrative may also be called “story,” or what academics call myth. We do not fully develop the capacity to understand narrative until we are fully grown adults. If you try to get a 4 or 5-year-old to tell a story, she will usually spout off a series of random details, never linking them with any sort of logic. She cannot tell stories. But through the normal process of development, she will eventually learn to tell them and understand them.
This development begins in early adolescence, where the first item on the psychological agenda is developing a “myth of self,” or a “personal narrative.” Many call it an “identity crisis.” A normal adolescent is trying to enter the adult world, but she needs a story, a narrative, a myth, to give her life purpose. That is what an identity is. You see, it is in story that we find meaning, and each of our self-created stories are truly unique. If you take a moment to think about who you are, what kind of person you are, you will eventually finish using simple adjectives like “kind” and “smart,” and you will begin recounting your story. You might ponder where that story will lead in the future, or even how it will end. That is why any good movie you watch or book you read has a great story. Stories reflect how we imagine our own lives. You can really only relate to a character when he is living a story you relate to. And you are always the protagonist.
Narrative even applies to religion. The Christian Bible is a case in point. If the Bible simply contained a set of moral rules and quotations from our Heavenly Father, maybe even a scientific explanation of physics and astronomy, nobody would have related to it. It would have been just a boring old book, and I doubt anyone but the most devout nerd would ever open it, let alone preserve it for thousands of years. Nay, its genius, its historical tradition, and its religious meaning—all of that belongs to its narrative.
The Bible doesn’t try to tell us facts. It does so much better. It tells us a story. At first it’s a story of civilization, of a people coping with morality, nature, war, famine, beliefs, politics, even an ancient form of civil disobedience (which you can read about in the Prophets). It proceeds into an account of all the greatness of God finding its way into the flesh of a man, a man who overcomes war, famine, moral dogmatism, death, and even evil itself. His name is Jesus. And this God-in-the-flesh teaches us his ways. He offers us hope and purpose, a guiding “way, truth, and light” for all of humanity to follow. But the story does not end there. It is open-ended. It lets us live out the end of our own story, and it never tells us how we will end up living it.
But Narrative Is Not Just For Individuals—It Is For Communities
To some people, our incapability of flawless logical thought is a fault in human behavior. Many people suggest you should aspire to think only in the most logical terms. Modern science disagrees. In fact, it makes more sense to say that, as organisms in nature, our methods of thought are purely natural. But while we lament that others don’t see eye to eye due to this or that logical “flaw,” we can never rationally expect people to build the same mental constructions of the world that we do. They have different lives. They are basically living in different universes. At least different mental universes. For instance, When you offer someone information that you know to be key to your own understanding of modern politics, that information may have little relevance to the mental universe he lives in, so it’s unconvincing. Tough luck. Try another route.
But for the most part, logic is convincing. And, believe it or not, convincing people is its purpose. You regularly use it to convince yourself of ideas, though you might not think of it that way. You also use logic to convince others. According to our discussion on narrative, you use logic to modify your mental universe (I think have officially coined a term here), but you can also present others with the opportunity to modify their own. That is why we communicate in the first place. We may not even be in an argument. If I tell you that the table is already set, and that you don’t have to worry about setting it, I have logically modified your mind to understand your physical universe in a new way. In your old system of thought, the table hasn’t been set yet, but in the new one, it has. Then you adjust your behavior accordingly. You might do something more productive than trying to set the table a second time. Humans do this in the most helpful ways, and social discourse helps us mutually modify our thoughts so that we arrive at more similar mental universes. By seeing the world in similar ways, we form closer communities, ones we feel more psychologically comfortable with. It’s a natural human thing to do.
Look how close they are!
However, left only to our own experiences, we will soon develop some pretty crazy ideas. “Old wives tales” are good examples of this, as are psychologically diagnosed delusions. Without a social influence, our own logic takes us to strange places. With our limited observations and preexisting knowledge, logic on its own will only recognize patterns, and those patterns could indicate strange conclusions. It may seem logical that every time you sacrificed a goat to such-and-such god, he provided you with plenty of food the coming winter. Based on your experience, this might be completely logical, if not rigorously scientific.
This is why logic itself will never lead us to truth. Only by listening to others and joining them in discussion will we ever learn truth. This requires openness to critique. Sometimes we may be unwilling to relinquish our firm beliefs in the face of the harshest criticism. Psychologically, this makes sense. In some cases, relinquishing our most precious beliefs is the equivalent of an apocalyptic destruction of the world itself. To the human mind, this could not be truer. Deconstruction of myth is a real physiological breakdown. It destroys whole sections of your mental universe. You may even begin to question whether anything is true at all. But as you pick up the pieces, you form a new one, one that is more malleable and adaptable. (Typically this happens in young adulthood, but for many people, this kind of intense deconstruction may not ever happen. Some people never complete this sort-of-optional developmental stage.)
Nevertheless, openness to new ideas is at the heart of community formation. But the opposite is also true. Complete rejection of new ideas is at the heart of separation, and at its worst, even war itself. When people desire to understand each other, bonds are made, bridges are built, hands are shaken, peace is established. When no understanding is sought, isolation becomes the norm, and fear sets in, in turn giving more reason to remain isolated.
Given the above tools for understanding human thought, my argument is that “isolation” is the foundation of religious fundamentalism.
The Foundation of Fundamentalism in Isolation
First, let us discuss a historical case of Church self-isolation.
Galileo once based his astronomical work on that of Copernicus. He determined, like Copernicus before him, that the heavens did not revolve around the earth, but that the earth revolved around the sun, an outright heresy in the Catholic Church of his time. In fact, Galileo wrote to the religious authorities basically telling them to back off. He said theologians should stick to their trade, and he would stick to his. This wasn’t only a problem with Catholics, though. Many colleges and universities all over the world began teaching Copernican astronomy as soon as it became public. Harvard was one of the most criticized for Copernicanism, and America was Protestant then. In any case, most the good Christians of the Western world, Catholics and Protestants alike, promptly flipped their lids at the thought of not being the very center of God’s heavenly creation.
But the point here is not to describe any sort of victory of Science over Religion. That question itself is really a fallacy. In fact, in the academic community, the “Science vs. Religion” paradigm is often dismissed, because these two streams of thought are not necessarily at odds.
The point is that it took a lot of convincing to get the Church on board with modern astronomy. It took centuries of theological development before the solar-system was a mainstream concept. These days you will not find many Christians who disagree that we revolve around the Sun. But you may find a lot of disagreement over how to handle the now mainstream concept of evolution. Against both scientific astronomy and modern evolutionism, the Church has always found passages in Scripture to back up the traditional orthodox view. But we don’t use those arguments anymore against scientific astronomy, because the Church has since come up with ways to assimilate it into its theology. Indeed, it is already beginning to assimilate evolutionism, and I suspect evolutionism will be widely taken for granted by the end of the century. But the Church would never have come up with Copernican or Galilean astronomy on its own, not in that age, in its self-isolation. It needed painful convincing.
What about the field of theology itself? Well, let’s look at the Protestant Reformation. The real problem of the medieval Catholic church was not inherently that people could not read the Bible for themselves. The underlying problem was that you could not question Catholic authority. Supplying lay people with Bibles in their native languages was a means for them to challenge authority, to take back control of their minds and religious behavior from the hands of the self-isolated clergy. It was this isolation that kept the religious leaders in scientific and theological darkness, not their desire to lie about what they knew. Catholic leaders felt very strongly that their beliefs were true, and they quickly burned, tortured, and excommunicated those who thought differently. In effect, they isolated themselves from competing ideologies.
Many groups of Protestants in America do this same thing, just without all the torture. They isolate themselves into communities based on creeds or ideologies. Early on, this took the form of denominations, but denominations were much more regional then. They were more homogeneous, regional entities that resembled the Catholic form of isolation. These days religious groups are much more intermingled in cities, so over the past century they have developed different ways of religiously isolating themselves. This sometimes takes the form of denominations among older generations, but more and more people are creating new groups we call “the emergent church,” the “fundamentalists,” the “new monastics,” and the general trend in young people toward “social justice” church movements. And these movements are extending far beyond the boundaries of denominational lines.
Any group of people tends to become isolated to some degree based on interests and beliefs. The New Monastics may seem to harmfully isolate themselves, but really they are embracing a broad and diverse community, the community of the poor and downtrodden. Instead the group that stands out the most in modern Christianity is the fundamentalist movement. And of course it stands out—it gets all the media coverage, mostly because it provides awesomely controversial soundbites.
Today fundamentalists isolate themselves out of intense fear. In their minds, not only will sin lead them to an eternity in hell, but also wrong ideas about relatively minor things. While the thinkers of the Enlightenment consistently refuted various factual readings of the Scriptures, fundamentalists emerged as a reaction to an encroaching community of critical thought. In retrospect, they should have engaged this intellectual community, but they refused.
By the beginning of the twentieth century, fundamentalists had already isolated themselves into fringe communities, developing all sorts of new myths about their “secular enemies.” I’ve heard many of these myths first hand. Some of them report that outside thought is a demonic attack by Satan or his minions. Likewise, any information causing you to doubt the beliefs of your community are satanic temptations. You can see how this would naturally scare people into intense isolation from society. The most famous myth that came from this movement was the belief in a brutal, hellish account of rapture. In the seventies, Hal Lindsey wrote a book called The Late Great Planet Earth. It sold like wildfire, both to fundamentalists looking for a community narrative and to mainstream onlookers who wanted to watch the insanity unfold. The book depicted a series of future events that would end in the earthly torture of secular outsiders, followed by their destruction and eternity in hell, while the theologically devout, “true,” Christians would escape destruction in a magnificent rapture. Later, in the nineties, the Left Behind series was released, following the same trend, even producing a film adaptation.
Sociologically speaking, These myths were created and adopted by fundamentalists as theological doctrine in order to give them a community narrative. It fulfilled their psychological need for an epic, larger than life, otherworldly purpose. In this case, it also justified their isolation. Their regression from mainstream society was no longer merely the result of their fear of competing ideologies, it was the means of Christian purity. By keeping their beliefs pure they could escape the coming destruction of all things. It also provided a motivation for action and evangelism. All in all, this apocalyptic narrative strengthened the community of American fundamentalists, and for obvious sociological reasons.
What Can We Learn from Fundamentalism?
Narratives are powerful binding forces. They are the glue that holds us together. Stories give us reason to think, to pursue relationships, to find jobs, to fight wars, to build nations, to pray. Stories give us a reason to live. Without our own set of stories, stories we can share with other people, life appears meaningless. But when our narratives keep us in a state of constant fear, they bind us to fearful communities. The fear of outsiders, or xenophobia, leads us to fight otherwise irrational wars, even commit genocide. Only by trying to understand people can we hope to find peace with them.
What can we learn from this knowledge? For us to form healthy communities, especially religious communities, we need to think, speak, read, and learn in community. We can think logically on our own, but logic does not lead to truth or wisdom. When we are left to increasingly small communities, we are limited to our personal experiences and narrow insights. When we open our minds to see things from the perspective of outsiders, foreigners, or even other religions entirely, we can begin to approach true wisdom. If we let people convince us, even just occasionally, our mental universe expands, and we begin to see things from a clearer, wiser perspective.
Best of all, we don’t have to fear the great world outside. We don’t have to fear new theology. We don’t have to fear a different reading of Scripture. If history has anything to teach us, there was never anything to fear, anyway. One of the most often spoken phrases in the entire Bible is, “Do not be afraid.”
Peace I leave with you; my peace I give you. I do not give to you as the world gives. Do not let your hearts be troubled and do not be afraid.
[Update: It has come to my attention that in my discussion of Piaget's theories of logic, I do not make clear the distinction between conventional 'propositional' logic and Piaget's 'serial operations,' and I mistakenly referred to both, interchangeably, under the general name "logic." I did not explain the process of 'serial operations,' so I used a more colloquial 'logic' to describe it. Also, I neglected to cite several sources. The most important of these are James Fowler's "Stages of Faith," Sharon D. Parks, Big Questions, Worthy Dreams," and Paul Tillich's "Dynamics of Faith." I want to make clear that, like James Fowler, I make no qualitative claims for the process of 'de-/reconstruction,' nor do I suggest that 'propositional' logic should be the mode of discourse between communities a la Enlightenment progressivism. I only suggest that discourse with outsiders should be made with efforts to understand them rather than simply fear or antagonize them.]