This is a book about methodology. An intimidating subject, some might think; others might have worse names for it. Let me assure the reader from the outset: although the book is about methodology, there is little if any explicit methodology in it. This is not a book that will tell you how to think or how to solve problems. What it seeks to accomplish is merely to illustrate, in a series of essays on disparate topics, the consequences of following a specific methodological ideal. What is this methodological ideal? I call it realism, which simply means being honest, making the truth come first. If we apply all the zeal and intelligence at our disposal to remaining faithful advocates of fact and truth, it becomes less probable that we will spend our days feeding on illusions and wasting our lives in ignorance and despair.
It is easy enough to say one should be realistic and truthful, but how does one go about it attaining this end? How does one make truth come first? The central premise of this book contends that the greatest obstacle to the attainment and understanding of truth is human pride. Arrogance and egotistical presumption constitute the great enemies of truth, because they cause us to believe our own lies. Human beings have enough trouble remaining honest toward others. But remaining honest to themselves poses difficulties that few can surmount.
As bad as it is to deceive others, it is even worse to deceive oneself. To know that something is a lie at least implies knowledge of truth. The case stands otherwise when we lie to ourselves. The lie we tell ourselves prevents us from ever knowing the truth. Through self-deception and a kind of “sincere” hypocrisy, the mind quarantines itself from reality. No longer can it distinguish fact from fiction and truth from error. Believing lies to be truth, the mind ends up becoming the victim of its own confidence scheme. The most dangerous lie is the lie we tell ourselves.
Unfortunately, human beings are little inclined to guard against the tendency toward self-delusion that exists in all of us. Individuals not only lie to themselves, even worse, they lie about themselves. Literally hundreds of experiments in social psychology have shown that most people think better of themselves than is warranted by the facts. (1997, 422) The human beast, far from being the rational animal that romantic enthusiasts imagine him to be, could more accurately be described as the mendacious animal.
Is there any remedy to cure the innate mendacity of man? If by remedy, we mean some trick or device to get rid of the predisposition altogether, the answer must be a resounding “No!” The limitations of human nature are congenital. Science ascribes the tragic fixity of these limitations to our DNA; theology blames them on original sin. Regardless of their origin, they are obvious facts that can be verified through everyday experience and the testimony of great literature.
Even though there exists no absolute remedy to cure the ills of self-deception, several stratagems can help us cope with the problem. The most important of these stratagems involves the individual’s humble acknowledgment that he, just like everyone else, is prone to self-deception. Until the individual admits his common humanity, which includes admitting that he, too, is subject to the limitations of human nature, there is no hope for him. It is by noting these limitations that the individual takes his first tentative steps toward the kind of blunt honesty and truculent realism advocated in this book.
Another important stratagem is to recognize how our tendency toward self-deception can affect the way we think. We all have a tendency to rationalize what we want to believe, even when our will-to-believe plainly goes against the grain of the truth. How do we go about combating this fatal tendency in ourselves? There are two principles critical in this respect. The first principle is openness to criticism—the eagerness to expose our ideas to the most rigorous process of experimentation and testing. The second principle is respect for ideas and notions and usages that have stood the test of time.
At first blush, these principles may appear contradictory. After all, if all our ideas require the most rigorous process of experimentation and testing before we can accept them, why should ideas that have stood the test of time be respected above any other? Aren’t all ideas so much grist for the experimental mill? That would be superficial way of regarding the issue. Any idea that has stood the test of time already has been rigorously tested: otherwise, it would have long ago become outmoded and forgotten. The fact that it is still around evinces its probable soundness. Moreover, not all ideas can in truth be rigorously tested. Some are based on very complicated judgments about the human condition—judgments so complicated that they could never be adequately justified on explicitly experimental grounds. Other ideas are based on incomplete information. Human beings are not omniscient. We can’t understand everything, because either the circumstances confronting us are too complicated or we don’t have access to all the relevant facts. If we could only evade the necessity of making a decision based on inadequate information, all might be well. Often, however, we are forced to act on incomplete information. What then should we do? In such circumstances, the best thing is to follow established usages, because at least these usages, even when they are not entirely understood, have proven useful to generations of men. There is probably something good in them and this something should be respected.
This does not mean that traditional ways of thinking and acting can never be challenged. Everything can and, when possible, should be challenged. The question involves whether, once a traditional norm has been challenged, it should be rejected. Since any long-standing tradition will probably have proven its usefulness and fecundity many times over, this in itself constitutes a very strong presumption in its favor. So before we reject any tradition with a long and distinguished track record, we better have very compelling evidence on the other side of the question. One should not, for instance, reject a long established custom or some incidental ancient usage merely because it isn’t “perfect” or because it fails to accord with “reason.” What one needs is very compelling evidence that the custom or ancient usage in question is pernicious, or could be replaced by something more effective. Without such evidence, one should be wary of messing with traditional ways of thinking and doing.
I have stressed the importance of evidence, because evidence usually constitutes the best way to get at the truth. Whenever some question arises over a matter of fact, the best way to settle the issue is to consult the relevant evidence. The worst way is to “reason” about it. What philosophers call “reason” is little more than clever rationalization. Knowledge of facts cannot be attained merely by reasoning about them. Reality is not a closed logical system; one fact does not logically beget another. Although conjectures about facts can be formed through logical speculation, such conjectures have little cognitive worth until they have been tested empirically.
To sum up: To get at the truth requires: (1) Awareness of the extent to which all of us fall prey to self-deception; (2) openness to criticism, particularly self-criticism, combined with an appreciation for established usages and venerable traditions; and (3) the realization that the best method of testing any claim about matters of fact is to consult the relevant evidence. If the individual allows these principles to help mold his attitude toward the business of getting at the truth, he will at least have a chance to see the world as it really is, free from the distorting influences of hubris, self-deceit, and wishful thinking. It is surpris