Beyond Good and Evil

Dr. Ronnie J. Hastings

Archive for the tag “religion”

God –The Ultimate Meme, or The Problem of God

In Perception Theory and Memes — Full Circle, [March 2019], the epistemological concept of memes was used to “tie together” the basic concepts of Perception Theory, “circling back” to the beginnings of the theory. This tying-together of memes into Perception Theory, if you will, was done within the group of related posts having to do with Perception Theory.

Similarly, this is the tying together of two groups of posts, one again being the Perception Theory group (Group II.) and the other being the origin of Christianity group (Group I.)  Both groups of posts share constituent subjects of God, religion, or, to use my phrase, god and god stories.

Group I. consists of Sorting Out the Apostle Paul, [April, 2012], Sorting Out Constantine I the Great and His Momma, [Feb., 2015], Sorting Out Jesus, [July, 2015], At Last, a Probable Jesus, [August, 2015], and Jesus — A Keeper, [Sept., 2015].  It is a personal journey of religious belief utilizing history as a forensic science and my own “spiritual” experiences as a guide toward understanding how Christianity (and, by extrapolation, all religious systems of belief) came about.  It utilizes modern biblical criticism and the application of philosophy’s Occam’s Razor.  Conclusions gleaned in this group of posts rest upon the separation of theology and ethics, the former seen as mostly epistemologically and intellectually toxic, and the latter seen as epistemologically, intellectually, and socially essential and vital.  As the title Jesus — A Keeper, [Sept., 2015] implies, Christianity’s value (and by implication the value of all religions) lies in the time-proven ethics of the Golden Rule or Principle of Reciprocity, not in theology.

Group II. is much larger numerically, which correctly implies its greater subject breadth and depth.  It consists of Perception Is Everything, [Jan., 2016], Perception Theory (Perception is Everything) — Three Applications, [Feb., 2016], Perception Theory:  Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016], I Believe!, [Oct., 2016], Hope and Faith, [Jan., 2017], Prayer, [Feb., 2017], Egalite: A Qualified Virtue, [Feb., 2018], Going Global, [March, 2018], AVAPS!, [May, 2018], Toward an Imagined Order of Everything, Using AVAPS, [June, 2018], The “Problem ” of Free Will, [June, 2018], and, as indicated above, Perception Theory and Memes — Full Circle, [March, 2019].   This group develops a universal ontology and epistemology under the heading “Perception Theory.”  Perception Theory is a combination of rationalism and existentialism which enjoys a wide range of applications, as demonstrated in Perception Theory (Perception is Everything) — Three Applications, [Feb., 2016] and The “Problem ” of Free Will, [June, 2018].  In addition to illuminating directions of modern political and economic theory, Perception Theory particularly sheds light on topics from Group I., as shown by Perception Theory:  Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016], I Believe!, [Oct., 2016], Hope and Faith, [Jan., 2017],  and Prayer, [Feb., 2017].   Hence, from the perspective of sorting out “god and god stories,” much of Group II. seems like a continuation and elaboration of Group I. (as the posting dates of publishing on www.ronniejhastings.com (site name Beyond Good and Evil) above might indicate).

Memes blending “full circle” with Perception Theory (Perception Theory and Memes — Full Circle, [March, 2019]) indicates that a common theme woven throughout both groups, the “what” and “why” of gods and god stories, will also have a “full circle” of its own.  Philosophy of religion often posits the “problem” of God.  As in the “problem” of free will (The “Problem ” of Free Will, [June, 2018]), a question is begged:  is there need of a “problem” at all?  The epistemological questions surrounding the formation of Christianity (and all religious sects, for that matter), coupled with the suggestion that ontological differences among theists, atheists, and agnostics are silly and absurd (Perception Theory:  Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016]), imply, in my opinion, a resolution concerning any such “problem” is highly plausible.

{Here it is necessary to interject that the more familiar the reader is with the content of all the posts referenced above, greater and swifter will be the understanding of that which is to follow.  Bear in mind that, as always, “understanding” is not necessarily the same as “agreeing.”  Listing all the posts above emphasizes that the “full circle” attempted hereafter is not some momentary epiphany, revelation, emotional experience, recent whim, or musing, but, rather, is the result of years of methodical, careful thought leading to satisfying  personal conclusions.  That they would be satisfying to anyone else is unwarranted speculation on my part.  Achieving understanding (not necessarily agreeing) with others may be a forlorn hope (See Hope and Faith, [Jan., 2017]), but achieving any understanding from others at least would provide relief from any lingering angst over my personal “subjective trap” (See Perception Is Everything, [Jan., 2016]) — adding to the personal relief memes give (See Perception Theory and Memes — Full Circle, [March 2019]).}

In dealing with gods and god stories in terms of memes, we do not start “from scratch;” all terminology has been defined in the above posts in both Groups I. and II.  The context of our start is 1. We are star-stuff in self-contemplation.  2.  Math is the language of the universe.  To this context is added 3.  God is a looped non-veridically based concept in our heads, or meme having no resonance with the “real” veridical world or universe outside our epiphenomenal minds contained in our veridical physiological brains. (Perception Theory:  Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016])  Therefore, God exists as does a unicorn, as does Santa Claus, as does the tooth fairy, as does Satan.  The same existence applies to the generic term “gods” as well as to stories about God, or god stories.

Memes or concepts of the veridical world outside us, like the idea of “rock” or “dog,” are non-veridical, like the memes of gods, but with a very important difference: they are resonant memes, resonating with the empirical data bombarding our senses when we experience a rock or a dog.  We use our epiphenomenal  imaginations to create memes of both looped concepts (non-veridically self-contained in the imagination) and resonant concepts (non-veridically related with the veridical “outside” world indicated by our continual “pouring in” of empirical sense data).  Imagined worlds in science fiction are looped memes and scientific theories are resonant memes.  “Scientific” objectivity is making memes as resonant as possible, or as veridical as possible (AVAPS!, [May, 2018] and Toward an Imagined Order of Everything, Using AVAPS, [June, 2018]).

Certain looped non-veridical memes, like Santa Claus and Satan, are made to appear resonant by saying Santa Claus is the “personification” of Christmas giving or Satan is the “personification” of human evil.  Personifications are like avatars, or manifestations of something else.  If the “something else” has a veridical existence, again, like a rock or a dog, then it would not be looped.  The behavior of giving at Christmas and acts of human evil are real enough, just as human values like “love” and “freedom,” but equating the spirit of giving with a human form or evil acts in general with a human form is as absurd as equating all the facets of human love to a single form (like a pagan goddess) or all the facets of freedom to a single form (like Miss Liberty).  Therefore, just like a goddess such as  Venus or Aphrodite does not exist like a rock or dog, or a historical woman named Miss Liberty does not exist like a rock or dog, Santa Claus does not exist, nor does Satan.  As extant beings, Santa Claus, Satan, Venus, and Miss Liberty are looped memes; the phenomena of which these four are personifications, giving at Christmas, human evil, love, and freedom, respectively, do exist as scientifically observable distinct acts in the veridical real world, and, therefore, are resonating, non-veridical memes (Perception Theory:  Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016]).  Personifying (or making gods of) real human activity is a primitive habit of human imagination that probably began with the earliest forms of animism, and is linked with the origins of religion and its ritualization; personification was and still is a method of making sophisticated memes understandable for children; as adults it is strange today that in Christian civilizations we shed the notion that Santa “really” (that is, veridically) exists, but many of us still believe Satan “really” (i.e., veridically) exists.

What about the looped meme God, a.k.a. Yahweh, Elohim, or Jehovah in Judaism, God in Christianity, or Allah in Islam?  To what would God resonate to make God a resonate meme, like love, evil, or freedom?  To the whole world, being that God is the creator god?  Would that not be pantheism, meaning we worship the universe? (How odd would that be, in that we are part of the universe?  To worship the universe is to make the matter and energy of our bodies also objects of adoration, along with mountains, stars, animals, etc.)  To worship any part of the universe is, again, returning back to primitive religion, to idolatry.  It seems clear to me that we have made up God as the personification of everything, as the answer to any question we may pose.  As I said in Perception Theory:  Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016], God is the Grand Answerer, Super-friend, and Creator.   God, once believed in within the individual heads of worshipers, can be used to any end by the clergy, from yesterday’s shamans to today’s popes, ministers, priests, mullahs, etc.  It seems easy for us to forget that just because we can imagine X, that does not mean that X exists like a rock or a dog (Remember, a rock or a dog exists in our head like any other non-veridical meme — in the form of a concept stored as memory built by perception.)

God, therefore, is the ultimate meme, the meme beyond which nothing can be imagined.  The meme of God is seemingly a tribute to the power of our imagination, but the history of humanly imagined religion shows this tribute to be simultaneously a problem — a flexible meme easily twisted into a “pass” to do evil to each other; this is the toxicity of most, if not all, of theology; this is why Richard Dawkins describes religious, theological memes as agents of a chronic mental disease; this is why I separated ethics from theology in Jesus — A Keeper, [Sept., 2015].

But have I not described God as the atheists do?  No, not quite.  Perception Theory allows existence in the real, veridical universe outside our minds (which includes our bodies, including our brains), but also allows the epiphenomenal, non-veridical existence of imagined memes inside our minds, which are, in turn, inside our brains.  In other words, an imagined entity, like a unicorn, if defined in any mind, can have an ephemeral existence as stored data in the memory of the brain of that mind; in this sense looped non-veridical memes exist.  A very weak existence compared with the strong veridical existence of a rock’s meme or the quickened and strong veridical existence of a dog’s meme (Perception Theory:  Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016]), for sure, but an existence made possible by our imaginative, epiphenomenal mind.  According to Perception Theory, then, an atheist recognizes only strong veridical existence, whereas a theist thinks that a weak existence is as strong as a strong existence.  An agnostic does not take either position, but Perception Theory would say all three positions are in denial of the ability of the mind to be both objective and subjective.  Theists, atheists, and agnostics can all agree that some form of God exists in the heads of both believers and non-believers (Atheists have a meme of a god that does not exist in the real veridical world like a meme of a rock or dog that does exist in the real veridical world.), and that existence of god has no basis outside the human mind; all can agree to the statement, “God exists!” in the dual veridical/non-veridical definition allowed in Perception Theory.  All the conflict, blood, and death perpetuated over disagreement as to what kind of God is “real” throughout the terrible annals of historical warfare, pillage, incarceration, and personal violence were never necessary, and in the long run silly; what still goes on today is folly, absurd, and unjustified.  How less amazing are the billions of concepts (memes) of God in the imaginations of humans worldwide compared to the consensus, imagined Creator God of, say, Genesis, Chapter 1?

In order for theists, atheists, and agnostics to agree on the existence of God or of the gods, atheists have to compromise but very little, while theists will have to move their position a great deal.  To agree that God exists in the imaginations of individual heads into which no other but that individual can “see,” due to the subjective trap, is not that far away from the “classic” atheistic claim that there is no supernatural deity or deities in the “real,” veridical universe.  The theist “classic” claim is just the opposite that of the atheist — there IS WITHOUT DOUBT a God that exists outside human imagination, just like some part of the universe or the universe itself actually exists.  If one listens carefully to the worshipful words of praise of theists (at least, this has been my experience), the existence of God is affirmed “within the heart” of the believer — affirmed by an epiphenomenal feeling of emotion fueled by faith (See Hope and Faith, [Jan., 2017]).  That is about as far from objective evidence as one can get.  This, instead of affirming God’s existence, affirms what Perception Theory identifies as a looped non-veridically based case for existence.  That is, the theist’s affirmation of God’s existence is no stronger than that of affirming the existence of unicorns or tooth fairies, and is much weaker than affirming the existence, of, say, freedom.  And, of course, the theist’s affirmation of God’s existence is minuscule compared to the strong veridically based cases for existence of, say, a rock or a dog (Perception Theory:  Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016]).  As for agnostics, I would speculate that some would welcome the compromise about God’s or the gods’ existence with the “little-to-lose shoulder shrug” of the atheists, or some might remain skeptical and non-committed, not willing to come close to agreeing with theists, who they see as gullible and naive.  All in all, I would speculate that at the “table” of agreement of all three groups over Perception Theory’s compromise possibility of the existence of God, it would be disproportionately made up of atheists, with a smaller group of agnostics, followed by an even smaller group of theists who have bravely changed their ontological thinking a great deal.   The future success of Perception Theory might be measured by seeing if the population at the compromise table might approach equal proportions from all three groups.  (No matter what the proportions at the table might be, Perception Theory might take credit for the absence of evangelism among the three groups, as, by definition, the table is one of agreement.)

Stated directly and succinctly, God or gods exist(s) only in our imaginations; we made up all deities, past, present, and future.  Most theology is not only useless, it can often be dangerous and even lethal.  Not all of religion is useless; part of religion is vital — the ethical part based upon the Golden Rule or Principle of Reciprocity (Jesus — A Keeper, [Sept., 2015]).  In Western culture this means a deliberate separation of ethics from theology in religions like the three Abrahamic ones, Judaism, Christianity, and Islam; this separation is already done in some religions of Eastern culture, like Buddhism, Jainism, Confucianism, and Taoism.  We have met the Creator God, and it is us; there is no problem of God or of the gods — just like all memes in our heads, the ultimate meme of God or the gods is at our disposal; we can do with theology what we will; we can make it impotent and irrelevant, just as we have made memes like pseudoscience, superstitions, and unwanted or uninteresting fantasies.  Just as was done by so many Americans in their revolution for independence, religion must be relegated and confined to individual minds, not made into social and sacred creeds demanding conflicting evangelism (The United States of America — A Christian Nation? [June, 2012]).

 

With the gods relegated to fantasy within our heads, we can now deal with god stories and the lessons they teach with historical utilitarianism.  Like so much of “ancient wisdom” from our distant past, such as the humanistic Principle of Reciprocity, we can both individually and collectively judge the god stories and their lessons without fear of supernatural reprisals.  For example, in Christian culture, from which I come, I can now see that the Old Testament of the Bible is a collection of literature blended together by Hebrew scholars and priests to teleologically justify the invasion and conquest by newly independent nomads of what we call the Holy Land, all under the theological guise of the Hebrews being God’s “Chosen People.”  I can now see that the New Testament of the Bible is a collection of literature blended together by the scholars of a new sect to teleologically justify the execution of their leader as a common criminal (See all of Group I. for details).  The New Testament is to Christians what the Icelandic Sagas were to many Scandinavians of the Viking persuasion.

Erich Fromm, a Jewish humanist philosopher, who describes himself as a “non-theist,” has done something very similar way before Perception Theory.  In Ye Shall Be As Gods (Fawcett Premier Books, New York, 1966 — ISBN 0-449-30763-8), Fromm “radically” interprets the Old Testament as the evolution of the relationship between the meme (concept) of God and the entirety of mankind, not just the “Chosen People.”  He offers understanding into the “God is dead” meme and gives insight into the New Testament’s Passion of Christ, using Psalm 22.  The rabbinic teachings of the Old Testament during the centuries of the Diaspora are also employed.  By critically looking at the Old Testament, Fromm has, in my opinion, created paths toward its greater appreciation. (Why Some White Evangelical Christians Voted for and/or Still Support Donald Trump, [Dec., 2018])

With the gods relegated to fantasy within our heads, we can now investigate why religion sprang within the heads of our species in the first place.  The reasons why belief in some form of supernatural entities or spirits in the real world became, apparently, necessary for human survival in our cognitive revolution during our species “hunter-gatherer” stage can now be studied and be made into a consensus of anthropology.  Elements dealing with the origins of religion from Groups I. and II. have already pointed the way (See At Last, a Probable Jesus, [August, 2015],  Jesus — A Keeper, [Sept., 2015], Perception Is Everything, [Jan., 2016], I Believe!, [Oct., 2016],  and Toward an Imagined Order of Everything, Using AVAPS, [June, 2018]).  The physical and cognitive attributes that were passed on from generation to generation over thousands of years contributing to our species-wide universal “religiosity” will have to break down the elements of our survival, such as cooperation, altruism, and the necessity of suspending doubt and questioning in times of emergency, such as discussed in I Believe!, [Oct., 2016], wherein our ancestors having to deal with a “leopard problem” is offered as a “thought scenario.”  How did religion evolve from simple appeasement of a local “leopard god” to the continual sacrifice of thousands atop Aztec temples in Tenochtitlan?  How did we get from admonishing our children to be quiet when the shaman is speaking to the eruption of the Thirty Years War?  What a difference between believing a god or gods causes thunder/lightning and calling the faithful to the Crusades!

With the gods relegated to fantasy within our heads, we can now see how important the separation of theology from ethics is.  Moreover, such a separation is conveniently seen as a sorting of memes.  When the origin of religion, with its subsets of theology and ethics, is couched in terms of memes, I would suggest that the vital “good” memes, those of ethics coming from the human mind and necessarily developing in the longest childhood of all primates, if not of all mammals.  That is, the memes of ethics for human beings necessarily formed on the “template” of the development of the nuclear family — mother, child, father, and extended family, including friends.  The rules of behavior taught to a child are extrapolated to apply not only to the mother-child relationship, but to all other possible relationships within the hunter-gather group, and these rules collectively are treated as social norms applied throughout childhood and adulthood.  In turn, these norms were justified upon the authority of the group.  This collective authority became more than “what our mothers and older siblings told us;” it became the authority of the political leaders and the authority of the “spiritual” leaders, the shamen, the beginning of politics and the beginning of religion.  But now, without the necessity of religious memes, only those of politics and ethics are still needed.  (Recalling a point germane to the “need” for religion shown by Yuval Noah Harari in his book Sapiens, A Brief History of Humankind – that religion is a meme that can motivate many more than a leader within shouting distance, once that meme is transmitted to other minds — I would hasten to add that today’s almost instant electronic communications over the world wide internet has taken over religion’s communicative skill and can spread memes much, much better; spreading theological memes using the internet only accelerates the spread of the “poison.”)  Religion and theology memes are not needed any more; only ethics memes are needed.

Gods as fantasy has at least one ancient precedent.  In India, in the 3rd to 6th centuries, BCE (or BC), the original form of Buddhism, called Hinayana or Theravada Buddhism, basically ignored the question of the existence of the gods (curiously non-theological) and concentrated on the human, inner, existentialist self (Jainism, contemporary with the founding centuries of Buddhism, could be spoken of in a similar vein, and could even be seen as outward looking, not for the gods, but for practicing an extreme reverence for life).  Hinayana Buddhism dealt with attaining Nirvana, or enlightenment as demonstrated by Siddhartha, the founder of Buddhism; dealing with gods took a back seat to struggling with inner human desire; the gods were not germane to Siddhartha’s original teaching.  In time Mahayana Buddhism (along with other forms, like Zen) became the dominant form of Siddhartha’s teaching, in which Siddhartha himself, or Buddha, became deified as a god — much as Jesus himself became deified as a god in Christianity (Sorting Out Constantine I the Great and His Momma, [Feb., 2015]).  Imagery featuring the statues of Buddha are found at Mahayana sites, but sites featuring simple imagery such as Buddha’s footprint are Hinayana or Theravada sites.

Note the “direction” of Hinayana Buddhism, though admirably unhindered by the gods, is inward, toward the non-veridical, not outward, toward the veridical, as are science, technology, math, and engineering (the STEM subjects in US schools), which are equally and admirably unhindered by the gods.  The success of studying “outward” toward the veridical is another way of repeating the message of AVAPS!, [May, 2018] — As Veridical As Possible, Stupid!  Hinayana Buddhism took its lack of theology and went the “wrong” direction!  Hinayana Buddhism should have done “a 180,” (180 degrees) and gone the opposite direction.

Without the threats of punishment after death or fantasies of paradise after death germane to much of theology, religion becomes transparent as many, many forms of the sociological phenomenon of a cult.  At every religion’s beginning — more finely, at the beginning of every denomination’s sect — it is a cult.  If I in another time had acted upon my “visitation” from my deceased great uncle in the form of a vivid dream, as described in At Last, a Probable Jesus, [August, 2015], and had convinced others around me I had communicated with the dead, I would have formed a cult.  Great religions of the world throughout history are successful cults, their “truth” erroneously measured by their success, and large subsets of great religions are smaller successful cults.  Cults venerate a “great” being (usually a god or person of “special” powers) through the leadership of a cult founder, who also can be the venerated.  Thus, Judaism can be seen as Moses founding the veneration of Yahweh, Elohim, or Jehovah, and Christianity can be seen as Peter, Paul, and Mary Magdalene venerating Jesus (See At Last, a Probable Jesus, [August, 2015]).  Smaller successful cults in the Christian vein include cult leaders such as many Popes, many Orthodox archbishops, many saints, Martin Luther (Lutherans) , John Calvin (Presbyterians), Henry VIII and Thomas Cranmer (Anglicans in U.K., Episcopalians in U.S.), George Fox (Quakers), Jane Wardley, Ann Lee, and Lucy Wright (Shakers), John Smyth, Thomas Helwys, and Roger Williams (Baptists), Charles Wesley, John Wesley, and George Whitefield (Methodists), Joseph Smith (Mormons), Christian Rosenkreuz (Rosicrucians), Mary Baker Eddy (Christian Scientists), William Miller and Ellen G. White (Seventh-day Adventists), Barton W. Stone (Christian Church, Disciples of Christ), Alexander Campbell (Church of Christ), Charles Fox Parham and William Seymour (Pentecostals), 1914 General Council at Hot Springs (Assembly of God), and Sun Myung Moon (Unification Church) — just to name a few with which I am familiar.  Two non-Christian examples of small successful cults are 3 Roman Emperors (veneration of Apollonius) (See Sorting Out Jesus, [July, 2015])  and Scientology (veneration of L. Ron Hubbard).  Two unsuccessful cult leaders and their cults here in the United States are Jim Jones (Peoples Temple) and David Koresh (Branch Davidians).  The toxicity of theology throughout history has been carried out through cults such as these.  The ethical kindness, love, and care of one group of humans to another group has also been carried out through cults such as these, but what has been overlooked is that ethical behavior needs no theology or organized religion to spread from one human to others.  When Jesus taught his version of the Golden Rule, he talked not of loving your neighbor as yourself through the social vehicle of the synagogue; the foundation of ethics, our caring for each other, has no origin in any religion or any theology; the Principle of Reciprocity began within each little hunter-gatherer group that successfully struggled for survival.  If theology exists as a meme in an individual, there it must stay — it should not be passed on to others; mental health services can help individuals for whom resisting that passing on is a struggle.  On the other hand, if ethics such as the ethical teachings of Jesus exists as a meme in an individual, by all means it should be passed on, as ethical memes were passed on in the little hunter-gatherer groups.  To be ethical in the manner spoken here is to be human, not religious or theological.  We are not human to each other through the imagined groups to which we belong, but, rather through the fact we are homo sapiens.

The general “shedding” of religion and its toxic theology, then, is seen as a veridically-based “enlightenment” which follows AVAPS toward more anthropological memes.  Imaginations young and old, fueled by the ethics of reciprocity (The Golden Rule), cannot but generate memes fired in the scrutiny of scientific consensus that will solve problems and heal wounds both for our species and for our planet and the universe beyond.  We are tweaking our inner-star-stuff to resonate more with the star-stuff that makes up the rest of the universe.

I would suggest that any reader who thinks this is but another announcement of another religion, of another cult, is victimized by her seemingly genetic tendency to think in terms of gods and god stories.  He needs to go back and read or re-read Groups I. and II.  God as the ultimate, unnecessary meme is NOT a new religion, NOT a new cult.  Rather, it is a veridically-directed philosophy transcendent of theism, atheism, or agnosticism.  Using the combination of rationalism and existentialism provided by Perception Theory, it suggests an expansion of anthropology to deal with the “who, what, why, and how” of human existence; the “who, what, why, and how” of human existence used to be handled by religion and its attendant theology, and I am suggesting that they have failed miserably.  The “should” statements used above are not evangelical pontificates, but, rather, are calls to consider looking at existence veridically, to look at existence in the opposite way Hinayana Buddhism did.  When I followed my own “shoulds” of Perception Theory tied to religion, I found the intellectual and emotional personal satisfaction I had been seeking for years. (“Personal satisfaction” does not mean I’ve not continued to question “everything,” especially this meme like Perception Theory that my imagination conjures.)  Perhaps my own intellectual adventure might be of help toward others finding their own version of personal satisfaction.  Or, perhaps not.  I’ve written it down compelled by an ethical Principle of Reciprocity tens of thousands of years old and taught by Jesus and so many others.

RJH

 

 

Why Some White Evangelical Christians Voted for and/or Still Support Donald Trump

White evangelical Christians who apparently were “one issue” voters willing to sell their morality and soul by supporting Trump over an issue like abortion, prayer in schools, secularization of society, too liberal SCOTUS, demonization of liberals like the Clintons and Obama, etc. are in my experience not as dense as their stance might portend; there had to be some “sacred” reason(s) they would knowingly be supportive and culpable of the bigotry, immorality, and intellectual bankruptcy of Don of the present White House. Finally, I have discovered at least one such reason.

 
Up until recently all the clues I had from evangelical Christian friends and family, always reluctant to talk politics and/or religion with me, were comments like “God moves in mysterious ways!” (from the hymn “God Moves in a Mysterious Way” by William Cowper (1774), based upon Romans 11:33) or “Hillary is evil!” Then my friend and former student Dr. John Andrews sent me a link entitled “The Political Theology of Trump” by Adam Kotsko, which begins with the question “Why do evangelical Christians support Trump?” Kotsko, who is apparently white and an evangelical Christian, pointed out something concerning the Old Testament that “clicked” with my life-long experience with white evangelical Christians. Turns out, for some white evangelicals, to support Trump is to support God’s will; to not support Trump is to work against God’s plan!

 
First, let’s be clear about whom I’m writing. I am not talking about all Christians; I am not talking about all evangelicals; I am not talking about all white Christians. I am talking about a minority within a minority within a minority…, like the innermost figure in a Russian matryoshka doll, or nesting doll, or stacking doll. This minority group is mightily qualified and nuanced. White, Protestant, evangelical, biblical literalist, apocalyptic, and often holier-than-anyone-else describes this group well. I need an acronym to cover efficiently all these qualifications — White, Evangelical, Protestant, Christian, biblical LiteralistS, or WEPCLS, pronounced “wep-cils.” (I’ve not included the nuance of politically conservative, which I assume is obvious.) WEPCLS vote for and support Trump with hypocrisy so “huge” and blatant they seem unaware of it, like not seeing the forest for the trees.

 
Here in the “Bible belt” part of Texas, it may not be apparent that the WEPCLS constitute a minority. After all, the large First Baptist Church of Dallas with Dr. Robert Jeffress, well-known Trump supporter, as pastor, is seen as a beacon of WEPCLS values. But even this congregation is not 100% WEPCLS. When all Christians nationwide and worldwide are taken into consideration, then even we Protestant Texans can see WEPCLS as a minority.

 
Second, the reason something “clicked” about the Old Testament with me is that, for those of you who don’t already know, I’ve lived my whole life among WEPCLS; many of my friends and family are WEPCLS and, therefore, voted for Trump. (Personally, I “got” the “W” in the acronym down pat! 23 and me showed me to be Scots-Irish, English, French, German, and Scandinavian; I’m so white I squeak!) The denomination in which I grew up, Southern Baptist, was and is replete with WEPCLS; not all Southern Baptists are WEPCLS, but every congregation in which I have been a member contained and contains not a few WEPCLS. Why did I not over the years join the WEPCLS? Because, briefly, I early on asked questions answers to which were NOT “Because the Bible said so,” “Because the Church, Sunday School teacher, pastor, your parents, etc. say so,” “Just because,” “Because God made it that way,” “You shouldn’t ask such things,” etc. These woefully inadequate and empty answers made me take a closer look at the Bible, and by the time I went to college I had read both testaments and began to see why so much of Scripture was not the subject of sermons or Sunday School lessons. (See Sorting Out the Apostle Paul [April, 2012] on my website www.ronniejhastings.com) In short, I did not become a member of WEPCLS in large part because I did not become a Biblical literalist, and over time the idea of evangelizing others based upon faith that had few if any answers added to the social divisiveness around me — added to the “us vs. them” syndrome, the bane of all religions.

 
In addition to WEPCLS’s Biblical literalism, which is the clue to their support of Trump, it is my opinion the WEPCLS have sold their birthright from the Reformation with their emphasis on conversion and conformity. The Reformation gave birth, it seems to me, to a Protestantism wherein congregations are not groups of sheep (pew warmers) led by shepherds (the clergy), but, rather, are groups of meritocratic believers, each one of which has his/her own pathway and relationship to God. Moreover, WEPCLS have turned their backs on the great gift of the Enlightenment to everyone, including all believers — that everything is open to question, including this statement; there are no intellectual taboos. The human mind is free to question any- and everything, in the fine traditions of Job and doubting Thomas. It has not been that long ago a WEPCLS friend of mine referenced Martin Luther negatively because the Reformer was not godly enough and blamed the Enlightenment for the blatant secularism of today. To ignore both the Reformation and the Enlightenment categorizes the WEPCLS as woefully anachronistic — downright medieval even.

 
Incidentally, the mixing of politics and religion by so many WEPCLS (an attack on separation of church and state) is very unsettling because it is so un-American. As Jon Meacham, renowned American historian, said in his book American Gospel (2006, Random House pbk., ISBN 978-0-8129-7666-3) regarding the Founders’ view of the relationship between the new nation and Christianity, “The preponderance of historical evidence….suggests that the nation was not ‘Christian’ but rather a place of people whose experience with religious violence and the burdens of established churches led them to view religious liberty as one of humankind’s natural rights — a right as natural and as significant as those of thought and expression.” (p. 84) (See also my The United States of America — A Christian Nation? [June 2012] at www.ronniejhastings.com.)

 
Back to the clue of why WEPCLS support Trump. If one is a Biblical literalist, chances are you have to hold the Bible as your sole source of truth — the source of true science (creationism and intelligent design) and of true history (Moses wrote the Pentateuch, Adam and Eve were actual historical beings, Joshua actually commanded the sun to stop in the sky, Mary of Nazareth was impregnated through some form of parthenogenesis, Jesus was resurrected back to life after crucifixion, etc., etc.). As time went on it was to me like adult Biblical literalists actually believe Santa Claus, the tooth fairy, Satan, the Easter bunny, ghosts, Paul Bunyan, Pecos Bill, and Uncle Sam all exist just like the live friends and family that surround them instead of as concepts in their heads. As I studied epistemology in college, it became obvious one could justify and believe in literally anything through faith. Evidence-based truth is non-applicable to a Biblical literalist, and therefore is not applicable to WEPCLS.
Eventually, I became a physicist who likes to teach, instead of a WEPCLS. This post represents how the teacher in me compels me to pass on knowledge as best we know it at the present; to not be skeptical as all good scientists should be, and to not pass on what evidence-based skepticism cannot “shoot down” as all good teachers should do, is for me to fail my family, my friends, and all my fellow homo sapiens.

 
Recalling my days as a Sunday School teacher who relished the rare lessons from the “histories” of the Old Testament (like I & II Kings and I & II Chronicles), let me give you in brief outline the Biblical history that animates the WEPCLS (especially if Old Testament history is not your cup of tea):

 
1.) After the reigns of kings David and Solomon, the Israelite kingdom (consisting of the 12 tribes associated with the 12 sons of Jacob) split in twain, 10 tribes in the north known as Israel and 2 tribes in the south (close to Jerusalem) known as Judah. Each new kingdom had its own line of kings. The split occurred around 930 BCE (Before Common Era) or B.C. (Before Christ).

 
2.) Beginning about 740 BCE, the Assyrian Empire, which replaced the “Old” Babylonian Empire, invaded and overran the northern kingdom of 10-tribe Israel over some 20 years under the Assyrian kings Tiglath-Pileser III (Pul), Shalmaneser V, Sargon II, and Sennacherib. The 10 tribes were scattered in an Israelite diaspora and became known as the “lost tribes” of Israel. Assyria replaced the displaced Israelites with other peoples from the wider Mesopotamian region who became known by New Testament times as Samaritans. Sennacherib tried unsuccessfully to conquer 2-tribe Judah in the south, being killed by his sons. These events are covered in II Kings, Chaps. 15, 17, & 18, in I Chronicles Chap. 5, and in II Chronicles Chaps. 15, 30, & 31. The prophet known as “early Isaiah” from the 1st of three sections of the book of Isaiah is the major “prophet of record.”

 
3.) The Assyrian Empire was replaced by the “New” Babylonian Empire under King Nebuchadnezzar II and by 605 BCE the kingdom of Judah was succumbing to Babylon in the form of three deportations of Jews to Babylon in the years 605-598 BCE, 598-597 BCE, and 588-587 BCE, the third resulting in the Babylonian Captivity from 586-538 BCE following the siege and fall of Jerusalem in July and August of 587 BCE, during which Solomon’s Temple was destroyed. The end of II Kings and II Chronicles record the fall of Judah, and the Book of Jeremiah, Chaps. 39-43 offers the prophetic perspective (along with the book of Ezekiel), with the addition of the books of Ezra and the first six chapters of the book of Daniel.

 
4.) After Cyrus the Great of Persia captured Babylon, ending the Babylonian Empire and beginning the Persian Empire in 539 BCE, the Jews in exile in Babylon were allowed by Cyrus to return to Jerusalem in 538 BCE and eventually rebuild the Temple (II Chronicles 36:22-23 and “later” Isaiah). The book of Daniel records Cyrus’ (and, later, Darius I’s) role in the return and the book of Ezra reports the construction of the second Temple in Jerusalem begun around 537 BCE. Construction, toward which contributions by Nehemiah were incorporated with Ezra, lasted at least until 516 BCE.

 
The Biblical histories and books of the prophets concerning the historical events described in 2.) through 4.) above show a “divine pattern” which WEPCLS have seized upon. The great cataclysms brought upon the ancient Hebrews after Solomon were orchestrated by God as punishment for the sins (turning from God) of His Chosen People, and, moreover, God used pagan, heathen kings like Sennacherib and Nebuchadnezzar to punish His people and a pagan heathen king like Cyrus for the restoration of His people. For instance, Nebuchadnezzar is called God’s servant in Jeremiah 25:9 and is promised that the Babylonian’s land will be wasted only two verses later (Jeremiah 25:11). Later Isaiah calls Cyrus God’s “anointed” (Isaiah 45:1) and promises Cyrus God’s divine favor (Isaiah 44:28 & 45:13), while nonetheless declaring that Cyrus “does not know” God (Isaiah 45:4).
In other words, the WEPCLS have been swept up in the “divine revelation” or “special knowledge” that whatever happened to the ancient Hebrews (all the death, destruction, and utter humiliation), God was always in control of both punishment and reward, using unGodly evil empires as his tools to chastise His wayward “children.” Being Biblical literal-ists, the WEPCLS “naturally” transfer these Old Testament revelations to the present day, seeing “evil” Trump as God’s tool to punish the secular world for resisting God’s plan according to the interpretations of the WEPCLS. Trump as God’s tool is WEPCLS’s “special knowledge” through which all their issues like abortion will be “taken care of” without regard to the pagan, heathen, and evil attributes of that tool — just like the pagan, heathen, and evil actions of the Assyrian, Babylonian, and Persian rulers were disregarded by the prophets.

 
Trump is a tool all right, but not God’s tool.

 
Before applying “higher” Biblical criticism (or just biblical criticism) to WEPCLS’s interpretation of scripture, look at the conundrum the WEPCLS have created for themselves. Trump is so unGodly the absurdity that evil can be a tool of good is somehow proof that this must be, in the end, of God; Trump must be God’s President. And the more unGodly the tool, the greater proof that the tool must be of God! It reminds me of the Christian existentialist Soren Kierkegaard’s assertion that the absurdity of accepting Jesus as God on nothing but pure, blind faith is all the more reason for taking the leap of faith and accepting Jesus Christ as your Lord and personal Savior. Or, on a more mundane level, it reminds me of the creationist scientist on the banks of the Paluxy River announcing that the absence of human prints in the Cretaceous limestone alongside those of dinosaurs must INCREASE the probability that human prints ARE to be found; in other words, absence of evidence means presence of evidence! One can’t help but think of an Orwellian “double-speak” mantra “Bad is good!” and “Good is bad!”

 
Faith, like falling in love, is irrational, but falling in love is not bat-shit crazy!

 
The epistemological problem with faith-based religion is that any one religious belief cannot be shown to be better or worse than any other. By faith the WEPCLS believe the Bible is the Word of God established as everlasting truth about 1600 years ago (when the biblical canon was finally hammered out by acceptance of some books and rejection of others). For them truth is “set in concrete,” never to be altered by facts thereafter. despite the uncomfortable truth that God’s “concrete” of Jesus being God in the Trinity was not established as truth until about 400 years after Jesus’ crucifixion. What became amazing to me is that such canonization into unmoving, unchanging truth can only be defended by ignoring hundreds of years of new facts. If I were living in Europe around 1500, the fact that the Bible does not record the existence of a whole New World of two huge continents would make me revisit the rigidity of my faith and my beliefs. Nor does scripture mention all the scientific facts that evolve with ever-increasing evidence year after year, because the Bible is pre-scientific and written way before widespread literacy.

 
Because Christianity is “set” in history for biblical literalists, and because history has become a forensic science, Christians such as the WEPCLS do not have history on their side, just as all other believers who believe solely on faith. The forensic science of biblical criticism shows that literalists such as the WEPCLS do not have to become atheists or agnostics if they seek the most reasonable and probable view of what must have happened in the past for the Bible as we know it today to be in our hands. They must accept more historical facts than they presently do — facts that are compatible with as objective a view of the past as possible, facts that conjure the broadest agreement across Christendom, facts that place Christians in a majority armed with modern techniques of forensic history and forensic science, like archaeology and the history of Judaeo-Christian scripture (See the Dec. 2018 issue of National Geographic).

 
What then does biblical criticism have to say about WEPCLS’s interpretation of the Old Testament stories involving Assyria, Babylon, and Persia? Note the span of years covered by the events 1.) through 4.) above — essentially 930 BCE to 516 BCE. If you look at faith-based, conservative listings of the books of the Bible covering this span (I & II Kings, I & II Chronicles, Isaiah, Jeremiah, Ezekiel, Daniel, Ezra, Nehemiah) and when they were written, you would be told the books were written contemporaneously with or soon after the events with which they deal. But biblical criticism, which we have had since the 19th century or earlier, is, through archaeology and study of the origin of scripture (Dec. 2018 National Geographic), finding that they were all written well after the events as rationalizations or apologetics for the tribulations of what are supposed to be God’s Chosen People who He loves. (To say God employed “tough love” dealing with the ancient Israelites is a gross understatement indeed!) For a fairly well-established example, the book of Daniel was not written during or soon after the Babylonian Captivity or exile (586-538 BCE), but rather was written in the 2nd century BCE, circa 165 BCE. Further, it appears the author of the book of Daniel was writing about the 2nd century persecution of the Jews under the Seleucid king Antiochus IV Epiphanes using the prior persecution of the exile as a cover. The same dating fraud is committed concerning the books of the New Testament, especially the Gospels. Faith-based conservatives such as the WEPCLS want the Gospels written well before the Jewish Revolt against the Romans in 62-70 CE (Common Era or A.D. , anno Domini), as close to the life of Jesus as say, Paul’s letters. But biblical criticism based upon historical research shows the Gospels to be written during or after the Revolt (See Sorting Out the Apostle Paul [April, 2012]).

 
As we enter the 21st century, we know much, much more about the origins of the Bible than ever. What is needed in Christian scholarship of the scriptures is more polemics, not more apologetics. For WEPCLS to ignore this new wealth of historical findings for the sake of their medieval-like literalism is intellectually anachronistic and irresponsible. Consequently, the WEPCLS give non-Christians a bad name, as many non-Christians erroneously think WEPCLS represents all Christians.

 
Epistemologically, the WEPCLS commit the intellectual fraud of decontextualization, the practice of plucking a source out of its context so that its plucked state of being ripped from historical references makes it applicable to any time whatsoever, even a time bearing no relationship to its original intended applicability. The WEPCLS have decontextualized much of the histories and major prophets of the Old Testament so that they can be used for their conservative, Trinitarian, evangelistic purposes. Higher Biblical criticism has exposed their attempts to relate Old Testament references to Old Testament historical individuals as being references to the coming of Jesus Christ as the Son of God. To relate God’s use of Godless leaders in the Old Testament to today’s situation is not the WEPCLS’s first “fraudulent rodeo.”

 
I urge everyone in Christendom to apply biblical criticism to expose WEPCLS as a corrosive influence to Christian evangelism. I urge believers of all religions to use the same techniques of biblical criticism to their own faith-based creeds and/or practices. I urge non-believers to apply these same techniques to combat the politicization of theologies of organized religions.

 
My own experience in biblical criticism suggests it does not necessarily mean the WEPCLS retreat further from intellectual inquiry nor mean that it drives one away from Biblical consideration forever. The Bible itself often is all that is needed for its foibles to be exposed; often the Bible is its own best critic. For instance, I found that by comparing pre-exile-written II Samuel 24:1 with post-exile-written I Chronicles 21:1, one discovers how the concept of Satan, a parallel to the Zoroastrian (Persian) evil co-god Ahriman (counterpart to the good god Ahura Mazda), was introduced into Judaism by the exile (and later into Christianity). Calling upon other sources from archaeology, the Christian scrolls found at Nag Hammadi in Egypt show that there were at least 21 possible Gospels, not 4. These scrolls also show how the early Church bishops strove mightily to suppress and destroy these “lost” Gospels and also perpetuated the besmirching of Mary Magdalene’s character. To my surprise, when I placed Genesis 1 in its literary context, I saw it was not a history of the beginning of the world at all, but, rather, a comparison of the “superior” Hebrew Creator god with the “inferior” gods of neighboring peoples; my respect for Genesis 1 has risen considerably. Biblical criticism opens your mind to broader horizons not suggested by the Church, and helps to understand the archaeological findings relating to ancient religions.

 
Biblical criticism and its related readings applied to consensus world history has led me to work through a “most probable” scenario of how to me Christianity came into human history (Read in order on my website www.ronniejhastings.com Sorting Out the Apostle Paul [April, 2012], Sorting Out Constantine I The Great and His Momma Feb., 2015], Sorting Out Jesus [July, 2015],  At Last, A Probable Jesus [August, 2015], and Jesus — A Keeper [Sept., 2015]). Any person so “armed” and inclined can come up with their own scenario as well or better than I.

 

 

Regarding this matter of Biblical or biblical proportions and votes for Trump, I hope I have not failed my family, my friends, or my entire species in passing on what I see as the best of a growing majority consensus.

 

RJH

 

American Conservatism Belies History

[Waxing philosophically right now, so……CONSERVATIVE DISCRETION ADVISED!]
Seen as a parade of good and bad (and in-between) ideas instead of a parade of good and bad (and in-between) people’s lives, history reveals definite directions of advancement over, say, the centuries since the “discovery” of the American continents. These directions are easy to detect following the rise and fall of ideas along time’s arrow using a broad time scale (The Big Picture, [Sept., 2011]). Also easily detected are peoples’ ideas discarded along the way, ideas that didn’t “make it,” that didn’t “stand the test of time,” that history “left behind in its wake.”

For instance, the two world wars of the 20th century left in their wake discarded ideas such as monarchism and fascism (and certain forms of government they imply, like theocracy and oligarchy). Another resulting discarded idea was that of empires like the Roman, the Mogul, the Mongol, the Ottoman, the Spanish, and the British. The final “victory” of WWII was the end of the Cold War in 1989 when the idea of Soviet communism collapsed. These wars sent history toward liberal democracies (or democratic liberalism) in the form of republics (Reference former Republican Steve Schmidt for this terminology.). The economy of the victors was capitalism (witness how China today is employing a form of capitalism). But non-liberals (especially American conservatives) strive against the liberal capitalism that emerged victorious by practicing a perverted capitalism (They should read their Adam Smith.), wherein not enough profits are plowed back into business as capital and too much of the profit is selfishly stagnated as personal wealth — all of which opens the doors for oligarchy (striven for by Donald Trump) and its ancillary kleptocracy (striven for and practiced by Vladimir Putin). Autocracies of many forms, including “banana republics,” however, have yet to disappear.

(If you think democratic republics are “safe,” having been given the “nod” of 20th-century history, think again. Who was the only democratically elected President of Russia after the Soviet Union? Boris Yeltsin and Russian democracy are now gone. And just in the second decade of the 21st century, Turkey has collapsed into a form of fascism Mussolini, Hitler, and Hirohito would easily recognize.)

Also left behind by history are the ideas of the Luddites and those of American Tories at the end of the American Revolution (also called loyalists). Yet these are the same ideas animating the Republican Party led by Trump. (21st Century Luddites?, [March, 2017], and 21st Century Tories?, [March, 2017]) Despite history’s harsh lessons, “Trumpies” today fail to grasp workers adapting to new ongoing technology and even to what it means to be a citizen (“citizen” being well-defined by the blood spilled in the American and French Revolutions (Egalite: A Qualified Virtue, [Feb., 2018])).

Generally speaking, American conservatism has clung to antiquated, outdated, and anachronistic ideas history has “shaken off” like water off a dog’s back, such as isolationism, racism, xenophobia, homophobia, misogyny, nationalism, sacred political states, tariffs, elitism, class hierarchy, nepotism, non-universal health coverage, and non-universal suffrage. (Citizens (I) Call For the Destruction of the Political Professional Class, [Nov., 2012], Citizens (II) The Redistribution of Wealth, [Jan., 2013], Citizens (III) Call for Election Reform, [Jan., 2013], An Expose of American Conservatism — Part 1, [Dec., 2012], An Expose of American Conservatism — Part 2, [Dec., 2012], An Expose of American Conservatism — Part 3, [Dec., 2012], Some Thoughts on Trump’s Election, [Nov., 2016], and Dealing with Donald, or, A Citizen’s Survival Guide for Trump’s Apparent Presidency, [Dec., 2016])

The xenophobic “circling-the-wagons” mentality of so many American conservatives is based upon the human tendency to take on the “us-versus-them syndrome,” which served us well when we were all hunter-gatherers (about 70,000 to 12,000 years ago). That is, “They over there don’t look like us, so there must be something wrong and possibly dangerous about them.” The “sacred” “us-versus-them syndrome” serves all religions, ancient and modern, including Christianity, well: “They don’t believe the same things we do, so we must convince them to believe as we do or rid ourselves of them.” Here in the 21st century, I think there is no longer any need of the “us-versus-them syndrome,” nor of its attendant bad ideas of nationalism and evangelism; history has passed them by. (Going Global, [March, 2018], At Last, a Probable Jesus, [August, 2015], and Towards an Imagined Order of Everything, Using AVAPS, [June, 2018])

Speaking more specifically, it even seems Trump’s administration, in the name of historically despicable and bigoted immigration laws, is now using our tax money for systematic child abuse. (I have visions of him going down to the detention centers and throwing scraps of food and rolls of paper napkins over the edge of the cages and into the flaps of the tents — similar to his condescending actions in Puerto Rico.) The June 30, 2018 protests across the nation speak loud and clear: the crying two-year old trumps Trump and all his zero tolerance.

Some of the Trump supporters who have not repudiated him and would vote for him still, despite his despicable words, actions, and inaction, such as “evangelical ‘single issue’ Christians” who turn a blind eye to his plethora of “sins” so they can have their conservative SCOTUS in the name of anti-abortion or pro-life (or immigration, or campaign finance, or some such). Pro-life is such a historically unsustainable position, much like creationism and intelligent design. These positions place their proponents at loggerheads with nature, and just like “history bats last,” “nature bats last.” As opposition to evolution is without evidence and completely useless, so is risking future babies to the horrors of genetic defects, when such risk is so unnecessary. I’m angry that sex education courses in schools and sex education at home and in places of worship do not inform future parents that already we have the medical skills in place to assure every pregnant mother she has the right to have a genetically healthy baby. Yet the pro-lifers, by denying mothers the basic right to control their reproductive cycles, force the possibility of tragedy upon families — tragedy that can with certainty be avoided. (It is like inequality of wealth forcing poverty upon countless people of minimal means, which also can be avoided.) The modern technology of human birth and “natural abortions” — miscarriages — compel history to give pro-choice the “nod.” If expectant mothers want to go ahead and take to term a baby with genetic defects, detected early in gestation, that is their choice; there is a chance in future such defects can be rectified either in womb or just after birth. But such a choice is risky, especially based upon a religious belief. (The “A” Word — Don’t Get Angry, Calm Down, and Let Us Talk, [April, 2013], and The “A” Word Revisited (Because of Gov. Rick Perry of Texas), or A Word on Bad Eggs, [July, 2013]) To cling to pro-life is like clinging to slide rules and horse collars; it is out-of-date.

And moreover, such Christians as described above risk, by clinging to pro-life, walking into the theological quicksand of redefining Christianity (“You can’t be a Christian and be pro-choice.”), just as the creationists and intelligent designers have done (“You can’t be a Christian and ‘believe’ in evolution.”). (Creationism and Intelligent Design — On the Road to Extinction, [July, 2012]) You do not have to be anachronistic to be a Christian (Jesus — A Keeper, [Sept., 2015]) Nor do you have to be historically clueless to be a Christian. (The United States of America — A Christian Nation?, [June, 2012])

Historically, American conservatives has lost their way. History is not on their side. And it is their own fault. They let their own credulity get the best of them, and then somehow become too lazy and/or too busy to vet any and all political statements. And today with the sources we have at our fingertips, thanks to the social network, it often takes only seconds to vet almost anything. Liars like Trump thrive because not enough people, regardless of political leanings, vet what he says. What do you think history will do with the “birthers?” Like the flat-earthers, history, I think, will fling them into the dustbin of bad ideas, worth only a laugh or chuckle if ever remembered.

American conservatives, unless they start reading some history instead listening to Fox News exclusively, risk, in the long run, going the path of the Luddites, the American Tories, the flat-earthers, the creationists, the intelligent designers, the pro-lifers, and the birthers. Unless they start reading some history they risk becoming pawns of revivalist fascism, organized crime, communism, nationalism, isolationism, imperialism, and/or colonialism; they risk “warping” in their heads back into 1950’s America.

RJH

Going Global

In addition to being possible 21st century Luddites and possible 21st century Tories, early 21st century American ultra-conservatives, such as those brought “out of the woodwork” by the Donald Trump administration, display other facets worthy of condemnation (21st Century Luddites?, [March, 2017] and 21st Century Tories?, [March, 2017]).  A common thread running through American ultra-conservatives very different from, say, lifting up the 2nd Amendment to the U.S. Constitution as a sacred call to own as many powerful weapons as possible {Guns, “Gun Control,” and School Massacres (Part The First), [March, 2013]; Guns, “Gun Control,” and School Massacres (Part The Second), [March, 2013]; Guns, “Gun Control,” and School Massacres (Part The Third), [April, 2013]; Guns, “Gun Control,” and School Massacres (Part The Fourth) — the “Smoking Gun,” [May, 2013]; Guns, “Gun Control,” and School Massacres (Part The Fifth) — “Four Dead in O-HI-O,” [June, 2013]}, is categorically demonizing globalization.  Why?

First, I had to find out what is the consensus definition of “globalization,” when did it begin, and what is its history.  Two paired books helped me do just that:  1) 1491, New Revelations of the Americas Before Columbus, Charles C. Mann, Vintage Books, 2nd edition, New York, 2011, ISBN 978-1-4000-3205-1, and 2)  1493, Uncovering the New World Columbus Created, Charles C. Mann, Vintage Books, 1st edition, New York, 2012, ISBN 978-0-307-27824-1.  The two titles tell the reader a lot.  They sandwich the year before and the year after Columbus “discovered” America.  (Of course this language we learned in school discredits historical characters like Lief Erikson, and, worse, an entire people who migrated across the Bering Strait into the two continents of the New World thousands of years ago.)  Clearly they compare the “before” and “after” of the European discovery of the New World; the pair present a measure of the impact of that discovery, an impact that echoes across centuries to the present.  Mann’s major theme is that globalization as we know it today began with Columbus’ first voyage.

The year 1492 ushered in a world-wide exchange of cultures, knowledge, foods, diseases, wars, and forced labor in the form of slaves.  As technologies of transportation improved, worldwide trade and colonial exploitation integrated the planet Earth into a global market.  Projecting this sweeping historical view into the 21st century, Mann, in my opinion, suggests that the lesson of globalization is that trading with each other is better than exploiting and killing each other.  A rather obvious good lesson, I’d say.  So, why would anyone be against globalization as defined by these books?

Look again above at the grossly over-simplified list of what was and is being exchanged in globalization; not all of them can individually be labeled as “good.”  Sure, to take one many foods from South America that “saved” Renaissance, Reformation, and Enlightenment Europe, the potato, the “spud,” became the basis of the diet of the poor, and, later, the middle class.  Yet also from the New World came venereal disease and to the New World came European diseases to which native Americans had little or no resistance.  European diseases were even more devastating to the New World people than the Black Death was to Europeans about 150 years before Columbus sailed westward. However, economies based upon world-wide trading were spawned, economies like which are expanding to this day.  For example, the gold and silver mined by the Spanish with native American slave labor in Mexico and the Andes went not only east to Europe, fueling many national economies, but also went west across the Pacific to the Philippines, where Chinese traders traded Chinese goods like silk for the precious metals; this westward movement fueled the economies of China and the Philippines, as well as that of colonial Spain.  Foodstuffs like the potato and corn (maize) also went west.  Black markets and pirate economies sprung up in the Caribbean and in the waters off China as a result.  Another example was the flow of furs and timber to Europe from colonized North America.

But human beings, especially those from Africa, became commodities of trade to work the sugar cane and tobacco industries in the New World, later followed by the cotton industry.  Tropical diseases, such as malaria, killed off European overseers so badly, sometimes slave populations literally disappeared off the plantations into the interior to form new, independent, and undocumented societies often of blended heritage from native Americans — societies of mulattoes and maroons for example.  Because of sickle cell anemia from Africa, more slaves survived the ravages of disease than did the Europeans.

These examples are but “the tip of the iceberg” found in Mann’s books, but they are enough to clearly show that globalization is a mixed blessing; its contributions to our species often came at a considerable price of human suffering.

The more I knew about the history of globalization, the gift of hindsight compelled me to say the price mankind paid was more than worth it, given how global trade of resources back and forth across the oceans made possible the worldwide improvement of life compared with that hundreds of years ago.  Much of this improvement, like the establishment of democratic republics and the march toward universal suffrage and social justice, the rejection of monarchies, and the rejection of slavery, centers around making sure the price paid for globalization is more humane than ever before.  Yet, ultra-conservatives speak of globalization as if they wished it had never happened, even while speaking in an environment filled with comforts and advantages made possible by globalization.

Could it be that conservatives don’t know enough history to appreciate what globalization has done for us?  Possibly, but there are lots of ultra-conservatives, like Steve Bannon of Trump administration infamy, who appear very smart and well-educated.  So, the question begs itself — why, when you know the effects of globalization throughout modern history, would you despise it so?  Why are so-called liberals pro-globalization while so-called conservatives seem anti-globalization?  Those conservatives who still prefer war over trade are getting fewer and far between, as they are symptomatic of vestigial colonialism and imperialism, which began disappearing after WWI and WWII.  So it is possible a conservative might be both anti-war and anti-globalization.

I suspect the answer to the questions in the previous paragraph is found in the phrase above containing the words “mulattoes and maroons.”  Ultra-conservatives equate globalization with the mixing of races, and, as a result, become usually political isolationists.  In a word, they are racists at the core; they are xenophobic toward persons not like them.  It is true, much mixing of races came with globalization; Spaniards and Portuguese with American Indians became Mexicans, Central Americans, and South Americans; Europeans with Africans became mulattoes; Chinese with Filipinos became Sangleys, or Chinese Filipinos.  It is no accident that even in “progressive” societies like the U.S., many family trees were produced by brides and grooms marrying “one of their own.” Not that all who want to maintain a strong connection to the “mother country” are racists.  Rather, that the attachment to the “mother country” is psychologically based upon a racist xenophobia for some of them.  Ultra-conservatives have politicized this racism and politically express their racist bias by opposing globalization.  Their economics resemble that of a long-past colonialist, imperialist overseer.

RJH

P.S.  Lest you, the reader, think my linking anti-globalization with racism is but fanciful whimsy or giddy rationalization, consider how a growing number of historians and anthropologists are agreeing that the concept of “racism” was not a concern in Western civilization until it was clearly possible European and non-Europeans would be living together in an ongoing situation; that is, until different races lived together to make interracial mixing possible.  In other words, racism was not a considerable problem in Western culture until very different groups were shuffled across oceans; racism became synonymous with globalization when globalization began such shuffling, when the New World was “discovered” by Columbus.

Hope and Faith

I remember singing in Sunday School, “Have faith, hope, and charity, That’s the way to live successfully, How do I know? The Bible tells me so!”   I assume the song’s words are taken directly from Paul’s epistle to the Corinthians (I Corinthians 13:13, KJV).  The three words faith, hope, and charity are called the “three theological virtues” or just the “three virtues.”  Having sorted out what Perception Theory tells us about “belief” (I Believe! [Oct., 2016]), two of the three, faith and hope, or, in the order I consider them here, hope and faith, will be considered.  Both are related to belief and though both are “separate virtues,” the pair, I intend to show, are very similar in Perception Theory, yet are very distinguishable from one another.  (Perception is Everything, [Jan., 2016]; Perception Theory (Perception is Everything) — Three Applications, [Feb., 2016]; Perception Theory: Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016])  Indeed, they are paired conceptually in Hebrews 11:1:  “Now faith is the substance of things hoped for, the evidence of things not seen.” (KJV)

Despite my skepticism Paul should even be call an apostle, much less an accurate describer of Jesus (Sorting Out the Apostle Paul, [April, 2012]) and despite the consensus Paul did not write Hebrews (Priscilla, Barnabas, Luke, Clement of Rome, and Apollos of Alexandria have been proposed as more likely authors of Hebrews than Paul.), the presence of the same two words (hope and faith) together in both KJV verses provides a convenient “cutting board” upon which to dissect the two with Perception Theory.  In I Believe! [Oct., 2016] belief is far from having anything to do with evidence, yet the Hebrews verse links “substance” and “evidence” with faith.

Hence, if this linkage is accurate, faith has more to do with evidence than belief.  In fact, starting from absence of evidence, starting from belief, and heading in the direction of evidence, I see hope first, followed by faith, with evidence (“I know” statements –I Believe! [Oct., 2016]) coming only after faith.  “I believe” statements and “I know” statements, with hope and faith “sandwiched” in between, are all four non-veridical activities of the brain, with “I believe” statements devoid of resonance with the “outside,” real, veridical world beyond the volume of our brains and “I know” statements as resonant with the real, veridical world as they possibly can be (as possibly allowed by the “subjective trap”).  This would suggest that both hope and faith exist as resonating non-veridically based concepts, “in between” the looped non-veridically based existence of “I believe” statements and the strongly veridically-based existence of “I know” statements.  In other words, belief is looped non-veridically based, like God, and hope and faith are possibly resonating non-veridically based, like freedom (Perception Theory: Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016]); both hope and faith at first appear to “reach out” to the veridical world in a way belief does not bother to do.

To Perception Theory, however, hope is like a “wish statement” that may or may not resonate veridically.  To hope God hears our prayer is looped non-veridically based, but to hope your sick loved one gets well is resonating non-veridically based.  Hope statements can be in either non-veridically based camp — looped or resonating.  To Perception Theory faith leans strongly toward the resonating non-veridical, like having faith that your sick loved one will actually get well, which means the loved one’s health will be described with “I know” statements of wellness in the future.  If the sick one does not get well, the hope still seems justified, but the faith seems ill-placed; hope cannot ever count on “I know” statements to come, but faith risks counting upon “I know” statements coming.  One’s hope can never be squelched by the real veridical world (it is so looped); one’s faith can (it is so resonate).  Faith, then, is like a “prediction statement,” a declaration that something will in future be supported by evidence, and by, therefore, “I know” statements.  With hope I wish, and with faith I predict or bet.  Moreover, faith is embedded with a confidence in a “real world” outcome, whether justified in hindsight or not.  This confidence reinforces the resonance of faith with the veridical.

Hebrews 11:1, therefore, is way off-base.  Faith cannot be substance or evidence of anything.  I can believe or hope in just anything (wishing); conversely I cannot bet on just anything (predicting) and be considered sane, no matter how confident my faith.  Based upon what we know about the universe that seems to be outside our heads, hoping that unicorns exist can be seen as “cute and charming,” while confidently predicting that unicorns exist will probably been seen as silly.  Stating I have faith that unicorns exist is not evidence that unicorns exist, but stating I hope unicorns exist “gets a pass” from those who demand evidence.  One is simply not taken seriously when hoping, like he/she is when bestowing faith.  Hope is more like belief than faith; faith is more like predicting freedom in a veridical society than hope, but with a confidence often falsely interpreted by others as connected with evidence.

An analogy might be in order:  I am about to witness the results of a wager I’ve made at a casino in Las Vegas, say.  It’s the results of a pull of the handle of a slot machine, the final resting place of the ball in a roulette wheel, a roll of the dice at the craps table, the revealing of the cards at the end of a round of poker, or the public posting of the results of a sporting event I have bet on.  Normally, I hope I win (which is not the same as saying I predict I will win), but if I don’t (if I fail to win), the worst that can happen is the loss of my wager.  However, if I win, any conclusion other than to realize how lucky I am would not be warranted; I happened to beat the odds, the probability of which I knew was very low when I made the bet.  But if I have bestowed faith in winning the wager, as we have seen above, it is almost redundant to say I am betting, that is, predicting that I will win.  (Recall I can place a bet with hope, which is not a prediction.) If I have faith that I will win, predicting that I will win, then the amount of the wager, the bet, relative to my gambling budget, is a measure of the strength of my faith.  If I fail to win, my faith will be seen as ill-placed and in hindsight unnecessary; confidence in my winning (in my faith) in hindsight might seem cruelly laughable.  However, if I win, my faith, along with the confidence attending it, seems (irrationally) justified.  In minds wherein suspension of rationality seems commonplace, the human mind tends to think that the win might not have happened without the faith and its attendant confidence.  But the win would not have happened without the bet, and the confident faith before the results had nothing to do with the win, but too often the faith and its confidence are seen as the “cause” of the win!  Such an irrational conclusion is nothing short of believing in magic; it is a view of the win that is all in the head of the winner, and has nothing to do with the evidence from the real world that actually determined the mechanics of the results.  Perception theory would say that veridically the results, win or lose, were the outcome of random probability; any hope or faith put in the results are non-veridical processes inside the brain (Perception is Everything, [Jan., 2016]).

Now, let’s get to the “elephant in the room,” the “gorilla sitting in the corner.”  Believing that God exists is just like hoping God exists — neither tells one anything about God’s existence, except that God is a concept in the head of the one making the belief statement or the hope statement (Perception Theory: Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016]).  Having faith that God exists in the real veridical world bets that, or predicts that, God exists like freedom, a dog, or a rock.  Bets and predictions can fail (as in gambling), as have all bets and predictions concerning both unicorns and God, so far.  Faith in God outside our heads, as faith in unicorns outside our heads, is ill-placed — in terms found in Perception Theory: Adventures in Ontology — Rock, Dog, Freedom, & God, it is absurd.  Unlike freedom, God and unicorns do not resonate with the veridical.  I can think of at least one statement about God in which we can all make an “I know” statement — God is a concept in our heads.  It is curiously difficult not to say we can all have faith that God is a concept in our heads.  Also, curiously, I am betting, have faith, that the concept of God, under “high resolution,” is different for each and every head.  Perhaps this “God difference in every head” will one day be shown to be only a hope (an inescapable belief) or. even perhaps be another “I know” statement.

RJH

I Believe!

I must count myself in that school of thought which has asserted that everyone has to believe in many things, but the “trick” is to believe in things that are true. Yet, it seems obvious to me that one can believe in anything.  And, since not just anything can be true, it must be equally obvious that mere belief is no reliable means to finding out the truth.  Curiously, the ability to believe seems basic to the human mind. In my opinion, the pervasiveness of belief among the species Homo sapiens indicates that belief was at the origin of our species necessary for survival, just like our propensity to be religious, or to be ethical, or to be evil.  The evolution of these last three propensities, based upon both physical and cultural anthropology, was a major vehicle in the development of the ideas, themes, and conclusions of 1) my series on the origin of Christianity (Sorting Out the Apostle Paul, [April, 2012]; Sorting Out Constantine I the Great and His Momma, [Feb., 2015]; Sorting Out Jesus, [July, 2015]; At Last, a Probable Jesus, [August, 2015]; Jesus — A Keeper, [Sept., 2015]) and of 2) the first of my series on Perception Theory (Perception is Everything, [Jan., 2016]; Perception Theory (Perception is Everything) — Three Applications, [Feb., 2016]; Perception Theory: Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016]).  The discussion of human belief seems a good addition to 2) above, given the very broad applicability of the theory.

For every human mind there seems a hierarchy of importance of beliefs.  Whether or not one believes their sports team is going to win an upcoming contest seems pretty trivial compared to whether or not one believes their partner in life truly loves them; whether or not one believes they can accomplish a challenging task seems pretty trivial compared to whether or not one believes in God.  Moreover, human belief seems intimately entwined with human faith and trust.  Belief in an expected event, in the words of someone else, in the truth of ideas and/or assertions of all sorts, in anticipated future states of the world, and in the truth of past events all involve faith that the object of the belief is worthy of one’s trust.  In other words, I have faith that the resources leading me to believe in X, whatever X may be, are worthy of my trust to the extent I tell myself that X must be true; X is true to me because I have faith in the trustworthiness of believing in X.  Admittedly, this epistemological dissection of belief sounds esoteric, convoluted, and nuanced.  We do not normally think about either the hierarchy or the underlying philosophical assumptions of belief; we just believe, because we come into the world “wired” in our brain to do just that.  What I propose to do is to make thinking about belief less esoteric, convoluted, and nuanced — to make serious consideration of what it is we do when we believe more normal in day-to-day thinking.

In the context of expounding upon freedom of the press in the United States, Alexis de Tocqueville in Democracy in America (The Folio Society, London, 2002) said that a majority of US citizens reflecting upon freedom of the press “…will always stop in one of these two states:  they will believe without knowing why, or not know precisely what one must believe.” (p 179)  It seems to me any area of reflection, not just freedom of the press, could have this quote applied to it, given how muddled together “thinking” and “believing” have seemingly always been in common rational mentation.  So basic is our habit of believing without intellectual meditation and discrimination, being caught between the dilemma of the two states quoted above becomes seemingly all-to-often inevitable.  The hierarchy of importance among beliefs as well as consideration of the roles faith and trust play in belief become lost in an intellectually lazy resignation to the dilemma, in my opinion.

I think we can know why we believe.  I think we can know precisely what we must believe.  Note I did not use “I believe” to start the first two sentences of this paragraph; instead, I used “I think.”  So many thinking people tend to use “I believe” in sentences the same or similar to these and thereby fall into a trap of circular reasoning; they mean “I think,” but utter “I believe.”  I think Perception Theory can help to sort out any nuances associated with belief and point the way to how believing in things that are true is no trick at all, but, rather, a sensible mode of using our mind.  And the first two sentences of this paragraph contain strong clues as to how to relieve “I believe…” and even “I think…” statements from ambiguity.   We just simply give them reliability with the beginning words “I know…,” instead of “I believe…” or “I think…”  Herein I hope to lay out the epistemological process by which statements become reliable and thereafter merit the beginning words “I know…”  At the same time I hope to show that in the name of truth, “I believe” and “I think” should not be necessarily be thrown away, but, rather, used with reticence, care, and candor.

 

I submit that the statement “I believe the sun will appear to rise in the east tomorrow morning.” is fundamentally different from the statement “I believe in the existence of God.”  Neither is irrefutable as, presumably, the speaker cannot deliver an image of a future event, nor is anything remotely resembling a deity alongside the speaker.  According to Perception Theory, any belief statement, certainly including these two, is non-veridical (At Last, a Probable Jesus, [August, 2015]; Perception is Everything, [Jan., 2016]), as a belief is a descriptive statement of some result of the mind, imagination, and other epiphenomenal processes operating within the brain.  As shown in Perception Theory: Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016], such statements can resonate strongly, weakly, or not at all with the real or veridical world from which comes all empirical input into the brain through the senses.  The sun rising tomorrow resonates strongly or weakly with the veridical real world (depending upon how skeptical and/or cynical the speaker is), based upon previously experienced (directly or indirectly) sunrises; in terms of Perception Theory: Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016], it is resonating non-veridically based.  God existing is, conversely, looped non-veridically based, as defined in Perception Theory: Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016].  The second statement is purely epiphenomenal, while the first hearkens to a real empirical world; the second is a naked product of the mind, while the first links an epiphenomenal product to a presumed reality (phenomena) outside the brain.  Belief is in both cases epiphenomenal; the first is based upon empirical, veridical, phenomenal past perceptions; the second is based upon imaginative, non-veridical, epiphenomenal intra-brain biochemical activity.  In other words, sunrises are non-veridical images based upon empirical data, while God is non-veridical imagery based upon other non-veridical imagery.

At the risk of being redundant, it bears repeating that why we have the ability to believe in the two manners illustrated by the two belief statements of the previous paragraph is easily understood.  When our brains evolved the complexity making self-consciousness possible, assuring our survival as a small group of especially big-brained members of the genus Homo, applying our new ability to imagine ourselves in situations other than the present was not practically possible at all times; we still had to react instinctively in threatening situations, without pausing to think about the situation, else we might not survive the situation.  With, say, leopards attacking our little hunter-gatherer group during the night, to question or think about alternatives to proactively defend ourselves potentially would have made the situation more dangerous, not safer; in other words, often whoever hesitated by thinking about the situation got eaten.  Those who came up with or listened to a plan of defense without argument or disagreement tended to assure the success of the plan, as the group agreed to act quickly to avoid future nights of terror; or, often acting unquestionably directly led to successfully solving the leopard problem.  To justify ourselves individually joining the plan, we used our newly complex, self-conscious minds to suspend judgement and believe that the originators of the plan of defense, whether we ourselves, the leaders of the group, the shaman of the group, or just some unspecified member of the group, had some seemingly good idea to deal with the leopard problem; without rationalization of any sort, we believed the plan would work.  Without hesitation, we often believed out of such desperation; we had no choice but to believe in some plan, to believe in something, else we might die.  Hence, those who developed the ability to unthinkingly believe tended to be those who survived in the long run.

I submit that as human beings began civilizations and culture over the last several thousand years, the need for “knee-jerk,” unthinking belief has overall diminished.  Outside of modern totalitarian political, sectarian, or secular regimes, our brains can safely be used to question, scrutinize, vet, and adjudicate ideas, plans, positions, conclusions, etc. as never before.  As knowledge continues to increase, we can without desperation hesitate and “think it over;” immediate belief is not necessary any longer in most situations.  Belief continues to be an option we all use at one time or another, but on important issues we no longer have to suspend judgement and “just believe.”  Don’t get me wrong — spouting beliefs “right and left” on issues of little or no importance, such as what I believe will be the outcome of upcoming sporting events or of the next pull on a slot machine in Las Vegas, can be fun.  What I am saying is that we do not have to agonize over what we believe, as long as the consequences of that belief portends little or nothing at all.  What this means is that we must train ourselves to start serious, important, and substantive declarations with “I think” rather than “I believe,” as I did above, which indicates some rational thought has gone into formulating those declarations.  Moreover, it portends that “I know” is even better than “I think” in that the rational thought going into “I know” statements is so substantive and evidence-based, the statement is reliable and feels close to the “truth.”   It also means we can suspend belief indefinitely, if we choose, or we never need think belief is necessary.

Admittedly, belief does have use in motivational rhetoric, which may not be so trivial in many different individual minds.  Often consensus of agreement for group action relies upon conjuring in individual minds belief that the action is in the group’s collective best interest.  Halftime speeches in the locker room by coaches to their teams is one example that comes to mind; such locker rooms rely upon words and signs exhorting belief; evidence and reflection need not be evoked.  This common use of belief hearkens back to our evolutionary need to believe, as discussed above, but today compelling emotionally-charged adrenaline in a group is more a matter of avoiding losing a game or avoiding falling short of a group goal than it is avoiding being eaten by leopards.  The outcome of the game or striving for the goal determines if the belief was fun and justified, or disappointing and misleading.  Neither outcome might seem trivial to many, but neither outcome would justify the belief conjured to be “true” or “false.”  Locker room belief shown justified or not justified by subsequent events is merely coincidence.

We can now list some characteristics about human belief:

1)  Belief is a non-veridical activity, existing in our minds as either a) resonant non-veridically based  or b) looped non-veridically based.

2)  Belief involves a denial, suspension, or avoidance of judgment, bypassing all forms of adjudication involved in rational scrutiny; it is lazy mentation.

3)  Belief has decreased in importance as culture and knowledge has increased in importance.

4)  Belief is bereft of epistemological value; just because one believes X is true does not necessarily make X true; just because one believes X is false does not necessarily make X false.

5)  Belief is an epiphenomenal, evolutionary vestige of the human mind; it has value today only as an amusing tool in trivial matters or as a rhetorical tool in matters many consider not so trivial.

6)  Beginning with “I think” rather than “I believe” is stronger, and can indicate a closer proximity to the truth, but “I think” does not evoke the confidence and reliability of “I know;” “I think” leaves room for reasonable doubt.

7)  On statements and issues of portent, they can be consistently begun with “I know” rather than “I believe” or “I think.”  Just how this is possible is to follow:

 

Knowing why we believe, we now turn to what we should believe.  Clearly, merely believing in non-trivial matters carries little weight, and is hardly worthy of consideration in epistemological discussions.  Important ideas, plans, and systems of thought do not need belief — they need rational adjudication; we no longer need say “…we need to believe in or think upon what is true;” rather, we need to say “…I know X is true beyond reasonable doubt, independent of what I may believe or think.”  So, we actually now turn to what is worthy of our thought, trusting that in future we will say, instead of “what we should believe” or “what we should think” say “what we know is true.”

Let’s say I want to unequivocally state my conviction that my wife loves me.  To say “I believe my wife loves me.” belies the fact I have lived with the same woman for 48 years and counting, as of this writing.  To say “I believe” in this case sounds like we have just fallen in love (I fell in love with her when we were sophomores in high school together.).  It sounds as if there has not been time to accumulate evidence she loves me transcendent to what I believe.  The truth of the matter is beyond belief, given the 48 years.

If I say “I think my wife loves me.” it can sound as if I may have some doubt and/or there is some evidence that I should doubt, which are/is definitely not the case.  Clearly, in my view, to say “I believe” or “I think” my wife loves me does not do the truth of the matter justice; neither is strongly reliable enough to accurately describe the case from my perspective.

So, it is the case “I know my wife loves me.”  How do I know that?  Evidence, evidence, evidence.  And I’m not talking about saying to each other everyday “I love you,” which we do, by the way.  I am talking evidence transcendent of words.  For 48 years we have never been apart more than a few days, and at night we sleep in the same bed.  For 48 years she daily does so many little things for me over and beyond what she “has” to do.  She is consistently attendant, patient, gentle, caring, and comforting; she is true to her marriage vows daily.  I’ve joked for many years that either she loves me, or she is collecting data for writing a novel about living decades with an impossible man.  Truly, love is blind.

This example illustrates the 3-step process that has come to work for me at arriving at personally satisfying truth.  I’ve even personalized the steps, naming Step 1 for my younger son Chad when he was an elementary school student; Step 2 is named for my younger granddaughter Madison, Chad’s daughter, when she was in the 3rd grade; Step 3 is named for my older granddaughter Gabriella, my older son Dan’s daughter, when she was about 3 or 4 years old.  Hence, I call the process the Chad/Madison/Gabriella Method.  The Chad/Madison/Gabriella Method, or CMGM, bypasses “I believe” and “I think” to “I know.”  Transcendent of belief or speculation, CMGM allows me to arrive at the truth; I can confidently achieve reliability, conclusions I can count on; I can and have arrived at decisions, conclusions, and positions upon which I can not only stake my reputation, I can, if necessary, stake my life.

Yet, CMGM does not provide absolute truth, the corner into which so many thinkers paint themselves.  The results of CMGM are highly probable truths, worthy of ultimate risks, as indicated above, but never can my mortal mind declare 100% certainty.  There is always the finite probability the 3-step process CMGM will yield results shown to be false with unknown and/or forthcoming evidence in the future.  The foundation of CMGM is based upon the philosophical premise of the universal fallibility of human knowledge.

How do we arrive, then, at what we know is true, realizing it really has nothing to do with our careless believing or casual thinking?  What are the “nuts and bolts” of the 3-step process CMGM?

Step 1:  When my son Chad was in elementary school, he discovered he had certain teachers to whom he could direct the question “How do you know?” when information was presented to him; for some outstanding teachers he could ask that question without the teacher becoming upset or angry.  He also discovered you could not ask that of certain family members, Sunday School teachers, or other acquaintances without upsetting them.  It is a courageous question, one conjuring in me, his father, great pride. “C,” Step 1, of the method is a universal skepticism declaring literally everything in questionable, including this very sentence.  From the simple to the profound, whenever any declaration is stated, ask “How do you know?

If no evidence is given when answering the question in Step 1, it is the same as if it was not answered at all.  Answers like “Just because…,” “I just believe…,” “I just think….,” “They say that….,” or similar vacuous retorts are no answers at all.  Or, it is possible that some evidence might be cited.  If that evidence is presented as if it should be accepted and be beyond doubt and question because of the authority or reputation of the source of the evidence, that outcome would be taken to Step 2 just like no answer at all is taken to Step 2.  Therefore, after Step 1, one either has 1) no answer or a vacuous answer or 2) cited evidence for the answer.

Step 2:  When my younger granddaughter was in the 3rd grade and I was the subject of a family conversation, she, Madison, said “Papa Doc is big on knowledge.” (Instead of being called “Granddad, Grandfather, or Grandpa, my granddaughters call me “Papa Doc.”)  In other words, gather your own evidence in response to the results of Step 1; “get your ducks in a row” or “get your shit together” or “get your facts straight.”  If you received nothing in response to executing Step 1, then decide if you want to accumulate evidence for or against the original declaration.  If you don’t, dismiss or disregard the reliability of those who made the original declaration; “reset” for the next declaration.  If you decide to accumulate evidence, it is just as if you received evidence cited in support of the original declaration.  Evidence given in Step 1 needs a search for other relevant evidence and, if you decide to respond to no evidence given in Step 1, the same search is needed.  The ability and quality of getting your “ducks/shit/facts” in a row/together/straight is directly proportional to your education (formal or not) and to the amount of personal experience you have.  “M,” Step 2, of the method is identifying reliable information as evidence for or against the declaration in Step 1; it requires not so much courage as it does effort.  Intellectually lazy persons seldom venture as far as Step 2; it requires work, time, and personal research skills whose quantity, price, and outcome are often unknown, so some courage in the form of confidence is needed to accomplish Step 2.  It is the personal challenge of every successful scholar on any level from pre-K through days on Medicare.  On some questions, such as “Should women be given equal rights as men?” or “Who were the United States’ founding fathers?” it takes but moments for me to identify the reliable information, given my long experiences reading US history.  On other questions, such as “How did Christianity originate?” or “Why did the American and French Revolutions proceed on such different paths when both were based upon similar ideals?”, it has taken me years of off-and-on reading to identify the reliable information allowing me, in my own estimation, to proceed to Step 3.

Step 3:  Way before she started school, my older granddaughter Gabriella, listening carefully to family plans casually mentioned for the next day, voluntarily said, “Actually,…..” such-and-such is going to happen.  And, she was correct, despite her extreme inexperience.  “G,” Step 3, is boldly and confidently stating the results indicated by the evidence from Step 2 applied to the original declaration in Step 1.  If the original declaration in C, Step 1, is “X,” and if the evidence from M in Step 2 is “a,b,c,d,…..,” then Step 3 is “Actually, it is not X, but, rather Y, because of a,b,c,d,…..”  Step 3 takes both confidence and courage.  In Step 3 you are “running it up a flag pole to see who salutes it;” you are taking a chance that of those who listen, no one will agree or only a few will agree, and it is almost infinitesimal that all will agree.  Step 3 exposes you to both justified and unjustified criticism.  Step 3 “thickens your skin” and, if critical feedback to your Step 3 is justified and makes sense to you, that feedback can be used to tweak, modify, or redefine Y.  Justified critical feedback possibly can change Y so that the new version is closer to the truth than the old.

Hence, the way to reliable knowledge I’m suggesting , the way to truth, is essentially an internal, personal, mental adjudication; your head is your own judge, jury, prosecution, and defense.  CMGM is suggested as a possible “instruction list” for this adjudication; CMGM works for me, but others might well find another “formula” that works better for them.  CMGM, Steps 1,2,& 3, conjure(s) X and usually change(s) X to Y, based upon a,b,c,d,…..  Y is usually closer to the truth than X, but it is possible X “passes muster” (Step 2) relatively unchanged into Step 3.  It is not unlike how reliable knowledge is accumulated mentally in all areas of science, math, and engineering.  The advantage these three areas have over CMGM is that Y MUST be successfully tested by nature, by the real world, including the “real world” of logic in our heads, and independent investigators/testers also dealing with Y must corroborate with the same independently derived results; some Y’s from CMGM might not be as easily tested, such as “Men and women can never completely understand each other.” or “A different set of universal physical laws were required to create the present set of universal physical laws.” or “At least one other universe exists along with our own.”

 

If I want to make a truth statement, I need to begin it with “I know.”  I need to have “I know” statements backed up with evidence accumulated by personal adjudication produced by mental steps similar to CMGM.  If reliable knowledge and/or truth are not germane to my statements, then I can use “I believe” or “I think,” depending on how close to being important to me these statements are; “I believe” and “I think” have little or no epistemological content.

How do I know X is true?  Chad-as-a-child makes me ask that very question.  I can say “I believe X is true,” as a knee-jerk, off-the-top-of-my-head statement, just to add to the conversational mix; I feel no need to justify it.  Challenged to justify X, Madison-as-a-child reminds me I’ve got to do some scholarly work.  With some brief, cursory thought I might say “I think X is true,” maybe with a piece of evidence ‘a,’ but neither I nor my fellow conversationalists would think such a statement has much epistemological clout worthy of truth seekers.  With Madison’s work and Gabriella’s courage and confidence I sooner or later can say “I know Y is true, to the best of my ability;”  Gabriella-as-a-child tests my intellectual acumen; I must at some time bravely state Y publically, regardless of the consequences.  In all probability X has morphed into Y thanks to the accumulated evidence ‘a,b,c,d,…..’  Y has “epistemological meat” on its “bones.”  Y has brought me closer to the truth; it is a stepping stone with which to draw even closer.

Yes, I do believe all the time in lots of things.  But I think about certain things in whose reliability I’m more confident.  However, I can know a few things in whose reliability and truth I have as much intellectual and emotional confidence as I can muster.  For me, it is better to know than to just believe or to just think.  I am drawn to what you know, not necessarily to what you believe or what you think.

RJH

 

Perception Theory: Adventures in Ontology — Rock, Dog, Freedom, & God

Development and application of perception theory (Perception is Everything, [Jan., 2016] & Perception Theory (Perception is Everything) — Three Applications, [Feb., 2016]) has opened up for me seemingly unending possibilities of understanding better almost any aspect of human knowledge and experience.  Among my favorite areas of philosophy is ontology, the philosophy of being — what is existence?, what does it mean “to be?”, etc.  Modern existentialism has sprung from ontology, now armed with human psychology, cultural anthropology, and evolutionary psychology.  Perception theory thrives upon the notion that objectivity (the veridical) and subjectivity (the non-veridical) are not “at odds,” but, rather, exist in an evolutionary symbiosis via and upon our “world-view screen of perception” within our heads (See At Last, A Probable Jesus, [August, 2015] & Perception is Everything, [Jan., 2016]).  (Another way of thinking of this screen is that it is synonymous with the German Weltanschauung.)  What this work focuses upon is the light shed upon the question “What does it mean to exist?” provided by perception theory.

For anything to exist, there must be some perception, conception, or idea of that thing on the non-veridical side of the screen — in the human mind embedded in the human brain.  I recall several years ago finding agreement with a former friend and fundamentalist Christian on this universal premise of “knowing” anything — e.g. to know God is to have certain brain activity within your mind; to know anything else is to have different brain activity within your mind.  Not having worked out perception theory at that time, I only remembered the novelty of agreement between the two of us.  I now know this novelty was but an unrecognized feeling of the compatibility of the objective and the subjective; had the symbiosis between objectivity and subjectivity been clear to me back then, our discussion would have gotten much further than it did.

The definition of existence in the first sentence of the previous paragraph must not be mistaken for an affirmation of Bishop Berkeley’s ontological “proof of God” based upon “To be is to be perceived.”  The good bishop declared that God must exist because He is the Universal Perceiver keeping the world in existence around us, even when we are not directly perceiving it, such as when we are asleep.  Perception theory declares, on the other hand, that existence creates perception, not the other way around.  Existence is a processed quality actively attributed by the non-veridical upon both the veridical (empirical data bombarding the senses) and the non-veridical (ideas generated or processed by the mind using veridical data, other non-veridical concepts, or both).  All things perceived existent either in the outside world or in our heads must be non-veridical products, even though the genesis of all things perceived lies ultimately but indirectly in prior and/or present empirical data.

 

To demonstrate all this with examples, consider the existence of four non-veridical products — the idea of a rock, of a dog, of freedom, and of God.  In other words, how does perception theory describe the existence of a rock, a dog, freedom, and God?  Four ideas are chosen in anticipation of existence falling into four distinct categories.  Perhaps other ontologists using other theories would choose another number; perhaps other ontologists using my exact same perception theory would choose another number.  Moreover, the list of possible examples representing each category is virtually endless.  No doubt every single reader would come up with a completely different list than rock, dog, freedom, and God.

First, how do we know a rock exists?  Its existence is inferred by our minds from strong, direct empirical signals sent by our senses of primarily sight and touch.  If it is a relatively small rock, we can pick it up and collect even more empirical signals; we can, for instance, measure its size and we can weigh it.  A rock does not move of any volition from within; if broken apart, and if not a geode, it seems uniformly hard and dense throughout, etc. etc.  Each rock we investigate, even if only one in our entire life, contributes to an idea of a rock that becomes a non-veridical image on our perception screen in our head, an image reinforced by subsequent direct empirical experience of any particular rock “out there,” outside ourselves; typically this subsequent empirical experience could be our picking up a rock we’ve never seen before, or someone purposely or accidentally hitting us with a thrown rock, etc.  Finally, we know a rock exists because empirical data from other human beings having to do with rocks seems to correlate with the notion that their non-veridical perception of rocks is nearly the same as our non-veridical perception of rocks.  In fact, I have never seen a human holding a rock denying it is there.  This, despite the impossibility of our ever experiencing others’ non-veridical perception, due to the subjective trap (Perception is Everything, [Jan., 2016]).  In other words, other apparent perceptions of rocks assure me I am not “making rocks up” in my own head, or “If I’m crazy to say rocks exist, then apparently almost everyone else must also be crazy!”  Beings like me also behave as if rocks exist.

[Here I pause to interject and define a useful “test” to aide in contrasting and comparing the four examples of existence (the first of which is the existence of a rock just discussed).  I am going to employ three sentences with blanks to fill in with each of the examples, one at a time. The three sentences are: 1) “_______ helps me to understand the universe better.” 2) “Wars over _______ are sometimes justified.” and 3) “I have a personal relationship with ________.”]

Let’s “test” the existence of a rock with the three sentences:  1) “A rock helps me to understand the universe better.”  That is hard to argue against (i.e. there is little or no absurdity in 1) about a rock.)  Contemplating a rock is “classic” starstuff interacting with fellow starstuff (Perception Is Everything, [Jan., 2016]).  One of my many favorite photographs of my elder granddaughter when she was a toddler is her sitting on the patio holding a fallen leaf with both hands and staring at it intently — if that is not starstuff contemplating fellow starstuff, I don’t know what is!  Just like my granddaughter left that patio so many years ago with “leaf,” apparently, as a new non-veridical concept in her brain, my holding and staring at a rock not only reinforces my catalog of non-veridical rock concepts in my brain, my understanding of the place of rocks in my universe, the universe I assume we all share, is enriched further.  So, yes, 1) about a rock seems to be clearly true.

2) “Wars over a rock are sometimes justified.”  This one seems totally absurd, as if it is a theme of a classic Monty Python skit.  There may have been a time at least a hundred thousand years ago when a group of early Homo sapiens attacked a neighboring group that had stolen the first group’s “sacred stone,” or some such, but to kill each other over a rock is today considered insanity.

3) “I have a personal relationship with a rock.”  Again, this reeks strongly Phythonesque, but at least no one is getting hurt, it is assumed.  One thinks of the absurd fad a few years ago of owning a “pet rock.”  Good fun, if one is not serious about it, but the ones who had the most fun were the sellers of pet rocks making deposits in their bank accounts.  Similar to the pet rock “relationship” is a person’s attachment to tools, equipment, houses, automobile, etc.  For instance, in the building projects I have done, I’ve grown “attached” to tools such as my Dremel-brand rotary multi-tool.  But, like a pet rock, these inanimate objects can be replaced if lost, stolen, or worn out; replacements give the same attachment as the tool they replaced.  Hence, the relationship is to any tool that can do a specific job, not to a specific one — to the idea of efficient and practical rotary tools; to emotionally attach to a worn-out tool that no longer does the job is absurd.  I “loved” the old Dremel I had to replace, but as soon as the new one “fired up,” I no longer thought about the old one — I immediately “loved” the new one.  However, I often fondly think of a 1966 red Ford Mustang I used to own and later on sold, but from the moment I sold it, I no longer had the personal relationship with that particular car — I had and still have a “love affair” with the idea of owning a red Ford Mustang, since I never replaced the one I sold.  3) speaks of a relationship with a particular rock, not with the idea of rocks in general.

Since 1), 2), and 3) for rock responses, are, respectively, “very true,” “absurd,” and “also absurd,” we can infer something about the type existence exemplified by the existence of a rock.  If I label this type existence as strongly veridically-based, as it always harkens and focuses back to the empirical, veridical source of the non-veridical concept of rocks in our heads (“rocks in our heads!” get it?……..never mind……) — namely, the universe outside our heads that we assume exists, else we would not behave the way almost all of us do and all existences conjured in the contemplation of the universe — again, anything outside our heads — is/are strongly veridically-based existence(s).  This means existing as science assumes existence to be; the existence of a rock is a scientific example of “scientific existentialism,” a basic ontological assumption of the philosophy of science.  Strongly veridically-based existence suggests that objects like the rock exist independent of our perceiving them.  We logically infer the rock existed before anyone alive today (unless it is a man-made structure like a brick recently kilned), and, long after we are gone, long after the non-veridical perceptions, conceptions, and ideas of rocks have ceased to exist inside our heads, the rock will continue to exist.  (Even if the rock erodes considerably, we normally consider it to be the same rock; we could conceive of its deliberate or accidental destruction, such as being thrown or knocked into the magma of a volcano, but most rocks seem to survive for eons of time.)  Strongly veridically-based (rock) is the first category of existence.

 

Second, how do we know a dog exists?  Most of what is said about the existence of a rock above applies to the existence of a dog, with at least one obvious difference.  That difference is the reason I chose the idea of dog as another existence example instead of lumping the canine with the rock.  That difference is best illustrated by an event that occurred not long ago in a favorite pub I frequent:  Early one afternoon in this establishment the lady co-owner walked through holding her newest family member — a puppy that looked like a wired-haired dachshund.  We all reacted as if she was carrying a new grandchild of hers; “how cute!” and similar exhortations abounded.  The evolutionary reasons we naturally respond to puppies is not germane to the point here, but imagining how different it would have been if she had walked through holding a rock is.  Had she walked through with a rock rather than a young dog, many would have not noticed at all; if they did notice, perhaps they would have dismissed the observation immediately as not noteworthy, or again if they did notice, would think it odd for the situation and would either ask her about the rock or say nothing.

It seems obvious that the difference is that the dog is alive (“quickened”) like us while the rock is not.  Being alive (being “quick”) and animate portends a brain, and a brain portends some non-veridical potential such as humans have.  (Clearly, though plants are alive, the life forms I’m here describing are animals.)  So the strongly veridically-based existence of a dog (We can empirically interact with a dog just like we do the rock.) is modified, tweaked, or nuanced slightly; it is a somewhat different kind of veridically-based existence.  I label this type existence as quickened & strong veridically-based.  Another ontological difference between a dog and a rock is that, like all living beings, there is no notion of extended prior or future existence; like humans, dogs have very limited, terminated existences compared to rocks; brains are very finite.  Quickened & strong veridically-based (dog) is the second category of existence.

1) “A dog helps me to understand the universe better.”  Again, for the same reasons as those of 1) for a rock, this seems very, very true.  Perhaps human understanding of the universe is furthered more by the dog than by the rock because we are physically closer related to dogs than rocks; a dog’s starstuff strongly reminds us of our own starstuff — both of us are mammals, etc.

2) “Wars over a dog are sometimes justified.”  Once more, unless we are talking about an imagined early, early time of Homo sapiens, this statement cannot be considered meaningful in our modern, civilized times.  Once again for 2), absurd.

So far, the three-statement test’s responses for the dog are just like the rock’s.  But a difference appears in 3):

3) “I have a personal relationship with a dog.”  Even if one has never owned a dog, one surely has observed dog owners and knows this statement has to be very true, and not absurd. We now know that just like perception theory describes a symbiotic relationship between objectivity and subjectivity, human cultural evolution now describes the symbiotic relationship between humans and their domesticated animals, especially dogs.  (Cat lovers undoubtedly would have chosen a cat instead of a dog in this work.  I have just as undoubtedly exposed myself as a dog lover.)

Summing up, 1), 2), and 3) for dog responses are, respectively, “very true,” “absurd,” and “true.”  This shows that the difference between strongly veridically-based existence and quickened & strong veridically-based existence is simply the difference between “alive” and “not alive.”  Strong veridically-based existence of these two slightly different types is firmly planted in empirical data focused upon by perception; the rock and the dog exist scientifically, or, as we say, “The rock and the dog exist.”  Anyone who seriously disagrees with this statement is a hopeless solipsist doomed to self-exile from the rest of mankind.  Also, most of mankind would find the dog more interesting and emotionally satisfying than the rock for obvious reasons; we ontologically have more in common with a dog than with a rock.  We naturally quicken the dog, not the rock.

Before we continue, keep in mind these two slightly different forms of existence, though veridically-based via being scientifically objective, have to be generated as all human knowledge — subjectively and non-veridically generated within our brains and attributed to the perceptions from our senses we label as “rock” and “dog.”  We are convinced non-veridically that rocks and dogs exist veridically.

 

Third, how do we know freedom exists?  There is nothing “out there” outside our brains that we can see, touch, smell, etc. and label it “freedom.”  There are plenty of symbols of freedom “out there” that fire our senses, to be sure, but we would never hang a giant “FREEDOM” sign around the neck of, say, the Statue of Liberty in the harbor of New York City and declare Lady Liberty equivalent to freedom; a symbol of freedom stands in for the idea, concept, or perception of freedom, reminding us what freedom is.  Freedom, then, is not only non-veridical in origin, like all knowledge and perception (and therefore a product of our imaginative, creative, and calculative capacities inside our brains), it never corresponds one-to-one to something “out there” outside our brains existing strongly veridically-based or quickened & strong verdicially-based (existing like a rock or dog).  Yet most astute observers think of freedom as a quality and/or constituent of the “real” world of the veridical.  Freedom, then, has to be linked to the veridical universe outside our brains, but not as directly as the idea of a rock or of a dog.

Perception theory suggests freedom resonates with the veridical universe outside our heads (a universe assumed, as science assumes, to exist independent of our perception) through not only objects designated as symbols of freedom (e.g. Statue of Liberty) but through observable actions and language (citizens deciding for themselves, and political speeches and books waxing long and eloquently about freedom — the latter of which are more symbols).  In other words, we say non-veridical freedom exists indirectly in the veridical real world by resonating with objects and actions that would not logically exist without the non-veridical concept of freedom in our heads, much like unseen moving air molecules cause seen leaves on a tree to move.  Remove the wind, and the leaves don’t “move in the breeze;” if freedom did not exist, we would not see different people respond differently, as if by “free choice,” to the same situation, and we would not have Thomas Jefferson’s words in the U.S. Declaration of Independence.  Freedom, then, exists as a resonating non-veridically based existence.  Resonating non-veridically based existence (freedom) is the third category of existence.

The example of freedom suggests all political, economic, artistic, and ethical theories are resonating non-veridically based.  The same goes for all scientific and mathematical theory; numbers are non-veridical constructs in our heads that resonate strongly (I don’t know an example stronger) with the veridical “real” world; mathematics is the “language of the universe;” the universe appears to us to behave mathematically, thanks to this strong resonance.  As anything non-veridically based, we make these theories up in our heads, but they are distinguished from strictly fanciful ideas by our ability to appeal to the real world of the universe and the human culture inside the universe (cite evidence, in other words) and point to objects and/or social behaviors that correlate logically with the theories in our heads, all leading to a necessary consensus in a majority of heads around us.  Without the consensus of others, resonating non-veridically based ideas remain eccentric musings, speculations, or hypotheses.  If the resonating idea did not exist, there would be no consensus evidence to cite.  The vehicle of this resonance of the non-veridical with the veridical might very well be Richard Dawkin’s “memes,” or bits of human culture that spread throughout humanity like genes or viruses or bacteria.

[We can now illustrate literally the three categories of existence so far listed.  Look at Figure 2 — A Model of the Subjectivity of Perception (The “Screen”) in Perception is Everything, [Jan., 2016].  Rocks and dogs (processed, veridical, and empirical screen results) would be drawn in the figure in a solid font, while freedom (a subjective, non-veridical, and algorithmic screen result) would be written in the figure as the word “freedom” in a “dashed font,” if I could do such using Word.  Everything on the screen is non-veridical in origin (“made up” in our heads), but the “solids” are direct products of our senses in contact with the “real world,” and the “dashed” are indirectly but firmly connected to the “real world” (idea of a horse) or not connected at all to the “real world”(idea of a unicorn).  Again, in the world of Figure 2, rocks and dogs are solid, and freedom is dashed.]

Back to our ontological “adventure,” how do freedom’s 1), 2), and 3) read?

1) “Freedom helps me understand the universe better.”  There has to be agreement to this statement, even in disagreeing minds; leaders of democracies see freedom as something to be provided for the people and despots of all ilks see freedom as something to be denied the people.  The non-veridical concept of freedom is very useful and motivating in the real, veridical world.

Speaking of the really veridical, 2) “Wars over freedom are sometimes justified.”  So much of history screams for agreement to this 2) sentence.  No need to elaborate upon how much blood has been sacrificed in wars in which somebody’s freedom was at stake.

3) “I have a personal relationship with freedom.”  Plausibly, there would be a lot of agreement here too, even in disagreeing minds.  Citizens have a positive relationship with freedom, while despots have a negative one.

Interestingly, freedom’s three responses to 1), 2), and 3) are three resounding “true’s.”  a) Could it be that a general characteristic of resonating non-veridically based existence is the absence of “absurd” from the answers to the three questions?  (Same for other ideas like freedom?) b) Is the absence of “absurd” in the answers always characteristic of any kind of non-veridically based existence, not just the resonant kind?  Take the resonant non-veridical case of “love;” I suspect that “absurd” would probably be the logical response to 2) in the case of love (all types, including eros, philos, and agape).  Imagine the insanity of making war on a group because they refused to love your group, or, conversely, because you refused to love them!  Therefore, the answer to the a) question of this paragraph is clearly “no.”  When it comes to scientific, resonating non-veridical ideas, the answer to a) is also “no,” as fighting wars over a scientific theory (whose existence is definitely resonating non-veridically based) is as absurd as the craziest Python skit. [Imagine testing somebody’s new theory in quantum mechanics by rival, skeptical departments of physics of major universities attacking the claimant’s department instead of “hashing it out” at a conference of presentation of lab data.]  Probably it is just coincidence, then, that freedom’s responses are three “true’s.”  Perhaps the proper conclusion to draw on this matter is that responses for the resonating non-veridically based (freedom) are more varied than the responses for the strongly veridically-based (rock) and the quickened & strong veridically-based (dog).  Getting ahead of ourselves, the idea of a unicorn mentioned above is clearly non-veridical and suspiciously looks non-resonating.  Answers to 1), 2), and 3) for a unicorn must contain at least one “absurd,” if not two or three, so “no” also must be the response to b).  For all possible resonant non-veridically based existences, responses 1), 2), and 3) should be “True,” “True/Absurd,” and “True,” respectively.

 

Fourth, we come to the question of God.  I use the generic “God” to include all monotheistic and polytheistic views, in order to address the views of theists, agnostics, and atheists.  If God is used in the context of a specific religion or religious philosophy, I will naturally use the God of the Judeo-Christian tradition, as this is the religious culture in which I have lived.  However, my tack in this ontological “trek” is to come up with as widely applicable conceptions as possible, so that I could just as well use “deity” instead of “God.”  So, how do we know God exists?

God exists, like the rock, dog, and freedom, as a non-veridical construct of our brain.  God is different than the other three in that God not only is not empirically verified in the “real” world outside our heads, God cannot “escape” our heads via resonance. (Symbols, words, and actions purportedly representing God’s presence can be sensed all around, but like symbols and actions for freedom, they are NOT God — if they become God to certain worshipers they are NOT ontologically God; they are idols and/or icons or rituals.)  That is, the concept of God is so epiphenomenal (a secondary, coincidental, and unintentional by-product of brain activity), there is no world-wide consistency and agreement among these symbols, words, and actions, as there are for freedom, love, or ethical behavior. The non-veridical creation of God does NOT resonate with the universe, because God is like an ultimate non-veridical heat sink or dumping ground in our minds of as much definition, blame, credit, love, mystery, origin, power, thought, etc. as we can bestow.  No resonant non-veridical existence, like the idea of freedom, is like that; resonant concepts are definitely defined and predictably correlated to specific objects and actions, not to just any and to just all objects and actions, as is the case for God.  God is said to be the answer for everything, which is absurd, as it says nothing.  God is said to be in everything, which again says nothing, as we have discovered something in everything (We call them elementary particles.), but do not worship elementary particles as God.  Therefore, the non-veridical existence of God does not resonate; it “bounces back” or loops back into the brain’s fanciful, imaginative, creative faculties.  God, then, exists as a looped non-veridically based existence, a concept perpetually defying definition out in the real world outside our heads.  God is epiphenomenalism run amuck.

God exists as Santa Claus, Satan, Heaven, Hell, Purgatory, ghosts, the Tooth Fairy, the Easter Bunny, and fairies exist in our brains, and in our brains only.  (It is possible that some, perhaps not all the non-God listings in the previous sentence are resonant and exist as resonate non-veridically based, as will be shown below.)  Theists love and atheists despise the two words “God exists” near the beginning of the previous sentence; atheists love and theists despise the entire sentence. I would speculate that agnostics would be uneasy that theists and atheists could “sort of” agree upon something as “important” as God existing.  I just may have angered all three groups!  I’m not sure any of the three would be happy for me to join their group.

Things that exists as looped non-veridically based entities in the human brain, like God and Arthur Conan Doyle’s English garden fairies, remind us of our “imaginary friends” so many of us imagined as children.  Having imaginary friends probably evolved as culturally advantageous to psychologically deal with stressful loneliness, which is a life-long problem for such social creatures as we; hermits are not the normal examples for Homo sapiens.  The modus operandi of creating imaginary friends is related to attributing human characteristics to non-human veridical and non-veridical entities.  We call this anthropomorphism or personification of phenomenon.  Personification of looped non-veridically based entities in our head is a hallmark of our epiphenomenal abilities.  Thus, Santa Claus is the personification of the very veridical altruistic behavior of giving at Christmas time; Satan is the personification of the very veridical phenomenon of human evil.  In this sense, Santa Claus and Satan very “weakly” exist, or superstitiously exist — exist as psychological “crutches” to “handle” not-so-simple observations in the real world.  Santa Claus and Satan, as superstitious personifications, enjoy in our heads the ontological label of resonate non-veridically based, as the desire to give and human evil are both very real.  But God could be seen as the superstitious personification of everything and anything, the ultimate “imaginary friend,”  or “super-friend,” if you please.  And as a looped non-veridically based entity, God could also be an “all answer” friend, the “answer” to any and all unanswerable questions.  (Recall the analogy of the ultimate heat sink — actually, functioning like an imaginary “black hole” in our head.)  It is but a short step to God being “the” answer to all we see, to being the origin and Creator of the universe, as well as our super-friend.  This is exactly what theists do; they pray to God one moment and are speechless with pious awe the next as they stare into a telescope at the clear night sky.   What a trick we do in our heads — God is not only “with us,” he/she/it is simultaneously somehow controlling the entire universe!  At one extreme God seems close to being the same as the universe (pantheism) and at the other God seems to be the perfect “person” we wish we could be (wishful narcissism).  Effortlessly swinging back and forth between these theological extremes, we don’t have to think; we only need one answer — God.

[The only way God could be added to Figure 2 in Perception Is Everything, [Jan., 2016] would be the word “God” in dashed format; there would be no world-wide consensus on any dashed object that would represent “God.”]

Thoughts applied to this “whatever and everything” looping non-veridical entity form theology, which varies and correlates with the particular culture of the brains producing the thoughts.  “Looped” is another way of saying “faith-based,” so it is easy to see that theology is a “sitting duck” destined to become toxic due to faith-based epistemology as described in Sorting Out the Apostle Paul, [April, 2012], Jesus — A Keeper, [Sept. 2015], Perception is Everything, [Jan., 2016], and Perception Theory (Perception is Everything) — Three Applications, [Feb., 2016].

Now to sentences 1), 2), and 3). 1) “God helps me understand the universe better.”  Definitely not, as “the” answer to every question is no answer at all.  There is no definition, comparison, or contrasting possible with God.  Even most theistic scientists agree here.

2) “Wars over God are sometimes justified.”  Apparently so!  As the history of Europe and the Middle East (not to mention events today in the Middle East) attest.  However, this may be the response only for today’s theists.  Today’s modern atheists would definitely say “no.”  For lack of certainty, agnostics could not justify any “holy war.”

3) “I have a personal relationship with God.”  Theists say “You ‘bet-cha’!”  Atheists say “Hell, no!”  Agnostics say “Who knows?”  The looped non-veridically based existence of God placed into 3) may very well render 3) non-applicable or nonsensical.

So, for God, the three responses in possible theism, atheism, and agnosticism “triads,” are “No!,” “Yes/No/No,” and “Yes/No/?”  (Or, to correlate with the other three sets of responses, “Absurd,” “True/Absurd,” and “True/Absurd.”)  An astounding assortment of ambiguity, to say the least.  Ontology shows us, then, that God does not exist like a rock or a dog; nor does God exist like freedom.  God exists only in our heads; we have made he/she/it up, and he/she/it is so purely epiphenomenal that he/she/it becoming even weakly veridical (becoming resonant) seems impossible, even oxymoronic.

 

We can construct the following table of ontological results of this “adventure” for convenience:

CATEGORY OF EXISTENCE                       EXAMPLE          1), 2), 3) RESPONSES

Strongly Veridically-based                         Rock                   True, Absurd, Absurd

Quickened & Strong Veridically-based   Dog                      True, Absurd, True

Resonating Non-Veridically based          Freedom           True, True/Absurd, True

Looped Non-Veridically based               God      Absurd, True/Absurd, True/Absurd

Clearly, there are two main divisions of categories — the first two are veridical and the last two are non-veridical.  This is to be expected from perception theory with its assumption of “balance” between the objective and the subjective.  The veridically-based categories of existence indicate learning about the universe and avoiding war, while the non-veridically based indicate no definite pattern except being ambiguous on war and personal relationship.  Correlation between the two “veridicals” is strong, and correlation between the two “non-veridicals” is non-existent, or, at best, really weak.  Reliability, not surprisingly, seems to lie with the universe outside us, not with that within our heads — with the two “veridicals” and with the non-veridical that resonates with the real world.  Nor is it surprising to see that if you want to know about the universe, direct the non-veridical toward the veridical in your head (Perception is Everything, [Jan., 2016]).  And, war is clearly a function of our heads, not of the universe.  In my opinion, war could also have more favorability with theists than with atheists or agnostics (Perhaps I’ve not met enough Quakers.).

The astute reader of perception theory might have noticed I’ve interchangeably used, pretty much throughout, the terms “mind” and “brain,” as if they are essentially synonymous.  They can be distinguished, but for the purposes of perception theory they obviously go together.  For completeness, let me mention their distinction:  perception theory is compatible with the idea that “mind” is an epiphenomenal by-product of the physiological complexity of the brain, mostly the complexity of those “johnny-come-latelys” of the brain, the  frontal lobes; the “mind” is an incidental effect of the complex brain, which originally evolved for survival of the species.  We needed to be cleverer than the animals competing for our resources and/or trying to eat us, so with the addition of animal protein from dead animals, our brains enlarged enough, on the average, to be just that — cleverer.  Human birth canals did not enlarge enough to “keep up,” so big-brained babies had to be born less mature than the babies of our primate cousins, chimps and gorillas.  This gave Homo sapiens a “long childhood” and child rearing to physical independence became a necessary part of developing human culture, contributing to the advancement of the “nuclear family” and necessarily cooperative groups, usually of extended kinship.  The imaginations of our “new” big brains had a long time to exercise in this long childhood — so much so, in my opinion, created imaginary concepts based upon veridical perceptions lead to a self-concept of “that which imagines,” or, the mind.  Our brains did not evolve “intentionally” to form a mind; they just happened to be complex enough to form a mind.

The astute reader also no doubt noticed that I described the looped non-veridical based concept of God in our heads as being epiphenomenal, a clear unintentional by-product of brain complexity — a product of our mind.  Perhaps I should have throughout the presentation of perception theory used the descriptor “epiphenomenal” with all non-veridical existence, both resonating and looped.  Our ideas and concepts exist as epiphenomenal products of our epiphenomenal mind.

As I began this “ontological adventure” of comparing the existence of a rock, a dog, freedom, and God as suggested by perception theory, I could see that the adventure had to end talking about theists, atheists, and agnostics.  Frankly, I did not at first see exactly where the adventure would leave me, a “perception theorist,” or “perceptionist” in relation to these three groups of thinkers.  Would I come down agreeing with one of the groups or two?  To my surprise, perception theory both agrees and disagrees with all three.  God exists all right, which makes the theists glad but the atheists furious (agnostics would not like this certainty of God’s existence), but God exists confined in our heads as, again, “epiphenomenalism run amuck” — a dashed word on the perception screen of our mind — as a Grand Answerer, or super-friend so super we don’t have to struggle with where we and the universe came from, as God also is the answer to that also; he/she/it is not only the Grand Answerer and Grand super-friend, he/she/it is also the Grand Creator.  God is all we need in one Grand Epiphenomenal Package, saving us from having to mentally struggle, think, and/or worry.  God only being in our heads infuriates the theists and delights the atheists (and again is too certain for agnostics).

Perception theory, then, in a way, makes the clashes, conflicts, debates, and ill feelings among theists, atheists, and agnostics seem rather silly.  The differences among them are interesting, but not worth fighting over.  Taking my cue from Arian Foster, NFL running back formally with the Houston Texans, who is the only NFL player I know to have the courage to “come out” in favor of freethinking amidst a locker room and overall profession teeming with theism, Arian says it is better to have friendly, respectful dialogue about religious beliefs than trying to convert each other.  He is, in addition to being a free agent as of this writing, in my books a perfect candidate for being called a perceptionist.

 

Finally, I want to establish that despite a lot of correlations with perception theory in Richard Dawkins’ The God Delusion (2006, Houghton, Mifflin, Harcourt, New York, NY, ISBN 978-0-618-91824-9 (pbk.) or 0-618-91824-8 (pbk.)), I had developed perception theory before I read this book, and this book was written about a decade before my perception theory.  I am delighted at these independent correlations, as I’ve met Dr. Richard Dawkins personally and spent a few hours with him one-on-one, in which we did NOT discuss our religious positions.  I consider him a friend of casual acquaintance, but it is possible he has no recollection of meeting me.  I met him years ago as part of the cast of a BBC film featuring Richard that was part of the debunking of creationist fossilized “mantrack” claims along the Paluxy River near my home in Texas; my role was the “intrepid amateur paleontologist (with son),” among many amateur and professional scientists, who were showing evidence these claims had no scientific merit whatsoever. (See Creationism and Intelligent Design — On the Road to Extinction, [July, 2012])  I recommend all Dawkins’ books to the readers of perception theory.  The God Delusion presents the case for atheism very well for theists, atheists, and agnostics; I can only hope my presentation of the case for perception theory does something similar for all three groups.  I agree with Arian Foster: I hope in future to have meaningful, respectful, and friendly dialogue among all three groups, during which I’d love to renew my acquaintance with Richard Dawkins and start one with Arian Foster.

[Incidentally, the BBC film done along the Paluxy River, entitled “God, Darwin, and the Dinosaurs,” was so “controversial” in the U.S., it was never aired on TV’s “NOVA” PBS scientific series.  It was, however, shown in Britain (I think) and Canada.  I got to see it only because a Canadian friend of mine mailed me a VCR videotape copy he recorded off his TV!  I can only hope that public scientific sensibilities in the U.S. are now less “medieval” than then.]

RJH

 

Perception Is Everything

Recently a model of human perception has occurred to me. Perception is like that “screen” of appearance before us in our waking hours that is turned off when we are asleep. Yet, it appears to us it does not really turn off during slumber when we remember dreams we have had before we awoke. The moments just before we “nod off” or just as we awake seem as times when perception is “half-way” turned on. The “fuzziness” of this “half-way switch” is clearly apparent in those mornings we awake and momentarily do not know the location of exactly where we slept.

 

Say I am sitting in an enclosed room with a large card painted uniformly with a bright red color. Focusing upon only my visual sensation, suppressing the facts I am also sensing the tactile signals of sitting in a chair with my feet on the floor as well as peripherally seeing “in the corner of my eye” the walls and other features of the room, I am only visually observing the color “red,” all for simplicity. Light from the card enters my eyes and is photo-electrically and electro-chemically processed into visual signals down my optic nerve to the parts of my brain responsible for my vision. The result of this process is the perception of the color “red” on the “screen” of my perception. If I were to describe this perception to myself I would simply imagine the word “red” in my head (or the word “red” in some other language if my “normal” spoken language was not English); were I to describe this perception to someone else in the room, say, a friend standing behind me, I would say, “I am seeing the color red,” again in the appropriate language.

Yet, if my friend could somehow see into my head and observe my brain as I claimed to be seeing red, that person would not experience my sensation or perception of “red.” He/she would see, perhaps with the help of medical instrumentation, biochemical reactions and signals on and in my brain cells. Presumably when I perceive red at a different moment in time later on, the observer of my brain would see the same pattern of chemical reactions and bio-electrical signals.

 
On the “screen” of my perception, I do NOT see the biochemistry of my brain responsible for my perception of red; were I to observe inside the head of my friend in the room while he/she was also focusing on the red card, I would NOT see his/her “screen” of perception, but only the biochemical and bio-electrical activity of his/her brain. It is IMPOSSIBLE to experience (to perceive) both the subjective perception of red and observe the biochemistry responsible for the same subjective perception within the same person. We can hook up electrodes to our own head to a monitor which we observe at the same time we look at red, but we would only be seeing just another representation of the biochemistry forming our perception, not the biochemistry itself, as well as perceiving the red perception. I call this impossibility “the subjective trap.”

 
And yet, my friend and I make sense of each of our very individual impossibilities, of each of our very personal subjective traps, by behaving as if the other perceives red subjectively exactly the same, and as if our biochemical patterns in our respective brains are exactly the same. We are ASSUMING these subjective and biochemical correlations are the same, but we could never show this is the case; we cannot prove our individual perceptions in our own head are the same perceptions in other heads; we cannot ever know that we perceive the same things that others around us perceive, even if focusing upon the exact same observation. The very weak justification of this assumption is that we call our parallel perceptions, in this scenario, “red.” But this is merely the learning of linguistic labels. What if I were raised in complete isolation and was told that the card was “green?” I would say “green” when describing the card while my friend, raised “normally” would say “red.” (Note I’m stipulating neither of us is color blind.) Such is the nature of the subjective trap.

 
[If one or both of us in the room were color-blind, comparison of visual perceptions in the context of our subjective traps would be meaningless — nothing to compare or assume. In this scenario, another sensation both of us could equally perceive, like touching the surface of a piece of carpet or rubbing the fur of a cute puppy in the room with us, would be substituted for seeing the color red.]

 
The subjective trap suggests the dichotomy of “objective” and “subjective.” What we perceive “objectively” and what we perceive “subjectively” do not seem to overlap (though they seem related and linked), leading to a separation of the two adjectives in our culture, which has a checkered history. Using crude stereotypes, the sciences claim objectivity is good while subjectivity is suspect, while the liberal arts (humanities) claim subjectivity is good while objectivity is ignorable. Even schools, colleges, and universities are physically laid out with the science (including mathematics and engineering) buildings on one end of the campus and the liberal arts (including social studies and psychology) buildings on the other. This is the “set-up” for the “two cultures'” “war of words.” I remember as an undergraduate physics major debating an undergraduate political science major as we walked across campus which has had the greatest impact upon civilization, science or politics? We soon came to an impasse, an impasse that possibly could be blamed, in retrospect over the years, on the subjective trap. Ideas about the world outside us seemed at odds with ideas about our self-perception; where we see ourselves seemed very different from whom we see ourselves; what we are is different from whom we are.

Yet, despite being a physics major and coming down “hard” on the “science side” of the argument, I understood where the “subjective side” was coming from, as I was in the midst of attaining, in addition to my math minor, minors in philosophy and English; I was a physics major who really “dug” my course in existentialism. It was as if I “naturally” never accepted the “two cultures” divide; it was as if I somehow “knew” both the objective and the subjective had to co-exist to adequately describe human experience, to define the sequence of perception that defines a human’s lifespan. And, in this sense, if one’s lifespan can be seen as a spectrum of perception from birth to death of that individual, then, to that individual, perception IS everything.

How can the impossibility of the subjective trap be modeled? How can objectivity and subjectivity be seen as a symbiotic, rather than as an antagonistic, relationship within the human brain? Attempted answers to these questions constitute recent occurrences inside my brain.

 

Figure 1 is a schematic model of perception seen objectively – a schematic of the human brain and its interaction with sensory data, both from the world “outside” and from the mind “inside.” The center of the model is the “world display screen,” the result of a two-way flow of data, empirical (or “real world” or veridical) data from the left and subjective (or “imaginative” or non-veridical) data from the right. (Excellent analogies to the veridical/non-veridical definitions are the real image/virtual image definitions in optics; real images are those formed by actual rays of light and virtual images are those of appearance, only indirectly formed by light rays due to the way the human brain geometrically interprets signals from the optic nerves.) [For an extensive definition of veridical and non-veridical, see At Last, A Probable Jesus [August, 2015]] Entering the screen from the left is the result of empirical data processed by the body’s sense organs and nervous system, and entering the screen from the right is the result of imaginative concepts, subjective interpretations, and ideas processed by the brain. The “screen” or world display is perception emerging to the “mind’s eye” (shown on the right “inside the brain”) created by the interaction of this two-way flow.

 
Figure 1 is how others would view my brain functioning to produce my perception; Figure 1 is how I would view the brains of others functioning to produce their perceptions. This figure helps define the subjective trap in that I cannot see my own brain as it perceives; all I can “see” is my world display screen. Nor can I see the world display screens of others; I can only view the brains of others (outside opening up their heads) as some schematic model like Figure 1. In fact, Figure 1 is a schematic representation of what I see if I were to peer inside the skull of someone else. (Obviously, it is grossly schematic, bearing no resemblance to brain, nervous system, and sense organ physiology. Perhaps many far more proficient in neuro-brain function than I, and surely such individuals in future, can and will correlate those terms on the right side of Figure 1 with actual parts of the brain.)

 
Outside data collectively is labeled “INPUT” on the far left of Figure 1, bombarding all the body’s senses — sight, sound, smell and taste, heat, and touch. Data that stimulates the senses is labeled “PERCEPTIVE” and either triggers the autonomic nervous system to the muscles for immediate reaction (sticking your fingers into a flame) necessarily not requiring any processing or thinking, or, goes on to be processed as possible veridical data for the world display. However, note that some inputs for processing “bounce off” and never reach the world display; if we processed the entirety of our data input, our brains would “overload,” using up all brain function for storage and having none for consideration of the data “let in.” This overloading could be considered a model for so-called “idiot savants” who perceive and remember so much more than the “average” person (“perfect memories”), yet have subnormal abilities for rational thought and consideration. Just how some data is ignored and some is processed is not yet understood, but I would guess that it is a process that differs in every developing brain, resulting in no two brains, even those of twins, accepting and rejecting data EXACTLY alike. What is for sure is that we have evolved “selective” data perception over hundreds of thousands of years that has assured our survival as a species.
The accepted, processed data that enter our world display in the center of Figure 1 as veridical data from the outside world makes up the “picture” we “see” on our “screen” at any given moment, a picture dominated by the visual images of the objects we have before us, near and far, but also supplemented by sound, smell, tactile information from our skin, etc. (This subjective “picture” is illustrated in Figure 2.) The “pixels” of our screen, if you please, enter the subjective world of our brain shown on the right of Figure 1 in four categories – memory loops, ideas, self-perception, and concepts – as shown by the double-headed, broad, and straight arrows penetrating the boundary of the world display with the four categories. The four categories “mix and grind” this newly-entered data with previous data in all four categories (shown by crossed and looped broad, double-headed arrows) to produced imagined and/or reasoned data results back upon the same world display as the moment’s “picture” – non-veridical data moving from the four categories back into the display (thus, the “double-headedness” of the arrows). Thusly can we imagine things before us that are not really there at the moment; we can, for instance, imagine a Platonic “perfect circle” (non-veridical) not really there upon a page of circles actually “out there” drawn upon a geometry textbook’s page (veridical) at which we are staring. In fact, the Platonic “perfect circle” is an example of a “type” or “algorithmic” or symbolic representation for ALL circles created by our subjective imagination so we do not have to “keep up” will all the individual circles we have seen in our lifetime. Algorithms and symbols represent the avoidance of brain overload.

 
From some considered input into our four categories of the brain come “commands” to the muscles and nervous system to create OUTPUT and FEEDBACK into the world outside us in addition to the autonomic nerve commands mentioned above, like the command to turn the page of the geometry text at which we are looking. Through reactive and reflexive actions, bodily communication (e.g. talking), and environmental manipulation (like using tools), resulting from these feedback outputs into the real world (shown at bottom left of Figure 1), we act and behave just as if there had been an autonomic reaction, only this time the action or behavior is the result of “thinking” or “consideration.” (The curved arrow labeled “Considered” leading to the muscles in Figure 1.)

 

Note how Figure 1 places epistemological and existential terms like CONSCIOUSNESS, Imagination, Knowing, Intention & Free Will, and Reason in place on the schematic, along with areas of the philosophy of epistemology, like Empiricism, Rationalism, and Existentialism (at the top of Figure 1). These placements are my own philosophical interpretations and are subject to change and placement alteration indicated by a consensus of professional and amateur philosophers, in conjunction with consensus from psychologists and brain physiologists, world-wide.
Figure 2 is a schematic of the “screen” of subjective perception that confronts us at every moment we see, hear, smell, taste, and/or touch. Figure 2 is again crudely schematic (like Figure 1), in this case devoid of the richness of the signals of our senses processed and displayed to our “mind’s eye.” Broad dashed arrows at the four corners of the figure represent the input to the screen from the four categories on the right of Figure 1 – memory loops, ideas, perception, and concepts. Solid illustrated objects on Figure 2 represent processed, veridical, and empirical results flowing to the screen from the left in Figure 1, and dashed illustrated objects on Figure 2 represent subjective, non-veridical, type, and algorithmic results flowing to the screen from the right in Figure 1. Thus Figure 2 defines the screen of our perception as a result of the simultaneous flow of both veridical and non-veridical making up every waking moment.

PerceptPic1

Figure 1 — A Model of the Objectivity of Perception

 

(Mathematical equations cannot be printed in dashed format, so the solid equations and words, like History, FUTURE, Faith, and PRESENT, represent both veridical and non-veridical forms; note I was able to represent the veridical and non-veridical forms of single numbers, like “8” and certain symbols, like X, equals, and does not equal.) Thus, the solid lightning bolt, for example, represents an actual observed bolt in a thunderstorm and the dashed lightning bolt represents the “idea” of all lightning bolts observed in the past.

 

The “subjective trap” previously introduced above is defined and represented by the rule that nothing of Figure 1 can be seen on Figure 2, and vice-versa. In my “show-and-tell” presentation of this perception model encapsulated in both figures, I present the figures standing on end at right angles to each other, so that one figure’s area does not project upon the area of the other – two sheets slit half-height so that one sheet slides into the other. Again, a) Figure 2 represents my own individual subjective screen of perception no one else can see or experience; b) Figure 1 represents the only way I can describe someone else allegedly perceiving as I. I cannot prove a) and b) are true, nor can anyone else. I can only state with reasonable certainty that both someone else and I BEHAVE as if a) and b) are true. In other words, thanks to the common cultural experience of the same language, my non-color-blind friend and I in the room observing the red-painted card agree the card “is red.” To doubt our agreement that it is red would stretch both our limits of credulity into absurdity.

 
The model described above and schematically illustrated in Figures 1 and 2 can be seen as one way of describing the ontology of human beings, of describing human existence. Looking at Figure 1, anything to the left of the world display screen is the only way we know anything outside our brain exists and anything to the right of the world display screen is the only way we know we as “I’s” exist in a Cartesian sense; anything to the right is what we call our “mind,” and we assume we think with our mind; in the words of Descartes, “I think, therefore I am.” We see our mind as part of the universe being “bombarded” from the left, so we think of ourselves as part of the universe. Modern science has over the centuries given us some incredible ontological insights, such as all physical existence is made up of atoms and molecules and elementary particles; we can objectively or “scientifically” describe our existence, but we do so, as we describe anything else, with our subjective mind; we, as self-conscious beings, describe the veridical in the only way we possibly can – non-veridically. Thus, the model suggests the incredible statement made by scientists and philosophers of science lately. Recalling that atoms are created in the interior of stars (“cooked,” if you please, by nuclear fusion inside stars of various sizes and temperatures) that have long since “died” and spewed out their atoms in

PerceptPic2

Figure 2 — A Model of the Subjectivity of Perception (The “Screen”)

 

contribution to the formation of our own solar system around 13.5 billion earth years ago, and recalling our bodies, including our brains, are made of molecules made from the atoms from dead and gone stars, the statement “We are ‘star-stuff’ in self-contemplation” makes, simultaneously, objective and subjective, or scientific and artistic, “spiritual sense.”

We can veridically “take in,” “observe,” “experience,” or “contemplate” anything from the vast universe outside our body as well as the vast universe inside our body outside our brain while at the same time we can imagine non-veridically limitless ways of “making sense” of all this veridical data by filing it, storing it, mixing it, and thinking about it, all within our brain. We are limitless minds making up part of a limitless universe.

 

As if that was not enough, each of us, as a veridical/non-veridical “package of perception,” is unique. Every human has a unique Figure 1 and a unique Figure 2. Our existence rests upon the common human genome of our species, the genetic “blueprint” that specifies the details of our biological existence. Yet, every individual’s genome is different from every other (even if only by .1% or by a factor of .001), just considering that mutations even for identical twins make their two “blueprints” slightly different once the two organisms exist as separated zygotes in the womb. Moreover, how we behave, and, therefore, how we respond non-veridically to the veridical data we receive individually, even from the same environment shared by others, is mitigated by the unique series of experiences each of us has had in our past. Hence, each person is a unique individual genome subjected to unique environmental experiences, the exact copy of which cannot possibly statistically exist.

 

The world display screen of an individual in any given moment has never been perceived before, nor will it ever be perceived again, as in the next moment the screen is modified by the dual flux of the veridical flux from the left and the non-veridical flux from the right in Figure 1. The life of an individual is a series of receiving this ever-changing dual flux and thinking or acting in the real world upon the basis of this dual flux; it is a series of two-way perceptions. The life of an individual is observed by another individual as a series of perceived behaviors assumed, but never proven, to be generated in the same way as those of the observer. All in the span of a human life is perception; to an individual human being, perception has to be everything.

 

This model suggests to me the absurdity of having objectivity and subjectivity irreconcilably separate; it suggests, rather, that they are inseparable; they go together like, in the words of the song, “horse and carriage” or “love and marriage.” The blending of objective data and imaginative concepts in our brain makes our perception, our conscious “everything,” or existence as a self-conscious being, if you please, possible. What we are is the veridical of our screen of perception; who we are is the non-veridical of the screen. In other words, the scientist is as potentially subjective as the poet, and the poet is as potentially objective as the scientist; they differ only in the emphases on the contents of their respective screens of perception. For the “two sides” of campuses of higher learning to be at “war” over the minds of mankind is absurd – as absurd as the impasse the political science major and I reached in conversation so many years ago.

 
If the above was all the model and its two figures did, its conjuring would have been well worth it, I think, but the above is just the tip of the iceberg of how the model can be applied to human experience. Knowing how prone we are to hyperbole when talking about our “brain children,” I nonetheless feel compelled to suggest this model of conception can be intriguingly applied to almost any concept or idea the human brain can produce – in the sense of alternatively defining the concept using “both worlds,” both the objective and the subjective, instead of using one much more than the other. In other words, we can define with this model almost anything more “humanly” than before; we can define and understand almost anything with “more” of ourselves than we’ve done in the past.

 

Take the concept of the human “soul” for example. It seems to me possible that cultures that use the concept of soul, whether in a sacred or secular sense, whether in the context of religion or psychology, they are close to using the concept of the “mind’s eye” illustrated in Figure 1 of the model. The “mind’s eye” is the subjective “I,” the subjective observer of the screen, the “see-er,” the “smell-er,” the “taste-er,” the “hear-er,” the “touch-er,” the “feel-er” of perception; the soul is the active perceiver of subjective human experience. The soul defines self-consciousness; it is synonymous with the ego. This view is consistent with the soul being defined as the essence of being alive, of being that which “leaves” the body upon death. Objectively, we would say that death marks the ceasing of processing veridical data; subjectively, we would say that death marks the ceasing of producing non-veridical data and the closing of the “mind’s eye.”

 

Yet the soul is a product of the same physiology as the pre-conscious “body” of our evolutionary ancestors. In other words, the soul “stands upon the shoulders” of the id, our collection of instincts hewn over millions of years. So, in addition, we would objectively say that death also marks the ceasing of “following” our instincts physically and mentally; our unique, individual genome stops defining our biological limitations and potentialities. The elements of our body, including our brain, eventually blend to join the elements of our environment. Objectively, we would say death marks our ceasing to exist as a living being. The concept of the soul allows death to be seen as the “exiting” or “leaving” of that necessary to be called “alive.”

 
So, the concept of the soul could be discussed as the same or similar to the concept of the ego, and issues such as when does a developing human fetus (or proto-baby) develop or “receive” a soul/ego, which in turn has everything to do with the issue of abortion, can be discussed without necessarily coming to impasses. (See my The ‘A’ Word – Don’t Get Angry, Calm Down, and Let Us Talk, [April, 2013] and my The ‘A’ Word Revisited (Because of Gov. Rick Perry of Texas), or A Word on Bad Eggs [July, 2013]) I said “could be,” not “will be” discussed without possibly coming to impasses. Impasses between the objective and subjective seem more the norm than the exception, unfortunately; the “two cultures war” appears ingrained. Why?

 
Earlier, I mentioned causally the answer the model provides to this “Why?”. The scientist/engineer and the artist/poet differ in their emphases of either the veridical flux to the world display screen or the non-veridical flux to the same world display screen of their individual brains. By “emphasis” I merely mean assigning more importance by the individual to one flux direction or the other in his/her head. At this point, one is reminded of the “left-brain, right-brain” dichotomy dominating brain/mind modeling since the phenomenon of the bicameral mind became widely accepted. The perception model being presented here incorporates on the non-veridical side of the perception screen both analytical (left) brain activity and emotional (right) brain activity in flux to the screen from the right side of Figure 1. Just like my use of left/right in Figure 1 is not like the use of left/right in bicameral mind/brain modeling, this model of perception is not directly analogous to bicameral modeling. What the perception model suggests, in my opinion, is that the analytical/emotional chasm of the human brain is not as unbridgeable as the “left-brain-right-brain” view might suggest.

More specifically, the perception model suggests that the “normal” or “sane” person keeps the two fluxes to the world display screen in his/her head “in balance,” always one flux mitigating and blending with the other. It is possible “insanity” might be the domination of one flux over the other so great that the dominated flux is rendered relatively ineffective. If the veridical flux is completely dominant, the person’s mind is in perpetual overload with empirical data, impotent to sort or otherwise deal with the one-way bombardment on his/her world display screen; such a person would presumably be desperate to “turn off” the bombardment; such a person would be driven to insanity by sensation. If the non-veridical flux is completely dominant, the person’s mind is in a perpetual dream of self-induced fantasy, sensing with all senses, that which is NOT “out there;” such a person would be driven to insanity by hallucination. In this view, the infamous “acid trips” of the 1960’s induced by hallucinatory drugs such as LSD could be seen as self-induced temporary periods of time in which the non-veridical flux “got the upper hand” over the veridical flux.

This discussion of “flux balance” explains why dreams are depicted in Figure 1 as “hovering” just outside the world display screen. The perception model suggests dreams are the brain’s way of keeping the two fluxes in balance, keeping us as “sane” as possible. In fact, the need to keep the fluxes in balance, seen as the need to dream, may explain why we and other creatures with large brains apparently need to sleep. We need “time outs” from empirical data influx (not to mention “time outs” just to rest the body’s muscular system and other systems) to give dreaming the chance to balance out the empirical with the fanciful on the stage of the world display. Dreams are the mixtures of the veridical and non-veridical not needed to be stored or acted upon in order to prevent overload from the fluxes of the previous day (or night, if we are “night owls”); they play out without being perceived in our sleeping unconsciousness (except for the dreams we “remember” just before we awaken) like files in computer systems sentenced to the “trash bin” or “recycle bin” marked for deletion. Dreams can be seen as a sort of “reset” procedure that prepares the world display screen to ready for the upcoming day’s (or night’s) two-way flux flow that defines our being awake and conscious.

This model might possibly suggest new ways of defining a “scientific, analytical mind” (“left brain”) and comparing that with an “artistic, emotional mind” (“right brain”). Each could be seen as a slight imbalance (emphasis on “slight” to remain “sane”) of one flux over the other, or, better, as two possible cases of one flux mitigating the other slightly more. To think generally “scientifically,” therefore, would be when the non-veridical flux blends “head-on” upon the world display screen with the veridical flux and produces new non-veridical data that focuses primarily upon the world external to the brain; the goal of this type non-veridical focus is to create cause/effect explanations, to problem-solve, to recognize patterns, and to create non-veridically rational hypotheses, or, as I would say, “proto-theories,” or scientific theories in-the-making. Thus is knowledge about the world outside our brain increased. To think generally “artistically,” on the other hand, would be when the non-veridical flux takes on the veridical flux upon the world display screen as ancillary only, useful in focusing upon the “world” inside the brain; the goal of this type non-veridical focus is to create new ways of dealing with likes, dis-likes, and emotions, to evoke “feelings” from morbid to euphoric, and to modify and form tastes from fanciful thinking to dealing emotionally with the external world in irrational ways. Thus is knowledge about what we imagine and about what appears revealed to us inside our brain increased.

With these two new definitions, it is easy to see that we have evolved as a species capable of being simultaneously both scientific and artistic, both “left-brain” and “right-brain;” as I said earlier, the scientist is as potentially subjective as the poet, and the poet is as potentially objective as the scientist. We do ourselves a disservice when we believe we have to be one or the other; ontologically, we are both. Applying the rule of evolutionary psychology that any defining characteristic we possess as a species that we pass on to our progeny was probably necessary today and/or in our past to our survival (or, at minimum, was “neutral” in contributing to our survival), the fact we are necessarily a scientific/artistic creature was in all likelihood a major reason we evolved beyond our ancestral Homo erectus and “triumphed” over our evolutionary cousins like the Neanderthals. When we describe in our midst a “gifted scientist” or a “gifted artist” we are describing a person who, in their individual, unique existence purposely developed, probably by following their tastes (likes and dislikes), one of the two potentialities over the other. The possibility that an individual can be gifted in both ways is very clear. (My most memorable example of a “both-way” gifted person was when I, as a graduate student, looked in the orchestra pit at a production of Handel’s Messiah and saw in the first chair of the violin section one of my nuclear physics professors.) Successful people in certain vocations, in my opinion, do better because of strong development of both their “scientific” and “artistic” potentialities; those in business and in service positions need the ability to simultaneously successfully deal with problem solving and dealing with the emotions of colleagues and clientele. Finding one’s “niche” in life and in one’s culture is a matter of taste, depending on whether the individual feels more comfortable and satisfied “leaning” one way or another, or, being “well-rounded” in both ways.

Regardless of the results of individual tastes in individual circumstances, the “scientist” being at odds with the “artist” and vice-versa is always unnecessary and ludicrous; the results of one are no better or worse than those of another, as long as those results come from the individual’s volition (not imposed upon the individual by others).

 

From the 1960’s “acid rock, hard rock” song by Jefferson Airplane, Somebody to Love:

When the truth is found to be……lies!
And all the joy within you…..dies!
Don’t you want somebody to love?
Don’t you need somebody to love?
Wouldn’t you love somebody to love?
You better find somebody to love!

These lyrics, belted out by front woman Grace Slick, will serve as the introduction to two of the most interesting and most controversial applications of this perception theory. The first part about truth, joy, and lies I’ll designate as GS1, for “Grace Slick Point 1” and the second part about somebody to love I’ll designate as GS2.

Going in reverse order, GS2 to me deals with that fundamental phenomenon without which our cerebral species or any such species could not have existed – falling in love and becoming parents, or, biologically speaking, pair bonding. The universal human theme of erotic love is the basis of so much of culture’s story-telling, literature, poetry, and romantic subjects of all genres. Hardwired into our mammalian genome is the urge, upon the outset of puberty, to pair-bond with another of our species and engage, upon mutual consent, in sexual activity. If the pair is made of two different genders, such activity might fulfill the genome’s “real” intent of this often very complex and convoluted bonding – procreation of offspring; procreation keeps the genes “going;” it is easily seen as a scientific form of “immortality;” we live on in the form of our children, and in our children’s children, and so on. Even human altruism seems to emerge biologically from the urge to propagate the genes we share with our kin.

Falling in love, or pair bonding, is highly irrational, and, therefore a very non-veridical phenomenon; love is blind. When one is in love, the short comings of the beloved are ignored, because their veridical signals are probably blocked non-veridically by the “smitten;” when one is in love, and when others bring up any short comings of the beloved, they are denied by the “smitten,” often in defiance of veridical evidence. If this were not so, if pair bonding was a rational enterprise, much fewer pair bonds would occur, perhaps threatening the perpetuation of the species into another generation. [This irrationality of procreation was no better defined than in an episode of the first Star Trek TV series back in the 1960’s, wherein the half human-half alien (Vulcan) Enterprise First Science Officer Spock (played by Leonard Nimoy) horrifically went apparently berserk and crazy in order to get himself back to his home planet so he could find a mate (to the point of hijacking the starship Enterprise). I think it was the only actual moment of Spock’s life on the series in which he was irrational (in which he behaved like we – fully human.]

GS1 is to me another way of introducing our religiosity, of asking why we are as a species religious. This question jump-started me on my “long and winding road,” as I called it – a personal Christian religious journey in five titles, written in the order they need to be read: 1) Sorting Out the Apostle Paul [April, 2012], 2) Sorting Out Constantine I the Great and His Momma [Feb., 2015], 3) Sorting Out Jesus [July, 2015], 4) At Last, a Probable Jesus [August, 2015], and 5) Jesus – A Keeper [Sept., 2015]. Universal religiosity (which I take as an interpretation of GS1) is here suggested as being like the universality of the urge to procreate, though not near as ancient as GS2. As modern humans emerged and became self-conscious, they had to socially bond into small bands of hunter-gatherers to survive and protect themselves and their children, and the part of the glue holding these bands together was not only pair-bonding and its attendant primitive culture, but the development of un-evidenced beliefs – beliefs in gods and god stories – to answer the then unanswerable, like “What is lightning?” and “How will we survive the next attack from predators or the enemy over the next hill?” In other words, our non-veridical faculties in our brain dealt with the “great mysteries” of life and death by making up gods and god stories to provide assurance, unity, fear, and desperation sufficient to make survival of the group more probable. Often the gods took the shape of long-dead ancestors who “appeared” to individuals in dreams (At Last, a Probable Jesus [August, 2015]). Not that there are “religious genes” like there are “procreate genes,” but, rather, our ancestors survived partly because the genes they passed on to us tended to make them cooperative for the good of the group bound by a set of accepted beliefs – gods and god stories; that is, bound by “religion.”

The “lies” part of GS1 has to do with the epistemological toxicity of theology (the intellectual organization of the gods and god stories) – religious beliefs are faith-based, not evidence-based, a theme developed throughout the five parts of my “long and winding road.” On p. 149 of Jerry A. Coyne’s Faith vs. Fact, Why Science and Religion are Incompatible (ISBN 978-0-670-02653-1), the author characterizes this toxicity as a “metaphysical add-on….a supplement demanded not by evidence but by the emotional needs of the faithful.” Any one theology cannot be shown to be truer than any other theology; all theologies assume things unnecessary and un-evidenced; yet, all theologies declare themselves “true.” As my personal journey indicates, all theologies are exposed by this common epistemological toxicity, yet it is an exposé made possible only since the Enlightenment of Western Europe and the development of forensic history in the form of, in the case of Christianity, higher Biblical criticism. This exposé, in my experience, can keep your “joy” from dying because of “lies,” referring back to GS1.

Both GS1 and GS2 demonstrate the incredible influence of the non-veridical capabilities of the human brain. A beloved one can appear on the world display screen, can be perceived, as “the one” in the real world “out there,” and a god or the lesson of a god story can appear on the world display screen, can be perceived, as actually existing or as being actually manifest in the real world “out there.”

Putting GS1 in more direct terms of the perception model represented by Figures 1 and 2, non-veridical self-consciousness desires the comfort of understandable cause and effect as it develops from infancy into adulthood; in our brains we “need” answers — sometimes any answers will do; and the answers do not necessarily have to have veridical verification. Combining the social pressure of the group for conformity and cooperation, for the common survival and well-being of the group, with this individual need for answers, the “mind,” the non-veridical, epiphenomenal companion of our complex brain, creates a personified “cause” of the mysterious and a personified “answerer” to our nagging questions about life and death in general and in particular; we create a god or gods paralleling the created god or gods in the heads of those around us who came before us (if we are not the first of the group to so create). We experience non-veridically the god or gods of our own making through dreams, hallucinations, and other visions, all seen as revelations or visitations; these visions can be as “real” as the real objects “out there” that we sense veridically. (See At Last, a Probable Jesus [August, 2015] for examples of non-veridical visions, including some of my own.) Stories made up about the gods, often created to further explain the mysteries of our existence and of our experiences personally and collectively, combine with the god or gods to form theology. Not all of theology is toxic; but its propensity to become lethally dangerous to those who created it, when it is developed in large populations into what today are called the world’s “great religions,” and fueled by a clergy of some sort into a kind of “mass hysteria” (Crusades, jihads, ethnic “cleansings,” etc.), makes practicing theology analogous to playing with fire. As I pointed out in Jesus – A Keeper [Sept., 2015], epistemologically toxic theology is dangerously flawed. Just as we have veridically created the potential of destroying ourselves by learning how to make nuclear weapons of mass destruction, we have non-veridically created reasons for one group to try and kill off another group by learning how to make theologies of mass destruction; these theologies are based upon the “authority” of the gods we have non-veridically created and non-veridically “interpreted” or “listened to.” It is good to remember Voltaire’s words, or a paraphrase thereof: “Those who can make you believe absurdities can make you commit atrocities.”

Also remember, the condemnation of toxic theology is not the condemnation of the non-veridical; a balance of the veridical flux and the non-veridical flux was absolutely necessary in the past and absolutely necessary today for our survival as individuals, and, therefore, as a species. Toxic theology, like fantasy, is the non-veridical focused upon the non-veridical – the imagination spawning even more images without checking with the veridical from the “real world out there.” Without reference to the veridical, the non-veridical has little or no accountability toward being reliable and “true.” All forms of theology, including the toxic kind, and all forms of fantasy, therefore, have no accountability toward reality “out there” outside our brains. Harmony with the universe of which we are a part is possible only when the non-veridical focuses upon referencing the veridical, referencing the information coming through our senses from the world “out there.” This is the definition of “balance” of the two fluxes to our world display screens in our heads.

Comparing this balanced flux concept with the unbalanced one dominated by the non-veridical (remember the unbalanced flux dominated by the veridical is brain overload leading to some form of insanity), it is easy to see why biologist Richard Dawkins sees religiosity as a kind of mental disease spread like a mental virus through the social pressures of one’s sacred setting and through evangelism. Immersing one’s non-veridical efforts into theology is in my opinion this model’s way of defining Dawkins’ “religiosity.” In the sense that such immersion can often lead to toxic theology, it is easy to see the mind “sickened” by the non-veridical toxins. Whether Dawkins describes it as a mental disease, or I as an imbalance of flux dominated by the non-veridical, religiosity or toxic theology is bad for our species, and, if the ethical is defined as that which is good for our species, then toxic theology is unethical, or, even, evil.

To say that the gods and god stories, which certainly include the Judeo-Christian God and the Islamic Allah, are all imaginative, non-veridical products of the human mind/brain is not necessarily atheistic in meaning, although I can understand that many a reader would respond with “atheist!” Atheism, as developed originally in ancient Greece and further developed after the European Enlightenment in both Europe and America, can be seen as still another form of theology, though a godless one, potentially as toxic as any other toxic theology. Atheism pushing no god or gods can be as fundamentalist as any religion pushing a god or gods, complete with its dogma without evidence, creeds without justification, evangelism without consideration of the evangelized, and intolerance of those who disagree; atheism can be but another religion. Atheism in the United States has in my opinion been particularly guilty in this regard. Therefore, I prefer to call the conclusions about religion spawned by this perception model as some form of agnostic; non-veridical products of the brain’s imagination might be at their origin religious-like (lacking in veridical evidence or dream-like or revelatory or hallucinatory) but should never be seen as credible (called epistemologically “true”) and worthy of one’s faith, belief, and tastes until they are “weighed” against the veridical information coming into the world display screen; and when they can be seen by the individual as credible, then I would ask why call them “religious” at all, but, rather, call them “objective,” “scientific,” “moral,” “good,” or “common sense.” I suggest this because of the horrendous toxicity with which religions in general and religions in particular are historically shackled.

We do not have to yield to the death of GS1 (When the truth is found to be lies, and all the joy within you dies!); GS2 (Love is all you need, to quote the Beatles instead of Grace Slick) can prevent that, even if our irrational love is not returned. In other words, we do not need the gods and god stories; what we need is the Golden Rule (Jesus – A Keeper [Sept., 2015]). This is my non-veridical “take” on the incredible non-veridical capabilities encapsulated in GS1 and GS2.

Western culture has historically entangled theology and ethics (No better case in point than about half of the Ten Commandments have to do with God and the other half have to do with our relationship to each other.) This entanglement makes the condemnation of theology suggested by this perception model of human ontology an uncomfortable consideration for many. Disentanglement would relieve this mental discomfort. Christianity is a good example of entangled theology and ethics, and I have suggested in Jesus – A Keeper [Sept., 2015] how to disentangle the two and avoid the “dark side” of Christian theology and theology in general.

Ethics, centered around the Golden Rule, or the Principle of Reciprocity, is clearly a product of non-veridical activity, but ethics, unlike theology and fantasy, is balanced with the veridical, in that our ethical behavior is measured through veridical feedback from others like us “out there.” We became ethical beings similarly to our becoming religious beings – by responding to human needs. Coyne’s book Faith vs. Fact, Why Science and Religion are Incompatible points out that in addition to our genetic tendency (our “nature”) to behave altruistically, recognize taboos, favor our kin, condemn forms of violence like murder and rape, favor the Golden Rule, and develop the idea of fairness, we have culturally developed (our “nurture”) moral values such as group loyalty, bravery, respect, recognition of property rights, and other moral sentiments we define as “recognizing right from wrong.” Other values culturally developed and often not considered “moral” but considered at least “good” are friendship and senses of humor, both of which also seem present in other mammalian species, suggesting they are more genetic (nature) than cultural (nurture). Other culture values (mentioned, in fact, in the letters of the “Apostle” Paul are faith, hope, and charity, but none of these three need have anything to do with the gods and god stories, as Paul would have us believe. Still others are love of learning, generosity (individual charity), philanthropy (social charity), artistic expression of an ever-increasing number of forms, long childhoods filled with play, volunteerism, respect for others, loyalty, trust, research, individual work ethic, individual responsibility, and courtesy. The reader can doubtless add to this list. Behaving as suggested by these ideas and values (non-veridical products) produce veridical feedback from those around us that render these ideas accountable and measurable (It is good to do X, or it is bad to do X.) What is good and what is bad is veridically verified, so that moral consensus in most of the groups of our species evolves into rules, laws, and sophisticated jurisprudence (e.g. the Code of Hammurabi and the latter half of the Ten Commandments). The group becomes a society that is stable, self-protecting, self-propagating, and a responsible steward of the environment upon which the existence of the group depends; the group has used its nature to nurture a human ethical set of rules that answers the call of our genes and grows beyond this call through cultural evolution. The irony of this scenario of the origin of ethics is that humans non-veridically mixed in gods and god stories (perhaps necessarily to get people to respond by fear and respect for authority for survival’s sake), and thereby risked infection of human ethics by toxic theology. Today, there is no need of such mixing; in fact, the future of human culture may well hinge upon our ability to separate, once and for all, ethics from theology.

A final example of applying the perception model illustrated by Figures 1 and 2 for this writing is the definition of mathematics. Mathematics is clearly a non-veridical, imaginative product of the human brian/mind; this is why all the equations in Figure 2 need a “dashed” version in addition to the “solid,” as I was able to do for the single numbers like “8.” But why is math the language of science? Why is something so imaginative so empirically veridical? In other words, why does math describe how the world works, or, why does the world behave mathematically?

Math is the quintessential example of non-veridical ideas rigidly fixed by logic and consistent patterns; math cannot deviate from its own set of rules. What “fixes” the rules is its applicability to the veridical data bombarding the world display screen from the “real” world “out there.” If math did not have its utility in the real world (from counting livestock at the end of the day to predicting how the next generation of computers can be designed) it would be a silly game lodged within the memory loops of the brain only. But, the brain is part of the star-stuff contemplating all the other star-stuff, including itself; it makes cosmological “sense” that star-stuff can communicate with itself; the language of that communication is math. Mathematics is an evolutionary product of evolutionary complexity of the human brain; it is the ultimate non-veridical focus upon the veridical. Mathematics is the “poster child” of the balance of the two fluxes upon the world display screen of every human brain/mind. No wonder the philosopher Spinoza is said to have had a “religious, emotional” experience gazing at a mathematical equation on paper! No wonder we should teach little children numbers at least as early as (or earlier than) we teach them the alphabet of their native culture!

Further applications of the perception model suggest themselves. Understanding politics, economics, education, and early individual human development are but four.

I understand the philosophical problem of a theory that explains everything might very well explain nothing. But this perception model is an ontological theory, which necessarily must explain some form of existence, which, in turn, entails “everything.” I think the problem is avoided by imagining some aspect of human nature and culture the model cannot explain. For instance, my simplistic explanation of insanity as a flux imbalance may be for those who study extreme forms of human psychosis woefully inadequate. Artists who see their imaginations more veridically driven than I may have suggested might find the model in need of much “tuning,” if not abandonment. I have found the model personally useful in piecing together basic, separate parts of human experience into a much-more-coherent and logically unified puzzle. To find a harmony between the objective and the subjective of human existence is to me very objective (intellectually satisfying) and subjective (simultaneously comforting and exciting). The problem of explaining nothing is non-existent if other harmonies can be conjured by others. Part of my mental comfort comes from developing an appreciation for, rather than a frustration with, the “subjective trap,” the idea introduced at the beginning.

RJH

Jesus — A Keeper

My “long and winding road” through three sortings (Sorting Out the Apostle Paul, [April, 2012],  Sorting Out Constantine I the Great and His Momma, [Feb., 2015], and Sorting Out Jesus, [July, 2015]) has led to what I personally think is a reliable biography of Jesus (At Last, a Probable Jesus, [August, 2015]).  This suggestive biography was made possible by the application of historical and biblical criticism developed for over a century and in continued development; Jesus’ life emerges as the outcome of a considered application of history as a tool of forensic science.  This critical application functioned as an expose and was expressed by the metaphor of stripping off varnish- or paint-like layers applied over time to an original table top representing the end of Jesus’ life, an end agreed-upon by both believers and non-believers alike.  Among that exposed was the epistemological bankruptcy of faith-based theology that presumably can be found in all religions, not just Christianity.  Jesus’ teachings were dual-themed, a theological half based upon the messianic Son of Man and an ethical half based upon one of many versions of the Golden Rule (At Last, a Probable Jesus, [August, 2015]).  I suggest from the biographical content that “survived” the table top stripping, that Jesus’ theological teaching was as the theology that was layered upon the table top to exalt Jesus eventually as part of the Divine Trinity — bankrupt, historically speaking.

The great evolutionary biologist Richard Dawkins suggests religious belief in bankrupt theology is akin to a form of mental illness that spreads like a cultural virus.  My position is a little different, although I do understand Professor Dawkins’ point.  The reason religious theology is bankrupt of historical reliability is because theology is a flawed product of the imagination, and, therefore necessarily nonveridical (At Last, a Probable Jesus, [August, 2015]). (In fact, a veridical theology just might be oxymoronic.)  Not all products of the imagination are similarly bankrupt, of course; it is just that the imagination generating theology is similar to that generating fantasy, with little or no correlation with the veridical data bombarding the brain from the “real world.”  There is no accountability for theology, just like there is no accountability for fantasies; if one imagines a conclusion in science (including the forensic science of history), it must correlate veridically, correlate with the real world; not so with theology and fantasy.  What makes the nonveridical theology of Christianity (and all the other major world religions) not only bankrupt, but also absurd in a scary sort-of-way, is that a.) it claims truth solely on the basis of faith and b.) that it originates in minds burdened by chronic stress (At Last, a Probable Jesus, [August, 2015]); there is no way truth can be veridically demonstrated in faith-based theology.  That faith is usually based upon some form of supernatural god or deity, which is by definition beyond veridical verification.  What makes the theology scary and toxic is there are in the theology veridical or, at least, veridically-sounding punishments that are concurrently conjured for not accepting the “truths” prescribed by the faith.  (Those who don’t believe are going to Hell.)  If those punishments were just expulsion into the group of “them” away from “us” (the “us-them” syndrome discussed in At Last, a Probable Jesus, [August, 2015]), nonveridical theology might not be of such concern, but history has paraded before us example after example of religious wars, purges, pogroms, executions, persecutions, pillaging, and incarcerations (just to list a few) that existed and exist solely upon the basis of nonveridical, absurd, and toxic theology; in Christianity the nonveridical absurdity of Jesus being part of the Godhead has spawned very veridical atrocities (the Crusades, the Inquisition, the Thirty Years War, etc. etc.). [Voltaire needs to be re-quoted here:  “Those who can make you believe absurdities can make you commit atrocities.”]

[Nonveridical theology need not be toxic, to be fair.  My good friend and retired Presbyterian minister Dr. Jim Burns (He has the same degree as I — Ph.D. in nuclear physics; I call him “Rev. Dr.”) and I have periodically a fun and intellectually stimulating discussion on whether the rewards of Heaven are individualized or not.  We do not try to convert each other, we agree to agree or agree to disagree, and nobody gets killed or maimed.]

Blind faith is a virtue in theology, a conclusion of pure fantasy.  If positive feedback is concurrent with blind faith, that is pure coincidence, pure luck.

I apologize to the reader for the above because it might make for the reader the “long and winding road” a little too redundant, a little too long, and a little too winding.  But I want to be sure my position is clearly understood and wanted to briefly summarize how we have come to this posting, which is to claim that Jesus’ ethical teaching, unlike his attendant theology, is more than worthy of keeping.

Jesus’ ethical teaching is centered around the so-called Golden Rule, or, the Ethic of Reciprocity, though it is made up of far more, such as the Sermon on the Mount (Matthew, Chapters 5, 6, 7) and the Beatitudes (Matthew 5:3-12; Luke 6:20-22; Luke 6:24-26; Luke 11:37-54).  The Golden Rule is an idea that has been around since the beginnings of history among theologians and philosophers.  It is an idea, however, that appears transcendent of theology itself, in that it does not have anything to do with a god or gods.  It is based upon the idea it is better to treat one’s fellow human being, whether family, friend, neighbor, or stranger, in the manner you yourself would like to be treated.  It is solidly built upon an individual’s self-interest, for it implies that your kind action will receive an equally-kind reaction, making your life easier, better, and, therefore, happier.  Evil treatment among individuals tends to be weeded out with repeated application of the Golden Rule, as it is in the best interest of both parties, unless one or both is a masochist, to be nice to one another.  I particularly like it because it contains its own intellectual and practical motivation — treating someone kindly, respectfully, and courteously is its own reward.  A deity or deities is/are not needed to command you to be good; the East realized this unnecessary need-for-the-gods centuries before the West; the East separated religion and ethics way, way before the West, which waited until Enlightenment philosophy (18th century CE) to resurrect the ancient Greek separation of the two.  Ethics requires no blind faith; blind faith, because of its susceptibility to toxicity of the mind, is not a virtue in ethics; often, it is to be avoided like a vice.  Positive feedback is not only real in Golden-Rule-based ethics, it often is not long in forthcoming, as in the gratitude of a stranger to whom you have just done a simple act of kindness/courtesy.

[Note the Synoptic Gospel references in the above paragraph leave out quotations from Mark — the earliest, and, perhaps, the most “historically honest” Gospel of them all.  Near-absence of humanitarian parables as vehicles of ethical teachings in Mark does not bode well for my ethics-over-theology case, admittedly, but careful scrutiny of Mark reveals that benefactors of Jesus’ humane treatment (healings, etc.) needed to be receptive to have their needs met; they needed to have “faith” that Jesus could meet their needs.  (People needed to have the faith of a child, for instance.)  Though cynics might claim that Jesus merely took advantage of the credulous through the power of suggestion, it could also just as well be the case that Jesus was by example teaching the importance of “faith in the physician” or “bedside manner” — that following the Golden Rule reaps rewards only when the benefactor is humble enough to appreciate what is being freely given him/her.  This “receptiveness to kindness” might well be the best interpretation of the Parable of the Sower (Mark 4:1-9; Matthew 13:1-9; Luke 8:4-8).  It was certainly important to Jesus that this parable be understood, as all three of these passages end with Jesus’ exhortation: “Listen, then, if you have ears!”

The “faith” in Mark is faith in one another to do good to each other, perhaps.  Mark, then, is just as ethical as the other Synoptic Gospels.]

Listed below is a litany of religious or religious-like recognition of the Golden Rule, which over time has apparently always been in our religious and ethical thinking.  Jesus was but one of many who saw the Ethic of Reciprocity as being necessary and fundamental to an ethical and fulfilling existence.  Listed in chronological order when the time is available, each quote’s setting is given; if a specific work is cited, the time of that work’s origin is given (if known); if not, then the approximate date of the religion’s or belief system’s origin is given.  Note how Jesus’ ethical Golden Rule center was also taught both before and following Christianity in history.  The “first” to record the Golden Rule may never be known; as shown in the most recent on the list, it is still being recorded, claimed, and cited:

* “Do for one who may do for you, that you may cause him thus to do.” The Tale of the Eloquent Peasant, 109 – 110  [Ancient Egypt, 1800 BCE]

* “A man should wander about treating all creatures as he himself would be treated. ” Sutrakritanga 1.11.33 [Jainism, 9th-7th centuries BCE]

“Let no man to do another that which would be repugnant to himself; this is the sum of righteousness. A man obtains the proper rule by regarding another’s case as like his own.” [Upanishads, Hinduism, circa 700 BCE]

“…thou shalt love thy neighbor as thyself.”, Leviticus 19:18 [Judaism, 7th century BCE]

“Hurt not others in ways that you yourself would find hurtful.” Udana-Varga 5:18 [Buddhism, 6th-4th centuries BCE]

“To those who are good to me, I am good; to those who are not good to me, I am also good. Thus all get to be good.” [Taoism, 6th-5th centuries BCE]

“That nature alone is good which refrains from doing to another whatsoever is not good for itself.” Dadisten-I-dinik, 94,5 [Zoroastrianism, 5th century BCE]

“Tse-kung asked, ‘Is there one word that can serve as a principle of conduct for life?’ Confucius replied, ‘It is the word ‘shu’ — reciprocity. Do not impose on others what you yourself do not desire.'” Doctrine of the Mean 13.3 [Confucianism, 5th-4th centuries BCE]

“If you see a jackal in your neighbor’s garden, drive it out. One might get into yours one day, and you would like the same done for you.” [Bakongo people of the Congo and Angola]

The law imprinted on the hearts of all men is to love the members of society as themselves [Roman Paganism, BCE-CE]

“Therefore all things whatsoever ye would that men should do to you, do ye even so to them: for this is the law and the prophets.” Matthew 7:12 [Christianity, 1st century CE]

“None of you [truly] believes until he wishes for his brother what he wishes for himself.” Number 13 of Imam Al-Nawawi’s Forty Hadiths [Islam,  7th century CE]

The heart of the person before you is a mirror. See there your own form.” Munetada Kurozumi [Shintoism, 7th century CE]

Do not wrong or hate your neighbor. For it is not he who you wrong, but yourself.” Pima proverb [Native American Spirituality]

No one is my enemy, none a stranger and everyone is my friend.” Guru Arjan Dev : AG 1299 [Sikhism, 1699 CE]

“Ascribe not to any soul that which thou wouldst not have ascribed to thee, and say not that which thou doest not.” “Blessed is he who preferreth his brother before himself.” Baha’u’llah [Baha’i Faith, 1844 CE]

“Don’t do things you wouldn’t want to have done to you.” [British Humanist Society]

“Try to treat others as you would want them to treat you.” [L. Ron Hubbard, Scientology]

Here are some versions of the Golden Rule or Ethic of Reciprocity from some famous Western philosophers in chronological order, along with their settings.  These can be considered secular sources, clearly free of a theological context, in contrast to the above list:

                       Socrates:Do not do to others that which would anger you if others did it to you.” [Greece; 5thcentury BCE]

                       Plato:May I do to others as I would that they should do unto me.” [Greece; 4th century BCE]

                       Aristotle:We should behave towards friends, as we would wish friends to behave towards us.” (This is a restricted version of the golden rule limited only towards friends.)            [Greece; 4thcentury BCE]

                       Seneca:Treat your inferiors as you would be treated by your superiors,” Epistle 47:11 [Rome; 1st century CE]

                       Epictetus: What you would avoid suffering yourself, seek not to impose on others.” [Turkey; circa 100 CE]

                      Kant: Act as if the maxim of thy action were to become by thy will a universal law of nature.” [Germany; 18th century CE]

                      John Stuart Mill: “To do as you would be done by, and to love your neighbor as yourself, constitute the ideal perfection of utilitarian morality.” [Britain; 19th century CE]

Jesus-stripped-of-theology must be what Thomas Jefferson in his The Jefferson Bible was trying to achieve; I like to think, thanks to my long and winding road, I now know why he was striving so.  (Perhaps, too, Jefferson’s friend Thomas Paine was similarly motivated in writing Age of Reason [Prometheus,1964, ISBN 0-87975-273-4, pbk].)  Jesus-with-just-ethics is worth keeping; it is a blueprint of stress-free, confident, and happy living for both Christian and non-Christian alike; as shown above, it is transcendent of theology, and, therefore, free from absurdity, and, hence, non-toxic.  Anyone can follow the Golden Rule or Ethic of Reciprocity, regardless of what they believe, or don’t believe.  You don’t need a god or gods; you don’t need a guru or teacher to follow.  If one must have someone to tell them what to do, then pick out one the quotes above by a religious leader or secular philosopher and try to live by that quote.

Though the following reference may well be another long and winding road (Don’t worry, I’m not taking it, just noting its “entrance gate.”), the words from John Lennon’s song “Imagine” also seem to “fit in” here:  “Imagine there’s no countries, It isn’t hard to do, Nothing to kill or die for, And no religion too, Imagine all the people, Living life in peace.”  The absurdities and toxicity of political ideology clearly parallel the absurdities and toxicity of theological ideology, thanks to history’s lessons, but to talk about political ideology as has been done here with theological ideology is, as I said, another long and winding road not taken just now.

Theology, compared with secular ethics, has the tremendous disadvantage of apparently needing a church, synagogue, temple, or mosque.  Time and wealth has to be invested in some kind of edifice of worship and veneration. (However, atheism, which I consider a nihilistic theology, at least does not require an edifice.)  Another disadvantage is the need for a clergy to “run” the edifice.  (Quakerism, of all the sects of Christianity, may be the only group to “get it right” when it comes to a clergy; it requires none, but does need a meeting place.  Atheism also requires no clergy.)  Imagine if all the effort and money that goes into building an edifice and supporting a clergy went instead into meeting human need when encountered, as Jesus taught in his ethics!  Modern organizations such as the Red Cross, UNICEF, (I like to think also the Red Crescent), Live Aid, and the Carter Center work to meet human needs with no “theological strings” attached; imagine liquidating all sacred wealth and channeling it into secular causes such as these (I’m sure the reader could name other worthy secular organizations in addition to these.).  Theologically-based organizations cannot possibly do as well serving mankind as the secular ones, until the costs of the “theological strings” are eliminated.  Sound too fanciful, like theology?  I don’t think so, as these organizations have obtained tangible, veridical results; none of these existed until the late 19th and the 20th century came along; how many more will emerge in the 21st?

Do not think this is uplifting atheism in the place of Christian theology, or, for that matter, theology in general.  As I said above, atheism seems to me to be a nihilistic theology of some sort; atheism seems itself to be too “churchy;” I’ve met atheists who seem as evangelical as Christian evangelists.  One can have a theology or not have a theology, but theology is a personal commitment or belief, not transferable to anyone else.  Dawkins may have a point:  to have someone agree with your personal theology willingly or unwillingly necessarily fosters the toxic “us-them” syndrome (At Last, a Probable Jesus, [August, 2015]), which is a “slippery slope” to human misery, according to history.  The toxicity of theology spreads from human mind to human mind, like a disease.

I have a personal theology, but feel no need for anyone else to feel and think the same.  In the words of Thomas Jefferson, “I am a sect of one.”  I often express elements of my theology, but never with the purpose of “converting” anyone.  Any agreements or disagreements with my personal theology are purely coincidental and carry with them no necessary consequences; my theology has no “baggage” and no “strings attached.”  I would never “condemn” anyone to a theological heaven, purgatory, or hell; I think the paraphrases of Jesus’ words “Judge not, that ye be not judged…” (Mark 4:24; Matthew 7:1; Luke 6:37) are ethical, not theological.

But in this personal theology, I am intellectually hamstrung.  I have no way of knowing whether or not my nonveridical creations of my mind correspond to the reality providing me in all my waking hours veridical persuasion that something is “out there.”  I cannot “check out” my theological impressions; I can believe in a religious way, but I cannot know if I am believing in something true, something independent of my mind.  In 1763 Voltaire said, “The interest I have in believing in something is not a proof that the something exists.”  This is the “subjective trap.”  It is impossible to  verify in a scientific sense any nonveridical, faith-based theology.  In fact, I have to assume that others are in the same subjective trap, but I can never demonstrate that is the case beyond doubt.  On the other hand, behaving ethically yields veridical feedback, usually from the benefactor(s) of my kind and courteous acts — feedback so empirical it seems part of the natural world “out there,”  a world so “real” that doubt of that world existing outside my skull common sense reason refuses to allow.  In the same way the “hardest” of the sciences, along with all the other sciences, like forensic history, do not doubt the existence of the natural world.  All personal theologies, like all knowledge, are fallible.

Therefore, the destinations of my long and winding road here, my conclusions, may be wrong.  For instance, my position of the fate of Jesus after the crucifixion in At Last, a Probable Jesus, [August, 2015] may be shown to be erroneous if the ossuaries found in the Talpiot Tomb in Jerusalem mentioned near the end of Sorting Out the Apostle Paul, [April, 2012] are studied and turn out to be the actual remains of Mary’s and Joseph’s family, including their son Jesus.  But, just as in the sciences, I can live with the possibility of being wrong, of being content with tentative, temporary answers.  The journey never ends; “it’s not the kill, it’s the thrill of the chase.”  I trust that as new historical evidence is revealed, I will draw closer to the truth about Christianity and theology in general than at the time of this writing.  The great physicist Richard Feynman in a BBC interview stated, “I can live with doubt, and uncertainty, and not knowing.  I think it’s much more interesting to live not knowing than to have answers which might be wrong.”  Revealed religion, with its nonveridical theology based upon the quicksand of faith-based epistemology (as opposed to evidence-based epistemology) declares it is true and demands of believers unquestioning belief in that declaration; I say revealed religion offers no reason for either the declaration or the demand.

 

Evolutionary psychology suggests that anything that marks definitively human existence must have some time in our evolutionary past been beneficial to our survival as a species.  Perhaps this explains why we as a species are so “religious.”  The origins of human religion have, relatively speaking, just started as a serious study to be discovered.  Given the potential of lethal religious theology spawned by our minds, progress in this study seems not only needed, but imminently vital.

We must “own up” to the possibility in cultural anthropology that without the development of some form of theology that went unquestioned by the “believers,” preservation of “us” from the attacks of “them” on the other side of the hill, human or beast, would not have been possible; the “us-them” syndrome may have played a vital role in our survival.  “Don’t ask why, just have faith, praise god or the gods, and grab a spear or knife!” “Us” needed to have a vision beyond our visible, tangible leaders to conjure sufficient communal courage to meet the challenges of our hunter/gatherer past; we needed gods and “god stories.”  Communal bonding and identity developed around some local form of religion.

All who survived to the dawn of civilization, then, were probably predisposed to be religious.  Not that we had “religious genes,” but, rather, our genes worked in concert to make us tend to be religious.  As civilization grew from city states into nations and into empires, religion grew and consolidated into state or world religions.  Lethally, religions never lost their “us-them” syndrome.  Because Voltaire’s words above are so true (both quotes), we have molded our once-upon-a-time survival tool into potentially murderous madness.  We have met the enemy and it is us, or, rather, our gods and god stories. (Not to mention our nationalism, patriotism, and politics.)

But we who survived to the dawn of civilization were also probably predisposed to be ethical — to love one another and to treat each other with kindness and courtesy.  Ethics was at least as responsible for our survival as religion.  The Golden Rule never lost its value, and someone, somewhere, always recognizes that it and its implications can, if we will, trump the gods and god stories, if for no other reason than ethics is not potentially toxic like religious theology.  Ethics fosters no “us-them syndrome.”

Whether Jesus separated in his head theology and ethics as exemplified in the two preceding paragraphs may be never known.  What is known about Jesus is that there was a duality about his message.  The theological part of his message has not turned out so well in the modern world, just like the theologies of other world religions.  But the ethical part of his message resonates with the best that human beings can be in the modern world.

For almost three hundred years, enlightened rationality in lots of free, courageous minds has boldly separated the sheep from the goats, the theology from the ethics, the sacred from the secular.  Civilization’s philosophy may well need to redefine a “Great Commission.”  Instead of going out and teaching or conquering all nations, we need to go out and just be decent to each other — to live, in Lennon’s words, “life in peace.”

RJH

 

At Last, a Probable Jesus

After three successive sortings, Sorting Out the Apostle Paul, [April, 2012], Sorting Out Constantine I the Great and His Momma, [Feb., 2015], and Sorting Out Jesus, [July, 2015], it is now possible to recontextualize the biography of Jesus with some degree of historical reliability. What distinguishes this rebuilding is that it is fact-based, utilizing the modern forensic science of history, rather than being faith-based, as are just about all religions, creeds, and belief systems.  That is, it is as based upon historical facts (defined by communicating and debating historians) as close to “consensually factual” as biblical criticism can come; it is not “hard” science, and will change in future as new historical and archaeological evidence emerges.  (Even “hard” science is not “chiseled in stone,” as it can change in future as well, as new researched evidence emerges.)  To make this recontextualization plausible, I have inserted into the historical consensus (the paragraphs in italics) my own personal “take” on what-happened-when-and-why to give the biography a flowing narrative without, hopefully, weakening rational plausibility.

To those believers and non-believers who think this whole intellectual exercise is “throwing the baby out with the bath water,” that I am losing the essence of Christianity by ignoring most of Christian theology and tenants of Christian faith, I can only remind them of the quicksand that is the epistemology of faith.  If truth is purely faith-based and comes by miraculous, indemonstrable revelation, then one theology/faith cannot be shown to be more truthful than any other theology/faith; a believer can believe literally anything; one can put faith in literally anything.  (See the example of the Flying Spaghetti Monster “religion,” or FSM, in Sorting Out the Apostle Paul, [April, 2012]) From the outset of my Christian experience, from my early independent thoughts on Christianity, I thought the strength of Christianity was in its historicity, not in its spirituality.  The zeal of non-Christian faith is as strong as the zeal of Christian faith, but if history could be brought to bear witness to the claims of Christianity, faith in Jesus would seem to have a “leg up.”  So, instead of immersing myself in Sunday School lessons on Paul’s letters, or coming up with mental or verbal personal testimonies about Jesus’ sacrifice for our sins anew every Easter season (and every Sunday, for that matter), I looked into the historical case of Christianity (Sorting Out the Apostle Paul, [April, 2012]).  The three sortings listed above are summaries of this personal case study, summaries that have brought me to suggest a probable, plausible story of the life of Jesus.

Studying the historical origins of Christianity is nothing short of a startling revelation in its own right.  The nuances of this shocking realization varies from person to person, I’d surmise.  For many who grew up in the Church who have had this surprising revelation, fewer years were probably required than the number for me.  Though I do not regret all the time I spent in church pews (I learned a lot.), it is now not surprising to me why attention of congregations is not drawn toward the historical origins of not only the denomination, but not drawn especially to the historical origins of the faith itself.  Congregations are drawn instead to focus upon community services and/or increasing the church membership, while being told the egregious “tall” tale that they are behaving like “the” single Church of the 1st century CE; in contrast, questions like I asked (Sorting Out the Apostle Paul, [April, 2012]) lead to the ludicrous centuries-long defining of who Jesus Christ was (Sorting Out Constantine I the Great and His Momma, [Feb., 2015]) and to the realization Christianity has very little historical reliability I originally thought it had (Sorting Out Jesus, [July, 2015]).

The third sorting (Sorting Out Jesus, [July, 2015]) was made quick work thanks to Bart D. Ehrman’s recently published book How Jesus Became God, The Exaltation of a Jewish Preacher from Galilee.  This work not only spelled out the contemporary consensus concerning Jesus by biblical criticism and archaeological studies, it suggested to me the “layer” metaphor and analogy, wherein each alteration and addition to Jesus’ life could be seen as an obscurantist layer painted upon the surface of a table top that represents the historical situation at the time of Jesus’ death.  What the Church describes as Jesus Christ revealed in the Scriptures is actually a “trumped-up” Jesus exalted and defined well beyond anything he himself intended.  In fact, Jesus was exalted during the latter half of the 1st century CE (after his death) eventually to the divine status of Son of God using the same “blueprint” that exalted another historical character, Apollonius, during the same time span (Sorting Out Jesus, [July, 2015]). The “layers” prevent us from seeing the table top; Jesus has over time been victimized by “bad press.”  I am not throwing out the baby with the bath water; the baby was taken out before the watery tossing, or was never in the bath water to begin with.  The reliable essence of Jesus is not in the theology of his exaltation or in faith in the theology.

The third sorting stripped all the layers above the table top down to just the original surface, a surface on which we can have some degree of historical confidence.  All we find on this stripped surface is i) a trial before Pilate, ii) an execution by crucifixion, and iii) a claim Jesus rose from the dead.  Now we need to work underneath the surface of the table, aware that much of the layers now stripped from above the surface had to do with Jesus’ life and ministry, but realizing there may be many layers below the table top needing to be stripped also.  What remains of Jesus’ life from birth to death that has historical credibility?  This posting is the answer to that question from my point of view, resulting in a believable, plausible, and probable Jesus  — my recontextualization of Jesus.  As in Sorting Out Jesus, [July, 2015], I shall attempt to employ the three criteria of “stripping” that determine what biblically is reliable and what is not:  1) as in The Jefferson Bible, ignore claims that smack of fanciful hyperbole and that appeal to those of strong credulity, 2) keep matters that orthodoxy finds problematic but had to leave in so as not to be charged with incompleteness, and 3) favor that which requires fewer, rather than more, assumptions; tend to select the simple as opposed to the nuanced and/or the confusing.

(Remember, the regular text is close to historical consensus; the italic text is my personal speculation that gives, hopefully, reasonable flow among the reliable events of Jesus’ life.)

 

It seems to me the life of Jesus went something like……..

Jesus, obviously conceived out-of-wedlock, and born in Nazareth, had a very understanding and supportive mother in Mary.  Joseph married her, saving her reputation from being tarnished even more, and together the couple had four boys and three girls as Jesus’ younger siblings.  To arm Jesus against the social stigma of being a bastard, Mary doted upon him, which resulted in Jesus standing out in comparison to his brothers and sisters, as he was in comparison a precocious child.

Mary was like an early version of a “stage mom,” paying particular attention to developing self-confidence in her eldest son.  She probably indulged his every inclination, giving him a sense of being “special” at a very early age.  Originally intending her son to be immune from the social scarring by the label of “bastard,” she found Jesus developing into a child with an early sense of purpose — almost a prodigy of early maturity.  [Incidentally, to me the only redeeming features of Mel Gibson’s 2004 film The Passion of the Christ were the “flashback” moments of Mary remembering the moving and tender mother/child times she had had with her son, who was being condemned to death.  Otherwise, I thought the film was anti-Semitic Hollywood hyperbole that recklessly added to scripture and stretched the limits of credulity (e.g. anyone who bled that much would not have the strength to carry a cross to Golgotha).]  Perhaps his father (adoptive father?) Joseph did his part in raising Jesus as a “special” child.

To say Jesus was gifted would be an understatement; he was observant, introspective, and reflective.  The closest thing he had to a formal education were the teachings he received at synagogue, conducted by rabbinic Pharisees.  He became fascinated by at least two social issues playing out before him: a) the inhumanity of applied Mosaic law, especially in everyday domestic situations, and b) the prevailing apocalyptic view that God was going to intervene to deliver the Jews from the oppression of the Roman Empire.  He gathered a group of followers who became his disciples, each less educated than he; he was the teacher, the Master, regardless of his relative age to each; Mary had developed his sense of worth well.

By observing the Pharisees, the local Roman officials, and the local Greek intellectuals, he mimicked their leadership skills, following his natural tendency to “stand out,” to desire to be noticed.  He appealed to the less educated of his peers who also were fascinated with the uncomfortable inconsistencies of Jewish common law and with the idea that, as God’s “Chosen People,” the Jews would be relieved of their Roman masters by divine intervention.  Both Jesus and his chosen twelve found it easier to walk about in critical commentary of the social ills and myopia all around them than to stay in the “binding obligations” of working responsibilities at home or in the demanding vocations in which they had found themselves; for the disciples, the audacity of the charismatic Nazarene, with whose family they were probably familiar, gave them a release from the work in front of the rest of their lives.  Jesus and his disciples were very much like Socrates and his “pupils,” save Jesus was not blind; any Greek-cultured observers of a young “teacher” speaking to an attentive band in parables must have appeared pretty normal, in a Hellenistic sort of way.

Not only did Jesus discover that the vehicle of parables was an effective way to communicate with his unsophisticated and even illiterate audiences (including the disciples), he found them a great way to deflect direct attacks upon his apocalyptic agenda or upon his lessons in social mores.  Parables require interpretation, which can utilize reflection in an obscurantist manner.  Such reflection became in time necessary, as his apocalyptic, messianic teaching about the coming of the Kingdom of God transferred to Roman authority as potentially seditious, and his teaching of loving one’s neighbors transcendent of religious laws of conduct smacked of replacing the teachings of rabbis, Pharisees, and Sadducees with a “higher authority” of his own.  Jesus and his entourage of twelve were not always received well, and Jesus’ family grew concerned for his safety (Mark 3:31-35; Matthew 12:46-50; Luke 8:19-21).  Jesus, the twelve, and Jesus’ message were rejected in his home town of Nazareth (Mark 6:1-6; Matthew 13:53-58; Luke 4:16-30).

If Jesus’ family members were in agreement with his teaching, that would have been emphasized in the Gospels.  Showing up out of concern could have indicated the family actively encouraged him to cease his teaching and return home.  And their concern was understandable, as I think Ehrman in How Jesus Became God, The Exaltation of a Jewish Preacher from Galilee is correct that there were in the first century CE three different versions of the Messiah in Jewish culture, and Jesus was linking himself with one of the three.  1) The religious-political Messiah, often called the Son of Man, would come as God’s “right-hand man” to usher in the Kingdom of God by judging all of mankind — God’s prosecutor, in other words, 2) the “Temple-centered” Messiah, or God’s chosen “high priest” who would make things right from (probably) the Temple at Jerusalem, and 3) the political Messiah, who would by Maccabeean-like rebellion literally restore the line of David as the independent kings of God’s Chosen People.  Jesus was linking himself with 1), in all likelihood seeing himself as the Son of Man (Matthew 19:28).  He clearly  was not an official rabbi, nor Pharisee, nor Sadducee; as a non-priest, therefore, he could not be preaching a type 2) Messiah.  Because of the obscurantist nature of his open-to-interpretation parables, he was most probably misinterpreted as seeing himself as a type 3) Messiah, which played a large part in his fatal condemnation.  As for his teachings on moral conduct, they smacked of humane Epicurianism and Stoicism that had him calling the likes of Pharisees “vipers.”  Neither synagogue nor Roman authority could be pleased with what they were hearing from Jesus.

Personally, I like to think his Golden Rule-based humanism drove his dual-themed message, instead of the messianic, apocalyptic heralding of the coming of the Kingdom driving the double-headed agenda.  At first glance, the two messages seem difficult to reconcile and harmonize, but I am inclined to think he used the wide-spread notion of God’s intervention through the Son of Man as the “banner” around which to gather listeners, who were then regaled with the message of treating each other with love and respect, a not-so-bad idea even for the bloody-minded rebels wanting to throw off the yoke of Rome, if God’s judgement through the Son of Man was inevitably coming.  You want to appear as kind to your own, at least, even if the blood of your enemies stains your hands doing what you believe to be ultimately God’s will.  Despite his mixed messages of bringing a sword and leaving the family, his teachings emphasized peace and pacifism.  Of course, this is just my opinion; perhaps Jesus lived his entire ministry without ever reconciling the two themes in his head; perhaps many turned away from his teachings because they could not see the possibility of such reconciliation.

Perhaps Jesus was emboldened by the similar ministry of John the Baptist.  Perhaps Jesus was at first a follower of John’s teachings.  Whatever their relationship, and regardless of whether John baptized Jesus or not, Jesus’ ministry was shaken by the beheading of John.  When the tetarch of Galilee, Herod Antipas, had John executed over John’s moral condemnation of Herod’s taking of Herodias as his wife in about 30 CE, Jesus and his disciples tried to “lay low” for a while and keep a lower profile (Mark 6:31, Matthew 14:13, Luke 9:10).  But his followers (undoubtedly joined by followers of John) would not allow this public absence; the vacuum of need left by John’s death had to be filled by Jesus, and he could not resist filling it.

The story of John the Baptist and the similarity of his and Jesus’ teachings are a great indicator that these teachings were the “talk” of many more self-motivated prophets of apocalyptic doom and/or love-over-law than John and Jesus.  It is highly probable Jesus saw John’s death as God’s sign he was God’s choice to pave the way for the coming of the Kingdom; the “torch” was passed to him from John as part of God’s will; it is possible he began to believe he was destined to be the Son of Man in that Kingdom as a part of this divine sign.  If such a belief came to him, whenever it came, it turned out to be fatal for Jesus.

By the time the Passover of about 33 CE came, Jesus was emboldened by  his followers, his disciples, and his acquired self-perception to go to Jerusalem, the “capital” of Judea, to go to the political and religious center of the Jews and to the heart of Roman power over the Jewish state.  Clearly, Jerusalem was the place where application of his dual-headed message would have the greatest, far-reaching effects.  His fame had grown beyond his control, even to being an advocate for women (a revolutionary idea for that society), as personified by his close relationship with Mary Magdalene, who in effect had become his closest feminine disciple as well as the thirteenth disciple added to the original twelve.  He figuratively was swept into Jerusalem by a destiny of his own making, all the time being scrutinized more and more by both Jewish and Roman authorities as his fame grew — a “dual watch” reflecting his dual message, which spawned dual suspicions of blasphemy and of subversion.  Knowingly or unknowingly, he was trapped between the two “horns” of his teachings.

Despite what the Church did to expunge Mary Magdalene from the story of Jesus, I think it highly probable not only was she the closest of his disciples, they had a sexual relationship; she was the “beloved disciple,” not John.  That Dan Brown’s novel The Da Vinci Code was based partly upon historical evidence (2nd century CE gospels relating how Jesus taught Mary Magdalene beyond what he told the disciples, who were jealous of her) indicates her importance in Jesus’ ministry (March 2012 National Geographic).  Her clear importance to the origin of Christianity is forthcoming below.

My inclination is to think Jesus could not have anticipated what was going to happen to him and his disciples in Jerusalem.  He was not the only “troublemaker” attending Passover that year, given the atmosphere of religious reform and political rancor at the time.  It does seem obvious to me that between the time of John’s execution and this trip to Jerusalem, he gained some sense of direction for his ministry, even if only from following the suggestions of the most vocal of his growing followers.  Perhaps he believed his own “hype,” or perhaps not.  Surely, he was too famous to run away and hide, even if he wanted that.

The “cleansing of the Temple” of the moneychangers (Mark 11:15-19; Matthew 21:12-17; Luke 19:45-48; John 2:13-22) was pivotal.  Whether from careful, calculated planning or from a moment of uncontrolled anger, or from something in-between, Jesus suddenly acted as if he had religious and spiritual authority to bring about reform through a rebellious and revolutionary act.  No amount of calmly teaching in the Temple before or thereafter, actual or added by the Gospel writers, could sooth the apparent fact he had given his detractors the excuse they sought to bring his ministry to a close.  What was sold to the religious authority as blasphemous behavior in the Temple of the Lord was sold to the civil authority as disturbance of the peace at the absolutely most socially volatile time of the year (Passover).  The religious Jews had their moment of “heresy” to pin upon him, and the Romans had their moment of “sedition” or “revolt” to pin upon him.  He was taken into custody because of the two “horns” of his ministry.  His fate was sealed.

No amount of drama added by the writers of the 4 Synoptic Gospels to the time between the driving out of the moneychangers from the Temple to his arrest (e.g. The Last Supper) could rationalize his rash act at the holy site.  Attempts to do so resulted in convoluted conflicts within Jesus in which he knew he was going to die for divine reasons, yet he was full of human apprehensions.  Good drama, but hardly reliable history.  It is more likely he, as he felt the “heat” of public exposure of the incident, placed his hopes in his actions in the Temple as being lost among many similar incidents by other “troublemakers” during the tense chaos of Passover.  Had it not been for the betrayal of Judas, those hopes might have been well placed.

Ehrman, in How Jesus Became God, The Exaltation of a Jewish Preacher from Galilee, found the betrayal of Jesus by Judas difficult to understand.  I think it is easy to understand if you see Judas as a disciple who followed Jesus primarily because he believed Jesus to be a type 3) Messiah [Way after the betrayal, the authors of the Gospels of Matthew and Luke ironically fed the association of Jesus as a type 3) Messiah from the line of David by awkwardly and inconsistently listing Jesus’ ancestry; the two lists don’t agree (Matthew 1:1-17 and Luke 3:23-38).]  When Jesus showed himself not to be a political revolutionary starting a revolt against Rome, as some angry moments in Jesus’ teaching had indicated to the mind of Judas early on, but, instead, showed himself a religious reformer of some sort when he purged the Temple, that was the “last straw” for Judas.  Probably reading into the obscurantist “lessons” of the parables his own signs and wishes for a political revolt, Judas, when considering the Temple incident, “snapped” and acted out against Jesus, out of anger at himself and his own years-long self-deception.  How could all that time Judas not see that Jesus was a “religious nut,” and not the clever political firebrand Judas believed Judea desperately needed?  Judas realized Jesus was a pie-in-the-sky guru, not Spartacus.  Because he could not forgive himself for his own shortsightedness, Judas punished Jesus for not being the Messiah in whom Judas had placed his hopes and dreams.

As I earlier mentioned, had not Judas betrayed Jesus, there may not have been an arrest at all.  Judas was internally a firebrand rebel yearning for bloody revolution.  However, Judas was simultaneously a natural-born follower, a hanger-on willing to invest his life around anyone who would take up the responsibilities of the dangerous causes Judas happened to believe in, because Judas did not have the fortitude to risk the danger himself; externally, Judas was also a coward. Unfortunately for Jesus, he did have the fortitude to actually do something about his petty, selfish anger, instead of slinking off into the obscurity of his disgruntlement  —  he betrayed Jesus to the authorities.  Whether true or not, the suicide of Judas had to be the end of his story to placate the pious hounds writing the Gospels and howling for justice.

At last, the arrival at the “table top” from the bottom has come, a table top consisting only of i) a trial before Pilate, ii) an execution by crucifixion, and iii) a claim Jesus rose from the dead.  Clearly, many layers like which we saw on top of the table (Sorting Out Jesus, [July, 2015]) needed to be stripped from the bottom to arrive at a top that looks almost the same from both directions; now, the Passion Week needs to be stripped so that the bottom view is the same as the top view.

The heart of making historical sense of i), the trial, is to understand why it was important to the Christianity being spawned near the end of the 1st century CE, decades after Jesus had died, to place the blame for the death of Jesus on the Jews and not the Romans.  This, despite the fact Roman authority had power and responsibility over life and death at Jerusalem at the time; the exploitation of Judea by Rome demanded nothing less.  Recall the 4 Synoptic Gospels were written during and just after the Jewish revolt against Rome of 62-70 CE, the one that ended with the siege at Masada.  It was dangerous to speak or write ill of the winners’ (Romans’) actions at this time or at any time prior.  Conversely, it was politically expedient to speak or write ill about the losers (the Jews).  Moreover, despite resistance the first Christians met in the Roman Empire (Paul’s letters), spreading the Gospel of Jesus had nothing to gain by speaking ill of the Empire regarding the death of Jesus; Christians benefited from being seen at worst as neutral when it came to the Empire.  This conciliatory policy toward the Romans paid huge dividends when Roman persecutions of the Christians ceased and Christianity became the state religion of the Empire under Constantine I (Sorting Out Constantine I the Great and His Momma, [Feb., 2015]); with Constantine Christianity “crushed” other religions in the Empire like the Romans crushed the Jewish revolt of 62-70 CE.

But, complete alienation of the Jews within Christianity was not going to work for at least three reasons:  First, Jesus was a Jew.  What would it look like if Jesus started something that was contrary to his own people?  No matter how non-human and divine Jesus became in the minds transforming and exalting him into the Christ, he was a man in a particular ethnic and cultural group, making him appealing and fascinating to both Jew and Gentile alike.  Second, if Christianity was to become truly universal and embracing of all, the opportunity for Jews to convert to Christianity must always be available.  Third, Jesus’ cultural origin as a Jew was so well known, any condemnation of the Jewish people by Jesus or his Jewish disciples would make the Christian movement seem traitorous to its roots; complete alienation of the Jews by Christianity would make Jesus look like a Benedict Arnold, something antisemitism of later times “conveniently” overlooked.

Therefore, the Gospels of Mark, Matthew, Luke, and John inserted into the Passion Week stories of Jesus’ arrest, trial, and sentencing castigation upon Jesus’ own people in the capital, placing the ultimate blame for Jesus’ fate upon Jewish authority in Jerusalem (the Sadducees, the Pharisees, the Sanhedrin — the Jewish high court).  Pilate, other Roman authorities, and the Roman soldiers who carried out the sentence seemed “reluctant” (Pilate washing his hands of the whole affair) to carry out the desires of Jewish authority.  Yet, not all Jews in Jerusalem wanted Jesus’ blood; the story of Joseph of Arimathea (a member of the Sanhedrin?) providing a tomb and retrieving Jesus’ body is a case in point; the tradition that Nicodemus assisted Joseph of Arimathea with the body is another.

More likely, in my opinion, Pilate saw in the condemnation and crucifixion of Jesus the opportunity to publicly demonstrate no toleration of disruption-leading-to-insurrection-leading-to-revolt, especially in the highly-charged atmosphere of Passover.  The Sadducees and the Sanhedrin saw the opportunity to rid themselves of still another religious troublemaker similar to John the Baptist, who, unlike John, was right in their midst drawing too much attention; they also saw in their assent to the execution the opportunity to demonstrate they were officially against the fomenting of anti-Roman sentiments, even religious ones, at least at that juncture. 

I think blaming the Jews as being at least partly culpable for Jesus’ death set up the template that guided Christianity centuries later into the horrors of antisemitism.  Not that the writers of the Synoptic Gospels and Acts were blatantly antisemitic; these writers, in their evangelistic zeal, did not and possibly could not foresee hellish interpretations of their writing; rather, as I said earlier, later antisemitism blatantly overlooked Jesus’ Jewishness.  As Jesus became the Christ, Jews became “Christ killers” in the twisted minds of antisemitism.  Even more horrifically, as Jesus became Jesus Christ, the divine Son of God, and one of the Trinity, Jews were seen as turning their back upon their biblical God to become “God killers.”  That logic leads to the conclusion that Jews do not deserve to live, at least not to live well, and this became the theme of European antisemitism, culminating in the ultimate, unthinkable crime against humanity history calls the Holocaust.  I urge all readers to visit the Holocaust Museum in Washington D.C. (if they cannot visit the Nazi death camps in Europe or memorials in Israel) to be reminded of the truth of Voltaire’s words: “Those who can make you believe absurdities can make you commit atrocities.”  History has shown religious and/or political power conjures the most inhuman absurdities spawning the most genocidal atrocities.

Curiously, as pointed out by archaeologist Simcha Jacobovici, for crucifixion to be allegedly so wide-spread throughout the Roman Empire, there is today remarkably little archaeological evidence for it.  Nor is there extensive writing on the process from its day, except for the great numbers of victims.  There could be many explanations for this interesting phenomenon, like perhaps during the centuries of the Christianized Roman Empire, most believers were illiterate and extremely credulous and could be persuaded that any piece of ghoulish evidence from a crucifixion, like a nail in an ankle bone, was from Jesus’ crucifixion; Christian relics of all sorts were “big business,” and in some cases, still are.  For literate believers, perhaps writers of the Empire found better subjects about which to write than the preferred method of executing criminals.  I prefer an explanation suggested by Ehrman (How Jesus Became God, The Exaltation of a Jewish Preacher from Galilee) – crosses, nails, and ropes were used and reused, and the body was traditionally left to whatever carrion eaters were available; there was no need to bury crucified bodies, as whatever was left was thrown to the dogs; all traces of the crucified body usually disappeared, destroying the crucified from memory in time.

Hence, ii), Jesus’ crucifixion, was recorded well after the fact, replete with all kinds of circumstances attesting that there was a complete body to bury — his early death on the cross, Joseph of Arimathea, etc.  Most of Christian apologetics are rationalizations of the undeniable fact that Jesus was executed as a criminal.  If the gospel writers could have gotten by without talking about the crucifixion, they probably would have done so.  Of course, they could not do that, as Jesus’ execution fills his story with drama and pathos, and if anyone knew anything about Jesus, it was his death on the cross.  Think of how his disciples, minus Judas, must have felt, not to mention his family and Mary Magdalene.  There was every reason to believe the ministry of Jesus was over.

Jesus’ ministry was over.  I think his body disappeared just as those of other crucifixion victims.

Perhaps in as little as days after the crucifixion (the Friday-to-Sunday tradition) some version of iii) apparently came about — the belief that despite Jesus dying, he was now alive, having been resurrected from the dead.  No need for creative scenarios like Hugh J. Schonfield’s The Passover Plot (Bantam paperback, 1966), in which Jesus was drugged to appear dead; believers and non-believers agree that he died.  To be resurrected intact, there needed to be an empty tomb for the Gospel writers.  At the time, however, I agree with Erhman there needed to be no tomb at all, much less an empty one.  All the stories of the resurrected Jesus appearing to individuals and to crowds were added decades after the crucifixion as scaffolding to “hold up” belief in the resurrection, including the story of doubting Thomas.  By the time they were added, chances were most who could have protested the stories’ authenticity were in no position to be skeptical, as they were among the multitudes victimized by the revolt of 62-70 CE; writers of the Synoptic Gospels and Acts had few critics to worry about.

So, how did iii) become the saving idea of Jesus’ following?  How did the belief Jesus rose from the dead ultimately make possible the “layering” of theology upon theology to create what became Christianity with all its attendant Christology?  Ehrman makes a strong case that “the resurrection idea” survived and “took off” because of just three “visions” or “appearances” of the resurrected Jesus — three separate visions before three who just might be the base-rock creators of Christianity, namely, Peter, Paul, and Mary Magdalene.  (Just remember the ’60’s folk group Peter, Paul, and Mary of “If I Had a Hammer” and “Blowin’ in the Wind” fame.)  The appearances before the disciples/apostles Peter and Mary Magdalene [Remember, I consider Mary Magdalene Jesus’ closest disciple, and therefore an apostle; Paul should not be the Apostle Paul, in my opinion (Sorting Out the Apostle Paul, [April, 2012], as he was not one of those who lived with Jesus during his ministry.] are the foundation or “immediate” visions, as Paul’s (a result of head trauma due to a fall off his ass, trauma to a mind already traumatized by guilt over persecuting early Christians as Saul) came well after Jesus’ alleged ascension into heaven.  [Paul’s is important because it “kickstarted” Paul’s ministry to the Gentiles, the universal theme of Christian theology.  As I have said, a case could be made that Christianity would be more accurately called “Paulianity.” (Sorting Out the Apostle Paul, [April, 2012])]

As we learn more about how the human brain works, studies in the late 20th and early 21st centuries have given us insights into hallucinations, visions, and dreams.  Combining Ehrman’s terminology with my own recently-developed ideas on human perception, the following model can easily account for what happened to Peter, Paul, and Mary Magdalene:

What we perceive, what we “see” with the “mind’s eye,” is the combined product of empirical data coming to us from our five senses — from the real world “outside” — and manufactured concepts and ideas from our mind’s world “inside.”  These “inside” concepts result from processing the empirical data from the outside, “digesting” them back onto our perception through simplifying rules of pattern recognition and algorithms.  Thus our perception is part “in your face” outside world and part “made up” inside world.  The ratio outside to inside is probably different from individual to individual and from moment to moment.  Perception from outside empirical data is called veridical (or based upon the “real world”) and perception from inside the workings of the mind is called nonveridical (or NOT based upon the “real world”).  Human hallucinations and dreams are seen as nonveridical, and I agree with Ehrman the three visions that ultimately made Christianity possible were nonveridical; the three had the experience of “seeing something,” but what they saw was not from their immediate surroundings; they thought they saw and heard Jesus, but if others were nearby, as in the case of Paul’s vision, they would not have seen and heard Jesus.  (Incidentally, nonveridical properties of the brain are not all negative, as they are necessary to generalize and organize the flood of empirical data bombarding our senses; without them we probably would have gone mad and not survived dealing with the unfathomable number of datum from “out there.”  Moreover, our subjective imaginations are all nonveridical; ironically, any critique, including this one, is a nonveridical enterprise.)

Nonveridical visions are known to be associated with times the person is under stress.  Peter (who denied Jesus after his master’s arrest) and Paul (who persecuted Christians when he was called Saul) were both racked with guilt when they had their visions, and Mary Magdalene was racked with grief when she had hers, as she had just lost the most important person in her religious and personal life.  Who was the most consistent witness to the “empty tomb” across the Synoptic Gospels?  Mary Magdalene.  Why was an actual empty tomb not necessary?  She, at the time of Jesus’ death, was the most important and credible disciple (Some historians want to recognize her as the first evangelist, not Peter or Paul.); if she says she saw Jesus, then there is no need to find an empty tomb.  Who in their right mind among the believers would doubt her?  She, having had a vivid nonveridical experience, certainly believed she had had a visitation from him.  The same could be said about the credibility of Peter’s nonveridical visitation.  The scenes at Joseph of Arimathea’s lent tomb and stories like that of doubting Thomas were later added to the resurrection story by the gospel writers, as questions were asked over the centuries.  Questions like, “Why did the resurrected Jesus only appear to two of his disciples, and not the other ten?” (13 disciples, minus Peter and Mary Magdalene, and minus Judas equals ten).  The perception model I’m presenting here would answer, “Because those ten did not have nonveridical visions of Jesus after he died.”  (Moreover, Peter and Paul were dead by the time the Gospels were written, and probably so was Mary Magdalene, as tradition places her death about 100 CE, which would give her unusual longevity for a woman of that time.  Even if she were alive, lucid, and knew about the Gospel stories of her and the empty tomb, why would she as an evangelist repudiate them?)  As it turned out, the empty tomb “layering” became accepted Christian tradition, looking as credible as “layers” like Jesus’ plethora of post-death appearances and his ascension.  And, besides, should anyone have wanted to “check out” the empty tomb by reading the freshly-written Synoptic Gospels, probably nothing resembling such an empty tomb could be found due to co-lateral damage from the revolt.  Those who believed in the empty tomb had to take the writers at their word.  Later, in about three hundred years, Constantine’s Momma, Helena, gave the newly-empowered Church an actual tomb site to venerate (Sorting Out Constantine I the Great and His Momma, [Feb., 2015]).

Almost any introspective person, I would suspect, has had nonveridical experiences of their own.  I will mention three of my own:  a) Just recently, my son and I were playing a 3-hole washer game in his back yard and had to stop due to darkness.  In the twilight, we had to search for a lost, dark-colored washer on the lawn; we finally had to get a flashlight to find it.  During the search, I saw in the dark, indistinct patterns of the grass in the twilight doughnut-like impressions, as if they were the washer we were seeking.  Clearly, the impressions were not really there.  b) Several years ago, I had a vivid dream about my long-dead maternal great uncle.  It must have been the dream right before I awakened, as I awoke remembering clearly details about his appearance and especially the words he had spoken to me in the dream.  I right away realized why primitive societies developed some form of ancestor worship; in a primitive society I might well have declared to my family and neighbors that my great uncle had visited me the night before.  I would have authenticated my “visitation” with the words I was told, which probably would have resonated with my listeners who happened to have known him.  If I and others had additional such “visitations” suggested by this first experience, then my great uncle might have joined our local “pantheon.”  c)  The third example is the most serious one, as it illustrates possible nonveridical origins of even dangerous absurdities.  Sometime in the early grades of my schooling, when I was just old enough to think I could be an independent helping hand to my parents on the farm/ranch they owned, and I now own, I got lost from my parents on a cold, drizzly winter’s day in the woods as we were spreading out from the barn trying to locate the herd of mohair goats we were running on the ranch then.  Being temporarily lost placed my mind into stress almost leading to panic, but I was too proud to shout out my location.  I moved to where I thought the barn and truck were, and sure enough a sight of a barn with truck appeared before me, complete with the stock tank dam on which I loved to play.  But I was so stressed over being lost, I convinced myself this was NOT our barn, truck, and tank, probably because I did not see goats nor parents.  I turned around and went back into the woods!  I had found safety, but convinced myself it was not; I was not lost anymore, but I thought I still was.  Luckily, I soon encountered in the woods some of our goats and then my parents; as we herded back in the direction from which I came, I saw now the barn as our barn, confused as to why I had seen it so differently earlier.  c) could have been a dangerous situation for me, had I really got lost and backtracked on a larger piece of land beyond earshot of my parents.  I had made an absurd interpretation of my perception a fact in my head, thanks to nonveridical capabilities.

Now that the table top has been made to look the same, top and bottom, this streamlined, plausible biography of Jesus still allows that for Christianity to emerge, long though it took, from such meager and credible historical sources, is still quite remarkable.  That does not make the theology of Christianity true, however.  If the theology is so bankrupt, then, why has the Church “hung on,” say, from the days of the Enlightenment and the establishment of biblical criticism?  Part of the answer is that Christian congregations have been “cocooned” from historical skepticism by their clergy, their own credulity, and their own intellectual laziness (It is easier and less trouble to believe what you are told.).  But another part of the answer is that even skeptics of Christian theology concentrate on the “other horn” of Jesus’ message, his marvelously humane ethical teachings based upon the Golden Rule as being “worthy of keeping.”  I agree, but ask why his ethics have to be taught through a vehicle shackled by the faith-based absurdities of the theology.  In an upcoming exploration, I consider “keeping” Jesus as one of many “Golden Rulers” whose teaching is spread by other ways than some kind of Church.

 

Hence, the creation of Jesus Christ, through an exhalation taking centuries to execute, is a huge historical distortion of a remarkable “common” man’s biography.  From this, I am tempted to induce that the nonveridical capabilities of the human brain can turn any human or humans into a person or persons worthy of some kind of religious veneration.

Looking at the broader picture, Christianity is one of the three Semitic (Abraham-based) world religions that have separately done just that “turning” — Judaism, Christianity, and Islam, in chronological order of origin.  There is at least one very dark consequence of creating nonveridically-derived religious venerates.  All three, perhaps in contrast with Eastern world religions (Hinduism, Buddhism, Taoism, Confucianism, etc.), paint themselves into a corner of intolerance, in that you cannot be an orthodox Jew and at the same time be of another faith; you cannot be a Christian and simultaneously be of another faith; you cannot be a Muslim and concurrently be of another faith.  In other words, these three world religions create an “us versus them” syndrome, with no way of comparing “us” with “them” save through a faith-based epistemology.  With no way, therefore, of demonstrating their “truth” save by faith, they must not tolerate “them” midst “us” if “them” claims to be as true, or more true, than “us.”  Again, “us” cannot tolerate “them,” and history has shown that encounters among the intolerant result in innocent people being killed — innocents murdered, in the final analysis, in the name of some nonveridical theology.

Another horrible consequence of both Christianity and Islam is the possibility of justifying, via their respective intolerance, antisemitism, as mentioned above.  This does not mean Judaism is “innocent” of murderous intolerance.  Look in the Old Testament what was done to “Gentiles,” whose only crime was not being Hebrew, not being “God’s Chosen.”  Nonveridical theology has indeed spawned evil in all three Semitic great world religions.

 

I hope now the reader has some understanding of why I took the “long and winding road” of the three sortings (apologies to the Beatles song “Long and Winding Road”) listed at the beginning.  Some important truths at the end of this road turn out to be unexpected, shocking, revealing, heretical, and/or blasphemous, so, therefore, it is imperative to “tread carefully,” making the argument to these truths step-by-step and carefully measured.  My long and winding intellectual path is necessarily a personal one; others who would arrive at similar or other conclusions as I would undoubtedly have a much different-looking path.  Religion is a subjective experience, a personal journey necessarily consisting of nonveridical mental exercises, some of which that can be “good,” “wholesome,” “kind,” and “loving,”  like conceptions of Jesus’ ethical teachings.  It could be said that I have personally undertaken the task of separating the wheat from the chaff, the sheep from the goats, the reliable from the unreliable, or the ethics from the theology in Christianity.  I hope I have demonstrated this can be done for Christianity and any other religion; I hope I have inspired other freethinkers to do something similar.

Another personal reason for my “long and winding road” is that I can take that road without fearing consequences of doing so; I criticize Christianity and other religions because I can.  I can, thanks to the heritage we have living in a free society, a heritage bought with the blood of courageous thinkers and doers of our past — a heritage made possible by ancient rationalists, the Renaissance, the Reformation, the Enlightenment, the Scientific Revolution, the Industrial Revolution, the American Revolution, and the French Revolution.  Because of this heritage, I understand and sympathize with the comment of a character in a French film, who sat down on the steps of the cathedral into which the rest of his family was going for a wedding, christening, or some such — sat down in personal conflict if he should go in or not.  When asked why he was hesitating, he said unequivocally, “There is no place for churches in a republic!”  Nonveridical theology does not mix well with liberte, egalite, and fraternite, or with life, liberty, and the pursuit of happiness.  Or, as Ayaan Hirsi Ali writes on p 237 of her book Heretic, Why Islam Needs a Reformation Now (Harper, 2015, ISBN 978-0-06-233393-3), we need “…to remind ourselves that the right to think, to speak, and to write in freedom and without fear is ultimately a more sacred thing than any religion.”  Well said, Ayaan, well said and so true.

RJH

 

 

 

 

Post Navigation