Beyond Good and Evil

Dr. Ronnie J. Hastings

Archive for the tag “life”

God –The Ultimate Meme, or The Problem of God

In Perception Theory and Memes — Full Circle, [March 2019], the epistemological concept of memes was used to “tie together” the basic concepts of Perception Theory, “circling back” to the beginnings of the theory. This tying-together of memes into Perception Theory, if you will, was done within the group of related posts having to do with Perception Theory.

Similarly, this is the tying together of two groups of posts, one again being the Perception Theory group (Group II.) and the other being the origin of Christianity group (Group I.)  Both groups of posts share constituent subjects of God, religion, or, to use my phrase, god and god stories.

Group I. consists of Sorting Out the Apostle Paul, [April, 2012], Sorting Out Constantine I the Great and His Momma, [Feb., 2015], Sorting Out Jesus, [July, 2015], At Last, a Probable Jesus, [August, 2015], and Jesus — A Keeper, [Sept., 2015].  It is a personal journey of religious belief utilizing history as a forensic science and my own “spiritual” experiences as a guide toward understanding how Christianity (and, by extrapolation, all religious systems of belief) came about.  It utilizes modern biblical criticism and the application of philosophy’s Occam’s Razor.  Conclusions gleaned in this group of posts rest upon the separation of theology and ethics, the former seen as mostly epistemologically and intellectually toxic, and the latter seen as epistemologically, intellectually, and socially essential and vital.  As the title Jesus — A Keeper, [Sept., 2015] implies, Christianity’s value (and by implication the value of all religions) lies in the time-proven ethics of the Golden Rule or Principle of Reciprocity, not in theology.

Group II. is much larger numerically, which correctly implies its greater subject breadth and depth.  It consists of Perception Is Everything, [Jan., 2016], Perception Theory (Perception is Everything) — Three Applications, [Feb., 2016], Perception Theory:  Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016], I Believe!, [Oct., 2016], Hope and Faith, [Jan., 2017], Prayer, [Feb., 2017], Egalite: A Qualified Virtue, [Feb., 2018], Going Global, [March, 2018], AVAPS!, [May, 2018], Toward an Imagined Order of Everything, Using AVAPS, [June, 2018], The “Problem ” of Free Will, [June, 2018], and, as indicated above, Perception Theory and Memes — Full Circle, [March, 2019].   This group develops a universal ontology and epistemology under the heading “Perception Theory.”  Perception Theory is a combination of rationalism and existentialism which enjoys a wide range of applications, as demonstrated in Perception Theory (Perception is Everything) — Three Applications, [Feb., 2016] and The “Problem ” of Free Will, [June, 2018].  In addition to illuminating directions of modern political and economic theory, Perception Theory particularly sheds light on topics from Group I., as shown by Perception Theory:  Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016], I Believe!, [Oct., 2016], Hope and Faith, [Jan., 2017],  and Prayer, [Feb., 2017].   Hence, from the perspective of sorting out “god and god stories,” much of Group II. seems like a continuation and elaboration of Group I. (as the posting dates of publishing on www.ronniejhastings.com (site name Beyond Good and Evil) above might indicate).

Memes blending “full circle” with Perception Theory (Perception Theory and Memes — Full Circle, [March, 2019]) indicates that a common theme woven throughout both groups, the “what” and “why” of gods and god stories, will also have a “full circle” of its own.  Philosophy of religion often posits the “problem” of God.  As in the “problem” of free will (The “Problem ” of Free Will, [June, 2018]), a question is begged:  is there need of a “problem” at all?  The epistemological questions surrounding the formation of Christianity (and all religious sects, for that matter), coupled with the suggestion that ontological differences among theists, atheists, and agnostics are silly and absurd (Perception Theory:  Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016]), imply, in my opinion, a resolution concerning any such “problem” is highly plausible.

{Here it is necessary to interject that the more familiar the reader is with the content of all the posts referenced above, greater and swifter will be the understanding of that which is to follow.  Bear in mind that, as always, “understanding” is not necessarily the same as “agreeing.”  Listing all the posts above emphasizes that the “full circle” attempted hereafter is not some momentary epiphany, revelation, emotional experience, recent whim, or musing, but, rather, is the result of years of methodical, careful thought leading to satisfying  personal conclusions.  That they would be satisfying to anyone else is unwarranted speculation on my part.  Achieving understanding (not necessarily agreeing) with others may be a forlorn hope (See Hope and Faith, [Jan., 2017]), but achieving any understanding from others at least would provide relief from any lingering angst over my personal “subjective trap” (See Perception Is Everything, [Jan., 2016]) — adding to the personal relief memes give (See Perception Theory and Memes — Full Circle, [March 2019]).}

In dealing with gods and god stories in terms of memes, we do not start “from scratch;” all terminology has been defined in the above posts in both Groups I. and II.  The context of our start is 1. We are star-stuff in self-contemplation.  2.  Math is the language of the universe.  To this context is added 3.  God is a looped non-veridically based concept in our heads, or meme having no resonance with the “real” veridical world or universe outside our epiphenomenal minds contained in our veridical physiological brains. (Perception Theory:  Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016])  Therefore, God exists as does a unicorn, as does Santa Claus, as does the tooth fairy, as does Satan.  The same existence applies to the generic term “gods” as well as to stories about God, or god stories.

Memes or concepts of the veridical world outside us, like the idea of “rock” or “dog,” are non-veridical, like the memes of gods, but with a very important difference: they are resonant memes, resonating with the empirical data bombarding our senses when we experience a rock or a dog.  We use our epiphenomenal  imaginations to create memes of both looped concepts (non-veridically self-contained in the imagination) and resonant concepts (non-veridically related with the veridical “outside” world indicated by our continual “pouring in” of empirical sense data).  Imagined worlds in science fiction are looped memes and scientific theories are resonant memes.  “Scientific” objectivity is making memes as resonant as possible, or as veridical as possible (AVAPS!, [May, 2018] and Toward an Imagined Order of Everything, Using AVAPS, [June, 2018]).

Certain looped non-veridical memes, like Santa Claus and Satan, are made to appear resonant by saying Santa Claus is the “personification” of Christmas giving or Satan is the “personification” of human evil.  Personifications are like avatars, or manifestations of something else.  If the “something else” has a veridical existence, again, like a rock or a dog, then it would not be looped.  The behavior of giving at Christmas and acts of human evil are real enough, just as human values like “love” and “freedom,” but equating the spirit of giving with a human form or evil acts in general with a human form is as absurd as equating all the facets of human love to a single form (like a pagan goddess) or all the facets of freedom to a single form (like Miss Liberty).  Therefore, just like a goddess such as  Venus or Aphrodite does not exist like a rock or dog, or a historical woman named Miss Liberty does not exist like a rock or dog, Santa Claus does not exist, nor does Satan.  As extant beings, Santa Claus, Satan, Venus, and Miss Liberty are looped memes; the phenomena of which these four are personifications, giving at Christmas, human evil, love, and freedom, respectively, do exist as scientifically observable distinct acts in the veridical real world, and, therefore, are resonating, non-veridical memes (Perception Theory:  Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016]).  Personifying (or making gods of) real human activity is a primitive habit of human imagination that probably began with the earliest forms of animism, and is linked with the origins of religion and its ritualization; personification was and still is a method of making sophisticated memes understandable for children; as adults it is strange today that in Christian civilizations we shed the notion that Santa “really” (that is, veridically) exists, but many of us still believe Satan “really” (i.e., veridically) exists.

What about the looped meme God, a.k.a. Yahweh, Elohim, or Jehovah in Judaism, God in Christianity, or Allah in Islam?  To what would God resonate to make God a resonate meme, like love, evil, or freedom?  To the whole world, being that God is the creator god?  Would that not be pantheism, meaning we worship the universe? (How odd would that be, in that we are part of the universe?  To worship the universe is to make the matter and energy of our bodies also objects of adoration, along with mountains, stars, animals, etc.)  To worship any part of the universe is, again, returning back to primitive religion, to idolatry.  It seems clear to me that we have made up God as the personification of everything, as the answer to any question we may pose.  As I said in Perception Theory:  Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016], God is the Grand Answerer, Super-friend, and Creator.   God, once believed in within the individual heads of worshipers, can be used to any end by the clergy, from yesterday’s shamans to today’s popes, ministers, priests, mullahs, etc.  It seems easy for us to forget that just because we can imagine X, that does not mean that X exists like a rock or a dog (Remember, a rock or a dog exists in our head like any other non-veridical meme — in the form of a concept stored as memory built by perception.)

God, therefore, is the ultimate meme, the meme beyond which nothing can be imagined.  The meme of God is seemingly a tribute to the power of our imagination, but the history of humanly imagined religion shows this tribute to be simultaneously a problem — a flexible meme easily twisted into a “pass” to do evil to each other; this is the toxicity of most, if not all, of theology; this is why Richard Dawkins describes religious, theological memes as agents of a chronic mental disease; this is why I separated ethics from theology in Jesus — A Keeper, [Sept., 2015].

But have I not described God as the atheists do?  No, not quite.  Perception Theory allows existence in the real, veridical universe outside our minds (which includes our bodies, including our brains), but also allows the epiphenomenal, non-veridical existence of imagined memes inside our minds, which are, in turn, inside our brains.  In other words, an imagined entity, like a unicorn, if defined in any mind, can have an ephemeral existence as stored data in the memory of the brain of that mind; in this sense looped non-veridical memes exist.  A very weak existence compared with the strong veridical existence of a rock’s meme or the quickened and strong veridical existence of a dog’s meme (Perception Theory:  Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016]), for sure, but an existence made possible by our imaginative, epiphenomenal mind.  According to Perception Theory, then, an atheist recognizes only strong veridical existence, whereas a theist thinks that a weak existence is as strong as a strong existence.  An agnostic does not take either position, but Perception Theory would say all three positions are in denial of the ability of the mind to be both objective and subjective.  Theists, atheists, and agnostics can all agree that some form of God exists in the heads of both believers and non-believers (Atheists have a meme of a god that does not exist in the real veridical world like a meme of a rock or dog that does exist in the real veridical world.), and that existence of god has no basis outside the human mind; all can agree to the statement, “God exists!” in the dual veridical/non-veridical definition allowed in Perception Theory.  All the conflict, blood, and death perpetuated over disagreement as to what kind of God is “real” throughout the terrible annals of historical warfare, pillage, incarceration, and personal violence were never necessary, and in the long run silly; what still goes on today is folly, absurd, and unjustified.  How less amazing are the billions of concepts (memes) of God in the imaginations of humans worldwide compared to the consensus, imagined Creator God of, say, Genesis, Chapter 1?

In order for theists, atheists, and agnostics to agree on the existence of God or of the gods, atheists have to compromise but very little, while theists will have to move their position a great deal.  To agree that God exists in the imaginations of individual heads into which no other but that individual can “see,” due to the subjective trap, is not that far away from the “classic” atheistic claim that there is no supernatural deity or deities in the “real,” veridical universe.  The theist “classic” claim is just the opposite that of the atheist — there IS WITHOUT DOUBT a God that exists outside human imagination, just like some part of the universe or the universe itself actually exists.  If one listens carefully to the worshipful words of praise of theists (at least, this has been my experience), the existence of God is affirmed “within the heart” of the believer — affirmed by an epiphenomenal feeling of emotion fueled by faith (See Hope and Faith, [Jan., 2017]).  That is about as far from objective evidence as one can get.  This, instead of affirming God’s existence, affirms what Perception Theory identifies as a looped non-veridically based case for existence.  That is, the theist’s affirmation of God’s existence is no stronger than that of affirming the existence of unicorns or tooth fairies, and is much weaker than affirming the existence, of, say, freedom.  And, of course, the theist’s affirmation of God’s existence is minuscule compared to the strong veridically based cases for existence of, say, a rock or a dog (Perception Theory:  Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016]).  As for agnostics, I would speculate that some would welcome the compromise about God’s or the gods’ existence with the “little-to-lose shoulder shrug” of the atheists, or some might remain skeptical and non-committed, not willing to come close to agreeing with theists, who they see as gullible and naive.  All in all, I would speculate that at the “table” of agreement of all three groups over Perception Theory’s compromise possibility of the existence of God, it would be disproportionately made up of atheists, with a smaller group of agnostics, followed by an even smaller group of theists who have bravely changed their ontological thinking a great deal.   The future success of Perception Theory might be measured by seeing if the population at the compromise table might approach equal proportions from all three groups.  (No matter what the proportions at the table might be, Perception Theory might take credit for the absence of evangelism among the three groups, as, by definition, the table is one of agreement.)

Stated directly and succinctly, God or gods exist(s) only in our imaginations; we made up all deities, past, present, and future.  Most theology is not only useless, it can often be dangerous and even lethal.  Not all of religion is useless; part of religion is vital — the ethical part based upon the Golden Rule or Principle of Reciprocity (Jesus — A Keeper, [Sept., 2015]).  In Western culture this means a deliberate separation of ethics from theology in religions like the three Abrahamic ones, Judaism, Christianity, and Islam; this separation is already done in some religions of Eastern culture, like Buddhism, Jainism, Confucianism, and Taoism.  We have met the Creator God, and it is us; there is no problem of God or of the gods — just like all memes in our heads, the ultimate meme of God or the gods is at our disposal; we can do with theology what we will; we can make it impotent and irrelevant, just as we have made memes like pseudoscience, superstitions, and unwanted or uninteresting fantasies.  Just as was done by so many Americans in their revolution for independence, religion must be relegated and confined to individual minds, not made into social and sacred creeds demanding conflicting evangelism (The United States of America — A Christian Nation? [June, 2012]).

 

With the gods relegated to fantasy within our heads, we can now deal with god stories and the lessons they teach with historical utilitarianism.  Like so much of “ancient wisdom” from our distant past, such as the humanistic Principle of Reciprocity, we can both individually and collectively judge the god stories and their lessons without fear of supernatural reprisals.  For example, in Christian culture, from which I come, I can now see that the Old Testament of the Bible is a collection of literature blended together by Hebrew scholars and priests to teleologically justify the invasion and conquest by newly independent nomads of what we call the Holy Land, all under the theological guise of the Hebrews being God’s “Chosen People.”  I can now see that the New Testament of the Bible is a collection of literature blended together by the scholars of a new sect to teleologically justify the execution of their leader as a common criminal (See all of Group I. for details).  The New Testament is to Christians what the Icelandic Sagas were to many Scandinavians of the Viking persuasion.

Erich Fromm, a Jewish humanist philosopher, who describes himself as a “non-theist,” has done something very similar way before Perception Theory.  In Ye Shall Be As Gods (Fawcett Premier Books, New York, 1966 — ISBN 0-449-30763-8), Fromm “radically” interprets the Old Testament as the evolution of the relationship between the meme (concept) of God and the entirety of mankind, not just the “Chosen People.”  He offers understanding into the “God is dead” meme and gives insight into the New Testament’s Passion of Christ, using Psalm 22.  The rabbinic teachings of the Old Testament during the centuries of the Diaspora are also employed.  By critically looking at the Old Testament, Fromm has, in my opinion, created paths toward its greater appreciation. (Why Some White Evangelical Christians Voted for and/or Still Support Donald Trump, [Dec., 2018])

With the gods relegated to fantasy within our heads, we can now investigate why religion sprang within the heads of our species in the first place.  The reasons why belief in some form of supernatural entities or spirits in the real world became, apparently, necessary for human survival in our cognitive revolution during our species “hunter-gatherer” stage can now be studied and be made into a consensus of anthropology.  Elements dealing with the origins of religion from Groups I. and II. have already pointed the way (See At Last, a Probable Jesus, [August, 2015],  Jesus — A Keeper, [Sept., 2015], Perception Is Everything, [Jan., 2016], I Believe!, [Oct., 2016],  and Toward an Imagined Order of Everything, Using AVAPS, [June, 2018]).  The physical and cognitive attributes that were passed on from generation to generation over thousands of years contributing to our species-wide universal “religiosity” will have to break down the elements of our survival, such as cooperation, altruism, and the necessity of suspending doubt and questioning in times of emergency, such as discussed in I Believe!, [Oct., 2016], wherein our ancestors having to deal with a “leopard problem” is offered as a “thought scenario.”  How did religion evolve from simple appeasement of a local “leopard god” to the continual sacrifice of thousands atop Aztec temples in Tenochtitlan?  How did we get from admonishing our children to be quiet when the shaman is speaking to the eruption of the Thirty Years War?  What a difference between believing a god or gods causes thunder/lightning and calling the faithful to the Crusades!

With the gods relegated to fantasy within our heads, we can now see how important the separation of theology from ethics is.  Moreover, such a separation is conveniently seen as a sorting of memes.  When the origin of religion, with its subsets of theology and ethics, is couched in terms of memes, I would suggest that the vital “good” memes, those of ethics coming from the human mind and necessarily developing in the longest childhood of all primates, if not of all mammals.  That is, the memes of ethics for human beings necessarily formed on the “template” of the development of the nuclear family — mother, child, father, and extended family, including friends.  The rules of behavior taught to a child are extrapolated to apply not only to the mother-child relationship, but to all other possible relationships within the hunter-gather group, and these rules collectively are treated as social norms applied throughout childhood and adulthood.  In turn, these norms were justified upon the authority of the group.  This collective authority became more than “what our mothers and older siblings told us;” it became the authority of the political leaders and the authority of the “spiritual” leaders, the shamen, the beginning of politics and the beginning of religion.  But now, without the necessity of religious memes, only those of politics and ethics are still needed.  (Recalling a point germane to the “need” for religion shown by Yuval Noah Harari in his book Sapiens, A Brief History of Humankind – that religion is a meme that can motivate many more than a leader within shouting distance, once that meme is transmitted to other minds — I would hasten to add that today’s almost instant electronic communications over the world wide internet has taken over religion’s communicative skill and can spread memes much, much better; spreading theological memes using the internet only accelerates the spread of the “poison.”)  Religion and theology memes are not needed any more; only ethics memes are needed.

Gods as fantasy has at least one ancient precedent.  In India, in the 3rd to 6th centuries, BCE (or BC), the original form of Buddhism, called Hinayana or Theravada Buddhism, basically ignored the question of the existence of the gods (curiously non-theological) and concentrated on the human, inner, existentialist self (Jainism, contemporary with the founding centuries of Buddhism, could be spoken of in a similar vein, and could even be seen as outward looking, not for the gods, but for practicing an extreme reverence for life).  Hinayana Buddhism dealt with attaining Nirvana, or enlightenment as demonstrated by Siddhartha, the founder of Buddhism; dealing with gods took a back seat to struggling with inner human desire; the gods were not germane to Siddhartha’s original teaching.  In time Mahayana Buddhism (along with other forms, like Zen) became the dominant form of Siddhartha’s teaching, in which Siddhartha himself, or Buddha, became deified as a god — much as Jesus himself became deified as a god in Christianity (Sorting Out Constantine I the Great and His Momma, [Feb., 2015]).  Imagery featuring the statues of Buddha are found at Mahayana sites, but sites featuring simple imagery such as Buddha’s footprint are Hinayana or Theravada sites.

Note the “direction” of Hinayana Buddhism, though admirably unhindered by the gods, is inward, toward the non-veridical, not outward, toward the veridical, as are science, technology, math, and engineering (the STEM subjects in US schools), which are equally and admirably unhindered by the gods.  The success of studying “outward” toward the veridical is another way of repeating the message of AVAPS!, [May, 2018] — As Veridical As Possible, Stupid!  Hinayana Buddhism took its lack of theology and went the “wrong” direction!  Hinayana Buddhism should have done “a 180,” (180 degrees) and gone the opposite direction.

Without the threats of punishment after death or fantasies of paradise after death germane to much of theology, religion becomes transparent as many, many forms of the sociological phenomenon of a cult.  At every religion’s beginning — more finely, at the beginning of every denomination’s sect — it is a cult.  If I in another time had acted upon my “visitation” from my deceased great uncle in the form of a vivid dream, as described in At Last, a Probable Jesus, [August, 2015], and had convinced others around me I had communicated with the dead, I would have formed a cult.  Great religions of the world throughout history are successful cults, their “truth” erroneously measured by their success, and large subsets of great religions are smaller successful cults.  Cults venerate a “great” being (usually a god or person of “special” powers) through the leadership of a cult founder, who also can be the venerated.  Thus, Judaism can be seen as Moses founding the veneration of Yahweh, Elohim, or Jehovah, and Christianity can be seen as Peter, Paul, and Mary Magdalene venerating Jesus (See At Last, a Probable Jesus, [August, 2015]).  Smaller successful cults in the Christian vein include cult leaders such as many Popes, many Orthodox archbishops, many saints, Martin Luther (Lutherans) , John Calvin (Presbyterians), Henry VIII and Thomas Cranmer (Anglicans in U.K., Episcopalians in U.S.), George Fox (Quakers), Jane Wardley, Ann Lee, and Lucy Wright (Shakers), John Smyth, Thomas Helwys, and Roger Williams (Baptists), Charles Wesley, John Wesley, and George Whitefield (Methodists), Joseph Smith (Mormons), Christian Rosenkreuz (Rosicrucians), Mary Baker Eddy (Christian Scientists), William Miller and Ellen G. White (Seventh-day Adventists), Barton W. Stone (Christian Church, Disciples of Christ), Alexander Campbell (Church of Christ), Charles Fox Parham and William Seymour (Pentecostals), 1914 General Council at Hot Springs (Assembly of God), and Sun Myung Moon (Unification Church) — just to name a few with which I am familiar.  Two non-Christian examples of small successful cults are 3 Roman Emperors (veneration of Apollonius) (See Sorting Out Jesus, [July, 2015])  and Scientology (veneration of L. Ron Hubbard).  Two unsuccessful cult leaders and their cults here in the United States are Jim Jones (Peoples Temple) and David Koresh (Branch Davidians).  The toxicity of theology throughout history has been carried out through cults such as these.  The ethical kindness, love, and care of one group of humans to another group has also been carried out through cults such as these, but what has been overlooked is that ethical behavior needs no theology or organized religion to spread from one human to others.  When Jesus taught his version of the Golden Rule, he talked not of loving your neighbor as yourself through the social vehicle of the synagogue; the foundation of ethics, our caring for each other, has no origin in any religion or any theology; the Principle of Reciprocity began within each little hunter-gatherer group that successfully struggled for survival.  If theology exists as a meme in an individual, there it must stay — it should not be passed on to others; mental health services can help individuals for whom resisting that passing on is a struggle.  On the other hand, if ethics such as the ethical teachings of Jesus exists as a meme in an individual, by all means it should be passed on, as ethical memes were passed on in the little hunter-gatherer groups.  To be ethical in the manner spoken here is to be human, not religious or theological.  We are not human to each other through the imagined groups to which we belong, but, rather through the fact we are homo sapiens.

The general “shedding” of religion and its toxic theology, then, is seen as a veridically-based “enlightenment” which follows AVAPS toward more anthropological memes.  Imaginations young and old, fueled by the ethics of reciprocity (The Golden Rule), cannot but generate memes fired in the scrutiny of scientific consensus that will solve problems and heal wounds both for our species and for our planet and the universe beyond.  We are tweaking our inner-star-stuff to resonate more with the star-stuff that makes up the rest of the universe.

I would suggest that any reader who thinks this is but another announcement of another religion, of another cult, is victimized by her seemingly genetic tendency to think in terms of gods and god stories.  He needs to go back and read or re-read Groups I. and II.  God as the ultimate, unnecessary meme is NOT a new religion, NOT a new cult.  Rather, it is a veridically-directed philosophy transcendent of theism, atheism, or agnosticism.  Using the combination of rationalism and existentialism provided by Perception Theory, it suggests an expansion of anthropology to deal with the “who, what, why, and how” of human existence; the “who, what, why, and how” of human existence used to be handled by religion and its attendant theology, and I am suggesting that they have failed miserably.  The “should” statements used above are not evangelical pontificates, but, rather, are calls to consider looking at existence veridically, to look at existence in the opposite way Hinayana Buddhism did.  When I followed my own “shoulds” of Perception Theory tied to religion, I found the intellectual and emotional personal satisfaction I had been seeking for years. (“Personal satisfaction” does not mean I’ve not continued to question “everything,” especially this meme like Perception Theory that my imagination conjures.)  Perhaps my own intellectual adventure might be of help toward others finding their own version of personal satisfaction.  Or, perhaps not.  I’ve written it down compelled by an ethical Principle of Reciprocity tens of thousands of years old and taught by Jesus and so many others.

RJH

 

 

Perception Theory (Perception is Everything) — Three Applications

In the presentation of a theory of human existence, Perception is Everything [Jan., 2016], it was suggested the theory could be applied to almost every aspect of human experience.  The model paints the picture of the objective/subjective duality of human existence as the interactive dual flow (or flux) of real-world, empirical, and veridical data bombarding our senses and of imaginative, conceptual, and non-veridical data generated by our mind, all encased within the organ we call the brain.  The two sides of the duality need not be at odds, and both sides are necessary; the objective and the subjective are in a symbiotic relationship that has evolved out of this necessity; what and who we are simultaneously exist because of this symbiosis that dwells in the head of every human individual.  No two humans are alike because no two symbioses in two brains are alike.

This post is to briefly demonstrate how the perception model of Perception is Everything [Jan., 2016] can be use to contribute insights into I. Development of Self-Consciousness in a Human Infant, II. Education, and III. The Origin of Politics.

 

I. Development of Self-Consciousness in a Human Infant – That the human mind has the ability to develop a concept of “self,” as opposed to “others,” is commonly seen as fundamentally human.  It might not be unique to our species, however, as we cannot perceive as do individuals of other species.  Often pet owners are convinced their dog or cat behaves as if it is aware of its own individuality.  But that might be just too much anthropomorphism cast toward Rover or Garfield by the loving owners.  So fundamental is our self-consciousness, most views would assert its development must commence just after birth, and my perception theory is no exception.

The human baby is born with its “nature” genetically dealt by the parents and altered by the “nurture” of the quality of its gestation within the mother’s womb (or within the “test tube” early on or within the artificial womb of the future).  The world display screen in the head of the baby (Perception is Everything [Jan., 2016]) has to be primitive at birth, limited to whatever could bombard it veridically and non-veridically while in the womb (Can a baby sense empirical data? Can a baby dream?  Are reflex movements of the fetus within her which the mother can feel before birth recorded in the memory of the fetus?)  Regardless of any answers to these questions, perception theory would describe the first moments after the cutting of the umbilical cord as the beginning of a “piece of star-stuff contemplating star-stuff all around it” Perception is Everything [Jan., 2016].  The event causing the baby to take its first breath begins the lifelong empirical veridical flux entering one “side” of the baby’s world display screen, triggering on the other “side” of the screen an imaginative non-veridical flux from the other “side.”  The dual flux has begun; the baby is “alive” as an individual, independent of the symbiosis with its mother’s body; its life as a distinct person has begun.

The unique “long childhood” of Homo sapiens (due to the size-of-the-birth-canal/size-of-the-baby’s-skull-after-9-months’-gestation consideration), the longest “childhood” of any species before the offspring can “make it on its own” —  a childhood necessarily elongated, else we would not be here as a species today — assures the world display screen is so primitive that the first few days, weeks, and months of each of us are never remembered as our memory develops on the non-veridical side of the screen.  It takes a while for memory generated from the empirical veridical flux to be able to create a counter flow of imaginative non-veridical flux back to the screen. Perception is Everything [Jan., 2016] indicates the dual flow is necessary for the screen to become “busy” enough to be noticed by the “mind’s eye,” that within us that “observes” the screen.  No doubt all of us first had our screens filled by perceptions of faces of caretakers (usually dominated by our mother’s face) and sensations of sound, touch, smell, and taste as our bodies adapted to the cycles of eating, eliminating, and sleeping.  Waking hours during which we were doing none of these, we began to focus on the inputs of our senses.  These are the indicators we inevitably process non-veridically how we are aware of these inputs; and just as inevitably we at some point become aware of a “perceiver,” an observer of these inputs; we have an idea of “something” is perceiving, that “something” is relating to our caretaker(s) (whose face(s) we always feel good seeing), and that “something” is us.  In each individual, the development of a subjective “I” is normally “there” in the head in a few months (exact time interval different, probably, for each individual); a distinction between “me” and “not-me” begins.  This distinction is self-consciousness in-the-making, or “proto-self-consciousness.”

That distinction between “me” and “not-me” is vital and fundamental for each piece of star-stuff beginning to contemplate his or her “fellow” star-stuff — contemplation that is constantly painting an increasingly complex world display screen inside his or her head.  Early on, anything that “disappears” when eyes are closed is “not-me;” anything that is hungry, that likes things in a hole below the eyes to quench that hunger, that experiences discomfort periodically way below the eyes, and that feels tactile sensations from different locales in the immediate vicinity (through the skin covering all the body as well as the “hole below,” the mouth) is “me.”  Eventually, “me” is refined further to include those strange appendages that can be moved at will (early volition) and put into the hunger hole below the eyes, two of which are easy to put in (hands and fingers) and two of which are harder to put in (feet and toes).  That face that seems to exist to make “me” feel better and even happy turns out to be part of “not-me” and it becomes apparent that much of “not-me” does not necessarily make “me” feel better, but are interesting nonetheless.  Reality is being sorted out in the young brain into that which is sorted and that which sorts, the latter of which is the “mind’s eye,” self-consciousness.

In time, “me” can move at will and that which can move thus is the “housing” and boundary limiting “me.”  As soon as the faces “me” can recognize are perceived that they represent other “me’s,” then the distinction between “me” and “you” begins, soon followed by “me,” “you,” and “them.”  Some “you’s” and “them’s” don’t look like other “you’s” and “them’s,” such as household pets.  Still other “you’s” and “them’s” don’t move on their own like “me, soon to be ‘I'” does, such as dolls and stuffed animals.  “You’s” and “them’s” separate into two catagories — “alive” and “not-alive.”  As quantity becomes more a developed concept, it soon becomes apparent that there are outside “me” more “not-alives” than “alives;” “not-alives” soon are called “things” and “alives” take on unique identities by learning to recognize and later speak names.  Things are also non-veridically given names, and the genetic ability to quickly learn language “kicks in,” as well as the genetic ability to count and learn math.  In a few months’ time, existence for “me” has become both complex and fixating to its mind/brain, and growing at an increasing rate (accelerated growth).  The name non-veridically given to “me” is the subjective “I” or the objective “myself” — both of which are understood to be self-consciousness.

This clearly is an approach similar to a psychology of infants, which might deal eventually with the development of the ego and the id.  This approach using perception theory allows a seamless tracing of the development of the human mind back before birth, employing a more objective approach to talking about subjectivity than possessed by some other psychological approaches; it is an approach based upon evolutionary psychology.  In addition, it is clear that the emergence of self-consciousness according to perception theory demands a singular definition of the “self” or of “I” or of “myself,” in order to avoid the problems of schizophrenia and its multiple personalities.  Perhaps the widespread phenomenon of children making up “imaginary friends” is an evolved coping mechanism in the individual child’s imagination to order to avoid schizophrenia; an imaginary friend is not the same as the self-consciousness producing such a “friend.”  Just like the individual brain, self-consciousness is singularly unique, in ontological resonance with the brain.

 

II.  Education – Perception theory is compatible with the idea of what education should be.  Education is not a business turning students into future consumers; education is not a sports team turning students into participants; education is not training to turn students into operators of everything from computer keyboards to spaceship control panels.  Instead, education is but the development of students’ minds (1. Education Reform — Wrong Models! [May, 2013], 2. Education Reform — The Right Model [May, 2013], 3. Education Reform — How We Get the Teachers We Need [May, 2013], & Top Ten List for Teachers of HS Students Preparing for College or University (Not a Ranking) — A List for Their Students, Too! [Dec., 2014]).  The word “but” here is somewhat misleading, as it indicates that education might be simple.  However, education is so complex that as yet we have no science of education (#1 on the “List” in Top Ten List for Teachers of HS Students Preparing for College or University (Not a Ranking) — A List for Their Students, Too! [Dec., 2014]).  Perception theory indicates why education is so complex as to defy definition and “sorting out,” Defining education is like the brain trying to define its own development, or, like a piece of star-stuff trying to self-analyze and contemplate itself instead of the universe outside itself.  At this writing, I am inclined to say that a more definitive sorting out of what education is and how it is accomplished inside individual brains is not impossible, as an individual seeing his/her own brain activity is impossible, or, as another person seeing my subjective world display screen in my head is impossible (the “subjective trap”) [Perception is Everything [Jan., 2016]].

Following this optimistic inclination, education is seen as developing in individual brain/minds a continuous and stable dual flow of veridical flux and non-veridical flux upon the individual’s world display screen (Perception is Everything [Jan. 2016]).  A “balance” of this dual flow in Perception is Everything [Jan., 2016] is seen as a desired “mid-point” of a spectrum of sanity, the two ends of which denote extreme cases of veridical insanity and non-veridical insanity.  Therefore, the goal of education is to make the probability of becoming unbalanced and away from this mid-point in either direction as small as possible; in other words, education attempts, ideally, to make in the student’s mind the concentration and focusing of the non-veridical upon the veridical as much as possible.  The non-veridical vigor of “figuring out” the veridical from “out there” outside the brain is matched by the vigor of the empirical bombardment of that same veridical daily data.  Making this focus a life-long habit, making this focus a comfortable, “natural,” and “fun” thing for the non-veridical mind to do for all time is another way to state this goal of education.  Defining education in this manner seems compatible and resonate with the way our mind/brain seems to be constructed (with the necessary duality of the objective and the subjective); our mind/brains seem evolved to be comfortable with being at the mid-point without struggling to getting or staying there; self-educated individuals are those fortunate enough to have discovered this comfort mostly on their own; graduates of educational institutions who become life-long scholars have been guided by teachers and other “educators” to develop this “comfort zone” in their heads.  Education, in this sense, is seen as behaving compatibly with the structure of the brain/mind that has assured our survival as a species over our evolution as a species.  In order to successfully, comfortably, and delightfully spend our individual spans of time in accordance to the evolution of our mind/brains, we must live a mental life of balance of the two fluxes; education, properly defined and thought upon in individual mind/brains, assures this balance, and therefore assures lives of success, comfort, and delight.  He/she who is so educated uses his/her head “in step” with the evolution of their head.

We evolved not to be religious, political, or artistic; we evolved to be in awe of the universe, not to be in awe of the gods, our leaders, or our creations.  We evolved not to be godly, patriotic, or impressive; we evolved to survive so that our progeny can also survive.  Religion, politics, and the arts are products of our cultural evolution invented by our non-veridical minds to cope with surviving in our historical past.  In my opinion these aspects of human culture do not assure the balance of the two fluxes that maximize the probability of our survival.  Only focusing upon the universe of which we are a part will maximize that probability — thinking scientifically and “speaking” mathematically, in other words.  Education, therefore, is properly defined as developing the scientifically focused mind/brain; that is, developing skills of observation, pattern recognition, mathematical expression, skepticism, imagination, and rational thinking.  But it is not an education in a vacuum without the ethical aspects of religion, the social lessons of political science and history, and the imaginative exercises of the arts.  In this manner religious studies, social studies, and the fine arts (not to mention vocational education) all can be seen as ancillary, participatory, and helpful in keeping the balance of the two fluxes, as they all strengthen the mind/brain to observe, recognize, think, and imagine (i.e. they exercise and maintain the “health” of the non-veridical).  I personally think non-scientific studies can make scientific studies even more effective in the mind/brain than scientific studies without them; non-scientific studies are excellent exercises in developing imagination, expression, senses of humor, and insight, attributes as important in doing science as doing non-science.  The “well-rounded” scholar appreciates the role both the objective and the subjective play in the benefit of culture better than the “specialist” scholar, though both types of scholars should understand that the focus of all study, scientific or not, should be upon the veridical, the universe “out there.”  Not everyone can development their talents, interests, and skills in the areas of science, math, engineering, and technology, but those who do not can focus their talents, interests, and skills toward toward developing some aspect of humanity-in-the-universe — toward exploring the limitless ramifications of star-stuff in self-contemplation.

Therefore, education, Pre-K through graduate school, needs a new vertical coordination or alignment of all curricula.  ALL curricula should be taught in a self-critical manner, as science courses are taught (or should be taught if they are not).  An excellent example of what this means was the list of philosophy courses I took in undergraduate school and graduate school.  Virtually all the philosophy courses I took or audited were taught in a presentation of X, of good things about X, and of bad things about X sequence.  In other words, all courses, regardless of level, should be taught as being fallible, not dogmatic, and subject to criticism.  A concept of reliable knowledge, not absolute truth, should be developed in every individual mind/brain so that reliability is proportional to verification when tested against the “real world,” the origin of the veridical flux upon our world display screen; what “checks out” according to a consensus of widely-accepted facts and theories is seen as more reliable than something that is supported by no such consensus.  Hence, the philosophy of education should be the universal fallibility of human knowledge; even the statement of universal fallibility should be considered fallible.  Material of all curricula should be presented as for consideration, not as authoritative; schools are not to be practitioners of dogma or propagators of propaganda.  No change should occur in the incentive to learn the material if it is all considered questionable, as material continues often to be learned in order to pass each and every course through traditional educational assessment (tests, exams, quizzes, etc.).  And one does not get diplomas (and all the rights and privileges that come with them) unless one passes his/her courses.  Certainly the best incentive to learn material, with no consideration of its fallibility other than it’s all fallible, is the reward of knowing for its own sake; for some students, the fortunate ones, the more one knows, the more one wants to know; just the knowing is its own reward.  Would that a higher percentage of present and future students felt that way about what they were learning in the classroom!

The “mantra” of education in presenting all-fallible curricula is embodied in the statement of the students and for the students.  Institutions of learning exist to develop the minds of students; socialization and extracurricular development of students are secondary or even tertiary compared to the academic development of students, as important as these secondary and tertiary effects obviously are.  As soon as students are in the upper years of secondary schooling the phrase by the students should be added to the other two prepositional phrases; in other words, by the time students graduate from secondary schools, they should have first-hand experience with self-teaching and tutoring, and with self-administration through student government and leadership in other student organizations.  Teachers, administrators, coaches, sponsors, and other school personnel who do not do what they do for the sake of students’ minds are in the wrong personal line of work.

Educational goals of schools should be the facilitation of individual student discovery of likes, dislikes, strengths, weaknesses, tastes, and tendencies.  Whatever diploma a student clutches should be understood as completing a successful regimen of realistic self-analysis; to graduate at some level should mean each student knows his/herself in a level-appropriate sense; at each level each student should be simultaneously comfortable with and motivated by a realistic view of who and what he/she is.  Education should strive to have student bodies free of “big-heads,” bullies, “wall-flowers,” and “wimps.”  Part of the non-academic, social responsibility of schools should be help for students who, at any level, struggle, for whatever reason, in reaching a realistic, comfortable, and inspiring self-assessment of themselves.  Schools are not only places where you learn stuff about reality outside the self, they are places where you learn about yourself.  Students who know a lot “outside and inside” themselves are students demonstrating the two fluxes upon their world display screen in their heads are in some sense balanced. (1. Education Reform — Wrong Models! [May, 2013], 2. Education Reform — The Right Model [May, 2013], 3. Education Reform — How We Get the Teachers We Need [May, 2013],  Top Ten List for Teachers of HS Students Preparing for College or University (Not a Ranking) — A List for Their Students, Too! [Dec., 2014], & Top Ten List for Teachers of HS Students Preparing for College or University (Not a Ranking) — A List for Their Students, Too! [Dec., 2014])

Consequently, the only time education should be seen as guaranteeing equality is at the beginning, at the “start-line” the first day in grade K.  Education is in the “business” of individual development, not group development; there is no common “social” mind or consciousness — there is only agreement among individual brain/minds.  Phrases like “no child left behind” has resulted in overall mediocrity, rather than overall improvement.  Obviously, no group of graduates at any level can be at the same level of academic achievement, as each brain has gained knowledge in its own, unique way; some graduates emerge more knowledgeable, more talented, and more skilled than others; diverse educational results emerge from the diversity of our brain/minds; education must be a spectrum of results because of the spectrum of our existence, our ontology, of countless brain/minds.  Education, therefore, should be seen as the guardian of perpetual equal opportunity from day 1 to death, not the champion of equal results anywhere along the way.

[Incidentally, one of the consequences of “re-centering” or “re-focusing” the philosophy, the goals, and the practices of education because of perception theory may be a surprising one.  One aspect of a scientific curriculum compared to, say, an average “humanities” curriculum, is that in science,, original sources are normally not used, unless it is a history and philosophy of science course (Is history/philosophy of science a humanities course?).  I am ending a 40-year career of teaching physics, mostly the first-year course of algebra-based physics for high school juniors and seniors, and, therefore, ending a 40-year career introducing students to the understanding and application of Isaac Newton’s three laws of motion and Newtonian gravitational theory.  Never once did I ever read to my physics students, nor did I ever assign to my physics students to read, a single passage from Philosophiae Naturalis Principia Mathematica, Newton’s introduction to the world of these theories.  Imagine studying Hamlet but never reading Shakespeare’s original version or some close revised version of the original!

The reason for this comparison above is easy to see (but not easy to put in few words for me):  science polices its own content; if nature does not verify some idea or theory, that idea or theory is thrown out and replaced by something different that does a better job of explaining how nature words.  At any moment in historical time, the positions throughout science are expected to be the best we collectively know at that moment.  Interpretations and alternative views outside the present “best-we-know” consensus are the right and privilege of anyone who thinks about science, but until those interpretations and views start making better explanations of nature than the consensus, they are ignored (and, speaking as a scientist, laughed at).

Though many of the humanities are somewhat more “scientific” than in the past — for instance, history being more and more seen as a forensic science striving to recreate the most reasonable scenes of history — they are by definition focused on the non-veridical rather than the veridical.  They are justified in education, again, because they aid and “sharpen” the non-veridical to deal with the veridical with more insight than we have done in the past.  The problems we face in the future are better handled with not only knowledge and application of science, math, engineering, and technology but also with knowledge of what we think about, of what we imagine, of the good and bad decisions we have made collectively and individually in the past, and of the myriad of ways we can express ourselves, especially express ourselves about the veridical “real” world.  Since the original sources of these “humanities” studies are seen as applicable today as they were when written, since they, unlike Newton, were not describing reality, but only telling often imaginative, indemonstrable, and unverifiable stories about human behavior to which humans today can still relate, the original authors’ versions are usually preferred over modern “re-hashes” of the original story-telling.  The interest in the humanities lies in relating to the non-veridical side of the human brain/mind, while the interest in the sciences lies in the world reflecting the same thing being said about it; Newton’s laws of motion are “cool” not because of the personality and times of Isaac, but because they appear to most people today “true;” Hamlet’s soliloquies are “cool” not because they help us understand the world around us, but because they help us understand and deal with our non-veridical selves, which makes their creator, Shakespeare, also “cool;” the laws of motion, not Newton, are today relevant, but Shakespeare’s play is relevant today because in its original form it leads still to a myriad of possibly useful interpretations.  What leads to veridical “truth” is independent of its human source; what leads to non-veridical “stories” is irrevocably labeled by its originator.

To finally state my bracketed point on altered education as begged above the opening bracket, science, math, and engineering curricula should be expanded to include important historical details of scientific ideas, so that the expulsion of the bad ideas in the past as well as the presentation of the good ideas of the present are included.   Including the reasons the expunged ideas are not part of the curriculum today would be the “self-critical” part of science courses.  Science teachers would be reluctant to add anything to the curriculum because of lack of time, true enough, but the clever science teacher can find the few seconds needed to add by being more anecdotal in their lessons, which would require them to be more knowledgeable of the history and philosophy of science.  Hence, all the curriculum in education suggested by perception theory would be similar — cast in the universal presentation of X, of good things about X, and of bad things about X mold.]

 

III.  The Origin of Politics (The “Toxic Twin”) – Perception is Everything [Jan., 2016] makes dealing with human politics straightforward, in that politics not only originated, in all likelihood, just as religion and its attendant theology originated, it has developed along the same lines as theology so similarly that politics could be considered the “toxic twin” of theology, in that it can turn as toxic (dangerous) to humanity as theology can turn. (Citizens! (I) Call For the Destruction of the Political Professional Class [Nov., 2012], Citizens! (II) The Redistribution of Wealth [Jan., 2013], Citizens! (III) Call for Election Reform [Jan., 2013], The United States of America — A Christian Nation? [June, 2012], An Expose of American Conservatism — Part 1 [Dec., 2012], An Expose of American Conservatism – Part 2 [Dec., 2012], An Expose of American Conservatism — Part 3 [Dec., 2012], Sorting Out Jesus [July, 2015], At Last, a Probable Jesus [Sept., 2015], & Jesus — A Keeper [Sept., 2015]) In order for us to survive in our hunter-gatherer past, leaders and organizers were apparently needed as much as shamans, or proto-priests; someone or a group of someones (leader, chief, council, elders, etc.) had to decide what would be the best next thing for the collective group to do (usually regarding the procuring of food for the group’s next eating session or regarding threats to the group from predators, storms, or enemy groups over the next hill, etc., etc.,); just as someone was approached to answer the then unanswerable questions, like where the storms come from and why did so-and-so have to die, leaders of the group were looked to for solving the group’s practical and social problems.  In other words, politics evolved out of necessity, just like religion.  Our non-veridical capabilities produced politics to meet real needs, just as they produced religion to meet real needs.

But, just as theology can go toxic, so can politics and politics’ attendant economic theory.  Voltaire’s statement that those who can make you believe in absurdities can make you commit atrocities applies to political and economic ideology just like it does to gods and god stories.  Anything based purely upon non-veridical imagination is subject to application of Voltaire’s statement.  However, I think politics has an “out” that theology does not.  Theology is epistemologically trapped, in that one god, several gods, or any god story cannot be shown to be truer (better in describing reality) than another god, other several gods, or another god story.  Politics is not so trapped, in my opinion, as it does not have to be “attached at the hip” with religion, as has been demonstrated in human history since the 18th century.  Politics can be shown to be “better” or “worse” than its previous version by comparing the political and social outcome of “before” with “after.”  No political solution solves all human problems, if for no other reasons than such problems continually evolve in a matter of weeks or less, and, no political installment can anticipate the problems it will encounter, even when it has solved the problems of the “before.” Nonetheless, I think one can argue that the fledgling United States of America created by the outcome of the American Revolution and the birth of the U.S. Constitution was better than the colonial regime established in the 13 colonies by the reign of George III.  The same can be said about the independent nations that emerged peacefully from being commonwealths of the British Empire, like India, Canada, and Australia, though the USA, India, Canada, and Australia were and are never perfect and free from “birth pangs.”

What are the political attributes that are “better” than what was “before?”  Many of the references cited just above point out many of them, a list I would not claim to be complete or sufficient.  Overall, however, the history of Western and Eastern Civilization has painfully demonstrated, at the cost of spilling of the blood of millions (Thirty Years’ War, Napoleonic Wars, World War I, World War II, etc.) that theocracies and monarchies are “right out.”  [Here I am applying the philosophy that history is not so much a parade of great individuals, but, rather, is more apply seen as a parade of great ideas — a parade of non-veridical products much better than other such products.]  Democracies only work for small populations, so a representative form of government, a republic, works for larger populations of the modern world.  Clearly, secular autocracies and dictatorships are also “right out.”  Class structure of privilege and groundless entitlement still rears its ugly head even in representative republican governments in the form of rule-by-the-few of power (oligarchies) and/or wealth (plutocracies).  To prevent oligarchies and plutocracies, elected representative government officials should be limited in how long they can serve so that they cannot become a political professional class (limited terms of office); in other words, politicians should be paid so that they cannot make a profit.

[Almost the exact same things can be said of government work staffs and other non-elected officials — the bureaucrats of “big government.”  Terms of service should be on a staggered schedule of limitations so that some “experience” is always present in both the elected and their staffs; bureaucrats should be paid in order that they cannot become a professional class of “bean-counters” at tax payer expense; public service should be kept based upon timely representation, and civil service should be kept based upon a system of timely merit; politicians are elected by voters, and bureaucrats are selected by civil service testing — both groups subject to inevitable replacement.]

This, in turn, calls for severe restrictions on lobbying of elected officials of all types (making lobbying a crime?).  Preventing oligarchies and plutocracies of any “flavor” can only be effective if the overall political philosophy applied is a liberal one (“liberal” meaning the opportunity to achieve wealth, power, and influence while simultaneously working so that others around you (all over the globe) can achieve the same, all without the unjust expense to someone else’s wealth, power, and influence).  The philosophy of such a liberal posture I call “liberalist,” meaning that freedom, equality, and brotherhood (the liberte, egalite, and fraternite of the French Revolution) are all three held constantly at equal strength.  When one or two of the three are reduced at the relative boosting of two or one, respectively, then things like the atrocities of the French Terror, the atrocities of fascism, the atrocities of communism, or the atrocities of unregulated capitalism result.

[The word “equality” in political philosophy as used above must be distinguished from the “equality” issue of education in II. above.  When the US Constitution speaks of “all men are created equal,” that does not mean equal in knowledge, talents, and skills; rather it means a shared, universal entitlement to basic human rights, such as, in the Constitution’s words, “life, liberty, and the pursuit of happiness.”  We all have equal rights, not equal educational results; equal rights does not mean equal brain/minds — something the Terror tragically and horribly did not grasp; equal rights to education does not mean equal knowledge, talents, and skills for graduates — something too many “educators” tragically do not grasp.  Perception theory would suggest political equality is different from educational equality; the word “equality” must be understood in its context, if the appropriate adjective is not used with the noun “equality.”  The difference is crucial; political equality is crucial to the healthy social organization of the species, while educational equality (equal results, not equal opportunity) is tragic and harmful to the individual brain/minds of the species.  Awareness of this difference, or always making this semantic distinction, should avoid unnecessary confusion.]

Certain Western European countries, such as the Scandinavian countries, have shown the future of political systems toward which all nations should strive in accordance to liberal, liberalist views.  If anything is needed by the population at large, then a socialist program is called for to deal with all fairly — such as social security, free public education through university level, postal service, public transportation, universal single-payer health care, public safety, state security, and “fair-share” taxation of all who earn and/or own.  No one is allowed to achieve personal gain through regulated capitalism or through leadership in any of these socialist programs except upon merit, meaning his/her gain (in wealth, power, and/or influence) is not at the unjust loss of someone else, and is based solely upon the successful individual’s talents, skills, and knowledge; competition in capitalism and program leadership is both necessary and in need of limitations. It is OK to “lose” in the game of capitalism, as long as one loses “fair and square;” every business success and every business failure must be laid at the feet of the entrepreneur.  The political system with its social programs is merely the crucible of both individual success and individual failure, continually monitoring and regulating the crucible so as to assure perpetual and equal opportunity for all.  Regulation of the political system crucible is achieved by electors of political leadership and program leadership — regulation keeping the programs, like capitalism, perpetually merit-based, fair, and just.  This is a system of “checks and balances” toward which every political system should strive.

History has taught us that the foregoing is not a description of some “pie-in-the-sky” Utopia; it is a description of what history has painfully taught us as “the way” of avoiding a theology-like toxicity for politics.  Politics is not doomed to be theology’s “toxic twin;” it will be so doomed if the bloody lessons of its past are not heeded.  In my opinion, it really is not complicated: it is better to liberally trade, tolerate, and befriend than to conservatively exploit, distrust, and demonize.  Politically speaking, we need to securely develop a xenophilia to replace our prehistoric and insecure xenophobia.  This “xeno-development” is one of the great lessons taught by the modern world over the last 300 years, and this “xeno-development” is begged by perception theory.

RJH

 

Perception Is Everything

Recently a model of human perception has occurred to me. Perception is like that “screen” of appearance before us in our waking hours that is turned off when we are asleep. Yet, it appears to us it does not really turn off during slumber when we remember dreams we have had before we awoke. The moments just before we “nod off” or just as we awake seem as times when perception is “half-way” turned on. The “fuzziness” of this “half-way switch” is clearly apparent in those mornings we awake and momentarily do not know the location of exactly where we slept.

 

Say I am sitting in an enclosed room with a large card painted uniformly with a bright red color. Focusing upon only my visual sensation, suppressing the facts I am also sensing the tactile signals of sitting in a chair with my feet on the floor as well as peripherally seeing “in the corner of my eye” the walls and other features of the room, I am only visually observing the color “red,” all for simplicity. Light from the card enters my eyes and is photo-electrically and electro-chemically processed into visual signals down my optic nerve to the parts of my brain responsible for my vision. The result of this process is the perception of the color “red” on the “screen” of my perception. If I were to describe this perception to myself I would simply imagine the word “red” in my head (or the word “red” in some other language if my “normal” spoken language was not English); were I to describe this perception to someone else in the room, say, a friend standing behind me, I would say, “I am seeing the color red,” again in the appropriate language.

Yet, if my friend could somehow see into my head and observe my brain as I claimed to be seeing red, that person would not experience my sensation or perception of “red.” He/she would see, perhaps with the help of medical instrumentation, biochemical reactions and signals on and in my brain cells. Presumably when I perceive red at a different moment in time later on, the observer of my brain would see the same pattern of chemical reactions and bio-electrical signals.

 
On the “screen” of my perception, I do NOT see the biochemistry of my brain responsible for my perception of red; were I to observe inside the head of my friend in the room while he/she was also focusing on the red card, I would NOT see his/her “screen” of perception, but only the biochemical and bio-electrical activity of his/her brain. It is IMPOSSIBLE to experience (to perceive) both the subjective perception of red and observe the biochemistry responsible for the same subjective perception within the same person. We can hook up electrodes to our own head to a monitor which we observe at the same time we look at red, but we would only be seeing just another representation of the biochemistry forming our perception, not the biochemistry itself, as well as perceiving the red perception. I call this impossibility “the subjective trap.”

 
And yet, my friend and I make sense of each of our very individual impossibilities, of each of our very personal subjective traps, by behaving as if the other perceives red subjectively exactly the same, and as if our biochemical patterns in our respective brains are exactly the same. We are ASSUMING these subjective and biochemical correlations are the same, but we could never show this is the case; we cannot prove our individual perceptions in our own head are the same perceptions in other heads; we cannot ever know that we perceive the same things that others around us perceive, even if focusing upon the exact same observation. The very weak justification of this assumption is that we call our parallel perceptions, in this scenario, “red.” But this is merely the learning of linguistic labels. What if I were raised in complete isolation and was told that the card was “green?” I would say “green” when describing the card while my friend, raised “normally” would say “red.” (Note I’m stipulating neither of us is color blind.) Such is the nature of the subjective trap.

 
[If one or both of us in the room were color-blind, comparison of visual perceptions in the context of our subjective traps would be meaningless — nothing to compare or assume. In this scenario, another sensation both of us could equally perceive, like touching the surface of a piece of carpet or rubbing the fur of a cute puppy in the room with us, would be substituted for seeing the color red.]

 
The subjective trap suggests the dichotomy of “objective” and “subjective.” What we perceive “objectively” and what we perceive “subjectively” do not seem to overlap (though they seem related and linked), leading to a separation of the two adjectives in our culture, which has a checkered history. Using crude stereotypes, the sciences claim objectivity is good while subjectivity is suspect, while the liberal arts (humanities) claim subjectivity is good while objectivity is ignorable. Even schools, colleges, and universities are physically laid out with the science (including mathematics and engineering) buildings on one end of the campus and the liberal arts (including social studies and psychology) buildings on the other. This is the “set-up” for the “two cultures'” “war of words.” I remember as an undergraduate physics major debating an undergraduate political science major as we walked across campus which has had the greatest impact upon civilization, science or politics? We soon came to an impasse, an impasse that possibly could be blamed, in retrospect over the years, on the subjective trap. Ideas about the world outside us seemed at odds with ideas about our self-perception; where we see ourselves seemed very different from whom we see ourselves; what we are is different from whom we are.

Yet, despite being a physics major and coming down “hard” on the “science side” of the argument, I understood where the “subjective side” was coming from, as I was in the midst of attaining, in addition to my math minor, minors in philosophy and English; I was a physics major who really “dug” my course in existentialism. It was as if I “naturally” never accepted the “two cultures” divide; it was as if I somehow “knew” both the objective and the subjective had to co-exist to adequately describe human experience, to define the sequence of perception that defines a human’s lifespan. And, in this sense, if one’s lifespan can be seen as a spectrum of perception from birth to death of that individual, then, to that individual, perception IS everything.

How can the impossibility of the subjective trap be modeled? How can objectivity and subjectivity be seen as a symbiotic, rather than as an antagonistic, relationship within the human brain? Attempted answers to these questions constitute recent occurrences inside my brain.

 

Figure 1 is a schematic model of perception seen objectively – a schematic of the human brain and its interaction with sensory data, both from the world “outside” and from the mind “inside.” The center of the model is the “world display screen,” the result of a two-way flow of data, empirical (or “real world” or veridical) data from the left and subjective (or “imaginative” or non-veridical) data from the right. (Excellent analogies to the veridical/non-veridical definitions are the real image/virtual image definitions in optics; real images are those formed by actual rays of light and virtual images are those of appearance, only indirectly formed by light rays due to the way the human brain geometrically interprets signals from the optic nerves.) [For an extensive definition of veridical and non-veridical, see At Last, A Probable Jesus [August, 2015]] Entering the screen from the left is the result of empirical data processed by the body’s sense organs and nervous system, and entering the screen from the right is the result of imaginative concepts, subjective interpretations, and ideas processed by the brain. The “screen” or world display is perception emerging to the “mind’s eye” (shown on the right “inside the brain”) created by the interaction of this two-way flow.

 
Figure 1 is how others would view my brain functioning to produce my perception; Figure 1 is how I would view the brains of others functioning to produce their perceptions. This figure helps define the subjective trap in that I cannot see my own brain as it perceives; all I can “see” is my world display screen. Nor can I see the world display screens of others; I can only view the brains of others (outside opening up their heads) as some schematic model like Figure 1. In fact, Figure 1 is a schematic representation of what I see if I were to peer inside the skull of someone else. (Obviously, it is grossly schematic, bearing no resemblance to brain, nervous system, and sense organ physiology. Perhaps many far more proficient in neuro-brain function than I, and surely such individuals in future, can and will correlate those terms on the right side of Figure 1 with actual parts of the brain.)

 
Outside data collectively is labeled “INPUT” on the far left of Figure 1, bombarding all the body’s senses — sight, sound, smell and taste, heat, and touch. Data that stimulates the senses is labeled “PERCEPTIVE” and either triggers the autonomic nervous system to the muscles for immediate reaction (sticking your fingers into a flame) necessarily not requiring any processing or thinking, or, goes on to be processed as possible veridical data for the world display. However, note that some inputs for processing “bounce off” and never reach the world display; if we processed the entirety of our data input, our brains would “overload,” using up all brain function for storage and having none for consideration of the data “let in.” This overloading could be considered a model for so-called “idiot savants” who perceive and remember so much more than the “average” person (“perfect memories”), yet have subnormal abilities for rational thought and consideration. Just how some data is ignored and some is processed is not yet understood, but I would guess that it is a process that differs in every developing brain, resulting in no two brains, even those of twins, accepting and rejecting data EXACTLY alike. What is for sure is that we have evolved “selective” data perception over hundreds of thousands of years that has assured our survival as a species.
The accepted, processed data that enter our world display in the center of Figure 1 as veridical data from the outside world makes up the “picture” we “see” on our “screen” at any given moment, a picture dominated by the visual images of the objects we have before us, near and far, but also supplemented by sound, smell, tactile information from our skin, etc. (This subjective “picture” is illustrated in Figure 2.) The “pixels” of our screen, if you please, enter the subjective world of our brain shown on the right of Figure 1 in four categories – memory loops, ideas, self-perception, and concepts – as shown by the double-headed, broad, and straight arrows penetrating the boundary of the world display with the four categories. The four categories “mix and grind” this newly-entered data with previous data in all four categories (shown by crossed and looped broad, double-headed arrows) to produced imagined and/or reasoned data results back upon the same world display as the moment’s “picture” – non-veridical data moving from the four categories back into the display (thus, the “double-headedness” of the arrows). Thusly can we imagine things before us that are not really there at the moment; we can, for instance, imagine a Platonic “perfect circle” (non-veridical) not really there upon a page of circles actually “out there” drawn upon a geometry textbook’s page (veridical) at which we are staring. In fact, the Platonic “perfect circle” is an example of a “type” or “algorithmic” or symbolic representation for ALL circles created by our subjective imagination so we do not have to “keep up” will all the individual circles we have seen in our lifetime. Algorithms and symbols represent the avoidance of brain overload.

 
From some considered input into our four categories of the brain come “commands” to the muscles and nervous system to create OUTPUT and FEEDBACK into the world outside us in addition to the autonomic nerve commands mentioned above, like the command to turn the page of the geometry text at which we are looking. Through reactive and reflexive actions, bodily communication (e.g. talking), and environmental manipulation (like using tools), resulting from these feedback outputs into the real world (shown at bottom left of Figure 1), we act and behave just as if there had been an autonomic reaction, only this time the action or behavior is the result of “thinking” or “consideration.” (The curved arrow labeled “Considered” leading to the muscles in Figure 1.)

 

Note how Figure 1 places epistemological and existential terms like CONSCIOUSNESS, Imagination, Knowing, Intention & Free Will, and Reason in place on the schematic, along with areas of the philosophy of epistemology, like Empiricism, Rationalism, and Existentialism (at the top of Figure 1). These placements are my own philosophical interpretations and are subject to change and placement alteration indicated by a consensus of professional and amateur philosophers, in conjunction with consensus from psychologists and brain physiologists, world-wide.
Figure 2 is a schematic of the “screen” of subjective perception that confronts us at every moment we see, hear, smell, taste, and/or touch. Figure 2 is again crudely schematic (like Figure 1), in this case devoid of the richness of the signals of our senses processed and displayed to our “mind’s eye.” Broad dashed arrows at the four corners of the figure represent the input to the screen from the four categories on the right of Figure 1 – memory loops, ideas, perception, and concepts. Solid illustrated objects on Figure 2 represent processed, veridical, and empirical results flowing to the screen from the left in Figure 1, and dashed illustrated objects on Figure 2 represent subjective, non-veridical, type, and algorithmic results flowing to the screen from the right in Figure 1. Thus Figure 2 defines the screen of our perception as a result of the simultaneous flow of both veridical and non-veridical making up every waking moment.

PerceptPic1

Figure 1 — A Model of the Objectivity of Perception

 

(Mathematical equations cannot be printed in dashed format, so the solid equations and words, like History, FUTURE, Faith, and PRESENT, represent both veridical and non-veridical forms; note I was able to represent the veridical and non-veridical forms of single numbers, like “8” and certain symbols, like X, equals, and does not equal.) Thus, the solid lightning bolt, for example, represents an actual observed bolt in a thunderstorm and the dashed lightning bolt represents the “idea” of all lightning bolts observed in the past.

 

The “subjective trap” previously introduced above is defined and represented by the rule that nothing of Figure 1 can be seen on Figure 2, and vice-versa. In my “show-and-tell” presentation of this perception model encapsulated in both figures, I present the figures standing on end at right angles to each other, so that one figure’s area does not project upon the area of the other – two sheets slit half-height so that one sheet slides into the other. Again, a) Figure 2 represents my own individual subjective screen of perception no one else can see or experience; b) Figure 1 represents the only way I can describe someone else allegedly perceiving as I. I cannot prove a) and b) are true, nor can anyone else. I can only state with reasonable certainty that both someone else and I BEHAVE as if a) and b) are true. In other words, thanks to the common cultural experience of the same language, my non-color-blind friend and I in the room observing the red-painted card agree the card “is red.” To doubt our agreement that it is red would stretch both our limits of credulity into absurdity.

 
The model described above and schematically illustrated in Figures 1 and 2 can be seen as one way of describing the ontology of human beings, of describing human existence. Looking at Figure 1, anything to the left of the world display screen is the only way we know anything outside our brain exists and anything to the right of the world display screen is the only way we know we as “I’s” exist in a Cartesian sense; anything to the right is what we call our “mind,” and we assume we think with our mind; in the words of Descartes, “I think, therefore I am.” We see our mind as part of the universe being “bombarded” from the left, so we think of ourselves as part of the universe. Modern science has over the centuries given us some incredible ontological insights, such as all physical existence is made up of atoms and molecules and elementary particles; we can objectively or “scientifically” describe our existence, but we do so, as we describe anything else, with our subjective mind; we, as self-conscious beings, describe the veridical in the only way we possibly can – non-veridically. Thus, the model suggests the incredible statement made by scientists and philosophers of science lately. Recalling that atoms are created in the interior of stars (“cooked,” if you please, by nuclear fusion inside stars of various sizes and temperatures) that have long since “died” and spewed out their atoms in

PerceptPic2

Figure 2 — A Model of the Subjectivity of Perception (The “Screen”)

 

contribution to the formation of our own solar system around 13.5 billion earth years ago, and recalling our bodies, including our brains, are made of molecules made from the atoms from dead and gone stars, the statement “We are ‘star-stuff’ in self-contemplation” makes, simultaneously, objective and subjective, or scientific and artistic, “spiritual sense.”

We can veridically “take in,” “observe,” “experience,” or “contemplate” anything from the vast universe outside our body as well as the vast universe inside our body outside our brain while at the same time we can imagine non-veridically limitless ways of “making sense” of all this veridical data by filing it, storing it, mixing it, and thinking about it, all within our brain. We are limitless minds making up part of a limitless universe.

 

As if that was not enough, each of us, as a veridical/non-veridical “package of perception,” is unique. Every human has a unique Figure 1 and a unique Figure 2. Our existence rests upon the common human genome of our species, the genetic “blueprint” that specifies the details of our biological existence. Yet, every individual’s genome is different from every other (even if only by .1% or by a factor of .001), just considering that mutations even for identical twins make their two “blueprints” slightly different once the two organisms exist as separated zygotes in the womb. Moreover, how we behave, and, therefore, how we respond non-veridically to the veridical data we receive individually, even from the same environment shared by others, is mitigated by the unique series of experiences each of us has had in our past. Hence, each person is a unique individual genome subjected to unique environmental experiences, the exact copy of which cannot possibly statistically exist.

 

The world display screen of an individual in any given moment has never been perceived before, nor will it ever be perceived again, as in the next moment the screen is modified by the dual flux of the veridical flux from the left and the non-veridical flux from the right in Figure 1. The life of an individual is a series of receiving this ever-changing dual flux and thinking or acting in the real world upon the basis of this dual flux; it is a series of two-way perceptions. The life of an individual is observed by another individual as a series of perceived behaviors assumed, but never proven, to be generated in the same way as those of the observer. All in the span of a human life is perception; to an individual human being, perception has to be everything.

 

This model suggests to me the absurdity of having objectivity and subjectivity irreconcilably separate; it suggests, rather, that they are inseparable; they go together like, in the words of the song, “horse and carriage” or “love and marriage.” The blending of objective data and imaginative concepts in our brain makes our perception, our conscious “everything,” or existence as a self-conscious being, if you please, possible. What we are is the veridical of our screen of perception; who we are is the non-veridical of the screen. In other words, the scientist is as potentially subjective as the poet, and the poet is as potentially objective as the scientist; they differ only in the emphases on the contents of their respective screens of perception. For the “two sides” of campuses of higher learning to be at “war” over the minds of mankind is absurd – as absurd as the impasse the political science major and I reached in conversation so many years ago.

 
If the above was all the model and its two figures did, its conjuring would have been well worth it, I think, but the above is just the tip of the iceberg of how the model can be applied to human experience. Knowing how prone we are to hyperbole when talking about our “brain children,” I nonetheless feel compelled to suggest this model of conception can be intriguingly applied to almost any concept or idea the human brain can produce – in the sense of alternatively defining the concept using “both worlds,” both the objective and the subjective, instead of using one much more than the other. In other words, we can define with this model almost anything more “humanly” than before; we can define and understand almost anything with “more” of ourselves than we’ve done in the past.

 

Take the concept of the human “soul” for example. It seems to me possible that cultures that use the concept of soul, whether in a sacred or secular sense, whether in the context of religion or psychology, they are close to using the concept of the “mind’s eye” illustrated in Figure 1 of the model. The “mind’s eye” is the subjective “I,” the subjective observer of the screen, the “see-er,” the “smell-er,” the “taste-er,” the “hear-er,” the “touch-er,” the “feel-er” of perception; the soul is the active perceiver of subjective human experience. The soul defines self-consciousness; it is synonymous with the ego. This view is consistent with the soul being defined as the essence of being alive, of being that which “leaves” the body upon death. Objectively, we would say that death marks the ceasing of processing veridical data; subjectively, we would say that death marks the ceasing of producing non-veridical data and the closing of the “mind’s eye.”

 

Yet the soul is a product of the same physiology as the pre-conscious “body” of our evolutionary ancestors. In other words, the soul “stands upon the shoulders” of the id, our collection of instincts hewn over millions of years. So, in addition, we would objectively say that death also marks the ceasing of “following” our instincts physically and mentally; our unique, individual genome stops defining our biological limitations and potentialities. The elements of our body, including our brain, eventually blend to join the elements of our environment. Objectively, we would say death marks our ceasing to exist as a living being. The concept of the soul allows death to be seen as the “exiting” or “leaving” of that necessary to be called “alive.”

 
So, the concept of the soul could be discussed as the same or similar to the concept of the ego, and issues such as when does a developing human fetus (or proto-baby) develop or “receive” a soul/ego, which in turn has everything to do with the issue of abortion, can be discussed without necessarily coming to impasses. (See my The ‘A’ Word – Don’t Get Angry, Calm Down, and Let Us Talk, [April, 2013] and my The ‘A’ Word Revisited (Because of Gov. Rick Perry of Texas), or A Word on Bad Eggs [July, 2013]) I said “could be,” not “will be” discussed without possibly coming to impasses. Impasses between the objective and subjective seem more the norm than the exception, unfortunately; the “two cultures war” appears ingrained. Why?

 
Earlier, I mentioned causally the answer the model provides to this “Why?”. The scientist/engineer and the artist/poet differ in their emphases of either the veridical flux to the world display screen or the non-veridical flux to the same world display screen of their individual brains. By “emphasis” I merely mean assigning more importance by the individual to one flux direction or the other in his/her head. At this point, one is reminded of the “left-brain, right-brain” dichotomy dominating brain/mind modeling since the phenomenon of the bicameral mind became widely accepted. The perception model being presented here incorporates on the non-veridical side of the perception screen both analytical (left) brain activity and emotional (right) brain activity in flux to the screen from the right side of Figure 1. Just like my use of left/right in Figure 1 is not like the use of left/right in bicameral mind/brain modeling, this model of perception is not directly analogous to bicameral modeling. What the perception model suggests, in my opinion, is that the analytical/emotional chasm of the human brain is not as unbridgeable as the “left-brain-right-brain” view might suggest.

More specifically, the perception model suggests that the “normal” or “sane” person keeps the two fluxes to the world display screen in his/her head “in balance,” always one flux mitigating and blending with the other. It is possible “insanity” might be the domination of one flux over the other so great that the dominated flux is rendered relatively ineffective. If the veridical flux is completely dominant, the person’s mind is in perpetual overload with empirical data, impotent to sort or otherwise deal with the one-way bombardment on his/her world display screen; such a person would presumably be desperate to “turn off” the bombardment; such a person would be driven to insanity by sensation. If the non-veridical flux is completely dominant, the person’s mind is in a perpetual dream of self-induced fantasy, sensing with all senses, that which is NOT “out there;” such a person would be driven to insanity by hallucination. In this view, the infamous “acid trips” of the 1960’s induced by hallucinatory drugs such as LSD could be seen as self-induced temporary periods of time in which the non-veridical flux “got the upper hand” over the veridical flux.

This discussion of “flux balance” explains why dreams are depicted in Figure 1 as “hovering” just outside the world display screen. The perception model suggests dreams are the brain’s way of keeping the two fluxes in balance, keeping us as “sane” as possible. In fact, the need to keep the fluxes in balance, seen as the need to dream, may explain why we and other creatures with large brains apparently need to sleep. We need “time outs” from empirical data influx (not to mention “time outs” just to rest the body’s muscular system and other systems) to give dreaming the chance to balance out the empirical with the fanciful on the stage of the world display. Dreams are the mixtures of the veridical and non-veridical not needed to be stored or acted upon in order to prevent overload from the fluxes of the previous day (or night, if we are “night owls”); they play out without being perceived in our sleeping unconsciousness (except for the dreams we “remember” just before we awaken) like files in computer systems sentenced to the “trash bin” or “recycle bin” marked for deletion. Dreams can be seen as a sort of “reset” procedure that prepares the world display screen to ready for the upcoming day’s (or night’s) two-way flux flow that defines our being awake and conscious.

This model might possibly suggest new ways of defining a “scientific, analytical mind” (“left brain”) and comparing that with an “artistic, emotional mind” (“right brain”). Each could be seen as a slight imbalance (emphasis on “slight” to remain “sane”) of one flux over the other, or, better, as two possible cases of one flux mitigating the other slightly more. To think generally “scientifically,” therefore, would be when the non-veridical flux blends “head-on” upon the world display screen with the veridical flux and produces new non-veridical data that focuses primarily upon the world external to the brain; the goal of this type non-veridical focus is to create cause/effect explanations, to problem-solve, to recognize patterns, and to create non-veridically rational hypotheses, or, as I would say, “proto-theories,” or scientific theories in-the-making. Thus is knowledge about the world outside our brain increased. To think generally “artistically,” on the other hand, would be when the non-veridical flux takes on the veridical flux upon the world display screen as ancillary only, useful in focusing upon the “world” inside the brain; the goal of this type non-veridical focus is to create new ways of dealing with likes, dis-likes, and emotions, to evoke “feelings” from morbid to euphoric, and to modify and form tastes from fanciful thinking to dealing emotionally with the external world in irrational ways. Thus is knowledge about what we imagine and about what appears revealed to us inside our brain increased.

With these two new definitions, it is easy to see that we have evolved as a species capable of being simultaneously both scientific and artistic, both “left-brain” and “right-brain;” as I said earlier, the scientist is as potentially subjective as the poet, and the poet is as potentially objective as the scientist. We do ourselves a disservice when we believe we have to be one or the other; ontologically, we are both. Applying the rule of evolutionary psychology that any defining characteristic we possess as a species that we pass on to our progeny was probably necessary today and/or in our past to our survival (or, at minimum, was “neutral” in contributing to our survival), the fact we are necessarily a scientific/artistic creature was in all likelihood a major reason we evolved beyond our ancestral Homo erectus and “triumphed” over our evolutionary cousins like the Neanderthals. When we describe in our midst a “gifted scientist” or a “gifted artist” we are describing a person who, in their individual, unique existence purposely developed, probably by following their tastes (likes and dislikes), one of the two potentialities over the other. The possibility that an individual can be gifted in both ways is very clear. (My most memorable example of a “both-way” gifted person was when I, as a graduate student, looked in the orchestra pit at a production of Handel’s Messiah and saw in the first chair of the violin section one of my nuclear physics professors.) Successful people in certain vocations, in my opinion, do better because of strong development of both their “scientific” and “artistic” potentialities; those in business and in service positions need the ability to simultaneously successfully deal with problem solving and dealing with the emotions of colleagues and clientele. Finding one’s “niche” in life and in one’s culture is a matter of taste, depending on whether the individual feels more comfortable and satisfied “leaning” one way or another, or, being “well-rounded” in both ways.

Regardless of the results of individual tastes in individual circumstances, the “scientist” being at odds with the “artist” and vice-versa is always unnecessary and ludicrous; the results of one are no better or worse than those of another, as long as those results come from the individual’s volition (not imposed upon the individual by others).

 

From the 1960’s “acid rock, hard rock” song by Jefferson Airplane, Somebody to Love:

When the truth is found to be……lies!
And all the joy within you…..dies!
Don’t you want somebody to love?
Don’t you need somebody to love?
Wouldn’t you love somebody to love?
You better find somebody to love!

These lyrics, belted out by front woman Grace Slick, will serve as the introduction to two of the most interesting and most controversial applications of this perception theory. The first part about truth, joy, and lies I’ll designate as GS1, for “Grace Slick Point 1” and the second part about somebody to love I’ll designate as GS2.

Going in reverse order, GS2 to me deals with that fundamental phenomenon without which our cerebral species or any such species could not have existed – falling in love and becoming parents, or, biologically speaking, pair bonding. The universal human theme of erotic love is the basis of so much of culture’s story-telling, literature, poetry, and romantic subjects of all genres. Hardwired into our mammalian genome is the urge, upon the outset of puberty, to pair-bond with another of our species and engage, upon mutual consent, in sexual activity. If the pair is made of two different genders, such activity might fulfill the genome’s “real” intent of this often very complex and convoluted bonding – procreation of offspring; procreation keeps the genes “going;” it is easily seen as a scientific form of “immortality;” we live on in the form of our children, and in our children’s children, and so on. Even human altruism seems to emerge biologically from the urge to propagate the genes we share with our kin.

Falling in love, or pair bonding, is highly irrational, and, therefore a very non-veridical phenomenon; love is blind. When one is in love, the short comings of the beloved are ignored, because their veridical signals are probably blocked non-veridically by the “smitten;” when one is in love, and when others bring up any short comings of the beloved, they are denied by the “smitten,” often in defiance of veridical evidence. If this were not so, if pair bonding was a rational enterprise, much fewer pair bonds would occur, perhaps threatening the perpetuation of the species into another generation. [This irrationality of procreation was no better defined than in an episode of the first Star Trek TV series back in the 1960’s, wherein the half human-half alien (Vulcan) Enterprise First Science Officer Spock (played by Leonard Nimoy) horrifically went apparently berserk and crazy in order to get himself back to his home planet so he could find a mate (to the point of hijacking the starship Enterprise). I think it was the only actual moment of Spock’s life on the series in which he was irrational (in which he behaved like we – fully human.]

GS1 is to me another way of introducing our religiosity, of asking why we are as a species religious. This question jump-started me on my “long and winding road,” as I called it – a personal Christian religious journey in five titles, written in the order they need to be read: 1) Sorting Out the Apostle Paul [April, 2012], 2) Sorting Out Constantine I the Great and His Momma [Feb., 2015], 3) Sorting Out Jesus [July, 2015], 4) At Last, a Probable Jesus [August, 2015], and 5) Jesus – A Keeper [Sept., 2015]. Universal religiosity (which I take as an interpretation of GS1) is here suggested as being like the universality of the urge to procreate, though not near as ancient as GS2. As modern humans emerged and became self-conscious, they had to socially bond into small bands of hunter-gatherers to survive and protect themselves and their children, and the part of the glue holding these bands together was not only pair-bonding and its attendant primitive culture, but the development of un-evidenced beliefs – beliefs in gods and god stories – to answer the then unanswerable, like “What is lightning?” and “How will we survive the next attack from predators or the enemy over the next hill?” In other words, our non-veridical faculties in our brain dealt with the “great mysteries” of life and death by making up gods and god stories to provide assurance, unity, fear, and desperation sufficient to make survival of the group more probable. Often the gods took the shape of long-dead ancestors who “appeared” to individuals in dreams (At Last, a Probable Jesus [August, 2015]). Not that there are “religious genes” like there are “procreate genes,” but, rather, our ancestors survived partly because the genes they passed on to us tended to make them cooperative for the good of the group bound by a set of accepted beliefs – gods and god stories; that is, bound by “religion.”

The “lies” part of GS1 has to do with the epistemological toxicity of theology (the intellectual organization of the gods and god stories) – religious beliefs are faith-based, not evidence-based, a theme developed throughout the five parts of my “long and winding road.” On p. 149 of Jerry A. Coyne’s Faith vs. Fact, Why Science and Religion are Incompatible (ISBN 978-0-670-02653-1), the author characterizes this toxicity as a “metaphysical add-on….a supplement demanded not by evidence but by the emotional needs of the faithful.” Any one theology cannot be shown to be truer than any other theology; all theologies assume things unnecessary and un-evidenced; yet, all theologies declare themselves “true.” As my personal journey indicates, all theologies are exposed by this common epistemological toxicity, yet it is an exposé made possible only since the Enlightenment of Western Europe and the development of forensic history in the form of, in the case of Christianity, higher Biblical criticism. This exposé, in my experience, can keep your “joy” from dying because of “lies,” referring back to GS1.

Both GS1 and GS2 demonstrate the incredible influence of the non-veridical capabilities of the human brain. A beloved one can appear on the world display screen, can be perceived, as “the one” in the real world “out there,” and a god or the lesson of a god story can appear on the world display screen, can be perceived, as actually existing or as being actually manifest in the real world “out there.”

Putting GS1 in more direct terms of the perception model represented by Figures 1 and 2, non-veridical self-consciousness desires the comfort of understandable cause and effect as it develops from infancy into adulthood; in our brains we “need” answers — sometimes any answers will do; and the answers do not necessarily have to have veridical verification. Combining the social pressure of the group for conformity and cooperation, for the common survival and well-being of the group, with this individual need for answers, the “mind,” the non-veridical, epiphenomenal companion of our complex brain, creates a personified “cause” of the mysterious and a personified “answerer” to our nagging questions about life and death in general and in particular; we create a god or gods paralleling the created god or gods in the heads of those around us who came before us (if we are not the first of the group to so create). We experience non-veridically the god or gods of our own making through dreams, hallucinations, and other visions, all seen as revelations or visitations; these visions can be as “real” as the real objects “out there” that we sense veridically. (See At Last, a Probable Jesus [August, 2015] for examples of non-veridical visions, including some of my own.) Stories made up about the gods, often created to further explain the mysteries of our existence and of our experiences personally and collectively, combine with the god or gods to form theology. Not all of theology is toxic; but its propensity to become lethally dangerous to those who created it, when it is developed in large populations into what today are called the world’s “great religions,” and fueled by a clergy of some sort into a kind of “mass hysteria” (Crusades, jihads, ethnic “cleansings,” etc.), makes practicing theology analogous to playing with fire. As I pointed out in Jesus – A Keeper [Sept., 2015], epistemologically toxic theology is dangerously flawed. Just as we have veridically created the potential of destroying ourselves by learning how to make nuclear weapons of mass destruction, we have non-veridically created reasons for one group to try and kill off another group by learning how to make theologies of mass destruction; these theologies are based upon the “authority” of the gods we have non-veridically created and non-veridically “interpreted” or “listened to.” It is good to remember Voltaire’s words, or a paraphrase thereof: “Those who can make you believe absurdities can make you commit atrocities.”

Also remember, the condemnation of toxic theology is not the condemnation of the non-veridical; a balance of the veridical flux and the non-veridical flux was absolutely necessary in the past and absolutely necessary today for our survival as individuals, and, therefore, as a species. Toxic theology, like fantasy, is the non-veridical focused upon the non-veridical – the imagination spawning even more images without checking with the veridical from the “real world out there.” Without reference to the veridical, the non-veridical has little or no accountability toward being reliable and “true.” All forms of theology, including the toxic kind, and all forms of fantasy, therefore, have no accountability toward reality “out there” outside our brains. Harmony with the universe of which we are a part is possible only when the non-veridical focuses upon referencing the veridical, referencing the information coming through our senses from the world “out there.” This is the definition of “balance” of the two fluxes to our world display screens in our heads.

Comparing this balanced flux concept with the unbalanced one dominated by the non-veridical (remember the unbalanced flux dominated by the veridical is brain overload leading to some form of insanity), it is easy to see why biologist Richard Dawkins sees religiosity as a kind of mental disease spread like a mental virus through the social pressures of one’s sacred setting and through evangelism. Immersing one’s non-veridical efforts into theology is in my opinion this model’s way of defining Dawkins’ “religiosity.” In the sense that such immersion can often lead to toxic theology, it is easy to see the mind “sickened” by the non-veridical toxins. Whether Dawkins describes it as a mental disease, or I as an imbalance of flux dominated by the non-veridical, religiosity or toxic theology is bad for our species, and, if the ethical is defined as that which is good for our species, then toxic theology is unethical, or, even, evil.

To say that the gods and god stories, which certainly include the Judeo-Christian God and the Islamic Allah, are all imaginative, non-veridical products of the human mind/brain is not necessarily atheistic in meaning, although I can understand that many a reader would respond with “atheist!” Atheism, as developed originally in ancient Greece and further developed after the European Enlightenment in both Europe and America, can be seen as still another form of theology, though a godless one, potentially as toxic as any other toxic theology. Atheism pushing no god or gods can be as fundamentalist as any religion pushing a god or gods, complete with its dogma without evidence, creeds without justification, evangelism without consideration of the evangelized, and intolerance of those who disagree; atheism can be but another religion. Atheism in the United States has in my opinion been particularly guilty in this regard. Therefore, I prefer to call the conclusions about religion spawned by this perception model as some form of agnostic; non-veridical products of the brain’s imagination might be at their origin religious-like (lacking in veridical evidence or dream-like or revelatory or hallucinatory) but should never be seen as credible (called epistemologically “true”) and worthy of one’s faith, belief, and tastes until they are “weighed” against the veridical information coming into the world display screen; and when they can be seen by the individual as credible, then I would ask why call them “religious” at all, but, rather, call them “objective,” “scientific,” “moral,” “good,” or “common sense.” I suggest this because of the horrendous toxicity with which religions in general and religions in particular are historically shackled.

We do not have to yield to the death of GS1 (When the truth is found to be lies, and all the joy within you dies!); GS2 (Love is all you need, to quote the Beatles instead of Grace Slick) can prevent that, even if our irrational love is not returned. In other words, we do not need the gods and god stories; what we need is the Golden Rule (Jesus – A Keeper [Sept., 2015]). This is my non-veridical “take” on the incredible non-veridical capabilities encapsulated in GS1 and GS2.

Western culture has historically entangled theology and ethics (No better case in point than about half of the Ten Commandments have to do with God and the other half have to do with our relationship to each other.) This entanglement makes the condemnation of theology suggested by this perception model of human ontology an uncomfortable consideration for many. Disentanglement would relieve this mental discomfort. Christianity is a good example of entangled theology and ethics, and I have suggested in Jesus – A Keeper [Sept., 2015] how to disentangle the two and avoid the “dark side” of Christian theology and theology in general.

Ethics, centered around the Golden Rule, or the Principle of Reciprocity, is clearly a product of non-veridical activity, but ethics, unlike theology and fantasy, is balanced with the veridical, in that our ethical behavior is measured through veridical feedback from others like us “out there.” We became ethical beings similarly to our becoming religious beings – by responding to human needs. Coyne’s book Faith vs. Fact, Why Science and Religion are Incompatible points out that in addition to our genetic tendency (our “nature”) to behave altruistically, recognize taboos, favor our kin, condemn forms of violence like murder and rape, favor the Golden Rule, and develop the idea of fairness, we have culturally developed (our “nurture”) moral values such as group loyalty, bravery, respect, recognition of property rights, and other moral sentiments we define as “recognizing right from wrong.” Other values culturally developed and often not considered “moral” but considered at least “good” are friendship and senses of humor, both of which also seem present in other mammalian species, suggesting they are more genetic (nature) than cultural (nurture). Other culture values (mentioned, in fact, in the letters of the “Apostle” Paul are faith, hope, and charity, but none of these three need have anything to do with the gods and god stories, as Paul would have us believe. Still others are love of learning, generosity (individual charity), philanthropy (social charity), artistic expression of an ever-increasing number of forms, long childhoods filled with play, volunteerism, respect for others, loyalty, trust, research, individual work ethic, individual responsibility, and courtesy. The reader can doubtless add to this list. Behaving as suggested by these ideas and values (non-veridical products) produce veridical feedback from those around us that render these ideas accountable and measurable (It is good to do X, or it is bad to do X.) What is good and what is bad is veridically verified, so that moral consensus in most of the groups of our species evolves into rules, laws, and sophisticated jurisprudence (e.g. the Code of Hammurabi and the latter half of the Ten Commandments). The group becomes a society that is stable, self-protecting, self-propagating, and a responsible steward of the environment upon which the existence of the group depends; the group has used its nature to nurture a human ethical set of rules that answers the call of our genes and grows beyond this call through cultural evolution. The irony of this scenario of the origin of ethics is that humans non-veridically mixed in gods and god stories (perhaps necessarily to get people to respond by fear and respect for authority for survival’s sake), and thereby risked infection of human ethics by toxic theology. Today, there is no need of such mixing; in fact, the future of human culture may well hinge upon our ability to separate, once and for all, ethics from theology.

A final example of applying the perception model illustrated by Figures 1 and 2 for this writing is the definition of mathematics. Mathematics is clearly a non-veridical, imaginative product of the human brian/mind; this is why all the equations in Figure 2 need a “dashed” version in addition to the “solid,” as I was able to do for the single numbers like “8.” But why is math the language of science? Why is something so imaginative so empirically veridical? In other words, why does math describe how the world works, or, why does the world behave mathematically?

Math is the quintessential example of non-veridical ideas rigidly fixed by logic and consistent patterns; math cannot deviate from its own set of rules. What “fixes” the rules is its applicability to the veridical data bombarding the world display screen from the “real” world “out there.” If math did not have its utility in the real world (from counting livestock at the end of the day to predicting how the next generation of computers can be designed) it would be a silly game lodged within the memory loops of the brain only. But, the brain is part of the star-stuff contemplating all the other star-stuff, including itself; it makes cosmological “sense” that star-stuff can communicate with itself; the language of that communication is math. Mathematics is an evolutionary product of evolutionary complexity of the human brain; it is the ultimate non-veridical focus upon the veridical. Mathematics is the “poster child” of the balance of the two fluxes upon the world display screen of every human brain/mind. No wonder the philosopher Spinoza is said to have had a “religious, emotional” experience gazing at a mathematical equation on paper! No wonder we should teach little children numbers at least as early as (or earlier than) we teach them the alphabet of their native culture!

Further applications of the perception model suggest themselves. Understanding politics, economics, education, and early individual human development are but four.

I understand the philosophical problem of a theory that explains everything might very well explain nothing. But this perception model is an ontological theory, which necessarily must explain some form of existence, which, in turn, entails “everything.” I think the problem is avoided by imagining some aspect of human nature and culture the model cannot explain. For instance, my simplistic explanation of insanity as a flux imbalance may be for those who study extreme forms of human psychosis woefully inadequate. Artists who see their imaginations more veridically driven than I may have suggested might find the model in need of much “tuning,” if not abandonment. I have found the model personally useful in piecing together basic, separate parts of human experience into a much-more-coherent and logically unified puzzle. To find a harmony between the objective and the subjective of human existence is to me very objective (intellectually satisfying) and subjective (simultaneously comforting and exciting). The problem of explaining nothing is non-existent if other harmonies can be conjured by others. Part of my mental comfort comes from developing an appreciation for, rather than a frustration with, the “subjective trap,” the idea introduced at the beginning.

RJH

Sticks and Stones May Break Our Bones, But Words We Don’t Know Can Also Hurt Us, or, Jesus Was a Liberalist

The Long List of names I have been called and of labels directed at me for attempted attachment keeps growing.

Beginning as far back as high school, I have been called or labeled a progressive, a liberal, a pinko, a communist, a socialist, a fascist, a Nazi, a Democrat, a secular humanist, a scientific revolution freak, a political revolution freak, an agnostic, an atheist, a Christian, a Texas-phile, a Texas Aggie, a Marxist, a liberation theologian, a Southern Baptist, an anti-cleric, a nuclear physicist, an arrogant high school teacher, a great teacher of math and physics, an unqualified math teacher, a painter of Texas flags on barns and sheds, a history freak, an American Civil War buff, an unintentional expert on Cretaceous fossil fish teeth, a barbed wire artist, a country redneck, a designer and builder of porches and decks out of composite materials, a male chauvinist pig, a land owner, a student of comparative religion, a gadfly, a Teutonic freak, a Napoleonic freak, a lover of ’66 red Mustangs, a coon hunter, a rock mason using only unaltered, natural-shaped rocks, an optimist with rose-colored glasses, a member of a sneaky group of pranksters, an amateur dinosaur track hunter, a militaristic war-hawk, an Obama-phile, a dinosaur freak, a rock-and-roll freak, a painter of the Lake Cisco dam, a heavy metal music freak, a cancer survivor, an anti-creationist, an evolutionist, an anti-intelligent designer, a hippie, a PhD, an absent-minded professsor, an empiricist, a philososphy-phile, an epistemology freak, an incurable screamer of rock songs in karaoke bars, a beer connoisseur, a protester of stupid rules, a feminist, an insatiable reader of non-fiction books, a war gamer, a lover of all things Cisco, Waxahachie, or College Station, an astronomy teacher, a fanatical football and baseball fan, a driver of tractors and trucks, and a writer of “improbable histories.”

To this, since the latest of my Facebook postings and the formation of my website, have been added 1) an intellectual, and 2) an idiot (This last one brings me full circle, so to speak; this is exactly what I was called as a freshman in high school!). I must be doing something right!

Let’s see, today is Wednesday, so if I were to call myself something for the day (for it would change each day, you see), I would say I am a dealer of ideas. (Some of you are old enough to remember the old black-and-white movie and TV series “Dr. Fu Man Chu” — “They say the Devil deals in men’s souls; so does Dr. Fu Man Chu!” They say the Devil deals in ideas; so does Dr. Ronnie J. Hastings!

Let me take one of the ideas suggested by the list above, say, “liberal.” Problems occur right off the bat, because what Americans mean as liberal and what Europeans mean as liberal are slightly different things, and the difference, I think, is crucial. The word “liberal” was first used in reference to the Whig political agenda in Britain in the early 1800’s. It was not incorporated into American politics through the American Whig party, necessarily, but, rather, through American suffrage, grassroot, and populist movements of the 19th centrury.

The original political definition of “liberal” grew, in my opinion, out of the successes of the American Revoluton and the French Revolution, both in the 18th century. There was nothing conservative about these two revolutions! What I would suggest as “liberalism” was actually born out of these two pivotal events, embodied by the words “life, liberty, and the pursuit of happiness” in the case of America, and “liberte, egalite, and fraternite” in the case of France (liberty, equality, and brotherhood). The Reformation ,the Renaissance, and the Enlightenment had combined to spark the minds of America’s founding fathers (Franklin, Jefferson, Adams, and Paine) and to set up the political landscape of revolutionary France just prior to 1789, defining the terms “liberal” — those who sat on the “left” side of the chambers in France — and “conservative” (aristocratic) — those who sat on the “right” side of the French chambers. Liberalism, as I will call it, is the equal balance of all three (liberty, equality, and brotherhood [humanity-oriented]) and is the political ideal to which I think history is showing us to aspire. Liberalism has existed in this ideal form in America only in the short interval from Washington’s first term to Jefferson’s first; it existed in France only from the moment the Revolutionary government was formed to the institution of the Terror.

I am not sure we’ve witnessed any equal balance since, at least not in the USA. We have not truly reaped the benefits of liberalism. All systems of government seem to have the three words out-of-balance in some way. Some easy-to-see examples will suffice: the French Terror exalted equality at the expense of freedom and brotherhood; Marxist-Leninist communism exalts an inequality at the expense of freedom and brotherhood, ironically the same as monarchies, fascist-regimes, and “Christian” regimes such as the Papal States and Cromwellian England. Modern-day socialism makes a similar mistake as did the Terror: pushing equality at the expense of individual freedom and of genuine brotherhood – only without the beheading; unfortunately, in my opinion, that is what most Americans today call “liberal.” It is essentially a misnomer. So, to be clear, I am pushing “liberalism,” not whatever is labeled “liberal,” like socialism. Perhaps, to avoid being mired in the prevailing view of “liberal” today, those who are of the persuasion of “liberalism” should be called “liberalists” instead of “liberals.”

The original definition of conservatism was to work for no change, to keep and defend the status-quo. Those already with power and wealth, the aristocrats, and later, the capitalist rich, had no need for change, for they deemphasized equality and brotherhood; they paid attention only to the “liberty” part. Today American conservatives interpret “life, liberty, and the pursuit of happiness” as “my freedom, my entitlement, and who-gives-a-shit about my neighbors.” American conservatives whitewash over this “official” OK for selfishness, greed, and inhumane treatment by appealing to the myth that we are a Christian nation, which, in their myopic minds, means the poor, needy, and working have-nots will be taken care of by Christian charity (remember the solicitors of Scrooge in A Christmas Carol, and his response to them?) (Incidentally, Christian charity through the organized churches cannot begin to meet the growing need of social services in our country.) Conservatives, as a result, are champions of some form of elitism: the smarter, the richer, the powerful, etc. etc. are better than the others. I know the book was about communism, but the conservatives of today remind me of the pigs in Orwell’s Animal Farm, remember? — “All animals are equal, but some are more equal than others.” Conservatives, in my opinion, give only lip service to liberte, egalite, and fraternite, covering up their treason to the liberalist ideals that our forefathers ingeniously envisioned with feigned Christian piety, which is another treason — the treason betraying separation of church and state and the freedom to worship and the freedom from worship.

The progressive march of history is clear: conservative political philosophy cannot be sustained. With the price of the blood of millions since the 18th century, the imbalance of monarchies has failed and been dismantled, the imbalance of fascism has failed and been dismantled, the imbalance of communism has failed and been (almost everywhere) dismantled, and the imbalance of Latin American regimes of tyranny against personal liberty has failed and been dismantled. Guess what is going to happen in future to the imbalance of dictatorships, kingdoms, and sectarian states that still survive!

Look at this progressive march in the United States: universal suffrage finally became a reality, but it took into the 20th century to achieve it (Now, white males are joined by females and descendents of slaves at the voting polls.). The privileges of US citizenship are given without the shackles of discriminatory qualifications. (It doesn’t matter if you are blue, covered with green polka dots, and worship an anthill in your back yard, you have the same rights, privileges, and opportunities as the rich, powerful, and influential in this country.) For all this, you must pay a price, but a price well worth it, I believe: US citizenship means you have to work, you have to pay taxes, and you have to be a patriot in your new country — and, conservatives tend to overlook this, your freedom is qualified — you cannot climb the ladder of success at the expense of others! Your gain should not be someone else’s loss.

The three-pronged revolution of the 60’s (anti-war movement, Civil Rights movement, women’s movement) is all liberalist in spirit: perpetrated to extend (instead of restrict, as the conservatives want to do) all of the following — 1) power over your personal affairs, 2) influence in the leadership of your country, 3) your rights as a working, tax-paying citizen, 4) your rights not to be victimized by any form of discrimination, 5) your rights to educate yourself as far as your mind will take you, and 6) your grasp upon the promise of the liberalist, revolutionary agenda of our Constitution and Declaration.

So, when I go to the polls to vote for President, I vote for the candidate closer to the ideals of a liberalist, closer to the ideas upon which our country was founded. To vote for a political conservative is to me tantamount to voting against the ideals of the American Revolution; it would be literally un-American!

And, incidentally, to me it would be anti-Christian. Note that all the unflattering references I had above to Christians and Christianity had to do with church and those who attend church. They had nothing to do, in my opinion, with the teachings of Jesus. All those years I sat in Sunday School and in the church pews revealed to me how little emphasis, in the long scheme of things, was placed upon the teachings of the one supposed to have founded the church in the first place! Turns out, when you read the “red letters” of the four Gospels, or, better, the Jefferson Bible, what Jesus is supposed to have said doesn’t have much to do with the church, with organized religion. Jesus spoke in liberalist terms. The Sermon on the Mount translates almost verbatum into liberalist philosophy. Laws were made for people, not people for the laws. What is best for your fellow man trumps all other needs. The Golden Rule — so universal! Principles that can only be called humanistic are our guides, not some theology propping up some social class of clergy and a string of fancy buildings. He was a revolutionary in the truest sense of the word. Jesus’ adversaries were the representatives of the established religion of his day. Any Son of Man can become a Son of God. I have discussed all this with minister friends of mine (names withheld here for obvious reasons), and in private they cannot disagree with me on most of these points.

Jesus was a forerunner of the liberalist principles of our founding fathers. He was a liberalist way before the liberalist “time” in the 18th century. The American Revolution was fought for purely secular, not sectarian reasons; when the French aristocracy fell under the blade of the guillotine, so did the Church and its clergy. One of my favorite quotes from a French film was “There can be no church in a true republic.” I don’t think we should burn down all the churches — I think we should stop giving Jesus credit for them; such credit insults Him.

If all or part of this moves you to do so, get back with me. All I ask is that you try to do a little more than just add to the Long List of names and labels.

RJH

Post Navigation