Beyond Good and Evil

Dr. Ronnie J. Hastings

Archive for the category “Education”

Why Some White Evangelical Christians Voted for and/or Still Support Donald Trump

White evangelical Christians who apparently were “one issue” voters willing to sell their morality and soul by supporting Trump over an issue like abortion, prayer in schools, secularization of society, too liberal SCOTUS, demonization of liberals like the Clintons and Obama, etc. are in my experience not as dense as their stance might portend; there had to be some “sacred” reason(s) they would knowingly be supportive and culpable of the bigotry, immorality, and intellectual bankruptcy of Don of the present White House. Finally, I have discovered at least one such reason.

 
Up until recently all the clues I had from evangelical Christian friends and family, always reluctant to talk politics and/or religion with me, were comments like “God moves in mysterious ways!” (from the hymn “God Moves in a Mysterious Way” by William Cowper (1774), based upon Romans 11:33) or “Hillary is evil!” Then my friend and former student Dr. John Andrews sent me a link entitled “The Political Theology of Trump” by Adam Kotsko, which begins with the question “Why do evangelical Christians support Trump?” Kotsko, who is apparently white and an evangelical Christian, pointed out something concerning the Old Testament that “clicked” with my life-long experience with white evangelical Christians. Turns out, for some white evangelicals, to support Trump is to support God’s will; to not support Trump is to work against God’s plan!

 
First, let’s be clear about whom I’m writing. I am not talking about all Christians; I am not talking about all evangelicals; I am not talking about all white Christians. I am talking about a minority within a minority within a minority…, like the innermost figure in a Russian matryoshka doll, or nesting doll, or stacking doll. This minority group is mightily qualified and nuanced. White, Protestant, evangelical, biblical literalist, apocalyptic, and often holier-than-anyone-else describes this group well. I need an acronym to cover efficiently all these qualifications — White, Evangelical, Protestant, Christian, biblical LiteralistS, or WEPCLS, pronounced “wep-cils.” (I’ve not included the nuance of politically conservative, which I assume is obvious.) WEPCLS vote for and support Trump with hypocrisy so “huge” and blatant they seem unaware of it, like not seeing the forest for the trees.

 
Here in the “Bible belt” part of Texas, it may not be apparent that the WEPCLS constitute a minority. After all, the large First Baptist Church of Dallas with Dr. Robert Jeffress, well-known Trump supporter, as pastor, is seen as a beacon of WEPCLS values. But even this congregation is not 100% WEPCLS. When all Christians nationwide and worldwide are taken into consideration, then even we Protestant Texans can see WEPCLS as a minority.

 
Second, the reason something “clicked” about the Old Testament with me is that, for those of you who don’t already know, I’ve lived my whole life among WEPCLS; many of my friends and family are WEPCLS and, therefore, voted for Trump. (Personally, I “got” the “W” in the acronym down pat! 23 and me showed me to be Scots-Irish, English, French, German, and Scandinavian; I’m so white I squeak!) The denomination in which I grew up, Southern Baptist, was and is replete with WEPCLS; not all Southern Baptists are WEPCLS, but every congregation in which I have been a member contained and contains not a few WEPCLS. Why did I not over the years join the WEPCLS? Because, briefly, I early on asked questions answers to which were NOT “Because the Bible said so,” “Because the Church, Sunday School teacher, pastor, your parents, etc. say so,” “Just because,” “Because God made it that way,” “You shouldn’t ask such things,” etc. These woefully inadequate and empty answers made me take a closer look at the Bible, and by the time I went to college I had read both testaments and began to see why so much of Scripture was not the subject of sermons or Sunday School lessons. (See Sorting Out the Apostle Paul [April, 2012] on my website www.ronniejhastings.com) In short, I did not become a member of WEPCLS in large part because I did not become a Biblical literalist, and over time the idea of evangelizing others based upon faith that had few if any answers added to the social divisiveness around me — added to the “us vs. them” syndrome, the bane of all religions.

 
In addition to WEPCLS’s Biblical literalism, which is the clue to their support of Trump, it is my opinion the WEPCLS have sold their birthright from the Reformation with their emphasis on conversion and conformity. The Reformation gave birth, it seems to me, to a Protestantism wherein congregations are not groups of sheep (pew warmers) led by shepherds (the clergy), but, rather, are groups of meritocratic believers, each one of which has his/her own pathway and relationship to God. Moreover, WEPCLS have turned their backs on the great gift of the Enlightenment to everyone, including all believers — that everything is open to question, including this statement; there are no intellectual taboos. The human mind is free to question any- and everything, in the fine traditions of Job and doubting Thomas. It has not been that long ago a WEPCLS friend of mine referenced Martin Luther negatively because the Reformer was not godly enough and blamed the Enlightenment for the blatant secularism of today. To ignore both the Reformation and the Enlightenment categorizes the WEPCLS as woefully anachronistic — downright medieval even.

 
Incidentally, the mixing of politics and religion by so many WEPCLS (an attack on separation of church and state) is very unsettling because it is so un-American. As Jon Meacham, renowned American historian, said in his book American Gospel (2006, Random House pbk., ISBN 978-0-8129-7666-3) regarding the Founders’ view of the relationship between the new nation and Christianity, “The preponderance of historical evidence….suggests that the nation was not ‘Christian’ but rather a place of people whose experience with religious violence and the burdens of established churches led them to view religious liberty as one of humankind’s natural rights — a right as natural and as significant as those of thought and expression.” (p. 84) (See also my The United States of America — A Christian Nation? [June 2012] at www.ronniejhastings.com.)

 
Back to the clue of why WEPCLS support Trump. If one is a Biblical literalist, chances are you have to hold the Bible as your sole source of truth — the source of true science (creationism and intelligent design) and of true history (Moses wrote the Pentateuch, Adam and Eve were actual historical beings, Joshua actually commanded the sun to stop in the sky, Mary of Nazareth was impregnated through some form of parthenogenesis, Jesus was resurrected back to life after crucifixion, etc., etc.). As time went on it was to me like adult Biblical literalists actually believe Santa Claus, the tooth fairy, Satan, the Easter bunny, ghosts, Paul Bunyan, Pecos Bill, and Uncle Sam all exist just like the live friends and family that surround them instead of as concepts in their heads. As I studied epistemology in college, it became obvious one could justify and believe in literally anything through faith. Evidence-based truth is non-applicable to a Biblical literalist, and therefore is not applicable to WEPCLS.
Eventually, I became a physicist who likes to teach, instead of a WEPCLS. This post represents how the teacher in me compels me to pass on knowledge as best we know it at the present; to not be skeptical as all good scientists should be, and to not pass on what evidence-based skepticism cannot “shoot down” as all good teachers should do, is for me to fail my family, my friends, and all my fellow homo sapiens.

 
Recalling my days as a Sunday School teacher who relished the rare lessons from the “histories” of the Old Testament (like I & II Kings and I & II Chronicles), let me give you in brief outline the Biblical history that animates the WEPCLS (especially if Old Testament history is not your cup of tea):

 
1.) After the reigns of kings David and Solomon, the Israelite kingdom (consisting of the 12 tribes associated with the 12 sons of Jacob) split in twain, 10 tribes in the north known as Israel and 2 tribes in the south (close to Jerusalem) known as Judah. Each new kingdom had its own line of kings. The split occurred around 930 BCE (Before Common Era) or B.C. (Before Christ).

 
2.) Beginning about 740 BCE, the Assyrian Empire, which replaced the “Old” Babylonian Empire, invaded and overran the northern kingdom of 10-tribe Israel over some 20 years under the Assyrian kings Tiglath-Pileser III (Pul), Shalmaneser V, Sargon II, and Sennacherib. The 10 tribes were scattered in an Israelite diaspora and became known as the “lost tribes” of Israel. Assyria replaced the displaced Israelites with other peoples from the wider Mesopotamian region who became known by New Testament times as Samaritans. Sennacherib tried unsuccessfully to conquer 2-tribe Judah in the south, being killed by his sons. These events are covered in II Kings, Chaps. 15, 17, & 18, in I Chronicles Chap. 5, and in II Chronicles Chaps. 15, 30, & 31. The prophet known as “early Isaiah” from the 1st of three sections of the book of Isaiah is the major “prophet of record.”

 
3.) The Assyrian Empire was replaced by the “New” Babylonian Empire under King Nebuchadnezzar II and by 605 BCE the kingdom of Judah was succumbing to Babylon in the form of three deportations of Jews to Babylon in the years 605-598 BCE, 598-597 BCE, and 588-587 BCE, the third resulting in the Babylonian Captivity from 586-538 BCE following the siege and fall of Jerusalem in July and August of 587 BCE, during which Solomon’s Temple was destroyed. The end of II Kings and II Chronicles record the fall of Judah, and the Book of Jeremiah, Chaps. 39-43 offers the prophetic perspective (along with the book of Ezekiel), with the addition of the books of Ezra and the first six chapters of the book of Daniel.

 
4.) After Cyrus the Great of Persia captured Babylon, ending the Babylonian Empire and beginning the Persian Empire in 539 BCE, the Jews in exile in Babylon were allowed by Cyrus to return to Jerusalem in 538 BCE and eventually rebuild the Temple (II Chronicles 36:22-23 and “later” Isaiah). The book of Daniel records Cyrus’ (and, later, Darius I’s) role in the return and the book of Ezra reports the construction of the second Temple in Jerusalem begun around 537 BCE. Construction, toward which contributions by Nehemiah were incorporated with Ezra, lasted at least until 516 BCE.

 
The Biblical histories and books of the prophets concerning the historical events described in 2.) through 4.) above show a “divine pattern” which WEPCLS have seized upon. The great cataclysms brought upon the ancient Hebrews after Solomon were orchestrated by God as punishment for the sins (turning from God) of His Chosen People, and, moreover, God used pagan, heathen kings like Sennacherib and Nebuchadnezzar to punish His people and a pagan heathen king like Cyrus for the restoration of His people. For instance, Nebuchadnezzar is called God’s servant in Jeremiah 25:9 and is promised that the Babylonian’s land will be wasted only two verses later (Jeremiah 25:11). Later Isaiah calls Cyrus God’s “anointed” (Isaiah 45:1) and promises Cyrus God’s divine favor (Isaiah 44:28 & 45:13), while nonetheless declaring that Cyrus “does not know” God (Isaiah 45:4).
In other words, the WEPCLS have been swept up in the “divine revelation” or “special knowledge” that whatever happened to the ancient Hebrews (all the death, destruction, and utter humiliation), God was always in control of both punishment and reward, using unGodly evil empires as his tools to chastise His wayward “children.” Being Biblical literal-ists, the WEPCLS “naturally” transfer these Old Testament revelations to the present day, seeing “evil” Trump as God’s tool to punish the secular world for resisting God’s plan according to the interpretations of the WEPCLS. Trump as God’s tool is WEPCLS’s “special knowledge” through which all their issues like abortion will be “taken care of” without regard to the pagan, heathen, and evil attributes of that tool — just like the pagan, heathen, and evil actions of the Assyrian, Babylonian, and Persian rulers were disregarded by the prophets.

 
Trump is a tool all right, but not God’s tool.

 
Before applying “higher” Biblical criticism (or just biblical criticism) to WEPCLS’s interpretation of scripture, look at the conundrum the WEPCLS have created for themselves. Trump is so unGodly the absurdity that evil can be a tool of good is somehow proof that this must be, in the end, of God; Trump must be God’s President. And the more unGodly the tool, the greater proof that the tool must be of God! It reminds me of the Christian existentialist Soren Kierkegaard’s assertion that the absurdity of accepting Jesus as God on nothing but pure, blind faith is all the more reason for taking the leap of faith and accepting Jesus Christ as your Lord and personal Savior. Or, on a more mundane level, it reminds me of the creationist scientist on the banks of the Paluxy River announcing that the absence of human prints in the Cretaceous limestone alongside those of dinosaurs must INCREASE the probability that human prints ARE to be found; in other words, absence of evidence means presence of evidence! One can’t help but think of an Orwellian “double-speak” mantra “Bad is good!” and “Good is bad!”

 
Faith, like falling in love, is irrational, but falling in love is not bat-shit crazy!

 
The epistemological problem with faith-based religion is that any one religious belief cannot be shown to be better or worse than any other. By faith the WEPCLS believe the Bible is the Word of God established as everlasting truth about 1600 years ago (when the biblical canon was finally hammered out by acceptance of some books and rejection of others). For them truth is “set in concrete,” never to be altered by facts thereafter. despite the uncomfortable truth that God’s “concrete” of Jesus being God in the Trinity was not established as truth until about 400 years after Jesus’ crucifixion. What became amazing to me is that such canonization into unmoving, unchanging truth can only be defended by ignoring hundreds of years of new facts. If I were living in Europe around 1500, the fact that the Bible does not record the existence of a whole New World of two huge continents would make me revisit the rigidity of my faith and my beliefs. Nor does scripture mention all the scientific facts that evolve with ever-increasing evidence year after year, because the Bible is pre-scientific and written way before widespread literacy.

 
Because Christianity is “set” in history for biblical literalists, and because history has become a forensic science, Christians such as the WEPCLS do not have history on their side, just as all other believers who believe solely on faith. The forensic science of biblical criticism shows that literalists such as the WEPCLS do not have to become atheists or agnostics if they seek the most reasonable and probable view of what must have happened in the past for the Bible as we know it today to be in our hands. They must accept more historical facts than they presently do — facts that are compatible with as objective a view of the past as possible, facts that conjure the broadest agreement across Christendom, facts that place Christians in a majority armed with modern techniques of forensic history and forensic science, like archaeology and the history of Judaeo-Christian scripture (See the Dec. 2018 issue of National Geographic).

 
What then does biblical criticism have to say about WEPCLS’s interpretation of the Old Testament stories involving Assyria, Babylon, and Persia? Note the span of years covered by the events 1.) through 4.) above — essentially 930 BCE to 516 BCE. If you look at faith-based, conservative listings of the books of the Bible covering this span (I & II Kings, I & II Chronicles, Isaiah, Jeremiah, Ezekiel, Daniel, Ezra, Nehemiah) and when they were written, you would be told the books were written contemporaneously with or soon after the events with which they deal. But biblical criticism, which we have had since the 19th century or earlier, is, through archaeology and study of the origin of scripture (Dec. 2018 National Geographic), finding that they were all written well after the events as rationalizations or apologetics for the tribulations of what are supposed to be God’s Chosen People who He loves. (To say God employed “tough love” dealing with the ancient Israelites is a gross understatement indeed!) For a fairly well-established example, the book of Daniel was not written during or soon after the Babylonian Captivity or exile (586-538 BCE), but rather was written in the 2nd century BCE, circa 165 BCE. Further, it appears the author of the book of Daniel was writing about the 2nd century persecution of the Jews under the Seleucid king Antiochus IV Epiphanes using the prior persecution of the exile as a cover. The same dating fraud is committed concerning the books of the New Testament, especially the Gospels. Faith-based conservatives such as the WEPCLS want the Gospels written well before the Jewish Revolt against the Romans in 62-70 CE (Common Era or A.D. , anno Domini), as close to the life of Jesus as say, Paul’s letters. But biblical criticism based upon historical research shows the Gospels to be written during or after the Revolt (See Sorting Out the Apostle Paul [April, 2012]).

 
As we enter the 21st century, we know much, much more about the origins of the Bible than ever. What is needed in Christian scholarship of the scriptures is more polemics, not more apologetics. For WEPCLS to ignore this new wealth of historical findings for the sake of their medieval-like literalism is intellectually anachronistic and irresponsible. Consequently, the WEPCLS give non-Christians a bad name, as many non-Christians erroneously think WEPCLS represents all Christians.

 
Epistemologically, the WEPCLS commit the intellectual fraud of decontextualization, the practice of plucking a source out of its context so that its plucked state of being ripped from historical references makes it applicable to any time whatsoever, even a time bearing no relationship to its original intended applicability. The WEPCLS have decontextualized much of the histories and major prophets of the Old Testament so that they can be used for their conservative, Trinitarian, evangelistic purposes. Higher Biblical criticism has exposed their attempts to relate Old Testament references to Old Testament historical individuals as being references to the coming of Jesus Christ as the Son of God. To relate God’s use of Godless leaders in the Old Testament to today’s situation is not the WEPCLS’s first “fraudulent rodeo.”

 
I urge everyone in Christendom to apply biblical criticism to expose WEPCLS as a corrosive influence to Christian evangelism. I urge believers of all religions to use the same techniques of biblical criticism to their own faith-based creeds and/or practices. I urge non-believers to apply these same techniques to combat the politicization of theologies of organized religions.

 
My own experience in biblical criticism suggests it does not necessarily mean the WEPCLS retreat further from intellectual inquiry nor mean that it drives one away from Biblical consideration forever. The Bible itself often is all that is needed for its foibles to be exposed; often the Bible is its own best critic. For instance, I found that by comparing pre-exile-written II Samuel 24:1 with post-exile-written I Chronicles 21:1, one discovers how the concept of Satan, a parallel to the Zoroastrian (Persian) evil co-god Ahriman (counterpart to the good god Ahura Mazda), was introduced into Judaism by the exile (and later into Christianity). Calling upon other sources from archaeology, the Christian scrolls found at Nag Hammadi in Egypt show that there were at least 21 possible Gospels, not 4. These scrolls also show how the early Church bishops strove mightily to suppress and destroy these “lost” Gospels and also perpetuated the besmirching of Mary Magdalene’s character. To my surprise, when I placed Genesis 1 in its literary context, I saw it was not a history of the beginning of the world at all, but, rather, a comparison of the “superior” Hebrew Creator god with the “inferior” gods of neighboring peoples; my respect for Genesis 1 has risen considerably. Biblical criticism opens your mind to broader horizons not suggested by the Church, and helps to understand the archaeological findings relating to ancient religions.

 
Biblical criticism and its related readings applied to consensus world history has led me to work through a “most probable” scenario of how to me Christianity came into human history (Read in order on my website www.ronniejhastings.com Sorting Out the Apostle Paul [April, 2012], Sorting Out Constantine I The Great and His Momma Feb., 2015], Sorting Out Jesus [July, 2015],  At Last, A Probable Jesus [August, 2015], and Jesus — A Keeper [Sept., 2015]). Any person so “armed” and inclined can come up with their own scenario as well or better than I.

 

 

Regarding this matter of Biblical or biblical proportions and votes for Trump, I hope I have not failed my family, my friends, or my entire species in passing on what I see as the best of a growing majority consensus.

 

RJH

 

The “Problem” of Free Will

Perception Theory (Perception is Everything, [Jan., 2016]) describes human existence as a perpetual juxtaposition of empirical sense data from the outside, veridical, “real,” objective world outside our brains with imagined data of concepts, ideas, and orders from the “inside,” non-veridical, epiphenomenal subjectivity inside our brains — all projected upon our world view “screen” (perceived by the mind’s “eye”), upon which we simultaneously perceive what we “see” from the real world and what we “see” with our imagination. (Again, see Perception is Everything, [Jan., 2016])  Clearly, the areas of philosophy emphasized by Perception Theory are ontology and epistemology.

Almost any extended discussion of human ontology and epistemology sooner or later gets around to the topic of “free will,” the  problem of whether we have discretionary powers over what we think and do, or, are we slaves to the laws of physics, chemistry, and biochemistry, such that any such discretionary powers are delusional.  Do we have free will or not?

It seems reasonable that Perception Theory has the ability to answer the question of free will and “solve” the problem of free will.

In Perception is Everything, [Jan., 2016] the “subjective trap” is defined as the impossibility of an individual to see both the perception of something like “red” on our world screen inside our heads and the biochemistry within the neurons of our brain we know responsible for causing the perception “red” on our screen.  This impossibility leads to our assuming without proof that our perception of anything is just like someone else’s perception of the same thing.  Were we to look inside the head of that someone else perceiving red, we would see only his/her biochemistry of red, not his/her perception of red.  Hence, because of the subjective trap, we ASSUME others’ perceptions are as our perceptions, but there is no way of justifying that assumption in a scientific, objective way; we justify the assumption only in a practical, utilitarian way — communication among all of us seems to be compatibly possible making this assumption.

Is free will assumed similarly as are the perceptions of others?  If so, it would have to be assumptions within and about the individual mind, not assumptions about the perceptions of others.  Let’s say I am on a pleasant walk among a park’s many walkways and I come to a two-pronged fork in the path of equally appealing potential pathways, and, to all appearances, including my own, I CHOOSE one of the two paths and continue my walk.  Did I choose of my own free will?  A proponent of objective deterministic free will might argue that all my previous experience, if known, would predict with certainty which path I would choose, and only because I cannot command from my memory ALL my experiences (If I could, my brain would be flooded to insanity with stored empirical data.), I delude myself into thinking I flippantly, “for-no-reason,” “just-because-I-feel-like-it,” or randomly chose which path to take; in other words, I do not have free will, but have not the capacity of realizing I do not; my choosing is illusory.  A proponent of subjective free will might just as well argue that I have complete discretion in the two possible states of walking one path or another.  Even if my past experiences tend me toward my left or right, with each new decision I am free to choose either way in disregard of my tendencies, without having to justify that decision to anyone, including myself.  “Choosing without thinking about it” is a hallmark of my exercising what everyone is assumed to have, a free will.  But, just like the objective argument admits the futility of realizing all the assumed factors that “determine” the illusion of free will, the subjective argument irresponsibly assumes a “freedom” of choice ignoring all the physical laws to which the complexity of the brain and its epiphenomenal mind are subject.  Note how both arguments employ non-demonstrable assumptions, implying free will is not demonstrable without such assumptions.

Perception Theory, an admitted blend of the objective and the subjective (Perception is Everything, [Jan., 2016]), suggests both arguments are useful in solving the problem of free will.  The patterns of empirical data that demand strong veridical resonance of the mind with the “outside” world compel science and medicine to conclude all causes and effects, including our apparent free will, to be understandable in terms of particles, fields, and energy.  Yet these particles, fields, and energy are creations, or concepts, or imagined orders of the subjective mind.  (The epistemological “bottom line” of particles, fields, and energy existing outside our brains (mind) is that when we observe external to ourselves as objectively as possible [scientifically], we have to say the universe outside us behaves AS IF all the universe is made of particles, fields, and energy.)  We know how these particles, fields, and energy can demonstrate and explain physical phenomenon throughout the universe, but we do not know how they can be used (yet) to demonstrate how empirical data and previously store ideas can produce veridical and non-veridical projections upon our world screen of perception in our heads.  Similarly, particles, fields, and energy cannot demonstrate (yet) the explanation of free will not being “free” at all.  On the other hand, the “freedom” of the subjective argument cannot be truly free, as our perceptions ultimately are products of “star-stuff” just as much as our brain and body are, and star-stuff is bound by the universe’s demonstrable laws of physical science and life science.

What is suggested by Perception Theory, then, is that just like it is logically impossible for a person to simultaneously experience both her biochemical (objective) perception of red and her non-veridical (subjective) perception of red, it is logically impossible for free will to be both completely deterministic and completely without empirical cause.  In other words, when I appear to exercise free will at the fork of paths I cannot assume my choice is determined NOR can I assume I’ve exercised any kind of free will.

So what is free will, given the logically impossibilities and forced assumptions of both free will’s detractors and proponents?  What is suggested in my mind as a trained physicist is that free will is just like light.  When you ask a physicist what is the nature of light, waves or particles, the answer is “both; it depends upon how light is measured or observed.”  Similarly, free will is neither determined or undetermined.  “Free will” has to be a non-veridical concept, but not a scientific one trying to explain the veridical world outside our brain.  Rather, free will is a concept trying to explain human choice or volition, a behavior of possibilities, just like human love is a behavior of possibilities.  Gravity is a concept that can take on objectivity; free will, like any other human psychological concept, cannot, as DEFINITIVE SELF-STUDY CANNOT BE AS OBJECTIVE AS DEFINITIVE STUDY OF OUTSIDE THE SELF.  When we study the star-stuff that is us, we cannot escape ourselves, so that we cannot ever see ourselves as if we were outside ourselves; we cannot see ourselves objectively like the subjects of physical science.  This is why physics is considered a “hard” science, while psychology is considered a “soft” science.  It is as if the study of our minds has built-in an unavoidable uncertainty principle, like Heisenberg’s uncertainty principle of quantum mechanics.  Just like light can behave differently in different cases, the exercise of our free will can appear deterministic in some cases and wildly free in others.  Two different observers of my choice at the fork of paths could describe my exercise of “free will” differently.  One might say he predicted my choice and the other might say my choice looked completely random to her.  Neither could measure the “amount” of free will I exercised, and, neither could I.  I could recall my choice later as one of conscious or unconscious deliberation, or as one of complete obliviousness to either path, or as one somewhere in between.

All this uncertainty and lack of objective definition suggests that free will is a rationalization of convenience arrived at in the minds of humans over thousands of years to obtain the mental comfort of explanation of particular human behavior in the act of choosing.  Free will is psychological balm soothing the discomfort in trying to answer “Why did I do that?”, or “Why did he do that?”, or “Why did she do that?”  The real answer, down to the neuron, is like education, too complicated to understand entirely.  The non-veridical concept of “free will” or “lack of free will” is assumed as a practical vehicle toward understanding human behavior.  Free will, like concepts of gods or god stories, is a practical and illogical explanation that conveniently and more easily explains behaviors without having to take the trouble to objectively study them; free will makes dealing with human choices efficient.  Free will is an unconscious assumption of the human mind passed on generation to generation directly or indirectly.

So, who is right when it comes to free will, the objective proponent or the subjective proponent?  Both.  Who is wrong when it comes to free will, the objective proponent or the subjective proponent?  Both.  The “problem” of free will is not a problem at all.

 

Yet, any impasse about free will implied by the foregoing discussion is not a “hard” impasse like the subjective trap in Perception is Everything, [Jan., 2016].  Progress can be made toward understanding free will, by, first, dropping the “free” part and just talk about “will,” or just talk about human volition.  So my choice of paths employed above would come to a discussion of my choice being a product of my personal volition in that moment.  Next, one’s volition, or will, can be seen as a well-developed psycho-physio behavior practiced inside the individual from early days of infancy, if not before in the womb (See “I.  Development of Self-Consciousness in a Human Infant” in Perception Theory (Perception is Everything) — Three Applications, [Feb., 2016]).

Part of human self-consciousness is the awareness we can willfully do or think things just by employing an “I want to..” in our mind.  In my opinion, the “feeling,” perception, genetic tendency, or epiphenomenal “extra” for self-consciousness that we can will any action or thought of our own free will is one of many important evolved results of the “Cognitive Revolution” that occurred in our species, according to Harari (Sapiens and Homo Deus), between 70,000 and 12,000 years ago, before the Agricultural Revolution.  Clearly, our conviction we have a will that we control had, and probably still has, survival value — a trait “favored” by our physical and cultural evolution.  Perception Theory emphasizes that, as our self-consciousness was developed, probably around and within the Cognitive Revolution, our imaginations developed the ability of perceiving ourselves independent of our present setting.  That is, we could imagine ourselves not only in the present, but also imagine ourselves in the past or in the future.  Imagining ourselves in this way naturally includes imagining ourselves doing or thinking something in the present, past, or future.  The logical explanation of the cause of our doing or thinking something independent of setting is having the ability to command our thoughts and actions of our imagination; it is logical to us we have a will “barking orders of our judgement or whimsy” within our imagination.  And it is logical to us because we’ve been exercising that will since we were infants, according to our imagination. (Perception Theory (Perception is Everything) — Three Applications, [Feb., 2016])  We can easily imagine all infants, including ourselves when we were one, for the first time reaching out with a hand to touch or grasp some object that is not part of their body; the baby “wanted to” or “willed” his/herself to touch or grasp.

Not only can “will” be seen as a natural evolutionary development in our heads, it can be seen, thanks to modern science, as subject to statistics and probabilities of the complicated.  In the wake of the revolutionary development of the Kinetic Theory of Matter wherein all matter (including our bodies and our brains) is seen as composed of countless particles called atoms or clusters of atoms, molecules, statistical mechanics was developed in place of Newtonian mechanics, which had “no prayer” to describe countless masses moving and colliding with each other.  Statistical measurements, such as temperature, were defined to represent an average value of kinetic energy for all the masses, which tells you nothing about the value for a single particle.  Moreover, the scale of atoms and molecules is quantum mechanical, meaning mechanics are quantum, not Newtonian.  Hence, interactions on an atomic scale, such as the firing of a neuron in a brain cell, are statistical and quantum, not biological in scale and behavior.  In other words, our brain-based non-veridical “mind” exists because of countless neurons (brain cell) quantum mechanically interacting in accordance to biochemistry; just like the “well-defined” big-scale images on our TV screens are produced by atomic-level, quantum solid state circuitry understood in terms of electrons which are so tiny they can only be “seen” indirectly, our “well-defined” imagined images on our world perception screen in our heads are produced by atomic-level, quantum biochemistry within neurons understood in terms of the same electrons.  And all quantum phenomena are “fuzzy,” not fixed, subject to statistical fluctuations and unavoidably described in uncertain probabilities; the appearance of certainty on the scale of our bodies (big-scale) is the statistical mean of atomic “outputs” filtered by our averaging senses to a single result.  When we perceive “red,” the probability that we are perceiving data similar to previous perceptions of red is high, but, statistically, can never, ever be exactly the same, because the same exact set of electrons, atoms, and molecules that produced the previous perception are not available to produce the next; our big-scale senses only deliver the average of countless atomic-level inputs from incoming light data and processed, averaged biochemical data by our retina cells and optic nerve cells.  Imagine how “averaged” must be the non-veridical images on our world screen!  Our “feelings,” perceptions, and convictions are our big-scale utilitarian “averaging” of unimaginably numerous and unfathomably complicated quantum behaviors of the atomic level particles making up our brain.  And each “averaging,” it stands to reason, can never be repeated in detail.  Equally reasonable is the assumption that the averaging only has to be accurate enough to “get us by,” to assure that we survive as a species.

Our “will” is a self-imposed, evolutionary, imagined property describing our subjective “self,” the epiphenomenal result of the long-ago origin of self-awareness and self-consciousness.  It is a psychological, positive, mental “crutch” to attribute to ourselves the ability to conjure actions and thoughts; it is basic to our self-confidence.  There is, however, as best we know, no reason to call it “free.”

Further ontological insight into “will” can only be possible through future understanding, via scientific research, of how the physical, veridical brain can produce epiphenomenal, non-veridical perceptions.  The same research will perhaps make progress toward understanding and, maybe, redefining (“overcoming”) the subjective trap.  Though obviously useful, Perception Theory can be improved with better models and metaphors than veridical, non-veridical, world-view screen, etc.  Building a better theory seems necessary toward better understanding “will” and the subjective trap.

 

RJH

 

21st Century Luddites?

After the 2016 Presidential election, participants in and supporters of the US coal mining industry were asked why they voted against the industry being phased out, despite the widespread agreement it is a “dirty” source of energy contributing mightily to atmospheric pollution and climate change, and despite the promise that participants could easily be retrained for far more healthy employment in the future.  One particular answer from a participant spoke volumes to me — something to the effect that not only had his family been coal miners for generations, he categorically rejected the notion of being retrained in anything other than what he had been doing!  It was sort of a “you can’t teach an old dog new tricks” answer.

I thought of the Luddites.  Luddites were primarily textile factory workers in England during the Napoleonic Wars years who created a movement of destruction and violence from 1811-1816, which was crushed by heavy-handed government reprisal supported by the factory owners.  The Luddites were most famous for breaking into factories and destroying the new looms and other machines that were doing the Luddite labor at less cost, more efficiently, and more productively.  It probably is a misconception they destroyed the machines out of fear of the machines themselves replacing them, as some research suggests they actually feared that time spent learning new skills (retraining) germane to the new machines would be wasted.  This suggests that perhaps a lot of destruction, maiming, and death could have been avoided had the factory owners at the time offered to retrain the dissident workers at full pay.  Nonetheless, the term “Luddites” came to mean those in opposition to industrialization, automation, and, today, computerization.  What has not changed from the early 19th century to today is that factory mechanization clearly allows faster and cheaper labor and allows operation to be done by fewer laborers, who can even be less-skilled — meaning working for lower wages than the workers-before-machines who were replaced by the machines.  This is not to overlook the present-day need for highly skilled and high-wage workers to maintain and repair the machines; the point is that the number of skilled and well-paid workers needed today is less than in the days when far fewer products were manufactured by workers.

The Luddites seemed placed in a historical spectrum of labor whose roots go back to the medieval guilds, which gave way in the emergence of modern Europe (16th and 17th centuries) to organizations such as village and town support groups for traveling journeymen, which pointed toward labor unions following the era of the Luddites.  As you watch at length programs such as How It’s Made on the Science Channel, fostering the notion that machines “make everything” nowadays, the social and political influence of modern labor unions seems less germane to industrial economies in the last couple or three decades or so, simply because the unions did their job protecting workers so well in the past.  I suspect this spectrum is laced throughout with a workers’ stubborn refusal to change with the times, as per the Luddites.

I have witnessed in the past 30 years or so a “change of economic times” affecting farms and farm workers in the agricultural region south of Cisco, Texas — the town in which I grew up.  So much of southern Eastland County used to be “peanut country.”  My paternal grandfather was a peanut farmer, and my father grew peanuts on the family farms near the end of and during his retirement.  The paternal side of my family traditionally had two “cash crops,” peanuts and beef cattle raised on pasture land not devoted to planting peanuts.  Before my father died, the peanut economy south of Cisco was irrevocably transformed into today’s disappearance.  First came the mechanization of peanut farming and of cattle feed farming (hay), so rapid that with tractors and all the accompanying attachments and implements, my father could do more by himself than what 3 or 4 of us could do only 15 or so years before.  Then came the expansion of irrigated peanut farming elsewhere in Texas, making the small acreage peanut farms of Eastland County pressed to compete with volume of production and the ability of larger farms to sell at lower prices; the small scale peanut farmer of Texas was being phased out.  Despite attempts to irrigate peanuts also in the county, the main peanut mill in Gorman, Texas, dwindled into non-existence; peanut farmers could not economically survive even one bad season.  Farms did survive by turning the peanut fields into hay fields, mostly nowadays growing coastal bermuda grass; peanut-growing implements became scrap iron or decorative antiques.  Southern Eastland County is today a hay/pasture/cattle agricultural economy.

What if the peanut growers of Eastland County had taken the attitude of the Luddites, the attitude of modern coal miners, and refused to change, citing family traditions of peanut farming as I have just done?  They would have gone to their graves owning fallow, unused ground, assuming they had not been forced to sell in order to pay the land’s taxes.  They would have lost everything, for they were never unionized like the coal miners; they had no economic “safety net.”  Instead they changed (begrudgingly, I admit) by seeing their land as something different — producing hay underwriting the cattle industry pervasive all over the county, not just in the southern part.  They are still farming today, needing fewer workers than ever before, thanks to machines, and producing hay (some irrigating, some dry-land), pasture land, and cattle.  Their fathers and grandfathers would not recognize the family land today!

 I am not saying that modern US coal miners will turn violent if they are not allowed to continue coal mining in the tradition of their forefathers, but I am saying the peanut farmers of Eastland County, Texas, should give these miners and their supporters pause.  The miners run the risk of being 21st century Luddites (without the violence) and dooming their traditional economy to an ignoble end, causing further, unnecessary environmental pollution along the way.  Circumstances forced the peanut farmers to change, just like circumstances are forcing coal mining to change; I think that the miners, just like the farmers, have no choice but to change.  So focused are the miners and their supporters on tradition, nostalgia, and reverence for the values of their ancestors, they only look to the past, not to the future; they are, in a word, anachronistic.  They are so anachronistic, they even vote against their own best interests, and thereby vote against the best interests of their children and grandchildren!  They as a group remind one of the irrational, tradition-bound “secret societies” many medieval guilds became.  Using the peanut farmer analogy, it would be like the farmers giving their heirs no choice but to continue growing peanuts, despite the regional support structure for growing “goobers” having long since dwindled away!  “Good luck, son and daughter, because I know you are going to have a harder time than I had!”  Again, downright medieval, if you ask me.

Nor am I saying worker organizations like unions are a cause of the “insanity” of “Luddite-ism.”  If the coal miner unions get behind the backward-looking position of the miners-who-refuse-to-change, then the very concept of unions is being abused.  Protection of jobs does not entail battling progress; unions should always be in step with what is best for the future of workers, not with irrational loyalty to family tradition.  Unions are the reason for child labor laws, safe and humane working conditions, and the exercise of workers’ basic rights; they are not perpetrators of the ancient, archaic idea of guilds based upon family tradition.

Also, to not change with the changing economic times is myopic and selfish.  When farmers in Eastland County gave up raising peanuts, they did not see that as betraying their family traditions; they did not cease to revere, love, and take pride in their peanut-farming heritage!  Farmers knew their ancestors would have done the same thing in their place, given the same circumstances; the way one makes a living is not sacred — it is an individual choice.  Do the coal miners actually think their ancestors would be proud of their continuing doing the same unhealthy things as their grandfathers did?  I have a hard time believing that.  Instead, I think it comes down to the fact it is easier not to change than to change one’s employment.  In a word, they are, ironically, lazy.  Those who do one of the most physical, dangerous jobs still around may well be too lazy to change to an easier, safer job.  It takes effort on the part of the worker to be retrained, an effort the Luddites were not willing to exert.  So it is with today’s coal miners.  They need to be reminded, as they comfortably and longingly gaze into their past, that this is the 21st century of accelerated change, and that coal mining does not “revolve” around them, just as peanut farming did not “revolve” around denizens of southern Eastland County.  Coal mining must look to the future, and will evolve according to environmental circumstances and changing means of obtaining clean energy, not according to the traditions of coal miners.

RJH

 

We All Can Have PTSD

PTSD (acronym for post-traumatic stress disorder) has started expanding its applicability way beyond its military context, it seems to me.  Historically, the concept of PTSD developed from the stress of combat and other horrors of war causing either damage to brain physiology or to the individual psychology of the mind, or both.  Its symptoms, regardless of particular causes in particular cases, are a myriad of brain disorders that cause mild to chronic disruptions of normal brain function.  In World War I, it was called “shell shock,” and in World War II on in to Vietnam, it was called “combat fatigue.”  I want to make the case that all of us can have shell shock and combat fatigue without experiencing a second of combat, without a speck of horror or brain damage.

My most vivid experience of PTSD in a Vietnam vet was when I was working with faculty members from Waxahachie High School years ago in preparation for a faculty party to be held at the Waxahachie National Guard Armory several years ago.  Helping us build stage sets for party performances was David Simmons, building trades instructor at the high school and a Vietnam vet.  The Waxahachie Guard was moving the last cargo truck out of the building when David, upon hearing the truck’s engine, immediately had a flashback to Vietnam.  He dropped his hammer and had to be helped to sit down on the edge of the stage we were building.  For a few moments, he could not stop the imagery in his head; only when the truck had exited the building did he return to “normal.”  Clearly this was purely mental PTSD, as I am not aware of his suffering a head injury during the war.

Equally clear are PTSD-like cases of closed head injuries, such as result from motorcycle accidents.  I remember my friend Rick Qualls and I visiting a motorcycle accident victim who was seeing blood on the fossils he was collecting; we were “experts” invited by his mother to examine the fossils and help him be a little more critical in his hopefully therapeutic hobby.  We to no avail could convince him his iron-compound stains were not blood or that blood does not normally leave trace fossils.  At least he was not a “vegetable,” but that was little consolation to a mother whose son’s close head injury had interjected tragedy so cruelly into the family.  The son was experiencing something personally real in his head, just as David was in his head inside the armory, but the something was permanent, not temporary, as in David’s case.

I have come to think similarly about my older son Dan, who experienced a closed head injury in 1986 as a freshman in high school with a collision on bicycle with a van.  He is Sylvia’s and my “miracle child,” as he clearly recovered completely from all his physical injuries and almost recovered completely from his brain injuries.  Years after his accident, only the stress of traumatic events like divorce revealed his inability to deal with higher cognitive functions, as now in the past few years he is incapable of finding and holding a job.  Only recently have I recognized his cognitive trauma as PTSD-like, showing symptoms like paranoia, depression, mistrust, and hallucinatory reports.  But his brain recovery was so complete he now has a healthy case of denial, stubbornly refusing to recognize he is behaving abnormally.  But, when seen in comparison to the motorcycle accident victim, our son could have suffered mentally much worse.

Also helping me to recognize my son’s form of PTSD (in my opinion), was my recent development of Perception Theory (Perception is Everything, [Jan., 2016]) and its wide spectrum of applications in our universal experiences (Perception Theory (Perception is Everything) — Three Applications, [Feb., 2016], Perception Theory: Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016], and I Believe!, [October, 2016]).  Perception Theory was suggested to me during explaining the role hallucinations played in the origin and development of Christianity (At Last, a Probable Jesus, [August, 2015]), in which I shared my own flashback-like hallucinations.  Emerging from both projects conjured the realization my own non-combat hallucinations (only requiring some kind of trauma of the mind — not necessarily bad or harmful trauma) might mean I too have a form of PTSD, and, by extrapolation, all of us have the capability to empathize with PTSD victims, for we have experienced it ourselves, but have not recognized it as such.

 

I know I can empathize with David, with the motorcycle accident victim, and with my son Dan, for I have had several PTSD flashbacks over the years.  Rather than repeating those in At Last, a Probable Jesus, [August, 2015], I thought I would share with you three others:

1)  I grew up, as I’ve said in my memoirs and in my book SXYTMCHSC1964M4M (ISBN 978-0-692-21783-2, College Street Press, Waxahachie, TX, 2014) {See Fun Read, [August, 2014] to read how to attain a copy}, I grew up simultaneously at three homes, one with my parents in town in Cisco, Texas, and in the two rural homes of both sets of my grandparents outside Cisco.  The “home” of my maternal grandparents, the McKinneys, was completely destroyed by a tornado in May, 2015, a site that belongs to my wife and me nowadays.  For sentimental reasons I had the bulldozer and track hoe “cleaning up” the site leave a surviving iron yard gate still swinging on its hinges, so that any time I want, I can go out there, open the gate, and slam it shut.  That sound it makes when closing conjures images of the house and yard and of me going in and out the gate as a young boy.  I cannot help but see the house and yard, even though they are not there today.  The images are triggered by the slamming of the gate; it’s like being one of Pavlov’s dogs.  There is some possible bad trauma in this example, because of memory of the tornado, but the images are pleasant and very sentimental.  This feels to me as a PTSD-like experience of bittersweet memories and pleasant imagery, triggered by an iron-on-iron collision.  The imagery doesn’t last but a few moments, but can be re-conjured by slamming the gate again.  (This gate triggering also seems to work, at least mildly, on first cousins of mine who spent a lot of time at the site also as young children.)

2)  In the summer of 2007 I arranged a very personal and emotional moment upon myself when I confided in my good friend Bill Adling (See SXYTMCHSC1964M4M.) that I was about to write my life’s novel at the Mirage Hotel and Casino in Las Vegas.  He was the first in whom I confided such information, and I had insisted I tell him in private away from our wives.  The site chosen to reveal my secret to Adling was a neon display advertising the Beatles-based performances of “Love” by Cirque du Soleil at the Mirage.  The display had places at which we could sit.  It is hard to overstate how important the Beatles are and were to Adling’s and my friendship — for example, the two of us, along with our fellow fast friend/high school prankster Bob Berry, claim to be the very first Beatles fans in Cisco as 1963 changed to 1964.  How appropriate a setting for me to share my secret with Adling!  Fast forward to the summer of 2016, when just my wife and I were “taking in” Las Vegas and I was wandering around the casino floor of the Mirage while my wife Sylvia was still playing video poker.  I wandered to the spot where the neon display was 9 years earlier (It was now gone, despite the fact “Love” was still playing — we saw the show again, incidentally.), but I recognized the spot by its surroundings.  And suddenly, here came into my head bright neon lights, Adling’s face, and exchanged words I seemed to remember from almost a decade ago!  It was very fleeting but no less vivid.  The “trauma” must have been the “stress” of keeping the secret from everyone except Adling at the time, but the feeling was exhilarating, making me momentarily almost giddy!  I now look upon this moment as a PTSD-like experience.

3)  The third of this trio is the most PTSD-like to me and, coincidentally, the most gross.  Near the McKinney house of 1) above, my Granddad McKinney, among other animals, raised and kept for selling and butchering (Yes, the tornado left the rock and concrete foundation of the old slaughter house.) hogs, lots of hogs.  Playing in and around the lots, sheds, and barns there as a boy, I was in a constant menagerie of not only hogs, but cattle, chickens, turkeys, and peafowl.  Fast forward to just a few years ago, I had stopped at Brendan Odom’s house (Brendan today leases much of the land my wife and I own, including the McKinney place.), which coincidentally is on the road between where my Granddad McKinney lived and my Granddad Hastings lived, to ask him something.  Away from his house but sort of in the extended front yard was a covered cattle trailer, one of my dad’s old ones, in which Brendan kept wild hogs he had trapped for sale to buyers with customers craving “wild pork.” (Today, because of the collapse of the small-scale hog market, no one today raises hogs such as my grandfather did.)  As I walked by the trailer, I noted there were no hogs in it, but that there recently been some “residents,” as my nose was bombarded by the unmistakable odor of hog shit!  And the imagery flowed in my head of hogs wallowing, hogs sleeping, hogs feeding, and hogs squealing.  I could not stop seeing them!  As David’s trigger was auditory, mine in this moment was olfactory.  I had to walk away almost to the house to get the imagery to stop.  The trauma, as well as the trigger, was the incredibly bad odor, so the images were not particularly pleasant.

 

Perception Theory (Perception is Everything, [Jan., 2016], (Perception Theory (Perception is Everything) — Three Applications, [Feb., 2016], Perception Theory: Adventures in Ontology — Rock, Dog, Freedom, & God, [March, 2016], and I Believe!, [October, 2016]) suggests what is going on in our heads during PTSD experiences.  Some non-veridical trauma in our mind triggers uncontrollable perceptions upon our inner world view, momentarily or permanently blocking or suspending the non-veridical brain mechanisms by which we normally determine that what we are perceiving at the moment “must have been a dream.”  The uncontrollable perceptions seem as real and the controlled perceptions we receive from the “outside world” outside our brains.  They are suspensions of rationality, much like what we do when we fall in love.  Often they make us doubt our sanity, and often we are reluctant to share them with others for fear they will doubt our sanity.  Yet, history has shown they can cover the spectrum of individual perception from the destruction of life, through little or no effect, to the basis of starting a religion or a political movement.

PTSD-like experiences are profound epiphenomenal capabilities of our brain, part of the evolutionary “baggage” that was part of our “big brain” development.  I would guess it was a trait neutral to our survival (or, “tagging along” with our vital survival trait of the ability to irrationally fall in love), and, therefore, could be a vestigial trait passed into our future by the same genes that produce our vital non-veridical existence within our brains (in our minds).  Whatever future research into them brings, I will always be fascinated by their possible triggers within an individual, whether it be combat, closed-head injuries, a sound from the past, the Fab Four, or hog shit.

RJH

40 for 40

Upon retiring from public and private school classrooms after 40 years as a physicist who was “called” to teach physics and higher math to college-bound high school juniors (11th grade) and seniors (12th grade), I had accumulated over time certain sayings, thoughts, mores, musings, beliefs, philosophies, etc.  I decided to pen 40 of them, one for each year of my teaching career.  I do not pretend all of them are originally mine, as I’m sure many are paraphrases and/or plagiarisms of sentences that have personal meaning.  Many are school-related in particular, education-related in general, or related to both inside and outside the classroom — to life itself.  There is no order, as I left them in the sequence of my writing them down; therefore, they are not numbered, not only to remind the reader of their random sequencing, but also to remind they have to me no hierarchy — placing them in some order of importance is a prerogative of the reader, not a preference of mine.  Perhaps they will in part or whole have meaning or usefulness to the reader.  My highest hope is that they will in part or whole be thought-provoking.

(In-depth commentary upon many of these can be found throughout the posts under the title Beyond Good and Evil, on this site www.ronniejhastings.com .)

=> Unquestioning faith is not a virtue; it is a disability.

 

 

=> Knowledge is power for self-control and self-determinaton; knowledge is freedom of thought; knowledge carries with it the responsibility to pass it on to others.

 

 

=> Respect must be earned, not freely given nor expected.

 

 

=> There is no science of education.

 

 

=> Believe what people do, not what people say.

 
=> Einstein was right about what he said about the universe not because he was Einstein, but because the universe behaves as he said.

 
=> Schools are for the students, of the students, and (for upper grades) by the students.

 
=> The language of the universe is mathematics.

 
=> To be a great teacher, one only needs to be 1) competent and 2) caring.

 
=> Everything can and should be questioned, even this sentence.

 
=> Everything can be made fun of, but only if you include yourself and everything you hold sacred.

 
=> Don’t try to foist your values off on others, especially when they are not solicited.

 
=> We all are children of the stars; we are starstuff.

 
=> Human existence is starstuff in self-contemplation and in contemplation of all other starstuff.

 
=> Funerals are for the living, not for the dead.

 
=> Marriages are for the community of the bonded pair, not for the bonded pair.

 
=> It is highly probable men and women cannot understand each other, for, were that understanding possible, the fascination for each other necessary for pair bonding (& necessary for the propagation of the species) would not be near as intense. The two sexes were meant to “drive each other crazy,” so that we will always fall in love.

 
=> Schools are not businesses; schools are not sports teams; schools are not technology exhibitions; schools are not expensive baby-sitting facilities.

 
=> Schools ARE facilitators of developing students’ minds, coordinated by a group of professional colleagues called the faculty.

 
=> Education is multi-pathed communication among students and teachers.

 
=> Personal tastes and choices (e.g. food, drink, music, sports, literature, politics, religion, life styles, etc.) are not to be mandated by society; ethical behavior (e.g. The Golden Rule), on the other hand, is NOT a matter of taste.

 
=> Science is reliable because it is never considered sacred or finished; nor is science held beyond vicious self-scrutiny, which also makes it reliable.

 
=> Science is not so much “believed in” as it is “subscribed to,” as if subscription to any and all theories can be changed when a better alternative or better alternatives come(s) along.

 
=> Teaching is never better than when the teacher tries to “teach his/herself out of a job.” No greater gift can a teacher give a student than the self-confidence that the student can learn the curriculum just as well without the teacher.

 
=> The teacher who does not learn from the students is not paying attention to his/her classes.

 
=> Particular courses that should be added to public secondary school curricula (required or elective) are 1) philosophy, 2) comprehensive, responsible sex education, 3) comparative religion, and 4) the Bible as literature.

 
=> Teachers are not 2nd class blue collar workers; they are professionals, like medical doctors, veterinarians, and lawyers.

 
=> School administrators are too often nothing more than over-paid hall monitors; their job is to support classroom teachers, not manage them.

 
=> The highest paid professionals in a school district should be tenured teachers.

 
=> Students are the clients of teachers; teachers work for their clients, not for administrators, school districts, States, or nations.

 
=> Teacher contracts should not contain the word “insubordination.” Administrators are supporting peers of teachers, not teachers’ “bosses.”

 
=> Education courses are unnecessary for teacher certification; only a period of classroom “student teaching” is.

 
=> “Lesson plans” are unnecessary; they only fill administrators’ filing cabinets; teachers individually develop the syllabi by which they teach day-by-day.

 
=> As professionals, teachers should mentor teachers-to-be, who function in the classroom in a secretarial role and observe the “nuts & bolts” of teaching as part of their “student teaching” requirements.

 
=> HR departments of school districts are support staff for teachers, not strong arms of the district administration.

 
=> For each subject a teacher teaches, it should be taught as if it is absolutely vital every student knows its content; students should feel the teacher’s passion for the subject.

 
=> Teachers should be hired and fired by other teachers.

 
=> Outside the classroom a teacher should have interests beyond his/her specialty; a teacher should have an extracurricular mental life.

 
=> Schools waste taxpayers’ money through at least two corrupt “good ol’ boy” systems: 1) promoting administrators’ careers via favoritism instead of merit, and 2) exclusive use of school supply companies that deal in ridiculously inflated prices.

 
=> Understanding does NOT necessarily also mean agreement.

RJH

 

“Campusology” at Texas A&M and in Education 6-12

In the wake of my retirement beginning June, 2016, I’ve experienced personally that the positive I’ve done professionally as a Ph.D. scientist who likes to teach (as opposed to a science and math teacher) stands out more than the positive criticism of our educational system I’ve offered in recent years. The wonderful things my former students/present friends said to me at my retirement party was like a life-long justification of my decision so long ago to teach high school juniors and seniors rather than teach physics on the collegiate level. At the same time too many people at the party seemed to ignore my criticisms of secondary education leading to high school graduation (probably because I used at the party an “inappropriate” term — the “s” word as part of a strong negative adjective abbreviated by “BS” — in describing part of the school administration I experienced over 40 years of teaching in public and private school) and missed the message because they could not get around a word in that message.  I used the “BS” descriptor as a word of emphasis to call people’s attention to the issues I raised in the years just prior to my retirement. In my opinion “BS” does not make my descriptor “positive criticism” oxymoronic, if one hears all the criticism. (See 1: Education Reform — Wrong Models!, [May, 2013], 2: Education Reform — The Right Model, [May, 2013], 3: Education Reform — How We Get the Teachers We Need, [May, 2013])  I learned that even in adulthood so many still judge people strongly by the language they use; the adults overlook the context of that language utilized to draw attention to the context; ironic and sad, but true.

So, in sincere gratitude for all the accolades sent my way at the beginning of my retirement, I’d like to attempt my positive criticisms of the education of the young minds in our society once again, this time without such strong language that unfortunately distracts so many.  In a time-honored metaphor, I’d like not only to be an old war horse put out to pasture with accomplishments, fun, and fond memories attached, I’d like to be an old war horse put out to pasture with thought-provoking, reformative, and time-proven suggestions also attached.  I’d like to leave a legacy of both knowledge in our children’s heads and ways of improving what we are doing on our children’s campuses to improve that knowledge.  As I spell out in 1: Education Reform — Wrong Models!, [May, 2013], 2: Education Reform — The Right Model, [May, 2013], 3: Education Reform — How We Get the Teachers We Need, [May, 2013], we can do better than we are.  I know this because I have experienced the “better;” my suggestions are not hopeful fairy tales of what could be; my suggestions are methods I’ve seen work marvelously at Cisco High School, Texas A&M University, Waxahachie High School, and Canterbury Episcopal School.

This time I’d like to use an analogy — one between my experience both in and out of Texas A&M University’s Corps of Cadets in my undergraduate years 1964-1968 and my experience living within administration/faculty/student body situations for 60 of my 70 (so far) years.

 

Texas A&M University is one of those unique institutions offering a full-time ROTC program by which undergraduates can simultaneously graduate with a bachelor’s degree and a commission as a 2nd Lieutenant in the armed forces calling for a service career of twenty years or so (The Citadel is another such institution, I think.).  As a freshman (a “fish”) at A&M in the school year 1964-1965, I had to be a member of the Corps as well as an entering physics major, as that was the last school year being in the Corps was compulsory.  Beginning the following year of 1965-1966, I opted out of the Corps, learning I did not want to pursue a career in the military; so I became what was known then on campus as a “non-reg,” a non-Corps student.  For the record, I was in the Army ROTC branch of the Corps, assigned to company “Devil” D-3 of the Third Brigade.  Nowadays Texas A&M is almost 9 times larger than it was in 1964-1965 and the overwhelming majority of students are “non-regs.”

Like the service academies at West Point, Annapolis,  and Colorado Springs, being in the Texas A&M Corps of Cadets means you “play soldier” 24-7 during the week and on most weekends.  You have to eat, drink, and play Corps as well as eat, drink, and play going to classes working toward your bachelor’s degree.  You are living simultaneously in two not necessarily different and incompatible worlds.  For instance, being in the Corps helped me become more disciplined in my study habits.  (As a fish back then I had to study, and only study every weekday evening from 6:30-10:00 PM at my dorm room desk, monitored every 15 minutes by the sharp eyes of the upperclassmen hall monitors of our outfit (company).)  On the other hand, when I became a non-reg my sophomore year, I had only one world to deal with, which for me as a physics major was about all the “world” I could handle.  Therefore, I know the Corps and its effects upon academics “inside and out.”

Here’s the point:  in the Corps of Cadets at Texas A&M, there are three possible modes of “juggling” the two worlds, with various degrees of emphasis among the three modes, depending upon the mind of the individual student.  1)  You can successfully and consistently keep working toward two goals, the degree and the commission, 2) you can work harder on the degree without much concern for the commission, or 3) you can work harder on the commission without much concern for the degree.  In my fish year, I saw hundreds of fellow students in each mode.  Being in the Vietnam war era, the war in southeast Asia played a part in each student’s mind as they found themselves in one of these modes, as each mode carried with it its own risks back in those days.  I was in mode #2 — I was well on my way being kicked out of the Corps early the second semester, so “un-sharp” was I (I was  called “Cadet Basketball.”), until the company commanding officers (seniors) saw my good first semester grades.  My good grades were needed to keep the outfit’s grade average above the level of acceptability monitored by the Corps Command in the Trigon (A&M’s version of the Pentagon).  Those in mode #3 became known as majoring in “campusology,” more involved with extra-curricular Corps-related activities than with their responsibilities in the classroom.  Campusology was very seductive to lots of students, as #3 mode was lots of fun, even if you were on the receiving end of the hazing fish got constantly; I had loads of fun taking advantage of “fish privileges” and driving upperclassmen (sophomores (“piss-heads”), juniors (“serge-butts”), and seniors (“zipper-heads”)) “crazy.” (Mess hall had its own fun vocabulary — e.g. “bull-neck” was meat, “deal” was sliced bread, and “baby” was mustard, to name a few that can repeated in polite conversation.)  Campusology majors ran the greatest risk, for if you don’t pass, you cannot stay on campus long.  I tragically saw a good friend be seduced by #3.

Our fish year was only a couple of months along, and my friend and I were both taking calculus for the first time ever, but not in the same section (classroom).  Despite the fact we were in different outfits across campus from each other, he and I arranged a meeting in the library (Today that building looks like a tiny annex attached to the massive library that came after.) so I could help him with his calculus.  His first serious question to me was “What is a derivative?”  Again, he was not joking; he was serious.  Those of you who are “calculus savvy” know someone asking that question two months into a calculus course is in, to put it mildly, trouble.  I helped him, but it was too little too late for him.  He couldn’t get the help from me he needed, I worried about his academics from then on, and beginning the next semester, he slid into academic probation, resulting in his having to transfer to another college or university.  He was but one example among many who were lured by the fun of campusology.  I’m pleased to say that over many years thereafter he was able to work his way back and eventually get his degree from A&M.

No institution of higher learning deliberately sets up a “campusology major” to ensnare unsuspecting students; students alone were responsible for the mode they found themselves in at A&M back in the 1960’s.  But campusology can be a problem on any campus.  For all non-Aggie or non-Aggie-knowledgeable readers, substitute “sorority” or “fraternity” for “Corps of Cadets” in the above and I think all readers will see my point (not to mention substituting “athletics,” “student government,” “committees,” “clubs,” “community service,” “jobs,” etc).  The social structure of institutions today, just like at A&M in the ’60s,  can become the “prize” for students instead of the intended prize of academic success — the only path to a college degree.  If a student of higher learning looks toward another prize other than the one intended by the establishment of the institution itself, they have created for themselves an inverted, “tail-wagging-the-dog” world; they closed their eyes while swinging at the pitch; they come to realize there are no degrees in distractions or diversions — no degrees in campusology.

 

A form of campusology is at work in our public and private schools, a form whose remedying would go a long way toward getting our high school graduates back on par and even ahead of high school graduates in other countries around the world.  I’m not talking about the usual high school extra-curricular activities that take up students’ academic time, like athletics, band, theater, student government, clubs, squads, jobs, proms, etc., though these can be campusology-like impediments toward high school graduation; how many students tragically have to settle for GED’s because they didn’t graduate high school due to being seduced by secondary school campusology?  No, I’m talking about a campusology that affects students like I was privileged to teach in high school — students sharp and talented enough to easily juggle loads of extra-curricular activities alongside impressive academic success in the classroom.  In other words, a campusology that impedes student preparation for direct matriculation into 4-year universities, an impediment in addition to cost.  It is a campusology that, curiously, involves the relationship between school administration and school faculty. If relieved of this campusology, the percentage of high school graduates in all sizes of graduating classes capable of successfully and immediately attending university classrooms would jump from, in my observation, from around 10-11% to around 25%.

This high school adminstration/faculty campusology I’m speaking of here is labeled “wrong models” in 1: Education Reform — Wrong Models!.  Moreover, this labeling can be applied to administrations and faculties at the junior high or middle school level, in my opinion.  The relief from the problem of this campusology/wrong model is spelled out in 2: Education Reform — The Right Model and in 3: Education Reform — How We Get the Teachers We Need.

In brief, then, a campusology-like mode potentially grips both administration and faculty in public and private schools.  As articulated in, again,  2: Education Reform — The Right Model and 3: Education Reform — How We Get the Teachers We Need, an “adult game” of management diverts both administrators and faculty away from their one and only charge or client — the student body.  Teachers are called “professionals,” yet treated like “employees;” administrators are called “managers,” yet behave like “bosses.”   Policy-level academic decisions are made by administrators, not classroom teachers; teachers are called away from their time with students to help the administrators “manage.”  (Hall duty, lunch room duty, bus arrival/departure duty, etc.)  The campusology becomes obvious when teachers are trained to focus upon being a part of a “team,” as opposed to an egalitarian gathering of colleagues.  Teachers are groomed to climb the tenure ladder and establish working relationships with their “bosses,” the administrators, all to the negligence of their duties to their students in the classroom.  Teachers are encouraged to be promoted out of the classroom, to leave their students and become “middle managers” — i.e. assistant principals on the path of becoming head principals.  This promotion is heavily loaded with salary increases, increases towering over those reserved for becoming, say, a department head staying in the classroom.  Such a situation lends itself pregnant toward becoming a “good ol’ boy” system, wherein school districts cooperate with each other to pass bad teachers away from one district to another and to create positions for “buddies” across districts so that the “buddy” can get a few more years service to pad their retirement payments.  Ex-teachers and ex-administrators often are grafted away from campuses to become salesmen for exclusive “good ol’ boy” school-supply companies (not in the spirit of capitalistic competition) charging inflated prices, taking advantage of the school-tax-dollar “cash cow.”  Nothing in the “good ol’ boy” system seems to have the students’ (or teachers’) best interests.  This is why I called “BS” on it this past summer.

Schools are not about the careers of administrators and teachers; schools are about the development of young minds toward meeting the intellectual challenges of the world beyond high school.

2: Education Reform — The Right Model and  3: Education Reform — How We Get the Teachers We Need remind us this campusology featuring the “good ol’ boy” system need not be.  To repeat myself, I saw vestiges of the “right model,” the professional/colleagual in Cisco High School, in at least two campus administrations during my 32 years teaching at Waxahachie High School, and in Canterbury Episcopal School.  Where I really saw this right model functioning full-time was in graduate school at Texas A&M University.  Wherever I saw this remedy from all forms of campusology, it had nothing to do with departments of education or with degrees in education.  This tells me that not only is there no science of education, departments of education and their curricula either are culpable in the perpetuation of campusology in secondary schools and the “good ol’ boy” system, or they are inadvertently so.  Sprinkled throughout 1: Education Reform — Wrong Models!, [May, 2013], 2: Education Reform — The Right Model, [May, 2013], and 3: Education Reform — How We Get the Teachers We Need, [May, 2013], is the constant theme of teachers declaring the education courses they took toward getting teacher-certified were “useless,” save the time spent in the classroom doing their student-teaching.  Holders of degrees in education need to get back in the classroom taking long, hard looks at student behavior rather than at school organization and management.  Moreover, school district administration and school boards need to focus on the classroom also, instead of trying to manage others from their ivory towers.  All school organization outside the classroom should be postured in support of the classroom; the schools are not the focus of education — students’ minds are.

 

As an aid in thinking about education reform, think about a large regional hospital, wherein hospital administration and board are analogous to school administration (campus and district) and the school board, professional staff (the doctors) are analogous to school faculty, and patients are analogous to the classroom students.  Ideally the entire hospital focuses upon its mission of promoting the health and recovery of its patients; hospitals are for patients the same way schools are for students.  What if this hospital was infected by the campusology/good ol’ boy system such as described in schools above?  Doctors would have to have administrative approval of their diagnoses and their prescriptions; doctors would be encouraged to become hospital administrators, as the latter are paid more than the former; emphasis for doctors would be upon climbing the tenure ladder; the hospital softball team becomes as important as successful patient stays.  That, to me as a potential patient, is insane!  Are schools fraught with divertive campusology and good ol’ boy tactics potentially harmful to student education any less insane?  Based upon 60 years in the classroom as a student or teacher, I don’t think so!  I trust all good teachers, those who are competent and also who care, resonate with what I’m saying.

 

If any school tax payer, any former or present school administrator, any former or present school teacher, or any former or present student thinks positively toward any or all of the above, please read my specific, suggested solutions toward needed education reform in 2: Education Reform — The Right Model and  3: Education Reform — How We Get the Teachers We Need.  In all probability, many such readers could come up with better suggestions, ideas, and plans than I.  In fact, good friend and former student Dr. Burl Barr thinks I don’t have a prayer reforming departments of education as I propose; reluctantly Dr. Stephen Weldon, another friend/former student, agreed with Dr. Barr.  See if you can change Dr. Barr’s and Dr. Weldon’s views on this matter.  Please pass on to us all your suggestions, ideas, and plans.

The idea/philosophy of public education is, in my opinion, one of the greatest, if not the greatest, legacies passed on to modern humankind by the United States of America.  What is happening now in our USA classrooms does not reflect that great legacy.  We need to do something about this incongruity, not just talk about it.  Please join me in working for the development of young minds.

RJH

 

 

Perception Theory (Perception is Everything) — Three Applications

In the presentation of a theory of human existence, Perception is Everything [Jan., 2016], it was suggested the theory could be applied to almost every aspect of human experience.  The model paints the picture of the objective/subjective duality of human existence as the interactive dual flow (or flux) of real-world, empirical, and veridical data bombarding our senses and of imaginative, conceptual, and non-veridical data generated by our mind, all encased within the organ we call the brain.  The two sides of the duality need not be at odds, and both sides are necessary; the objective and the subjective are in a symbiotic relationship that has evolved out of this necessity; what and who we are simultaneously exist because of this symbiosis that dwells in the head of every human individual.  No two humans are alike because no two symbioses in two brains are alike.

This post is to briefly demonstrate how the perception model of Perception is Everything [Jan., 2016] can be use to contribute insights into I. Development of Self-Consciousness in a Human Infant, II. Education, and III. The Origin of Politics.

 

I. Development of Self-Consciousness in a Human Infant – That the human mind has the ability to develop a concept of “self,” as opposed to “others,” is commonly seen as fundamentally human.  It might not be unique to our species, however, as we cannot perceive as do individuals of other species.  Often pet owners are convinced their dog or cat behaves as if it is aware of its own individuality.  But that might be just too much anthropomorphism cast toward Rover or Garfield by the loving owners.  So fundamental is our self-consciousness, most views would assert its development must commence just after birth, and my perception theory is no exception.

The human baby is born with its “nature” genetically dealt by the parents and altered by the “nurture” of the quality of its gestation within the mother’s womb (or within the “test tube” early on or within the artificial womb of the future).  The world display screen in the head of the baby (Perception is Everything [Jan., 2016]) has to be primitive at birth, limited to whatever could bombard it veridically and non-veridically while in the womb (Can a baby sense empirical data? Can a baby dream?  Are reflex movements of the fetus within her which the mother can feel before birth recorded in the memory of the fetus?)  Regardless of any answers to these questions, perception theory would describe the first moments after the cutting of the umbilical cord as the beginning of a “piece of star-stuff contemplating star-stuff all around it” Perception is Everything [Jan., 2016].  The event causing the baby to take its first breath begins the lifelong empirical veridical flux entering one “side” of the baby’s world display screen, triggering on the other “side” of the screen an imaginative non-veridical flux from the other “side.”  The dual flux has begun; the baby is “alive” as an individual, independent of the symbiosis with its mother’s body; its life as a distinct person has begun.

The unique “long childhood” of Homo sapiens (due to the size-of-the-birth-canal/size-of-the-baby’s-skull-after-9-months’-gestation consideration), the longest “childhood” of any species before the offspring can “make it on its own” —  a childhood necessarily elongated, else we would not be here as a species today — assures the world display screen is so primitive that the first few days, weeks, and months of each of us are never remembered as our memory develops on the non-veridical side of the screen.  It takes a while for memory generated from the empirical veridical flux to be able to create a counter flow of imaginative non-veridical flux back to the screen. Perception is Everything [Jan., 2016] indicates the dual flow is necessary for the screen to become “busy” enough to be noticed by the “mind’s eye,” that within us that “observes” the screen.  No doubt all of us first had our screens filled by perceptions of faces of caretakers (usually dominated by our mother’s face) and sensations of sound, touch, smell, and taste as our bodies adapted to the cycles of eating, eliminating, and sleeping.  Waking hours during which we were doing none of these, we began to focus on the inputs of our senses.  These are the indicators we inevitably process non-veridically how we are aware of these inputs; and just as inevitably we at some point become aware of a “perceiver,” an observer of these inputs; we have an idea of “something” is perceiving, that “something” is relating to our caretaker(s) (whose face(s) we always feel good seeing), and that “something” is us.  In each individual, the development of a subjective “I” is normally “there” in the head in a few months (exact time interval different, probably, for each individual); a distinction between “me” and “not-me” begins.  This distinction is self-consciousness in-the-making, or “proto-self-consciousness.”

That distinction between “me” and “not-me” is vital and fundamental for each piece of star-stuff beginning to contemplate his or her “fellow” star-stuff — contemplation that is constantly painting an increasingly complex world display screen inside his or her head.  Early on, anything that “disappears” when eyes are closed is “not-me;” anything that is hungry, that likes things in a hole below the eyes to quench that hunger, that experiences discomfort periodically way below the eyes, and that feels tactile sensations from different locales in the immediate vicinity (through the skin covering all the body as well as the “hole below,” the mouth) is “me.”  Eventually, “me” is refined further to include those strange appendages that can be moved at will (early volition) and put into the hunger hole below the eyes, two of which are easy to put in (hands and fingers) and two of which are harder to put in (feet and toes).  That face that seems to exist to make “me” feel better and even happy turns out to be part of “not-me” and it becomes apparent that much of “not-me” does not necessarily make “me” feel better, but are interesting nonetheless.  Reality is being sorted out in the young brain into that which is sorted and that which sorts, the latter of which is the “mind’s eye,” self-consciousness.

In time, “me” can move at will and that which can move thus is the “housing” and boundary limiting “me.”  As soon as the faces “me” can recognize are perceived that they represent other “me’s,” then the distinction between “me” and “you” begins, soon followed by “me,” “you,” and “them.”  Some “you’s” and “them’s” don’t look like other “you’s” and “them’s,” such as household pets.  Still other “you’s” and “them’s” don’t move on their own like “me, soon to be ‘I'” does, such as dolls and stuffed animals.  “You’s” and “them’s” separate into two catagories — “alive” and “not-alive.”  As quantity becomes more a developed concept, it soon becomes apparent that there are outside “me” more “not-alives” than “alives;” “not-alives” soon are called “things” and “alives” take on unique identities by learning to recognize and later speak names.  Things are also non-veridically given names, and the genetic ability to quickly learn language “kicks in,” as well as the genetic ability to count and learn math.  In a few months’ time, existence for “me” has become both complex and fixating to its mind/brain, and growing at an increasing rate (accelerated growth).  The name non-veridically given to “me” is the subjective “I” or the objective “myself” — both of which are understood to be self-consciousness.

This clearly is an approach similar to a psychology of infants, which might deal eventually with the development of the ego and the id.  This approach using perception theory allows a seamless tracing of the development of the human mind back before birth, employing a more objective approach to talking about subjectivity than possessed by some other psychological approaches; it is an approach based upon evolutionary psychology.  In addition, it is clear that the emergence of self-consciousness according to perception theory demands a singular definition of the “self” or of “I” or of “myself,” in order to avoid the problems of schizophrenia and its multiple personalities.  Perhaps the widespread phenomenon of children making up “imaginary friends” is an evolved coping mechanism in the individual child’s imagination to order to avoid schizophrenia; an imaginary friend is not the same as the self-consciousness producing such a “friend.”  Just like the individual brain, self-consciousness is singularly unique, in ontological resonance with the brain.

 

II.  Education – Perception theory is compatible with the idea of what education should be.  Education is not a business turning students into future consumers; education is not a sports team turning students into participants; education is not training to turn students into operators of everything from computer keyboards to spaceship control panels.  Instead, education is but the development of students’ minds (1. Education Reform — Wrong Models! [May, 2013], 2. Education Reform — The Right Model [May, 2013], 3. Education Reform — How We Get the Teachers We Need [May, 2013], & Top Ten List for Teachers of HS Students Preparing for College or University (Not a Ranking) — A List for Their Students, Too! [Dec., 2014]).  The word “but” here is somewhat misleading, as it indicates that education might be simple.  However, education is so complex that as yet we have no science of education (#1 on the “List” in Top Ten List for Teachers of HS Students Preparing for College or University (Not a Ranking) — A List for Their Students, Too! [Dec., 2014]).  Perception theory indicates why education is so complex as to defy definition and “sorting out,” Defining education is like the brain trying to define its own development, or, like a piece of star-stuff trying to self-analyze and contemplate itself instead of the universe outside itself.  At this writing, I am inclined to say that a more definitive sorting out of what education is and how it is accomplished inside individual brains is not impossible, as an individual seeing his/her own brain activity is impossible, or, as another person seeing my subjective world display screen in my head is impossible (the “subjective trap”) [Perception is Everything [Jan., 2016]].

Following this optimistic inclination, education is seen as developing in individual brain/minds a continuous and stable dual flow of veridical flux and non-veridical flux upon the individual’s world display screen (Perception is Everything [Jan. 2016]).  A “balance” of this dual flow in Perception is Everything [Jan., 2016] is seen as a desired “mid-point” of a spectrum of sanity, the two ends of which denote extreme cases of veridical insanity and non-veridical insanity.  Therefore, the goal of education is to make the probability of becoming unbalanced and away from this mid-point in either direction as small as possible; in other words, education attempts, ideally, to make in the student’s mind the concentration and focusing of the non-veridical upon the veridical as much as possible.  The non-veridical vigor of “figuring out” the veridical from “out there” outside the brain is matched by the vigor of the empirical bombardment of that same veridical daily data.  Making this focus a life-long habit, making this focus a comfortable, “natural,” and “fun” thing for the non-veridical mind to do for all time is another way to state this goal of education.  Defining education in this manner seems compatible and resonate with the way our mind/brain seems to be constructed (with the necessary duality of the objective and the subjective); our mind/brains seem evolved to be comfortable with being at the mid-point without struggling to getting or staying there; self-educated individuals are those fortunate enough to have discovered this comfort mostly on their own; graduates of educational institutions who become life-long scholars have been guided by teachers and other “educators” to develop this “comfort zone” in their heads.  Education, in this sense, is seen as behaving compatibly with the structure of the brain/mind that has assured our survival as a species over our evolution as a species.  In order to successfully, comfortably, and delightfully spend our individual spans of time in accordance to the evolution of our mind/brains, we must live a mental life of balance of the two fluxes; education, properly defined and thought upon in individual mind/brains, assures this balance, and therefore assures lives of success, comfort, and delight.  He/she who is so educated uses his/her head “in step” with the evolution of their head.

We evolved not to be religious, political, or artistic; we evolved to be in awe of the universe, not to be in awe of the gods, our leaders, or our creations.  We evolved not to be godly, patriotic, or impressive; we evolved to survive so that our progeny can also survive.  Religion, politics, and the arts are products of our cultural evolution invented by our non-veridical minds to cope with surviving in our historical past.  In my opinion these aspects of human culture do not assure the balance of the two fluxes that maximize the probability of our survival.  Only focusing upon the universe of which we are a part will maximize that probability — thinking scientifically and “speaking” mathematically, in other words.  Education, therefore, is properly defined as developing the scientifically focused mind/brain; that is, developing skills of observation, pattern recognition, mathematical expression, skepticism, imagination, and rational thinking.  But it is not an education in a vacuum without the ethical aspects of religion, the social lessons of political science and history, and the imaginative exercises of the arts.  In this manner religious studies, social studies, and the fine arts (not to mention vocational education) all can be seen as ancillary, participatory, and helpful in keeping the balance of the two fluxes, as they all strengthen the mind/brain to observe, recognize, think, and imagine (i.e. they exercise and maintain the “health” of the non-veridical).  I personally think non-scientific studies can make scientific studies even more effective in the mind/brain than scientific studies without them; non-scientific studies are excellent exercises in developing imagination, expression, senses of humor, and insight, attributes as important in doing science as doing non-science.  The “well-rounded” scholar appreciates the role both the objective and the subjective play in the benefit of culture better than the “specialist” scholar, though both types of scholars should understand that the focus of all study, scientific or not, should be upon the veridical, the universe “out there.”  Not everyone can development their talents, interests, and skills in the areas of science, math, engineering, and technology, but those who do not can focus their talents, interests, and skills toward toward developing some aspect of humanity-in-the-universe — toward exploring the limitless ramifications of star-stuff in self-contemplation.

Therefore, education, Pre-K through graduate school, needs a new vertical coordination or alignment of all curricula.  ALL curricula should be taught in a self-critical manner, as science courses are taught (or should be taught if they are not).  An excellent example of what this means was the list of philosophy courses I took in undergraduate school and graduate school.  Virtually all the philosophy courses I took or audited were taught in a presentation of X, of good things about X, and of bad things about X sequence.  In other words, all courses, regardless of level, should be taught as being fallible, not dogmatic, and subject to criticism.  A concept of reliable knowledge, not absolute truth, should be developed in every individual mind/brain so that reliability is proportional to verification when tested against the “real world,” the origin of the veridical flux upon our world display screen; what “checks out” according to a consensus of widely-accepted facts and theories is seen as more reliable than something that is supported by no such consensus.  Hence, the philosophy of education should be the universal fallibility of human knowledge; even the statement of universal fallibility should be considered fallible.  Material of all curricula should be presented as for consideration, not as authoritative; schools are not to be practitioners of dogma or propagators of propaganda.  No change should occur in the incentive to learn the material if it is all considered questionable, as material continues often to be learned in order to pass each and every course through traditional educational assessment (tests, exams, quizzes, etc.).  And one does not get diplomas (and all the rights and privileges that come with them) unless one passes his/her courses.  Certainly the best incentive to learn material, with no consideration of its fallibility other than it’s all fallible, is the reward of knowing for its own sake; for some students, the fortunate ones, the more one knows, the more one wants to know; just the knowing is its own reward.  Would that a higher percentage of present and future students felt that way about what they were learning in the classroom!

The “mantra” of education in presenting all-fallible curricula is embodied in the statement of the students and for the students.  Institutions of learning exist to develop the minds of students; socialization and extracurricular development of students are secondary or even tertiary compared to the academic development of students, as important as these secondary and tertiary effects obviously are.  As soon as students are in the upper years of secondary schooling the phrase by the students should be added to the other two prepositional phrases; in other words, by the time students graduate from secondary schools, they should have first-hand experience with self-teaching and tutoring, and with self-administration through student government and leadership in other student organizations.  Teachers, administrators, coaches, sponsors, and other school personnel who do not do what they do for the sake of students’ minds are in the wrong personal line of work.

Educational goals of schools should be the facilitation of individual student discovery of likes, dislikes, strengths, weaknesses, tastes, and tendencies.  Whatever diploma a student clutches should be understood as completing a successful regimen of realistic self-analysis; to graduate at some level should mean each student knows his/herself in a level-appropriate sense; at each level each student should be simultaneously comfortable with and motivated by a realistic view of who and what he/she is.  Education should strive to have student bodies free of “big-heads,” bullies, “wall-flowers,” and “wimps.”  Part of the non-academic, social responsibility of schools should be help for students who, at any level, struggle, for whatever reason, in reaching a realistic, comfortable, and inspiring self-assessment of themselves.  Schools are not only places where you learn stuff about reality outside the self, they are places where you learn about yourself.  Students who know a lot “outside and inside” themselves are students demonstrating the two fluxes upon their world display screen in their heads are in some sense balanced. (1. Education Reform — Wrong Models! [May, 2013], 2. Education Reform — The Right Model [May, 2013], 3. Education Reform — How We Get the Teachers We Need [May, 2013],  Top Ten List for Teachers of HS Students Preparing for College or University (Not a Ranking) — A List for Their Students, Too! [Dec., 2014], & Top Ten List for Teachers of HS Students Preparing for College or University (Not a Ranking) — A List for Their Students, Too! [Dec., 2014])

Consequently, the only time education should be seen as guaranteeing equality is at the beginning, at the “start-line” the first day in grade K.  Education is in the “business” of individual development, not group development; there is no common “social” mind or consciousness — there is only agreement among individual brain/minds.  Phrases like “no child left behind” has resulted in overall mediocrity, rather than overall improvement.  Obviously, no group of graduates at any level can be at the same level of academic achievement, as each brain has gained knowledge in its own, unique way; some graduates emerge more knowledgeable, more talented, and more skilled than others; diverse educational results emerge from the diversity of our brain/minds; education must be a spectrum of results because of the spectrum of our existence, our ontology, of countless brain/minds.  Education, therefore, should be seen as the guardian of perpetual equal opportunity from day 1 to death, not the champion of equal results anywhere along the way.

[Incidentally, one of the consequences of “re-centering” or “re-focusing” the philosophy, the goals, and the practices of education because of perception theory may be a surprising one.  One aspect of a scientific curriculum compared to, say, an average “humanities” curriculum, is that in science,, original sources are normally not used, unless it is a history and philosophy of science course (Is history/philosophy of science a humanities course?).  I am ending a 40-year career of teaching physics, mostly the first-year course of algebra-based physics for high school juniors and seniors, and, therefore, ending a 40-year career introducing students to the understanding and application of Isaac Newton’s three laws of motion and Newtonian gravitational theory.  Never once did I ever read to my physics students, nor did I ever assign to my physics students to read, a single passage from Philosophiae Naturalis Principia Mathematica, Newton’s introduction to the world of these theories.  Imagine studying Hamlet but never reading Shakespeare’s original version or some close revised version of the original!

The reason for this comparison above is easy to see (but not easy to put in few words for me):  science polices its own content; if nature does not verify some idea or theory, that idea or theory is thrown out and replaced by something different that does a better job of explaining how nature words.  At any moment in historical time, the positions throughout science are expected to be the best we collectively know at that moment.  Interpretations and alternative views outside the present “best-we-know” consensus are the right and privilege of anyone who thinks about science, but until those interpretations and views start making better explanations of nature than the consensus, they are ignored (and, speaking as a scientist, laughed at).

Though many of the humanities are somewhat more “scientific” than in the past — for instance, history being more and more seen as a forensic science striving to recreate the most reasonable scenes of history — they are by definition focused on the non-veridical rather than the veridical.  They are justified in education, again, because they aid and “sharpen” the non-veridical to deal with the veridical with more insight than we have done in the past.  The problems we face in the future are better handled with not only knowledge and application of science, math, engineering, and technology but also with knowledge of what we think about, of what we imagine, of the good and bad decisions we have made collectively and individually in the past, and of the myriad of ways we can express ourselves, especially express ourselves about the veridical “real” world.  Since the original sources of these “humanities” studies are seen as applicable today as they were when written, since they, unlike Newton, were not describing reality, but only telling often imaginative, indemonstrable, and unverifiable stories about human behavior to which humans today can still relate, the original authors’ versions are usually preferred over modern “re-hashes” of the original story-telling.  The interest in the humanities lies in relating to the non-veridical side of the human brain/mind, while the interest in the sciences lies in the world reflecting the same thing being said about it; Newton’s laws of motion are “cool” not because of the personality and times of Isaac, but because they appear to most people today “true;” Hamlet’s soliloquies are “cool” not because they help us understand the world around us, but because they help us understand and deal with our non-veridical selves, which makes their creator, Shakespeare, also “cool;” the laws of motion, not Newton, are today relevant, but Shakespeare’s play is relevant today because in its original form it leads still to a myriad of possibly useful interpretations.  What leads to veridical “truth” is independent of its human source; what leads to non-veridical “stories” is irrevocably labeled by its originator.

To finally state my bracketed point on altered education as begged above the opening bracket, science, math, and engineering curricula should be expanded to include important historical details of scientific ideas, so that the expulsion of the bad ideas in the past as well as the presentation of the good ideas of the present are included.   Including the reasons the expunged ideas are not part of the curriculum today would be the “self-critical” part of science courses.  Science teachers would be reluctant to add anything to the curriculum because of lack of time, true enough, but the clever science teacher can find the few seconds needed to add by being more anecdotal in their lessons, which would require them to be more knowledgeable of the history and philosophy of science.  Hence, all the curriculum in education suggested by perception theory would be similar — cast in the universal presentation of X, of good things about X, and of bad things about X mold.]

 

III.  The Origin of Politics (The “Toxic Twin”) – Perception is Everything [Jan., 2016] makes dealing with human politics straightforward, in that politics not only originated, in all likelihood, just as religion and its attendant theology originated, it has developed along the same lines as theology so similarly that politics could be considered the “toxic twin” of theology, in that it can turn as toxic (dangerous) to humanity as theology can turn. (Citizens! (I) Call For the Destruction of the Political Professional Class [Nov., 2012], Citizens! (II) The Redistribution of Wealth [Jan., 2013], Citizens! (III) Call for Election Reform [Jan., 2013], The United States of America — A Christian Nation? [June, 2012], An Expose of American Conservatism — Part 1 [Dec., 2012], An Expose of American Conservatism – Part 2 [Dec., 2012], An Expose of American Conservatism — Part 3 [Dec., 2012], Sorting Out Jesus [July, 2015], At Last, a Probable Jesus [Sept., 2015], & Jesus — A Keeper [Sept., 2015]) In order for us to survive in our hunter-gatherer past, leaders and organizers were apparently needed as much as shamans, or proto-priests; someone or a group of someones (leader, chief, council, elders, etc.) had to decide what would be the best next thing for the collective group to do (usually regarding the procuring of food for the group’s next eating session or regarding threats to the group from predators, storms, or enemy groups over the next hill, etc., etc.,); just as someone was approached to answer the then unanswerable questions, like where the storms come from and why did so-and-so have to die, leaders of the group were looked to for solving the group’s practical and social problems.  In other words, politics evolved out of necessity, just like religion.  Our non-veridical capabilities produced politics to meet real needs, just as they produced religion to meet real needs.

But, just as theology can go toxic, so can politics and politics’ attendant economic theory.  Voltaire’s statement that those who can make you believe in absurdities can make you commit atrocities applies to political and economic ideology just like it does to gods and god stories.  Anything based purely upon non-veridical imagination is subject to application of Voltaire’s statement.  However, I think politics has an “out” that theology does not.  Theology is epistemologically trapped, in that one god, several gods, or any god story cannot be shown to be truer (better in describing reality) than another god, other several gods, or another god story.  Politics is not so trapped, in my opinion, as it does not have to be “attached at the hip” with religion, as has been demonstrated in human history since the 18th century.  Politics can be shown to be “better” or “worse” than its previous version by comparing the political and social outcome of “before” with “after.”  No political solution solves all human problems, if for no other reasons than such problems continually evolve in a matter of weeks or less, and, no political installment can anticipate the problems it will encounter, even when it has solved the problems of the “before.” Nonetheless, I think one can argue that the fledgling United States of America created by the outcome of the American Revolution and the birth of the U.S. Constitution was better than the colonial regime established in the 13 colonies by the reign of George III.  The same can be said about the independent nations that emerged peacefully from being commonwealths of the British Empire, like India, Canada, and Australia, though the USA, India, Canada, and Australia were and are never perfect and free from “birth pangs.”

What are the political attributes that are “better” than what was “before?”  Many of the references cited just above point out many of them, a list I would not claim to be complete or sufficient.  Overall, however, the history of Western and Eastern Civilization has painfully demonstrated, at the cost of spilling of the blood of millions (Thirty Years’ War, Napoleonic Wars, World War I, World War II, etc.) that theocracies and monarchies are “right out.”  [Here I am applying the philosophy that history is not so much a parade of great individuals, but, rather, is more apply seen as a parade of great ideas — a parade of non-veridical products much better than other such products.]  Democracies only work for small populations, so a representative form of government, a republic, works for larger populations of the modern world.  Clearly, secular autocracies and dictatorships are also “right out.”  Class structure of privilege and groundless entitlement still rears its ugly head even in representative republican governments in the form of rule-by-the-few of power (oligarchies) and/or wealth (plutocracies).  To prevent oligarchies and plutocracies, elected representative government officials should be limited in how long they can serve so that they cannot become a political professional class (limited terms of office); in other words, politicians should be paid so that they cannot make a profit.

[Almost the exact same things can be said of government work staffs and other non-elected officials — the bureaucrats of “big government.”  Terms of service should be on a staggered schedule of limitations so that some “experience” is always present in both the elected and their staffs; bureaucrats should be paid in order that they cannot become a professional class of “bean-counters” at tax payer expense; public service should be kept based upon timely representation, and civil service should be kept based upon a system of timely merit; politicians are elected by voters, and bureaucrats are selected by civil service testing — both groups subject to inevitable replacement.]

This, in turn, calls for severe restrictions on lobbying of elected officials of all types (making lobbying a crime?).  Preventing oligarchies and plutocracies of any “flavor” can only be effective if the overall political philosophy applied is a liberal one (“liberal” meaning the opportunity to achieve wealth, power, and influence while simultaneously working so that others around you (all over the globe) can achieve the same, all without the unjust expense to someone else’s wealth, power, and influence).  The philosophy of such a liberal posture I call “liberalist,” meaning that freedom, equality, and brotherhood (the liberte, egalite, and fraternite of the French Revolution) are all three held constantly at equal strength.  When one or two of the three are reduced at the relative boosting of two or one, respectively, then things like the atrocities of the French Terror, the atrocities of fascism, the atrocities of communism, or the atrocities of unregulated capitalism result.

[The word “equality” in political philosophy as used above must be distinguished from the “equality” issue of education in II. above.  When the US Constitution speaks of “all men are created equal,” that does not mean equal in knowledge, talents, and skills; rather it means a shared, universal entitlement to basic human rights, such as, in the Constitution’s words, “life, liberty, and the pursuit of happiness.”  We all have equal rights, not equal educational results; equal rights does not mean equal brain/minds — something the Terror tragically and horribly did not grasp; equal rights to education does not mean equal knowledge, talents, and skills for graduates — something too many “educators” tragically do not grasp.  Perception theory would suggest political equality is different from educational equality; the word “equality” must be understood in its context, if the appropriate adjective is not used with the noun “equality.”  The difference is crucial; political equality is crucial to the healthy social organization of the species, while educational equality (equal results, not equal opportunity) is tragic and harmful to the individual brain/minds of the species.  Awareness of this difference, or always making this semantic distinction, should avoid unnecessary confusion.]

Certain Western European countries, such as the Scandinavian countries, have shown the future of political systems toward which all nations should strive in accordance to liberal, liberalist views.  If anything is needed by the population at large, then a socialist program is called for to deal with all fairly — such as social security, free public education through university level, postal service, public transportation, universal single-payer health care, public safety, state security, and “fair-share” taxation of all who earn and/or own.  No one is allowed to achieve personal gain through regulated capitalism or through leadership in any of these socialist programs except upon merit, meaning his/her gain (in wealth, power, and/or influence) is not at the unjust loss of someone else, and is based solely upon the successful individual’s talents, skills, and knowledge; competition in capitalism and program leadership is both necessary and in need of limitations. It is OK to “lose” in the game of capitalism, as long as one loses “fair and square;” every business success and every business failure must be laid at the feet of the entrepreneur.  The political system with its social programs is merely the crucible of both individual success and individual failure, continually monitoring and regulating the crucible so as to assure perpetual and equal opportunity for all.  Regulation of the political system crucible is achieved by electors of political leadership and program leadership — regulation keeping the programs, like capitalism, perpetually merit-based, fair, and just.  This is a system of “checks and balances” toward which every political system should strive.

History has taught us that the foregoing is not a description of some “pie-in-the-sky” Utopia; it is a description of what history has painfully taught us as “the way” of avoiding a theology-like toxicity for politics.  Politics is not doomed to be theology’s “toxic twin;” it will be so doomed if the bloody lessons of its past are not heeded.  In my opinion, it really is not complicated: it is better to liberally trade, tolerate, and befriend than to conservatively exploit, distrust, and demonize.  Politically speaking, we need to securely develop a xenophilia to replace our prehistoric and insecure xenophobia.  This “xeno-development” is one of the great lessons taught by the modern world over the last 300 years, and this “xeno-development” is begged by perception theory.

RJH

 

Perception Is Everything

Recently a model of human perception has occurred to me. Perception is like that “screen” of appearance before us in our waking hours that is turned off when we are asleep. Yet, it appears to us it does not really turn off during slumber when we remember dreams we have had before we awoke. The moments just before we “nod off” or just as we awake seem as times when perception is “half-way” turned on. The “fuzziness” of this “half-way switch” is clearly apparent in those mornings we awake and momentarily do not know the location of exactly where we slept.

 

Say I am sitting in an enclosed room with a large card painted uniformly with a bright red color. Focusing upon only my visual sensation, suppressing the facts I am also sensing the tactile signals of sitting in a chair with my feet on the floor as well as peripherally seeing “in the corner of my eye” the walls and other features of the room, I am only visually observing the color “red,” all for simplicity. Light from the card enters my eyes and is photo-electrically and electro-chemically processed into visual signals down my optic nerve to the parts of my brain responsible for my vision. The result of this process is the perception of the color “red” on the “screen” of my perception. If I were to describe this perception to myself I would simply imagine the word “red” in my head (or the word “red” in some other language if my “normal” spoken language was not English); were I to describe this perception to someone else in the room, say, a friend standing behind me, I would say, “I am seeing the color red,” again in the appropriate language.

Yet, if my friend could somehow see into my head and observe my brain as I claimed to be seeing red, that person would not experience my sensation or perception of “red.” He/she would see, perhaps with the help of medical instrumentation, biochemical reactions and signals on and in my brain cells. Presumably when I perceive red at a different moment in time later on, the observer of my brain would see the same pattern of chemical reactions and bio-electrical signals.

 
On the “screen” of my perception, I do NOT see the biochemistry of my brain responsible for my perception of red; were I to observe inside the head of my friend in the room while he/she was also focusing on the red card, I would NOT see his/her “screen” of perception, but only the biochemical and bio-electrical activity of his/her brain. It is IMPOSSIBLE to experience (to perceive) both the subjective perception of red and observe the biochemistry responsible for the same subjective perception within the same person. We can hook up electrodes to our own head to a monitor which we observe at the same time we look at red, but we would only be seeing just another representation of the biochemistry forming our perception, not the biochemistry itself, as well as perceiving the red perception. I call this impossibility “the subjective trap.”

 
And yet, my friend and I make sense of each of our very individual impossibilities, of each of our very personal subjective traps, by behaving as if the other perceives red subjectively exactly the same, and as if our biochemical patterns in our respective brains are exactly the same. We are ASSUMING these subjective and biochemical correlations are the same, but we could never show this is the case; we cannot prove our individual perceptions in our own head are the same perceptions in other heads; we cannot ever know that we perceive the same things that others around us perceive, even if focusing upon the exact same observation. The very weak justification of this assumption is that we call our parallel perceptions, in this scenario, “red.” But this is merely the learning of linguistic labels. What if I were raised in complete isolation and was told that the card was “green?” I would say “green” when describing the card while my friend, raised “normally” would say “red.” (Note I’m stipulating neither of us is color blind.) Such is the nature of the subjective trap.

 
[If one or both of us in the room were color-blind, comparison of visual perceptions in the context of our subjective traps would be meaningless — nothing to compare or assume. In this scenario, another sensation both of us could equally perceive, like touching the surface of a piece of carpet or rubbing the fur of a cute puppy in the room with us, would be substituted for seeing the color red.]

 
The subjective trap suggests the dichotomy of “objective” and “subjective.” What we perceive “objectively” and what we perceive “subjectively” do not seem to overlap (though they seem related and linked), leading to a separation of the two adjectives in our culture, which has a checkered history. Using crude stereotypes, the sciences claim objectivity is good while subjectivity is suspect, while the liberal arts (humanities) claim subjectivity is good while objectivity is ignorable. Even schools, colleges, and universities are physically laid out with the science (including mathematics and engineering) buildings on one end of the campus and the liberal arts (including social studies and psychology) buildings on the other. This is the “set-up” for the “two cultures'” “war of words.” I remember as an undergraduate physics major debating an undergraduate political science major as we walked across campus which has had the greatest impact upon civilization, science or politics? We soon came to an impasse, an impasse that possibly could be blamed, in retrospect over the years, on the subjective trap. Ideas about the world outside us seemed at odds with ideas about our self-perception; where we see ourselves seemed very different from whom we see ourselves; what we are is different from whom we are.

Yet, despite being a physics major and coming down “hard” on the “science side” of the argument, I understood where the “subjective side” was coming from, as I was in the midst of attaining, in addition to my math minor, minors in philosophy and English; I was a physics major who really “dug” my course in existentialism. It was as if I “naturally” never accepted the “two cultures” divide; it was as if I somehow “knew” both the objective and the subjective had to co-exist to adequately describe human experience, to define the sequence of perception that defines a human’s lifespan. And, in this sense, if one’s lifespan can be seen as a spectrum of perception from birth to death of that individual, then, to that individual, perception IS everything.

How can the impossibility of the subjective trap be modeled? How can objectivity and subjectivity be seen as a symbiotic, rather than as an antagonistic, relationship within the human brain? Attempted answers to these questions constitute recent occurrences inside my brain.

 

Figure 1 is a schematic model of perception seen objectively – a schematic of the human brain and its interaction with sensory data, both from the world “outside” and from the mind “inside.” The center of the model is the “world display screen,” the result of a two-way flow of data, empirical (or “real world” or veridical) data from the left and subjective (or “imaginative” or non-veridical) data from the right. (Excellent analogies to the veridical/non-veridical definitions are the real image/virtual image definitions in optics; real images are those formed by actual rays of light and virtual images are those of appearance, only indirectly formed by light rays due to the way the human brain geometrically interprets signals from the optic nerves.) [For an extensive definition of veridical and non-veridical, see At Last, A Probable Jesus [August, 2015]] Entering the screen from the left is the result of empirical data processed by the body’s sense organs and nervous system, and entering the screen from the right is the result of imaginative concepts, subjective interpretations, and ideas processed by the brain. The “screen” or world display is perception emerging to the “mind’s eye” (shown on the right “inside the brain”) created by the interaction of this two-way flow.

 
Figure 1 is how others would view my brain functioning to produce my perception; Figure 1 is how I would view the brains of others functioning to produce their perceptions. This figure helps define the subjective trap in that I cannot see my own brain as it perceives; all I can “see” is my world display screen. Nor can I see the world display screens of others; I can only view the brains of others (outside opening up their heads) as some schematic model like Figure 1. In fact, Figure 1 is a schematic representation of what I see if I were to peer inside the skull of someone else. (Obviously, it is grossly schematic, bearing no resemblance to brain, nervous system, and sense organ physiology. Perhaps many far more proficient in neuro-brain function than I, and surely such individuals in future, can and will correlate those terms on the right side of Figure 1 with actual parts of the brain.)

 
Outside data collectively is labeled “INPUT” on the far left of Figure 1, bombarding all the body’s senses — sight, sound, smell and taste, heat, and touch. Data that stimulates the senses is labeled “PERCEPTIVE” and either triggers the autonomic nervous system to the muscles for immediate reaction (sticking your fingers into a flame) necessarily not requiring any processing or thinking, or, goes on to be processed as possible veridical data for the world display. However, note that some inputs for processing “bounce off” and never reach the world display; if we processed the entirety of our data input, our brains would “overload,” using up all brain function for storage and having none for consideration of the data “let in.” This overloading could be considered a model for so-called “idiot savants” who perceive and remember so much more than the “average” person (“perfect memories”), yet have subnormal abilities for rational thought and consideration. Just how some data is ignored and some is processed is not yet understood, but I would guess that it is a process that differs in every developing brain, resulting in no two brains, even those of twins, accepting and rejecting data EXACTLY alike. What is for sure is that we have evolved “selective” data perception over hundreds of thousands of years that has assured our survival as a species.
The accepted, processed data that enter our world display in the center of Figure 1 as veridical data from the outside world makes up the “picture” we “see” on our “screen” at any given moment, a picture dominated by the visual images of the objects we have before us, near and far, but also supplemented by sound, smell, tactile information from our skin, etc. (This subjective “picture” is illustrated in Figure 2.) The “pixels” of our screen, if you please, enter the subjective world of our brain shown on the right of Figure 1 in four categories – memory loops, ideas, self-perception, and concepts – as shown by the double-headed, broad, and straight arrows penetrating the boundary of the world display with the four categories. The four categories “mix and grind” this newly-entered data with previous data in all four categories (shown by crossed and looped broad, double-headed arrows) to produced imagined and/or reasoned data results back upon the same world display as the moment’s “picture” – non-veridical data moving from the four categories back into the display (thus, the “double-headedness” of the arrows). Thusly can we imagine things before us that are not really there at the moment; we can, for instance, imagine a Platonic “perfect circle” (non-veridical) not really there upon a page of circles actually “out there” drawn upon a geometry textbook’s page (veridical) at which we are staring. In fact, the Platonic “perfect circle” is an example of a “type” or “algorithmic” or symbolic representation for ALL circles created by our subjective imagination so we do not have to “keep up” will all the individual circles we have seen in our lifetime. Algorithms and symbols represent the avoidance of brain overload.

 
From some considered input into our four categories of the brain come “commands” to the muscles and nervous system to create OUTPUT and FEEDBACK into the world outside us in addition to the autonomic nerve commands mentioned above, like the command to turn the page of the geometry text at which we are looking. Through reactive and reflexive actions, bodily communication (e.g. talking), and environmental manipulation (like using tools), resulting from these feedback outputs into the real world (shown at bottom left of Figure 1), we act and behave just as if there had been an autonomic reaction, only this time the action or behavior is the result of “thinking” or “consideration.” (The curved arrow labeled “Considered” leading to the muscles in Figure 1.)

 

Note how Figure 1 places epistemological and existential terms like CONSCIOUSNESS, Imagination, Knowing, Intention & Free Will, and Reason in place on the schematic, along with areas of the philosophy of epistemology, like Empiricism, Rationalism, and Existentialism (at the top of Figure 1). These placements are my own philosophical interpretations and are subject to change and placement alteration indicated by a consensus of professional and amateur philosophers, in conjunction with consensus from psychologists and brain physiologists, world-wide.
Figure 2 is a schematic of the “screen” of subjective perception that confronts us at every moment we see, hear, smell, taste, and/or touch. Figure 2 is again crudely schematic (like Figure 1), in this case devoid of the richness of the signals of our senses processed and displayed to our “mind’s eye.” Broad dashed arrows at the four corners of the figure represent the input to the screen from the four categories on the right of Figure 1 – memory loops, ideas, perception, and concepts. Solid illustrated objects on Figure 2 represent processed, veridical, and empirical results flowing to the screen from the left in Figure 1, and dashed illustrated objects on Figure 2 represent subjective, non-veridical, type, and algorithmic results flowing to the screen from the right in Figure 1. Thus Figure 2 defines the screen of our perception as a result of the simultaneous flow of both veridical and non-veridical making up every waking moment.

PerceptPic1

Figure 1 — A Model of the Objectivity of Perception

 

(Mathematical equations cannot be printed in dashed format, so the solid equations and words, like History, FUTURE, Faith, and PRESENT, represent both veridical and non-veridical forms; note I was able to represent the veridical and non-veridical forms of single numbers, like “8” and certain symbols, like X, equals, and does not equal.) Thus, the solid lightning bolt, for example, represents an actual observed bolt in a thunderstorm and the dashed lightning bolt represents the “idea” of all lightning bolts observed in the past.

 

The “subjective trap” previously introduced above is defined and represented by the rule that nothing of Figure 1 can be seen on Figure 2, and vice-versa. In my “show-and-tell” presentation of this perception model encapsulated in both figures, I present the figures standing on end at right angles to each other, so that one figure’s area does not project upon the area of the other – two sheets slit half-height so that one sheet slides into the other. Again, a) Figure 2 represents my own individual subjective screen of perception no one else can see or experience; b) Figure 1 represents the only way I can describe someone else allegedly perceiving as I. I cannot prove a) and b) are true, nor can anyone else. I can only state with reasonable certainty that both someone else and I BEHAVE as if a) and b) are true. In other words, thanks to the common cultural experience of the same language, my non-color-blind friend and I in the room observing the red-painted card agree the card “is red.” To doubt our agreement that it is red would stretch both our limits of credulity into absurdity.

 
The model described above and schematically illustrated in Figures 1 and 2 can be seen as one way of describing the ontology of human beings, of describing human existence. Looking at Figure 1, anything to the left of the world display screen is the only way we know anything outside our brain exists and anything to the right of the world display screen is the only way we know we as “I’s” exist in a Cartesian sense; anything to the right is what we call our “mind,” and we assume we think with our mind; in the words of Descartes, “I think, therefore I am.” We see our mind as part of the universe being “bombarded” from the left, so we think of ourselves as part of the universe. Modern science has over the centuries given us some incredible ontological insights, such as all physical existence is made up of atoms and molecules and elementary particles; we can objectively or “scientifically” describe our existence, but we do so, as we describe anything else, with our subjective mind; we, as self-conscious beings, describe the veridical in the only way we possibly can – non-veridically. Thus, the model suggests the incredible statement made by scientists and philosophers of science lately. Recalling that atoms are created in the interior of stars (“cooked,” if you please, by nuclear fusion inside stars of various sizes and temperatures) that have long since “died” and spewed out their atoms in

PerceptPic2

Figure 2 — A Model of the Subjectivity of Perception (The “Screen”)

 

contribution to the formation of our own solar system around 13.5 billion earth years ago, and recalling our bodies, including our brains, are made of molecules made from the atoms from dead and gone stars, the statement “We are ‘star-stuff’ in self-contemplation” makes, simultaneously, objective and subjective, or scientific and artistic, “spiritual sense.”

We can veridically “take in,” “observe,” “experience,” or “contemplate” anything from the vast universe outside our body as well as the vast universe inside our body outside our brain while at the same time we can imagine non-veridically limitless ways of “making sense” of all this veridical data by filing it, storing it, mixing it, and thinking about it, all within our brain. We are limitless minds making up part of a limitless universe.

 

As if that was not enough, each of us, as a veridical/non-veridical “package of perception,” is unique. Every human has a unique Figure 1 and a unique Figure 2. Our existence rests upon the common human genome of our species, the genetic “blueprint” that specifies the details of our biological existence. Yet, every individual’s genome is different from every other (even if only by .1% or by a factor of .001), just considering that mutations even for identical twins make their two “blueprints” slightly different once the two organisms exist as separated zygotes in the womb. Moreover, how we behave, and, therefore, how we respond non-veridically to the veridical data we receive individually, even from the same environment shared by others, is mitigated by the unique series of experiences each of us has had in our past. Hence, each person is a unique individual genome subjected to unique environmental experiences, the exact copy of which cannot possibly statistically exist.

 

The world display screen of an individual in any given moment has never been perceived before, nor will it ever be perceived again, as in the next moment the screen is modified by the dual flux of the veridical flux from the left and the non-veridical flux from the right in Figure 1. The life of an individual is a series of receiving this ever-changing dual flux and thinking or acting in the real world upon the basis of this dual flux; it is a series of two-way perceptions. The life of an individual is observed by another individual as a series of perceived behaviors assumed, but never proven, to be generated in the same way as those of the observer. All in the span of a human life is perception; to an individual human being, perception has to be everything.

 

This model suggests to me the absurdity of having objectivity and subjectivity irreconcilably separate; it suggests, rather, that they are inseparable; they go together like, in the words of the song, “horse and carriage” or “love and marriage.” The blending of objective data and imaginative concepts in our brain makes our perception, our conscious “everything,” or existence as a self-conscious being, if you please, possible. What we are is the veridical of our screen of perception; who we are is the non-veridical of the screen. In other words, the scientist is as potentially subjective as the poet, and the poet is as potentially objective as the scientist; they differ only in the emphases on the contents of their respective screens of perception. For the “two sides” of campuses of higher learning to be at “war” over the minds of mankind is absurd – as absurd as the impasse the political science major and I reached in conversation so many years ago.

 
If the above was all the model and its two figures did, its conjuring would have been well worth it, I think, but the above is just the tip of the iceberg of how the model can be applied to human experience. Knowing how prone we are to hyperbole when talking about our “brain children,” I nonetheless feel compelled to suggest this model of conception can be intriguingly applied to almost any concept or idea the human brain can produce – in the sense of alternatively defining the concept using “both worlds,” both the objective and the subjective, instead of using one much more than the other. In other words, we can define with this model almost anything more “humanly” than before; we can define and understand almost anything with “more” of ourselves than we’ve done in the past.

 

Take the concept of the human “soul” for example. It seems to me possible that cultures that use the concept of soul, whether in a sacred or secular sense, whether in the context of religion or psychology, they are close to using the concept of the “mind’s eye” illustrated in Figure 1 of the model. The “mind’s eye” is the subjective “I,” the subjective observer of the screen, the “see-er,” the “smell-er,” the “taste-er,” the “hear-er,” the “touch-er,” the “feel-er” of perception; the soul is the active perceiver of subjective human experience. The soul defines self-consciousness; it is synonymous with the ego. This view is consistent with the soul being defined as the essence of being alive, of being that which “leaves” the body upon death. Objectively, we would say that death marks the ceasing of processing veridical data; subjectively, we would say that death marks the ceasing of producing non-veridical data and the closing of the “mind’s eye.”

 

Yet the soul is a product of the same physiology as the pre-conscious “body” of our evolutionary ancestors. In other words, the soul “stands upon the shoulders” of the id, our collection of instincts hewn over millions of years. So, in addition, we would objectively say that death also marks the ceasing of “following” our instincts physically and mentally; our unique, individual genome stops defining our biological limitations and potentialities. The elements of our body, including our brain, eventually blend to join the elements of our environment. Objectively, we would say death marks our ceasing to exist as a living being. The concept of the soul allows death to be seen as the “exiting” or “leaving” of that necessary to be called “alive.”

 
So, the concept of the soul could be discussed as the same or similar to the concept of the ego, and issues such as when does a developing human fetus (or proto-baby) develop or “receive” a soul/ego, which in turn has everything to do with the issue of abortion, can be discussed without necessarily coming to impasses. (See my The ‘A’ Word – Don’t Get Angry, Calm Down, and Let Us Talk, [April, 2013] and my The ‘A’ Word Revisited (Because of Gov. Rick Perry of Texas), or A Word on Bad Eggs [July, 2013]) I said “could be,” not “will be” discussed without possibly coming to impasses. Impasses between the objective and subjective seem more the norm than the exception, unfortunately; the “two cultures war” appears ingrained. Why?

 
Earlier, I mentioned causally the answer the model provides to this “Why?”. The scientist/engineer and the artist/poet differ in their emphases of either the veridical flux to the world display screen or the non-veridical flux to the same world display screen of their individual brains. By “emphasis” I merely mean assigning more importance by the individual to one flux direction or the other in his/her head. At this point, one is reminded of the “left-brain, right-brain” dichotomy dominating brain/mind modeling since the phenomenon of the bicameral mind became widely accepted. The perception model being presented here incorporates on the non-veridical side of the perception screen both analytical (left) brain activity and emotional (right) brain activity in flux to the screen from the right side of Figure 1. Just like my use of left/right in Figure 1 is not like the use of left/right in bicameral mind/brain modeling, this model of perception is not directly analogous to bicameral modeling. What the perception model suggests, in my opinion, is that the analytical/emotional chasm of the human brain is not as unbridgeable as the “left-brain-right-brain” view might suggest.

More specifically, the perception model suggests that the “normal” or “sane” person keeps the two fluxes to the world display screen in his/her head “in balance,” always one flux mitigating and blending with the other. It is possible “insanity” might be the domination of one flux over the other so great that the dominated flux is rendered relatively ineffective. If the veridical flux is completely dominant, the person’s mind is in perpetual overload with empirical data, impotent to sort or otherwise deal with the one-way bombardment on his/her world display screen; such a person would presumably be desperate to “turn off” the bombardment; such a person would be driven to insanity by sensation. If the non-veridical flux is completely dominant, the person’s mind is in a perpetual dream of self-induced fantasy, sensing with all senses, that which is NOT “out there;” such a person would be driven to insanity by hallucination. In this view, the infamous “acid trips” of the 1960’s induced by hallucinatory drugs such as LSD could be seen as self-induced temporary periods of time in which the non-veridical flux “got the upper hand” over the veridical flux.

This discussion of “flux balance” explains why dreams are depicted in Figure 1 as “hovering” just outside the world display screen. The perception model suggests dreams are the brain’s way of keeping the two fluxes in balance, keeping us as “sane” as possible. In fact, the need to keep the fluxes in balance, seen as the need to dream, may explain why we and other creatures with large brains apparently need to sleep. We need “time outs” from empirical data influx (not to mention “time outs” just to rest the body’s muscular system and other systems) to give dreaming the chance to balance out the empirical with the fanciful on the stage of the world display. Dreams are the mixtures of the veridical and non-veridical not needed to be stored or acted upon in order to prevent overload from the fluxes of the previous day (or night, if we are “night owls”); they play out without being perceived in our sleeping unconsciousness (except for the dreams we “remember” just before we awaken) like files in computer systems sentenced to the “trash bin” or “recycle bin” marked for deletion. Dreams can be seen as a sort of “reset” procedure that prepares the world display screen to ready for the upcoming day’s (or night’s) two-way flux flow that defines our being awake and conscious.

This model might possibly suggest new ways of defining a “scientific, analytical mind” (“left brain”) and comparing that with an “artistic, emotional mind” (“right brain”). Each could be seen as a slight imbalance (emphasis on “slight” to remain “sane”) of one flux over the other, or, better, as two possible cases of one flux mitigating the other slightly more. To think generally “scientifically,” therefore, would be when the non-veridical flux blends “head-on” upon the world display screen with the veridical flux and produces new non-veridical data that focuses primarily upon the world external to the brain; the goal of this type non-veridical focus is to create cause/effect explanations, to problem-solve, to recognize patterns, and to create non-veridically rational hypotheses, or, as I would say, “proto-theories,” or scientific theories in-the-making. Thus is knowledge about the world outside our brain increased. To think generally “artistically,” on the other hand, would be when the non-veridical flux takes on the veridical flux upon the world display screen as ancillary only, useful in focusing upon the “world” inside the brain; the goal of this type non-veridical focus is to create new ways of dealing with likes, dis-likes, and emotions, to evoke “feelings” from morbid to euphoric, and to modify and form tastes from fanciful thinking to dealing emotionally with the external world in irrational ways. Thus is knowledge about what we imagine and about what appears revealed to us inside our brain increased.

With these two new definitions, it is easy to see that we have evolved as a species capable of being simultaneously both scientific and artistic, both “left-brain” and “right-brain;” as I said earlier, the scientist is as potentially subjective as the poet, and the poet is as potentially objective as the scientist. We do ourselves a disservice when we believe we have to be one or the other; ontologically, we are both. Applying the rule of evolutionary psychology that any defining characteristic we possess as a species that we pass on to our progeny was probably necessary today and/or in our past to our survival (or, at minimum, was “neutral” in contributing to our survival), the fact we are necessarily a scientific/artistic creature was in all likelihood a major reason we evolved beyond our ancestral Homo erectus and “triumphed” over our evolutionary cousins like the Neanderthals. When we describe in our midst a “gifted scientist” or a “gifted artist” we are describing a person who, in their individual, unique existence purposely developed, probably by following their tastes (likes and dislikes), one of the two potentialities over the other. The possibility that an individual can be gifted in both ways is very clear. (My most memorable example of a “both-way” gifted person was when I, as a graduate student, looked in the orchestra pit at a production of Handel’s Messiah and saw in the first chair of the violin section one of my nuclear physics professors.) Successful people in certain vocations, in my opinion, do better because of strong development of both their “scientific” and “artistic” potentialities; those in business and in service positions need the ability to simultaneously successfully deal with problem solving and dealing with the emotions of colleagues and clientele. Finding one’s “niche” in life and in one’s culture is a matter of taste, depending on whether the individual feels more comfortable and satisfied “leaning” one way or another, or, being “well-rounded” in both ways.

Regardless of the results of individual tastes in individual circumstances, the “scientist” being at odds with the “artist” and vice-versa is always unnecessary and ludicrous; the results of one are no better or worse than those of another, as long as those results come from the individual’s volition (not imposed upon the individual by others).

 

From the 1960’s “acid rock, hard rock” song by Jefferson Airplane, Somebody to Love:

When the truth is found to be……lies!
And all the joy within you…..dies!
Don’t you want somebody to love?
Don’t you need somebody to love?
Wouldn’t you love somebody to love?
You better find somebody to love!

These lyrics, belted out by front woman Grace Slick, will serve as the introduction to two of the most interesting and most controversial applications of this perception theory. The first part about truth, joy, and lies I’ll designate as GS1, for “Grace Slick Point 1” and the second part about somebody to love I’ll designate as GS2.

Going in reverse order, GS2 to me deals with that fundamental phenomenon without which our cerebral species or any such species could not have existed – falling in love and becoming parents, or, biologically speaking, pair bonding. The universal human theme of erotic love is the basis of so much of culture’s story-telling, literature, poetry, and romantic subjects of all genres. Hardwired into our mammalian genome is the urge, upon the outset of puberty, to pair-bond with another of our species and engage, upon mutual consent, in sexual activity. If the pair is made of two different genders, such activity might fulfill the genome’s “real” intent of this often very complex and convoluted bonding – procreation of offspring; procreation keeps the genes “going;” it is easily seen as a scientific form of “immortality;” we live on in the form of our children, and in our children’s children, and so on. Even human altruism seems to emerge biologically from the urge to propagate the genes we share with our kin.

Falling in love, or pair bonding, is highly irrational, and, therefore a very non-veridical phenomenon; love is blind. When one is in love, the short comings of the beloved are ignored, because their veridical signals are probably blocked non-veridically by the “smitten;” when one is in love, and when others bring up any short comings of the beloved, they are denied by the “smitten,” often in defiance of veridical evidence. If this were not so, if pair bonding was a rational enterprise, much fewer pair bonds would occur, perhaps threatening the perpetuation of the species into another generation. [This irrationality of procreation was no better defined than in an episode of the first Star Trek TV series back in the 1960’s, wherein the half human-half alien (Vulcan) Enterprise First Science Officer Spock (played by Leonard Nimoy) horrifically went apparently berserk and crazy in order to get himself back to his home planet so he could find a mate (to the point of hijacking the starship Enterprise). I think it was the only actual moment of Spock’s life on the series in which he was irrational (in which he behaved like we – fully human.]

GS1 is to me another way of introducing our religiosity, of asking why we are as a species religious. This question jump-started me on my “long and winding road,” as I called it – a personal Christian religious journey in five titles, written in the order they need to be read: 1) Sorting Out the Apostle Paul [April, 2012], 2) Sorting Out Constantine I the Great and His Momma [Feb., 2015], 3) Sorting Out Jesus [July, 2015], 4) At Last, a Probable Jesus [August, 2015], and 5) Jesus – A Keeper [Sept., 2015]. Universal religiosity (which I take as an interpretation of GS1) is here suggested as being like the universality of the urge to procreate, though not near as ancient as GS2. As modern humans emerged and became self-conscious, they had to socially bond into small bands of hunter-gatherers to survive and protect themselves and their children, and the part of the glue holding these bands together was not only pair-bonding and its attendant primitive culture, but the development of un-evidenced beliefs – beliefs in gods and god stories – to answer the then unanswerable, like “What is lightning?” and “How will we survive the next attack from predators or the enemy over the next hill?” In other words, our non-veridical faculties in our brain dealt with the “great mysteries” of life and death by making up gods and god stories to provide assurance, unity, fear, and desperation sufficient to make survival of the group more probable. Often the gods took the shape of long-dead ancestors who “appeared” to individuals in dreams (At Last, a Probable Jesus [August, 2015]). Not that there are “religious genes” like there are “procreate genes,” but, rather, our ancestors survived partly because the genes they passed on to us tended to make them cooperative for the good of the group bound by a set of accepted beliefs – gods and god stories; that is, bound by “religion.”

The “lies” part of GS1 has to do with the epistemological toxicity of theology (the intellectual organization of the gods and god stories) – religious beliefs are faith-based, not evidence-based, a theme developed throughout the five parts of my “long and winding road.” On p. 149 of Jerry A. Coyne’s Faith vs. Fact, Why Science and Religion are Incompatible (ISBN 978-0-670-02653-1), the author characterizes this toxicity as a “metaphysical add-on….a supplement demanded not by evidence but by the emotional needs of the faithful.” Any one theology cannot be shown to be truer than any other theology; all theologies assume things unnecessary and un-evidenced; yet, all theologies declare themselves “true.” As my personal journey indicates, all theologies are exposed by this common epistemological toxicity, yet it is an exposé made possible only since the Enlightenment of Western Europe and the development of forensic history in the form of, in the case of Christianity, higher Biblical criticism. This exposé, in my experience, can keep your “joy” from dying because of “lies,” referring back to GS1.

Both GS1 and GS2 demonstrate the incredible influence of the non-veridical capabilities of the human brain. A beloved one can appear on the world display screen, can be perceived, as “the one” in the real world “out there,” and a god or the lesson of a god story can appear on the world display screen, can be perceived, as actually existing or as being actually manifest in the real world “out there.”

Putting GS1 in more direct terms of the perception model represented by Figures 1 and 2, non-veridical self-consciousness desires the comfort of understandable cause and effect as it develops from infancy into adulthood; in our brains we “need” answers — sometimes any answers will do; and the answers do not necessarily have to have veridical verification. Combining the social pressure of the group for conformity and cooperation, for the common survival and well-being of the group, with this individual need for answers, the “mind,” the non-veridical, epiphenomenal companion of our complex brain, creates a personified “cause” of the mysterious and a personified “answerer” to our nagging questions about life and death in general and in particular; we create a god or gods paralleling the created god or gods in the heads of those around us who came before us (if we are not the first of the group to so create). We experience non-veridically the god or gods of our own making through dreams, hallucinations, and other visions, all seen as revelations or visitations; these visions can be as “real” as the real objects “out there” that we sense veridically. (See At Last, a Probable Jesus [August, 2015] for examples of non-veridical visions, including some of my own.) Stories made up about the gods, often created to further explain the mysteries of our existence and of our experiences personally and collectively, combine with the god or gods to form theology. Not all of theology is toxic; but its propensity to become lethally dangerous to those who created it, when it is developed in large populations into what today are called the world’s “great religions,” and fueled by a clergy of some sort into a kind of “mass hysteria” (Crusades, jihads, ethnic “cleansings,” etc.), makes practicing theology analogous to playing with fire. As I pointed out in Jesus – A Keeper [Sept., 2015], epistemologically toxic theology is dangerously flawed. Just as we have veridically created the potential of destroying ourselves by learning how to make nuclear weapons of mass destruction, we have non-veridically created reasons for one group to try and kill off another group by learning how to make theologies of mass destruction; these theologies are based upon the “authority” of the gods we have non-veridically created and non-veridically “interpreted” or “listened to.” It is good to remember Voltaire’s words, or a paraphrase thereof: “Those who can make you believe absurdities can make you commit atrocities.”

Also remember, the condemnation of toxic theology is not the condemnation of the non-veridical; a balance of the veridical flux and the non-veridical flux was absolutely necessary in the past and absolutely necessary today for our survival as individuals, and, therefore, as a species. Toxic theology, like fantasy, is the non-veridical focused upon the non-veridical – the imagination spawning even more images without checking with the veridical from the “real world out there.” Without reference to the veridical, the non-veridical has little or no accountability toward being reliable and “true.” All forms of theology, including the toxic kind, and all forms of fantasy, therefore, have no accountability toward reality “out there” outside our brains. Harmony with the universe of which we are a part is possible only when the non-veridical focuses upon referencing the veridical, referencing the information coming through our senses from the world “out there.” This is the definition of “balance” of the two fluxes to our world display screens in our heads.

Comparing this balanced flux concept with the unbalanced one dominated by the non-veridical (remember the unbalanced flux dominated by the veridical is brain overload leading to some form of insanity), it is easy to see why biologist Richard Dawkins sees religiosity as a kind of mental disease spread like a mental virus through the social pressures of one’s sacred setting and through evangelism. Immersing one’s non-veridical efforts into theology is in my opinion this model’s way of defining Dawkins’ “religiosity.” In the sense that such immersion can often lead to toxic theology, it is easy to see the mind “sickened” by the non-veridical toxins. Whether Dawkins describes it as a mental disease, or I as an imbalance of flux dominated by the non-veridical, religiosity or toxic theology is bad for our species, and, if the ethical is defined as that which is good for our species, then toxic theology is unethical, or, even, evil.

To say that the gods and god stories, which certainly include the Judeo-Christian God and the Islamic Allah, are all imaginative, non-veridical products of the human mind/brain is not necessarily atheistic in meaning, although I can understand that many a reader would respond with “atheist!” Atheism, as developed originally in ancient Greece and further developed after the European Enlightenment in both Europe and America, can be seen as still another form of theology, though a godless one, potentially as toxic as any other toxic theology. Atheism pushing no god or gods can be as fundamentalist as any religion pushing a god or gods, complete with its dogma without evidence, creeds without justification, evangelism without consideration of the evangelized, and intolerance of those who disagree; atheism can be but another religion. Atheism in the United States has in my opinion been particularly guilty in this regard. Therefore, I prefer to call the conclusions about religion spawned by this perception model as some form of agnostic; non-veridical products of the brain’s imagination might be at their origin religious-like (lacking in veridical evidence or dream-like or revelatory or hallucinatory) but should never be seen as credible (called epistemologically “true”) and worthy of one’s faith, belief, and tastes until they are “weighed” against the veridical information coming into the world display screen; and when they can be seen by the individual as credible, then I would ask why call them “religious” at all, but, rather, call them “objective,” “scientific,” “moral,” “good,” or “common sense.” I suggest this because of the horrendous toxicity with which religions in general and religions in particular are historically shackled.

We do not have to yield to the death of GS1 (When the truth is found to be lies, and all the joy within you dies!); GS2 (Love is all you need, to quote the Beatles instead of Grace Slick) can prevent that, even if our irrational love is not returned. In other words, we do not need the gods and god stories; what we need is the Golden Rule (Jesus – A Keeper [Sept., 2015]). This is my non-veridical “take” on the incredible non-veridical capabilities encapsulated in GS1 and GS2.

Western culture has historically entangled theology and ethics (No better case in point than about half of the Ten Commandments have to do with God and the other half have to do with our relationship to each other.) This entanglement makes the condemnation of theology suggested by this perception model of human ontology an uncomfortable consideration for many. Disentanglement would relieve this mental discomfort. Christianity is a good example of entangled theology and ethics, and I have suggested in Jesus – A Keeper [Sept., 2015] how to disentangle the two and avoid the “dark side” of Christian theology and theology in general.

Ethics, centered around the Golden Rule, or the Principle of Reciprocity, is clearly a product of non-veridical activity, but ethics, unlike theology and fantasy, is balanced with the veridical, in that our ethical behavior is measured through veridical feedback from others like us “out there.” We became ethical beings similarly to our becoming religious beings – by responding to human needs. Coyne’s book Faith vs. Fact, Why Science and Religion are Incompatible points out that in addition to our genetic tendency (our “nature”) to behave altruistically, recognize taboos, favor our kin, condemn forms of violence like murder and rape, favor the Golden Rule, and develop the idea of fairness, we have culturally developed (our “nurture”) moral values such as group loyalty, bravery, respect, recognition of property rights, and other moral sentiments we define as “recognizing right from wrong.” Other values culturally developed and often not considered “moral” but considered at least “good” are friendship and senses of humor, both of which also seem present in other mammalian species, suggesting they are more genetic (nature) than cultural (nurture). Other culture values (mentioned, in fact, in the letters of the “Apostle” Paul are faith, hope, and charity, but none of these three need have anything to do with the gods and god stories, as Paul would have us believe. Still others are love of learning, generosity (individual charity), philanthropy (social charity), artistic expression of an ever-increasing number of forms, long childhoods filled with play, volunteerism, respect for others, loyalty, trust, research, individual work ethic, individual responsibility, and courtesy. The reader can doubtless add to this list. Behaving as suggested by these ideas and values (non-veridical products) produce veridical feedback from those around us that render these ideas accountable and measurable (It is good to do X, or it is bad to do X.) What is good and what is bad is veridically verified, so that moral consensus in most of the groups of our species evolves into rules, laws, and sophisticated jurisprudence (e.g. the Code of Hammurabi and the latter half of the Ten Commandments). The group becomes a society that is stable, self-protecting, self-propagating, and a responsible steward of the environment upon which the existence of the group depends; the group has used its nature to nurture a human ethical set of rules that answers the call of our genes and grows beyond this call through cultural evolution. The irony of this scenario of the origin of ethics is that humans non-veridically mixed in gods and god stories (perhaps necessarily to get people to respond by fear and respect for authority for survival’s sake), and thereby risked infection of human ethics by toxic theology. Today, there is no need of such mixing; in fact, the future of human culture may well hinge upon our ability to separate, once and for all, ethics from theology.

A final example of applying the perception model illustrated by Figures 1 and 2 for this writing is the definition of mathematics. Mathematics is clearly a non-veridical, imaginative product of the human brian/mind; this is why all the equations in Figure 2 need a “dashed” version in addition to the “solid,” as I was able to do for the single numbers like “8.” But why is math the language of science? Why is something so imaginative so empirically veridical? In other words, why does math describe how the world works, or, why does the world behave mathematically?

Math is the quintessential example of non-veridical ideas rigidly fixed by logic and consistent patterns; math cannot deviate from its own set of rules. What “fixes” the rules is its applicability to the veridical data bombarding the world display screen from the “real” world “out there.” If math did not have its utility in the real world (from counting livestock at the end of the day to predicting how the next generation of computers can be designed) it would be a silly game lodged within the memory loops of the brain only. But, the brain is part of the star-stuff contemplating all the other star-stuff, including itself; it makes cosmological “sense” that star-stuff can communicate with itself; the language of that communication is math. Mathematics is an evolutionary product of evolutionary complexity of the human brain; it is the ultimate non-veridical focus upon the veridical. Mathematics is the “poster child” of the balance of the two fluxes upon the world display screen of every human brain/mind. No wonder the philosopher Spinoza is said to have had a “religious, emotional” experience gazing at a mathematical equation on paper! No wonder we should teach little children numbers at least as early as (or earlier than) we teach them the alphabet of their native culture!

Further applications of the perception model suggest themselves. Understanding politics, economics, education, and early individual human development are but four.

I understand the philosophical problem of a theory that explains everything might very well explain nothing. But this perception model is an ontological theory, which necessarily must explain some form of existence, which, in turn, entails “everything.” I think the problem is avoided by imagining some aspect of human nature and culture the model cannot explain. For instance, my simplistic explanation of insanity as a flux imbalance may be for those who study extreme forms of human psychosis woefully inadequate. Artists who see their imaginations more veridically driven than I may have suggested might find the model in need of much “tuning,” if not abandonment. I have found the model personally useful in piecing together basic, separate parts of human experience into a much-more-coherent and logically unified puzzle. To find a harmony between the objective and the subjective of human existence is to me very objective (intellectually satisfying) and subjective (simultaneously comforting and exciting). The problem of explaining nothing is non-existent if other harmonies can be conjured by others. Part of my mental comfort comes from developing an appreciation for, rather than a frustration with, the “subjective trap,” the idea introduced at the beginning.

RJH

Sorting Out Constantine I the Great and His Momma

In the first volume of his 3-volume trilogy Byzantium, by John Julius Norwich (1988, Folio Press), the author ranks Constantine I, Constantine the Great, Emperor of the Roman Empire from 306-337 AD or CE, right up there with the Buddha, Jesus Christ, and the Prophet Mohammed, among the most influential men in all history (p 2). Strikingly, the Emperor in question was not the founder of a great world religion, as are the other three. Ironically, the Emperor historically shares the founding of Christianity with Jesus Christ, as does the Apostle Paul (Sorting Out the Apostle Paul, [April, 2012]).  Consequently, Norwich’s anointing of the Emperor is historically accurate and not as odd as it might first seem.

Norwich is very clear why Constantine is so highly “ranked:”  1) He is credited for adopting Christianity as the official religion of the Roman Empire, and (2) he transferred the capital of the Empire from Rome to the strategically located old town of Byzantium on the waterways Bosporus and Sea of Marmara which help link the Black Sea with the Aegean Sea.  Interestingly, both acts are germane to defining what we know today as Christianity, not just the first.  He had a lot of help from a lot of subjects along the way, of course, but none greater than his mother, “Momma” Helena.

As the Apostle Paul contributed to the definition of Christianity as much or more as Jesus (Sorting Out the Apostle Paul, [April, 2012]), so did Constantine I and his Momma.

Set a “historical microscope” on low power on the history of Christianity, and a whole litany of important events never surfacing in detail in churches of any ilk appear.  And it is not hard to understand why almost all churches are better served having their congregations of believers ignorant of so much history (Sorting Out the Apostle Paul, [April, 2012]).  Look at the partial list making up the “Big Picture” of Christianity’s evolution:  The first century Jewish Revolt (62-70 CE) leaves Paul’s theology the primary interpretation of Jesus’ death; Constantine I calls for unity and consensus from a conflicting plethora of interpretations of Jesus’ life and death (325 CE); Christianity permanently splits in twain with the Great Schism or East-West Schism of 1054;  the Western Church has two different papal sees in Rome and Avignon (1378-1417) (also confusingly called the Great Schism); Western Christianity splits via the Reformation into Catholicism and Protestantism in the 16th century; advent of the Roman Catholic Inquisition lasts from 1232 to 1820; Catholics and Protestants try to kill each other off in the Thirty Years War (1618-1648); the Enlightenment of the 18th century ushers in higher Biblical criticism of the 19th century in Western Europe; Christianity continues to shatter and schism into ever more “shards” into the 21st century (Mormonism, Scientology, etc.).

Constantine I, bolstered by Helena, set Christianity on the road to the conflicts and break-ups just listed by calling, ironically, for unity and consensus at the 1st Ecumenical Council at Nicaea in 325 CE.  The long litany of what is orthodox (“code” for “winning beliefs”) and what is heretical (“code” for “losing beliefs”) was launched.  Since all epistemological systems (What is true and how do you know it is true?) like religion are faith-based, no great religion nor intra-religion conflict or difference can be shown to be “better” or “truer” than any other.  Who has the power usually winds up claiming the orthodox label, and the heretical “loser” does not go away, usually, unless by exercise of that power; it seems to always be a case of “might makes right;” what is true and reliable knowledge, epistemology shows, has nothing to do with who has the “biggest stick.” Faith-based epistemology, as opposed to evidence-based epistemology, dooms settlement of conflict, sooner or later, into one group fighting or warring with another; consensus becomes impossible — only bloodshed and human misery are assured.

To be fair to Constantine, he probably never dreamed his Ecumenical Council was doomed, and he officially did not become baptized and an orthodox Christian until just before his death in 337, perhaps in deference to so many of his subjects who practiced a spectrum of non-Christian faiths, from Greco-Roman polytheism to Jewish monotheism.  Momma Helena, just after orthodox Christianity was declared the religion of choice, made a holy pilgrimage to the Holy Land, “discovering” all the important sites of Jesus’ ministry (site of His birth, site of His crucifixion, etc.) and returned to Constantinople (the new name of Byzantium) laden with holy relics, like a piece of the True Cross, among the earliest of Christian icons.  She was at the time, if not the most orthodox of the orthodox, the most powerful of the orthodox; her son was only too happy to back up her findings and declarations in the name of unity and consensus, which was, of course, unknowingly and impossibly out-of-reach.

And the long history of suffering and strife as listed above was started by Jesus Christ, the Prince of Peace, Son of the loving God?  Was God’s plan through Jesus to establish a Church whose members are inspired to kill off each other, as well as destroy non-believers?  I prefer to think not.  Christianity, in the multi-forms of the Church, has an epistemological problem exposed by history, always has, and always will.  No better way to define this problem than by setting a “historical microscope” on high power and look at the centuries-long struggle to find a consensus definition of the nature of Jesus for orthodoxy over just centuries, a smaller window of time compared to the span of millenniums of “low power.”  I have deliberately limited the smaller window of time to coincide with Norwich’s first volume of Byzantium, from the founding of Constantinople to the crowning of Charlemagne in 800 CE, which marked the beginning of the Holy Roman Empire.  In a nutshell, who Jesus was theologically and in relation to God was far from settled even 450 years after Constantine I’s death!

Constantine I’s “shot” was, again, the 1st Ecumenical Council at Nicaea in 325.  It dealt primarily with the heresy of Arianism (The Father has primacy over the Son.).  Most of the Germanic “barbarian” tribes, like the Visigoths, the Ostrogoths, and the Vandals were Arians — Christians, but Arian, not orthodox.  The father of modern science and co-founder of calculus, Sir Isaac Newton (1642-1727), was an Arian.  The 2nd Council, at Constantinople in 381, formally condemned the lingering and bothersome Arian heresy and declared the Constantinople see second only to that at Rome.  The 3rd Ecumenical Council, held in 431 at Ephesus, dealt with the Nestorian heresy (Christ is both the Son of God and the man Jesus, as opposed to Rome’s view He is fully God.)  The 4th Council was held at Chalcedon in 451 and dealt with the heresy of monophysitism (The nature of Jesus Christ is singular, not dual, and His singular nature is divine.).  Variations of orthodox Christianity that survive to this day, like Christian Copts, the Abyssinian Church, the Jacobites of Syria, and the Armenian Church, retain elements of monophysitism.  The 4th also declared the sees at Constantinople and Rome as equals, an early step toward the split between the Eastern and Western Churches.  The Fifth Council, the second at Constantinople, in 553 had more to do with the power struggle between the Eastern Church and the Western Church than with the nature of Jesus Christ, and was another step toward the final East-West Schism about 500 years later.  The Sixth Council, also known as the Third Council of Constantinople, was held in 680-681 dealing with the heresy of monothelitism (Jesus Christ had only one will even though He had two natures.)  The 7th and last Ecumenical Council chosen for this listing within my window of “high power” time, or the Second Nicaean Council, came in 787 in the wake of the early iconoclast crisis of the Eastern Church and restored holy images as objects of veneration (not adoration); it technically defined icons, not attempting to define Jesus Christ directly, though it restored to orthodoxy the sanctioning of Jesus Christ as a subject of art.  Later iconoclasm and puritanical Protestantism were to struggle against that sanction.

Thus, for slightly more than 350 years, the founder of Christianity could not be definitely defined.  And it could be argued the ambiguity of His identity continues to this day.  I suspect no great religion is free from such ambiguity, given all the branches and orders within them all.  Such is the bane of faith-based religions everywhere and at any time in history, I would say.  Such is the sword of evidence-based criticism of all sacred texts and of those who are the texts’ practitioners, in my opinion.

The sincerity of the Apostle Paul and of Helena should never be questioned; nor should the good intentions of Constantine I to unite the minds of his empire.  But they always overestimated the authority of written or spoken sacred texts or teachings and the seemingly unfailing willingness of believers to accept what they were told or what they themselves read as the truth.  They could not foresee a day when that authority would be questioned and held accountable; they could not imagine the blindness of mind that seems germane to faith-based religion.

 

The “sorting out” of the Apostle Paul (Sorting Out the Apostle Paul, [April, 2012]), Constantine I, and Constantine’s Momma begs the question:  can the trap of faith-based epistemology be avoided?  Are religions, or specifically that of Christianity, doomed to being just as good and just as bad as any other religion?  Not only is history the instrument of the “sorting,” it can help answer the begged question just posed.  Faith-based religions only dip into the surface of the history that defines and describes their origins; history is used only in so far as it suits the purposes of the organized religion.  Much like all faith-based religions, Christianity purposely employs de-contextualization — the plucking of lessons, facts, stories, and creeds from a bygone period of history out of their proper time period and force-feeding them to contemporary society; they may or may not be applicable to today; only by faith are they assumed to all be applicable; only by faith are they seen as from God or from some similar concept representing Reality.

Therefore, we should not de-contextualize; just as Christianity retains lessons from the Old Testament (God’s covenant with His chosen people) and rejects others (the sacrifice of animals to God), the same should be done to the New Testament, as shown by higher Biblical criticism.  What “fits” the modern world from Christianity (the Golden Rule and the parables) should be retained and what does not (blood sacrifice for the sins of mankind and the concept of Satan) should be discarded.  Atheism is not the inevitable outcome of the criticism of religion; but the retention of doubt to some degree is.  Doubt is the means by which the quicksand of unquestioning faith-based belief can be avoided.  Doubt is the way not to be shackled by the absurdities of Paul, Constantine, and Helena.

An example might help:  Thomas Jefferson, our 3rd President, compiled a book called The Jefferson Bible (1989, Beacon Press, Boston, ISBN 0-8070-7702-X), which was published posthumously (Avoiding all the furor among clergy and believers of all ilks had it been published before he died!) and the original of which is today a prize artifact of the Smithsonian.  He doubted that all the red-letter words of Jesus were historically accurate; he compared various translations in various languages and cut out and kept only those words that had authenticity from historians and were compatible and non-contradictory across all four Gospels; he avoided what reasonably could not be verified, such as miracles and theological interpretations of the red-letter words.  What was left was a literal cut-out version of the Gospels, but nonetheless a very humane and practical blueprint of how to live by the Golden Rule, a simple treatise on how we can treat each other according to love of our fellow man (philos).  The essence of Jesus, Jefferson was trying to say, is not found in the Passion Week and interpreting that week’s meaning, as Paul would have us believe, but, rather, is found in the Sermon on the Mount and in the parables.  To find and live by that essence is not to live by blind, unquestioning faith, but to live by purposeful, reasonable kindness toward each other.  And, moreover, the essence of Jesus can be found almost universally in other sacred texts.

What Jefferson did may not appeal to all, but what he did reminds us we are all free to interpret religion, or to reject religion, or to find our own religion; we can believe what we want or disbelieve what we want; what we cannot do is foist our belief or unbelief on others; what we cannot do is found our religions solely on faith; what we cannot do, in the case of Christianity, is follow the examples of Paul, Constantine, and Helena, not if we want the truth.

So, maybe it’s time to sort out Jesus Himself……

RJH

 

Top Ten List for Teachers of HS Students Preparing for College or University (Not a Ranking) — A List for Their Students, Too!

Or, The Ten Commandments for Academically Successful HS Classes

Or, Ten-Point Guideline Toward Improving the Problems of Public and Private High Schools

(While both Students and Teachers need to be cognizant of all ten for the ideal classroom student/teacher classroom relationship, the first four listed are directly addressed to teachers and administrators, while the last six are directly addressed to students.)

1. There is No Science of Education

If we understood education like we do scientific and mathematical theory, education would be achieving success everywhere. If there was a science of education, all schools would be using the same model based upon that science. As it is, wrong models are being applied in schools as if there was evidence they worked, such as the business model and the coaching model (See Education Reform — Wrong Models! [May, 2013]). Sometimes, a technocratic model is also tried, as if expensive technology in the classroom will somehow salvage education. The preferred model, while not yet a science of education, is a step in the right direction: the professional/collegian model (See  Education Reform — The Right Model [May, 2013] & Education Reform — How We Get the Teachers We Need [May, 2013]).

In other words, a school is not a business, administrators are not bosses, teachers are not workers, and students are not products. Nor is a school a sports team, administrators are not coaches, teachers are not team captains, and students are not team members. A school is not a communications center linked to cyberspace, administrators are not arcade designers, teachers are not links making students “users,” and students are not life-long video game players and internet surfers.

A school is an institution, administrators are faculty facilitators, teachers are professional colleagues, and students are individual, young, developing minds. Too little in my almost 40-year teaching career have I seen this “correct” model practiced in public and private schools; I know this is “correct” because I’ve seen the amazing success of this institutional/collegian model in short bursts of time during which there were enlightened principals and superintendents, or while in graduate school working as a research assistant or teaching assistant in a scientific research institute.

2. Remember What It Was Like To Be a Student, and Teach From The Two C’s

Two absolutely necessary adjectives for teachers are a) competent and b) caring (The two C’s); they also are almost absolutely sufficient. Imagine, if a teacher knows what he/she is talking about and is skillful in conveying it in many different ways, and if that same teacher cares passionately about not only what they teach but also about students actively assimilating what is taught inside their brains, then success as a teacher is almost assured. This assumes that professional success is solely a function of what happens in students’ brains. During almost forty years of teaching upper-level high school students, I have seen many teachers turn out to be very poor ones; in each unfortunate case, the teacher lacked one or both of the two C’s.

Do unto your students what was done unto you (when you were a student) by your teachers who possessed the two C’s. NEVER forget what it was like to be a student.

3. Teach Yourself Out of a Job — (Teacher Independence)

Somewhere in one’s advanced education, it occurs to a certain number of students that they do not need a teacher to learn the material. There is no reason why this cannot happen to high school students. The “savvy” high school teacher needs to teach toward this revelation for his/her students, even though it at first sounds like he/she is deliberately making his/her job obsolete. As tough as it may be to actually do it, have students leave the classroom as teacher-independent as they possibly can be; they don’t need the teacher. But, the teacher needs to remember, there is probably a fresh group coming into the classroom next school year who have little or no experience as independent learners, so there is a “need for you” every school year. Develop pride in being needed only at the beginning of the year and not needed at the end. A true teacher weans students from dependent classroom minions to stand-alone scholars who ask their own questions.

4. Education is “Multi-Way” Communication

Astonishingly, all the wrong education models that are in use (see 1. above) assume that education is one-way — the teacher sending to the students. They don’t realize (I think because education is too burdened by the philosophy of behaviorism.) that it is at least two-way: teacher-to-student and student-to-teacher. That means that for a single teacher with a single student there are two ways of communication, a “double-headed” arrow, if you please. For a teacher with n students, then, there are at least n “double-headed arrows,” or ways of communication. It is more involved and communicative than that. Say the teacher has 2 students, then there is another double-headed arrow between the two students as well as the double-headed ones between the teacher and the student duo, for a total of 3 ways of communication. The student-student ways are as important as the ways involving the teacher; student-student ways can be just as instructive as those involving the teacher. One approach to a true theory of classroom education is to hypothesize that one must maximize at any moment in the classroom the ways of on-task communication, so that what is output from one end of a source of ways is input at at least one end of those ways. In other words, when the teacher communicates, at least one student is assimilating, preferably all of them. When one student is speaking to the material, at least one student (hopefully all the students) and the teacher are assimilating. Frivolous or disrupting speaking is not considered communication germane to the material being taught.

This “multi-way” theory of education has lots of potential; it can be mathematically defined. For a teacher where the number of students, n, is 2, as has been said, the number of ways of on-task/subject communication, f(2), is 3, or f(2) = 3, using function notation instead of sequence notation. The relation between one teacher and n students with the number of ways of communication is a sequence relating n with f(n) given by the recurrence relation f(1) = 1, f(n) = n + f(n – 1), or f(n + 1) = (n + 1) + f(n), and n greater than or equal to 1. Consequently, for only four students in a classroom there are ten possible ways of on-task/subject communication, for 6 there are 21! Imagine how many there are in a “normal” classroom of 15 – 30! No wonder education models are much too simplistic.

5. Grades “Take Care of Themselves”

Because high school graduates are seen by the wrong educational models as future consumers and because it is impossible to have a grading system in schools that truly measures academic performance (so complicated is educating a young mind), grades are pervasively seen as measures of the student rather than as the imperfect reflections of biased judgements and evaluations they actually are. Consequently, students are taught to evaluate and judge themselves by their grades. Students K through 12 need to be reminded every day they are far, far more than their grades and their teacher evaluations. Students’ self-confidence does not have to be a function of a set of grades on a transcript, though having pride in one’s transcript is surely not a bad thing; the point is, grades are not the “only” thing.

Until we have a better system to evaluate what happens in students’ brains than the traditional grading system, we should have teachers with the two C’s (2. above) emphasize students being motivated not by making good grades, but by being motivated toward making all material of the curriculum “mental possessions in the students’ brains.” In other words, concentrate on your work, students, and don’t worry about what your grades are going to be; if you focus on academically performing, grades will “take care of themselves.” Like most teachers, I suppose, in almost 40 years I saw only a handful of students in my classroom who never worried about grades, they were so busy being excited about learning the material. Getting students excited in this way is easier said than done, to say the least, as it depends upon a teacher realization at one end of the on-task/subject way of communication (4. above) and a similar, closely concurrent student realization at the other.

6. Self-Motivation in the Classroom

A student seen by school, administration, and faculty as a developing mind not having to worry about grades (5. above) is a student free to become truly self-motivated by the curriculum. The student is free to enter that enlightened state of “learning for its own sake,” of realizing gained knowledge and skills are beyond price and beyond numbers on a transcript. A self-motivated student naturally evolves into teacher independence (3. above) and begins to utilize schools, administrators, and teachers as aides and stepping-stones toward their life-long span of education, of accumulating for as long as they live knowledge and skills.

In other words, students can self-motivate themselves into becoming life-long scholars (See 10. below). They will not become scholars for someone else; they can only do it for themselves.

7. Accurate Self-Concept of Personal Beliefs and Academic Skills

Students taking their first steps toward becoming a self-motivated scholar (6. above) need to develop as they go an accurate, honest self-evaluation of not only what they believe, but why they have their particular beliefs; similarly, students need to accurately know how their academic knowledge and skills stack up to those of their classroom peers. Nothing is sadder to a “two C” teacher (2. above) than to see a student graduate high school with a distorted view of their comparative knowledge and skills and/or having no other justification of their belief system than mimicking and/or rejecting the system of their home.

Why is an honest self-concept of beliefs and academics important? Because if not in high school, then after HS graduation, each student’s beliefs will be challenged in either college, university, vocational school, or the work place; the chances of success after graduation is directly proportional to the amount of knowledge and skills accumulated in high school underwritten by self-knowledge and self-confidence. A knowledgable, self-confident person is not afraid they are different from the adults that raised them — an outcome highly probable, as we are all genetic hybrids of our two parents and unique compared to each and every one of our peers. Moreover, the knowledgable, self-confident person knows why he/she is different from the adults that raised him/her.

8. Take Advantage of School Curricula

Every high school student should take as many courses as they possibly can, even if it means taking more courses than minimally required for graduation. Accelerated, gifted, AP, honors, IB, etc. courses in areas of personal preference should be maximized where possible. By all means, TAKE A FULL LOAD OF COURSES EACH AND EVERY YEAR IN HS, ESPECIALLY THE SENIOR YEAR! Graduates who “minimally” graduate walk off the stage with diploma in-hand having “sold themselves short,” and they have no one to blame but themselves and/or those who advised them to set minimal academic graduation goals and have an “easy” school year or so academically.

9. Counseling via “Gut Checks”

Whether or not students have taken so-called “aptitude” tests, students by the time they graduate HS, need to have a detailed idea of their likes and dislikes, academically speaking. Counseling by classroom teachers and office counselors should be suggestive, not imperative. Students should be counseled by questions directing them to introspection; “have you thought about……?”, “how do you feel about…..?” should be used, not “You should…..!” Students should be encouraged to have a self-evaluation at the end of each school year concerning not only their personal progress (How much more do I know now than at the beginning of the school year?), but the status of what they like or dislike “deep down in their gut,” in their inner self to which only they have access. It is a procedure that will serve them well throughout their education beyond high school.

It is not necessary in the student’s “gut” to be stuck with an academic major or direction they sometimes have to write on the higher education institution admission form; the student’s likes at the time are a good guide. But likes and dislikes evolve over time on that campus, in that vocational school, or in that work place; changing one’s “major” or academic “direction” is commonplace. Having settled on a major or direction is probably wise after two years in post-high school academia.

10. Becoming a Scholar — “Own What You Learn”

To be a true scholar, as was mentioned in 6. above, each student must develop a relationship with the academic material of each subject, which is easy to do in courses the student likes, and not-so-easy to do in courses the student does not like. Regardless of the course, a well-developed set of likes and dislikes will enable the student to find SOMETHING even in his/her most detestable courses that will be intellectually stimulating and consequently make success even in these courses highly probable.

A scholar, regardless of likes and dislikes, develops a personal relationship with the curriculum of each course taken; or, the scholar deliberately reaches out and “owns” the material taught in each and every course. Not only will the scholar be able to “regurgitate” the material back on tests and final exams, the scholar will assimilate the material of each course with the material of past courses, folding in the elements of the various curricula into a coherent blend inside his/her mind or brain. A scholar not only knows stuff, he/she intellectually “owns” all that stuff.

And he/she who owns lots of stuff in their head, has proportionally to the amount of stuff thus owned, more control of all aspects of his/her life.

RJH

Post Navigation