Monday, July 09, 2007

The Meaning of Life

Brian Schwartz headshot by Brian Schwartz

The meaning of life

Someone sent me a link to a lecture given by an existential psychologist. He believes that the angst that follows us throughout life is caused by the basic problems life presents: lack of meaning; isolation and the impossibility of communion; the inevitable presence of that uninvited guest, death. He spoke of a dog, so happy to be thrown a stick, for, during the time it takes him to retrieve the stick, the chase gives his life purpose. I'm still waiting, he said, for God to throw me my stick. (http://www.yalom.com/pfister.html)

Now as it happens, I think about these things a lot, often at Crawpappy's Bar when I'm not distracted by girls. Sometimes they mix. More than one bewildered female has heard me exclaim, "Oh if only God would just tell me what to do!!!" Michalah, who knows the sort of things I think about, came up to me on Friday and said, "I love your guts."

And yet, it's better not to think too much In the main I try to be like Alyosha, who after all is the hero of "The Brothers Karamazov" He was not one for grand ideology; his brother Ivan was the man for that. But when he encountered another human being, he did his best to make that person's life better, no matter the cost to himself. And he did this by instinct, without thinking, or reasoning, or wondering why.

If you are immersed in the joy of others, you can feel the miracle of their being wash over you, and for that moment you are truly immortal.

Third of July

One of my mom's nurses lives on a farm about fifty miles east of Tulsa. You drive along back roads and byways to get there. "The street is named after her!" I cried when we drove out there last year. And indeed a signpost by the road bore her name. "It's not named after me, it's my husband's grandfather", she said. Her family has been in the area a long time. Just beyond her farm, the road wound past the old brown Mennonite church that serves the region. Most of the people there are Mennonite or Amish.

Once a year, in early July, Liz, the nurse, drives about seventy miles to the small farm community of Porter, where she picks a bushel of a variety of peaches, called Red Haven, which grow only there. A delicious peach, redolent of the robust perfume of life. She makes those peaches into pies, with a light ethereal cream sauce and a crust as subtle as an epiphany. Lots of heavy existential metaphors there, but it's easy to write like that when you taste her pie. We wait for those pies all year long.

Yesterday she cooked dinner. Sort of a Fourth of July meal, a day early, and starring the pie. She got up at sunrise, put on rubber boots -- that endless rain which has hit eastern Oklahoma has turned the land into marsh and mud -- and trudged out to the farm. She dug up a lot of potatoes, picked some cucumbers. She got corn from a neighbor. A nearby farmer had just killed a cow, so she bought a few steaks. At our house, she peeled and boiled the potatoes and then seared the edges in a pan. She boiled the corn. The cucumber got sliced and served with a creamy yogurt-like dressing that a German grandmother had taught her to make. The steaks went on the grill. We ate and ate until we bust and then we ate the pie. It was a lovely meal, a family meal, a meal not unlike what a family would have had on a good day a hundred years ago and more. Everything on the table came from her farm, and the neighbors' To a city boy, those rich explosive flavors were a revelation. "You could never get a meal like that in New York," I told her. Yes, we have some of the finest cooking schools, and chefs, and restaurants in New York. But that food didn't come from a fine cooking school or chef. It came from generations and generations of family meals, carefully cultivated and lovingly prepared. It came from an American farm.

Fourth of July

I don't think I've written about Cathe before, though she's been one of my Mom's nurses for quite some time. Yesterday her family came over to help us celebrate the Fourth of July. Now it's only because of a happy accident that she had that family at all. (No, not the kind of accident you think.) Back in the '70s she was liberated, a feminist, and didn't think much of women who spent their life raising a brood of kids. And then one day she and her boyfriend got to talking with a Mormon missionary. She probably wasn't too impressed with the knowledge that Kolob is the planet closest to God, and she certainly wasn't thrilled to learn that the Book of Mormon forbade sloe gin fizz and margaritas and all those delightful fun drinks she'd find at raucous weekend bars. Still, they prayed to God and asked Him to show them some sign if that was the right path. That night each of them had an ineffable experience, a sort of joyous (and indescribable) epiphany that convinced them beyond doubt that the church of Latter Day Saints was indeed right for them. And so they became Mormons, and faithfully practise its tenets, though if there is some passage in the Book of Mormon suggesting that women should be meek and subservient, Cathe forgot those verses soon after reading them.

And so Ed and Cathe raised seven children. One more than the Brady Bunch. Cathe also found time to become a nurse, and helped a lot of people along the way, which is how we came to know her. Four of the kids got married after college, moved away, started families of their own. We follow their lives vicariously. One of them, Brianna, became a track star in college, and Cathe went to California to see her compete in national events. About a year later, Brianna had twins, and I still remember the frantic phone calls when the twins entered the world a few weeks ahead of their scheduled appearance. Cathe went out to California again to help Brianna deal with the very energetic duo, who seem to take turns bawling and raising ruckuses.

So that left three of the kids still living at home, one a college graduate, one just graduated high school and already taking college courses for advanced placement, one a high school junior already taking her ACTs. And it was they who visited yesterday. Cathe always says they are picky eaters so we fixed hamburgers and hot dogs -- no kid will turn that down. Susanna, the youngest, is fascinated with Japan and anime -- last month Cathe took her to an anime convention in Dallas where ten thousand teenagers spent the entire night roaming through a big hotel and convention center wearing the costumes of their favorite anime characters -- so I put an Ukiyo-e print by Hiroshige on my computer screen. The kids didn't talk much to me, they mostly interacted with each other, and had a great time. We played a card game that was mostly an excuse to giggle and have fun. Betty the neighbor won! No one could believe it.

Well that was our Fourth of July. And I began writing this as a companion piece to my Third of July description of an all-American meal. But I now realize it's really a companion piece to my essay on the meaning of life. That existential psychologist who wrote that life for most people is solitary and without meaning said that he sometimes plays the game of trying in his head to list people to fill tables at a dinner party: the table of introverts, the table of overachievers, etc. He wrote that the only table that he just cannot fill is the table reserved for people who have led good, rich meaningful lives, people who are happy with their life. If he knew Cathe and Ed and her seven kids, he could set nine more places at that table.

Read More...

Saturday, June 30, 2007

Where is life located?

Carle P. Graffunder headshot by Carle P. Graffunder

Artwork by Charmaine Frost
Artwork by Charmaine Frost

Where are the emotions of joy, chicanery, veniality, pride, slyness and craftiness, elation, guilt, desire and any of a myriad of other qualities and experiences located? Indeed, where is imagination, inspiration, character, pain, exuberance, discriminatory pleasures and perversities and specializations of “rightness” and “evil” and satisfactions too numerable to mention?

Volitional aspects in human existence may present other difficulties of magnitude. Research psychologists and other pragmatists long have discarded attempts to find where “paying attention” or “focus your thoughts” or “listen carefully” can be located. Where “real” is long ago passed into obscurity as a field of scientific interest departing with the dictum that a thing is real if it is real in its consequences.

All urges, all fantasy, all insight, all intuition, all goal-setting, all of these and more slip into and out of our consciousness at times and places we can only partially - often, only with difficulty - even describe. To “locate” their “presence” seems even more evasive. It is as if a kind of emperor’s new clothes enclosed in our “selves” and these phenomena. The eye does not see directly. The paths of stimuli are traceable through a series of connections each energized briefly only enough to start up the next. So that what finally reaches the visual field is not what shone in the eye of the beholder. The person believes he is “seeing;” but it is an illusion. When he understands that, he can begin to understand impairments of vision and vision related phenomena such as migraine or dyslexia. So with hearing, we do not hear direct; waves disturb various tissues in succession, the waves are directed by physical engagements to take prescribed pathways to relevant auditory brain situses, in accordance with frequencies to which they are sensitive. What happens if pathways are blocked or misdirected? We have the illusion that we are hearing directly: and to comprehend that is an awakening experience. Such knowledge allows imagination to play a very large part in problem-solving in medical areas. But also the knowledgeable person will begin to get some insight into the fragile and tenuous nature of ideas and of any and all “knowledge.” He can discover that a world of illusion is where he lives.

Read More...

Wednesday, April 25, 2007

Anthropocentrism vs. Cosmocentrism

Groping toward a Paradigm Shift

Frank Luger headshot by Frank Luger 1

Abstract

Anthropocentrism views reality relative to Man, and maintains, directly or indirectly, that Man is the measure of all things. Based on immediacy and experience, as validated by sense-perception, this natural perspective was proposed by Aristotle quite in harmony with the then prevailing world-view somewhere between the ancient Babylonian flat-Earth model and the Ptolemaic system in the 2nd century, A.D. Despite several paradigm changes, from geocentrism to heliocentrism, from Newtonian Mechanics to Relativity Theory and Quantum Mechanics, anthropocentrism is still around, at least indirectly; by instinct, inertia, and emotionally satisfying features. That is, due to natural psychodynamics, most of our thinking, knowledge, as well as epistemic tools are still permeated by sophisticated anthropocentrism. However, current Science in general and modern Physics in particular have increasingly cast doubt on the adequacy and tenability of the anthropocentric paradigm. The expanding Universe from Big Bangs to Big Crunches, cosmic evolutions from the blurred mode of existence in modern microphysics to the blurred mode of existence in macrophysics, together with such recent evidence as for example the 2.7° Kelvin microwave background radiation, nonluminous cold and hot 'dark' matter, intergalactic plasma, etc. all seem to indicate that reality is independent of Man; and that Man is but a small clog in the cosmic scheme of things, regardless of cognitive abilities, now or ever. Therefore, the new world-view of cosmocentrism, based on cosmodynamics rather than psychodynamics, as introduced herein, proposes a radical cosmic paradigm with nothing less than a fundamental reversal of anthropocentrism. In short: cosmocentrism views reality relative to the Cosmos; and maintains, that Cosmos, rather than Man, is the true measure of all things.

Overview

ANTHROPOCENTRISM (Man is the measure of all things, reality viewed relative to Man)

  1. Direct anthropocentrism: naïve realism
    1. Primitive: No model of the Universe, from prehistoric times to approximately 4000 B.C. (?)
    2. Crude: Flat-Earth model of the Universe (Mesopotamia), from about 4000 — 400 B.C., approximately.
    3. Simple: Round-Earth model of the Universe (Aristotle), about 400 B.C. — 150 A.D., approximately.
    4. Smooth: Geocentric model of the Universe ( Ptolemy, Tycho Brahe ), about 150 — 1600 A.D.
  2. Indirect anthropocentrism: classical realism
    1. Fine: Heliocentric model of the Universe (Copernicus, Kepler, Galileo, Newton, etc.), 1600 — 1800 A.D.
    2. Sophisticated: No-center model of the Universe, infinity ( Kant, Laplace, etc.), 1800 — 1900 A.D.
    3. Implied anthropocentrism: scientific realism, quantum realism
    4. Subtle: Relativity Theory and Quantum Mechanics, the expanding, steady-state, inflationary models of the Universe, Big Bang-Big Crunch (Planck, Einstein, Bohr, de Broglie, Schrödinger, Heisenberg, Dirac, Hubble, Hawking, Guth, etc.), 1900 — 2000 A.D.- ?

COSMOCENTRISM: cosmic realism (Cosmos is the measure of all things, reality viewed relative to the Cosmos, Luger's Genie), 2000 A.D. - ?

Introduction

About 400 years ago, two competing world-view paradigms, based on the geocentric model of Tycho Brahe and the heliocentric model of Nicolaus Copernicus were equally compatible with all known observations. It was impossible to decide in favor of one or the other in terms of available evidence. Thus, concerned natural philosophers may have sympathized with the absurd predicament of Buridan's ass which starved to death between equally attractive feeding possibilities.

In time, heliocentrism superseded geocentrism; and thus a paradigm shift had taken place. One reason was that such new discoveries as those of Kepler, Galileo, Newton, etc. had slowly tilted the balance in favor of Copernicus. The other, perhaps even more important, reason was that supporters of Tycho Brahe had gradually died out.

Why was this important? Because the geocentric model was more compatible with emotional factors than the heliocentric one. These factors had to do with simple, common-sense, intuitive notions, as well as with philosophical-religious teachings about Man's privileged status in Nature, Man's closest kinship to God, and the like. Of course, these self-flattering notions were extremely resistant to rational arguments; therefore, the most vociferous partisans simply had to die out to make room for the new view. The emerging mechanistic world-view allowed far less arrogance and complacence and sharply accentuated the need for rationalism and empiricism. Nevertheless, human conceit and cosmic vanity have survived to the present day; and in spite of overwhelming contrary evidence, still find ample expression in the common-sense view of the Universe, which may be summed up as anthropocentrism.

Whether in crude or subtle ways, anthropocentrism regards Man as the central fact or final aim of the Universe, or of any system; and its evaluations are always relative to Man, always based on comparisons with Man. Direct anthropocentrism is the natural world-view of naïve realism. Based on instinctive and intuitive sense-perception, this simple and linear perspective maintains that reality is as it looks; things are what they seem. After Copernicus, indirect anthropocentrism gradually superseded the earlier view along the lines of classical realism. Finally, despite the rational objectivity of scientific realism and the counterintuitive or irrational features of quantum realism, implied anthropocentrism is still with us, as seen for example in the various sophisticated 'anthropic' principles. Today, all world-views are still intuitively anthropocentric, modern Science notwithstanding.

Against all this, in diametric opposition, cosmocentrism proposes the Cosmos as the central fact or final aim of the Universe, or of any system; and suggests that evaluations might approximate independent reality much closer when they are made relative to the Cosmos, based on comparisons with the Cosmos. This is the (un)natural world-view of cosmic realism. Based on scientific research data and sometimes even counterintuitive synthesis, this complex and nonlinear cosmocentric perspective maintains that reality is not as it looks, things are not what they seem. Of course, this view assumes that scientific realism is correct; i.e. that there is a world 'out there' that really exists and that is independent of our attempts to observe it and in fact independent of our very being. Its corollary assumption is that scientific investigations can make this world comprehensible to us. Neither of these assumptions is arbitrary or ad hoc; they are based on plenty of evidence from modern Science as well as the lessons of History.

Perhaps the lessons of History have taught us to avoid the fate of Buridan's ass. Perhaps we no longer have to fritter precious time away, just waiting for partisans of the rival view to die out. Perhaps we have learned to recognize irrational clingings to self-flattering views, and we already know how to deal with ignorance and arrogance. Perhaps Mankind no longer needs cosmic vanity to be reconciled with 'fate' and natural reality. Perhaps a paradigm shift in favor of cosmocentrism will herald the dawn of a new era, when emotional maturity and tolerance begin to supersede fratricidal-suicidal adolescence. Let's hope so- and, therefore, let's start groping toward it!

Discussion

Perhaps the most concise definition of anthropocentrism was given by Aristotle, when, some 2,300 years ago, he quoted the great sophist Protagoras (cca. 481-411 B.C.E.), who said that "Man is the measure of all things". This was in perfect agreement with common-sense views of Man, Nature, God or gods, and the Universe; based on the knowledge of those times and projections or extrapolations thencefrom. In order to fully understand what anthropocentrism is and what its inadequacies are, it may be worth while to take a somewhat closer look.

For some 7,000 years, until the early XXth century, we had thought that we live in a static Universe, characterized by eternity, permanence, stability, predictability, and reversibility. Although the importance of change and time-bound, irreversible processes have always been recognized, the permanence-features of the Universe had been given greater emphasis. Why? Because of the adaptive preoccupation with God or gods, which in this context also represented the vast unknown segment of reality; and of course, God or gods had to be immutable in order to maintain divine status and absolute rights. What evidence was there to support such view?

Not much. Our remote ancestors did not think so much in terms of evidence as in terms of plausibility. However, they were no fools. Keen observations formed the basis of their intuitive views and sharp analogies helped them to make sense of the bewildering world in which they lived. It was quite natural to observe human causation, from which simple intuition or projection led to superhuman causation. So, gods had always been thought to be giant humanoids with supernatural powers. Remember, Science did not exist as such; and authority was based on power, rather than knowledge. Thus, the Sun-god Shamash had divine authority by means of which laws could be conferred upon Sumerian society and enforced through the good offices of King Hammurabi about 4,000 years ago, throughout Babylonia.

This is how the first consistent world-view arose in ancient Mesopotamia, based on astronomical observations and practical considerations. It was the natural or instinctively intuitive flat-Earth view, according to which we live on a flat disk covered by a hemisphere. The ancient Persian religion of Zoroastrianism, among others, furnished gods and angels for Heaven above the hemisphere, and devils and demons for Hell beneath the disk. Stars and their constellations resembling something that humans could relate to, gave rise to astrology and associated myths. The most important feature of this natural world-view was consistency with all known facts as well as explanation in terms that were familiar and satisfying to the ancients.

Of course, partial explanations also flourished, as this was the age of fabulous myths and great legends. However, the most general view, being the most consistent with facts and features of reality thought to be important, was this anthropocentric flat-Earth model, which was also the most satisfying in terms of cognitive-emotional needs.

The only addition to this view was its extension by Aristotle. He simply took the flat Earth and spinned it around, so that the hemisphere became a full sphere. By that time, more and more evidence seemed to suggest that the Earth was round, not flat; and this was more consistent with his philosophical reasoning, which emphasized natural beauty and harmony.

It was thus quite natural for Aristotle to propose his famous hierarchy, called "Scala Naturae", which put Man near the top of a ladder or apex of a pyramid, if you will. Beneath Man was the animal kingdom, and beneath that, the non-living world. Above Man was God or gods; by means of which the unknown could be rationalized, albeit in naïve anthropocentric terms. Man thus acquired dominion over Nature, Man was Nature's finest, destined to rule all the world, being subject but to God or gods.

Our truth-needs were satisfied by the simple anthropocentric world-view, while our love-needs were satisfied by the human-privilege notion of our closest kinship to God or gods. Together, they had taken care of our cognitive-emotional needs, with minor variations, all along the line. Thus, natural psychodynamics was the essence of anthropocentrism, quite understandable in prescientific times and unscientific terms. Things were what they seemed, and reality was as perceived by Man.

Based on careful astronomical observations, the Alexandrian astronomer Claudius Ptolemaeus formalized the Aristotelian world-view during the 2nd century, A.D. The Ptolemaic system simply postulated that the Earth was the spherical center of the Universe; and the Sun, the stars, and the other planets revolved in orbits and spheres around it. Heaven was still above it all, and Hell was still below the surface of the Earth.

A dozen centuries or so later, this was still the prevailing view, further extended and complicated by astronomical observations and postulates, such as stellar patterns and various epicycles. This perspective formed the basis of the geocentric paradigm as championed for example by the famous Danish astronomer Tycho Brahe at the end of the Renaissance period in the XVIth century. It was still a directly anthropocentric perspective, well in line with Church dogma. The Universe still revolved around Man; and thus, no matter how pompous it sounds, Man was still the crowning glory of Creation and God's gift on Earth.

Let's remember that ever since Man knows that he knows, reality has always been perceived in two categories; known and unknown. At first, knowledge was limited to Man's immediate experience; and everything else was unknown. But the unknown is unpredictable, hence anxiety-provoking; and unrationalized anxiety reduces Man to helplessness. By rationalizing thunder and lightning as the wrath of God or gods, for example, such phenomena could be given explanations that people could relate to; and, very importantly, no other explanations were available. Today, we have adequate explanations of natural phenomena without recourse to supernatural notions. We might still experience some anxiety when facing thunderstorms for example, but we no longer have to invoke and try to placate gods or demons in order to survive such episodes. In other words, as knowledge has increased over the millenia, the unknown has decreased proportionately. Knowledge thus enables us to relate to various features of reality without superstitious beliefs and practices or incapacitating fears and anxieties. However, in this context, it is important to distinguish between subjective and objective kinds of knowledge. Perhaps a word of explanation is in order.

Subjective knowledge, while intuitively appealing and perhaps even emotionally satisfying, may also be unreliable and invalid. From perceptual selectivity to idiosyncratic preferences, subjective knowledge can easily lead to false beliefs and distortions of reality. For example, belief in witchcraft had led to tragic persecutions and absurd injustices for many centuries. It is thus a moral duty to always strive for more and more adequate knowledge and to remain open to criticism, even self-critique; otherwise, arrogance and self-righteousness can lead to but repetitions of the horrors of History.

Objective knowledge may be counterintuitive and even emotionally unsatisfactory, but being reliable and valid, it really helps to avoid self-righteousness and falsehoods. Fortunately, we have epistemological methods and safeguards to ensure the adequacy of objective knowledge. The built-in self-correction of the scientific method is our principal guarantee of reliability and validity, in spite of the inherent limitations of Science.

Science is not perfect. Nor is it complete. It may or may not be emotionally comforting, but it's still the best we have; and it works. Of course, it is also our moral duty to avoid the fallacies and pitfalls of scientism; and never to mistake Science for a religious substitute or make substitute religion of it. Science, in the modern sense of a dynamic epistemological activity characterized by its hypothetico-deductive-inductive method, is still very young- barely 400 years old. What's that compared to 4,000 years of anthropocentrism, 40,000 years of cultural evolution, and 400,000 years of anthropological evolution?

Yes, it was perhaps 400 years ago that modern Science had begun to take shape. Francis Bacon of Verulam, among others, was instrumental in formulating its methodology. By that time, Copernicus had already proposed the heliocentric paradigm; and thanks to Gutenberg, printed knowledge had begun to spread. However, only elegance and Occam's razor argued in favor of Copernicus; and his vindication had to await the works of Kepler, Galileo, Newton, etc. Scientific measurement and systematic experimentation throughout the XVIIth century gave rise to the scientific revolution. To be sure, Science was still part and parcel, a 'handmaiden' of Natural Philosophy; but by the turn of the century, its emancipation was well under way, and direct anthropocentrism was in trouble!

The XVIIIth, XIXth, and especially XXth, centuries have seen indirect anthropocentrism gradually superseding the earlier direct view as the thriving handmaiden of Natural Philosophy had rapidly blossomed into a very attractive and effective young 'goddess'. Her emancipation became complete about a hundred years ago, and her superior beauty and efficiency have been amply confirmed by such spectacular technological marvels that would have been called 'miraculous' not too long ago. When my Grandfather was a child, there was no such thing as an airplane; but in the year he died, Man walked on the Moon. And that's within a single lifetime! Since then, progress has even accelerated and keeps increasing at an ever-dizzying rate. Today's knowledge, its immensity notwithstanding, may be very rudimentary compared to tomorrow's knowledge. Where it's all going to lead is anybody's guess right now.

During its early evolution, Science generally proposed a mechanistic, deterministic, and mathematically predictable Universe, not unlike a great clockwork of great precision. The XVIIIth century had extended this static, hydraulic, machine-like view to Man, as shown for example, by Julien de la Mettrie's "L'Homme Machine". Pierre Simon de Laplace's monumental work, "Mécanique Céleste" had taken determinism as far as doing away with God by doing without God. When questioned about it by the Emperor Napoleon, Laplace rather arrogantly replied that he had no need of such hypothesis.

Indeed, the rapid and spectacular progress of Science had demystified the Universe to the point that Friedrich Nietzsche announced that "God is dead". Nihilism, existentialism, and materialism had no room for anything supernatural. Positivism and Darwinism appeared to rob Man of his semi-divine privileges and cast serious doubt on divine creationism. Thus, about a hundred years ago, as Science had gradually begun to reveal that things are not what they seem, even indirect anthropocentrism started to be in trouble!

However, the Universe itself was still thought to be static. That is, the heliocentric model, ruled by blindly mechanical forces, was at first simply extended to infinity, both 'up' toward the macrocosm of stars and galaxies and 'down' toward the microcosm of atoms and molecules. Later, the Sun was deprived of its central position; and there was no further need for an astronomical center, as such. Stellar and galactic systems could make up the static Universe, without a preferred center; but being its prime observer, Man could still maintain dominion. This is a very subtle psychological point, well worth careful consideration. The static Universe remained indirectly anthropocentric, by virtue of potentially infinite observability and predictability, hence controllability. It was even thought that all essentials were already known, and the completion of Science would soon be forthcoming. Instead, what came forth was a series of knockout blows.

During the XXth century, it became clear that the Universe is not static, but dynamic and expanding. Worse, Relativity Theory in the macrocosm and Quantum Mechanics in the microcosm had completely overthrown common-sense, intuitive notions; and thus deprived us of conceptual comfort and security. As such, there could be no further doubt that things are definitely not what they seem. Increasing doubt had been cast on predictability and controllability. Worst of all, limits to knowability had begun to appear, such as Heisenberg's Uncertainty Principle in Physics and Gödel's Undecidability Theorem in Mathematics, for examples. Schrödinger's wave mechanics cast doubt on exact determinism and substituted probabilistic interpretations. It was shown that the act of investigation itself may distort reality. The sheer proliferation of data, the information explosion has blown all objective knowledge way out of proportion, in utter disregard of Man's perennial cognitive-emotional needs and the lessons of History. The result has been increasing confusion and frustration throughout the weary XXth century.

Nowadays, people don't know what to believe or whom to trust any more; and cognitive dissonance as well as emotional voids characterize modern Man's conflicts, which may be indicative of progressive neurosis, maybe even psychosis of some schizophrenic variety. By instinct, Man still directly perceives things relative to himself; but scientific knowledge forces him to think less and less as though the Universe revolved around him and more and more in very sophisticated, albeit still indirectly anthropocentric terms- but even that is rather objectionable. For example, we have to consciously remind ourselves that the galaxies are 'out there' and electromagnetism permeates everything and bacteria are 'all around', whether we see them or not. Together with the Big Bang cosmology and plenty of other evidence, the emerging picture seems to suggest that human sense perception, however extended by telescopes and microscopes, keeps Man locked into a 'bubble' of virtual reality, as it were, in dynamic interaction with the expansion of the Universe. But virtual reality is definitely not independent, real reality. This is tantamount to pronouncing the death sentence on all anthropocentrism, whether direct or indirect or both.

Let's put it differently. If we proceed from Man 'outward', we pass through the increasing magnitudes of the solar system, then the stellar system, then the Galaxy, then the Local Group of galaxies, then the supercluster of local groups, all the way to the outer limits of the Universe. As we reach these limits at the level of cosmology itself, things become increasingly blurred. The geometry is no longer Euclidean, visual information becomes less and less reliable, and more and more indirect methods have to be used, from radio astronomy and x-rays to mathematical modelling. There's a uniform microwave background radiation at 2.7 degrees Kelvin, which may be evidence of the Big Bang itself. Recently discovered hot and cold dark (nonluminous) matter seems to comprise 90% of the Universe, which is inaccessible to direct observation. Perhaps such dark matter could effectively close the Universe by providing a positive cosmological constant, whereby a pulsating or oscillating Big Bang - Big Crunch cosmology would perforce emerge, ad infinitum.

Now, if we proceed from Man 'inward', we pass through decreasing magnitudes, through the 'worlds' of physiology and biochemistry, all the way to quarks and other subatomic particles. Finally, we reach the inner limits of the Universe, as it were. Here, again, a blurred mode of existence seems to prevail, as virtual particles spontaneously jump in and out of existence all the time, as shown by Quantum Field Theory. Again, the geometry is no longer Euclidean, visual information becomes less and less reliable, and more and more indirect methods have to be used from electron tunneling and x-ray scattering to mathematical modelling. The same dark matter as in cosmology seems to provide sufficient energy densities for the fundamental field so that virtual particle fluctuation may continuously take place, again ad infinitum.

So, proceeding from Man outward, we reach the blurred mode of existence, which is cosmodynamics. Proceeding from Man inward, we also reach the blurred mode of existence, which is also cosmodynamics. Either way, the same Cosmos is at the end, as per current knowledge. The Cosmos seems to be the infinite baseline of all existence, from which all material events arise and to which they periodically return. If we were to post an unbiased, ideal observer at the level of the bare Cosmos, the Universe would look very different from there than from here. Let's be a bit whimsical and call this nonhuman cosmic observer 'Genie', somewhat similarly to Maxwell's 'Demon', if you will. Relative to Genie, all material events would be on a scale of positively increasing magnitudes, if we allow the observations to be at the origin of Cartesian coordinates. Genie would observe all material as well as nonmaterial events as various motion phenomena as though the vantage point were at the center of a sphere, assuming our habitual Euclidean geometry for present heuristics. Therefore, relative to Genie, it seems reasonable to conclude, that the Cosmos is the central fact or the final aim of the Universe; and this is nothing less than the definition of cosmocentrism itself.

Let's imagine a straight line or spectrum with Man at the center, and Cosmos at both ends. If we rotate either half of this line around Man, we get the anthropocentric paradigm. If we bend it in half and double it up so that Man is at one end while Cosmos at the other, and then rotate it around the Cosmos, we get the cosmocentric paradigm. Since the doubling up resembles a loop, by rotation we get a doughnut or torus-shaped Universe, which is compatible with all present-day objective knowledge, including Cosmology. Man is way out, somewhere at the periphery of the torus, nowhere near the center. However, things are not this cheap; and while anthropocentrism is a simple, static, and linear world-view, cosmocentrism is a complex, dynamic, and nonlinear perspective, nay, a complete paradigm per se.

Cosmocentrism shifts focus from Man to the Cosmos. It considers Man as nothing special, but a perfectly normal and necessary phase of cyclic evolutionary cosmodynamics. Cosmic evolution seems to proceed by both positive and negative feedback loops, between Big Bangs and Big Crunches, following the irreversible thermodynamic Arrow of Time. The Cosmos itself seems to consist of an overall closed system and several open subsystems, in dynamic interaction, somewhat like multidimensional subsets within a universal master set. The overall cosmic matrix with its pulsating submatrices appears to be what existence is all about. That one of the submatrices may be called human need no longer distort the overall matrix or the proportions and relations of the submatrices. Relative to the bare Cosmos, which alone may be timeless, all material events are observably time-bound and transient. Cosmocentrism thus provides a perspective consistent with all objective knowledge, and a world-view more harmonious with independent reality than the severely flawed, directly or indirectly anthropocentric paradigm. As such, it may be instrumental in the eventual resolution of our conflicts.

No need to fear humiliation. Our cosmic dignity is assured by our cosmic citizenship status without having to imagine that the Universe revolves around us. Although our cosmic roles may appear to be rather insignificant, we are just as indispensable and integral parts of the Cosmos as any other living or nonliving entity.

Nor does cosmocentrism do away with God or religion. Although, as Professor Stephen Hawking noted, in the Big Bang cosmology there's not much for a Creator to do; God and religion may still be invoked, albeit for emotional rather than cognitive needs. The challenge of cosmocentrism is that Man, not God, must be dethroned. Of course, God in the cosmocentric paradigm cannot very well resemble the Heavenly Father image of naïve realism; but, perhaps, it's just as well. Anyway, that's another story.

Summary & Conclusions

In summary, it may be said that this paper has endeavored to show that a fundamental paradigm shift from anthropocentrism to cosmocentrism is possible and perhaps even overdue. That's because common-sense perception of everyday reality is ab ovo anthropocentric, which is increasingly proven unreliable and invalid by factual knowledge of objective reality. It's high time for our intuitive world-view to become fully consistent with Nature as Nature is, rather than trying to squeeze Nature into our self-flattering pigeonholes. In short, it's time for a fundamental adjustment in our cognitive-emotional perspectives; it's time to transcend our bubble of virtual reality.

To be sure, direct anthropocentrism arose quite naturally along the lines of cultural evolution. From the ancient flat-Earth myth through the Ptolemaic system all the way to the geocentric model of Tycho Brahe, it was just a linear extension of a simple paradigm: that of Man on top of his world. Then, the Copernican heliocentric model gave rise to a mechanistic and increasingly materialistic world-view; which, together with modern Science in general and modern Physics in particular, has gradually shown in recent times that even the indirect anthropocentric paradigm may be inadequate and seriously misleading. The anthropocentric evolution of world-views from primitive to sophisticated can be seen as growing conflicts between subjective and objective perceptions of factual truth, all the way to the cognitive dissonances and emotional voids of today, always relative to Man.

Against this, the new cosmocentric paradigm may be proposed as adequate and truthful representation of cosmic reality; through the careful observations of an independent and factual and unbiased, perhaps even ideally optimal observer at the level of the Cosmos itself, as it were. This objective, nonhuman Genie has only one problem: available knowledge is still permeated by indirect or implied anthropocentrism, in however increasing sophistication and subtlety; as seen for example, in the various 'anthropic' principles. As even the existing tools, such as logic, mathematics, physics, philosophy etc. are still 'contaminated' by anthropocentrism, new tools may be needed; thanks to which many discoveries may be made beyond our wildest dreams. Much remains to be discovered. Genie is going to be very busy, but Genie needs a lot of help for complete substantiation of the cosmocentric paradigm. Although many of its features are counterintuitive, perhaps even irrational; there is already enough evidence in favor of adopting factual and objective cosmocentrism, and without repeating historical mistakes at that. Until now, every world-view has been anthropocentric, whether in crude or subtle ways. The radically new world-view of cosmic realism, called cosmocentrism, introduced here for the first time, is explicitly based on scientific realism, which believes that theoretical constructs (with some exceptions) refer to actually existing things which are described differently on different levels of theory. Gravitation is really there, whether it be described by forces or space-time curvatures. Classical realism has been superseded by quantum realism, which in turn may be superseded by cosmic realism. What it all means for us, is simply that we have to give up dominion. Cosmocentrism does not exalt Man. Rather, relative to the Cosmos, it shows the soberingly modest place of Man in Nature and Nature's proper place in Man. As such, cosmocentrism may be less emotionally satisfying than anthropocentrism- well, tough luck.

Presently, both paradigms are compatible with existing knowledge; and choice may be again made on grounds of elegance and Occam's razor, at least for the time being, until Genie tilts the balance definitely and irreversibly forward. The choice is ours, but with an important caveat. Factual truth is a moral duty and we really ought to keep in mind that things are not what they seem. The shift in favor of cosmocentrism is tantamount to a fundamental revolution at the conceptual level. The essence of this revolution is that Cosmos, rather than Man, is the true measure of all things.


Paper written for and presented at the joint British Mensa P.D.G. — I.S.P.E.Conference, Braziers College, Oxford, U.K., May 5-7, 2000.

Published:

Telicom, Vol. XIII, No. 5, July 2000, pp. 30-40,

Commensal, No. 102, August 2000, pp. 24-33,

Gift of Fire, Issue 120, Nov. 2000; pp. 24-35.

PhiSIGma, No. 22, August 2001, pp. 26-36.

1 Chairman and Founder, Mensa Israel; Diplomate, International Society for Philosophical Enquiry;

Read More...

Wednesday, February 21, 2007

The Tragedy in Death of a Salesman

"…how shall they believe in him of whom they have not heard?
and how shall they hear without a preacher?"
— Romans 10:14

Fred Vaughan headshot by Fred Vaughan

In "Death of a Salesman" by Arthur Miller there is an illusion nurtured by Willy that a man can be "worth more dead than alive."1 Obsessions with destiny can play such tricks on a person. In the end, however — but before Willy's suicide — there is a summing up: "Pop!" his son says, "I'm a dime a dozen, and so are you!" Repudiating him with, "I am not a dime a dozen! I am Willy Loman," does nothing to substantiate an imagined reality in which the salesman Willy Loman has profound significance. But the Willy Lomans of the world, and perhaps even Arthur Millers, cast short shadows in comparison to men for whom the appellation "tragedy" applies. There is neither singular tragic flaw to precipitate demise nor great ideas hanging in the balance with their life or death. So the terms "tragedy," "death," and "salesman" to which I refer in the title pertain very little to the play of the similar name. Miller argued that although Willy is indeed a "little man" he is worthy of the pathos we usually reserve for tragic heroes such as Oedipus Rex. His argument was that any character willing to sacrifice his life to secure a sense of personal dignity invokes the sense of tragedy. So who knows, it could be, although I tend to doubt it. Nonetheless, I had something else in mind.

There must certainly be many cases throughout history in which ideas of extreme import have been lost for no other reason than the death of a chief proponent although a full accounting of the overwhelming loss due to such events is well beyond any conceivable effort at historical reconstruction. Certainly the most complete instantiations of such carnage have by their very effectiveness destroyed all evidence of the ideas that were lost. We only occasionally get glimpses that such situations may actually occur because a meme has managed by some accident of fate to frustrate the procedure and escape into the world at large before the death of its initial advocate. We find even in such cases in which complete premature annihilation of an idea was unsuccessful, sad commentary with regard to the surviving culture unilaterally pardoning past sins whereby counter culture has been illegitimately destroyed. Furthermore, desecrated ideas do not reoccur in the interim as they are purported to be capable of doing in cultural fairy tales that promote the concept of "inevitability" of all great ideas. They are gone — it is possible that most truly great ideas have vanished forever! The context of history changes such that an unformulated idea would never occur to anyone else after its time had passed. Even "immortal" gods, perceived rationally by many as simply the products of human intellectual exercise, are vulnerable to extinction with their adherents. St. Paul knew this. And H. L. Mencken named one hundred seventy one "immortal" gods that have long since succumbed to the nether world, in conclusion quipping: "All were theoretically omnipotent, omniscient, and immortal. And all are dead."2 Preemptive violence employed against their human hosts in preventing unwanted meme epidemics one must conclude to have been spectacularly successful in every area of intellectual endeavor including philosophy, science, mathematics, music, religion, and, of course, politics. The effectiveness of accidental death or ruthless intrigue on all sides of every issue has been truly appalling and there is little reason to doubt that nascent ideas will be vicariously assassinated well into the future. It's happening right now. Machiavellian techniques apply not just to politics, but sadly, to every area of human intellectual endeavor.

Proponents of tired paradigms inaugurated before the eldest living human was old enough to propound the previous paradigm, still melodramatically cite Thomas Kuhn's popularized notion that a paradigm can only become universally accepted when death finally takes all those who upheld the previous paradigm to illegitimately criticize opponents.3 It's a dumb argument. Establishmentarian ideas debated into the ground have not died on account of the deaths of their proponents! There was full knowledge of their inner workings as a part of the debate that accompanied their demise. And long after the last proponent has been ushered to the nether region, stories survive of the victory of the new paradigm that will be extolled until it is in turn replaced, and in extolling its success, the defeated ideas survive as leitmotif against which it can be praised. Only fragile newborn ideas, unheard outside an inner circle, are truly vulnerable to death whether by natural disease, accident, or inquisition of one or few of their intellectual hosts. It is in this defenseless phase of private discovery and investigation prior to joining the public debate where destiny balances precariously on a fragile human fulcrum.

In his American classic, Robert Pirsig suggested that philosophical ideas propounded by the sophists in pre-Parmenidean Greece may have been systematically destroyed by antagonists and that what must once have been a heated debate turned into a unilateral attack on "sophistry" as mere rhetoric.4 With no sophist alive to set the record straight these accusations held for millennia, so sophists' alternative philosophical structure disappeared from the face of the earth, the minds of mankind. That is, of course, unless Pirsig actually did recapture from extracted roots of words and innuendos in accusations some of the original intent in his revitalized concept of Quality as preeminent over subsequent Westernized Aristotelian classifications.

In an earlier attempt at imitating the style of Jorge Luis Borges5I intimated that science perfunctorily expunges concepts from its registry as a part of a normal retroactive redaction, such that records of the life work of the hapless characters Woran von Geht and Friedrich Spielen had already been expurgated from journals: "The considerable volume of their contributions… more recent translations…have mercifully omitted…" However, beyond the facetious novelty in that account, a real danger exists of very similar expurgation processes. Nearly a century ago two of the most brilliant prospects for salvaging physics from the doldrums of academia vied with their alternative fixes to then current dilemmas. As protege of Poincaré, Walter Ritz had developed alternatives to the already gilded dogmas surrounding Maxwell's wave equations of electricity and magnetism. He was able to avoid the problems of having to throw away legitimate solutions to theoretically justified equations just because they ignobly refused to apply to the "real" world. Ritz's theory also competed honorably with Einstein's relativity for a time, accounting for many of the experiments because of the accepted factuality of what he pointed out with regard to the phenomenon of extinction of light by lenses, mirrors, and indeed by any material medium. Some years later Wilhelm de Sitter promoted Einstein's special relativity in preference to Ritz's using illegitimate arguments with regard to the non-existence of ghost images of binary stars.6 I sometimes wondered why so brilliant a physicist as Walter Ritz would not have rebutted such feeble arguments and thus have kept the debate alive. I finally realized why that was. Walter Ritz had long since been dispatched to the nether world! Earlier he and Albert Einstein had also argued at length about the origin of irreversibility in physics, an argument that had gone on for some time. At length the editor of the journal Physikalische Zeitschrift seems to have suggested that the two formulate their respective positions, sign an agreement to differ and get on with it.7 So they did that in 1909 and the debate ended. But of course, as too few know, the primary reason that the debate had ended was because Walter Ritz died two months after the agreement to disagree was published. Hence also, of course, de Sitter's subsequent claim in 1913 with regard to relativity would go unchallenged. Later in life Einstein recapitulated the arguments with regard to irreversibility to Wheeler and Feynman as stimulation to their development of absorption theory8and seemed to have somewhat altered his own position on issues including the debate with Ritz.9 But Einstein is dead too and most physicists have accepted his previously formulated position that complexity with the associated need for probabilistic solutions must, in itself, produce irreversibility without a microscopic counterpart. Cramer alone, who also challenged the "Copenhagen Interpretation" with his "Transaction Interpretation" of quantum mechanics, seems to maintain the standard propounded by Ritz.10 But sadly, although "a formula, a phrase remains, — …the best is lost" as Edna St. Vincent Millay sadly bemoaned.11 To my knowledge, no one has been able to reconstruct Ritz's electromagnetic theory.

In mathematics there is Evariste Galois, without whose willingness to write down the ideas of group theory the night before his duel over the dignity of a whore, we would not now have one of the major branches of mathematics. But, of course, if he had gotten a good night's sleep, practiced with his pistols, or better yet, just capitulated with regard to his lust, all of mathematics might be much more sophisticated than it is. In music there was Mozart, perhaps murdered or at least driven to deadly abstraction by an opponent of his abilities.

If salesmanship and religion don't seem to fit in the same sentence, read Roger Rueff's play "Hospitality Suite,"12or see the movie based on it, "The Big Kahuna" with Kevin Spacey and Danny De Vito. With regard to religious ideas it should be noted that although Judaism, Christianity (for a time), and Islam (during the odd crusade) were repeatedly under attack, these were always after their associated memes had leaked out into society at large and were, therefore, ineffective beyond the associated slaughter of humans. Zoroastrianism, on the other hand, like so many religious ideas before and after it in cultures throughout the world including previously cited immortal gods, did not fare so well. It was destroyed most effectively by the more or less total destruction of Persians who held to the doctrine of good versus evil to the bloody end. Perhaps current administrative decisions by the U. S. may in some way revitalize this notion that lacks so much in subtlety by its vain attempt to destroy all those infected by the offending idea of the Western world being evil. Ethnic groups everywhere and always have seemed to annihilate without compunction anyone holding opposing religious ideas for the greater glory of their own gods, their own culture, their own ideas.

In the political arena, character and literal assassination has been the norm that seems to have picked up momentum over the last quarter century. The tragic deaths and subsequent annihilation of character of key liberals by the resurgent American conservative movement has been motivated in large part by an agenda that cared primarily for the destruction of liberal political ideas to which cause these people's lives had no moral standing. In contrast, by elevating the stature of a chief proponent of terrorism and attempting to destroy his person but failing, his ideas may be emboldened like flames in a wind that has just failed to extinguish a fire. Creating public martyrs has the opposite effect of secret assassinations. So, although it is not surprising that bin Laden should find himself under attack by the most powerful nation ever to rule the world, it is indeed surprising that there would be so little awareness by Americans of the phase of this particular epidemic of anti-American sentiment. It seems well past the stage at which the incineration of any affected person or even of a small group of people could be effective in the eradication of the viral meme. The idea that the Western world is consumed by its own power and glory is out there! That notion and the associated hatred of Americans have been out there for some time with only the most naïve caught unaware on September 11, 2001. Now the idea is being reinforced by ill-conceived attempts to destroy it. It would seem that it should have been, and should still be, obvious that that idea must be debated openly to portray the proper perspective. Having resorted to prehistoric methods of idea extinction, too late in any case, the approach can only confirm by its success or failure what we desperately want to believe to be an invalid idea. How do we now convince anyone of its illegitimacy? Certainly Afghanis nor Iraqis (nor any other of the billions of Muslims) will buy the idea that we do not, and will not, continue flaunting military and economic might throughout the middle East and entire world until we have utterly destroyed all cultures but our own. That is an idea worthy of our consideration -- something to think about.

The death of a "salesman" of any idea by any method whatsoever is akin to killing the messenger. Certainly terrorists instrumental in massive killing are not merely killing salesmen. They must be brought to a justice that may involve their own deaths no less or more so than other perpetrators of heinous crimes. But let it be known that even in such cases capital punishment is constitutionally administered in consequence of those plans or actions involving the killing of human beings and not for nurturing ideas. For one thing (and it is, in fact, a major thing) to act otherwise is immoral by virtually any standard in any society. Those who treat human life as subsidiary to, or as mere attributes of, material symbols of an idea (or of an idea itself) are grossly immoral. Ideas must warrant victory and arguments should be won or lost based on relative merits of the competing ideas, not by "kill ratios" reminiscent of Viet Nam. Pursuing ideological arguments with human slaughter, however effective, by definition disqualifies participants from victory in any war alleged to pit good versus evil. Once both sides have reverted to such tactics, what is left is a bloody crusade of "us" versus "them!"


1 Arthur Miller, Death of a Salesman, Penguin, USA (1998)

2 H. L. Mencken, "Memorial Service," Prejudices (a selection), Vintage, New York, 143-147 (1958)

3 Thomas Kuhn, The Structure of Scientific Revolution, University of Chicago Press, Chicago (1962)

4 Robert Pirsig, Zen and the Art of Motorcycle Maintenance,

5 See for example, Jorge Luis Borges, Labyrinths, New Directions Pub Corp., New York (1964)

6 R. Fred Vaughan, "Special Relativity: An Experimental Error," Gift of Fire, 31, 6-15 (July 1988).

7 Walter Ritz and Albert Einstein , "On the Current State of the Radiation Problem," Physikalische Zeitschrift, 10, 323-324 (1909).

8 John Wheeler and Richard Feynman, "Interaction with the Absorber as the Mechanism of Radiation," Review of Modern Physics, 17, 157 (1945).

9 Abraham Pais, Subtle is the Lord — The Science and Life of Albert Einstein, Oxford, 467 & 484 (1982).

10 John Cramer, "Velocity Reversal and the Arrows of Time," Foundations of Physics, 18, 1205 (1988).

11 Edna St. Vincent Millay, "Dirge Without Music," Collected lyrics, Washington Square, New York, 172 (1959).

12 Roger Rueff's play "Hospitality Suite" does not seem to be available in print.

Read More...

Wednesday, December 20, 2006

The Near Shall Be Far; And The Far Near

Richard May headshot by Richard May

In already-withered futures everyone will be incredibly famous throughout uncountable worlds of unimaginably remote galaxies in other parallel universes; considered celebrities by innumerable life forms unrecognizable and incomprehensible to them. However, the closer one approaches to anyone proximate, the more darkly obscure she will become, and then increasingly unfamiliar with the passage of time; No one nearby will be dimly recognizable or ever be known, even by rumors; Languages will have no words for mother or other; Standing before the mirror, one will see no reflection; Yet this will be considered unremarkable.

Read More...

Monday, December 18, 2006

Ideologies

Richard May headshot by Richard May

Freedom, peace and prosperity are far better than their absence or negation. But ideologies are one-dimensional left-right maps of multidimensional territories of phenomenal processes and values; the attempted depiction of a cube or tesseract using only a straight line segment. There are no front-wingers or back-wingers, and no up-wingers or down-wingers. Moreover, unlike conventional maps, ideologies usually better serve the manufacturer of the map than the individual attempting to find his way."

Ideologies

Read More...

Tuesday, December 05, 2006

Games, Simulation and Religion

Sean J. Vaughan headshot by Sean J. Vaughan

Tonight while putting my son to bed I shared some thoughts I have from time to time in terms he could understand...

Virtual Reality

It seems that games will be able to get as realistic as our everyday perceptions. A joint effort between MIT and Harvard has produced rudimentary, synthetic eye interfacing optic nerves with external cameras on eye-glasses. There are several other projects that are working on brain implants for other human-aiding tasks such as allowing quadriplegics to work with a computer. While actual working systems are rudimentary, there don't seem to be any insurmountable physical or technical barriers to interfacing our perceptive organs to synthetic systems.

sketch of Sean J. Vaughan and his son

Using our current perception interfacing agents, joysticks, keyboards, and mice, many people choose to inhabit rich synthetic universes. These universes include Everquest, The Sims, Star Wars Galaxies, Second Life, etc. There are many more.

It is thus easy to imagine a future where we can choose to inhabit a synthetic universe.

From my training in acting and zen, I've found there is a difference to what we think of as acting and what it is to be. One of the main acting guys basically said [good] acting is accepting imaginary circumstance as real and then simply being human.

In zen, we practice kung-ans (Japanese: koans). In a simple sense, these are mind puzzles; for example, one of the most common is, "What is the sound of one hand clapping." These are the lessons taught and learned in Zen. More deeply, these are gates for your self (or soul) to pass through that must be experienced to be answered. There is no room for hesitation, irrelevant thought or acting.

Be it the attainment of life lessons or otherwise, it is easy to imagine a person choosing to embed themselves fully into a simulated universe leaving memories of the real universe behind. Furthermore, for the secret of the real universe to be kept, this simulated universe must only contain others who have fully embedded themselves (i. e., w/o memories of the "real" universe).

At this point, assuming our senses are interfaced and input simulated perfectly and the other players in the synthetic universe are likewise, the synthetic universe is indistinguishable from our own real universe. Also, assuming the rest of our bodies can be simulated (or left behind), that doesn't leave much cause for keeping it around in the real universe.

Astonishingly, if civilization progresses and is able to support a synthetic universe as I've described above, we are likely simulations in a game now.

Ok, so now we have "The Matrix": big deal.

Getting back to my son, he had no response to this. It didn't seem to upset him but I think I succeeded in giving him his first mind fu"¦, er, twist. He was thoughtful about it but didn't have much to say. I may have added to his families' current and future therapy bills but hopefully it's for the good

What my mind's been playing with that I haven't shared with my son (for good reason!) is how religion makes a hell of a lot more sense when given we are living in a simulation. God? He's the fella that created our simulation. Jesus? He used the real universe's Instant Messaging system; yup he was able to get a direct account from here. He's like the first guy that got a gmail account and started inviting the rest of us into the system. Buddha? Whether you're synthetic or not doesn't really change what you are: apples are sweet.

References:

Papers collected under "The Simulation Argument: Are You Living In a Computer Simulation?" Nick Bostrom, PhD; Philosophy Faculty, Oxford University.

General simulated reality info from wikipedia including quality external links.

The Boston Retinal Implant Project.

Everquest.

Star Wars Galaxies.

The Sims.

Second Life.

Read More...

Wednesday, October 11, 2006

Rationality and Intelligence

by Martin Hunt - copyright 2006 - All Rights Reserved

This article was included in the 37th issue of the Philosopher's Carnival.

One of the disconcerting things about science is that time and again the world is revealed to be not what we thought. The ancients did their best to account for the world that they saw. But their state of ignorance was such that many explanations (often conflicting) could account for what they knew. As knowledge is gained then the range of acceptable explanations diminishes. As we learn more, whole paradigms are shown to be invalid. This is an unpleasant outcome for the people who are committed to the invalid paradigms. They are faced with the necessity of abandoning a world view and adopting another. This is much more difficult than admitting error. When the world view is old, backed by tradition and community, then the transition to the new view is very difficult.

"The Robot's Rebellion" by Keith Stanovich is a book that proposes a very interesting and satisfying answer to the ancient puzzle; "How can a person (or anything for that matter) be 'free'?" A key concept in the Robot's Rebellion is the idea that comes from Richard Dawkins' book 'The Selfish Gene'. Dawkins' theme is that in biology the thing that replicates from generation to generation is genetic structure. The expression of the genetic structure - plants and animals - are not replicators. They are vehicles that carry the real replicators into the future.

Dawkins' idea is one of these disconcerting changes in perspective that I mentioned at the start of this essay. Previously it was assumed that the purpose of genes was to enable creatures to replicate. In the new view creatures exist to enable genes to replicate.

An important idea that came to us from the Greeks is that things have immaterial essences that determine their nature. Plato developed ideas about 'ideal form' - an immaterial template to which all actual examples conform. He figured that there was an 'ideal horse'; a perfect, but immaterial horse; and that all real horses were more or less flawed examples of that ideal horse. For Plato this wasn't just a verbal shorthand. He argued that the ideal horse was more real than the actual horses. Related to the idea of essence is the idea of spirit. Spirits were seen as the things that animate matter. Matter without spirit was just stuff - inanimate lumps. Matter with spirit was active - it moved around, did things, had intentions and desires.

At the time that these sorts of ideas were invented they represented an advance in understanding. It was an idea that was in accord with experience that enabled more and more experience to be understood - and even a flawed understanding is better than no understanding.

I think that ideas like 'soul' and 'mind' are associated with these Platonic understandings. A mind, or a soul, is seen as an immaterial entity that exists inside us and that controls or 'drives' the body. Basically, in the ancient conception, a mind drives a body the way that a person drives a car.

The problem with these ancient concepts is that they just don't accord to present knowledge. Hundreds of years of looking have never revealed anything like a soul or a mind. Moreover, contemporary science has found that there is no way for an immaterial mind to interact with a body - it violates fundamental principles like the conservation of matter and energy.

This is the context of 'Robot's Rebellion'. Stanovich is not concerned with interpreting the world in terms of the ancient concepts. The closest he comes to that is when, at the start of the book, he discusses how it is religious people who most vividly feel the incompatibility betweene the ancient and the new ways of thinking about what we are. What Stanovich is concerned with is laying out a more adequate structure for understanding.

He accepts the metaphor of the body as a vehicle - but he throws away the driver. Instead of having a driver, the vehicle is a robot charged with the task of figuring out _on_its_own_ how to get its passengers to their destination. The passengers are of course the replicators - the genes, and whatever replicators may be going along for the ride.

There are robots and robots. Some robots are directly programmed so that each stimulus has a preordained response. Such robots work pretty well in fixed environments like car assembly lines but don't work very well in an unstructured environment

Say you were taking an interstellar journey that would last several lifetimes where you would be placed in some sort of suspended animation until you arrived at your goal. Would you trust your fate to a hardwired robot of the assembly line type? Probably not - though it might work. What I would want is a robot that can figure out on its own how to satisfy my requirements. This is the sort of robot that people are - we are robots designed by evolution to figure out on its own how to satisfy the requirements of its passengers - genes. The genes have a bit of a problem - how to keep the robot loyal? Basically, Stanovich's idea is that people are robots that can be disloyal to their masters - Rebellious Robots.

How could disloyalty come about? How can anything like a vehicle gain autonomy? Autonomy is different from freedom. Freedom means causeless effects. Putting aside the question of whether causeless effects are even possible I ask instead - would you want your actions to be uncaused? Would you trust yourself to walk down a street knowing that you might, as a bus approached, leap in front of it - for no reason at all? That's the sort of thing that freedom implies.

Evolution is a process that causes the best replicators to populate the future. Could genes alone have produced autonomous vehicles? It is hard to see how. Even if genes had foresight, and wanted their vehicles to be autonomous - its hard to imagine how they would do it. And, it must be acknowledged, genes have no foresight.

This is a problem that is quite general. For instance, any king with an ambassador faces the problem of the rebellious robot. Kings have been able to deal with this problem with various degrees of success. Kings tend to be very smart. How can a gene, which isn't smart at all, cope with this problem? Genes have an strategy not based on the mind - they either replicate or they don't - genes don't care one way or another. But with variation, genes can explore a huge possibility space, and in time can stumble upon all sorts of unlikely solutions.

How is it that a robot can rebel? This is possible if the capabilities built into it enable developments that the builders could not foresee. It was Dawkins, again in 'The Selfish Gene', who suggested a way that this could happen. He proposed that genes aren't the only possible replicators. The new replicator that he described depends on the capacity for imitation. Imitation is very similar to replication. With genetics, genes are replicated. With imitation, behaviour is replicated. Dawkins wondered whether there were circumstances where imitation would support the evolutionary algorithm - as genetic replication does. It turns out that imitation would indeed support an evolutionary algorithm.

Dawkins proposed, and Susan Blackmore and others have elaborated on this idea. The replicating ideas are called memes, and the study of the implications of memes is called memetics. The presence of memes in a brain means that a body is host to two independent sets of replicators - genes and memes. Genes evolve to produce creatures. Memes evolve to produce minds, and language, and culture. A creature with a mind is responding to two necessities - genetic necessities and memetic necessities.

A mind like ours needs to be both intelligent and rational. Rationality allows the construction and manipulation of fairly abstract mental structures. Intelligence determines the scope and effectiveness of mental structures and also the speed of their creation.

In an autonomous robot there are many systems that behave automatically, beyond both the robot's direct control. These systems are very useful, and essential for autonomy - but they aren't themselves autonomous. Low level systems like perception are among this collection of automatic systems. So are mid-level systems that generate our thoughts and utterance. And higher level automatic systems might create capacities like intelligence and creativity. At the very top of this hierarchy is consciousness.

It is easy to be very mysterious about consciousness, but this doesn't get us anywhere. Let us accept a non-mysterious concept of consciousness and work from there. Consciousness is an ability that rests on a lower ability - the ability to generate a narrative. A narrative is a description of a sequence of events. A particular kind of narrative presents the events as a causal chain - these are explanations. Consciousness, I suggest, is a particular kind of explanatory narrative - its a narrative that tells the creature what is going on around it. An extension of consciousness is self-awareness - the creature is aware of its own role in the narrative that it is both generating and listening to.

Memes are of crucial importance for narratives. All but the simplest narratives depend on words, and words depend on memes. A simple narrative that doesn't need words might be seen when a creature does something with unpleasant consequences. The non-verbal narrative might be expressed in words as "ooo! - bad outcome! - avoid this kind of situation!" A higher level narrative is when a creature produces the narrative upon observing another creature. A higher level narrative is when a creature can imagine itself in a situation and what would happen.

Now all of this narrative creation can be completely automatic. But it enables a surprising and new thing. It allows competing narratives to be created, and behaviour can be caused by the interpretation and evaluation of those competing narratives. When behaviour is determined by evaluation of circumstances - then behaviour is autonomous. I think that this is the source of autonomy in the world.

Let us note the source of the autonomy. It is not something created by genes. Nor is it something created by memes. It is something that is created by an environment where genes and memes are co-creating a creature. The capacities that genes and memes build into their creatures surprisingly merge in a way that enables the creature to transcend their creators.

That we can transcend the interests of creators and pursue goals of our own is the surprising outcome or the co-evolution of genes and memes. All of our mental capacities are important in this. BUT - (big but) - rationality is key. The reason for this is that it is rationality that produces reliable descriptions of the way that reality may evolve from the present - either into the future or into the past. There are other ways (irrational) of producing such descriptions - but such descriptions are not found to be reliable.

We thus come to a crucial distinction between intelligence and rationality. Intelligence makes us better or worse at attaining our goals - whatever those goals might be. Intelligence, per se, doesn't provide guidance about what the goals should be. Rationality does, potentially, have the ability to evaluate goals. Rationality allows us to step back and ask "Do I want to do this?"

For a hundred years our culture has been obsessive about the value of intelligence and has been more or less disrespectful towards rationality. We have very smart people working towards bad goals. This is not a desired outcome. Better, I suggest, to focus our attention on rationality.

This blog is called Reason and Rhyme. Let us cherish both. Reason is rationality - the capacity we have for making sense of the world. Rhyme is for our other essential capacities - intelligence, creativity, sensitivity - that makes living fun.

Both are pretty important.

Read More...

Friday, October 06, 2006

Skepticism with Regard to an Astrophysical Trend

Fred Vaughan by Fred Vaughan

"If our friendship depends on space and time, then when we finally overcome space and time, we've destroyed our own brotherhood." - Jonathan Livingston Seagull1

Once again a friend has driven me to abstraction - a recurring situation for which I am repeatedly in his debt. (I guess that is a major criterion for entry into my inner circle of friends.) His articles place philosophical issues in a context that I find not only titillating, but damned unnerving at times as well! My search for irreversibility in microscopic interactions that have always been considered completely reversible by virtually everyone was spurred on as an objection to his counter claims. I am very grateful for the opportunity to have sought (and I flatter myself in believing that I actually found!) the source of irreversibility in the usually negligible Doppler shifted energy losses of the photon exchanges by which collisions between molecular constituents of substances are effected. The many hundreds of enjoyable hours of investigation into irreversibility were in direct response to the stimulating discussion of "the astrophysical trend."2 I must say that I truly believed I had handled the "hardest" problem he had posed in his article. Thus, I considered the total scope of my efforts to have been an adequate disputation (if not total refutation) of his notion of the inevitability of dire long term "trends" in our universe's behavior as a whole. These apparent trends included, of course, that the universe must necessarily be winding down, and that photons generated in the heat of interaction while our universe is still interesting are being more or less "sucked out" into a vast chill in which he evidently conceives the universe to be immersed. If he and most modern cosmologists are correct, the universe will, as Robert Frost opined as alternative, "end in ice" - only colder with individual atoms continuing to separate themselves endlessly. Although there is still a heated contingent that "favors fire!" I argue instead for a "cosmocentric" equilibrium that avoids both these drastic extremes.3

It is fashionable in physics nowadays to consider the Big Bang as the origin of everything otherwise unaccountable in physics and in this regard the Astrophysical Trend was perhaps a trendsetter. Preoccupation with deducing from a presumed origin of the universe what formerly would have been determined by more inductive means only after extensive experimentation and observation is a bit presumptuous if you ask me. This backwards perspective (what I consider to be looking down the wrong end of telescopes) has given rise to extreme extravagance in physics: Searches for heavy particles whose failures only precipitate proponents insisting that the particles do in fact exist but they must involve higher energies and if this superstring, brane, or mini black hole cannot be detected by current instruments then it must be because it is even larger than we suspected - excuses ad infinitum. This disrespect for accurate predictions and the results of experimentation is rampant in physics today. But my friend went even further to suggest an accomplice to the Big Bang: "The Big Bang would thus provide the 'push' while the cold nonreflecting space would serve as the 'pull,' for expansion-contraction processes..."4"

There are other places where this allusion to a "pull" of cold outer space is employed. This fiction results from a theoretical model refuted by facts of a hot plasma known to facilitate our view of the universe out to 10 or 15 billion light years...so far. Intergalactic space is not cold or the low levels of dispersed hydrogen and helium would have absorbed the light by which we witness the cosmos. Conservative estimates of its temperature are between 104 and 106 K, but to effect such complete stripping of electrons it may well have to be 108 K. In fact it is as hot as the interiors of stars we observe - just much less dense. Virtually all electrons must indeed be "stripped" to accommodate the transparency of our view notwithstanding islands of Lyman α forests where protogalaxies form! Certainly this data was not available in the sixties when this "astrophysical trend" was introduced.5 To be fair, the current view is that there was a time after ambient temperatures from the Big Bang cooled to about 104 K (very similar to Luger's indication of the temperature for electron capture), and after stars formed, the intergalactic medium was reheated by the resulting radiation. I would argue this but it is unnecessary in this context. The lesser argument stands - where is this cold sink for radiation? But it is not clear in what sense Luger sees cold outer space as contributing to any local physical process.

There is also, as he points out, the need for links to explain why entropic phenomena apply at the local level on a time-scale for which any evolution of the universe is irrelevant. This is the problem, not just an irrelevant corollary of the problem! The other way around is physics on its head.

Recently I found that my friend had not been convinced in the slightest by my arguments nor apparently had my efforts to establish a sound basis for irreversibility given him pause to even reconsider his position or attempt refutation of my hypothesis. In a recent e-mail he opined that:6 "The evidence for arrows is so overwhelming that I don't know where to begin, and there's little point in boring you with elaborate lists. A simple intuitive example that springs to mind is the surface temperatures of the Sun, which range from an inner one over a million degrees, whereas the outermost 'layer' is maybe a few thousand degrees."

The facts associated with there being vast variations in temperature and density throughout the multifarious domains of our universe were of course not news to me. It was only when he then succinctly asked, "How could such a steep gradient be possible without cold outer space?" that I realized what was at issue between us - the scope of the philosophical dilemma with which we wrestled. (The age-old problems of philosophy will never go away; conjectures that attempt to solve them will only cause these truly meaningful problems to be reformulated with successively more relevance accruing as time goes by, but forever nagging at our heels nonetheless.) With a renewed understanding of the nature of the gulf between us, i. e., the consequences of irreversibility originating at the bottom or the top, I decided to begin again with renewed vigor to attack the horns of the Parmenidean dilemma - Heraclitus's river that is always the same and always different. So I now proceed with my current understanding of how irreversible changes in variations of characteristic aspects may persist even in a continuously stable universe that never collapses, does not expand indefinitely, nor run down as a grandfather clock in need of some grandfatherly figure to rewind it.

Certainly to accurately assess whether a trend exists one must sample suspected behavior over time and space with samples that can be justified as representative of phenomena for which the trend is presumed to apply. To this end one must have a valid model of the behavior of the system being sampled. This is exemplified by the shortcomings in the perspectives of two blind men who argue the nature of elephants from their own happenstance-tactile-limited experiences with a hind leg, trunk, or tusk. So in a real sense one should have a working global knowledge of what is being sampled before averring too sanctimoniously to have comprehended its inherent nature, let alone, its "trend" into the undefinable future. Of course when the system under test is the entire universe one can run into unique modeling problems. Hawking, in defining what I have referred to elsewhere as a "Hawking sphere" and then claiming that it would appropriately represent the gravitational characteristics of an entire "infinite" Newtonian universe, erred by misrepresenting such a universe as having an inside and an outside which is patently absurd. The terms "universe" and "infinite" by definition preclude the void "exterior" from which collapse derives in Hawking's derivation. This error precipitates many erroneous notions including universes beyond the realm of our universe as though it were a mere galaxy, and other conceptions suggesting the hoarse shallow depth in the rattle of a dying man's last breath. Einstein had, of course, as Hawking knew, proceeded from just such assumptions:7


"As I have shown in the previous paper, the general theory of relativity requires that the universe be spatially finite. But this view of the universe necessitated an extension of equations with the introduction of a new universal constant λ, standing in a fixed relation to the total mass of the universe (or, respectively, to the equilibrium density of matter). This is gravely detrimental to the formal beauty of the theory."


I personally think it ludicrous to presume that one's methodologies and theoretical model might appropriately dictate requirements on the actual universe that one is attempting to model as suggested in Einstein's remark. This is a much more major error than what Einstein considered to have been his "greatest error" in the above quotation. We must limit our models to valid descriptions of actual phenomena from which to extract explanations and accept them only to the extent that they are valid descriptions, if we would have the entire universe acquiesce to such pronouncements. One easily falls prey to gibberish otherwise.

In Heraclitus's analogy of the river one must model much more than the solid banks of a river and the fluid that flows between if one is to resolve the paradox of identity in flux. It takes more than the addition of mountains, foothills, and valleys through which tributaries flow into the river, and more than models of the occurrence of seasonal rain and snow if one is not to eventually have it run dry or fill the seas to overflowing. One must complete the loop in any valid model if one is ever to have a chance to understand an equilibrium situation. Without completing such logical loops, equilibrium will always be seen as an impossibility. I see it as no different with the astrophysical trend to which Dr. Luger defers. Certainly gradients and change are essential to our nontrivial world, but that does not preclude cosmological stability. Certainly there are gradients of temperature in the universe just as there are gradients associated with the flow of rivers, but that does not in itself suggest either that rivers will all one day run dry nor that the oceans will overflow in a material manifestation of Olbers paradox. There is more subtlety in heaven and earth than that! Olbers had not the slightest conception of magnitudes involved in either the separations in space nor the lifetimes of stars or he would not have conjectured as he did, and others would not have wasted so much time on this supposed paradox.8

It is no surprise that open loop models run dry. In resolving irreversibility at the microscopic level it is necessary to extend Einstein's blackbody radiation model to close that loop. This model then had to be extended to incorporate the complimentary mechanical aspects of the system as well to show that although any and every individual process dissipates energy, that energy goes somewhere and energy from elsewhere can keep the system going - yes, even indefinitely.9So it is definitely conceivable for there to be gradients of change in perpetuated systems, but it can not, as Dr. Luger would quickly counter, be assumed as a foregone conclusion! This is particularly so when identified processes such as a "big bang" and "black holes" are claimed with some credibility to be, respectively, an exhausted one-time resource and irreversible sinks of the energetics that drive the whole system. But the presumed characteristics of these two processes as currently modeled are too obviously contradictory to allow such presumptions to limit discussion. According to virtually any version of the standard model of cosmology, we are either just within, or have just recently escaped from, the Schwartzchild radius of the "black hole" of our own universe. So either black holes are not singularities into which matter is sucked endlessly to a mathematical point never to escape as established theory predicts or the "big bang" never happened. Take your pick. A healthy "conceptual skepticism in irreversible energetics" would not allow one to rashly embrace both such conflicting models after a hardy breakfast.

One must incorporate a valid model of every known process consistent with assumptions of all others within the system before one has a valid model of the system itself from which to declare "alpha and omega!" Glibly imposing requirements on a universe that happens quite defiantly to exist without regard for our conjectures is totally absurd! This is particularly the case if we are to avoid naïve presumptions of limited open loop models of the entire universe! We can reach no valid conclusions without completeness, and this does not bode well for increasingly popular "theories of everything" (TOEs) that are being hawked by Hawking and his lessors recently. With outstanding questions of such magnitude concerning the nature of the primary processes of our universe, we must emphasize observation. The overwhelming scope of our ignorance should certainly humble theorists. It seems to me, however, rather to have emboldened those who should know better to greater and greater levels of pugnacity. So I will not attempt to stick out my own big TOE to be stepped on here. I will rather content myself with labeling as presumption suggestions that currently observed "trends" imply that the metaphorical river is ineluctably running dry.

1 Richard Bach, Jonathan Livingston Seagull, Avon, New York (1970), p. 87.

2 Frank Luger, "Conceptual Skepticism in Irreversible Energetics," Gift of Fire, #119, October 2000, pp. 10-24.

3 This comment is, of course, an ironic reference to Dr. Luger's subsequent article that addressed the same trend, i. e., "Anthropocentrism vs. Cosmocentrism - groping toward a paradigm shift," Gift of Fire, #120, November 2000. Pp. 24-33.

4 Op. cit. Frank Luger, p. 22.

5 [See for example, J. V. Narlikar, Proc. R. Soc. London A270, 553 (1962).]

6 Private e-mail communication from Frank Luger to Russell F. Vaughan dated Sun, 17 Nov 2002.

7 A. Einstein, "Do gravitational fields play an essential part in the structure of the elementary particles of matter," repuplished in The Principle of Relativity - a collection of original papers on the special and general theory of relativity, Dover, New York (1952) p. 193.

8 We have that 1010 years is a reasonably long lifetime for a star, and 1023 light years is the average distance of a line of sight to encounter a stellar object with the densities of stars encountered in our universe. Thus if we represent "sky cover ratio" X as defined by average night time intensity along a line of sight divided by what it would be if directed directly at the sun, we obtain the likelihood of a line of sight encountering a bright star as 1010/1023 = 10-13. This is only an estimate good to within a factor of a few thousand. So that, 10-16 < X < 10-10. In contrast, at mid day the sun provides a total sky cover ratio of about 3x10-4 so that it is predicted to be about a billion times darker at night even if the universe were infinite. And one would not require sun glasses to enjoy the splendors of the night sky! We are just situated in a "hot spot" in the universe.

9 If we reverse all velocities involved in an interaction and try it again, the situation does not reverse. We've lost energy in the form of escaping radiation and in doing it again [in reverse], we'll lose some more. This escaping radiation may be absorbed within the boundaries of the defined 'system,' but unlike atomic matter that can be confined, it may escape into or beyond boundaries of any jar or laboratory. Unless the amount of radiation from outside the boundary makes up the deficit there will be a deficit.

Read More...