Monday, December 21, 2009

One of Us

Right after Jesus was born He was wrapped in a crude little piece of cloth and placed in a feeding trough. That is, He was wrapped in swaddling clothes and laid in a manger. Wise men at some time after his birth - maybe many weeks after - brought Him the ointments of kings. But at the time of He came into the world, Jesus lacked any kind of a regal reception. He came into the world surrounded by the sights, sounds and smells of farm animals.

For over two millennia we have been trying to understand what it means that the Creator of the world became like one of us. Christian orthodoxy accepts that He was born like a human baby, that he grew up like we do, and that He experienced the joys and the pains of mortality - only without committing sin. Nonetheless, according to this same orthodoxy, God was not transformed into a man, nor for that matter, was man changed into a god. It all seems a bit confusing. He is like us and yet he isn’t. Why did He come to earth this way in the first place? It leaves one wondering if we haven’t left something important out of the “official version.”

Fortunately at Christmastime we celebrate the birth of Christ as related in the gospels and not as indicated in the official creeds. If we stop and reflect on these remarkable verses, we can still somehow hear the bleating of sheep and the lowing of cattle. We can imagine the farmyard steam rising around the baby Jesus in His morning manger, and breathe in the rural pungency of a warming spring day in Bethlehem.

Perhaps Baby Jesus had a lock of curly black hair, a freckle on his chin, maybe even playful greenish brown eyes. He probably puckered His nose for a noisome fly. And if you tickled His feet with a piece of straw, He very likely curled His chubby little toes. It seems to me that if these stories were written for any reason at all, they were written to make one thing perfectly clear: Jesus is one of us.

This turns out to be a troubling thought if you don’t have a very good perception of human beings. In fact it almost verges on blasphemy. How could a divine being - even living without sin - become a mortal being and still be God? Even worse is the conclusion atheists draw from a very mortal Jesus: “of course he was mortal, what else did you expect?”

The question that needs to be asked is: where did we come up with this poor perception of human beings to begin with? For “Ye are Gods; and all of you are children of the most High” (Psalms 82:6.). if we start with the understanding that we are created in the image of God and that He insists on calling us His children (for the very obvious reason that He is our father) why should we wonder that our older brother Jesus Christ was born a human being just like us?

We have trained our minds for too long imagining that we can never understand the condescension of god (or call it the incarnation if you like). As a result, we have lost sight of our own potential. We have lost sight of who we really are.

The little baby wrapped in swaddling clothes is our brother, and He has made it possible to return home. If we haven’t figured this out yet, it’s time to think again about the true meaning of Christmas.

Saturday, December 12, 2009

Hope Abides

Some time ago in a class at church I listened as a comment was made about the principle of hope. “It’s an important principle,” admitted one of the class members, “but it’s kind of a weak one compared to some of the other principles of the gospel.”

This seemed rather sad to me, although I understood how someone might think so. Hope to many people is not much more than wishful thinking. In fact the dictionary considers hope to be just an optimistic desire about the future. And since the future is so ill-defined, how can hope be anything more substantial than a birthday wish made over candles on a cake?

It then occurred to me that hope has an additional meaning in Spanish. It can mean “waiting” - esperanza. Is it possible, I thought, that waiting was originally an important part of the meaning of hope that we have lost - except by those speaking Spanish?

I looked in new dictionaries without much luck. Then I went to older dictionaries and still could not find any evidence for my hunch. Even the word elpida (in the Greek New Testament) and the Latin speranza (in the Vulgate) lacked this sense of waiting. I was about to give up when I decided to check my copy of The Complete Biblical Library (Gilbrant et al.). In the second volume of the Greek-English Dictionary (in the discussion of “elpis”) I found it. Hope at one time had everything to do with waiting.

“There is probably no area in which the contrast between the Greek and the Hebraic concepts of life appear more clearly than in the differences between their conceptions of hope.”

So wrote the editors of this dictionary. In ancient Israel, hope did mean waiting. In fact it sometimes meant longsuffering. Certainly it could involve an optimism about the future but this optimism was tempered with time. It could also be tempered with grief. I was beginning to learn that hope was not necessarily a weak principle at all. To have hope - to endure the trials of life while remaining true to a divine desire - is to prove one’s faith. If faith is central to religion, hope becomes its refining fire.

The Hebrew word that conveys this sense of hope is q­āwâh, and an important example of how it is used can be found in Isaiah 40:31:

“But they that wait [hope] upon the Lord shall renew their strength; they shall mount up with wings as eagles; they shall run, and not be weary; and they shall walk, and not faint.”

This word for hope is usually translated in our scriptures to mean waiting. This is, of course appropriate. The trouble with it, though, is that because it carries more than one meaning, translators ended up having to choose between them. It seems that the sense of waiting was lost.

Having made this discovery, I then looked in the New Testament to see if there was any of this Hebrew sense of hope there. I was pleased to find that there was, but one has to look for both words (hope and wait/endure) to find it.

One remarkable example is the 13th chapter of 1st Corinthians. This is Paul’s famous discussion on charity and not the first place you would think to look for references about hope. But Paul understood charity as a culmination of faith and hope. Specifically, he saw both faith and hope as virtues that endure.

What stands out in his discussion of hope is his creative use of the Greek language and his expansion of the Hebrew concept of hope - with its sense of waiting and longsuffering - to include a Christian optimism in Christ.

There are primarily two words that are used in the Greek New Testament to convey the Old Testament meaning of hope. One is hypomeneo (the most commonly used) which means waiting or enduring. The other word is elpida which is very close in meaning to our word hope. In Paul’s letter to the Corinthians, he uses both forms, although if you don’t read it in the Greek you will miss it.

“And now abideth [meneo] faith, hope [elpis], charity, these three; but the greatest of these is charity” (1st Corinthians 13:13)

Paul uses the optimistic form of hope (if we can call it that for convenience) here - the Greek elpida. For him, hope in Christ is something to experience optimistically. But he is also sensitive to the Jewish understanding of waiting and also refers to it (he uses the word meneo - a less onerous form of hypomeneo). In this verse (from the King James Version) it is translated as “abideth”.

This may all seem a bit technical but the meaning is very simple. It is also significant: hope is much more than just wishful thinking.

This dual sense of hope continued into the early Christian period. The Epistle of Polycarp to the Philippians (written in the 2nd Century) pleads for both longsuffering and patience in hope (see Roberts and Donaldson).

“Let us then continually persevere in our hope, and the earnest of our righteousness, which is Jesus Christ … Let us then be imitators of His patience; and if we suffer for His name’s sake, let us glorify Him.”

Augustine, in his work on faith, hope and charity (the Enchiridion, written in the early 5th Century) does not have much to say about hope but does acknowledge Paul’s epistle to the Romans with it’s understanding of waiting.

For we are saved by hope: but hope that is see is not hope: for what a man seeth, why doth he yet hope for? But if we hope for that we see not, then do we with patience wait for it.” (Romans 8:24-25)

By the time of Thomas Aquinas, however (mid 13th Century), the Old Testament understanding of hope is difficult to detect. His treatise on faith, hope and charity (in the Summa Theologica) is comprised of roughly 500 pages. But just over ten percent of this is given to any discussion of hope, and most of these pages are only a discussion contrasting hope with fear. The older concept of waiting is acknowledged briefly but ambiguously.

The expectation which is mentioned in the definition of hope does not imply delay although longanimity may pertain to hope. But hope implies a reference to the Divine assistance, whether that which we hope for be delayed or not.”

Much of the problem seems to be that there just hasn’t been an adequate word that conveys both the senses of waiting and of wishing that the ancient authors appreciated. Translators have been left on their own to decide the best word to use in any given context, assuming they understood the language nuances to begin with. An alternative would be to describe both meanings in repetitive or contrasting ways like Paul did in 1st Corinthians. Fortunately we have a handful of places where this method does occur. The best examples happen to be in the Book of Mormon.

One of the key chapters on faith in the Book of Mormon is Alma 32. It is also an important, though rarely recognized, chapter on hope. In fact verse 21 (perhaps the most widely quoted verse in the chapter) shows that the two principles are interconnected in an important way.

…"faith is not to have a perfect knowledge of things; therefore if ye have faith ye hope for things which are not seen, which are true."

And what are these things that can’t be seen that we, with faith, will hope for? They are actually quite simple, but Alma first wants to explain something else. He wants to shows us that through experimenting on the word and nourishing it that we can look forward to the fruits of faith. And it is by looking forward to this fruit, coming from a tree that has become rooted and well cared for, that we can hope for eternal life (Alma 32: 41).

It is this journey of faith to the fruit of faith that requires hope. And it isn’t until the last verse of this remarkable chapter that Alma mentions how this is to be done. In fact he doesn’t even use the word hope at all - at least the word for hope that we’re familiar with. Instead he describes it in the words of his scriptural heritage where hope means waiting.

Then my brethren, ye shall reap the rewards of your faith, and your diligence, and patience, and long suffering, waiting on the tree to bring forth fruit unto you” (Alma 32: 43).

Obviously Alma understood hope to be much more than wishful thinking.

It appears that faith is to be tried. It is to be experimented with - even proven. And it is this trying of faith that informs hope. The prophet Ether, in another important chapter on faith, teaches that if men will humble themselves before the Lord and have faith in Him, will He turn their weaknesses into strengths. And then will faith, hope and charity bring men unto Christ the fountain of all righteousness (Ether 12: 27-28). So it is that faith must endure through trials of its own. As it does so, it waits. It endures. In a word, it hopes. As it is refined, it knows charity and brings us unto Christ.

Finally at the end of the Book of Mormon, Moroni includes some of the teachings of his father Mormon about the interconnectedness of faith, hope and charity. He then adds his father’s prayer that God the Father will keep him through the endurance of faith on his name to the end (Moroni 8: 3).

Sadly, our generation, more than any other, is handicapped from being able to understand hope. Part of the reason for this is that we find it difficult to wait for, or through, anything. It’s truly revealing how agonized some of us become over a slow traffic light, a slow computer, or the delay of a package by a day. Our hope-less condition becomes almost a spectacle in the long frustrated lines of cars waiting for fast food at a drive-through window. Efficiency is the name of the game, and the innovator that helps us save a few minutes is our hero.

But who understands the Law of the Harvest anymore – the principle of reaping what we sow, the principle of working and waiting before we receive? Our ancestors used to know about waiting by the sheer necessity of living off of the land. They also knew that the planting of seeds was an act of faith that required a period of gestation – a period of patient maturation and care.

This is the way things are meant to be – both for seeds and for faith. The Lord doesn’t normally test our faith by requiring some immediately difficult task. He tests our faith in the prolonging of our trials. Without this period of testing we would never grow to understand the love of God.

This pathway to Christ - the pathway of faith, hope and charity - is quite a different pathway than we sometimes think. It is also different than our generation is able to easily understand. Faith is certainly much more than a blind rational belief when it is truly experienced. Charity, likewise, as a gift from God is much more than another word for love or for almsgiving. And hope - that “middle” virtue that so often gets lost in between faith and charity - is much more than a weak doctrine of wishful thinking. It is the very path of patience in faith that brings us to Christ. It is the connecting link that makes faith and charity understandable in a world of doctrinal semantics and temporal confusion. It is, most certainly, a crowning virtue. Faith, we are taught, is the evidence of things not seen that are true. And hope, as it should be properly understood, is the evidence of this faith. It endures. It abides.

Works Cited

Aquinas, Thomas. Summa Theologica Part. II, Question 17, Article 5.

Gilbrant, Thoralf et al. eds. 1991. World Library Press, Inc. Springfield, Missouri.

Roberts, A and J. Donaldson eds. 1995 Anti-Nicene Fathers. Vol. 1, Chapter VIII.

Saturday, November 21, 2009

Notes on Constantine's Sword - by James Carroll

Carroll begins the book by recounting the Pope's (John Paul II) visit to Auschwitz and the wooden cross erected there in memory of Christian martyrs. His concern is that the death camp, which has come to stand for "the abyss in which meaning itself died" for the Jews has become "the sanctuary of someone else's recovered piety" (page 5). This is indeed a poignant symbol of the flavor of the rest of the book. Carroll is himself a Roman Catholic - having been a priest. His sympathy for the Jews makes the book an important effort of reconciliation between the two great faiths. I learned much from the book. There is much to be praised in it. The only significant criticism I have is that the author often seems to compromise his own faith in an overweening attempt to heal old wounds, to admit the errors of the Catholic Church.

There is a memorial in Yad Vashem in Jerusalem. It is the first memorial of the Holocaust. The legend reads, "Forgetfulness is the way to exile. Remembrance is the way to redemption." (See page 5.) This is powerfully true, especially as it refers to the God of Israel. Whether it was intended as such or not, it also implies the correct means of reconciliation that Carroll seeks. If the cross at Auschwitz is a symbol of how the Catholic Church has obviated reconciliation in the past, the memorial at Yad Vashem might be used as a truly effective symbol of reconciliation. Carroll does not mention (nor would the Jews probably admit) that it is Christ that seeks to gather Israel together "as a hen gathereth her chickens under her wings, and ye would not!" (Mathew 23:37.) This is the scriptural affirmation of the Yad Vashem memorial. It is also a Christian affirmation. This symbol is from the New Testament and modern scripture, not from Israel's cannon. It is a reminder that Jesus' message has always been more redemptive of Israel than that of any, or all, of Israel's prophets. (It is primarily Isaiah that can be seen as a voice of redemptive optimism among them.) Israel's exile is a symbol of our universal mortal exile from God's presence. But like any exile, it is helpful only to the extent that there is remembrance - but remembrance of God, not just our many mortal sufferings.

But this is largely the problem. Many Jewish voices (especially Elie Wiesel and others) insist that the Holocaust (or the Shoah to the Jews) is without meaning. The murder of a million children requires this. To this accusation I can answer only from my own experience. I have never lost a child to such a horrible end. Nor do I impose my answer on others that have. But I do know that I have never been forsaken in my own, not insignificant, sufferings.

But Carroll is right that the cross at Auschwitz is wrong. It is the symbol by which Christians have indicted the Jews for centuries as Christ killers (see page 7). Even if this was not the intent of the Catholic Church in placing the cross there, the least suggestion that it might be so interpreted should have kept it away.

Carroll also suggests that the unspeakable events at Auschwitz, if viewed too closely with the cross, might infer their own expiatory significance - types of atonement themselves for having killed Jesus. This also brings up the problem of the very word "holocaust", which in Greek means a burnt offering. Accepting the word seems to imply to some that the murders were justly meted to the Jews for having killed Christ. Of course this interpretation is egregiously offensive. Jews have turned from this implication by shunning the very word in many cases. They prefer the word "Shoa" which is a Hebrew word meaning catastrophe. In its biblical sense, shoah means an absence of God's presence. It is the opposite of ruach, which is the breath of God. Ruach in Genesis is how God drew order out of chaos. Shoah is the undoing of this ordering (page 11). Of course I sympathize with this. I am pained at one of the implications though. It implies that God has abandoned part of Israel - at least in the eyes of the Jews.

This Jewish / Christian conflict of misunderstanding, as represented by the cross at Auschwitz, seems almost incapable of resolution. The centuries of hatred and accusations that Carroll's book narrates are examples of this. How can this be otherwise? The Jesus of Palestine was a Jew. The Christian Jesus after Nicea is a philosophical construct. They are clearly not the same being. I believe that the Jews, when they come to truly understand the nature of the divine son of a carpenter's wife from Bethlehem, will begin to feel again the breath of God. He will be, after all, one of them. Most certainly this will not be experienced before a cross at Auschwitz, nor will it be experienced before a papal tiara. When it happens, it will be in God's way. I look forward to that day.

But Carroll goes too far, I think, in his focus on the cross of Auschwitz. His comparison of the cross to the cross hairs of a spotting scope is offensive (page 20) to me and I'm not even a Catholic. Another reason for the perpetual misunderstanding through millennia has been the Christian perspective of the Jews. This perspective has been largely Biblical (page 19). The recognition of Jews as a legitimate contemporary culture, on par with any other culture, seems to be lacking. In a way this is inevitable for Jews have maintained their identity through centuries by remembering their roots. This certainly is impressive and may turn out to be a virtue if viewed strictly from a Jewish standpoint. For Christians, it would do better in many ways to recognize the legitimacy of Israel as a state, unencumbered by the past. This is true because the Jews have been seen as a primitive faith superseded by the higher Christianity. Or maybe better, the Jews, that at one time were a favored family among Israel, have forsaken the truth and rejected the higher and purer Christianity. This perspective clearly fails to recognize, let alone accept, the Jews as a people worthy of their own right. It also seems to have doomed the Jews to the chronic stigmatism of being "Christ killers".

It seems to me that this trap may also be the reason for the apparent abuse of evolutionary thinking about religion in our time – even strangely enough from Christians. A Christian church that has supplanted a less favored Judaism imagines a religion that must evolve, or supersede, a malingering past. The seeds of the reformation, and even the enlightenment, may have been sewn in a tradition of restoration or purification but they have been frequently understood subsequently in evolutionarily terms. The restoration of the Church of Jesus Christ, on the other hand, has significant implications for the validity of this mindset. First of all, it is a restoration of the ancient truth and not an evolution of it. Its claim is not that of superseding Judaism, but of restoring even the faith of Israel to its original true form. It is grounded on eternal principles and as such implies, and even testifies of, the reality of eternal truths. It also draws into question the validity of reformation, enlightenment, and evolutionary ideologies that seem to be outgrowths of the old Jews-as-primitive mindset that we have inherited from the Jewish / Christian misunderstanding that Carroll narrates.

A central point of the book is the responsibility for the holocaust that rests squarely on the Catholic Church. Part of this stems from the fact that Jew hatred was made a holy sentiment. In its more subtle manifestations we see the art, jewels, and even funds of the pre-holocaust Jews showing up in museums and banks, never having been acquired via just compensation. Apparently, even Volkswagen, Krupp, Ford and others benefited significantly from Jewish slave labor.

Another implication of the church is what it could have done to eliminate, or at least lessen, the significance of the holocaust. Hitler, at one point, eliminated 70,000 people in his euthanasia program. Many thousands more were also scheduled to be eliminated but were not because of the concerted effort of the Catholic Church. The historian Deborah Lipstadt suggests that, "had the Nazi hierarchy encountered unambiguous and sustained revulsion by non-Jewish Germans at their antisemitic policies, there would have been no Final Solution." (Page 30.) Similarly, Cynthia Ozick asks, "How is it, that indifference, which on its own does no apparent or immediate positive harm, ends by washing itself in the very horrors it means to have nothing to do with? Hoping to confer no hurt, indifference finally grows lethal; why is that?"

This is indeed a troubling question. No doubt, much of the answer lies in the truism that the sentiments of self-preservation are usually stronger than the sentiments of moral justice. We are, after all, mortal; and as such are constrained by our physical natures to avoid risks. Nonetheless, the awareness of our immortal souls does occasionally shine through. Unfortunately, this is usually only the case in a small minority of situations. Acts of life-giving altruism are uncommon, but they do exist. To many of these examples, scientists are incapable of giving adequate naturalistic explanations. These examples show that some extraordinary individuals can live by a higher law than those that constrain the rest of us. Sadly, these examples are uncommon. Most cases of altruism are explained as efforts of preserving our genes by saving those of our relatives. The fight against euthanasia in Hitler's Germany can be seen as an example of this. The efforts to save another people, the Jews, cannot. This would have involved those truly altruistic cases that science really has not adequately given naturalistic answers to. People with this kind of awareness are indeed rare. Far too few of these extraordinary people could ever be expected to have lived at once in Nazi Germany. Thus indifference can become lethal, because most of us fail to live by other than mortal laws.

Chapter five is a look at the Passion Plays of Germany, where they were particularly common and influential, especially around Good Friday. Carroll points out that these plays, though intended as reminders of Jesus' suffering (His passion), were also very much indictments against Jews. They were examples of how "the Church defines itself entirely by its enemy" (page 32). For Carroll's youth this seems to have been the case. For many years, he remembered only one Jew, other than his friend Peter Seligman, and that was Judas Iscariot. Judas' acts were particularly symbolic of Jews, at least in the eyes of Catholics, because he chose suicide instead of repentance. Perhaps stated more succinctly, Judas was a traitor. Carroll is right in denouncing this.

To me the suicide of Judas should never have been a symbol of Jewish cupidity. His end was indeed tragic, but it is also a testimony. It is evidence of at least a partial understanding of Jesus. After all, how does one go about asking forgiveness (as some have suggested he should) for causing the death of Christ? One might ask forgiveness, though with difficulty, for betraying a friend. But how does one do the same for the Son of God? To make out of Judas an example of venality, betrayal, and then unrepentant suicide only trivializes his understanding of Christ's deity and ultimately of the atonement itself.

Since the Holocaust, the Catholic Church has made significant efforts to make things more right with the Jews. In 1962, Pope John XXIII convened the Vatican Council, which brought forth the declaration Nostra Aetate (page 38). This declaration essentially shifted the Church's blame of the crucifixion from the Jews to the Romans. This seems to have been a jolt to Carroll at the time. He had been taught the difference between anti-Semitism, which the Church deplored, and anti-Judaism, which the Church taught as an important part of defending the Faith. Nostra Aetate seemed to compromise this apparently important distinction.

John XXIII’s efforts of reconciliation are strongly contrasted with Pope Pius XII, who has often been referred to as Hitler's Pope. Pius XII is most strongly criticized for his silence during the Holocaust. His supporters insisted that he could have done nothing to stop the Final Solution. His critics insist otherwise. They contrast his influence against communism where he excommunicated all communist members with a stroke of the pen. This is troubling enough, but Carroll shows that even the local Catholic authorities in Germany supported Hitler, either by encouraging submission to authority, or in outright support of the volk. Carroll cites Gordon Zahn on this (page 45). It seems to me that both authors are a bit tendentious on this subject, although I don't believe they are completely wrong in their assessment.

Judaism has without doubt been poorly understood by Catholics; and I might add, by most Mormons. Carroll explains his own early understanding of Judaism as that of an antiquated religion that had been superseded by the "new Israel", which of course, was the Catholic Church. This attitude conveniently vindicated the church in all condescending relationships with Jews. It also failed to recognize Judaism as a living faith. Members of The Church of Jesus Christ of Latter-day Saints have also found it convenient to understand Judaism only in a Biblical sense. Carroll uses the example of Abraham Joshua Heschel as an example of recent Jewish thought that has not been adequately considered by Catholics. Heschel was a longtime professor at the Jewish Theological Seminary in New York. His two books: Man is Not Alone and God in Search of Man are important examples of the dynamism of actual Jewish thought. The central theme of these books is that of the living God (page 47). Heschel is cited saying. "The craving for God has never subsided in the Jewish soul."

This need of God should be compared to the traditional Catholic understanding of man seeking God, exemplified in Augustine's, "My heart is restless, Lord, until it rests in Thee." This Jewish contribution to our understanding of God is powerful, but I think the polarization between God needing man, and man needing God is misplaced. I certainly do believe that God greatly desires that we depend on Him. What a powerful concept that this need runs both ways. If this polarization is indeed a defining distinction between Catholics and Jews, then the truth of a mutual dependency, recognizing both beliefs, is also an important distinction of Latter-day Saints.

I certainly believe that self-sufficiency is an important gospel truth. So is a divine dependency. These two principles should not be mutually exclusive. To need God, to depend upon him, is really just another way of saying that we have faith in Him. This is the trusting part of faith that transcends mere belief. Heschel's understanding of God's need of His children certainly affirms the Biblical jealousy of God for man's religious attention. God does not want us to trust in manmade deities. Nor does he want us to trust alone in mortal technology. In this sense, divine jealousy requires self-sufficiency - a self-sufficiency that ultimately means our emotional and spiritual longings are not mortal.

Carroll then relates the reaction of Rabbi Heschel to the silence of American Bishops about Vietnam. Heschel called it blasphemy. Carroll equated the silence to the silence of German Bishops during the Third Reich. To me this brings up one of the most difficult and delicate religious issue confronting organized religion. To what extent are advocates of truth justified in compromising their advocacy out of political expediency? Two extremes of this question might be seen in the cases of Pius XII and Wilford Woodruff. Pius XII represented a powerful Catholic Church, and was not threatened politically by making strong statements against political regimes such as communism. By remaining silent about the Holocaust, his advocacy of truth must be questioned on moral grounds. Wilford Woodruff, on the other hand, was deeply troubled about the issue of plural marriage, yet was willing to stand behind it indefinitely if required to do so, even though it was politically unwise. He was the leader of a relatively small church and very susceptible to political leanings in America at the turn of the 19th Century. He stands historically vindicated in my mind because he refused to be swayed by uncomfortable political realities. It required a revelation before he changed the Church's practice of Plural Marriage. This revelation is very instructive. It shows that the Lord can withdraw, or temporarily stay, eternal principles or truths if this is politically necessary - especially if it would otherwise mean the destruction of the church. Those who seek truth in history have vindicated Wilford Woodruff. Pius XII does not stand vindicated. Should the American Bishops that were silent about Vietnam be vindicated? I'm not ready to say. I will say though, that this failure makes me wonder how much we have really learned from the Holocaust.

Carroll argues (on Page 54) that when the cross and the crucifixion became central to Christian piety, this focus also indicted the Jews, who were seen as being responsible for the death of Christ. The Cross and anti-Semitism developed together, perhaps inevitably. The cross became a symbol of contrast, defining Christianity in apposition to the Jews. This insight is useful. As a Mormon missionary in Catholic Spain, the cross represented apostasy. When asked, though, why this was so, we never really had a good answer. It sometimes implied a corrupt Catholic clergy. We also believed that it focused attention too much on the death instead of the resurrection of Christ.

Since my mission, I have become less critical of crosses and crucifixes. I suppose this is because I have become more sympathetic to the immense devotion that sincere Christians have expressed through these images - devotion that it is not my intent to destroy. But it has become clearer to me that the cross became important in church history as the Catholic Church lost divine direction. If crosses rankle with Jews, who see the centuries of abuse they have received from Christians in the symbol of Christ's death; I, as a Mormon, see it as a symbol of the corruption of the gospel of Jesus Christ. I am much happier with the symbol of the fish - an older symbol, a symbol that the early Saints, still sincere followers of Christ's true gospel, recognized.

The issue of supersessionism is addressed on page 58. The word seems to come from the Latin 'supercedere' meaning 'to sit upon'. A footnote in the text lists eight points in which the Catholic Church has claimed to have advanced beyond Judaism as the chosen of God. In fact Catholicism partially defines itself by the faults of Judaism. There seem to be similarities between this contrast of faiths with comparisons and contrasts of opposing ideologies in science. It seems to me that the Mormon Church being a restoration is not plagued with this impulse of criticizing the roots from which it sprang in order to vindicate its own existence; an impulse typical of almost all faiths and ideologies. To be sure, Mormonism, like early Christianity started as a small group with a message contrasting with the religious milieu into which it was born. Both suffered persecution as a result. The difference between supersession and restoration is more easily seen when these faiths are successful. Christianity, even after Constantine, retained the impulse to criticize Judaism, that is, it continued the criticizing impulse of supersession. Mormonism, on the other hand, after having passed through its (major) period of persecution, retains no impulse to criticize the Christian community even though it continues to be criticized by many. I believe this must be due, in part, to the fact that the restoration does not need to define what it is by contrasting it with what it is not. Prescription as a means of political wisdom is clearly of value in a fallen world. The true gospel of Jesus Christ, on the other hand, is revealed at once and understands itself by this revelatory process, and not by a history of other faiths.

On page 60, Carroll indicates that the cross at Auschwitz was erected, in part, to show that the Jews did not have a monopoly on suffering. In fact Carroll suggests that suffering has been used as a source of identity among Christians and even as a source of imagined superiority. This is, of course, lamentable and involves a gross misunderstanding of suffering. It is true that many righteous people have suffered due to no sin of their own. Sometimes these individuals suffer so others don't have to. Sometimes they suffer for any number of other reasons. It is a source of comfort to me that both Jesus Christ and Joseph Smith suffered greatly even though they were clearly instruments in the hands of the Father. Certainly our suffering does not imply unrighteousness. In fact, I believe that suffering must be viewed neutrally in the abstract. Individually it can provide invaluable meaning to life and of our relationship to God. But to use it as a claim of pious superiority is to obviate any sanctifying power it might have. Followers of a faith that has epitomized martyrdom and holy suffering should know better.

Notes on later pages may be forthcoming.

Thursday, October 22, 2009

How Boys Used to Grow Up

It has been just a century since big cities became so important to so many people. The way it happened was that a handful of scientists in Europe discovered a way to produce synthetic fertilizers cheaply. This breakthrough, along with a few other advances in agriculture, enabled farmers, almost overnight, to produce many times as much food as they had before. And since almost everybody in those times was a farmer, it soon became obvious that they could grow a lot more than they needed to survive.

Now agriculture is not always a lot of fun. If there is a chance of doing something that requires less physically demanding work, most people are eager to take it. And, in fact, many people did just that. By the time of World War I, many rural areas began seeing their sons and daughters leave for the city. By mid-century, New York City, which was the largest city in the world at the time, had over ten million people living there. By the end of the century, there were several cities with that many people. Cities with over a million people were common.

During this same period of urban growth, or shall we say rural abandonment, more and more boys began having problems growing up. A hundred years ago, the problems of juvenile delinquency were almost unheard of. Boys grew up on farms and learned how to work from an early age. There were many manly examples nearby for them to learn from and the transition from boy to man happened naturally. In fact it happened so naturally that nobody stopped to give it much thought. It just happened.

It worked because boys want to become men, and the world of men was easily defined. A boy takes great pride when his voice starts changing and he begins to hear a deeper sound than he used to hear. Then, when he starts to grow so fast that his pants and shoes no longer fit from one season to the next, he knows something is going on. When he starts to grow hair where he didn’t have any before, he begins to look at the world in a very different way. When all of this happens, the developing man needs to prove that he is no longer a boy. On a farm this is easy to do.

A man is strong enough to wield a heavy scythe. This means he can cut more grain than someone less strong. He can manage a team of horses or oxen with more confidence. He can heft a shovel and axe more deftly, accomplishing more than he could as a boy. A man is strong enough to build a barn or even a house. Before the advent of sawmills, it required the strength of a man to cut down trees, move them to a building site, and position them into sturdy structures. When a teenage boy was trusted to do this kind of work, he understood that he was becoming a man. After all, his body, including his growing muscles, was starting to look like a man.

This doesn’t mean that all boys used to grow up to be great men. Greatness has never been a democratic virtue. It does mean, though, that whereas boys used to grow up with a healthy sense of their proper place, all too many of them now have no idea about what they should do with their lives. Instead of shouldering a responsible life, they remain boys in grown-up bodies. Our urban lives and communities have made it hard for boys to find the right circumstances to become responsible adults - to become men. The prevalence of fatherless homes and the lack of venues where men can work side-by-side with boys are so common today that many of us can’t imagine any other way. This is at the heart of our problem.

Let’s at least try to imagine a different kind of world. If, by chance, we can learn how life used to be, maybe we can make changes to improve the way things are today. A couple of examples from Church history come immediately to mind. In both examples, we watch a boy become a man in spite of real setbacks. In the case of Joseph F. Smith, we see this happen in a fatherless family. In the case of B.H. Roberts, we see it happen without a father and without the support of a strong family. Despite the challenges, both boys became extraordinary men. Of course, a big part of the reason is that they were both extraordinary individuals in their own right. But they also lived in times when it wasn’t difficult to become a man.

Joseph F. Smith was the son of Hyrum Smith (the brother of the Prophet Joseph Smith) and Mary Fielding. He was only five years old when his father was killed at Carthage just outside of Nauvoo, Illinois where he lived. He witnessed first hand the challenges his mother faced as she outfitted a wagon, prepared her children (and other disadvantaged saints) for the long trip west, and refused to give up when others made it difficult for her to succeed.

On their trek to the Salt Lake Valley, Joseph, though only a boy, had to take responsibility on many occasions for the wagon and the team of oxen. This was no easy task because his company only had half as many animals a they needed to pull their two wagons. They were forced to hitch the two wagons together and have the animals pull both in tandem. When the trail was flat, the team managed well enough. When they came to a hill, however, Joseph would have to unhitch the wagons, pull one of them up the hill with the oxen, and then go back down the hill to get the other one.

This was a lot of work. In fact, it was a man’s work but since there were not enough men to do it for them, the nine-year-old Joseph managed to do it by himself. He was also required to take his turn guarding the wagons during the day, a job normally done exclusively by men. He would have had to do the same at night but his mother wouldn’t allow it.

Once his family got to the Salt Lake Valley, Joseph continued to be responsible for the cattle. During the first winter in the valley, one of the cows gave birth out on the range. Joseph, though not yet old enough to be a deacon, would not desert the calf. In spite of a pack of hungry wolves, he carried and pushed the newborn calf until he arrived back to the safety of his home.

Only a few years later, at the age of 15, Joseph was called on a mission to the Hawaiian Islands. This seems unbelievably young to us, yet Joseph was known to be fully up to the task. He had been doing man’s work for years. His son Joseph Fielding Smith later wrote of his maturity at this early time of his life that, “[At] the time of his mother’s death he was thirteen years of age, but the life he had led during all the tribulation the family had passed through had made of him a man at that tender age.”

Another early and remarkable example of becoming a man is Brigham Henry Roberts (or B.H. Roberts, as he is more commonly called) who became the Church‘s most important scholar in his time and a president of the Quorum of the Seventy. As a child, young Harry lived without both of his parents for several years and was accustomed to the environment of a broken home. Both of his parents joined the church in Lancashire, England in 1857 (the year Roberts was born). But it wasn’t long before his father abandoned the family.

A few years later, Harry’s mother, after acquiring just enough money to transport herself and her small children to Utah, left Harry and his older sister Polly with members in England. She hoped to send for them when she could earn enough money. Unfortunately, these members turned out to be less than trustworthy. Not long after Harry’s mother left for Utah they stopped attending church and began wandering around England. They often worked as entertainers in bars where Harry learned to dance and sing in order to earn his keep. When he could get away with it, he crawled under a bar room table to sleep.

As if this weren’t tragic enough, he ended up losing contact with his mother altogether. This was partly because of the negligence (perhaps intended) of his guardians but also because his care was entrusted to another family. Years were to pass before he was able to make contact again with the Church and make his way to Utah by participating in the Perpetual Immigration Fund. Roberts would later write of the years that, “my childhood was a nightmare; my boyhood a tragedy.”

There is an interesting story of young Harry walking across the plains that captures both the sense of his youth and of his maturity at this early age. His curiosity sometimes tempted him to leave his company in order to go exploring. On one occasion he found himself in a thicket and decided to rest. Before long he was asleep and only woke up when the last wagon was rolling away in the distance. He jumped up and ran to catch up, only to find a river between him and the others. A man on the other side asked if he knew how to swim. Harry said yes and quickly took off his shoes (they were actually clogs) and jacket, leaving them on the bank, and then swam across. He expected others coming after him to pick up his things, but he never saw them again.

For weeks Harry was left walking barefoot across the plains. Then one day his luck changed. he had been exploring in a small abandoned cabin where he discovered a corpse of a man who had only recently died. The body was still wearing a pair of boots. They were too big for Harry but that didn’t stop him. He removed them and returned to the wagon.

He then decided, however, that he wanted the boots to look nice for his mother when he arrived in Salt Lake. So, accordingly, he placed them carefully in the wagon and walked the rest of the way to the Salt Lake City barefoot. When his company finally arrived in the valley, Harry was seen walking down Main Street wearing a pair of shoes several sizes too large. Somehow this was quite fitting. He had certainly proven that he was capable of filling them.

It was good that Harry was a hardy boy because life continued to be difficult even after arriving in the valley. At the age of twelve, he had a job with Utah Central Railway as an ox-team grader. This was a challenging job that required leading a team of oxen while also manipulating a heavy wood or metal scraper that the team was pulling. In order to remove hard uneven mounds of dirt, the grader had to be positioned just right. Truman Madsen wrote of the experience that, “[b]y shouts and an expert whip hand he could simultaneously drive the oxen and manipulate the controls of the scraper. He did a man-size day’s work.”

As Harry got older, he found work in Tooele County in a mining camp. His work habits served him well as an errand boy and general camp hand, but life there had very definite disadvantages, as only mining camps have. As an impressionable youth he ended up participating in “irregular habits,” “Improprieties,” “recklessness,” “jumped claims,” “fights and gun play,” and even as a spectator of “murder”. His bishop ended up having him disfellowshipped from the church.

This is a hard way to grow up. In one sense, the stark realities of surviving forced a responsibility on young men at an early age - clearly an important part of growing up. But the example of B.H. Roberts also shows that this is, by itself, not enough. Manhood, in the eyes of the Lord requires more than just working hard.

Truman Madsen’s account of Harry’s mining years indicated that, “[b]y standards other than those of the Latter-day Saints, his mining camp lapses might be written off as “growing up.”” To Roberts, however, this was not the kind of “growing up” that he was proud of. He struggled the rest of his life to stay above the bad habits he learned as a youth. Many years later as a General Authority he pleaded for leaders to catch young men between boyhood and manhood when they needed the most help.

Fortunately for Roberts, the same bishop that had disfellowshipped him, helped him back into the Church. Some time later he began working with a blacksmith and came under the influence of a righteous mentor. Thereafter, he was able to put his life in order and start on the road to becoming a man of God.

These examples of Joseph F. Smith and B.H. Roberts are just two of so many others. Growing in to a man of God may have been as challenging as ever, but the mere fact of growing up was taken for granted. Today this is certainly not the case.

Far too many boys are not growing up. They may be shaving and well past their teenage years and still be incapable of, or unwilling to, shoulder the responsibilities of manhood. Is it any wonder then that the much more difficult goal of becoming a man of God seems now to be almost unreachable to so many of our struggling youth? What has changed? What can we do to fix this very serious problem?

[To be continued]

Thursday, October 8, 2009

American Drugs in Egyptian Mummies

American Drugs in Egyptian Mummies: A Review of the Evidence

Abstract: The recent findings of cocaine, nicotine, and hashish in Egyptian mummies by Balabanova et al. have been criticized on grounds that: contamination of the mummies may have occurred, improper techniques may have been used, chemical decomposition may have produced the compounds in question, recent mummies of drug users were mistakenly evaluated, that no similar cases are known of such compounds in long-dead bodies, and especially that pre-Columbian transoceanic voyages are highly speculative. These criticisms are each discussed in turn. Balabanova et al. are shown to have used and confirmed their findings with accepted methods. The possibility of the compounds being byproducts of decomposition is shown to be without precedent and highly unlikely. The possibility that the researchers made evaluations from faked mummies of recent drug users is shown to be highly unlikely in almost all cases. Several additional cases of identified American drugs in mummies are discussed. Additionally, it is shown that significant evidence exists for contact with the Americas in pre-Columbian times. It is determined that the original findings are supported by substantial evidence despite the initial criticisms.
In a one-page article appearing in Naturwissenschaften, German scientist Svetla Balabanova (1992) and two of her colleagues reported findings of cocaine, hashish and nicotine in Egyptian mummies. The findings were immediately identified as improbable on the grounds that two of the substances are known to be derived only from American plants - cocaine from Erythroxylon coca, and nicotine from Nicotiana tabacum. The suggestion that such compounds could have found their way to Egypt before Columbus' discovery of America seemed patently impossible.
The study was done as part of an ongoing program of investigating the use of hallucinogenic substances in ancient societies. The authors themselves were quite surprised by the findings (Discovery, 1997) but stood by their results despite being the major focus of criticism in the following volume of Naturwissenschaften. Of the nine mummies evaluated, all showed signs of cocaine and hashish (Tetrahydrocannabinol), whereas all but one sampled positive for nicotine. It is interesting too that the concentrations of the compounds suggest uses other than that of abuse. (For example, modern drug addicts often have concentrations of cocaine and nicotine in their hair 75 and 20 times higher respectively than that found in the mummy hair samples.) It is even possible that the quantities found may be high due to concentration in body tissues through time.

Without question, the study has sparked an interest in various disciplines. As Balabanova et al. predicted, "…the results open up an entirely new field of research which unravels aspects of past human life-style far beyound [sic] basic biological reconstruction."

The Criticisms

The biggest criticism of the findings of Balabanova et al. was not necessarily directed at the extraction process per se, although this was discussed. The biggest criticism was that cocaine and nicotine could not possibly have been used in Egypt before the discovery of the New World, and that transatlantic journeys were not known - or at least they are highly speculative. It is safe to say that the criticisms of the study would have been minimal or nonexistent if the findings had been made of Old World drugs. Such findings, in fact, would not have been at all unusual as the use of stimulants were known in Egypt. Poppy seeds and lotus plants have been identified for just this use in manuscripts (the Papyrus Ebers) and in hieroglyphs (as Balabanova et al. show).

Schafer (1993) argues that, "the detection of pharmacologically active substances in mummified material never proves their use prior to death." He argues that such compounds could have been introduced as part of the mummification process. The suggestion is that (especially) nicotine could have been introduced around the mummy (and subsequently absorbed into its tissue) as an insecticide (being used as a preservative) within relatively modern times. A similar criticism was raised by Bjorn (1993) who wondered if nicotine might have been absorbed by the mummies from cigarette smoke in the museums where the mummies have been preserved. According to Schafer, the only way to show that the compounds were taken into the bodies while they were alive would be to find different concentrations at different distances from the scalp - a procedure not undertaken by the authors.

Another interesting criticism of Schafer (1993) is that Balabanova et al. might have been the victims of faked mummies. Apparently people (living in the not too far distant past) believed that mummies contained black tar called bitumen and that it could be ground up and used to cure various illnesses. In fact the very word 'mummy' comes from the Persian mummia meaning bitumen (Discovery, 1997). A business seems to have developed wherein recently dead bodies where deliberately aged to appear as mummies and that some of the perpetrators of such deeds were drug abusers.

The criticism that seems most popular is that the identified drugs might have been products of "necrochemical and necrobiochemical processes" (Schafer, 1993; Bjorn, 1993). One explanation is that Egyptian priests used tropine-alkaloid-containing plants during the mummification process that subsequently underwent changes in the mummy to resemble the identified compounds.

Yet another argument is that there is nothing in the literature showing that any of the three compounds have been identified in bodies that have been dead for some time.

Reply to the Critics

Analytical Techniques and Contamination

In the study, samples were taken from nine mummies that were dated from between 1070 B.C. to 395 A.D. The samples of hair, skin and muscle were taken from the head and abdomen. Bone tissue was also taken from the skull. All tissues were pulverized and dissolved in NaCl solution, homogenized, and centrifuged. A portion of the supernatant was extracted with chloroform and dried and then dissolved in a phosphate buffer. Samples were then measured by both radioimmunoassay (Merck; Biermann) and gas chromatography / mass spectrometry (Hewlett Packard) - hereinafter GCMS.

This is the procedure used to produce what McPhillips (1998) considered indisputable evidence for confirming products of substance abuse in hair. Within recent years, hair analysis has been used more commonly in this kind of screening process and the techniques employed have been optimized. Mistakes are known to have occurred in some cases evaluating for metals, but the ability to detect drugs such as cocaine, nicotine, and hashish seem not been problematic (Wilhelm, 1996). The two possible mistakes in analyzing hair for drugs include false positives, which are caused by environmental contamination; and false negatives, where actual compounds are lost because of such things as hair coloring or perming. In recent years, these techniques of hair analysis have revealed the interesting findings of arsenic in the hair of Napoleon Bonaparte, and laudanum in the hair of the poet Keats.

The procedure includes a thorough washing of the hair to remove external contaminants followed by a process of physical degradation using a variety of methods (such as digestion with enzymes or dissolution with acids, organic solvents, etc.). Following these preparatory procedures, the hair is then analyzed. Antibody testing (e.g. radioimmunoassay) is a well-established procedure although there is small potential of obtaining false positive results. These are mainly caused by the cross-reactivity of the antibody with other compounds, including minor analgesics, cold remedies and antipsychotic drugs - compounds not likely to be found in Egyptian mummies. Because of the possible false positives, chromatography (GCMS) is routinely utilized to confirm the results.

The suggestion of nicotine contamination from cigarette smoke is eliminated by the use of solvents and/or acids in the cleaning process - methods used by Balabanova et al. and all other researchers that have documented drugs in mummies.

The validity of Balabanova's findings seems to be vindicated at least so far as the analytical methods used in the study. The authors' methods as well as those in the additional findings reported here (see below) have used the combination of immunological and chromatographic methods to both analyze and confirm samples.

Faked Mummies

The argument that the mummies might have been modern fakes was investigated by David (Discovery, 1997). David is the Keeper of Egyptology at the Manchester Museum, and undertook her own analysis of mummies, independent of Balabanova's group. In addition, she traveled to Munich to evaluate for herself the mummies studied by Balabanova's group. Unfortunately the mummies weren't available for filming and they were being kept isolated from further research on grounds of religious respect. David had to resort to the museum's records. She found that, except for the city's famous mummy of Henot Tawi (Lady of the Two Lands) the mummies were of unknown origin and some were represented only by detached heads.

David's inability to examine the mummies herself may have kept the possibility of faked ones open; however, her evaluation of the museum's records seemed to indicate otherwise. The mummies were preserved with packages of their viscera inside. Some even contained images of the gods. In addition the state of mummification itself was very good. The isolated heads may have been fakes (evidence one way or the other is lacking) but the intact bodies examined in Balabanova's research were clearly genuine.

Chemical Changes

The argument that the identified drugs might be byproducts of decomposition is highly unlikely. The argument appears to resemble a 'Just So' story of biochemical evolution without the benefit of natural selection. Schafer (1993) admits that natural decomposition or mummification has never led to the synthesis of cocaine or related alkaloids but leaves the possibility open anyway. He argues that the compounds in question might theoretically have been produced by tropine-alkaloid-containing plants (such as were present in species that were utilized in the mummification process).

The benefit of the doubt in this case clearly goes to Balabanova et al. Until it is shown how cocaine could be produced in this way, the argument is hypothetical at best.

Isolated Example

The detection of drugs in human hair is a fairly recent endeavor (McPhillips, 1998; Sachs, 1998). A few compounds were identified during the 1980's but it wasn't until the 1990s that drug screening via hair analysis became accepted and used as a possible alternative to urine sampling. The criticism that no known cases of cocaine, nicotine, or hashish have been reported in human hair must, therefore be interpreted with clarification. None of these compounds had been observed in human hair because the process had not been fully developed, nor had the application even been considered until quite recently. Even then the claim is not true.

Cartwell et al. (1991) using a radioimmunoassay method detected cocaine metabolites in pre-Columbian mummy hair from South America. In this study two out of eight mummies analyzed showed cocaine metabolites. All samples tested were confirmed by a separate laboratory (Psychomedics Corporation, Santa Monica, California) using GCMS. The two mummies testing positive were from the Camarones Valley in northern Chile. The artifacts as well as the mummies at this site were typical of Inca culture.

Since the initial work of Balabanova et al., other studies have revealed the same drugs (cocaine, nicotine, and hashish) in Egyptian mummies, confirming the original results. Nerlich et al. (1995), in a study evaluating the tissue pathology of an Egyptian mummy dating from approximately 950 B.C., found the compounds in several of the mummy's organs. They found the highest amounts of nicotine and cocaine in the mummy's stomach, and the hashish traces primarily in the lungs. These findings were again identified using both radioimmunoassay and GSMS techniques. Very similar results were again found in yet another study by Parsche and Nerlich (1995). Again, the findings were obtained using the immunological and chromatographic techniques.

David's work (Discovery, 1997) though not finding cocaine, did confirm the presence of nicotine. This finding has seemed a little less threatening to conservative scholarship in that it seems possible (albeit unlikely) that a nicotine-producing plant may have existed in Africa within historic times - only becoming extinct recently.

Such a possibility might allow for a comfortable resolution to conservative scholarship but doesn't explain the evidence of cocaine. Additionally, the possibility of a native plant going extinct is unlikely. Much more reasonable would be that an introduced species under cultivation could go extinct, yet this only begs the question of the original provenance of the species.

In any event, considering the several confirmations of Balabanova's work (as well as that of Caldwell et al. prior to her study) it appears that the argument against their findings based on too little evidence is quickly vanishing (if not already obviated).

Pre-Columbian Voyages to America

The major reason for the initial criticisms to Balabanova's work is the disbelief in pre-Columbian transoceanic contacts. Egyptologist John Baines (Discovery, 1997) went so far as to state, "The idea that the Egyptians should have traveled to America is overall absurd…and I also don't know anyone who spends time doing research in these areas, because they're not perceived to be areas that have any real meaning for the subjects." Another interpretation on why researchers haven't considered the subject closer is given by Kehoe (1998), "After mid-century, any archaeologist worried about money or career avoided looking at pre-Columbian contacts across saltwater [p. 193]." It appears that acknowledging that pre-Columbian contacts occurred was not academically acceptable. Kehoe (1998) also gives examples of several researchers whose work has been academically marginalized because it supported these views (e.g. Stephen Jett, Carl Johannessen, Gordon Ekholm, Paul Tolstoy, and George Carter).

Surprising at it may seem, evidence for early ocean voyages to America from the Old World is not lacking - nor is it negligibly verifiable. Within the last two years, two periodicals, focusing on these contacts have been established. The first, entitled Pre-Columbiana, is edited by Stephen C. Jett, Professor of Clothings and Textiles at the University of California, Davis; the second is entitled Migration and Diffusion and is edited by Professor Christine Pellek in Vienna, Italy. There is certainly quite a bit of spurious reports of early contacts from the Old World, however, a general disregard for all of the evidence is, anymore, itself evidence of academic negligence, as these two periodicals indicate.

A bibliography of these early contacts is given by John Sorensen (1998) in the first issue of Pre-Columbiana. It is a good example of the kinds of evidence being uncovered by legitimate researchers and institutions. The bibliography is itself a condensation of a two-volume work of these publications and includes titles such as: The world's oldest ship? (showing evidence for a pre-Columbian ship in America) published in Archaeology; Peruvian fabrics (showing very strong similarities between Peru and Asia) published in Anthropological papers of the American Museum of Natural History; Robbing native American cultures: Van Sertima's Afrocentricity and the Olmecs (showing evidence for connections between Africa and the Olmecs of Middle America) published in Current Anthropology; Possible Indonesian or Southeast Asian Influences in New World textile industries (showing at least three textile-related inventions that appear in both Indonesia and the New World) published in Indonesian Textiles; and, Genes may link Ancient Eurasians, Native Americans, published in Science.

And the list goes on and on - some evidence being better than others - but as a whole it seems pretty much irrefutable. Claims to the contrary seem to be made by individuals with a vested interest in the isolationist position. The evidence, pro and con, when evaluated objectively, would seem without question, to favor the diffusionist position (which claims that pre-Columbian contacts took place).


The initial reactions to the findings of Balabanova et al. were highly critical. These criticisms were not based on a known failing in the authors' research methodology, rather they were attempts to cast doubt on an implication of the research - that cocaine and nicotine were brought to Egypt from the New World before Columbus. This conclusion is not acceptable to conservative investigators of the past. In fact it suggests a deep-rooted aversion to what Balabanova suggested might mean an unraveling of aspects of history contrary to basic reconstructions. This aversion, according to Kehoe (1998) stems from the conviction that Indians were primitive savages destined to be overcome by the civilized world - that the acme of evolutionary success resided in the conquering race itself. "Childlike savages could never have voyaged across oceans."

Balabanova's findings bring yet other evidence forward that humanity is not so easily pinioned into the pre-conceived notions of primitive and advanced - even as this might be related to the presumed technology of earlier times. The quest for discovery - to find new worlds - is not just a modern selective advantage. Perhaps it has always been a defining characteristic.

Literature Cited:

Balababova, S., F. Parsche, and W. Pirsig. 1992. First identification of drugs in Egyptian mummies. Naturwissenschaften 79:358.
Bisset, N.G. and M.H. Zenk. 1993. Responding to 'First identification of drugs in Egyptian mummies'. Naturwissenschaften 80:244-245.
Bjorn, L.O. 1993. Responding to 'First identification of drugs in Egyptian mummies'. Naturwissenschaften 80:244.
Cartwell, L.W. et. al. 1991. Cocaine metabolites in pre-Columbian mummy hair. Journal of the Oklahoma State Medical Association 84:11-12.
Discovery Information. 1997. Curse of the Cocaine Mummies. Thirty-six page transcript of program viewed on US national TV in January 1997 and July 1999.
Kehoe, A.B. 1998. The Land of Prehistory, A Critical History of American Archaeology. Routledge, New York and London. 266 pp.
McIntosh, N.D.P. 1993. Responding to 'First identification of drugs in Egyptian mummies'. Naturwissenschaften 80:245-246.
McPhillips, M. et. al. 1998. Hair analysis, new laboratory ability to test for substance use. British Journal of Psychiatry 173: 287-290.
Nerlich, A.G. et. al. 1995. Extensive pulmonary haemorrhage in an Egyptian mummy. Virchows Archiv 127:423-429.
Parsche, F. 1993. Reply to "Responding to 'First identification of drugs in Egyptian mummies'". Naturwissenschaften 80:245-246.
Parsche, F. and A. Nerlich. 1995. Presence of drugs in different tissues of an Egyptian mummy. Fresenius' Journal of Analytical Chemistry 352:380-384.
Sachs, H. and P. Kintz. 1998. Testing for drugs in hair, critical review of chromatographic procedures since 1992. Journal of Chromatography (B) 713:147-161.
Schafer, T. 1993. Responding to 'First identification of drugs in Egyptian mummies'. Naturwissenschaften 80:243-244.
Sorenson, J.L. 1998. Bibliographia Pre-Columbiana. Pre-Columbiana 1(1&2):143-154.
Wilhelm, M. 1996. Hair analysis in environmental medicine. Zentralblatt fur Hygeine und Umweltmedizin 198: 485-501.

[This is a re-posting. The original is occasionally cited and can be hard to find so I am posting it here as well. The original can be found at:]

Saturday, August 15, 2009


Following a path
Of sunshine
The floor

From vast
And cold
And empty space
To warm my feet
So miniscule and far
And needing light

Sunday, August 9, 2009

Honesty is Not Negotiable

Some time ago I watched a young Christian lady talk herself into a number of dishonest acts. It troubled me because I know she is a good person with high ideals. It started rather innocently when she detected a double standard in the way her boss treated people. She then let this apparent injustice fester in her mind until she began to whine and complain about it. Then she began to justify doing things that she had been asked not to. “It’s only fair,” she would say to herself, followed by, “no one will ever know.”

What made this easy for her was that no major sins were being committed – or at least she didn’t think they were. It was her boss that was not being fair and she was on the side of justice. What she didn’t realize was that she was falling into a trap to which Christians, living in democracies, are very vulnerable. She put a higher value on fairness than on honesty.

Fairness is a very important virtue. When it is disregarded societies fail. In fact, in a very real sense, the history of liberty is a history of making society increasingly fairer. We disregard it at our own peril. But justice is important because it helps us live with each other. At an individual level it diminishes in importance. Being fair to ourselves is usually not something we have to work on. Being fair to others is important, but if we’re honest, fairness usually takes care of itself. The reverse is not always true.

Consider, for example, two simple questions. Suppose you have just returned from the store and noticed that the clerk short-changed you a dollar. What is your reaction? Now suppose you have just returned from the same store and discovered that the clerk gave you one dollar too much change. Now what is your reaction? Well the answer is pretty clear for most of us. We’re a bit upset by the first situation. A bit less upset by the second – in fact, maybe we’re not upset at all by it. Very few of us would be more upset by the second situation than by the first. Why is this?

No doubt part of the reason is self interest. If we don’t take care of ourselves, how can we expect others to? Besides, the customer is always right. If the clerk made a small mistake that benefits me – well, that’s his problem. Isn’t it?

Another part of the reason is that we’ve divided honesty into manageable compartments. We’ve learned to be honest to the extent of not breaking the civil law. But when it comes to our allegiance to a higher law, we are much less eager to comply.

A fair question to ask is which virtue is more important to a community – honesty or justice. Clearly, it will be argued, as long as there are criminals, there must be ways of dealing with them. In an imperfect world, justice cannot be dispensed with. Fair enough.

But this is the point where we make our mistakes. It’s certainly the place where my friend made hers. By emphasizing the importance of justice at a community level, we fail to recognize more important virtues in our personal lives. Instead of trying to overcome our own imperfections, we see how unfair others are and then we start gossiping and complaining about their imperfections.

A comparison might be made to espionage. We recognize that it can be important in international politics. But who would argue that spying on each other is good for families? This same disconnect in the relative hierarchy of virtues includes justice and honesty but we never seem to give this much thought.

Part of the reason seems to be that honesty is not something free societies worry too much about. It tends to take care of itself – at least in so far as it matters to the society at large. A dishonest person gets his comeuppance sooner or later. His colleagues might lose trust in him; or, if the dishonesty is illegal, he might end up in jail.

The Christian ethic, however, has never considered honesty a minor virtue or something that will just take care of itself. In fact Christians are not only expected to be completely honest with each other, they are also to be completely honest with God – regardless of who might be watching.

In the Sermon on the Mount, Christ taught His followers not to judge others at all, for “with what judgment ye judge, ye shall be judged: and with what measure ye mete, it shall be measured to you again.” Later in the New Testament Jesus taught the parable of the laborers in the vineyard. In the story, the master agreed upon a wage with the laborers for working a full day. The master then agreed to pay others the same amount for much less than a full day’s work. The laborers that had worked all day were a bit upset, understandably. Even though this is a parable about the last and the first in the kingdom of God, it works as a parable because of its basic understanding of justice. Fairness is what we agree to, not something we negotiate after the fact. If there is a message for us in the perceived injustice it might be something like this: “don’t get so preoccupied with these things, life isn’t fair - deal with it”.

Yet while fairness preoccupies our thoughts, we often fail to see the effects of being honest. They happen automatically. To someone serious about never cheating, lying or getting more than a fair share, injustices hardly ever occur. When they do, they are accepted as part of an imperfect world. An honest Christian expects that sacrifices are necessary and that perfect justice has never been promised to anybody. In fact the central message of Christianity is that the justice that has condemnatory claim on each one of us can be trumped by the sacrifice of Christ. It can be, that is, to those who are honest.

If we are fair, we will probably get along with each other. We may even be very ethical people. Justice certainly does not imply selfishness. But then again it doesn’t prevent it either. It depends on what sort of justice we believe in. But if we become imbalanced in our justice to the point of ignoring honesty, other things follow. Chances are we’ll get carried away with endless self-justifications. We become experts in situational justice, as if truth were negotiable. In the end, we live without peace, for there are no promises for this sort of thing other than a world of anger and offended people living tit-for-tat lives.

To those who are honest and are willing to sacrifice for a nobler cause, though, the rewards are immense. Life can be lived with a peaceful conscience. Injustices happen, of course, but somehow they seem to occur less frequently than they do to others. When they are overlooked, the honest person often wins a friend. More import is the spiritual maturity that comes from obeying a higher law and the joy of being acceptable to God (see Doctrine and Covenants 97:8).

It might seem that returning a dollar to the clerk who gave us too much money is an act of justice. And so it is. But we don’t praise this sort of thing for its fairness. We praise it for its honesty. Fairness, after all, is something we praise children for. It is a politeness like saying “thank you”. We are expected to have it when we grow up. It takes a lifetime and then some to become honest. It’s not something we can talk our way through by pointing out the failings in others. It is part of a higher law, and it isn’t negotiable.

Thursday, July 30, 2009


Most of us have lamented, on more than one occasion, how unfortunate it is that our memory is so poor. When we hear about people that seem never to forget things, it’s hard not to be jealous. Everybody knows of an unmotivated truant who gets better grades than the sedulous student because he never forgets what the teacher says - even if he doesn‘t show up to half of the classes. It doesn’t seem fair.

But this is really only a minor part of the problem. Forgetfulness has much more to do with who we are than we think. It is one of the biggest limiting factors in our lives. Our ability to remember determines, in many ways, what we are and what we do with our lives. What we forget, on the other hand, determines in many ways how much we will be held back.

There is one particular kind of forgetfulness that is, by far, more important than all the others. It is the forgetfulness that we call birth. Unlike other kinds of forgetfulness, though, the forgetfulness at birth is evenly experienced by all of us. And, as often happens when everybody experiences something to the same degree (like breathing oxygen), we tend to ignore it.

But this is a mistake. Regardless of its universal nature, this forgetfulness can cause us a great deal of grief. If it were to disappear suddenly, our individual lives would be filled with rapture. Unconditional love would prevail. There would be no more problems of poor self-esteem. We would no longer be pre-occupied with social standing, with how we look, with how much money we make, or with all the things that we imagine will help us feel good about ourselves. Wars would cease altogether and we would be at peace with ourselves. In a word, life would be heavenly. And it would be heavenly because, in the eyes of God, we are His children and we are of great – even unimaginable – worth. The trouble is that we just don’t remember that we are.

Which is, of course, the way mortality was meant to be. Life here on earth, at least for the time being, is not intended to be heavenly. It is supposed to be a testing ground. And this forgetfulness is an important part of the test. It even has a name is some faiths. In the Judeo/Christian tradition it is called “the veil”.

The image of a veil is a fitting one. Veils hide things that are not meant to be profaned. In some cultures a women’s face is so considered. Our life before and beyond mortality – life that remains obscured and yet informs our most sacred longings – surely fits this image as well. Yet, interestingly, the Judeo/Christian veil refers specifically to one particular veil – at least it used to. That veil was the veil in the temple in Jerusalem.

There was actually more than one veil in the temple. Some of them were like curtains separating rooms and sacred places. But the one that was specifically referred to as “the veil” separated the Holy of Holies from the Inner Court. This was the most sacred place on earth. It was the place that only the High Priest could enter – and then only once a year. The New Testament refers to this veil as the katapetasmatos, or literally, “the place to draw near to heaven (or winged things)”. It is mentioned in the Epistle to the Hebrews as the focus (or anchor) of a Christ-centered life (Hebrews 6:19).

This veil is not like other veils we are used to. We can’t just remove it and expect to see clearly. Neither can we just exercise, get a good night sleep and then wake up remembering what we have forgotten for many years. The veil of forgetfulness is intended to remain with us for as long as we sojourn in mortality. And yet remarkably, there are those that manage to sense what is on the other side anyway. I’m not referring to those that have near-death experiences – going through and returning from the veil. Such cases are certainly noteworthy but they seem so remote from most of our own experience.

There is another way to understand what lies beyond – to get a glimpse of what it means to be a child of God in this life and in the eternities. This other way is through the window known as charity.

By charity I don’t mean the giving of money to the poor, or the love that we have for our family and close friends. This kind of love is one of the greatest experiences of life but it is, nonetheless, a part of this life – of mortality. It is in our genes. Charity, or the love of God, is more than this. It is the great gift of the spirit that is vouchsafed to us when we give all of our hearts to God. Through this window of charity we see and understand the great worth of each of our Father in Heaven’s children. Through this window we are filled with love for all whom we see – even, remarkably, for our enemies. When we look through this window we begin to understand just how valued we are as members of this divine family. It is only through this window that we can ever hope to get beyond the constraints of this fallen world – constraints that are necessary but inseparable from so much sadness – even to the desperate angst of existential despair.

The veil is quite misunderstood
When it is cast before the mind
But it is not the brain that worries me
As does another kind of memory
That sunders from the heart
Belief in Heaven’s pedigree

How much grief could be avoided if our criminals – even our angry neighbors and coworkers, for that matter – could somehow see through this window. Instead of trying to get a bigger boat, the family down the street might instead offer to help take care of the neighbor’s yard while they are on vacation at the lake. They wouldn’t be worrying about their own importance or their visible possessions. Instead of competing with each other at work in order to please the boss, we would already understand that we are accepted by the greatest boss of all – our Heavenly Father. Knowing this, we would spend our time being helpful to everybody – helping others get ahead.

And it is no wonder. Seeing through the veil has the effect of filling us with the love of God. Of course the opposite is sometimes more commonly seen. When we fail to look through the veil, the love of many seems to disappear. Selfishness is the opposite of this eternal perspective and one of its greatest causes is not looking through the window often enough.

But failing to look through the veil is one thing. Forgetting about the veil altogether is quite another – sort of a compounded forgetfulness. Remembering that there is a place where we can glimpse into heaven – however imperfectly – should inform every aspect of our lives. Without this anchor, we are left to find meaning any other way we can. And, sadly, there are too many people living this way, and they are easy enough to spot. Charity is missing from their lives.

What then is to be done with this global epidemic of amnesia – with this failure to understand our relationship to God? The answer is really quite simple. It is the katapetasmatos. It is the veil in the House of the Lord. It is a life focused on that window into the knowledge of the love of God and of our relationship to Him.

Let’s face it. Life doesn’t offer us free samples of self esteem. How sad it is that most of us spend a lifetime trying to feel good about ourselves. It’s so much easier to just remember what we have forgotten – what is obvious on the other side of the veil. We really are children of God.

Thursday, July 16, 2009

Why We Need Field Guides

Travel books are popular and tend to sell well. They don’t make best seller lists, at least that I know of, but they usually occupy more shelf space in our favorite bookstores than titles on philosophy, for example, or music or even field guides.

This shouldn’t surprise anybody. We prefer traveling to thinking deeply, after all; and, as far as music goes - well, we’d much prefer to just listen to it rather than read about it. But what about field guides? This might seem like an odd question. After all, what does traveling have to do with field guides?

Well, frankly, traveling has - or at least it should have - a lot to do with field guides. The truth is that the most distinctive part of any place we might visit is more evident in the native animals and plants (even in the fungi and microbes for that matter) than in whatever man-made structures we might otherwise associate with them.

This is evident in spades to the business traveler who flies to a big city, eats at a nice publicly owned restaurant, sleeps in an upscale hotel and hardly steps outside except to hop in a cab back to the airport. “Oh the food was great,” our traveler might confess back at the office. But, in reality, our traveler has really not traveled at all. At least he hasn’t really experienced what is unique about the place he has just visited. All he has really done is change locations for few days. Unfortunately this is becoming more and more common as places all around the world compete for the world traveler’s business. The more interconnected we become nationally - and especially internationally - the more our business and tourist attractions begin to look more and more alike (as Daniel J. Boorstin has pointed out in The Image).

I don’t mean to downplay the many historical buildings and parks that make up our cities and give them personality and charm. I merely wish to point out what should be obvious. The complex of living things, that have made individual places around the world their home, is a part of the history of the world that should take priority to many other human constructs. These complexes are what make places unique.

We recognize this in ways we might not have realized. Experienced connoisseurs of wine (which I am not) are keen to the types of grapes that grow in a particular region. Sometimes these grapes are different varieties and give a corresponding flavor that is distinct. Sometimes, though, the same grapes just taste differently in different places that experience different climates. Even subtle environmental differences can make a very different wine.

Restaurateurs are also keen on locally produced foods. Crab cakes are just better on the Delmarva Peninsula than they are in Detroit. I can also vouch for the superiority of key lime pie in the Florida Keys over any other place I’ve tried it. Place is important after all, even if the purveyors of global markets try ceaselessly to convince us otherwise.

An acquaintance of mine - a true foodie - once told me that he would never eat at a chain restaurant if a local restaurant were available. And although it happens that the local cuisine is not always impressive, it at least has individuality. If he happens to misjudge a place because of a poor dinner choice, he has at least made a judgment using better criteria than the quality of a handful of national chains.

All of this may seem like a long way from field guides but, in fact, it isn’t. If food and wine are sufficient to give a place a level of local interest, the kinds of wildlife that live in a given place should do so to an even greater degree. If you like the seafood in Seattle, you should check out the coast and see how many unusual seabirds you can identify. Of course, you’ll need a field guide to help you - and hopefully you’ll be able to find a local one. Bird guides of the entire US are nice to have but for beginners looking to identify organisms in a specific place of interest, local field guides are quite a bit more helpful.

When you’re done with your trip, you’ll probably have many more fond memories of the place you visited and a more accurate understanding of what makes that place unique. Before long you’ll start planning your trips around places of natural interest instead of places that are just popular. You’ll also have a better understanding of the real world and be wiser than you would be for having just been a tourist. Before long you’ll have started to collect field guides from different places and have a lot of great reading material for bedtime. Field guides, after all are the perfect light reading at the end of the day. The information comes in small reading bites and leaves you with thoughts of interesting places to visit. You may even start dreaming of exotic places and beasts. Then, to top it all off, you’ll also be a lot smarter. This is hard to beat. And all of it for the price of an inexpensive paperback - that is usually bound to take a bit of beating. So next time you buy a travel book, make sure you stop by the field guides as well. They go hand-in-hand. And have a nice trip.

Sunday, July 5, 2009

Discovery and Invention

Our time is a time of invention and innovation. We expect a constant supply of new and better products. Our economy is based on a growth that is driven by these products and by new and better ways to do things. Our patent offices around the world are busier than ever before with all the new ideas that promise to make their inventors rich.

In contrast, the period of discovery is losing momentum. There are fewer and fewer unexplored places left on earth. No new maps are being made with the inviting words “Terra Incognita” written in far away places. This doesn’t mean that we’ve discovered everything there is to know - far from it. The oceans are still mostly mysterious to us, and many new species of living things are being found every year - sometimes in our own backyards. Even so, we are funding fewer and fewer taxonomists to handle the added diversity. There is a lot we still don’t know about our world, not to mention all that lives beyond it.

Bu in spite of all this, there has been a shift in the focus of or creativity. This shift may seem subtle but it is, I think, significant. Whereas discovery accepts the reality of the world as it is, invention attempts to re-create it. Neither discoverers nor inventers can claim a monopoly on virtue or take all the credit for improving the world. But when it comes to blame - blame for harming the world - inventers far exceed discoverers. Of course inventers usually don’t try and do this intentionally. But whether they do so intentionally or not, there is a different kind of arrogance that starts with a human construct outside of the natural order and imposes it upon the living world.

I say that this is a different kind of arrogance because I don’t wish to minimize the overweening pride of many discoverers and inventers alike. Without doubt the early explorers were often motivated by pride and their arrogance lead to much harm among conquered peoples. This is not all so different from many discoverers today who compete with each other in laboratories and in the field for recognition and prestige. It may even be true that the pride of discoverers is greater than that of inventers, who very often are more motivated by wealth than by pride.

But this is not the point I wish to emphasize. Human hubris - the kind that has disrupted our planet and threatens to destroy us in any number of ways - is a problem we have inherited from inventers and not so much from discoverers.

Now it is also true that inventers would have no basic building blocks to work with if it weren’t for the effort of the discoverers. And so it might be tempting to blame them as well, but this would be a mistake. I am not arguing for a cessation of inventions. In fact, if anything, I am asking for more - but for inventions of the right kinds - the kinds that respect the natural order of things.

There is nothing inherently wrong (or even sinful) about inventing, just as there is a lot of room for wrong (and even sin) in acts of discovery. The difference is that the act of discovery itself is grounded in the creation, and if it happens that the discoverer lacks any and all respect for the Creator, this grounding is at least a check on un-natural consequences.

The act of inventing, however, is a different thing. It often involves a modification of the natural order. When this saves lives or otherwise improves the world, it is commendable. As someone with a few patents to my name, I would be a hypocrite to argue otherwise.

But the part that we have ignored for too long is that there are consequences to everything we do. There are consequences that follow from natural events. These consequences are themselves part of the natural order. But consequences that stem from unnatural events can be an entirely different thing. Perhaps these consequences may be small - like a dry shirt as a consequence of using a clothespin on a rope. Or the consequence may be great - like the genocide of an atomic bomb. In some cases we don’t have the smallest idea of the consequences of the things that we invent.

But it’s about time that we started thinking about them a little more seriously, and stop supposing that invention is an unambiguous good. A good place to start is to ask the simple question about how an invention impacts the natural order. Subsequent questions follow naturally from this.

Come to think of it though, there’s even a more fundamental issue before we can ever start to ask this question. We need to start recognizing the fact that we are part of the natural order ourselves. When we ignore this, even while we unleash so many unnatural things upon the world, we risk much. Strange as it might seem, we need to “discover” again just how much a part of this order we belong to.

Daniel Boorstin pointed out several years ago that we are living more and more in a world of what he called pseudo-events. These are un-natural events that we as humans contrive for our convenience and pride and that now surround us and fill our lives to the exclusion of the natural order of things. A significant consequence of these many contrivances in our lives is that we are no longer grounded in reality. Boorstin writes,

“More and more of our experience thus becomes invention rather than discovery. The more planned and prefabricated our experience becomes, the more we include in it only what “interests” us. Then we can more effectively exclude the exotic world beyond our ken … and which we most need to make us more largely human.” (See The Image, A Guide to Pseudo-Events in America. Vintage Books (1987) p. 256.)

Of course the key issue here is a willingness to acknowledge that there is a natural order for us as humans and that it is not the same as for us as animals only. This was always evident to our ancestors – who in many ways were much wiser than we seem to be. This was evident to them because they were discoverers. They were discoverers of many things – including of what it means to be human.

So let’s continue to fill our patent offices with ways to improve the world. But let us be wise enough to realize that unless we also continue to discover what it means to be truly human, we run the very real risk of destroying ourselves and a whole lot more.

Saturday, June 27, 2009

Owl Near Aspen Grove

It might have been a hunch
That had me turn my head
Right when the specter glided by

And then I felt the silent
Wielding of her wings
And knew it wasn’t me

For my heart wasn’t beating
When the feathers
Ruffled past

I think she saw me all along,
And merely came to wish me
Pleasant dreams

Sunday, June 21, 2009

Limiting Factors

Many years ago when I was in junior high school, I decided I wanted to be on the track and field team and compete in the high jump. The gym coach made arrangements for me and I spent many hours practicing. I learned quickly that I could clear the bar better by jumping backwards and so I worked on that technique. I made quick progress at first as I strengthened my legs and learned a how to better control by jump. But then I hit a plateau and no matter how hard I tried, I couldn’t get any higher. The lower part of my back kept hitting the bar. I tried different kinds of exercises and lifted more weights but no matter how strong or flexible I became, I remained limited by my lower back. We didn’t have anybody at the school to help me on my technique so I never improved. When basketball season came around, I gave up high jumping; and, as it turned out, I never returned to the sport.

I’ve come to believe that limiting factors - whether they be a high jumping technique or anything else - are much more important than many of us think. And yet we hardly ever do pay much attention to them. The truth is that we really don’t like to because it usually isn’t very pleasant. This is understandable because limiting factors are very often things in our lives that we don’t do well. We much prefer focusing on our strengths. It makes us feel better about ourselves. Unfortunately, this kind of thinking also keeps us stranded on plateaus. Sometimes we want very badly to get off these plateaus and do something more with our lives. When we seem unable to do so, we get frustrated. “Why can’t people see how good I am? We ask ourselves, never realizing that it isn’t our strengths that keep us back. It’s our weaknesses that do.

When we are young we tend to take people at face value. If someone looks smart or pretty or keeps a particular image we believe that it accurately represents them. Not realizing that we are only seeing one side – the side the person wants us to see – we sometimes assume that they have no negative traits at all. When, to our surprise, this person ends up, say, in prison, it completely surprises us. “That’s not the person I knew,” we tell ourselves. And this, of course, is not a lie. We never did thoroughly know that person. We would be a good deal wiser if we recognized this. We all have virtues. Everybody expects this. Why should we be surprised when we discover that we also have limitations?

I’ve watched a lot of basketball players through the years. One thing is common among almost all of the fairly good players - I mean players that are not quite good enough to make the team. They think the game is all about scoring and so they practice and practice their shooting and ignore other important parts of the game. Many of these individuals would have done much better to have learned to be more aggressive and be tougher defenders and rebounders. These skills are much more commonly the limiting factors among this group of players.

The truth is that we are quite good at ignoring our limiting factors. The most common way that we do this is by making excuses or by otherwise justifying our behavior. I know a bright manager who desires to move up the corporate ladder. He has a lot of excellent qualifications but has a habit of being too glib and at times condescending in his conversation. He realizes this but instead of trying to change behavior, he says that people are just like that where he is from and that he can’t help the way he is. “We all have our little hang-ups anyway,” he says, or so he has convinced himself.

Another reason we tend to ignore our limiting factors is because they are often tied to our habits or to things that make us comfortable. The example that comes most immediately to mind is our preference to relax instead of work, like our habit of watching just one television program that then leads to a second and so on until the entire evening is wasted. It is a tragedy the amount of life that is wasted in this way.

These are obvious examples, but what about gossiping? It is also a habit that keeps us stuck on plateaus. Very few habits so readily mark us as mediocre as habitual gossip. It is the hallmark of prideful virtue and self-righteousness. It automatically puts one, whether merited or not, in the camp of the complainers instead of among the few that can be trusted. It is a past-time of moral laziness.

Another very common limiting factor is the lack of knowledge. Take for instance an inexperienced farmer who decides to plant an acre of his favorite sweet corn in a field that was planted in wheat the year before. Unknown to the farmer are the thousands of hungry wireworms hidden in the soil that are eager to eat every corn root they can find. The farmer, with a bit more knowledge, could have learned that farming isn’t just a matter of planting, watering and harvesting. Losing a crop is a hard way to learn about limiting factors.

But the most serious kinds of limiting factors are those involving sin. One of the reasons that they are so serious is because sin tends to be a cascade to so many other limiting factors. Maybe we are kept back in our employment because we tend to be lazy occasionally. This, of course, is a sin against our employer. If we fail to overcome this habit it very often leads to complaining or gossiping in order to vindicate our poor performance. When this happens, it becomes all too easy to tell lies which then lead to even more serious problems. Even the smallest sins that are left to fester can be profoundly limiting.

But sins can also have a bright side. They can be overcome and changed into strengths. This is the message of the prophet Ether. Those that humble themselves, exercise faith in Christ and forsake their sins have the promise that their weakness will turn to strengths (see Ether 12:27). This is not always the case with other limiting factors.

Very often when we overcome a limiting factor, we progress. This isn’t necessarily because our weakness has become strength but because we have just removed so much dead weight that was holding us back. When weaknesses are removed by Christ, however, the promise is that not only will we have the dead weight removed but that we will gain a new strength. This can happen because we gain access to strength beyond ourselves.

One of the great examples in literature both of the limiting and strengthening nature of sin is the Reverend Arthur Dimmesdale in Nathaniel Hawthorne’s The Scarlet Letter. The honorable Reverend Dimmesdale, in a moment of passion lost his virtue with the married woman Hester Prynne. Unlike Hester, though, whose sin was readily visible in the child of their adulterous relationship, the child’s father, Reverend Dimmesdale, remained unknown. And yet, for all its obscurity, the sin had a profound effect upon him. He planned many times to announce his error publicly but was unable to. In a society that was intolerant of this sin, his life would have been completely ruined. Any hope that repentance can lead to a better life in this sort of society did not exist.

And so Dimmesdale’s sin festered and he considered himself the worst of humanity. In fact, his self-demeaning habits were so obvious that the citizens of Boston considered him a very saintly and pure man because of them. Yet his burden made him physically sick. His health was even more compromised because of his doctor (who, unknown to him, was also Hester’s husband seeking revenge).

Through all this, the citizens believed their Reverend Dimmesdale to be among the greatest men alive. The reader is made to know that it was because of his sin that he had become such a humble and compassionate man. And yet, in the end, it was his sin that also too his life. His sin was the major limiting factor in his life. And yet it was also an unfulfilled strength. It has been argued that without his sin, he never would have been such a great spiritual leader. Hester believed that he had paid for his sin many times over and that he need not carry the burden around any longer.

But Dimmesdale knew that he had not completely repented. When he finally mustered the resolve to announce what he had done publicly, it only came after a crisis of his faith and just before he died. He never gained any solace from all the good that he had done. A sin that truly could have been turned into strength ended up being his greatest limiting factor. In the end it killed him.

The world around us is filled with people and their limiting factors. This world includes us. Maybe we hesitate to consider this fact in the people we care about. But acknowledging weakness in others does not need to be judgmental. It can be a prelude to great service. Considering weaknesses in ourselves can be even less appealing. And yet we really only have two choices is this matter. We can continue ignoring them – and remain forever limited. Or we can become better. Nothing needs to hold us back. In fact, the sky’s the limit.