Section

Miscellaneous

Mormons and the Mythology of the “Greatest Generation”

By August 15, 2013


This post is my contribution to our August theme highlighting the history of 20th Century Mormonism. A quick disclaimer–in the post I critique the idea of “The Greatest Generation.” This does not mean that I am degrading the patriotism or valor of men or relatives that served in the military during World War II. Many served valiantly and admirably. I am writing to expose some of the blind spots created by solely focusing on the pluck of individual soldiers and their commanders. Also, I know this post is a little long, so gird up your loins and ring the bell when you get to the top (how’s that for a mixed metaphor?)

The series of worldwide conflicts now collectively known as World War II transformed the sociopolitical landscape of both the Global North and South. The fighting redistricted the ongoing European ideological struggles between fascism, communism, and capitalism. It inspired anti-colonial movements throughout the world to fight the bonds of European imperialism. It also caused more deaths than any other conflict in human history with estimates of total deaths ranging between 50 and 85 million. From a global perspective, U.S. participation in the conflict appears relatively small–at least in terms of actual soldiers and casualties. Nevertheless, the number of U.S. deaths suffered during World War II exceeded the total from any other conflict except the American Civil War, and the actual number of combat deaths were probably greater. Somewhere around 275,000 Americans lost their lives in the fighting of World War II. In addition, a larger percentage of U.S. men of military age served in World War II than in any other U.S. conflict. Consequently, like the Civil War, World War II looms large in the American memory and consciousness. Historians, journalists, and novelists have written more books about these great 19th and 20th century conflicts than any other event in U.S. History. [1]

The great paradox of U.S. involvement in World War II is that Americans affected this world conflict disproportionately to the loss of life they experienced. Yet the scope of the United State’s casualties and its influence on the war’s outcome has made World War II one of the most significant military engagements in U.S. History. Most Americans, including members of the Church of Jesus Christ of Latter-day Saints, have shown little interest in understanding the war’s complexities, but feel a personal connection to the conflict through relatives or acquaintances who fought in Europe or the Pacific. This combination of ignorance and fascination has created a market for popular representations of World War II which both obscure the real horrors of war and reveal the contours of human courage. One popular solution for resolving America’s oblivious captivation with World War II has been the creation of the idea of the “Greatest Generation.” Mormons have embraced this popular conception with equal, if not greater vigor. This post seeks to investigate this trope, demonstrate its significance to Mormon Studies, and illustrate its limitations.

Beginning in the 1990s, the United States seemed to experience a renaissance in the history and memory of World War II. This key moment in the creation of public and historical memory about the war emerged from the confluence of many demographic and cultural factors. The end of the Cold War made it appear that Democracy, so fervently defended by World War II soldiers, had won and led to an end of history. The fiftieth anniversary of the conflict created a motivation and deadline to recover and share lost histories from the war. The temporal distance from the trauma had helped many veterans find the perspective and strength to speak about their wartime experiences, and their advanced age led chroniclers to try and record this history before too many passed away. While this trend normally might have run its course after ten years of commemoration, the events of September 11, 2001 and the ambivalence caused by subsequent wars in Afghanistan and Iraq led many to long for the nostalgia of the “good war” of their fathers and grandfathers. Consider one of the most memorable images from the aftermath of the bombing, the photograph of firefighters raising the American flag over the wreckage and debris at Ground Zero. Many have pointed out that this image consciously drew from the famous iconography of marines raising the flag at Iwo Jima. [2]

Many American journalists, authors, and filmmakers worked diligently to capture and generate this nostalgia. They created an industry of memory production about World War II in these years surrounding the turn of the twenty-first century. Consider the following works: Band of Brothers (1992) and other works by “historian” Stephen Ambrose, The Greatest Generation (1997) by Tom Brokaw, Saving Private Ryan (1998) by Stephen Spielberg, Spielberg also produced the Band of Brothers (2001) miniseries for HBO. These were only the most universally acclaimed offerings from what became a cottage industry of World War II memory production–remember Michael Bay’s Pearl Harbor (2001). During this same period of time, the 1993 proposal for a National World War II Memorial led to fundraising, construction, and its dedication in April of 2004–a correlation in time frame which seems more than coincidental. [3]

One hallmark of all of these portrayals of World War II was their focus on the valor, strength, and fortitude of individual soldiers. Brokaw observed about those who lived through the World War II era which he called “the Greatest Generation”:

It is a generation that, by and large, made no demands of homage from those who followed and prospered economically, politically, and culturally because of its sacrifices. It is a generation of towering achievement and modest demeanor, a legacy of their formative years when they were participants in and witness to sacrifices of the highest order. . . Millions of men and women were involved in this tumultuous journey through adversity and achievement, despair and triumph. Certainly there were those who failed to measure up, but taken as a whole this generation did have a “rendezvous with destiny.” [4]

Brokaw wrote of his parent’s generation, and understandably engaged in a certain amount of hyperbole. Nevertheless, his descriptions, along with the other portrayals previously discussed, have become the primary basis for public memory of World War II in the last twenty years.

Following the example of Brokaw and other popularizers of soldiers’ stories, producers of Mormon culture made sure that the contributions and faith of Latter-day Saint servicemen played a role in the mythology of the war. Since many of the leaders of the church had served in World War II, the idea that were no atheists in the foxholes of war had become an origin story for many leaders’ spiritual dedication and service. Before many of his stories proved fictional, Elder Paul H. Dunn had personified this mythology. In 2001, Covenant Communications, a popular Mormon press, released Robert Freeman’s and Dennis Wright’s Saints at War. The two authors, who at the time taught in BYU’s Department of Church History and Doctrine, claimed that they had been inspired by Brokaw and Ambrose to create an archive of Mormon soldiers’ accounts of military service during times of war. The book and its related CD and DVD offered short vignettes from different LDS soldiers’ World War II experiences. Unlike Brokaw, the authors of Saints at War allowed their subjects to speak for themselves, but in very short spurts. Most accounts focused on individual heroism, spiritual guidance on military missions, or soldiers’ efforts to maintain fellowship in the battlefield. Freeman and Wright also chose to edit out any derogatory terms used by their subjects to refer to the Japanese, Germans, or Italians. Consequently, Saints at War, as Freeman and Wright admit, “should not be seen as a history of World War II.” It offers an account of Mormon soldiers trying to hold on to the doctrines and practices of their church in an atmosphere often inimical to religious faith. It is a PG rendering of history meant to develop the faith of the reader. [5]

At about the same time, film director Ryan Little offered his film Saints and Soldiers (2005) which placed a Mormon soldier suffering from post-traumatic stress disorder within a story of courage and intrigue supposedly inspired by real-life events. The Mormon soldier, along with several other escaped prisoners of war, struggled to help a crashed British pilot carry essential intelligence back to Allied lines. Mormon Corporal Nathan Greer gained his gentile companions’ respect and ultimately lost his life in his efforts to protect their lives and mission. Greer demonstrated his ability to follow Christ by giving his life for the sake of his friends. Similar to the argument made through the editing choices in Saints at War, Saints and Soldiers offers a portrayal of how Mormons lived their religion while in the trenches.

In many ways, such Mormon popular portrayals of Latter-day Saint wartime service followed the same basic pattern set by secular portrayals of World War II in the last two decades. This occurred, in part, because stories about Mormon participation in World War II fit into a narrative of Mormon integration into mainstream American society. World War II represented a moment when Mormons answered a call to service in the same way as other communities throughout the country. Popular Mormon portrayals of World War II generally focus on individuals rather than institutions. While revealing occasional weaknesses or misjudgments, LDS soldiers demonstrated the ability to hold to a set of admirable principles. These Mormon popularizers added the component of faith to the courage, humanity, patriotism, and loyalty that defined the “greatest generation.” Personal flaws were ignored, downplayed, or utilized as adversities overcome by courage. In popular LDS portrayals, Mormons emerged as premier examples of the “greatest generation” elevated by their willingness to adhere to values above and beyond their ordinary companions and consequently blessed with an insight that allowed them to achieve and endure great and difficult things. Should anyone doubt the enduring legacy of the “greatest generation” on popular Mormon thought, consider Coach Bronco Mendenhall’s decision to brand his BYU football team a “Band of Brothers.”

Historians must hold the Mormon appendix to the “greatest generation” myth up to the same critiques forcefully advanced by Kenneth Rose in his book Myth and the Greatest Generation: A Social History of Americans in World War II. The focus on exemplary soldiers during their most courageous moments obfuscates the many terrible injustices perpetrated by American soldiers and society during the war. Rose busts myths about the conflict both at home and abroad. For example, the legacies of sexual violence in every theater of war. American supplies placed U.S. soldiers in a position of abundance as the war forced them into environments of scarcity. For every soldier who used this abundance to give candy to children and feed starving refugees, there were other soldiers who leveraged their access to food into sexual violence. One of the terrible legacies of American occupation was the girth of illegitimate children they left in their wake. Sometimes they found women eager to please them, at other times they took what they wanted. U.S. victory and U.S. immunity from local law enforcement allowed a certain segment of American soldiers the freedom to rape and pillage and they worked to make the world safe for democracy. [6]

In addition, focusing on individual stories of valor keeps the U.S. public from dealing with the legacy of mass slaughter of civilians perpetrated by the U.S. military and its allies during the war. More civilians died in World War II than actual soldiers. Americans pilots indiscriminately bombed civilian targets throughout Germany and Japan. In Dresden alone, over 20,000 people, many of them civilians, perished by fire. The firebombing in Tokyo killed 100,000 people and left almost no building standing. The atomic bomb at Hiroshima directly killed 80,000 people, and the completely unnecessary bombing of Nagasaki killed upwards of 50,000 people. Tens of thousands more died from the effects of radiation. Military leaders advanced theories that such tactics saved U.S. lives, but this argument cannot mitigate the fact that Americans purposely targeted hundreds of thousands of civilians during the war. [7]

While historians offer many other criticisms of U.S. war strategy and policy abroad, my work focuses particularly on the injustices perpetuated by the U.S. military and government at home. The World War II army was a segregated army. African American soldiers often performed more menial tasks than white soldiers. They faced discrimination from officers and other soldiers. Often, the discrimination proved so terrible that soldiers returned home to the United States determined to fight for Civil Rights so that they might avoid such degrading service in the future. Another little known tragedy of the war was the treatment of pacifists such as Quakers. Men subject to the draft, whose religious convictions kept them from fighting, were placed in work camps in rural places such as Eastern Oregon where they worked cutting lumber for the war effort under very difficult conditions. It is fairly well-known that Franklin Roosevelt denied many Jews trying to escape the coming Holocaust access to the United States. American officials denied the existence of Concentration Camps until their own soldiers started encountering them first-hand. Such willful ignorance demonstrated the wide-spread anti-Semitism held by many Americans before and during the war. Finally, the government incarcerated more than 100,000 Japanese Americans–most of them without any evidence besides the country from which their parents emigrated. The justification that the necessities of war justified this blatant of Japanese American citizens’ civil rights proved inadequate, and government officials lied to the Supreme Court in their effort to cover their mistakes.

In the end, historians agree that leaders and soldiers from Germany, Japan, and Italy committed terrible atrocities both at home and abroad. Hitler lived up to almost every villainous claim made against him and more. Many of the people in these countries bought in to the hateful policies and rhetoric propagandized by their leaders. Historians also acknowledge the bravery and conviction held by soldiers like the ones depicted in recent popular culture both Mormon and secular. While the stories often omit darker details and have been reformulated to fit into popular narratives, many American soldiers fought bravely and with great humanity. Nonetheless, the conception of World War II as a “good war” and its soldiers as the “greatest generation” offers a representation of a struggle that never existed. American leader made difficult and sometimes evil choices. When Americans and Mormons forget that all conflicts create as much darkness as light, it becomes easy to forget that “war is hell.”

___________________________

[1]  J.M. Winter, “demography of the war,” in The Oxford Companion to World War II, eds. I.C.B. Dear and M.R.D. Foot (Oxford and New York: Oxford University Press, 1995), 289-292; the Companion also makes clear that casualty statistics are notoriously unreliable.

[2] Guy Westwall, “One Image Begets Another: A Comparative Analysis of Flag-raising on Iwo Jima and Ground Zero Spirit,” Journal of War and Culture Studies 1, no. 3 (2008): 325-340.

[3] Stephen E. Ambrose, Band of Brothers: E Company, 506th Regiment, 101st Airborne from Normandy to Hitler’s Eagle’s Nest, S & S Classic Edition (New York: Simon and Schuster, 2001); Tom Brokaw, The Greatest Generation, 2nd edition (New York: Random House, 2004).

[4] Brokaw, 11-12.

[5] Robert C. Freeman and Dennis A. Wright, Saints at War: Experiences of Latter-day Saints in World War II (American Fork: Covenant Communications, 2001).

[6] Kenneth Rose, Myth and the Greatest Generation: A Social History of Americans in World War II (New York: Routledge, 2007).

[7] Oxford Companion to World War II, s.v. “strategic air offensives.”

 

 


The Mormon Cancer, 1 of 2: Mountain Meadows Massacre

By August 11, 2013


Over the past few months I have posted on figures of speech involving Mormonism. To the Mormon Octopus, Robot, Hydra, and Upas Tree I now add ?the Mormon cancer.? [1] Like so many of the negative characterizations of Mormonism, we begin our tour of Mormon cancers with John C Bennett, who, in 1842 wrote:

Nothing short of an excision of the cancer of Mormonism will effect a cure of that absorbing delusion, and the strong arm of military power must perform the operation at the edge of the sword, point of the bayonet, and mouth of the cannon. [2]

Bennett uses a fully-developed surgical metaphor: Mormonism is a ?cancer? in the present-day sense of a malignant tumor and the surgical ?operation? to remove it is military action. The ?cure? part of the metaphor is, I think, the most important for interpreting the Mormon reaction to cancer metaphors. As I will suggest below and next week, into the 1900s Mormons—not without cause—understood cancer metaphors as calls for organized violence against them. 

Continue Reading


Guest Book Review: Dominic Martinez on “Remembering Iosepa”

By August 10, 2013


Dominic Martinez {dominic.martinez AT ucdenver.edu} is currently a doctoral student at the University of Colorado Denver in the School of Education and Human Development with a focus on Leadership for Educational Equity.  He has presented papers titledIosepa “The Iosepa Voyage: The Reconstruction of Hawaiian Voyaging within Mormon Context” and “Iosepa, Utah: Reclaiming History Through Connectedness” at national conferences.  The Juvenile Instructor is pleased to share his review of Kester’s book on Iosepa.  

 

Matthew Kester. Remembering Iosepa: History, Place, and Religion in the American West.  New York, NY: Oxford University Press, 2013. vii, 203.  Photographs, notes, bibliography, index. Hardcover: $44.35; ISBN 978-0-19-984491-3

 

I had the opportunity to meet J. Matthew Kester in the summer of 2009 when I was in Hawai?i conducting research for my Master?s thesis on Polynesian Mormons.  I was thrilled to meet this exceptional scholar with his laid-back, surfer-dude personality.  Our conversation focused on three main subjects: the history of Brigham Young University Hawai?i; a character from the Book of Mormon named Hagoth who is speculated to have been one of the first ancestors to the Polynesian population; and Iosepa, a community in Utah founded by Mormon Hawaiians.  Knowing his passion for the history of Mormonism and the Hawaiian culture, I was pleased to see that his first book to be published is on Iosepa–a space, according to Dennis Atkin, that has not been researched enough (1). Other than Dennis Atkin?s Master?s thesis, his chapter, ?Iosepa: A Utah Home for Polynesians? in Voyages of Faith: Explorations in Mormon Pacific History (2) and Tracy E. Panek?s chapter, ?Life at Iosepa, Utah?s Polynesian Colony? in Proclamation to the People: Nineteenth-century Mormonism and the Pacific Basin Frontier (3), there has not been as much attention spent on this Mormon colony for Polynesians in the west.

Continue Reading


Things I Did Not Know: Dinosaurs in the Manti Temple (Edit: New Images, ht Mina)

By August 4, 2013


A few weeks ago, I worshipped in the Manti Utah Temple for the first time. My parents were endowed, married, and sealed there, so it is a special place to me. Amidst my devotions and pondering, I was somewhat taken aback to find paintings of Mesozoic reptiles on the wall of the Creation Room. [1]

Continue Reading


Juvenile Instructor 2.0

By August 2, 2013


Screen Shot 2013-07-30 at 11.57.23 PM

 

In just less than three months (on October 26, to be precise), the Juvenile Instructor will mark its sixth anniversary. To celebrate the occasion, we will be rolling out a few changes over the course of the next few months—some cosmetic, some content.

Continue Reading


“Free Toleration and Equal Privileges in this City”: Religious Freedom in Mormon Nauvoo

By July 31, 2013


Several years ago I reviewed David Sehat?s then-new book, The Myth of American Religious Freedom. Published in 2011, the book was intended as a corrective to what Sehat characterized as the conventional idea that Americans celebrate an unbroken and unblemished tradition of religious liberty.  Demonstrating that America?s record of toleration and freedom isn?t flawless, Sehat chronicled many episodes of religious discrimination during the nineteenth century Although, as many revisionist texts do, Sehat?s book may have overcorrected, he introduced an important new awareness of the historical reality of not only religious persecution, but subtler forms of establishment coercion that existed in the land of the free during the nineteenth century. Mormons were, quite naturally, a constituency of Sehat?s work, though most of his focus was elsewhere. I expressed in that post my opinion that Mormonism presents a natural point of entry for the study of religious freedom in America. Because of their controversial practice of polygamy and their broad assumption of political autonomy, Mormons were at the center of much national debate over the boundaries of religious freedom in the latter half of the nineteenth century, and this something that scholars like Kathleen Flake, Sarah Barringer Gordon, and now Leigh Eric Schmidt have worked on in various ways. [1] Relatively less has been said, though, about how early Mormons themselves conceived and understood religious liberty. How did this eminently democratic idea, resting on a premise of ideological pluralism, square with Mormon political theology?

Continue Reading


Mormonism’s Possible Political Theologies: Reading the Constitution through a Lens of Continuing Revelation, Part I

By July 30, 2013


Though one can trace a correspondence between Mormon scriptural and legal hermeneutics back to Joseph Smith, that indirect correlation has evolved in relation to ecclesiastical schisms and shifts and broader social and political developments. Despite recent criticisms, the equation between Mormonism and constitutional conservatism that developed in the wake of the Cold War era and that found embodiment in the person of Ezra Taft Benson remains a truism for some Latter-day Saints, many of whom embrace a scriptural literalism. A number of Saints uphold the Constitution as “A Heavenly Banner,” to be placed alongside the LDS canon. Indeed, mistrust of executive, legislative, and judicial interpreters leads some to insist on originalist interpretations (which, of course, are still interpretations) of the Constitution, while evidencing an openness to non-originalist interpretations of scripture, or at least to the readings of their leaders, which might be understood as literal.[1] While one can formulate defensible arguments that scriptural literalism and conservative constitutionalism are fruits of Mormonism, I want to suggest that the seeds of quite different approaches to sacred scriptural and legal texts can be found in the rich soil of early Mormon thought. Within the Mormon framework, accepting a text as sacred does not necessarily demand strict or literalist readings and may even call for alternative approaches. Before tracing out these potentialities in a subsequent post, here I aim to suggest that they may relate to broader intellectual trends and developments in antebellum biblical and constitutional interpretation.[2]

Though not the only force directing constitutional thought in the antebellum period, the South’s peculiar institution uniquely forced many Americans to reconsider the Constitution’s place in the present. Since the 1830s, a number of radical abolitionists concluded that the nation’s preeminent legal document had worn out its welcome and joined William Lloyd Garrison in dismissing it as a “covenant with death” and “an agreement with Hell.”[3] Such figures accepted the proslavery interpretation of their opponents as historically accurate and then condemned the Constitution as an outdated and immoral creed. They proved willing to throw out the Bible as well.[4] Other abolitionists, including Gerrit Smith and Frederick Douglass, advanced antislavery readings of the Constitution. Like Garrison, they appealed to the Declaration of Independence, but they read Madison’s text in light of Jefferson’s.[5] Static proslavery and antislavery readings dominated antebellum constitutional interpretation, leading to pro- and anti-constitutional readings, but some interpreters began to propose readings that valued the Constitution as an adaptable document “suited to time,” a kind of “raft, which should bend and yield, take the very shape of the waves, let the water in and out freely through its seams and junctures, and by its loose couplings and elastic movement divide and dissipate the force of any sudden shock.”[6]

The emerging view of the Constitution as malleable corresponded to a view of the Bible as a moldable book, a discussion that arose in relation to historical examinations of the biblical text. This relationship can be seen in Unitarian-turned-Transcendentalist Theodore Parker’s writings. His deep engagement with biblical criticism led him to distinguish between transient and permanent biblical truths and nourished his belief in divine communication.[7] Parker believed in the Bible’s usefulness as the historical expression of true religion, but the truth he privileged most rested in a Christ that aimed to foster future Christs. Indeed, Parker echoed Emerson in suggesting that by making Christ “the Son of God in a peculiar and exclusive sense–much of the significance of his character is gone.”[8] His religion was not restricted to a place, a past, a book, or a man, but “the inward Christ, which alone abideth forever, has much to say which the Bible never told,” or, as he added in a later edition, “much which the historical Jesus never knew.”[9] Parker’s abstract and minimalist beliefs freed him from allegiance to literalist and static meanings and allowed him to posit the Bible’s malleability. When biblical scholars used historical reasoning to interpret Paul’s decision to send a slave back to his owner and to then assert a clear correspondence between that decision and the Fugitive Slave Act, it was Parker’s engagement with biblical criticism and his deep-seated belief in an innate religious guide and the progress of religion that led him to accept that interpretation as historically accurate and to then dismiss it as historically dated. In the early 1840s, he lamented that “men justify slavery out of the New Testament, because Paul had not his eye open to the evil, but sent back a fugitive. It is dangerous,” he warned, “to rely on a troubled fountain for the water of life.”[10]

Parker’s approach to the Bible informed his interpretation of the Constitution. In response to Moses Stuart’s proslavery Conscience and the Constitution (1850), he suggested that “there is a “short and easy method” with Professor Stuart, and all other men who defend slavery out of the Bible. If the Bible defends slavery, it is not so much better for slavery, but so much the worse for the Bible.”[11] Parker was no respecter of founding documents. He asserted that “if the Constitution of the United States will not allow [the nation to end slavery], there is another Constitution that will.”[12] In referencing a higher law, Parker made it clear that he preferred “conscience to cotton,” the Bible, and the Constitution.[13] In his view, historical research evidenced that these texts contained outdated moral and legal teachings, but rather than joining Garrison in jettisoning them, he ultimately maintained that these founding texts also conveyed transcendent religious and legal truths. While dismissing strict literalist readings, Parker claimed the spirit of these sacred religious and legal texts, which allowed him posit their inherent capacity to adapt to historical change and modern circumstances.

Much separates Theodore Parker’s hermeneutics from Joseph Smith’s, and the relationship between scriptural and constitutional exegesis is much clearer in the Transcendentalist’s thought than in the Mormon Prophet’s. While holding in mind important distinctions and differences between early Mormon thought and broader developments in biblical and constitutional interpretation, we might consider whether Smith’s unique critique of the Bible and his emphatic assertion of new revelation might allow for and even demand a reading of sacred texts, both religious and legal, in light of historical change and development. In other words, does early Mormon thought call for a reading of the Constitution through the lens of continuing revelation? And, if so, what does that look like? I leave you to consider these questions, and hope to address them in part, at least, in a subsequent post.

 

[1] As Stanley Fish explains, the gesture to “disavow interpretation in favor of simply presenting the text” is actually a gesture in which one set of interpretive principles is replaced by another that happens to claim for itself the virtue of not being an interpretation at all.” In this article, Fish famously concluded that “interpretation is the only game in town.” Stanley Fish, “What Makes Interpretation Acceptable,” in Is There a Text in This Class? The Authority of Interpretive Communities (Cambridge, MA: Harvard University Press, 1980), 353, 355.

[2] On the relationship between biblical and constitutional hermeneutics, see Jaroslav Pelikan, Interpreting the Bible and the Constitution (New Haven: Yale University Press, 2004).

[3] William Lloyd Garrison to Rev. Samuel J. May, July 17, 1845, in Walter M. Merrill, ed. The Letters of William Lloyd Garrison (Cambridge, Mass.: Harvard University Press, 1973): 3:303.

[4] See, for example, William E. Cain, ed., William Lloyd Garrison and the Fight against Slavery: Selections from the Liberator (Boston: St. Martin’s Press, 1995), 29-36.

[5] For an outline of these positions and a discussion of Douglass’s slow and studied adoption of Smith’s position, see David W. Blight, Frederick Douglass’ Civil War: Keeping Faith in Jubilee (Baton Rouge: Louisiana State University Press, 1989), 26-35.

[6] “A Chapter on Slavery,” The North American Review 92 (April 1861): 492-93, quotes on 493.

[7] See Parker, “A Discourse of the Transient and Permanent in Christianity,” in The Critical and Miscellaneous Writings of Theodore Parker, Minister of the Second Church in Roxbury (Boston: James Munroe and Company, 1843). On Parker’s prolonged engagement with biblical criticism, including the writings of figures such as De Wette and Strauss, see Dean Grodzins, American Heretic: Theodore Parker and Transcendentalism (Chapel Hill: University of North Carolina Press, 2002).

[8] Parker, “A Discourse of the Transient and Permanent in Christianity,” 158.

[9] Parker, A Discourse of Matters Pertaining to Religion, 376. See Parker, A Discourse of Matters Pertaining to Religion, 4th ed. (Boston: Little, Brown and Company, 1856), 354.

[10]  Parker, A Discourse of Matters Pertaining to Religion (Boston: Charles C. Little and James Brown 1842), 375. Others, including William Ellery Channing, contended that Paul, in fact, advanced antislavery sentiment, but slavery “had so penetrated society” in New Testament times that Paul “satisfied himself with spreading principles which, however slowly, could not but work its destruction.” Channing, Slavery (Boston: James Munroe and Company, 1835), 111. On the New Testament debate over slavery, see Albert J. Harrill, “The Use of the New Testament in the American Slave Controversy: A Case History in the Hermeneutical Tension between Biblical Criticism and Christian Moral Debate,” Religion and American Culture: A Journal of Interpretation 10, no. 2 (Summer 2000): 149-186. Channing’s interpretation was a kind of originalist argument based on the New Testament authors’ original expectations of change. A similar kind of argument emerged in relation to the Constitution. This originalist expectation of change found expression, for example, in the dissenting opinions of the Dred Scott decision (1857). John McLean wrote that “our independence was a great epoch in the history of freedom, and while I admit the Government was not made especially for the colored race, yet many of them were citizens of the New England States, and exercised, the rights of suffrage when the Constitution was adopted, and it was not doubted by any intelligent person that its tendencies would greatly ameliorate their condition.” Similarly, in reference to the founders, Benjamin Robbins Curtis contended that “that a calm comparison of these assertions of universal abstract truths and of their own individual opinions and acts would not leave these men under any reproach of inconsistency; that the great truths they asserted on that solemn occasion, they were ready and anxious to make effectual, wherever a necessary regard to circumstances, which no statesman can disregard without producing more evil than good, would allow; and that it would not be just to them nor true in itself to allege that they intended to say that the Creator of all men had endowed the white race, exclusively, with the great natural rights which the Declaration of Independence asserts.” Dred Scott v. John F A. Sandford, 60 US (19 Howard) 393, 537, 574-75 (1857).

[11] Parker, “The Slave Power,” in The Works of Theodore Parker, Centennial Edition, 15 vols. (Boston: American Unitarian Association, 1907-1913), 11:272.

[12] Parker, “The Slave Power,” 11:285.

[13] Parker, “The Slave Power,” 286.


Mormon Hydra 2 of 2

By July 28, 2013


As the examples in the first post showed, a Hydra could represent an individual (Joseph Smith), an institution (the Church), or a concept. The concept-as-Hydra was probably most common, implicating ideas like violence or fraud, usually with some specific incident(s) under discussion as an individual head (or heads) of the larger monster (for non-Mormon-related examples, see images below). [1]

NoMo Hydra example composite 20130727a

Continue Reading


Guest Post: Brittany Chapman on Ruth May Fox, Mormon Women, and Political Rights

By July 25, 2013


[Today’s contribution to this month’s Mormonism & Politics series comes from Brittany Chapman, who basically runs the Church History Library nowadays.]

 ?Stronger than my political convictions,? wrote suffragist Ruth May Fox, ?was my belief in the political rights of women.?[1]

RMF_MiddleAgeSuffrageKCI?ve been thinking lately about how women view themselves, and the seeming monumental change in that perception since the nineteenth century. Often when we speak of women in politics during that time period, we instantly mark ?suffrage? as one of woman?s greatest achievements. Our nineteenth-century heroines are those who touted women?s advancement in the public sphere?education, employment, and, most heralded, the vote. Rightly so.  Now four or even five generations removed from that innovation, the value of universal suffrage is obvious and marginalizing woman?s voice at the ballot box is unthinkable. It is easy to assume the value of the vote was always obvious and that every woman always wanted it. But alas, such was not the case for hundreds of thousands of women. So, who were the women who did not want the vote, and why? What were they saying? And, at the root of it all, how did they view themselves?

There is a fascinating piece by Susan Fenimore Cooper (the daughter of novelist James Fenimore Cooper) entitled ?Female Suffrage: a Letter to the Christian Women of America.? Cooper, well-read and well-bred, represented a preponderance of women when she argued that they should not have the right to vote. In the same breath, she advocated women receiving higher education, equal pay for equal work, and other basic equalities. How did these seemingly inconsistent ideas of equality co-exist?

Continue Reading


Did her Religion Matter? Lenore Romney as Political Wife and Candidate

By July 24, 2013


Lenore_Romney_1970_campaign_commercialThe deceased Lenore Romney, the mother of 2008 and 2012 presidential candidate Mitt Romney and the wife of republican governor of Michigan from 1963 to 1969 George Romney, came into the spotlight in 2012 when both Time Magazine and the Washington Post featured stories that covered her effect on her son?s political career. Both stories featured her failed run for a senate seat in Michigan in 1970.  Compared to the contemporary images of Ann Romney as a housewife, what was most striking about these stories was not that Lenore Romney did not win the election for the U.S. senate seat but that she had run for office at all. It is necessary to note that Ann Romney also did actually run for and win a public office position  in the 1970s. She was elected as the town meeting representative in Belmont, Massachusetts in 1977.[1] However, probably because she has not pursued her own political career, the story has fallen mostly by the side after her husband stepped into political spotlight.

Continue Reading

 Newer Posts | Older Posts 

Series

Recent Comments

Mark Staker on Legacies in Mormon Studies: “Jenny was always generous in sharing her knowledge. She was not only an exceptional educator (who also taught her colleagues along the way), but she…”


Gary Bergera on Legacies in Mormon Studies: “Jenny's great. Thanks for posting this.”


Kathy Cardon on Legacies in Mormon Studies: “I worked in the Church's Historical department when Jenny was in the Museum. I always enjoyed our interactions. Reading this article has been a real…”


Don Tate on Legacies in Mormon Studies: “Very well done and richly deserved! I am most proud of Jenny and how far she has come with her life, her scholarship, and her…”


Ben P on Legacies in Mormon Studies: “My favorite former boss and respected current historian!”


Hannah J on Legacies in Mormon Studies: “I really enjoyed this! Going to be thinking about playing the long game for a while. Thanks Amy and Jenny.”

Topics


juvenileinstructor.org