A New Museum, A New Perspective

by Charles Hubbard

Recently the country has engaged in a vigorous debate about the treatment and interpretation of Civil War artifacts.   The treatment and use of the Confederate battle flag has elevated the discussion into the public policy arena. Statues of Confederate leaders and soldiers for over a century dotted the ground surrounding public buildings and parks.   Other symbols and memorials dedicated to remembering the so-called “lost cause” provoked debate and legislation in several state legislatures. Public historians and museums are grappling with the question of how best to interpret the Civil War so that the public appreciates the impact of the war on contemporary American society.

On May 4, 2019 the new American Civil War Museum in Richmond Virginia opened to the general public. Richmond has always been an essential destination for Civil War buffs, enthusiast, historians and scholars. Richmond was the capital of the Confederacy and the surrounding area was the site of over 40% of the battles during the Civil War. The new $25 million facility is located along the historic James riverfront and includes ruins of the Tredegar Ironworks. Certainly, the remains of the destroyed Ironworks where slaves produced canons of the Confederacy is a fitting place to reinterpret the war. The permanent exhibit includes over 500 artifacts including Robert E Lee’s hat, along with many tattered battle flags from both sides of the conflict. The first gallery in the museum includes a spectacular exhibit of the Henry House which was blown to pieces during the first battle of Bull Run. The widow Henry was killed when she refused to leave her house as the battle raged around her. The exhibits deliver the message that the American Civil War touched all Americans, white, black, North, South, Native Americans, and recent immigrants. The legacy of the Civil War reflects both the cultural diversity and inclusiveness that is the American experience.

The new Museum is the result of a merger between the 120-year-old Museum of the Confederacy and the American Civil War Center. Both institutions coexisted in Richmond for years even though both focused on the Civil War. Chisty S. Coleman the director of the Civil War Center and S. Waite Rawls, III Director of the Museum of the Confederacy, after lengthy discussions, agreed to merge the museums and present a more comprehensive and historically correct interpretation of the war and the social revolution it created. The Museum of the Confederacy traditionally supported Confederate apologist and a conscious decision was made to abandon the “Lost Cause” while still maintaining respect for courage of the Confederate soldier. Efforts are underway to digitize and make available the extraordinary archival material that includes private papers of numerous Confederate political and military leaders as well as the personal correspondence and diaries of southern citizens to further explain the passion and emotions of Southerners during and after the war.

The American Civil War Museum will continue to generate further debate and discussion, but perhaps the stories revealed in these galleries is the best way to incorporate the contributions of the Civil War into our personal understanding of the overall American experience.


Success or Failure? 2019 Lincoln Memorial University Commencement Address

by Dr. Allen C. Guelzo

Dr. Allen C. Guelzo is the Henry R. Luce Professor of Civil War Era History at Gettysburg College and a leading scholar of Lincoln and the Civil War. On May 4, he delivered the commencement address for Lincoln Memorial University. His remarks, which are framed around the institution’s namesake, are presented below:

President Hess, distinguished trustees and faculty, and honored graduates, students and guests: I have come to you today to talk about failure.

That will surprise many of you, since any failures among this graduating class are, logically speaking, not supposed to be here. We are all here supposed to be successes. What’s more, commencement addresses are routinely expected to be celebrations of success, or exhortations to success; they are not expected to take so morbid a turn as to talk about failure, and thus rain on the academic parade. And backing all that up are the great maxims of our society, “Laugh and the world laughs with you; cry, and you cry alone.” “Nothing succeeds like success.” “Everybody loves a winner.” Success is glamorous; failure is a reproach, and such a bleak reproach to our personal qualities, that we can hardly even bear to use the word. The word failure has become what the word cancer was a generation or two ago — something to be discussed only in hushed tones, behind closed doors. But unless you are something other than human, you will experience failure. And an important part of your education is learning how to meet it; in fact, the most important part of your character will be shaped by how you encounter and overcome it.

I take as my guiding star today a comment made — we think in 1856 — by the man for whom this college is named, Abraham Lincoln. In 1856, Lincoln had, to the outward appearance, all the trappings of success. From early poverty he had risen to become a prominent lawyer, and in the mid-1850s his legal practice, especially with the railroads, had made him one of the most financially successful lawyers in Illinois, with an annual income that could easily translate into six figures, in modern terms. He had served in the Illinois legislature, he had been elected to a term in Congress, and had just made a nearly-successful run for the Senate in 1855. In the year we think he wrote the comment I’m going to read, Lincoln had been nominated for vice-president on the ticket of the new Republican party (even though in the end he fell short of getting the nomination). Any one of you who had racked up ribbons like that before their fiftieth birthday might well feel entitled to think of themselves as successful, and that’s probably how we would think of Lincoln.

But he did not. Instead, he compared himself to an even more well-known Illinoisan, Stephen A. Douglas, and in 1856, Lincoln did not like what he saw in that comparison.

“Twenty‑two years ago Judge Douglas and I first became acquainted. We were both young then; he a trifle younger than I. Even then, we were both ambitious; I, perhaps, quite as much so as he. With me, the race of ambition has been a failure‑‑‑a flat failure; with him it has been one of splendid success. His name fills the nation; and is not unknown, even, in foreign lands.”

“I affect no contempt for the high eminence he has reached,” Lincoln added. But it was obvious that by his own standard, Lincoln felt that he had spent his life for nothing.

One hundred and fifty-four years after Lincoln’s death, we read those words and smile, knowing that in the years following those sad reflections, Lincoln would become the most revered figure in our history. But he did not know that then, any more than we can know now what lies before all of us. And as numerous as his successes had been on many scores, he had not succeeded in the one thing which was the most important to him – public leadership. Perhaps we could say to Lincoln that he had simply set his sights too high, that if he counted his blessings, he would see how much he had to feel content about. I don’t think that would have been a very good tactic to employ on Lincoln. Or perhaps we could try bucking the tall man up with some inspirational reflections – sing him a tune about climbing ev’ry mountain, walking through storms with your head held high, and so on.  I don’t think it would have done him a bit of good. Nor should it. Lincoln’s talents for public leadership were real, and more real than Stephen A. Douglas’s. In a world of fairness, Lincoln should have been in Douglas’s seat. And he knew it. And not being there was, he knew, a mark of failure. As it was for Lincoln, so there will come times for all of you when you will have to drink of failure to the dregs. Will you smile then? Has what you have learned in life and here at LMU prepared you to deal, not just with success, but with failure?

First of all, I want you to remember that not all failures are failures — they only seem that way in the eyes of a society which measures success by trophy homes and prestige toys. And not all successes are successes. For instance: there was once a man so enamored of showing off the wealth he had accumulated that he decreed that, when he died, he should be buried in his gold Cadillac. When the day for his funeral arrived, his corpse was duly propped up behind the wheel, and his pall-bearers pushed the gold Cadillac into the cemetery and down a ramp into a specially-dug grave. As the gold Cadillac, with its deceased owner still at the wheel, descended into the grave, one of the grave-diggers, beholding the scene, remarked to another: “Man, that‘s livin!”

And (I repeat) not all failures are failures. Some of you will dedicate your lives to callings which are noble but poorly rewarded in our world. Today, when you are young and strong and full of passion for your mission, you will push away the thought of how meagre those rewards are likely to be for the joy of serving others. But there will come a time – perhaps in the form of a cutting remark from a neighbor or a family member, or a class reunion you can’t afford to attend, or the ingratitude of the people you’re trying to help — when your strength and enthusiasm will wane, and you will wonder if you have been the butt of life’s joke, a failure. At that moment you must ask yourself – by whose standard?  If you can stop and ask that question, with regard only for that inward monitor which gives praise and blame according to what you love and value the most, then your courage, your persistence, your perseverance makes you the most successful of all. And in their hearts, everyone will know it, and wish they had your success, not theirs. Our culture is full of people who have acquired tremendous wealth; the misery they live with is that they go begging for significance.

By whose standard do you judge success and failure? Unless you have that standard within yourself, you will never know how to measure either. There was once a doctor named Williams who determined to devote his life to serving the poor and disabled in a great city. His patients often had nothing to pay him with, and he had only a small office at the top of a flight of stairs, over a liquor store, marked with the single sign, “Doctor Williams is upstairs.” The years passed, and Doc Williams died. His friends found that he was so poor himself, that there was no money left over to erect a marker on his grave. But one wise man knew what to do: he took down the old office sign and planted it on the doctor’s grave: “Doctor Williams is upstairs.” By our superficial standards, Doc Williams was a failure. Or was he?

The second thing I want to tell you is how often failure is the crucible out of which real success arises. Soichiro Honda, the founder of Honda Motor Corporation, once said: “Success is 99% failure.” And on the strength of that, I must tell you that I will feel sorry for you if your course in life leads you from one easy triumph to another, because if it does it will give you the arrogant notion that you are totally self-sufficient, and need no one else in the world to teach you anything. I have met people like that; they are wealthy, they have all the toys, but they are some of the most dreadfully stunted human beings I know. They never failed; and therefore, they never learned anything which might be more important than winning, such as confession or forgiveness or recovery.

Failure is a teacher. It brings us back to fundamentals; it disenthrones our egos and makes us see ourselves for what we are and shows us what we do not yet know. Do not be afraid to make mistakes; my mistakes (and they are more than a few in number) are my most important possessions, because they are what I have learned from. A reporter once asked a bank president to identify the secret of success. “Right decisions,” he replied. “Great,” said the reporter, “Now, how do you get to know how to make right decisions.” “Simple,” the president replied, “Experience.” “Well,” said the exasperated reporter, “how do you gain experience?” The president replied, “Wrong decisions.”

Failure also tells us who our friends are. As Oprah Winfrey once remarked, “Lots of people want to ride with you in the limo, but what you want is someone who will take the bus with you when the limo breaks down.”

Part of what makes Abraham Lincoln so admirable, so interesting to us even today, was that he had tasted the bitterness of failure, and was willing to take the bus with those who had failed. In 1860, he took time out from his presidential campaign to write to a friend of his son, Robert, who had failed the entrance exams at Harvard. “I have scarcely felt greater pain in my life than on learning yesterday…that you had failed to enter Harvard University,” Lincoln wrote to eighteen-year-old George Latham. There was empathy, the understanding of one who had been there, not the haughty superiority of one who looked down the nose. “And yet,” Lincoln continued,

“there is very little in it, if you will allow no feeling of discouragement to seize, and prey upon you. …I know not how to aid you, save in the assurance of one of mature age, and much severe experience, that you can not fail, if you resolutely determine, that you will not.”

Listen to Mr. Lincoln. Like him, I hope your paths will be strewn with success. Like him, I also know that it may be spiked with failure. Learn to embrace the failures as much as the successes, so that like him you may become what is more important even than being a “success” – becoming mature, resolute, persevering, hoping for all things, enduring all things, expecting all things.


Democracy in “The Federalist”

By John Grove

In January, Dr. Grove published an article in Polity examining the role of the public will in The Federalist Papers. Reflections takes its name from Alexander Hamilton’s iconic statement about the possibility of government directed by “reflection and choice.” Yet The Federalist is not exactly positive about democracy. Throughout, it indicates that democracy is prone to factionalism: the division of society into groups devoted to pursuing their own narrow interests rather than those of the whole. Yet, The Federalist also indicates that it is important for any government to be “popular,” deriving its authority from the people. In understanding exactly what The Federalist says about democracy, Dr. Grove argues that it is important to remember that it was written by different authors, and that those authors may have had slightly different viewpoints on what exactly is problematic about democracy and how a good government can address these faults:

The two primary authors of The Federalist, Alexander Hamilton and James Madison offered differing and conflicting accounts of the precise cause of factionalism and the manner in which the public will could safely be accommodated within the constitutional system. Alexander Hamilton believed that demagogic leadership was responsible for stirring up the otherwise politically apathetic citizenry into factional groups. The common citizen, he believed, was naturally uninterested in politics, preferring to focus on his own private life. This also meant, however, that the common citizens is relatively uninformed about political life and therefore susceptible to clever politicians know just how to “flatter [the people’s] prejudices and betray their interests.”[1] He even calls these petty politicians “parasites and sycophants” who are willing to sacrifice the permanent good of society in order to win a position of power for themselves. As such, Hamilton believed the key to a successful political system was constructing it in such a way that allowed better leaders, those who cared not about gaining temporary popularity, but about achieving great things for their country, to occupy positions of authority. He believed the presidency was the key to this: Public opinion could be unified around the person and character of the president, preventing faction so long as that office was held by a person of great vision and high ambition. As such, Hamilton put his faith in the Electoral College, the unlimited number of terms available to a president, and the robust powers of the office to attract the highest quality of leader.

James Madison, however, saw factionalism arising naturally, without any impetus from poor leaders. “The latent causes of faction” were “sown in the nature of man,” he wrote.[2] This mean that, whatever quality of leadership may exist, factions will always arise in popular governments. Therefore, they must be accommodated and moderated in the best way possible. Madison relied not on the president to do this, but a carefully crafted and limited legislature capable of refining the public will. Representatives would naturally reflect some of the biases of their constituents, but they would be placed in an environment which allowed for and encouraged healthy dialogue and compromise on the common good. Their terms would be just the appropriate length to provide a degree of independence from the factional will of the people, while still being ultimately answerable to it; the size of the legislature would be too large for casual intrigue and corruption, but too small to devolve into a mob; and the constitutional limitations on Congress’ power would mean representatives would be discussing broad, national issues and avoiding local concerns most likely to pit factions against one another.

Understanding The Federalist in this way allows it to illuminate questions which continue to press on us today: To what extent are the divisions in American politics caused by naturally arising identity and interest groups, and to what extent are they stirred up by the rhetoric of provocative leaders? Do we find the solution in unifying leaders, or in deliberation and dialogue? As is often the case, a careful reading of this great work can continue to offer wisdom and a framework within which to consider these political puzzles.

Dr. Grove’s full paper can be found here for those with access.

[1] The Federalist, No. 71

[2] The Federalist, No. 10



The Habits of Scholarship

By John Grove 

The following is an abridged version of remarks prepared for the keynote address for the Blue Ridge Undergraduate Research Conference held at LMU on Friday, April 5th, 2019. Dr. Grove was unable to deliver the remarks, but is happy to present a brief version of them here:

Ask any academic, and they’ll tell you one of the more difficult things we’re sometimes asked to do is interdisciplinary collaboration. Few things challenge us more than having to think outside the small area of expertise in which we’ve been trained and find unifying themes across different areas of study. So it is quite a challenge to write keynote remarks for a conference as broad as this one.

What unites all these different kinds of researchers? It is much easier to see what does not unite everyone here: First, academic discipline: We have students from psychology, natural sciences, sports medicine, humanities fields and many others. Second, research interests: The research presented here runs the gamut from the herbaceous species at Pine Mountain to the effect of parenting styles on life satisfaction; From learning assessment styles to Unionist sentiment in East Tennessee during the Civil War; from Jonathan Swift to the therapeutic use of Legos. Finally, methodology: We have literary analysis; we have controlled experiments; we have statistical analysis; we have the firsthand collection of data on tick DNA; we have artistic work; we have theory application and any number of other methods of obtaining knowledge.

So how is it that we can say everyone here is a “scholar”, and that the work presented here is all “scholarship” if the content of the research is so different? There are surely many ways to approach this question. Perhaps one way to consider it is to look at the people presenting the work. Perhaps we are all scholars because we are all scholarly. Our pursuit of knowledge is marked by certain characteristics and habits. So I want to put a question to you, and then some possible answers: What are the habits of scholarship? What makes a person “scholarly”?

Dictionaries don’t help much: the consensus definition, “of or relating to a scholar” is not terribly helpful. The word might have some connotations – If you say someone is particularly “scholarly”, it probably means that person is intelligent but abstract; out of touch with everyday life; I would like to put forward some more meaningful suggestions about what it means to be scholarly. These, I will preface, are not empirical observations: I certainly don’t mean to say that scholars always live up to these qualities. They are largely normative – what I think a scholar ought to be. But they are also characteristics that I think good scholarship hammers into us. They are habits that, if we regularly conduct high quality scholarship in an honest way, may eventually be engrained into our character. Scholarship makes us think and act in certain ways. This is certainly not an exhaustive list, and I’m quite sure there are plenty of critiques that can be lobbed its way. I hope my list, though, might prompt some thought about the ethos of scholarship – what it means to act in a scholarly way.

The first is intellectual humility. Certainly, the first characteristic that comes to mind when one thinks of academics is probably not “humility”. Academics have a not-altogether-undeserved reputation for pride. But conducting good research regularly reinforces the truth that we cannot presume to know answers. Regardless of our methodology, we cannot simply make assertions which we assume to be true. We must demonstrate the veracity of our claims with evidence and logical reasoning, both of which are open to critique by others. We always ought to be faced with the burden of proof. If we take that seriously, it drives home to us just how little we know instinctively, and how difficult it can be to justify even those claims that seem to us to be the obvious in the world.

Second, and relatedly, is the ability to accept failure and turn it into something constructive. Scholarship is not terribly egalitarian when it comes to outcomes: There are good papers and there are bad ones. Good arguments and bad ones. Good, well-constructed experiments and bad ones. We use a peer-review system in scholarly publishing precisely to separate the wheat from the chaff; to determine what is truly worthy of being published and read by others. This inevitably means that anyone working in this arena will experience some form of failure. Someone will, at some point, point out a vital mistake; a significantly flawed argument; a misapplication of theory. I went back and found some peer review comments on some of my published work: “there are places in which this manuscript makes implausible claims on matters of some significance”; “I hope this is not unkind, but there is something musty in this article”; “The article says the author will address an eternal but “dormant” question. It should remain dormant”; And my personal favorite, a comment made at the very first conference at which I presented as a graduate student: ““I’m sorry, he’s just wrong!” A scholar must take this in stride, recognizing that temporary failures can point the way to greater success. All of the comments I just mentioned were made on papers that were eventually published.

Third, is an appreciation of knowledge for its own sake. When I was a graduate student, my dissertation advisor related to me the common metaphor for how an academic gets noticed and makes a name for himself: Find the biggest, tallest tree in the forest, and take an axe to the roots. What this means, of course, is that you should find the most influential, well-known and respected research on your topic, and prove it wrong. I would add a caveat, however: Realize that you probably won’t be able to chop the tree down. Be content if you’re able to chip away some of the bark.

Why would anybody be content with chipping away at the bark of a giant tree if you set out to chop it down? You wouldn’t: That’s because the metaphor breaks down at this point. In most professions (including logging, I would presume), you would be discouraged if, after years and years of work you manage only to make a tiny dent in what you set out to do. But when you deal in knowledge, the calculus is a bit different. Most of us making a living as scholars have to come to accept the fact that we probably won’t revolutionize our fields of study. Our research won’t “change the world” – at least not in the direct, large-scale sense in which we often use that phrase. A 2007 study indicated as many as half of all published academic papers are read by precisely no one outside of the author, the peer reviewers and the journal editor. A different study found that a quarter of all papers in the natural sciences are never cited; 32% in the social

sciences, and a whopping 80% in the humanities. Now, there are some questions about the veracity of these statistics. Nevertheless, it’s clear that many papers that students and professors pour over for months and years will never be read or cited by anyone else, leading one blogger on Intellectual to ask,“why are professors writing crap that nobody reads?”

So, why are we? Part of this answer is the hope that our paper will be the one to make a difference; that it will be cited, it will influence others, and it will change the world. But I think most of us have to find an appreciation for knowledge for its own sake: that uncovering some truth, not just for the world but for ourselves; that knowing something we did not know before is worth the difficult task of learning it.

Fourth, good scholarship develops a habit of critical skepticism to claims of knowledge. All it takes is a quick perusal of social media posts these days to observe that most folks could use a bit more critical skepticism in the way we interact with all sorts of material. When we recognize how much it takes to justify a claim in our own research, we’re not as easily taken by those who claim to know it all.

But we don’t end with skepticism and criticism. Being a father for the past two years, I’m fascinated by observing the way my daughter grows up and learns about the world around her for the first time. One thing that is hard to miss is how much easier it is for her to destroy things than to put them together. From building blocks to puzzles, she immediately observes the weaknesses and pulls them apart. Putting them back together, however, requires a bit of help and patience. Scholarship also demands that we do both of these: deconstruct and construct; critique and build up. Not only do we critically evaluate the work of others, we also must put forward better explanations; positive answers to the questions that we are exploring. We build theories offer alternative explanations and build on the scholarship has come before ours. This is the imaginative element of scholarship: The ability to piece things together and see beyond the specific, narrow research questions that we examine, and appreciate a wider picture.

None of these qualities should be taken for self-congratulation. Scholars embody all of these traits. We often fight against them; We think we have all the answers. We have the same grand ambitions as people in every other field of life which makes us sometimes disappointed to only uncover some “minor” bit of knowledge. We get frustrated with failure. We sometimes fail to be critical when it comes to things we want to be true. We are sometimes overly critical, becoming satisfied only to tear down rather than build up our understanding. Yet I believe honest scholarship encourages these habits, and our hard work may not only build up the common stock of knowledge, but may also make us better people.


Presidential Emergencies: Constitutional Power or Congressional Dereliction?

by John Grove

Earlier this week, Professor Hubbard made the important observation that President Trump’s declaration of emergency must be considered within a historical and constitutional context. I disagree, however, with Professor Hubbard’s general view that the use of domestic emergency powers can be seen as an outgrowth of our constitutional system of separated powers, so long as it is utilized with circumspection and is subject to judicial review. Instead, I believe presidential emergencies are fundamentally at odds with the principle of the separation of powers and pose a significant challenge to that view of our constitutional system.

Professor Hubbard is correct to suggest that the constitutional framers expected the President to be able to act with speed and unity in times of national crisis. However, their final product does not contain any indication that presidents would, after the declaration of a crisis, have constitutional authority to exercise powers beyond those delegated by Article II.

The constitution itself does not grant any specific authority on the part of the executive to suspend the normal operation of law and take on new powers. If we are to find any such constitutional authority, it must be derived from Article II, Sec. 1, which states that the “The executive Power shall be vested in a President of the United States of America,” or Article II, Sec. 3 which states that the president “shall take care that the laws be faithfully executed.” This latter source I find untenable, as modern emergency declarations do not execute laws but alter them. It may be argued that identifying times of crisis and utilizing the whole force of the nation in the manner best suited to address that crisis is an inherent part of “executive power” and, therefore, is constitutionally assigned to the president by Article II, Sec. 1, however.

There may be some evidence for this. It has long been argued that there are certain exigencies to which a legislative body, with its deliberate and slow character, cannot adequately respond. John Locke, for instance, argued that the executive authority possessed “prerogative” powers to act for the good of society outside the established law if and when the legislature is unable to act. Yet, by the time the constitution was ratified, prerogative powers in Great Britain had been systematized and limited, and many powers considered to be prerogatives of the Crown, such as the pardon power and refusal of the royal assent (veto), when incorporated into the American constitution, were specifically codified and listed. They were not left to the discretion of the president. Furthermore, the only power the constitution specifically recognizes to suspend the normal legal order in time of crisis is the authority to suspend habeas corpus. This is, however (Abraham Lincoln’s example notwithstanding), granted to Congress in Article I, not to the president.

While presidents continue to make rhetorical appeals to an inherent constitutional authority to declare emergencies, most declarations (and, indeed, all which abide by the requirements of the National Emergencies Act), rely upon specific statutes passed by Congress which authorize presidential actions in certain circumstances. Here, I believe, is the real origin of contemporary emergency declarations. Over the years, Congress has regularly and willingly ceded its decision-making authority to the executive for the sake of ease, efficiency and flexibility. This is not limited to times of emergency, as the vast majority of domestic regulation empowers executive branch agencies to decide, through administrative rule-making, substantive regulations. Declarations of emergency merely highlight this tendency in starker relief. The separation of powers has some drawbacks and one of them is that it makes it difficult to speedily adopt new policies. The regular use of national emergencies addresses this problem, but it does so from outside the framework of separated powers, as it allows for the executive branch to exercise legislative power.

Consider the National Emergencies Act. This law was intended as a restriction on presidential authority to declare emergencies, yet it still authorized the president, without Congressional input, to unilaterally decide when an emergency exists, and to identify what powers he would wield to rectify it: A remarkable power. Congress initially seemed to understand just how much authority they were ceding to the president, as the original version of the National Emergencies Act allowed Congress to override a president’s declaration of emergency with a simple majority vote of both houses (the president being unable to veto the resolution). In 1983, however, the Supreme Court ruled that Congress could not simply abolish the presidential veto for certain acts of Congress, and found that portion of the act unconstitutional. The Court, thereby, made Congress choose: Either allow the president unilateral authority to declare an emergency – knowing it can only be rejected by overriding the president’s veto – or refuse to allow the executive to usurp your legislative authority. Congress chose the former and amended the National Emergencies Act to allow presidents to veto a Congressional resolution overruling an emergency declaration, essentially removing Congress’s oversight of such acts. This is what happened last week.

Another difficulty that arises from viewing presidential emergencies as simply one part of the system of separated power is the role of the courts. Professor Hubbard suggests that the Supreme Court ought to have final say on whether a president has overstepped his constitutional authority in a given emergency, just as it does on other separation of powers issues. But the Court can and should not play arbiter of what constitutes a national emergency or crisis. This is the essence of what the Court has called a “political question” with no clear legal answer. What does “crisis” mean? What circumstances justify a new, immediate and extra-legislative response? These are questions with no legal or constitutional answer and ought to be reserved for the political branches of government to decide. The Court should, of course, police the actions presidents take in the name of a national emergency to protect the rights of citizens, as it most famously did in the Youngstown Sheet and Tube Company case, but it cannot take it upon itself to decide what does and does not constitute an emergency.

This brings us back to Congress. It is Congress which possesses the constitutional authority to decide what policies and funding are appropriate in what circumstances, and it is Congress which ought to restrain presidential actions. Long ago, executive prerogative was justified by observing that legislatures were often not in session and could not easily be recalled to address a pressing crisis. In the age of internet communication, cell phones and air travel, we ought to consider whether this is still a necessity, or whether it might not be appropriate for Congress to retake its legislative authority.


Trump’s Emergency and the Separation of Powers

by Charles Hubbard

President Trump’s recent decision to declare a national emergency to allow the diversion of previously appropriated funds to address the crisis of illegal immigration along the southern borders of the United States has focused national attention on the long-standing debate over the balance of power authorized by the Constitution between the three branches of the national government. The Constitution allows for the executive branch to declare a national emergency to address an urgent national crisis. In addition, the president has the authority as commander-in-chief to exercise emergency war powers. The founders deliberately provided these vague and nonspecific powers to enable the President to respond quickly and without the delays often caused by congressional debate to an immediate crisis. However, the Constitution requires the president to seek congressional approval for all appropriations.

American presidents frequently declare national emergencies in cases of natural disasters, economic emergencies and threats to public security. However, there is a difference between a national emergency and the use of war powers. The pull and tug has existed between Congress and the executive branch over the use of these powers since the beginning of the Republic. Congress has the right under the Constitution to withhold funding to limit any abuse of the emergency powers by the president.

One of the earliest disputes over this issue arose when Thomas Jefferson unilaterally and without congressional authority purchased Louisiana from Napoleon. Jefferson, a strict constitutional constructionist, utilized a treaty agreement in order to bypass the responsibility of Congress to appropriate funding for the purchase of the vast territory. Over one hundred years later, Woodrow Wilson in February 1917 issued an emergency proclamation to address a maritime shipping shortage. Ultimately Wilson’s action led to the creation of the United States Shipping Board that still operates to regulate the Merchant Marine. In a highly controversial presidential decision, Franklin D. Roosevelt in 1933 declared a banking emergency crisis and declared a bank holiday closing the banks. The Supreme Court eventually ruled that the president was limited, but the court did not specify or provide limits to the power of the executive to declare a national emergency.

In an effort to limit the power of the president to declare a national emergency, Congress in 1919 enacted legislation authorizing Congress to terminate or overturn actions taken under a presidential proclamation or emergency action. More recently in 1976, Congress passed the National Emergencies Act in response to the abuses of the Nixon administration. This act requires a presidential emergency or proclamation to be specifically limited for one purpose and does not allow the president to provide for every possibility. The legislation specifically prevents the president from arbitrarily lumping multiple issues together at his sole discretion. The passage of this legislation, however, did not eliminate the potential problem associated with determining the existence of a national emergency: The president determines the existence of a national emergency. The president then determines the appropriate response to the emergency.

Presidential war powers are distinctly different from the national emergency powers. Frequently over the years, presidents have found it necessary to resort to the use of the war powers. Certainly, Abraham Lincoln during the Civil War exercised extraordinarily broad powers under the provisions of the war powers. Theodore Roosevelt called upon the war powers to seize the Panama Canal Zone. Harry Truman went to war in Korea without a declaration of war by exercising his authority as commander-in-chief. Lyndon Johnson during the Vietnam War acted without congressional approval until the Gulf of Tonkin resolution passed in 1968. Ronald Reagan used the military in Granada and Panama. The president has the constitutional right and the obligation to declare a national emergency or exercise extraordinary war powers to protect and preserve public order and security.

What then is the role of the court system as the third governing institution to provide checks and balances? It is the courts’ responsibility to determine whether there is a national emergency. In the case of the war powers, is a declaration of war required? In either case, are the actions of the president taken in response to the emergency constitutional? The legal process can be time-consuming and that is precisely why the founders granted these nonspecific powers to the executive. Ultimately, the court must have the final say.

Undoubtedly, the debate between the branches of government will continue. The Constitution with its creation of the three branches of government, each retaining specific powers, is a remarkable and unique contribution to political theory and practice. The debate and the resulting dialogue has produced limited and representative government for the people and by the people of the United States. It is important that we view the most recent use of expansive presidential power within the context of this delicate and essential balance.


The First to Die

By Stewart Harris

Elbert Williams was thirty-one years old in June of 1940.  He lived with his wife, Annie, in Haywood County, Tennessee, just northeast of Memphis.  He worked at the Sunshine Laundry in Brownsville, where he arguably had the most important job:  He kept the fire burning beneath the boiler that powered the entire enterprise.

Elbert was African-American, as were approximately seventy percent of the residents of Haywood County.  He served as a board member of the local chapter of the NAACP, which was trying to register black people to vote for the first time since Reconstruction.  Other NAACP leaders were threatened, their homes were burned down, and they fled.  Elbert, a big man and not easily intimidated, decided to remain in Haywood County.

Late one night, two Brownsville police officers showed up at the Williams home.  Elbert was just out of bed, barefoot, not dressed to go outside.  The police took him anyway.

Early the next morning, Annie went to the police station, where she tried to bring Elbert shoes and clothing.  The officer on duty—one of the two who had abducted her husband only hours before—looked straight at her and said, “I don’t know who he is.  Elbert Williams hasn’t been here tonight.”

Annie went to the Brownsville postmaster, a powerful local official.  Could he help her find her husband?  She still had his clothes.  The postmaster told her, “Maybe he doesn’t need any clothes.”

Eventually, desperately, Annie went to the Hayward County Sheriff, who finally acknowledged what everyone knew.  “Oh, Miz Williams, those boys are not going to hurt your husband.  They just want to ask him a few questions.  They’ll turn him loose.  If he’s not home in day or two, let me know.”

Elbert Williams never came home.  On June 23, 1940, a Sunday morning, his battered, lifeless body was found in the nearby Hatchie River.  The coroner called an immediate inquest, right there by the side of the water, that same morning.  There was no autopsy, no medical examination of any kind, despite contemporaneous accounts that the body was bruised, battered, castrated, and perhaps chained to a heavy weight, and despite Annie’s insistence that there were two holes in Elbert’s chest.  The inquest found that death had been caused by “foul means by parties unknown.”  The coroner—the brother-in-law of one of the police officers who had kidnapped Elbert—ordered an immediate burial.  That same afternoon, with no family present, Elbert’s body was buried in an unmarked grave.

The death caused quite a stir, for a while, anyway, at least within the local black community.  Elbert Williams was the first NAACP member to be murdered for advocating civil rights.  The perpetrators wanted to send a message, and they succeeded.

With time, however, memories faded, or were suppressed, at least in the local white community.   People simply didn’t talk about it.  Decades passed.

Enter trial attorney Jim Emison, a white man.  After graduating from Vanderbilt and the University of Tennessee’s College of Law in the 1960’s, Jim embarked upon a legal career in West Tennessee, a career marked by success, and accolades, and the respect of his peers.  He served as President both of the Tennessee Bar Association and the Tennessee Trial Lawyers Association.

Jim practiced law for years before he came across a reference to the case of Elbert Williams.  Intrigued, he asked colleagues about it.  Most of them had never heard of it.  More time passed, and Jim was busy, he couldn’t stop thinking about the unsolved, largely forgotten murder that had happened so close to where he lived and worked.

When he retired in 2011, Jim could have devoted his life to golf, or travel, or any of the other things that successful former attorneys do.  He decided, instead, to bring his analytical and investigatory skills to bear on the Williams case.  It was then more than seventy years since Elbert had died.  Documentary evidence was scant.  Witnesses, and perhaps, the perpetrators themselves, were dead.  There was no body.  Cases don’t get much colder than that.

Nonetheless, over the next seven years, Jim made considerable progress.  Others joined his cause.  On May 15, 2018, Governor Haslam signed legislation sponsored by Rep. Johnnie Turner and Sen. Mark Norris creating the Tennessee Civil Rights Crimes Information, Reconciliation, and Research Center to serve as a clearinghouse for cold civil rights cases.  On August 8, 2018, District Attorney Garry Brown re-opened the investigation into Elbert Williams’ murder.  There is no statute of limitations for murder in Tennessee.

Meanwhile, the search for Elbert Williams’ body has begun.  Vicksburg geophysicist Ryan North has used ground-penetrating radar to locate nine unmarked graves in the local cemetery where family lore says that Elbert was buried.  Careful excavation will soon begin, overseen by Dr. Amy Mundorff, a professor at the University of Tennessee’s Forensic Anthropology Center, home of the famous “body farm.”  If a male body is found matching Elbert’s large stature, its mitochondrial DNA will be compared to mitochondrial DNA of one of Elbert’s great-great-nieces.  Elbert’s body may still harbor forensic evidence, such as bullets.  Jim Emison has tracked down the sidearm carried by one of the police officers involved in the kidnapping.  Maybe, just maybe, a ballistics test will establish a match.

Last August, Jim gave a presentation at Lincoln Memorial University’s law school.  Like a spellbound jury, the audience hung on his every word.  Clearly moved, several members of Knoxville’s African-American community murmured along as Jim spoke: “Tell it!”  “Amen!”  At times, there were tears, both from the audience and from Jim himself.  A number of tissues were in evidence.

That same day, I interviewed Jim for my radio show.  If you’d like to listen to our discussion, click here.  Justice for Elbert Williams has been delayed, but perhaps it won’t ultimately be denied.

This essay originally appeared in DICTA: A Monthly Publication of the Knoxville Bar Association Vol. 45, Iss. 9, p. 26. Reprinted with permission of the author.