Perhaps the most revolutionary “self-evident truth” that Thomas Jefferson enumerated in the 1776 Declaration of Independence came in the second sentence of the second paragraph; that:
“Governments are instituted among men, deriving their just powers from the consent of the governed.”
The concept of the “consent of the governed” originated in the early 17th century with John Milton, Secretary for Foreign Tongues under Lord Protector Oliver Cromwell, in his 1648 book “The Tenure of Kings and Magistrates”. In it, Milton writes:
“The power of kings and magistrates is nothing else but what is only derivative, transferred and committed to them in trust from the people, to the common good of them all, in whom the power yet remains fundamentally and cannot be taken from them without a violation of their natural birthright.”
Milton’s book was a direct challenge to the “divine right of kings” enumerated by English philosopher Thomas Hobbes of Malmesbury, most famously in his 1651 book “Leviathan or the Matter, From and Power of a Commonwealth Ecclesiastical and Civil”. The man who most directly challenged Hobbes, however was English philosopher John Locke, a man strongly influenced by Milton. Hobbes was an absolutist: writing in “Leviathan” that people accede to a social contract and choose a leader to rule over them and abdicate their rights and exchange individual liberty in favor of subjecting themselves to the absolute authority of an absolute sovereign, giving the leader absolute power. Locke disagreed, writing in his 1689 “Essay Concerning Human Understanding” that:
“The right of making laws with penalties…for the regulating and preserving of property and of employing the force of the community in the execution of such laws…all this only for the public good. Such a power can arise only by consent, and though this may be tacitly given, it must be the consent of each individual for himself. For civil power can have no right except as this is derived from the individual right of each man to protect himself and his property. The legislative and executive power used by government to protect property is nothing except the natural power of each man resigned into the hands of the community and it is justified merely because it is a better way of protecting natural rights than the self-help to which each man is naturally entitled. This is the original compact by which men incorporate into one society; it is a bare agreement to unite into one political society, which is all the compact that is or needs to be between individuals that enter into or make up a commonwealth.”
In the early 18th century, the concept of the social contract came under attack from Scottish philosopher David Hume who, in his 1742 essay “Of the Original Contract”, refers to the consent of the governed as a “convenient fiction”, the ideal foundation on which a government could rest, arguing that it had not occurred in general:
“The one party, by tracing up government to the deity, endeavor to render it so sacred and inviolate that it must be little less than sacrilege, however tyrannical it may become, to touch or invade it in the smallest article. The other party, by founding government altogether on the consent of the people, supposes that there is a kind of original contract, by which the subjects have tacitly reserved the power of resisting their sovereign whenever they find aggrieved by that authority with which they have, for certain purposes, voluntarily entrusted him.”
Hume’s friend, Genevan philosopher Jean-Jacques Rousseau agreed with Locke. Rousseau wrote in Book I Chapter VI: “The Social Pact” his 1762 book “Of the Social Contract or Principles of Political Right” that:
“If then we set aside what is not of the essence of the social contract, we shall find that it is reducible to the following terms: Each of us puts in common his person and his whole power under the supreme direction of the general will, and in return we receive every member as an indivisible part of the whole.”
Hand in hand with the concept of rule by consent comes the right of revolution. In 1649, John Milton wrote in “The Tenure of Kings and Magistrates”, justifying the regicide of King Charles I:
“It is lawful, and hath been held so through the ages, for any who have the power to call to account a tyrant or wicked king, and after due conviction to depose and put him to death, if the ordinary magistrate have neglected or denied to do it.”
In Section 222 of Chapter XIX: “Of the Dissolution of Government” of “An Essay Concerning the True Original Extent and End of Civil Government”, published in 1689, John Locke wrote that:
“Whenever the legislators endeavor to take away and destroy the property of the people, or to reduce them to slavery under arbitrary power, they put themselves into a state of war with the people, who are thereupon absolved from any further obedience and are left to the common refuge, which god hath provided for all men against force and violence. Whenever therefore the legislative shall transgress this fundamental rule of society and either by ambition, fear, folly or corruption endeavor to grasp themselves or put into the hands of any other absolute power over the lives, liberties and estates of the people, by this breach of trust they forfeit the power the people had put into their hands for quite contrary ends and it devolves to the people, who have a right to resume their original liberty.”
In 1776, in the Declaration of Independence, Thomas Jefferson included among his “self-evident truths” that:
“Whenever any form of government becomes destructive to these ends, it is the right of the people to alter or to abolish it, and to institute new government, laying its foundations on such principle and organizing its powers in such form, as to them shall seem most likely to effect their safety and happiness…When a long train of abuses and usurpations, pursuing invariably the same object evinces a design to reduce them under absolute despotism, it is their right, it is their duty, to throw off such government, and to provide new guards for their future security.”
Jefferson, however, is more restrained than his predecessors in this right of revolution, moderating his statements by qualifying in the Declaration of Independence that:
“Prudence, indeed, will dictate that governments long established should not be changed for light and transient causes; and accordingly all experience hath shown that mankind are more disposed to suffer, while evils are sufferable, than to right themselves by abolishing the forms to which they are accustomed.“
A powerful influence on James Madison in authoring the Constitution of the United States was French lawyer and political philosopher Charles-Louis Montesquieu. In Book 11, chapter 6 of his 1748 treatise The Spirit of the Laws, Montesquieu wrote that “It is requisite the government be so constituted as one man need not be afraid of another.”
17th century English philosopher John Locke influenced Madison’s co-Founding-Father Thomas Jefferson. In his 1689 Second Treatise of Government Concerning the True Original Extent and End of Civil Government, Locke advocated governmental separation of powers. In The Spirit of the Laws, Montesquieu builds on Locke’s treatise by proposing a tripartite system. Montesquieu continues in Chapter 6, writing:
“When the legislative and executive powers are united in the same person, or in the same body of magistrates, there can be no liberty…There is no liberty if the judiciary power were not separated from the legislative and executive.”
In his 1787 Constitution, Madison makes Montesquieu’s tripartite system manifest by addressing one of the Constitution’s first three Articles to each of the three branches of government: Legislative, executive and judicial, each article establishing a separate institution or body of government to execute the powers of its assigned branch. However, the order in which the Founding Fathers placed these three Articles is telling of the thinking of the Founders as to which branch of government took precedence.
Article One of the Constitution establishes the Legislative Branch in the form of a bicameral Congress. Montesquieu, in Chapter 6 of his Spirit of the Laws, advocates this idea, writing that:
“The legislative body being composed of two parts, they check one another by the mutual privilege of rejecting. They are both restrained by the executive power, as the executive is by the legislative.”
The prominence with which Madison and the Founding Fathers featured Montesquieu’s idea of checks and balances in the Constitution is no coincidence either. In his indictment of King George III of England in his July 1776 Declaration of Independence, Jefferson wrote:
“He has forbidden his governors to pass laws…for the accommodation of large districts of people, unless those people would relinquish the right of representation in the legislature, a right inestimable to them and formidable to tyrants only…He has dissolved representative houses repeatedly…the state remaining in the mean time exposed to all the dangers of invasion from without, and convulsions within…He has made judges dependent on his will alone.”
As much as the founding of the United States was premised on the rejection of just such abuses of power by a sovereign monarch, one incident of what Jefferson termed “convulsions within” convinced the Founders of the necessity of a strong executive branch of government: Shays’ Rebellion.
The rebellion was prompted by the inadequacies of the 1781 Articles of Confederation, many of which were rooted in its lack of a strong centralized federal government. The Rebellion resulted in the deaths of six of the rebels and one government soldier, in part due to the Federal government’s lack of ability to organize and command a military under the Articles of Confederation. It was Shay’s rebellion, more than anything, which prompted Madison to present the Virginia Plan to the 1787 Constitutional Convention, which enumerated Montesquieu’s concept of a bicameral legislature while proposing a unitary executive. As a result, Article Two of Madison’s Constitution reads: “The executive power shall be vested in a President of the United States of America…he shall take care that the laws be faithfully executed.”
Echoes of Shays Rebellion can be seen in Article One Section Eight of the Constitution, in which The United States Congress is granted the power:
“To provide for the calling forth of the militia to execute the laws of the union, suppress insurrections and repel invasions. To provided for organizing, arming and disciplining the militia, and for governing such part of them as may be employed in the service of the United States.”
Likewise, Article Two Section Two reads:
“The President shall be Commander in Chief of the Army and Navy of the United States, and of the militia of the several states, when called into the actual service of the United States.”
As is best illustrated by the character played by Latvian actor Elya Bakin in season 6 episode 14 of Aaron Sorkin’s NBC television drama The West Wing, it is this seeming idiosyncrasy that most baffles and confounds people from other countries: That the Commander in Chief does not declare war. It was controversial at the Constitutional Convention of 1787 as well.
Most, if not all, of the delegates had already unanimously chosen the man who would assume the office of the Presidency before the Convention even began. Their choice was George Washington, the general who had led the American colonies’ military forces to victory over those of the British Empire in the American Revolutionary War four years earlier in 1783. Like eighteenth President of the United States Ulysses Grant and 34th President Dwight Eisenhower in the two centuries that followed his, Washington himself had no record of experience in government, and was chosen purely on his military record. With General Washington in mind as the man who, to them, embodied their conceptualization of the American Presidency, there was a strong impulse on the part of many delegates to the Constitutional Convention to give President Washington broad powers as a strong Commander in Chief.
However, as the character of White House Director of Communications Tobias Ziegler, played by actor Richard Shiff, points out to Bakin’s character in The West Wing, a Constitution is not about the current President, but about “the sixty guys who come after him”. Having only just broken away from a strong monarch who used his absolute power to oppress and wage war on his own citizens, and unable to foresee who the 43 Presidents of the United States to come after Washington would be, the Founding Fathers wisely placed the most power not in the Executive, but in the Congress, particularly when it came to making war. The result has been a representative Constitutional democracy that has lasted for more than two hundred years.
The platitude most often heard about the Cold War, which I believe to be fallacious, is that it was a contest between equals. Many written and video histories of the cold war portray the United States of America in the West and the Union of Soviet Socialist Republics in the East as twin global superpowers that emerged from the ashes of the Second World War. Upon closer examination, however, the two countries in the second half of the twentieth century reveals a very different picture indeed.
Beginning in the 1950’s under President Dwight Eisenhower, the postwar United States enjoyed the longest sustained period of prosperity and economic growth in its nearly two-hundred-year history as a nation. Technological innovations, many of which sprang from technology first constructed for use in war, such as radio, television, satellite communication, electric refrigerators, toasters and microwaves revolutionized the American standard of living over the next quarter century. The advent of prefabricated homes and affordable automobiles led to the construction of the first suburbs and good-paying blue-collar jobs led to the creation of a thriving and prospering American middle class. The American way of life in the decades after the Second World War became the model for what it meant to be a developed, industrialized first-world nation.
While the government of the USSR may have used its massive Nazi-style state-owned propaganda machine to project an image to the world of Soviet Russia as being a prosperous modern first-world nation in the style of the United States, the reality on the ground for Russian citizens could not have been more different. For one thing, the Soviet Union never had a middle class. While the upper-class elite lived a very Westernized American-style way of life, the vast majority of the people under Soviet rule lived out their lives in conditions that most Americans in the twenty-first century would instantly identify as being those of an underdeveloped third-world country. This discrepancy is exemplified by the fact that, in the late 1980’s and early 1990’s Soviet citizens’ exposure to the American way of life through television shows and movie critically undermined the control Soviet government had over its citizenry. The Berlin Wall fell in 1989 in part due to the people of Soviet East Germany’s desire to experience the perceived prosperity of the West.
This inequality did not merely exist socially, culturally or economically, but militarily as well. With the exception of the fact that both nations had stockpiles of thermonuclear weapons, making any armed conflict between them an apocalyptic proposition, the governments of both the United States and the Soviet Union understood the simple truth that America was the dominant military power.
For this reason, among others, the Conflicts between East and West were fought out via proxies, beginning with the war between the American-backed Democratic Republic of South Korea and the Soviet-backed People’s Republic of North Korea in the early 1950’s, and continuing on through the conflict between the American-backed Democratic Republic of South Vietnam and the Soviet-backed Republic of North Vietnam in the late 1960’s and early 1970’s. The American and Soviet militaries never did face off against one another face-to face on the battlefield.
This made the American “war” against the Soviet Union unlike any war America had ever fought before in its history. No war was declared by the United States Congress against either the People’s Republic of North Korea or the Republic of North Vietnam. Nor did America officially declare war against the Soviet Union itself.
In every other war that America had ever fought, beginning with the American Revolutionary War and the War of 1812 against the United Kingdom of Great Britain and carrying on through the Second World War against Nazi Germany and the Empire of Japan, American soldiers had, at some point during the war, faced the opposing nation’s soldiers in battle.
However, even when America did face the Soviet military, in Afghanistan in the late 1980’s, the United States government, in the form of he the Central Intelligence Agency, did so by proxy, using militant Islamist jihadists known as the Mujahedeen.
Article I, Section VIII of the Constitution of the United States of America, granting to the United States Congress the powers “to declare war”, “to raise and support armies”, “to provide and maintain a Navy”, “to provide for calling forth the militia” and “to provide for organizing, arming and disciplining the militia” were not only written in a time when the newly-founded United States had n standing professional military, but also a time when a war between nations necessitated a direct military confrontation between them. After the First World War, however, and especially after the Second, the United States did maintain a standing and professionally trained army, navy, air force, National Guard and Marine Corps. This rendered the powers granted to Congress in Article I Section VIII in regards to a “militia” all but obsolete. It also negated the necessity for Congress to ever exercise its other Constitutional power of “raising” an Army for the United States, since one already very much existed.
Article II, Section II, Clause I of the Constitution grants the President the power “be Commander in Chief of the Army and Navy of the United States, and of the Militia of the Several States, when called into actual service of the United States.” With a standing professionally trained army in place, the President’s powers in relation to the militia were almost as much diminished as those of the Congress. However it cannot be said, unlike with Congressional power, that the Presidency lost any of its powers, or that any became obsolete.
So while Congress’s Constitutional powers, such as that to declare war, were rendered inconsequential in the Cold War between the United States and the Soviet Union, the same cannot be said of the President’s Constitutional powers as Commander In Chief. It is notable that nowhere in Article II does it say that the President’s Commander In Chief powers are in any way contingent on Congress having issued a declaration of war in order for the President to be able to exercise them. The War Powers Resolution of 1973 made this de facto shift away from the conventional meaning of warfare official. Congress, in overriding the veto of President Richard Nixon, intended the Act to restrain the powers of a Presidency they perceived as run amok in the lead-up to the end of the Vietnam War. However, the War Powers Resolution officially authorizes the President, in his capacity as Commander in Chief, to send American forces into combat without a formal declaration of War by the Congress. As the Congress has not issued such a declaration since the beginning of the Second World War in 1942, this new formalization of Presidential executive power was a perfect fit for the new form of indirect and proxy warfare that America has found itself involved itself in ever since.
The Cold War ended in 1991, and perhaps if the peace that resulted had lasted longer than a decade, Americans may very well have demanded that the Congress reassert its Constitutional role in American war-making policy.
However such was not to be, as less than a decade after the dissolution of the USSR on Christmas day 1991, the amorphous threat of former President Ronald Reagan’s “Evil Empire” was replaced by a new fear and a new threat: that of international religious terrorism.
On the morning of September 11, 2001, airplanes flown by members of the “Al-Qaeda” Islamist terrorist organization destroyed the twin towers of the World Trade Center in lower Manhattan in New York City and severely damaged the Pentagon, headquarters of the United States Department of Defense, in Arlington Virginia, across the Potomac River from the American capitol of Washington D.C.
And just as abruptly as the Cold War had ended with the fall of the Berlin wall a decade earlier, a new war was launched with the fall of the Twin Towers, a war against an enemy just as amorphous as “communism” had been half a century before, if not more so. Then-43rd President of the United States George Walker Bush Junior II, in an address to Congress and to the nation shortly after September 11th, dubbed this new war a “war against terror”. While the war that President Bush launched in the fall of 2001 was not a proxy war like Korea or Vietnam had been, in that it involved an actual invasion of a foreign country by American troops, Bush’s invasion of Afghanistan was in many ways even more indirect than the Cold War had been. America was not fighting against any one nation with a military or a government, but rather against a practice, and indeed a concept: Terrorism. And while the Cold War ended with the USSR’s collapse in 1991, Bush’s war against terror could, theoretically, continue indefinitely.
As Doctor Martin Luther King Junior famously wrote in 1967, at the very height of the Vietnam War, in his book “Where Do We Go From Here? Chaos or Community?”
“The ultimate weakness of violence is that it is a descending spiral, begetting the very thing it seeks to destroy…Returning hate for hate multiplies hate, returning violence for violence multiplies violence and toughness multiplies toughness in a descending spiral of destruction…Adding deeper darkness to night already devoid of stars…The chain reaction of evil—hate begetting hate, wars producing more wars—must be broken, or we shall be plunged into the dark abyss of annihilation.”
The same is even truer of war and terrorism. The University of Oxford defines the term “terrorism” as: “the use of violent action and intimidation in the pursuit of and in order to achieve political aims or to force a government to act.” It defines a “war” as: “a fight or a sustained effort over a long period of time to deal with, get rid of, stop or end a particular unpleasant or undesirable situation or condition.” Under these and many other similar definitions of these two terms, a war can be and is classified as an act of terrorism in and of itself. So in a very real sense, terrorism will continue to exist so long as war exists. Like the proxy war against the amorphous concept of “communism” before it, the indefinite and perpetual war against the even more amorphous concept of “terror” is particularly conducive to the by-now institutionalized imbalance of Constitutional powers between the Congress and the Presidency under the war Powers Resolution.
In the era of mass-communication, beginning with the first televised Presidential speech by President Franklin Roosevelt in 1939 and carrying on through the weekly messages released on YouTube by President Barack Obama, a politician’s relative power in the government has been dictated in no small part by his or her level of visibility. This has exacerbated the gulf in political power between the President and the Congress, as while each house of Congress acts in legislating as a single body, each Congressman and Senator can only ever appear on television as an individual. Every President since Roosevelt has been a television celebrity, peaking perhaps the television coverage of the family life of President John Kennedy in the early 1960’s, a household that the reporters and journalists of the time labeled “Camelot”.
At the other end of Pennsylvania Avenue, however, the story reads quite differently. While the occasional Senator, such as Wisconsin’s Joe McCarthy and Gaylord Nelson, Massachusetts’ Edward Kennedy and John Kerry, and Arizona’s John McCain, may for better or worse achieve the status of becoming a household name, polls have repeatedly shown that not only are the overwhelmingly vast majority of the members of the House of Representatives unknown, but the American people by substantive majorities have difficulty even identifying their own Congressional Representative.
The indisputable celebrity of the Presidency, as contrasted with the relative anonymity of Congressmen, permits the President of the United States to be by far and away the most visible member of the United States Federal Government. When the President makes an address to the nation, television networks interrupt regularly scheduled programming to air the President’s words in their entirety, and 24-hour cable news networks devote round-the clock discussion to the President’s words. By contrast, the vast majority of the speeches given by members of Congress on the floor of the House or Senate go mostly untelevised, and thus unheard by most of their constituents. Since in the age of instantaneous communication a politician’s power is determined in large part by his or her visibility, this discrepancy between the relative visibilities of the President and members of Congress has the effect of further increasing the inequality in power between the Executive and Legislative Branches.
In the age of instantaneous mass-communication and the 24-hour news cycle, what President Theodore Roosevelt once famously referred to as “Bully Pulpit” has become a megaphone with the capability to drown out any and all other voices, no matter how loud or how popular. This is particularly true when it comes to foreign policy, as was demonstrated by the Bush Administration’s artful deception and misdirection, via mass-media, of the American people in the lead-up to their invasion of Iraq in March 2003.
This effect, of the public’s lack of knowledge of who their Congressional Representatives are leading them to follow the most visible, most vocal politician who appears the most often on their television screens, regardless of whether anything he or she might be saying is true. This is perhaps best described by the character of Lewis Rothschild, played by actor Michael J. Fox, in the 1995 film “The American President”:, written by screenwriter Aaron Sorkin:
“People want leadership, Mister President, and in the absence of genuine leadership, they’ll listen to anyone who steps up to the microphone. They want leadership. They’re so thirsty for it they’ll crawl through the desert toward a mirage, and when they discover there’s no water, they’ll drink the sand.”
The reply from the character of fictional President of the United States Andrew Shepherd, played by actor Michael Douglas, unfortunately describes the results of the largely disengaged and apathetic electorate:
“Lewis, we’ve had Presidents who were beloved who couldn’t find a coherent sentence with two hands and a flashlight. People don’t drink the sand because they’re thirsty. They drink the sand because they don’t know the difference.”
“In 2011, then Secretary of State Hillary Clinton and counterterrorism director Audrey Tomason were in the Situation Room with the rest of the national security team watching the raid of that killed Osama bin Laden. Der Tzitung, an orthodox Jewish newspaper in Brooklyn photoshopped both women out of the picture when they published the story. Editors wrote in response to criticism, “In accord with our religious beliefs, we do not publish photos of women, which in no way relegate them to a lower status…. Because of laws of modesty, we are not allowed to publish pictures of women, and we regret if this gives an impression of disparaging to women, which is certainly never our intention.” Does the newspaper have the right to alter the photo based on first Amendment grounds? How might Der Tzitung have handled this situation better in a way that respects religious beliefs, freedom of press, and responsible journalism?”
I believe today, and have always believed that the relative significance of what any given person thinks or believes pales in comparison to the importance of why they believe what they do. At the same time, I also hold that the net worth of a person or group of people cannot be gauged with sufficient accuracy merely by listening to what they say, but must also include observations pf their actions. As the classical Greek philosopher Plato once wrote, in the voice of his character of Socrates:
“We are what we repeatedly do. Excellence, then, is not an act, but a habit.”
Simply put, you cannot know what someone believes merely by asking them and listening to what they say. You know what you believe by what you do, and the same applies to anyone you might happen to meet.
As this applies to the situation in question, I believe that the emphasis has been misplaced on the fact that the Hasidic German-language Brooklyn newspaper “Di Zeitung” and magazine “Der Voch” deleted 67th United States Secretary of State Hillary Rodham Clinton and National Security Council Director for Counterterrorism Audrey Tomason from an image by White House Photography Office Director Pete Souza of the White House Situation Room in the basement of the West Wing of the White House. Even if we take the Orthodox Jews at their word: That their religious laws do not permit them to publish photographs of women; this still leaves starkly unanswered the question of why, exactly, it is that the “modesty” beliefs of Judaism so strictly prohibit the displaying of images of members of the female gender.
However, unlike with individual people, the motivations behind the actions of organizations such as religious cults are usually far less of a mystery. In the case of Judaism, there is no mystery at all, as the scripture wherefrom they derive their laws against women is called the “Torah”, a Hebrew term that translates as “Law”. It is part of the most-printed and best-selling book ever written in the known recorded history of human civilization: The Bible.
It should be noted however that, if they were to be followed in their entirety to their conclusion, any society that adhered to the Jewish Bible; known to Western Europe and the Americas as the “Old Testament”; would bear much closer resemblance to what is seen in the authoritarian autocratic, fascistic fundamentalist, Islamist theocratic dictatorships of the Middle East, the Arabian Peninsula, West Asia and the Indian Subcontinent than it would to anything that has been seen in the Western European civilized developed first world in the more than half a millennium since the end of the Medieval Dark Ages.
While there can be no question that, as an American newspaper, “Di Zeitung” has the prerogative to decide for itself what it will and will not print, it should also be noted that the very same First Amendment to the constitution of the United States Of America that guarantees them this right to the freedom of the Press also includes the Establishment Clause: A provision which explicitly and in no uncertain terms precludes religiously-based laws from having any force or jurisdiction anywhere within the borders of the United States.
Again, I feel that the emphasis in the “Der Voch” story is misplaced. To me, the newspaper’s photoshopping of the Secretary of State and the Director of Counterterrorism out of the iconic Situation Room photograph is not an issue of censorship, but rather a question of the motivations of the censors. By their own admission, the Hasidic Orthodox Jews at the German-language magazine deleted Secretary Clinton and Director Tomason from the image because their religious laws strictly instructed them that they must do so. The issue, therefore, is not whether or not newspapers should be permitted to selectively edit other people’s published material; such as Souza’s iconic Situation Room photograph; and proceed to publish the deliberately doctored content in their own publications. I feel that the argument that can be and has been made that this; while intellectually immature and dishonest behavior for any organization that claims to be in any way journalistic; is protected under the freedom of the press is a strong one.
To me the issue at hand is the question; as old as the progressive, secularist atheist, pluralistic liberal democracy that is the United States of America under the First Amendment; of whether or not religious cults and their adherent organizations should be permitted to take such infantile and childish immature actions based solely and exclusively on nothing more than what they believe the laws of their particular chosen religious cult, which under the Establishment clause have no jurisdiction and cannot be in any way enforced within our nation’s borders, command them to do.
While the very next clause following the Establishment Clause in the text of the First Amendment guarantees all people the right to practice whatever religion they choose freely, it is my firm belief that this right, like that of the freedom of speech, is not and was never intended to be absolute. Throughout human history the religious practices of civilizations around the globe have run the gambit from arbitrary dietary restrictions; such as the Jewish Old Testament Commandments’ prohibitions against the eating of owls and bats; to the ritualistic mass-human sacrifice on an industrial scale practiced by the Aztecs and Mayans of what is today Mexico.
As I noted previously, anyone who adheres strictly to the laws and Commandments of the Old Testament of the Bible would find themselves practicing precisely the sort of misogynistic discriminatory policies and frequent executions by stoning, beheading and burning at the stake that we see in the fascist Islamist theocracies of the Middle East and Arabian Peninsula. Since that is not something we see in this or any other Western civilized nation, it can only be concluded that those; such as the Orthodox Hasidic Jews in question here; who claim to be enforcing the laws of their religion are in fact doing nothing of the sort. By discriminating, in relatively modest ways by Biblical standards, against women but not stoning divorced men who remarry to death for adultery or beheading clean-shaven men for not wearing their beards and hair the proper length; the Orthodox Jews at “Di Voch” are demonstrating nothing of not that their proclaimed “enforcement” of Biblical commandments is highly selective to put it mildly.
The question of whether individuals have the right to practice whatever religion they choose freely was, I think, put to rest a couple of decades ago at least. However the issue of whether organizations, such as “Der Tzeitung” have the same religious freedom is one which is even to this very day being heatedly debated at the highest levels of our nation’s federal government.
Personally I feel that it should not be permitted for anyone, be it a person or a corporation, to selective enforce some of the bigoted discriminatory hateful, homophobic misogynistic, prejudiced, racist sexist, xenophobic Commandments of their chosen religious cult but not others. This, to me, reeks of intellectual dishonesty and “Do as I Say, Not as I Do” hypocrisy.
Much of the coverage of this story has featured prominently the supposed necessity of respecting any and all religious beliefs. However; as I have stated numerous times before elsewhere and will repeat once more here and now; I firmly do not believe that respect for anyone or anything should be either automatic or unconditional. Simply put, there exist beliefs that do not and never will deserve respect. A belief system, such as the Commandments of the Old Testament, that mandates its adherents to summarily execute en-masse anyone and everyone who does not believe in the same things as they do; which in the case of Judeo-Christianity is seven out of every eight men, women and children on Earth; most definitively qualifies as one such belief that never has been nor ever will be worthy of the respect of any sane rational human being.
“The President and the “organized interests” sometimes bash each other, but they also need each other, or at least have to accommodate to one another. What does each side want from the other? Who do you think generally gets the upper hand, i.e., is better able to get what it wants? (If you think it’s even, say so, but say why.)”
The answer to both the question of which special interest groups lobby the Executive Branch and the question of how much influence those groups wield within the White House is the same: “It depends”. Namely, it depends on who the President at the time is.
Interest groups are little more than merely an expression of the rights granted to the American people by the Petition Clause of the First Amendment: “The right of the people peaceably to assemble and to petition the government for a redress of their grievances”. If there is an overarching statement that can be made as to what interest groups want from the Oval Office regardless of who occupies, it is that they want what everyone who petitions the White House wants: The President’s help.
However, interest groups such as the Family Research Council or Focus on the Family are unlikely to spend any great deal of time or resources lobbying a President who is a progressive liberal secularist. Likewise, interest groups such as the American Civil Liberties Union and the NAACP are not inclined to lobby a President who is a conservative Southern racist Neo-Confederate anarcho-capitalist.
Who the President is also determines how much influence such interest groups wield in a given Administration.
In appointing James Watt as Secretary of the Interior, Republican President Ronald Reagan gave mining interests, and more specifically the coal mining industry, unprecedented levels of influence over the land use policies of the United States Federal Government.
Nearly a hundred miners lost their lives and more than twenty thousand were wounded per year through the 1990’s. A dozen miners died in a mine in Sago, West Virginia in January 2006. More than seventy would lose their lives that year alone. Nearly thirty more were killed in a mine in Montcoal, West Virginia on April 5, 2010.
Twenty years later, former oil and gas company CEO George Walker Bush Junior II brought with him to the White House not one but two of his fellow former fossil fuel and petrochemical industry members: his Vice President, Richard Cheney, was the former CEO of Halliburton, a multinational Fortune 500 oil corporation headquartered in Dubai in the United Arab Emirates. His Secretary of State, Condoleezza Rice, was formerly a member of the Board of Directors of the multinational oil corporation Chevron. As such, with Watt’s former employee Gale Norton as Secretary of the Interior, the fossil fuel and petrochemical industry was, for all practical intents and purposes, effectively permitted to write the energy policy for the Bush-Cheney White House.
In spite of Vice President Cheney’s close association with the natural gas company Enron proving troublesome for the Administration in late 2001, what resulted was a nearly decade-long period of systematic deregulation of the oil, coal, and natural gas industries.
This deregulation, and the defunding of Department of Interior’s Minerals Management Service, the Federal Government’s occupational health and safety oversight and inspection agency for the mining and drilling industries, that accompanied it proved tragic when an oil drilling rig partially operated by Halliburton, the aptly-named “Deep Water Horizon” exploded in the Gulf of Mexico forty miles southeast of Louisiana on April 20, 2010, killing eleven workers and wounding sixteen more. What followed was the largest accidental marine oil spill in the history of the world and the largest environmental disaster and public health crisis from chemical poisoning in United States history.
This is not to say that such Special Interest groups wield the same amount of influence regardless of the occupant of the Oval Office.
The fossil fuel industry’s influence over the Bush-Cheney administration is beyond dispute. However, contrary to continued criticisms of such nepotism from Republicans in Congress, no conclusive evidence has yet been found of influence being wielded over the Presidential Administration of Barack Obama by either the solar, wind and hydroelectric renewable and sustainable energy industries or their constituent corporations. Whether other interest groups, such as the National Association for the Advancement of Colored People, have wielded such influence in the Obama White House, however, remains open to interpretation. As was the case with the previous Democratic President William Clinton, it is far easier to state with certainty which Special interest groups are not represented by the policies of the Obama Administration, such as the interests of the National Rifle Association for example, than it is to divine which groups, such as the NAACP, might be peddling their influence inside President Obama’s White House.
This, more than anything, is what gives the President of the United States of America the upper hand in his dealings with interest groups: He controls the doors to the West Wing. Customarily, the easiest way to tell which groups were and were not lobbying the Office of the President was to merely check the White House visitors’ logbooks, which are a matter of public record, in order to see who was and was not invited into the Oval Office. It should be noted, however, that President Obama upset this trend considerably when he formally invited NRA President Wayne La Pierre to the West Wing for a face-to-face meeting. In this case, it was the interest group itself, and not the President, who decided whether the two would meet, when the NRA unceremoniously, though not entirely unexpectedly, refused the President of the United States’ request to come to a meeting at the White House.
One thing, it should be noted, that the coal mining industry’s steering of Reagan environmental policy and the oil industry’s steering of Bush energy policy had in common was that both welcomed the input of groups such as the United States Chamber of Commerce, the largest Conservative and pro-Republican business-oriented lobbying group in America, but neither included the voices or the interests of groups such as the American Federation of Labor and Congress of Industrial Organizations [AFL-CIO], the largest federation of trade unions. Given the great strides that such unions made in the twentieth century for occupational health and safety, it is difficult not to wonder how many dozens of lives might have been saved in the past quarter century had the dichotomy of which special interest groups were included and which were excluded been reversed.
How far back the trend goes is open to interpretation, whether it be only as far back as the reorganization of the Executive Branch of the Federal Government that followed the Watergate Scandal in 1974, or whether it indeed extends all the way back to the creation of the Executive Office of the President of the United states by Franklin Roosevelt in 1939. However far back it extends, the trend within the Executive Branch has been and continues to be twofold. Firstly, since the creation of the White House Staff under President Roosevelt, the number of actual duties and responsibilities that the President himself is required to perform personally has decreased substantially. However, the second trend is that the amount of power wielded by the Executive Branch, by the White house and by the President himself has expanded exponentially.
It is highly doubtful whether this trend is entirely the result of the expansion of the White House staff and of the Office of the President, as the streamlining effect of modern technology has certainly had its impact in allowing the President’s office to exert much more influence much more broadly in Washington and in the world with significantly less effort.
The advent of television, also, may have hastened the trend toward the President having less of a burden of duties. Since television cameras were first permitted inside the White House during the administration of President John Kennedy in the early 1960’s, nothing has gotten such camera crews higher approval ratings among the American people, and in turn boosted the ratings of the President himself, more than images of the President spending time with their family.
It began with Kennedy, his brother Robert, his wife Jacqueline Onassis and their children Caroline and John Jr., a family dynasty that came to be known colloquially as “Camelot”, and included John and Robert’s brother Ted.
Since the Kennedy’s, each successive President has placed a progressively greater and greater emphasis on having cameras capture the time that they spend with their families. In many ways this peaked with the Presidency of William Clinton in 1993, whose wife Hillary became almost as much of a social and political celebrity as her husband was, and was therefore able to continue her own political career long after his had ended, even to the point of launching two of her own bids for the Presidency in 2008 and 2016.
The men and women who staffed the President in the days before television are, for the most part, anonymous and unknown. However, as Presidents have placed greater emphasis on being filmed with their families, the President’s Senior Staff have assumed much more of a governing role within the Executive Branch. This, in turn, has led to the rise of celebrity Presidential staffers.
The first of these, perhaps, was Henry Kissinger, who served as National Security Advisor to President Richard Nixon in 1969 and then to Gerald Ford when Nixon resigned in 1974. Another was President Ford’s second White House Chief of Staff in 1974, Donald Rumsfeld, and his third Chief of Staff in 1975, Richard Cheney. Also falling into this category would be the Deputy National Security Advisor to President Nixon in 1970 and Ford’s Chief of Staff in 1974, Alexander Haig. The tradition then continued with the next Republican Presidential Administration, that of Ronald Reagan. Celebrity staffers in the Reagan White House included, most prominently, Reagan’s National Security Advisor in 1987, Colin Powell, but also his White House Chief of Staff in 1981, James Baker. Reagan’s successor, George Herbert Walker Bush Senior I, had fewer celebrity staffers, including his Deputy National Security Advisor in 1989, Robert Gates.
Following Bush’s brief Presidency, the Administration of President Clinton made a point of promoting to celebrity status many of his staffers. These included Clinton’s Senior Advisor in 1993, Rahm Emmanuel, and Clinton’s White House Chief of Staff in 1994, Leon Panetta, but also his Deputy Chief of Staff in 1997, Sylvia Burwell.
Whether the President having progressively less to do while at the same time wielding exponentially more power is a positive or negative development for America’s democracy and its government will ultimately be up to history to decide for itself. However, there is little or no indication of this trend slowing, much less coming to an end, any time in the remotely foreseeable future.