Allan C. Brownfeld

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby(Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

Saturday, 19 January 2019 13:30

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

Thanksgiving: A Time for Americans to Come Together

Thanksgiving 2018 is coming along just when we need it. The divisions in our diverse society have been growing, in large part because of intemperate political rhetoric which casts those with whom we disagree on matters of public policy as “enemies.” and the growth of “identity politics,” in which we are asked to identify ourselves by race, gender, ethnicity, sexual orientation and religion — not view ourselves as individual citizens of a free and democratic society.

It is time to take a moment and recall the uniqueness of the American society. From its very earliest days, ours has been a country made up of men and women of every conceivable background. In colonial America, Thomas Paine noted that:

“Is there a country in the world where concord, according to common calculation, would be least experienced, it is America. Made up, as it is, of people from different nations, accustomed to different forms and habits of government, speaking different languages, and more different in their modes of worship, it would appear that the Union of such a people was impracticable. But by the simple operation of constructing government on the principles of society and the rights of man, every difficulty retires and the parts are brought into cordial unison.”

Oliver Wendell Holmes pointed out that “We are the Romans of the modern world — the great assimilating people.” America, F. Scott Fitzgerald pointed out, was not simply another country:

“France was a land. England was a people, but America, having about it still the quality of the idea, was harder to utter — it was the graves at Shiloh, and the tired, drawn, nervous faces of its great men, and the country boys dying in the Argonne for a phrase that was empty before their bodies withered. It was a willingness of the heart.”

In recent days, some have said that diversity is an American “weakness,” not a strength. Any who hold this view simply do not understand our history. Diversity is not a novel 21st century notion. It is the reality of our society from its earliest days — long before we became an independent nation. Visiting New Amsterdam in 1643, French Jesuit missionary Isaac Jogues was surprised to discover that eighteen languages were being spoken in this town of 8,000 people. J. Hector St. John Crevecoeur wrote in 1782 in his Letters From an American Farmer, that, “Here individuals of all nations are melted into a new race of men, whose labors and posterity will one day cause great changes in the world.”

There was never a time when the American society was not diverse. By the time of the first census in 1790, people of English origin were already a minority. Enslaved Africans and their American-born descendants, made up 20 percent of the population. There were large clusters of Scotch-Irish, German, Dutch and Scottish settlers, and smaller numbers of Swedes, Finns, Huguenots, and Sephardic Jews.

America has been a nation much loved. Germans have loved Germany, Frenchmen have loved France, Swedes have loved Sweden. This, of course, is only natural. But America has been loved not only by native Americans, but by men and women throughout the world who have yearned for freedom. In the 1840s, Herman Melville wrote that, “We are the heirs of all time and with all nations we divide our inheritance. If you kill an American, you shed the blood of the whole world.” America dreamed a bigger dream than any nation in history. The dream remains very much alive, despite the efforts of those who would diminish it. It will survive even the tortured partisanship of the present time.

At a time when intolerance is widely expressed — especially on social media that enables disgruntled and disturbed individuals to connect with one another — we see growing manifestations of hatred and violence. The murder of eleven worshipers at the Tree of Life Synagogue in Pittsburgh is a recent example. The alleged killer is a white nationalist, a neo-Nazi who expressed particular anger at the Hebrew Immigrant Aid Society (HIAS), which helps resettle refugees from around the world. HIAS started its work in the 1880s. It says it originally helped refugees because they were Jewish. Now it helps refugees — from Iraq, Syria, Bangladesh, and elsewhere — “because we are Jewish.” If the Pittsburgh shooter, who denounced what he called an “immigrant invasion,” thinks he was upholding some sort of American tradition, he could not have been more wrong. After all, his own ancestors were immigrants — as were the ancestors of all of us — other than the descendants of those who greeted them.

The American tradition we celebrate on Thanksgiving Day is the one set forth by George Washington in his now famous letter to Jewish congregation of Newport, Rhode Island in 1790:

“The Citizens of the United States of America have a right to applaud themselves for having given to mankind examples of an enlarged and liberal policy: a policy worthy of imitation. All possess alike liberty of conscience and immunities of citizenship. It is now no more that toleration is spoken of as if it were by the indulgence of one class of people that another enjoyed the exercise of their inherent natural rights. For happily the Government of the United States, which gives to bigotry no sanction, to persecution no assistance, requires only that they who live under its protection should demean themselves as good citizens giving it on all occasions their effectual support.”

From the beginning, America has represented hope for a better future to people throughout the world. In a letter to Ralph Waldo Emerson in 1849, Thomas Carlyle wrote:

“How beautiful to think of lean tough Yankee settlers, tough as gutta-percha, with most occult unsubduable fire in their belly, steering over the Western Mountains to annihilate the jungle, and bring bacon and corn out of it for the Posterity of Adam. There is no Myth of Athene or Herakles to equal this fact.”

In 1866, Lord Acton, the British Liberal Party leader, said that America was becoming the “distant magnet.” Apart from the “millions who have crossed the ocean, who should reckon the millions whose hearts and hopes are in the United States, to whom the rising sun is in the West?”

We are a young country, but we are also an old one. Our Constitution is the oldest in the world, and we have continuously maintained the freedoms to which we first paid homage. There has been no period of an elimination of freedom of religion, or of the press, or of assembly. We have weathered wars and depressions. We will also weather the difficulties in which we are now embroiled. How ironic, that in a period of peace and prosperity, our political life has deteriorated to its present state. Democracy cannot thrive if men and women who disagree about public policy — health care, criminal justice, immigration, the environment, regulation of firearms, tax policy — are unwilling to work together and insist upon labeling those with whom they disagree “enemies of the people,” or worse. What happened to our traditional view of the “loyal opposition?”

We will move forward only if we recognize the fragility of a free and democratic society. It can be broken if its genuine uniqueness is not recognized and cherished. Thanksgiving is a time to recall our history and remember our values — and not give assent to those, on both the right and left, who seek only to condemn and divide. 

Remembering George H. W. Bush

It was with sadness that I learned of the death of George H. W. Bush.

I first met him in the early 1960s when I was in law school. I spent a summer in Houston working as a reporter for the Houston Press — and lived in the home of good friends Marjorie and Raymond Arsht, who were good friends of the Bushes.

One night, George and Barbara Bush came to dinner. He was then chairman of the Harris County Republican Party. One of his goals was to convince black voters to join the Republican Party. At that time, many Texas Democrats were still sympathetic to segregation. He encouraged Marjorie and Ray to have a reception for black leaders at their home. This took place after I returned to law school.

Later, after Bush was elected to Congress, I worked with him when I served as assistant to the research director of the House Republican Conference. We had two future presidents on our committee — Bush and Gerald Ford. We met weekly. I don’t remember hearing any abusive — or mocking — rhetoric about the Democrats. They were not viewed as “enemies.” Our goal was to convince as many Democrats as possible of the merits of the public policy proposals we were developing.

George H. W. Bush wanted the Republican Party to genuinely be the party of Lincoln. He wanted it to welcome Americans of all backgrounds. In his view, the role of a leader was to unite the country — not divide it. Sadly, the generosity of spirit he brought to our political life is lacking at the present time. Hopefully, we will return to it and abandon the divisive and narrow partisanship that is now corrupting our public life.

Making a Place for Christmas in a Chaotic World

As we enter the Christmas season, it seems that most of society’s concerns and obsessions are quite the opposite of what is, in fact, being celebrated. We live, more and more, in a materialist era in which the Christmas season begins with “Black Friday” and “Cyber Monday.” Newspaper headlines tell us how much money was spent each day — the more the better. In our political life, we are told that “nationalism” and “America First” are values we should embrace. But the Christmas message is something quite different.

I remember, after the murder of Martin Luther King, attending a memorial service at Washington’s National Cathedral. The hymn chosen declared, “In Christ, there is no East or West.” Its words express a universal religious message, which many seem to ignore:

“In Christ there is no East or West.

In Him no North or South.

But one great Fellowship

Throughout the whole wide earth.”

The idea of viewing all men and women as children of God, of respecting the stranger as oneself, was part of the Jewish tradition Jesus learned from his earliest days. Ironically, we have political spokesmen who at the very same time stir suspicion of those who are different — either by race, religion, or ethnicity — and proclaim they are Christians. What would Jesus say?

The views of man and the world set forth by Jesus — and the one that dominates in the modern world — are contradictory. Christmas should be a time of contemplation of the meaning of life — and of our own lives — and of seeking our answer to the question of what God expects of us.

In his book Jesus Rediscovered, Malcolm Muggeridge, the distinguished British author and editor, who had a religious conversion while preparing a BBC documentary about the life of Jesus, pointed out that a desire for power and riches is the opposite of what Jesus called for. Indeed, Jesus was tempted by the Devil with the very powers many of us so eagerly seek:

“Finally, the Devil showed Christ all the kingdoms of the world in a moment in time and said, ‘All this power I give thee, and the glory of them: for that is delivered unto me; and to whomsoever I will give it.’ All Christ had to do in return was worship the donor instead of God — which, of course, he could not do. How interesting though that power should be at the devil’s disposal, and only available through an understanding with him! Many have thought otherwise, and sought power in the belief that by its exercise they could lead men in brotherhood, and happiness, and peace — invariably with disastrous consequences. Always, in the end, the bargain with the Devil has to be fulfilled — as any Stalin, Napoleon or Cromwell must testify. ‘I am the light of the world,’ Christ said. ‘power belongs to darkness.’”

Muggeridge, who died in 1990, lamented the path in which he saw the Western world moving:

“I firmly believe that our civilization began with the Christian religion, and has been sustained and fortified by the values of the Christian religion, by which the greatest of them have tried to live. The Christian religion and these values no longer prevail. They no longer mean anything to ordinary people. Some suppose you can have a Christian civilization without Christian values. I disbelieve this. I think that the basis of order is a moral order; if there is no moral order there will be no political or social order, and we see this happening. This is how civilizations end.”

And yet, despite all of this, there is a spiritual yearning in our American society, a feeling that things are not what they should be, and a desire to set ourselves and our country back on a better path. Christmas speaks to the spiritual vacuum in our lives — but only if we will listen to the message.

G. K. Chesterton, discussing the meaning of Christmas, wrote:

“. . . there is a quite peculiar and individual character about the hold of this story on human nature; it is not, in its psychological substance, at all like a mere legend or the life of a great man. It does not in the ordinary sense turn our minds to greatness; to those extensions and exaggerations of humanity which are turned into gods and heroes, even by the healthiest form of hero worship. It does not exactly work outwards, adventurously, to the wonders to be found at the ends of the earth. It is rather something that surprises from behind, from the hidden and personal part of our being; like that which can sometimes take us off our guard in the pathos of small objects or the blind pieties of the poor. It is rather as if a man had found an inner room in the very heart of his own house, which he had never expected; and seen a light from within. It is as if he found something at the back of his own heart that betrayed him into good.”

A key question for Chesterton was, “How can we contrive to be at once astonished at the world and yet at home in it?” His sense that the world was a moral battleground, wrote his biographer Aliza Stone Dale, “helped Chesterton fight to keep the attitude that has been labeled ‘facile optimism,’ something that he could never recover, the wonder and surprise at ordinary life he had once felt as a child.”

The divisions in our society are unseemly and unnecessary — and the opposite of the Christmas message. Dividing people on the basis of race or ethnicity ignores the reality that all men and women are created in the image of God. To view people as “enemies” because they disagree about how best to deliver healthcare, or what the tax rate should be, or what our immigration policy should embrace, is to misunderstand the nature of democratic government. Men and women will naturally disagree about matters of public policy. That is why compromise in a democratic society is necessary. Genuine leaders strive to unite the American people, not divide it. We used to think that we could disagree without being disagreeable. Why is that no longer true for so many? Jesus urged his followers to love their enemies. Even many who call themselves Christian cannot even love those with whom they disagree upon one policy proposal or another.

This holiday season we would do well to reevaluate the real gods in our lives and in the life of our country. Our health and that of America may depend upon such a genuine celebration of Christmas.

As Political Passions Rise, Knowledge of American History and Government Declines

One of the ironies of our society at the present time is that, as political passions rise, the knowledge of American history, and how our system of constitutional government is meant to work, is in sharp decline.

The evidence of this decline is all around us. Recently, the Woodrow Wilson National Fellowship Foundation conducted a multiple choice poll using questions used on the test administered by U.S. Citizenship and Immigration Services and found a shocking lack of knowledge.

Only 13 percent could identify 1787 as the year the Constitution was written. The foundation said passing the citizenship test requires a score of at least 60 percent. But just 36 percent of the citizens they surveyed achieved that score. The poll found older Americans did better, with 74 percent of seniors answering enough questions correctly to have passed. Fewer than one in five Americans under 45 cleared the threshold.

The Woodrow Wilson Foundation President Arthur Levine said:

“With voters heading to the polls . . . an informed and engaged citizenry is essential. Unfortunately, this study found the average American to be woefully informed regarding America’s history and incapable of passing the U.S. Citizenship Test. It would be an error to view these findings as merely an embarrassment. Knowledge of the history of our country is fundamental to maintaining a democratic society which is imperiled today.”

The evidence of this sad reality has been building for some time. Several years ago, a student group at Texas Tech University went around campus and asked three questions: “Who won the Civil War?”; “Who is our Vice President?”; “Who did we gain our independence from?” Students’ answers ranged from “The South,” for the first question to “I have no idea,” for all three of them. However, when asked about the T.V. show Snookie starred in (“Jersey Shore”) or Brad Pitt’s marriage history, they answered correctly.

A study by the Intercollegiate Studies Institute surveyed more than 2,500 Americans and found that only half of adults could name the three branches of government. Studies have shown that 60 percent of college graduates don’t know any of the steps necessary to ratify a constitutional amendment and 50 percent don’t know how long the terms of representatives and senators are. Forty percent don’t know that Congress has the power to declare war; and 43 percent don’t know that the First Amendment gives them the right to freedom of speech; and a third can’t identify a single right it guarantees.

A 2016 American Council of Trustees and Alumni report showed that, even though all 12th grade students took a course in civics, less than a quarter of them passed a basic examination at “proficient” or above. In a survey of over one thousand liberal arts colleges, only 18 percent include a course in U.S. history or government as part of their graduation requirements.

Diane Ravitch, an education historian, was invited by the National Assessment of Educational Progress’s governing board, to review the results of a history and civics test in which 20 percent of fourth graders, 17 percent of eighth graders, and 12 percent of high school seniors demonstrated proficiency. She was particularly disturbed by the fact that only 2 percent of 12th graders correctly answered a question concerning Brown v. Board of Education, which she called “very likely the most important decision of the U.S. Supreme Court in the past seven decades.” 

Students were given an excerpt, including the following passage: “We conclude that in the field of public education, separate but equal has no place, separate educational facilities are inherently unequal.” Students were then asked what social problem the 1954 ruling was supposed to correct. “The answer was right in front of them,” said Ravitch. “This is alarming.”

The evidence of our failure to teach our history is abundant. Fewer than half of eighth graders knew the purpose of the Bill of Rights on a recent national civics examination and only one in ten demonstrated acceptable knowledge of the checks and balances among the legislative, executive, and judicial branches.

“These results confirm that we have a crisis on our hands when it comes to civics education,” said Sandra Day O’Connor, the former Supreme Court justice, who has founded icivics.org, a nonprofit group that teaches students civics through web-based games and other tools. Justice O’Connor says that:

“We face difficult challenges at home and abroad. Meanwhile, divisive rhetoric, and a culture of sound bites threaten to drown our national dialogue. We cannot afford to continue to neglect the preparation of future generations for active and informed citizenship.”

Historian David McCullough laments that:

“We’re raising young people who are, by and large, historically illiterate. I know how much these young people, even at the most esteemed institutions of higher learning, don’t know. It’s shocking.”

McCullough tells of a young woman who came up to him after a lecture at a respected university and said: “Until I heard your talk this morning, I never realized the original thirteen colonies were all on the East Coast.”

Historian Paul Johnson points out that:

“The study of history is a powerful antidote to contemporary arrogance. It is humbling to discover how many of our glib assumptions, which seem to us novel and plausible, have been tested before, not once but many times and in innumerable guises; and discovered to be, at great human cost, totally false.”

The history of the world indicates that freedom is not natural to man, but must be carefully cultivated and taught. Through most of recorded history, man’s natural state has been to live under one form of tyranny or another. Freedom must be learned and carefully transmitted from one generation to another if it is to endure. As Cicero (106-43 B.C.) understood:

“To remain ignorant of things that happened before you were born is to remain a child. What is human life worth unless it is incorporated into the lives of one’s ancestors and set in a historical context.”

The men who framed the U.S. Constitution were careful students of history, particularly the fate of early democracies in the ancient world, Athens and the Roman Republic. They sought to learn lessons from the demise of those early democracies. As a result, they crafted a government of limited power, and divided that power between three separate branches, hoping that freedom would be preserved in this way.

But free societies are very fragile. Our overheated political rhetoric at the present time, with each party portraying its adversary as a virtual enemy of freedom itself, threatens the very civility and honest competition that a properly functioning democracy requires. The less we know of history — and we seem to know less each year — the further we move away from what the Founding Fathers understood were the necessary prerequisites for freedom. As Thomas Jefferson wrote in 1816: “If a nation expects to be ignorant and free, in a state of civilization, it expects what never was and never will be.”

Do Those Who Promote “Socialism” Have Any Idea of What It Means?

Suddenly, we are hearing a great deal about “socialism.” A Gallup Poll in August found that 57 percent of Democrats said they view socialism positively. Other polls show the popularity of socialism among millennials. And Alexandria Ocasio-Cortez achieved celebrity promoting socialism, after she defeated the fourth-ranked Democratic House leader, Joseph Crowley, in a New York primary. Sen. Bernie Sanders (I-VT), has long called himself a “democratic socialist,” and gained widespread support in his 2016 challenge of Hillary Clinton.

What exactly do its proponents mean when the use the term “socialism?” In his article, “Socialism Is So HOT Right Now” (Commentary, Oct, 2018), Jonah Goldberg, a scholar at the American Enterprise Institute, notes that:

“. . . socialism has never been a particularly stable or coherent program. . . . It has always been best defined as whatever socialists want it to be at any given moment. That is because its chief utility is as a romantic indictment of the capitalist status quo. As many of the defenders of the new socialist craze admit, socialism is the off-the-shelf alternative to capitalism, which has been in bad odor since at least the financial crash of 2008.”

“For millennials’’ writes the Huffington Post’s Zach Carter, ‘‘capitalism means ‘unacceptable people ripping off the world’ while ‘socialism’ simply means ‘not that.’”

There was a time when socialism was widely understood to involve government owning the means of production, deciding exactly what was to be distributed, and who would get it. If contemporary advocates of socialism believe that the economies of Norway, Sweden and Denmark represent their ideal, they must be reminded that these Scandinavian countries are capitalist countries, with thriving, privately owned industries. They simply have decided to have higher taxes than we do, and to provide additional social services. They are not socialist.

We did have our own colonial experience with a genuine variety of socialism. From the earliest days, the American colonists learned the important lesson that the entire idea of the “common ownership” of property was both impractical and inequitable.

Discussing the experience of the Plymouth Colony, Professor Gottfried Dietze, in his book, In Defense of Property, writes that:

“Irrespective of what each of the colonists produced, everything went into a common warehouse and the government doled out the proceeds of the warehouse as need seemed to require. However, this system soon proved to be unsatisfactory. The warehouse was constantly running out of provisions and many of the colonists were starving. In view of this emergency, Governor Bradford and the remaining members of the colony agreed during the third winter to give up the common ownership and permit each colonist to keep the products of his work. This gave incentive to all.”

When Spring came, reported Governor Bradford:

“. . . the women now wente willingly into ye field and tooke their little-ons with them to set corne, which before would alledge weakness, and inabilitie; whom to have compelled would have bene thought great tiranie and oppression.”

The result of these efforts was a happy one.

Professor Dietze, reviewing the history of the entire American colonial period, as well as the thinking of the framers of the Constitution, concludes that, “. . . the American Revolution became, to a great extent, a movement for the protection of property.”

Those who today advocate an “equal” distribution of property claim that in doing so, they are simply applying the philosophy of the Founding Fathers to matters of economic concern. Nothing could be further from the truth.

In The Federalist Papers, James Madison clearly deals with this question. He wrote:

“The diversity in the faculties of men, from which the rights of property originate, is not less an insuperable obstacle to a uniformity of interest. The protection of these faculties is the first object of government. From the protection of different and unequal faculties of acquiring property, the possession of different degrees and kinds of property immediately results.”

It is difficult to understand how political activists who express suspicion of government and the ruling elites they believe to be in charge would think that socialism — which would give government power over our entire economy — would, somehow, be an improvement. What they misunderstand is the fact that economic freedom is the form of organizing an economy most consistent with other freedoms — of religion, speech and press, among others.

This point was made by Professor Milton Friedman:

“The kind of economic organization that provides economic freedom directly, namely competitive capitalism, also, promotes political freedom because it separates economic power from political power and in this way enables the one to offset the other.”

Unfortunately, we do not now have a system of genuine competitive free market capitalism. We have what some have called “crony capitalism,” with government subsidizing favored sectors of the economy, bailing out sectors which have gone bankrupt with taxpayer dollars, and interfering in the economy in myriad ways — most recently by imposing tariffs on products from a number of countries, leading to an unnecessary trade war. Democrats and Republicans are co-conspirators in this enterprise. 

Jonah Goldberg explains how this works:

“The major difference between the left and the right when it comes to any movement dedicated to overthrowing the free market order . . . is which groups will be the winners and which groups will be the losers. A left-wing system might empower labor leaders, government bureaucrats, progressive intellectuals, universities, certain minority groups, and one set of industries. A right-wing system might reward a different set of industries, as well as traditional religious groups and their leaders, an ethnic majority, aristocrats, or perhaps rural interests. But both systems would be reactionary in the sense that they rejected the legacy of the Lockean revolution, preferring . . . a natural state where the ‘stakeholders’ colluded to determine what was best for their interests.”

Where today’s conservatives stand is, Goldberg argues, increasingly confused:

“Today, in America, we associate defense of the market with the political right, although the new nationalist fervor aroused by Donald Trump and his defenders may overturn that somewhat. Already, the president’s economic rhetoric — and considerable swaths of his policies — is more reminiscent of natural state economics. Just as Obama picked economic winners and losers to the benefit of his coalition, Trump rewards industries that are crucial to his. One can argue that favoring wind and solar power is better policy than favoring steel or coal, but it’s still an argument for favoritism.”

Socialism, real socialism as envisioned by Karl Marx and his adherents, has always led to economic inefficiency and scarcity, and has eliminated political and religious freedom as well. The state controls everything and citizens become mere pawns of those in power.

It seems that those in our American political arena who casually embrace “socialism” know little of this history. They would do well to undertake a study of what socialism really involves and where it has led. If they did, they might be surprised to learn that they are promoting an ideology far worse, with far greater inequality, than whatever problems they seek to address in our own imperfect, but far preferable, society.

If, as has been said “ignorance is bliss,” then today’s advocates of socialism are having a moment of euphoria, to be followed, as night follows day, by a harsher reality.

The Green Book — The Travails of Traveling While Black During the Years of Segregation

For those of us who are old enough, and lived in the South, the years of segregation remain an indelible memory. I remember a time, not that long ago, when restaurants, restrooms, trains, buses, and almost every aspect of life was segregated. When I taught a course in international law at the Pentagon, I asked one of my students why there were so many restrooms along the hallways. I was told that the Pentagon, located in Virginia, was built during the years of segregation and that on the halls there were four sets of restrooms, for white men, black men, white women and black women.

The recent movie “The Green Book” shows the travail endured by black travelers in those days. It tells the true story of Dr. Don Shirley, a world-class African-American pianist who is about to embark on a concert tour in the South in 1962. In need of a driver and protection, he recruits Tony Valielonga, a tough-talking bouncer from an Italian-American neighborhood in the Bronx. Despite their differences, the two men soon develop an unexpected bond while confronting racism and danger. Tony is given a copy of The Green Book by the record studio, a guide for black travelers to find safe havens throughout the South. It guides them to the few establishments that were then safe for African Americans.

The Negro Motorist Green Book was originated and published by New York City mailman Victor Hugo Green from 1936 to 1966 as a guide to places and services relatively friendly to blacks. Many black Americans took to driving to avoid segregation on public transportation. The black journalist George Schuyler wrote in 1930 that, “All Negroes who can do so purchase an automobile as soon as possible in order to be free of discomfort, discrimination, segregation, and insult.”

Victor Green compiled resources “to give the Negro traveler information that will keep him from running into discrimination and to make his trip more enjoyable.” In 1917, the black scholar W. E. B. Du Bois observed that “the impact of ever-recurring race discrimination” had made it so difficult to travel to any number of destinations, from popular resorts to major cities, that it was now a “puzzling query as what to do with vacations.”

It was not only in the South that black travelers were not welcome. In Cincinnati, the African American editor Wendell Dabney wrote of the situation in the 1920s that, “Hotels, restaurants, eating, and drinking places almost universally are closed to all people in whom the least tincture of colored blood can be detected.” Not one hotel or other accommodation was open to blacks in Salt Lake City in the 1920s. Only 6 percent of the more than 100 motels on Route 66 in Albuquerque, New Mexico admitted black customers. Across the whole state of New Hampshire, only 3 motels in 1956 served African Americans.

In 1943, George Schuyler wrote: “Many colored families have motored all across the United States without being able to secure overnight accommodations at a single tourist camp or hotel.” He suggested that they would find it easier to travel abroad than in their own country.

In Chicago in 1945, St. Clair Drake and Horace A. Clayton reported that “the city’s hotel managers by general agreement do not sanction the use of hotel facilities by Negroes, particularly sleeping accommodations.”

Lester Granger of the National Urban League reported that black travelers had to carry buckets or portable toilets because they usually were barred from bathrooms and rest areas in service stations. African American travelers often packed meals and carried cans of gasoline because many service stations did not welcome them as customers.

Civil rights leader Julian Bond recalled that his parents used The Green Book, He notes that:

“It told you not where the best places were to eat but where there was anyplace. You needed The Green Book to tell you where you could go without having doors slammed in your face.”

Victor Green looked forward to a time when such guidebooks would no longer be necessary:

“There will be a day sometime in the near future when this guide will not have to be published. That is when we as a race will have equal opportunities and privileges in the United States. It will be a great day for us to suspend this publication. For then we can go as we please without embarrassment.”

The 1966 edition was the last to be published after the Civil Rights Act of 1964 outlawed racial discrimination in public accommodations. We have come a long way since then. When I was in college, in the years of segregation. If anyone suggested that we would live to see a black Supreme Court Justice, Secretary of State, and President. that person would have been considered mad. Yet, it has happened. Our society has shown a great capacity to change — for the better.

But the story is not over. Even today, we have politicians who seek to divide us on the basis of race. Too often, innocent people have been killed by the police, largely because of race. In December, a black man was escorted from the lobby of a hotel in Portland, Oregon because he was innocently speaking on his telephone in the lobby — even though he was a guest at the hotel. The news, unfortunately, has too many such stories.

Reviewing the history of The Green Book is instructive. We have come a long way. But our journey is hardly finished. And, sadly, divisions of people based on race, religion, and ethnicity is hardly a uniquely American problem. The growth of nationalism — often a euphemism for tribalism of one kind or another — is growing throughout the world. Hopefully, we — and people of good will everywhere — will learn some lessons from the story of The Green Book.     *

Friday, 09 November 2018 13:06

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

Socialism and American Politics: The Strange Involvement of Both Parties

Suddenly, the term “socialism” is on the lips of more and more men and women engaged in our political life. In August, Sen. Bernie Sanders said that socialism has gone “mainstream,” and urged Democrats to embrace the term. He is an avowed Democratic Socialist, as is Alexandria Ocasio-Cortez, who unseated 20-year incumbent Rep. Joe Crowley in a New York primary. Another self-proclaimed socialist, Rashida Tlaib, is the Democratic candidate for Congress in Detroit. Both Ocssio-Cortez and Tlaib are almost certain winners.

The Economist notes that, “Socialism is having a moment in America unlike any since perhaps 1912, when Eugene Debs, the socialist candidate won 6 percent of the national vote. A recent Gallup Poll showed that 57 percent of Democrats have a positive views of socialism.”

The poll, however, never defined “socialism,” so exactly what people were expressing support for was not clear. While Republicans immediately tried to tie Democrats identifying themselves as socialist with failed regimes in places like Venezuela and Cuba, and Newt Gingrich declared that socialists are “demons,” the reality may be somewhat more complicated.

Under classical Marxism, government ran the economy, owned factories and farms, and determined what was to be produced, who would get it, and what workers would be paid. This does not seem to be what today’s Democratic Socialists are promoting. Instead, the pro-free market Economist characterized what they are advocating this way:

“Even the platform of Bernie Sanders . . . left capitalism fundamentally intact, calling instead for a broader and more redistributive social safety net. His supporters seem enamored of Nordic-style social welfare policies. But those countries are not socialist; they are free market economies with huge rates of taxation that finance generous public services. Indeed, the ‘socialist’ part of those countries that (Democratic Socialists) support would be unaffordable without the dynamic capitalist part they dislike.”

While Republicans denounce “socialism,” the fact is that they endorse a form of government intervention in the economy — what many have called “crony capitalism” — which also challenges the idea of free market capitalism, except it serves a different constituency than would Bernie Sanders and those who embrace his philosophy.

In an article entitled “Corporate Welfare Lives On and On” in The American Conservative, Doug Bandow, senior fellow at the Cato Institute, notes that:

“Fiscal responsibility is out of fashion. The latest federal budget, drafted by a Republican president and Republican-controlled Congress, blew through the loose limits established by Democratic President Barack Obama. The result is trillion-dollar deficits as far as the eye can see.”

In Bandow’s view:

“Any amount of corporate welfare is too much. . . . Business plays a vital role in a free market. People should be able to invest and innovate, taking risks while accepting losses. In real capitalism there are no guaranteed profits. But corporate welfare gives the well-connected protection from many of the normal risks of business. Business subsidies undermine both capitalism and democracy. Allowing politicians to channel economic resources toward their preferred ends distorts investments and trade. Turning government into an engine of illicit profit encourages what economists call rent-seeking. Well-organized special interests usually triumph over the broader public and national interest.”

Tad DeHaven, a Mercatus scholar at George Mason University, makes the case that:

“Corporate welfare often subsidizes failing and mismanaged businesses and induces firms to spend more time on lobbying rather than on making better products. Instead of correcting market failures, federal subsidies misallocate resources and introduce government failures into the marketplace.”

Government aid to business comes in many forms, and is distributed through a variety of agencies, such as the Export-Import Bank and the Small Business Administration. We see spending, usually in grants, loans, and loan guarantees. There are limits on competitors, such as tariffs and quotas. There are tax preferences attached to broader tax bills to benefit individual companies and industries. All of these are a form of corporate welfare — insuring corporate profits. Those corporations on the receiving end of such subsidies employ armies of lobbyists — and huge campaign contributions — to achieve their goal. They contribute to both parties, so they always have a friend in power.

The Cato Institute argues that:

“Agriculture in particular has spawned a gaggle of sometimes bizarre subsidies, payments, loans, crop insurance, import quotas and more to underwrite farmers. When these distort the marketplace, further efforts are concocted to address these dislocations. A dairy program created milk surpluses, which in turn encouraged state price fixing that generated massive cheese stockpiles. . . . The federal government killed off cows as it continued to subsidize milk. . . . The Export-Import Bank is known as Boeing’s Bank. It provides cheap credit for foreign buyers of American products. This gives foreign firms, such as airlines that purchase Boeing airplanes, an advantage over U.S. carriers that must pay full fare. The Export-Import Bank’s biggest beneficiary, in recent years, has been China.”

 Those who believe in free markets have adversaries in both parties. The left’s advocacy of socialism and the right’s embrace of corporate welfare, both lead us in the direction of a government-managed economy. In the long run, other basic freedoms are also challenged when government control of the economy increases.

In their initial consideration about what kind of government to establish, the Founding Fathers, when they turned their attention to questions of economic organization, asked themselves which economic form would best maintain the free society they were in the process of creating. Clearly, the answer was free enterprise. For men suspicious of government power, this was an obvious choice.

Professor Milton Friedman explained that:

“The kind of economic organization that provides economic freedom directly, namely competitive capitalism, also promotes political freedom because it separates economic power from political power and in this way enables the one to offset the other. Political freedom means the absence of coercion of a man by his fellow men. The fundamental threat to freedom is power to coerce, be it in the hands of a monarch, a dictator, an oligarchy, or a momentary majority.”

In Friedman’s view:

“The preservation of freedom requires the elimination of such concentration of power to the fullest possible extent and the dispersal and distribution of whatever power cannot be eliminated — a system of checks and balances. By removing the organization of economic activity from the control of political authority, the market eliminates that source of coercive power. It enables economic strength to be a check to political power rather than a reinforcement.”

Political partisanship prevents Americans from understanding the forces that are at work in Washington. Republicans and Democrats regularly demonize each other, but regardless of which party holds office, government power grows and freedom declines. Whether it is bailing out Wall Street with taxpayer dollars, or subsidizing failing businesses, or keeping out competing products with tariffs, the last thing either party seems to want is a genuinely free market.

Understanding that the political parties are co-conspirators in the expansion of political power and the diminution of freedom is the beginning of political wisdom.

What Would the Founding Fathers Think of the Growth of Executive Power?

Executive power has been steadily growing, regardless of which party was in power. The Constitution clearly gives Congress the power to declare war. Still, we have gone to war in Korea, Vietnam, Iraq, Afghanistan, and a host of other places upon the authority of the president alone. Today, the president, on his own authority — without approval by Congress — imposes tariffs upon China, Canada and a host of other countries. We are even told by some that a president cannot be indicted or subpoenaed — even though the Constitution says no such thing. This would, in effect, place a president above the law. And how many of the rules under which we live have been imposed by executive order — by Bill Clinton, George W. Bush, Barack Obama, and Donald Trump — with no action by our elected representatives in Congress?

The Founding Fathers understood that freedom was not man’s natural state. Their entire political philosophy was based on a fear of government power and the need to limit and control that power very strictly. It was their fear of total government which initially caused them to rebel against the arbitrary rule of King George III. In the Constitution, they tried their best to construct a form of government that, through a series of checks and balances and a clear division of powers, would protect the individual.

The Founding Fathers would be disappointed to see the growth of government power, particularly in the executive branch. But they would not be surprised. In a letter to Edward Carrington, Thomas Jefferson wrote that, “The natural progress is for Liberty to yield and government to gain ground.” He noted that:

“One of the most profound preferences in human nature is for satisfying one’s needs and desires with the least possible exertion, for appropriating wealth produced by the labor of others, rather than producing it by one’s own labor . . . the stronger and more centralized the government, the safer would be the guarantees of such monopolies, the weaker the producer, the less consideration need be given him and the more might be taken away from him.”

That government should be limited — and clearly divided between separate branches — and that power is a corrupting force was the essential perception held by the men who formed the nation. In The Federalist Papers, James Madison declared:

“It may be a reflection on human nature that such devices should be necessary to control the abuses of government. . . . But what is government itself but the greatest of all reflections on human nature? If men were angels, no government would be necessary. If Angels were to govern men, neither external nor internal controls on government would be necessary. In framing a government which is to be administered by men over men, the great difficulty lies in this: you must first enable the government to control the governed, and in the next place, oblige it to control itself.”

The Founding Fathers were not utopians. They understood man’s nature. They attempted to form a government that was consistent with — not contrary to — that nature. Alexander Hamilton pointed out that:

“Here we already have seen enough of the fallacy and extravagance of those idle theories which have amused us with promises of an exemption from the imperfections, weaknesses, and evils incident to society in every shape. Is it now time to awake from the deceitful dream of a golden age and adopt as a practical maxim for the direction of our political conduct that we, as well as the other inhabitants of the globe, are yet remote from the happy empire of perfect wisdom and perfect virtue.”

Rather than viewing man and government in positive terms, the framers of the Constitution had almost precisely the opposite view. John Adams expressed the view that, “Whoever would found a state and make proper laws for the government of it, must presume that all men are bad by nature.” Adams attempted to learn something from the pages of history:

“We may appeal to every page of history we have hitherto turned over, for proofs irrefragable, that the people, when they have been unchecked, have been as unjust, tyrannical, brutal, barbarous, and cruel as any king or senate possessed of uncontrollable power. . . . All projects of government, formed upon a supposition of continued vigilance, sagacity, and virtue, firmness of the people when possessed of the exercise of supreme power, are cheats and delusions. . . . The fundamental article of my political creed is that despotism, or unlimited sovereignty, or absolute power, is the same in a majority of a popular assembly, an aristocratic council, an oligarchical junto, and a single emperor. Equally bloody, arbitrary, cruel, and in every respect diabolical.”

During the colonial period, Americans became all too familiar with the dangers of an all-powerful King and unlimited and arbitrary government. The Revolution was fought to prevent such abuses. When the Articles of Confederation were being considered, fears of excessive concentration of authority were often expressed. The town of West Springfield, Massachusetts, to cite one examples, reminded its representatives of the

“ . . . weaknesses of human nature and growing thirst for power. . . . It is freedom, Gentlemen, it is freedom, and not a choice of the forms of servitude for which we contend.”

To prevent the growth of unlimited government power, the Constitution divided government between a legislative, executive and judicial branch. The Congress was to be the most important branch, elected by the people on a frequent basis. The experience of life under an all-powerful King made a powerful president less than appealing. As years went by, however, the executive — whether Democrat or Republican — assumed more and more power.

Under President George W. Bush, for example, many began to refer to a new “Imperial Presidency.” The Cato Institute study, “The Cult of the Presidency” notes that the Bush administration’s broad assertion of executive power includes:

“ . . . the power to launch wars at will, to tap phones and read e-mails without a warrant, and to seize American citizens on American soil and hold them for the duration of the war on terror — in other words, perhaps forever — without ever having to answer to a judge.”

The study’s author, Eugene Healy, points out that:

“Neither Left nor Right see the president as the Framers saw him: a constitutionally constrained chief executive with an important, but limited, job: to defend the country when attacked, check Congress when it violates the Constitution, enforce the law — and little else. Today, for conservatives as well as liberals, it is the president’s job to protect us from harm, to ‘grow the economy,’ to spread democracy and American ideals abroad, and even to heal spiritual malaise.”

The modern presidency has become one far different from the one set forth in the Constitution. The Cato Institute provides this assessment:

“The constitutional presidency, as the Framers conceived it, was designed to stand against the popular will as often as not, with the president wielding the veto power to restrain Congress when it transgressed its constitutional bounds. In contrast, the modern president considers himself the tribune of the people, promising transformative action and demanding the power to carry it out. The result is what political scientist Theodore J. Lowi has termed ‘the plebiscitary presidency’: ‘An office of tremendous personal power drawn from people . . . and based on the . . . theory that the presidency with all powers is the necessary condition for governing a large democratic nation.’”

The men who led the Revolution, different from many today — in both parties — were suspicious of power and those who hold it. Samuel Adams declared:

“There is a degree of watchfulness over all men possessed of power or influence upon which the liberties of mankind much depend. It is necessary to guard against the infirmities of the best as well as the wickedness of the worst of men. Jealousy is the best security of public Liberty.”

The Founding Fathers would not be happy with our increasingly powerful government — and chief executive — but they would not be surprised. Leaving the Constitutional Convention, Benjamin Franklin was asked what kind of government had been established. He replied, “A Republic, if you can keep it.”

People who call themselves “conservative” used to understand all this. Now, they seem to have forgotten.

An Epidemic of Child Abuse in the Catholic Church: What Would Jesus Say?

More than 300 Catholic priests across Pennsylvania sexually abused children over seven decades, protected by a church hierarchy who covered it up, according to a sweeping grand jury report released in mid-August. The investigation, one of the most comprehensive inquiries into church sex abuse in U.S. history, identified 1,000 children who were victims — but reported that there are probably thousands more.

The grand jury wrote that, “Priests were raping little boys and girls, and the men of God who were responsible for them not only did nothing, but hid it all for decades.” The 1,400-page report described some of the abuses in disturbing detail. In Erie, a 7-year old boy was sexually abused by a priest who then told him he should go to confession and confess his “sins” to that same priest. Another boy was repeatedly raped from ages 13 to 15 by a priest who bore down so hard on the boy’s back that it caused severe spinal injuries. He became addicted to painkillers and later died of an overdose.

One victim in Pittsburgh was forced to pose naked as Christ on the cross as priests photographed him. Priests gave the boy and others gold cross necklaces to mark them as being “groomed” for abuse. One priest raped a girl who bore him a child. Another made his victim get an abortion. The report notes four cases in the Scranton diocese in which bishops and other church leaders allowed predator priests to continue in the ministry. The leadership also used confidentiality agreements with settlements to silence the victims. In one instance, they provided tuition for a boy to attend a school in the diocese.

Consider the case of Rev. Thomas D. Skotak. He sexually assaulted a minor female between 1980 and 1985, resulting in pregnancy. He helped her get an abortion in 1986. When Bishop James Timlin became aware of the situation, he transferred Father Skotak to another parish in 1989, and offered $75,000 to the girl and her family, contingent on a nondisclosure and confidentiality agreement. After the settlement, Bishop Timlin sought to reassure senior Catholic leaders in Rome that Father Skotek’s “criminal” acts would likely remain hidden. Sadly, we can fill pages with reports such as these — and that is what the Pennsylvania grand jury did.

The unfortunate fact is that the decades-long cover-up by the church hierarchy has created a situation in which few criminal cases may result from the massive investigation because most instances of abuse are too old to be prosecuted because of the statutes of limitations. One answer many are now calling for is a re-thinking of the whole idea of statutes of limitations.

Pennsylvania State Rep. Mark Rozzi said he was raped by a priest at his Catholic Church in Berks County, Pa. The same priest, he said, sexually abused one of his childhood friends, who killed himself in 2009. Rozzi called on fellow legislators to pass measures that would eliminate the statute of limitations for criminal prosecution of sexual abuse of children. In addition to ending such limitations, the grand jury also called for a law to allow older victims to sue a diocese for damage inflicted upon them as children, tighter laws that mandate the reporting of abuse, and an end to nondisclosure agreements when settlements have been reached.

Corruption in the church has been widespread, from parish priests to bishops and beyond. In July, Cardinal Theodore McCarrick, former Archbishop of Washington, resigned after being accused of sexually abusing children and adults for decades. Cardinal Donald Wuerl, the current archbishop of Washington, figures prominently in the report because he led the Pittsburgh diocese as its bishop from 1988 to 2006. It reports that, at times, he removed abusive priests, and, at other times, guided them back into parishes. 

The fact is that there has been no full accounting of abuse in the Catholic Church in the U.S. Peter Isely, a longtime advocate for victims of sexual abuse, said groups have long been pressing the U.S. Government for a national investigation of child sex abuse, particularly in the Catholic Church. Isely, who was abused and is a spokesman for the global group, Ending Clergy Abuse, said that a five-year inquiry in Australia is “ the gold standard,” but that other nations, including Canada, Germany, and Ireland, have conducted national reviews. “Imagine if they did what was done in Pennsylvania, but nationwide,” he said. In Chile, prosecutors and police are raiding church offices, confiscating documents and looking for crimes that went unreported to police.

Hopefully, the Pennsylvania grand jury report will lend new momentum to statute reform efforts both in that state and nationwide. “This will reignite these battles at the state level,” said Michael Moreland, a law professor at Villanova University, a Catholic school outside of Philadelphia. The grand jury also urges a two-year “civil window” in the existing statutes of limitations that would allow victims to sue the church for damages no matter when the abuse occurred. “These victims ran out of time before they even knew they had a case,” the grand jury wrote. In the past, the Catholic Church has lobbied fiercely against any such provisions that would hold it accountable. The church argues that it would be left open to “financial catastrophe.” Given the church’s role, that might, many would argue, constitute simple justice.

How would Jesus react to a church acting in His name in such a manner? When it comes to the abuse of children, consider these words of Jesus in the Gospel of Matthew:

“Anyone who welcomes one little child like this in my name, welcomes me. But anyone who is the downfall of one of these little ones who have faith in me would be better drowned in the depths of the sea with a great millstone around his neck.”

For a much lesser offense than the sexual abuse of children, we know how Jesus reacted to moneychangers and others who were corrupting the temple. Jesus and his disciples traveled to Jerusalem for Passover. He finds the holy temple corrupted by merchants and moneychangers. He expels them for having turned the temple into a “den of thieves” through their commercial activities. In John 2:13-6, we read:

“And making a whip of cords, He drove them all out of the temple, with the sheep and oxen. And He poured out the coins of the moneychangers and overturned their tables. And He told those who sold the pigeons, ‘Take these things away; do not make my Father’s house a house of trade.”

Nathan W. O’Halloran identifies the actions of Jesus with “a calculated prophetic action evocative of the temple condemnation in Jeremiah 7:1-15.” The Gospel of Mark uses the phrase, “Then he taught them. . . .” as Jesus references the prophet Jeremiah. The quote from Jeremiah reads:

“Are you to steal and murder, commit adultery and perjury, burn incense to Baal, go after strange gods that you know not, and yet come to stand before me in this house which bears my name, and say: ‘we are safe; we can commit all these abominations again?’ Has this house which bears my name become in your eyes a den of thieves, I, too, see what is being done, says the Lord (Jeremiah 7:9-11).”

The Catholic Church portrayed in the Pennsylvania grand jury report is not a “den of thieves” but something far worse. One can only imagine how Jesus would respond to those who have inflicted such horror, pain and suffering in His name.     *

Tuesday, 10 July 2018 11:29

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

Fifty Years Ago, Washington Was Burning; Despite Continuing Problems, Advances in Race Relations Have Been Dramatic

Fifty years ago, on April 4, 1968, the Rev. Martin Luther King, Jr. was murdered in Memphis. Riots exploded in 125 cities nationally, 43 people died, 3,500 were injured and 27,000 arrested in violence during the ten days following King’s murder, according to Peter B. Levy’s new book about race riots in the 1960s, The Great Uprising. Damage estimates reached upwards of $65 million — about $442 million today.

In 1968, this writer, a few years out of law school, was working in the U.S. Senate — as Washington went up in flames. Thirteen people were killed, two of them never identified. The air was filled with smoke and tear gas and the streets were littered with broken glass. Parts of the city resembled combat zones, and 13,000 members of the Army, Marines, and National Guard were brought in to regain control. I remember tanks patrolling the streets of Capitol Hill and a curfew of 6 p.m. From the window of a friend’s apartment across the Potomac River in Alexandria, Virginia, we could see smoke filling the air of the nation’s capital. It felt as if our country was tearing itself apart.

In those days, Washington was a largely segregated city. It did not elect its own city government, but was controlled by the House Committee on the District of Columbia. It was presided over by Rep. John McMillan (D-SC) and other Southern Democrats who were sympathetic to segregation. It represented the very system of taxation without representation against which the American Revolution had been launched. Thus, a majority-black city was disenfranchised, adding to the anger on display in the riots. 

Charlene Drew Jarvis, a fourth generation black Washingtonian, and former member of the D.C. Council, recalls that:

“There was a confluence of anger and hurt about the death of Martin Luther King. But there was also a way of breaking out of a cage in which African Americans felt they had been contained. A lot of it had to do with, ‘We’ve been contained here. We’re angry about this. We owe nothing to people who have confined us.’”

Much has changed in Washington since 1968. There is now an elected D.C. Mayor and City Council. Former D.C. Mayor Anthony Williams, who is black, notes that:

“Civil rights advances resulted in the desegregation of the federal and District government work forces, reversing discrimination that began, formally at least, more than 50 years earlier during the Woodrow Wilson administration.”

This is not to say that Washington does not still have serious social problems to deal with, and that many African-Americans continue to feel left behind. But the larger picture, as Anthony Williams points out, is a positive and hopeful one:

“For many residents, commuters and tourists, life is dramatically better. Seen through this lens, 50 years after rioting left large sections of the city in ruin, the District is a great success story. Washington has advanced markedly in its revitalization, its finances are on an enviable footing, its population continues to increase, investment continues to flow, and it is considered a front-runner for a new Amazon headquarters. . . . I am optimistic. Yes, inequality has been persistent; yes, the concentration of poverty in our city is daunting; but we have the capacity, we have the resources, and we’ve shown the willingness to tackle the big problems.”

There has been much concern expressed in recent days about the state of race relations in the American society. The election of Barack Obama in 2008 inspired hope that the country had moved into a new era of racial equality. More recently, the rise of white nationalist groups, such as those which organized last summer’s racist march in Charlottesville, Virginia, and continued police shootings of unarmed black men, have caused some to express doubt that real progress has, in fact, been made.

The reality, however, is that, despite shortcomings, things continue to improve. According to the Center for American Progress, the number of black men between 18 and 24 who attend a form of higher education is on the rise. Between 1976 and 2014 the number of black men aged 25 and over who earned at least a bachelor’s degree rose from 6.3 percent to 20.4 percent. In the same time period the rate of high school dropouts for black men has more than halved, decreasing from 21.2 percent to 8.1 percent.

The respected black academician, Prof. Henry Louis Gates, Jr. of Harvard, believes that the past five decades have been, if not a new Reconstruction, the occasion for tremendous progress for black Americans:

“This period, 1965-2015, I was thinking of it as the Second Reconstruction. This specific period is one between the Voting Rights Act and the re-election of the first black man to occupy the White House.”

Gates refers to this 50-year period as one of “unparalleled advances for black people,” which he explored in a four-hour PBS series, “Black America Since MLK: And Still I Rise” in 2016. While racial injustice continues to exist, in Gates’ view:

“The picture is quite complicated. On one hand, the black middle class has doubled. The black upper middle class has quadrupled. We have more black people elected to state office than ever before. These things were scarcely imaginable the terrible day in April 1968 when Dr. King was killed.”

To those of us of a certain age, who lived in the South during the years of segregation, when a black person could not get a cup of coffee, or use a rest room or, in many cases, cast a ballot, to suggest that race relations have not been steadily improving is to ask us not to believe our own eyes. When I was in college, President Eisenhower sent federal troops to integrate the schools in Little Rock. In our dormitory discussions of events in the world, if anyone suggested that we would live to see a black Governor of Virginia, or a black Secretary of State, let alone a black president, he would have been viewed as mad. We have, fortunately, lived to see things we never imagined. But things don’t move steadily in the right direction. Sometimes, there are those who suggest a backward step. Some problems prove difficult to resolve. But, taking all things in their proper perspective, 50 years after Dr. King’s murder and Washington in flames, we are a better country than we have ever been when it comes to race. Hopefully, despite all of our problems, we will become better still.

The Strange Criticism of the Movie “Chappaquiddick” — A Seeming Defense of Ted’s Kennedy’s Admittedly Bad Behavior

Political partisanship, on all sides of the political spectrum, makes people do strange things. Many find a way to defend the most outrageous behavior on the part of those within their own party — behavior they would find completely unacceptable if engaged in by those in the opposition. This is part of the reason people have such a low opinion of politicians, both Republicans and Democrats.

The new movie “Chappaquiddick,” which this writer found to be a fair presentation of what occurred on the night of Friday, July 18, 1969, is becoming the subject of controversy. The late night accident occurred on Chappaquiddick Island, Massachusetts, and was caused by Sen. Edward M. Kennedy’s negligence, and resulted in the death of 28-year-old Mary Jo Kopechne, who was trapped inside the vehicle.

According to his own testimony, Kennedy accidentally drove his car off the one-lane bridge and into a tidal basin. He swam free, left the scene and did not report the accident to the police for ten hours. Kopechne died inside the fully submerged car. The next day, the car with Kopechne’s body inside was recovered by a diver, minutes before Kennedy had reported the incident to local authorities. Kennedy later pleaded guilty to a charge of leaving the scene of a crash, causing personal injury, and he later received a two-month jail sentence. 

The film begins the day before the crash and ends six days later. The film’s producer, Mark Ciardi, says that:

“It’s amazing how compelling that narrative is when you just look at the facts. The writers used the inquest. It wasn’t off of a book. We went with the facts we knew, and didn’t make a movie for the left or right. It’s for the truth. . . . It’s a very tight line to walk because it’s a pretty bad incident that happened. A girl died at his hands, and his actions after proved pretty incredible — not in a great way.”

Mary Jo Kopechne was a member of Robert Kennedy’s staff for four years, and was considered a political idealist with a promising future. On the night in question, Kennedy attended a party on Chappaquiddick, an island off the coast of Martha’s Vineyard. Also attending were a group of young women who had worked in Robert Kennedy’s 1968 presidential campaign. Kennedy left the party early, and Kopechne asked if she could join him for a ride back to the hotel.

In an interview with Breitbart senior editor Rebecca Mansour, Ciardi discussed the film’s revelation of how Kopechne might have been saved if Kennedy had acted differently after the accident. He notes that:

“We spoke and tracked down the scuba diver, John Farrar, and he had been on record. . . . As he was recounting it, it was chilling, as if it was yesterday. He said when he got into that car and saw the position of the body and the way it was almost kind of reaching for her last breath up in the corner, hands up. When he took her out and they put her on the beach, when they compressed the stomach and chest, that there was a kind of pink froth coming out of the nose and mouth, which that signals to him that it was asphyxiation.”

Kopechne’s death, says Ciardi:

“. . . was not drowning. He didn’t know how long. He said it could have been five minutes or up to a couple of hours. . . . But she was alive in the car. The fact that he (Kennedy) walked past, 75 yards away, the dike house with the light on, he could have lit that island up and they could have had help there. Maybe she could have been saved. We can’t say for sure. But even if there’s a chance, it’s pretty bad not to try. At least have that wherewithal, even if you’ve been drinking. You don’t worry about your own consequences. That’s his biggest failing, and he didn’t report it for ten hours. You cannot get around that, and then he was having brunch the next morning. And that’s factual.”

In Ciardi’s view:

“Kennedy portrayed himself almost as a victim following the accident. . . . I mean he’s responsible for someone’s death, and then not to notify anybody, and pretend like it didn’t happen. In some ways he was reduced to a kind of child. He was like a ten-year-old who threw a baseball through a window and pretended that it didn’t happen.”

Kennedy says that he “was not driving under the influence of liquor” and that his conduct after the accident “made no sense to him at all.” He regarded as “indefensible” the fact that he did not report the accident to the police “immediately.” He says there was “no truth whatsoever to the widely circulated suspicions of immoral conduct.” At the inquest in January 1970, Judge James A. Boyle, found that Kennedy:

“. . . failed to execute due care as he approached the bridge. . . . There is probable cause to believe that Edward M. Kennedy operated his motor vehicle negligently . . . and that such operation appears to have contributed to the death of Mary Jo Kopechne.”

The movie does not allude to the many conspiracy theories that have surrounded the Chappaquiddick incident. It has received largely favorable reviews from such liberal publications as the Village Voice, Vanity Fair and The New York Times. But it has also come under attack.

CNN was particularly harsh. “‘Chappaquiddick’ is heavy-handed history, a film that at times seems to owe as much to ‘The X-Files’ as the many cinematic dives into the target-rich territory that is the Kennedy clan,” wrote critic Brian Lowery. New Yorker critic Richard Brody wrote that, “The sketches of Kennedy-family tensions and loyalties are thin and simplistic; the action rushes by with little insight or context.” An opinion writer in The New York Times, Neal Gabler charges the movie with “character assassination.”

Mr. Gabler argues that the film’s advertisements claiming to tell the “untold true story” of a “cover-up” is pointless because:

“. . . the story has been told plenty, and no one but the most lunatic conspiracy theorists see this as anything but a tragic accident in which nothing much was covered up. . . . Many scenes cross from dramatic interpretation to outright character assassination. In this version, the Kennedy character leaves Kopechne to die as she gasps for air, and then with the aid of his brothers’ old advisers, cooks up a scheme to salvage his presidential ambitions.”

But, in fact, the movie’s portrayal of events is quite true to history. Washington Times columnist Joseph Curl notes that:

“Contrary to what the Times’ writer claims, the movie does not delve into conspiracy theories — Kennedy does not appear drunk and there’s no mention of the rumors that spread after the accident that Kopechne was pregnant with Kennedy’s child. But the film does wade into some territory for which there is much factual support. An autopsy was never performed on Kopechne (the police chief and judge involved were all in the bag for Kennedys) but there is evidence that she did not drown. . . . And the movie perfectly captures Kennedy’s attempts to cover up the circumstances of her death, bringing in a team of high-powered politicos to concoct a plausible story. In one hilarious scene, Kennedy dons a neck brace to look injured (he wasn’t) and his only true friend, cousin Joseph Gargan (who throughout the movie plays a sort of Good Angel on his shoulder), forcibly rips it off him.”

What the viewer is left with, writes Curl:

“. . . is simply a portrait of a weak man — perhaps beaten down by a brutal and demanding father and the pressure of being the last of four of America’s most famous brothers. But throughout, Kennedy’s weak moral core is exposed. He makes the easy choice every time, the one most likely to save his skin.”

The movie ends with Kennedy giving a nationwide T.V. speech. He ends his speech with a quote from his brother Jack’s book, Profiles In Courage (which was not written by John F. Kennedy at all, but by Ted Sorenson):

“It has been written, ‘A man does what he must — in spite of personal consequences, in spite of obstacles and dangers and pressures — and that is the basis of all human morality. Whatever may be the sacrifices he faces if he follows his conscience — the loss of his friends, his fortune, his contentment, even the esteem of his fellow man — each man must decide for himself the course he will follow. The stories of past courage cannot supply courage itself. For this, each man must look to his own soul.’”

Noble sentiments, indeed. But Ted Kennedy’s actions that night in 1969 were something quite different. This story is a part our history, and in this movie that story is told accurately. Did Ted Kennedy regret his actions and move beyond them in later life? This seems to be the case. And the movie ends on precisely that note. 

As the film ends, the camera is on a still image of the Chappaquiddick bridge where Kopechne died and the audience hears a sound montage listing all of Kennedy’s legislative accomplishments throughout his long political career after the incident at Chappaquiddick. The film challenges viewers to consider the life of a complex man, with both achievements and great character flaws.

Why some narrow partisans have attacked this movie is difficult to understand — just as it is difficult to understand why honorable men and women will defend the dishonorable actions of politicians they view as being on “their” side. The political issues we debate — whether health care, the environment, taxes or education — may be less important in the long run than the moral character of those we choose as leaders, and the example they set. There is much to think about concerning our contemporary political life when considering “Chappaquiddick.”

Whatever Happened to American Conservatism: Remembering Russell Kirk

This year marks the 100th anniversary of the birth of Russell Kirk, who may be most responsible for the emergence of an intellectually vigorous and politically viable conservative movement in the latter part of the last century. If he were still with us, it would be interesting to consider his assessment of the many strange formulations that call themselves “conservative” at the present time. 

Historian Bradley Birzer, author of the recent biography, Russell Kirk: American Conservative, notes that, “Amidst today’s whirligig of populist conservatism, crass conservatism, consumerist conservatism, we conservatives and libertarians have almost completely forgotten our roots.”

Writing in The American Conservative, Birzer declares that:

“These roots can be found in Kirk’s thought, an eccentric but effective and potent mixture of stoicism, Burkeanism, anarchism, romanticism, and humanism. It is also important — critically so — to remember that Kirk’s vision of conservatism was never primarily a political one. Politics should play a role in the lives of Americans, but a role limited to its own sphere that stays out of rival areas of life. Family, business, education and religion should each remain sovereign, devoid of politics and politicization. Kirk wanted a conservatism of imagination, of liberal education, and of human dignity. Vitally, he wanted a conservatism that found all persons — regardless of their accidents of birth — as individual manifestations of the external and universal Logos. One hundred years after the birth of Russell Amos Kirk, those are ideas well worth remembering.”

For all of those who knew him, Kirk was a gentleman and scholar of the old school, always seeking to understand how men and women and societies work and interact, and to carefully delineate which things are permanent and must be preserved, and which are temporal and can, and often must, be altered.

The author of more than 30 books, his best-known work, The Conservative Mind, was published in 1953 and presented the intellectual and historic framework for contemporary American conservatism. It was a bestseller when it appeared and has never been out of print in subsequent years. Speaking at a testimonial dinner in Kirk’s honor in Washington in 1981, President Ronald Reagan said:

“Dr. Kirk helped renew a generation’s interest and knowledge of ‘permanent things,’ which are the underpinnings and the intellectual infrastructure of the conservative revival of our nation.”

To those who argued that it was liberal ideas that defined the American experience, Kirk, through an extensive discussion of Edmund Burke, John Adams, James Fennimore Cooper, Nathaniel Hawthorne, Benjamin Disraeli, Herman Melville, T.S. Eliot, and George Santayana, presented readers with a different intellectual and moral tradition, one with deep roots in the past — the Judeo-Christian tradition, the experience of Greece and Rome, the tradition of democratic self-government as it evolved in England from the time of Magna Carta.

It was Kirk’s view that if our nation were to grow and thrive, it must remember and understand the historical roots from which it grew. In The Roots of American Order, he wrote that:

“Lacking a knowledge of how we arrived where we stand today, lacking the deep love of country which is nurtured by knowledge of the past, lacking the apprehension that we all take part in a great historical continuity — why, a people so deprived will not dare much, or take long views. With them, creature comforts will be everything; yet historical consciousness wanting, in the long run they must lose their creature comforts too.”

The roots of the American order, Kirk showed, went back to the ancient world — to the Jews and their understanding of a purposeful universe under God’s dominion, to the Greeks, with their high regard for the uses of reason, to the stern virtues of Romans such as Cicero, to Christianity, which taught the duties and limitations of man, and the importance of the transcendent in our lives. The roots of our order, in addition, include the traditions and universities of the medieval world, the Reformation and the response to it, the development of English common law, the debates of the 18th century, and the written words of the Declaration of Independence and the Constitution.

The beliefs which motivated the Founding Fathers, Kirk pointed out, were ancient in origin:

“From Israel . . . America inherited an understanding of the sanctity of law. Certain root principles of justice exist, arising from the nature which God has conferred upon man; law is a means for realizing those principles, so far as we can. That assumption was in the minds of the men who wrote the Declaration . . . and the Constitution. . . . A conviction of man’s sinfulness, and the need for laws to restrain every man’s will and appetite, influenced the legislators of the colonies and of the Republic. . . . Thomas Jefferson, rationalist though he was, declared that in matters of political power, one must not trust in the alleged goodness of man, ‘but bind him down with chains of the Constitution.’”

It was Kirk’s hope to persuade the rising generation to set their faces against:

“. . . political . . . fanaticism . . . and utopian schemes . . .’ Politics is the art of the possible,’ the conservative says; he thinks of political policies as intended to preserve order, justice, and freedom. The ideologue, on the contrary, thinks of politics as a revolutionary instrument for transforming society and even transforming human nature. In his march toward Utopia, the ideologue is merciless.”

The ideologies that have been so costly — Nazism, Communism, Fascism — are, Kirk pointed out, really “inverted religion.” But, he noted:

“The prudential politician knows that ‘Utopia’ means ‘Nowhere,’ and that true religion is a discipline for the soul, not for the state. . . . In the 20th century it has been the body of opinion generally called ‘conservative’ that has defended the Permanent Things from ideological assault.”

Conservatism, to Kirk:

“. . . is not a bundle of theories got up to by some closet philosopher. On the contrary, the conservative conviction grows out of experience, the experience of the species, of the nation, of the person. . . . It is the practical statesman rather than the visionary recluse, who has maintained a healthy tension between the claims of authority and the claims of freedom.”

Not long before his death in 1994, this writer, who knew Kirk for more than three decades, spent a leisurely lunch with him and his wife, Annette, at which we discussed many of the problems facing our society. He lamented the fact that the evidence of decadence is all around us — growing crime, increasingly unstable families, schools which are no longer transmitting our history and cultural traditions, and ever more wasteful government. He saw this as not dissimilar to Greece and Rome in their days of decline. Still, he was not a pessimist, for he took history’s long view, as he did in the epilogue of The Politics of Prudence, which had recently been published. He wrote, “We may remind ourselves that ages of decadence sometimes have been followed by ages of renewal.” He urged the young to explore the past, discover the roots of our civilization, and work to restore its sensibility. “Time is not a devourer only,” he concluded.

Both Time and Newsweek described Kirk as one of the nation’s most influential thinkers. He often quoted the 1843 speech of Orestes Brownson, given at Dartmouth College: “Ask not what your age wants, but what it needs, not what it will reward, but what, without which it cannot be saved, and that go and do.” For 75 years, Russell Kirk did just that.

It would be interesting to know what Russell Kirk would think of the politics of 2018, in particular what those who now call themselves “conservative” proclaim — the lack of civility, the characterization of those with whom we disagree as “enemies,” the coarseness and vulgarization of our political life. It is certain that he would be unhappy, but equally certain that he would not be surprised. Human nature being what it is, periods such as this have been seen before. He would, more than likely, lament that the conservative movement he helped to launch after World War II, had evolved into something quite different. But this, he might say, will not last either. Something better, he might predict, is just over the horizon. If that would indeed be his prediction, let’s hope he’s right.     *

Friday, 07 July 2017 10:20

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

The Attack on Robert E. Lee Is an Assault on American History Itself

Early in February, the City Council of Charlottesville, Virginia, voted 3-2 to remove a bronze equestrian monument to Robert E. Lee that stands in a downtown park named in his honor. Vice Mayor Wes Bellamy, the council’s only African American member, led the effort to remove the statue. In the end, this vote may be largely symbolic. Those opposed to the statue’s removal intend to file a lawsuit and point to a state statute that says Virginia cities have no authority over the war memorials they inherited from past generations. “If such are erected,” the law reads, “it shall be unlawful for the authorities of the locality, or any other person or persons, to disturb or interfere with any monuments or memorials so erected.”

The attack on the Robert E. Lee statue is, in reality, an attack on American history itself. It has been suggested that the Washington Monument and Jefferson Memorial are inappropriate, since they celebrate men who owned slaves. Those who seek to erase our history seem a bit like the Taliban and ISIS, who are busy destroying historic structures all over the Middle East if they predate the rise of Islam. History is what it is, a mixed bag of mankind’s strengths and weaknesses, of extraordinary achievements and the most horrible depredations. To judge the men and women of past eras by today’s standards is to be guilty of what the Quaker theologian Elton Trueblood called the “sin of contemporaneity.”

Those who refer to slavery as America’s “original sin” should review history. Sadly, from the beginning of recorded history until the 19th century, slavery was the way of the world. When the U.S. Constitution was written in 1787, slavery was legal everyplace in the world. What was unique was that in the American colonies there was a strenuous objection to slavery and that the most prominent framers of the Constitution wanted to eliminate it at the very start of the nation.

Our Judeo-Christian tradition, many now forget, accepted the legitimacy of slavery. The Old Testament regulates the relationship between master and slave in great detail. In Leviticus (XXV: 39-55), God instructs the Children of Israel to enslave the heathen and their progeny forever. In the New Testament, St. Paul urges slaves to obey their masters with full hearts and without equivocation. St. Peter urges slaves to obey even unjust orders from their masters.

At the time of its cultural peak, ancient Athens may have had 115,000 slaves to 43,000 citizens. The same is true of Ancient Rome. Plutarch notes that on a single day in the year 167 B.C., 150,000 slaves were sold in a single market. The British historian of classical slavery, Moses I. Finley, writes: “The cities in which individual freedom reached its highest expression — most obviously Athens — were cities in which chattel slavery flourished.” 

American history is flawed, as is any human enterprise. Yet those who now call for the removal of statues and monuments commemorating our past are measuring our history against perfection, not against other real places. What other societies in 1787 — or any date in history prior to that time — would these critics find freer and more equitable than ours? Where else was religious freedom to be found in 1787? Compared to perfection, our ancestors are found wanting. Compared to other real places in the world, they were clearly ahead of their time, advancing the frontiers of freedom. 

In the case of Robert E. Lee himself, there is more to his story than the Charlottesville City Council may understand. Everyone knows that Lee’s surrender to Ulysses S. Grant at Appomattox effectively ended the Civil War. What few remember today is the real heroism of Robert E. Lee. By surrendering, he was violating the orders given by Jefferson Davis, the elected leader of the Confederacy. The story of April 1865 is not just one of decisions made, but also of decisions rejected. Lee’s rejection of continuing the war as a guerrilla battle, the preference of Jefferson Davis, and Grant’s choice to be magnanimous, cannot be overestimated in importance. 

With the fall of Richmond, Davis and the Confederate government were often on the run. Davis, writes Prof. Jay Winik in his important book April 1865: The Month That Saved America,

“. . . was thinking about such things as a war of extermination . . . a national war that ruins the enemy. In short, guerrilla resistance. . . . The day after Richmond fell, Davis had called on the Confederacy to shift from a conventional war to a dynamic guerrilla war of attrition, designed to wear down the North and force it to conclude that keeping the South in the Union would not be worth the interminable pain and ongoing sacrifice.”

But Robert E. Lee knew the war was over. Grant was magnanimous in victory and, Winik points out,

“. . . was acutely aware that on this day, what had occurred was the surrender of one army to another — not of one government to another. The war was very much on. There were a number of potentially troubling rebel commanders in the field. And there were still some 175,000 other Confederates under arms elsewhere; one-half in scattered garrisons and the rest in three remaining rebel armies. What mattered now was laying the groundwork for persuading Lee’s fellow armies to join in his surrender — and also for reunion, the urgent matter of making the nation whole again.”

Appomattox was not preordained. “If anything,” notes Winik,

“. . . retribution had been the larger and longer precedent. So, if these moments teemed with hope — and they did — it was largely due to two men who rose to the occasion, to Grant’s and Lee’s respective actions: one general, magnanimous in victory, the other gracious and equally dignified in defeat, the two of them, for their own reasons and in their own ways, fervently interested in beginning the process to bind up the wounds of the last four years. . . . Above all, this surrender defied millenniums of tradition in which rebellions typically ended in yet greater shedding of blood. . . . One need only recall the harsh suppression of the peasants’ revolt in Germany in the 16th century, or the ravages of Alva during the Dutch rebellion, or the terrible punishments inflicted on the Irish by Cromwell, and then on the Scots after Culloden, or the bloodstained vengeance executed during the Napoleonic restoration, or the horrible retaliation imposed during the futile Chinese rebellion in the mid-19th century.”

If it were not for Robert E. Lee’s decision not to blindly follow irrational instructions to keep fighting a guerrilla war indefinitely, the surrender at Appomattox never would have taken place and our nation’s history would have been far different. Fortunately, our American tradition has never embraced the notion of blindly following orders, particularly if they involved illegal or immoral acts. No American could ever escape responsibility for such acts by saying, “I was simply following orders.” 

The effort to erase our past, as the Charlottesville City Council proposes, comes about, in large part, because we know so little about our own history. Pulitzer Prize winning historian David McCullough declares that:

“We are raising a generation of people who are historically illiterate. We can’t function in a society if we don’t know who we are and where we came from.”

More than two-thirds of college students and administrators who participated in a national survey were unable to remember that freedom of religion and the press are guaranteed by the Bill of Rights. In surveys conducted at 339 colleges and universities, more than one-fourth of students and administrators did not list freedom of speech as an essential right protected by the First Amendment. 

If we judge the past by the standards of today, must we stop reading Plato and Aristotle, Sophocles, and Aristophanes, Dante and Chaucer? Will we soon hear calls to demolish the Acropolis and the Coliseum, as we do to remove memorials to Washington and Jefferson, and statues of Robert E. Lee? Must we abandon the Bible because it lacks modern sensibility? Where will it end? As theologian Elton Trueblood declared, “contemporaneity” is indeed a sin. We would all do well to avoid its embrace.”

Free Speech Is Not Only Under Attack at Our Universities, But “Objective Truth” Itself Is Referred to as a “Racist Construct”

Free speech is not faring well on the nation’s college and university campuses. In mid-April, the University of California at Berkeley canceled a scheduled talk by conservative author Ann Coulter in what The New York Times called “the latest blow to the institution’s legacy and reputation as a promoter and bastion of free speech.” In a letter to the Berkeley College Republicans, which was sponsoring the talk, two vice chancellors said the university had been “unable to find a safe and suitable venue for your April 27 event . . .”

In February, a speech by controversial right-wing writer Milo Yiannopoulos, also sponsored by the College Republicans, was canceled after masked protestors smashed windows, set fires, and pelted the police with rocks. The Washington Post notes that:

“The decisions by U.C.-Berkeley to cancel both events are especially notable given the campus’s role during the 1960s and 1970s as the birthplace of the Free Speech Movement and its long tradition of social protest.”

Throughout the country, assaults on free speech are widespread at our colleges and universities. In March, author Charles Murray of the American Enterprise Institute was forced to abandon a lecture at Middlebury College in Vermont. The professor who was hosting him, a liberal Democrat, was assaulted. Recently, Notre Dame students said that they felt “unsafe” at the prospect of Vice President Mike Pence speaking at their commencement. In April, the Student Senate at the University of California at Davis voted to remove the American flag from their meetings. One student declared that the flag “represents a history of genocide, slavery and imperialism.”

Things at our universities are becoming increasingly difficult to understand. The Wall Street Journal reports that:

“Every year, Stanford asks its applicants an excellent question: ‘What matters to you, and why?’ Ziad Ahmed of Princeton, N.J. summed up his answer in three words. His essay consisted of the hashtag ‘#BlackLivesMatter’ repeated 100 times. He got in.”

Carrying things to an extreme even unusual for the advocates of political correctness, a group of students at California’s five college Claremont Consortium says that objective truth is itself a “myth” espoused by “white supremacists.” This came after Pomona College President David Oxtoby released a statement in defense of free speech after conservative author Heather MacDonald of the Manhattan Institute had an event disrupted at nearby Claremont McKenna College.

President Oxtoby’s letter was met with a list of demands by minority activist students who called MacDonald “a white supremacist fascist supporter of the police state,” and objective truths, such as those cited in the Declaration of Independence, “a means of silencing oppressed peoples.” The authors, Dray Denson, Avery Jonas, and Shanaya Stephenson, received 22 co-signers. They said that silencing conservative speakers, like MacDonald, whose work has been published widely in The Wall Street Journal, The Washington Post, and other newspapers and journals, is a valid option for activists since such speaking engagements constitute “a form of violence.”

During her lecture, MacDonald was attempting to discuss the rise of anti-police attitudes when she was derailed by protestors banging on windows and shouting “F--k The Police” and “Black Lives Matter.” Campus security ultimately forced MacDonald to live stream her lecture from a near-empty room across campus. 

In his letter, President Oxtoby wrote that:

“Protest has a legitimate and celebrated place on college campuses. What we cannot support is the act of preventing others from engaging with an invited speaker. Our mission is founded upon the discovery of truth, the collaborative development of knowledge and the betterment of society.”

This call for free speech was rejected by the student protestors. They wrote:

“Free speech, a right that many freedom movements have fought for, has recently become a tool appropriated by hegemonic institutions. It has not just empowered students from marginalized backgrounds to voice their qualms and criticize aspects of the institution, but it has given those who seek to perpetuate systems of domination a platform to project their bigotry. Thus, if our mission is founded upon the discovery of truth, how does free speech uphold that value?”

The students said that the very idea of objective truth is a concept devised by “white supremacists” in an “attempt to silence oppressed peoples.” They declare that:

“Historically, white supremacy has venerated the idea of objectivity and wielded a dichotomy of ‘subjectivity vs. objectivity’ as a means of silencing oppressed peoples. The idea that there is a single truth —‘the Truth’ — is a construct of the Euro-West that is deeply rooted in the Enlightenment, which was a movement that also described Black and Brown people as both subhuman and impervious to pain. This construction is a myth and white supremacy, imperialism, colonization, capitalism, and the United States of America are all of its progeny. The idea that truth is an entity for which we must search, in matters that endanger our abilities to exist in open spaces, is an attempt to silence oppressed peoples.”

The assault on Heather MacDonald, viewed as a mainstream commentator, not an extremist, was particularly harsh. The students write:

“If engaged, Heather MacDonald would not be debating on mere difference of opinion, but the right of Black People to exist. Heather MacDonald is a fascist, a white supremacist, a War-hawk, a transphobic, a queerphobe, a classist, and ignorant of interlocking systems of domination that produce the lethal conditions under which oppressed peoples are forced to live. . . . Engaging with her, a white supremacist fascist supporter of the police state, is a form of violence.”

The assault on Western Civilization at our universities is hardly new. In the 1980s, Jesse Jackson led a group of militant students through the campus chanting, “Hey Hey, Ho Ho, Western Civilization Has Got To Go.” The opposition to transmitting our culture and civilization is based on the unusual idea that only books, music, and art created by men and women who share our own racial or ethnic background can be meaningful to us and should be transmitted. Under such a notion, only Jews could read the Bible, only Greeks could contemplate Plato or Aristotle, only those of English descent read Shakespeare, and only Italians appreciate Dante or Leonardo da Vinci.

Western culture is relevant to men and women of all races and backgrounds, particularly those living in the midst of our Western society — such as the students at Pomona College. The distinguished black scholar W. E. B. Du Bois recognized this reality when he wrote more than a hundred years ago:

“I sit with Shakespeare and he winces not. Across the color line, I walk arm in arm with Balzac and Dumas, where smiling men and welcoming women glide in gilded halls. From out of the caves of evening that swing between the strong-limbed earth and the tracery of the stars, I summon Aristotle and Aurelius and what soul I will, and they come all graciously, with no scorn or condescension. So, wed with truth, I dwell above the veil.”

When the attacks upon transmitting Western civilization began at our universities, in his address to the freshman class at Yale College in September 1990, Donald Kagan, Professor of History and Classics and Dean of the College, declared:

“The assault on the character of Western civilization badly distorts history. The West’s flaws are real enough, but they are common to almost all the civilizations known in any continent at any time in human history. What is remarkable about the Western heritage, and what makes it essential, are the important ways in which it has departed from the common experience. More than any other it has asserted the claims of the individual against those of the state, limiting the state’s power and creating a realm of privacy into which it cannot penetrate. . . . Western Civilization is the champion of representative democracy as the normal way for human beings to govern themselves, in place of the different varieties of monarchy, oligarchy, and tyranny that have ruled most of the human race throughout history and rule most of the world today. It has produced the theory and practice of separation of church and state, thereby creating a safe place for individual conscience. At its core is a tolerance and respect for diversity unknown in most cultures. One of its most telling characteristics is its encouragement of itself and its ways. Only in the West can one imagine a movement to neglect the culture’s own heritage in favor of some other.”

Our civilization is now under attack on many of our university campuses, as is the idea of objective truth itself, as the students at Pomona College have shown us. When will universities finally decide to remove from their campuses students who silence the speech of those with whom they disagree? When will alumni cut back their contributions to institutions that embrace identity politics and limit the speech of those who dare to differ? This is a serious challenge to our institutions of higher learning. Some of them are resisting. Others, such as Berkeley, seem to be acquiescing. It is hard to imagine student protestors who deny that there is such a thing as “truth” being taken seriously. That many take such irrationality as legitimate discourse tells us as much about today’s academic world as it does about those who would destroy free speech.

The Russian Revolution at 100: Remembering the Naïve Westerners Who Embraced It

One hundred years ago, Russia’s czar was overthrown and Communism began its reign. Sunday, March 12, is the date generally recognized as the start of the uprising. In Moscow, no celebrations are planned. Evidently the country remains too divided over Communism’s legacy. Mikhail Zygar, a Russian journalist and author of the book All The Kremlin’s Men, points out that:

“Vladimir Putin cannot compare himself to Nicholas II, nor to Lenin, nor to Kerensky because that is not Russian history to be proud of. In terms of 1917, nothing can be used as a propaganda tool.” 

Communism’s toll was a heavy one. The Black Book of Communism, an 846-page academic study, holds Communism responsible for the deaths of between 85 million and 100 million people worldwide. It estimates that the ideology claimed 45 million to 72 million in China, 20 million in the Soviet Union, between 1.3 million and 2.3 million in Cambodia, 2 million in North Korea, 1.7 million in Africa, 1.5 million in Afghanistan, 1 million in Vietnam, 1 million in Eastern Europe, and 150,000 in Latin America.

Through all those years, many intellectuals in the West insisted on disassociating Communism from the crimes committed in its name. Incredibly, in retrospect, we see many Western academics, clergymen, journalists, and literary figures not resisting Communist tyranny, but embracing it, defending it, and apologizing for it.

Consider the German playwright Bertolt Brecht, who created the modern propaganda play. When he visited the Manhattan apartment of American philosopher Sidney Hook in 1935, Stalin’s purges were just beginning. Hook, raising the cases of Zinoviev and Kamenev, asked Brecht how he could bear to work with the American Communists who were trumpeting their guilt. Brecht replied that the U.S. Communists were no good — nor were the Germans either — and that the only body that mattered was the Soviet party. Hook pointed out that they were all part of the same movement, responsible for the arrest and imprisonment of innocent former comrades. 

Brecht replied: “As for them, the more innocent they are, the more they deserve to be shot.” Hook asked, “Why, why?” Brecht did not answer. Hook got up, went into the next room and brought Brecht his hat and coat. During the entire course of Stalin’s purges, Brecht never uttered a word of protest. When Stalin died, Brecht’s comment was: “The oppressed of all five continents . . . must have felt their heartbeats stop when they heard that Stalin was dead. He was the embodiment of all their hopes.” 

Another case in point is French philosopher Jean Paul Sartre. In a July 1954 interview with Liberation, Sartre, who had just returned from a visit to Russia, said that Soviet citizens did not travel, not because they are prevented from doing so, but because they had no desire to leave their wonderful country. “The Soviet citizens,” he declared, “criticize their government much more and more effectively than we do.” He maintained that, “There is total freedom of criticism in the Soviet Union.”

Another intellectual defender of tyranny was Lillian Hellman, the American playwright. She visited Russia in October 1937, when Stalin’s purge trials were at their height. On her return, she said she knew nothing about them. In 1938 she was among the signatories of an ad in the Communist publication New Masses that approved the trials. She supported the 1939 Soviet invasion of Finland, saying: “I don’t believe in that fine, lovable little Republic of Finland that everyone is weepy about. I’ve been there and it looks like a pro-Nazi little republic to me.” There is no evidence that Hellman ever visited Finland — and her biographer states that this is “highly improbable.”

The American Quaker H. T. Hodgkin provided this assessment: “As we look at Russia’s great experiment in brotherhood, it may seem to us some dim perception of Jesus’ way, all unbeknown, inspiring it.”

The case of New York Times correspondent Walter Duranty who covered the Soviet Union in the 1930s is also instructive. In the midst of the enforced famine in the Ukraine, Duranty visited the region and denied that starvation and death were rampant. In November 1932, Duranty reported that “. . . there is no famine or actual starvation nor is there likely to be.” In The Times of August 23, 1933, Durany wrote: “Any report of a famine in Russia is today an exaggeration or malignant propaganda. . . . The food shortage which has affected almost the whole population last year . . . has, however, caused heavy loss of life.”

He estimated the deaths at nearly four times the usual rate, but did not blame Soviet policy. What Americans got was not the truth — but false reporting. But its influence was widespread. What Walter Duranty got was the highest honor in journalism — the Pulitzer Prize for 1933, complimenting him for “dispassionate, interpretive reporting of the news from Russia.” The citation declared that Duranty’s dispatches — which the world now knows to be false — were “marked by scholarship, profundity, impartiality, sound judgment, and exceptional clarity.”

Walter Duranty was only one of many correspondents and writers in the 1920s and 1930s who fed their readers in the West a steady diet of disinformation about the Soviet Union. Louis Fischer, who wrote for The Nation magazine, was also reluctant to tell his readers about the flaws in Soviet society. He, too, glossed over the searing famine of 1932-33. He once referred to what we now know as the “Gulags” as “a vast industrial organization and a big educational institution.” In 1936, he informed his readers that the dictatorship was “voluntarily abdicating” in favor of “democracy.”

Somehow, liberal intellectuals, who were harsh in their judgment of the American society, eagerly embraced the ruthless dictatorship of Joseph Stalin. Concerning the forced collectivization of Soviet agriculture, author Upton Sinclair wrote: “They drove rich peasants off the land — and sent them wholesale to work in lumber camps and on railroads. Maybe it cost a million lives — maybe it cost five million — but you cannot think intelligently about it unless you ask yourself how many millions it might have cost if the changes had not been made.”

W. E. B. Du Bois, the black intellectual, thought that, “He (Stalin) asked for neither adulation nor vengeance. He was reasonable and conciliatory.” It was not only Stalin who was embraced by many in the West but Mao as well. Visiting in Communist China, New York Times columnist James Reston said that he thought Chinese Communist doctrines and the Protestant ethic had much in common and was generally impressed by “the atmosphere of intelligent and purposeful work.” (New York Times, July 30, 1971). He wrote:

“China’s most visible characteristics are the characteristics of youth . . . a kind of lean, muscular grace, relentless hard work, and an opportunistic and even amiable outlook on the future. . . . The people seem not only young but enthusiastic about their changing lives.”

Reston also believed that young people from the city who were forced to work as manual laborers in rural areas “were treating it like an escape from the city and an outing in the countryside.” When Mao died in 1976, the Times devoted three pages to his obituary, but only a few lines alluded to his enormous crimes against the Chinese people. It has been estimated that Mao was responsible for the deaths of 30 million to 60 million people. The Times referred to the execution of “a million to three million people, including landlords, nationalist agents, and others suspected of being class enemies.” The Washington Post also devoted three pages to Mao, concluding, “Mao the warrior, philosopher, and ruler was the closest the modern world has been to the God-heroes of antiquity.” The Post acknowledged that some three million persons had lost their lives in the 1950 “reign of terror,” but the only victims mentioned were “counter-revolutionaries.”

In his landmark study of intellectual support for Communism, Political Pilgrims, Professor Paul Hollander writes that an important myth to be laid to rest “is the belief in the unflinching commitment of intellectuals to freedom, and particularly to freedom of expression.” In the case of the Soviet Union and other Communist societies, he notes:

“It is very clear that the absence of freedoms . . . hardly concerned the visitors or interfered with the attractions of these societies. To the extent that the lack of free expression was observed — and it is by itself noteworthy how frequently it was overlooked — it was excused or rationalized on the familiar grounds of temporary necessity, amply compensated for by the various achievements of the regimes concerned.”

In addition, states Hollander:

“Attributions of idealism and disinterestedness also call for re-examination when intellectuals move with lightning speed from vehement moral indignation and moral absolutism (generally reserved for their own society) to a strangely pragmatic moral relativism brought to the assessment of policies of countries they are committed to support. . . . Scott Nearing, who often left his home in Maine in November rather than watch hunters kill deer, defended Soviet tanks in Budapest (in 1956). . . . Such misjudgments and moral double ‘bookkeeping’ (or double standards) are in part due to the readiness to believe ‘the other side.’”

The anniversary of the Russian Revolution is particularly meaningful for those of us who are old enough to remember the reality of what Communism was really like. This writer spent time in Eastern Europe during the darkest days of Communist rule, visiting both the Berlin Wall and Czechoslovakia in 1969, shortly after the Soviet Union brutally marched into Prague and put down the attempts at liberalization. Wherever one went in Czechoslovakia, the contempt for the occupying Soviet Army was clear. At a student club I visited, when word got around that an American was on the premises, many young people came by to extend greetings. I was invited to the homes of a number of Czechs who openly declared their hostility to Communism and their desire for their country to once again join the Western world. It is fair to say that I did not encounter a single Czech who spoke well of either Communism or the Soviet Union.

As we commemorate the 100th anniversary of the Russian Revolution, we should remember how easily naïve Westerners were eager to embrace it. Vladimir Putin, who served Communism as a KGB agent, was an eager participant in the Communist enterprise. It is interesting to observe his current reluctance to celebrate the totalitarian and imperialistic system to which he devoted much of his life. Sadly, he now seems intent upon restoring as much of the Soviet empire as he can and to destabilize NATO, the EU, and our own country. Let us hope that we do not engage in the same wishful thinking about Putin’s goals and objectives that so many in the West did about Communism. Remembering those who naïvely embraced tyranny should immunize us against following such a path in the future — that is if we are willing to learn from history, something that is all too rare.     *

 

 

Monday, 27 March 2017 14:39

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

America Is Exceptional — But Now There Is an Effort to Make It Ordinary

Our society is unique in history — in other words “exceptional.” Ronald Reagan described it as a “City on a Hill.” Now, we confront an effort to make it ordinary, to build walls, promote fear of strangers, and promote a narrow nationalism. Perhaps those who would make America small and narrow do not understand what generations of Americans, Republicans and Democrats, liberals and conservatives, have meant by the term “American exceptionalism.”

America has never simply been another country. From the very beginning, its vision of Liberty attracted people of every ethnic background and religion. At the time of the American Revolution, Thomas Paine wrote:

“If there is a country in the world where concord, according to common calculation, would be least expected, it is America. Made up, as it is, of people from different nations, accustomed to different forms and habits of government, speaking different languages, and more different in their modes of worship, it would appear that the Union of such a people was impracticable. But by the simple operation of constructing government on the principles of society and the rights of man, every difficulty retires and the parts are brought into cordial unison.”

In Redburn, (1849), Herman Melville spelled out a vision of America which is as true today as it was then:

“There is something in the contemplation of the mode in which America has been settled that, in a noble breast, should forever extinguish the prejudices of national dislikes. Settled by the people of all nations, all nations may claim her for their own. You cannot spill a drop of American blood without spilling the blood of the whole world. Be he Englishman, German, Dane or Scot: the European who scoffs at an American . . . stands in danger of judgment. We are not a narrow tribe of men. . . . No: our blood is as the flood of the Amazon, made up of a thousand noble currents all pouring into one. We are not a nation, so much as a world.”

To make America simply another country, concerned only with its narrow self-interest, is to reverse our history. F. Scott Fitzgerald wrote:

“France was a land. England was a people, but America, having about it still the quality of an idea, was harder to utter — it was the graves at Shiloh, and the tired, drawn, nervous faces of its great men, and the country boys dying in the Argonne for a phrase that was empty before their bodies withered. It was a willingness of the heart.”

Today, our country is the most powerful and most prosperous in the world. We defeated Communism, Fascism and Nazism. Of course, there are always challenges to be confronted. ISIS threatens the West with terrorism, and it is important that it be defeated. But the promotion of fear by some in Washington is irrational. In his first Inaugural Address, in the midst of the Great Depression, as democracy was collapsing in Europe, Franklin D. Roosevelt told the country that:

“. . . the only thing we have to fear is fear itself — nameless, unreasoning, unjustified terror which paralyzes needed efforts to convert retreat into advance.”

Now, our new administration stirs fear with no basis for doing so. Scholars of the subject say they can think of no previous president so enamored as Mr. Trump of scare tactics. Historian Robert Dallek says, “If he frightens people, it puts him in the driver’s seat. These are what I think can be described as demagogic tendencies.”

There is nothing conservative about what we are hearing from the White House in recent days. Of the catchphrase “America First,” the antecedents of which go back to keeping the U.S. out of the war against Nazism, which many who now use it do not understand, conservative commentator Charles Krauthammer writes:

“Some claim that putting America first is a reassertion of American exceptionalism. On the contrary, it is the antithesis. It makes America no different from all the other countries that define themselves by a particularist blood-and-soil nationalism. What made America exceptional, unique in the world, was defining its own national interest beyond its narrow economic and security needs to encompass the safety and prosperity of a vast array of allies. A free world, markedly open trade and mutual defense was President Truman’s vision, shared by every president since. Until now. . . . For seventy years, we sustained an international system of open commerce and democratic alliances that has enabled America and the West to grow and thrive. Global leadership is what made America great. We abandon it at our peril.” 

The people around presidential adviser Stephen Bannon and the “alt-right” philosophy he promoted on the Breitbart news website, have more in common with the right-wing racial nationalism to be found in European parties such as Marine Le Pen’s National Front in France than anything in our own history. To what degree President Trump has embraced such views, remains less than clear. But many traditional conservatives see all of this as a dramatic departure from American exceptionalism. New York Times columnist David Brooks notes that:

“We are in the midst of a Great War of national identity. We thought we were in an ideological battle against radical Islam, but we are really fighting the national myths spread by Trump, Bannon, Putin, Le Pen and Farage. We can argue about immigration and trade and foreign policy, but nothing will be right until we restore and revive the meaning of America. Are we still the purpose-driven experiment Lincoln described and Emma Lazarus wrote about: assigned by providence to spread democracy and prosperity; to welcome the stranger. . . . Or are we just another nation, hunkered down in a fearful world?”

In 1866, Lord Acton, the British liberal leader, said that America was becoming the “distant magnet.” Apart from “the millions who have crossed the ocean, who shall reckon with the millions whose hearts and hopes are in the United States, to whom the rising sun is in the West.”

America has been a nation much loved. Germans have loved Germany. Frenchmen have loved France. Swedes have loved Sweden. This, of course, is only natural. But America has been beloved not only by native Americans, but by men and women throughout the world who have yearned for freedom. America dreamed a bigger dream than any nation in the history of man. Now, in Washington, that dream is being replaced with something far different. The Republican Party, which always embraced the idea of American exceptionalism, one to which Ronald Reagan and conservatives were particularly committed, now has a choice. Will it abandon its vision of America as exceptional and adopt the very ordinary nationalism that now is manifesting itself in the White House, or will it maintain its belief in an America that is, indeed, something new and positive in history? All of us will be losers if the vision of America embraced by the Founding Fathers and generations of Americans is abandoned by those whose notion of America is narrow and completely ahistorical.

The Strange Assault on Thomas Jefferson at the University He Founded

At the University of Virginia, its founding father, Thomas Jefferson, is under attack by some students and faculty.

After the November presidential election, university president Theresa Sullivan wrote a letter in which she quoted Jefferson in expressing the hope that students from the University would help our republic. Sullivan wrote:

“By coincidence, on this exact day 191 years ago — November 9, 1825, in the first year of classes at the University of Virginia — Thomas Jefferson wrote to a friend that U.V. students ‘. . . are not of ordinary significance only; they are exactly the persons who are to succeed to the government of our country, and to rule its future enmities, its friendships and fortunes.’ I encourage today’s U.V. students to embrace that responsibility.”

Almost immediately, a response was drafted by Noelle Brand, an assistant professor of psychology, who declared that Thomas Jefferson “was deeply involved in the racist history of this university” and he noted that:

“We would like for our administration to understand that although some members of this community may have come to this university because of Thomas Jefferson’s legacy, others of us came here in spite of it. For many of us, the inclusion of Jefferson quotations in these e-mails undermines the message of unity, equality, and civility that you are attempting to convey.”

Approximately 500 students and faculty signed the letter, with more adding their names later. President Sullivan responded that:

“Quoting Jefferson (or any historical figure) does not imply an endorsement of all the social structures and beliefs of his time, such as slavery and the exclusion of women and people of color from the university.”

Sullivan acknowledged “the university’s complicated Jeffersonian legacy.” She pointed out that:

“Today’s leaders are women and men, members of all racial and ethnic groups, members of the LGBTQ community and adherents of all religious traditions. All of them belong to today’s University of Virginia whose founders most influential and quoted words were ‘all men are created equal.’ Those words were inherently contradictory in an era of slavery, but because of their power they became the fundamental expression of a more genuine equality today.”

What President Sullivan’s critics are doing is applying the standards of 2016 to 1787, when the Constitution was written, and finding our ancestors seriously deficient. They are guilty of what the Quaker theologian Elton Trueblood called “the sin of contemporaneity,” of applying the standards of our own time to those who have come before. It is possible to look at the colonial period from both the vantage point of the period that preceded it as well as the period which has followed. This is instructive when considering the question of slavery.

Slavery played an important part in many ancient civilizations. Indeed, most people of the ancient world regarded slavery as a natural condition of life, one that could befall anyone at any time, having nothing to do with race. It has existed almost universally through history among peoples of every level of material culture — among nomadic pastoralists in Asia, hunting societies of North American Indians and sea people such as the Norsemen. The legal codes of Sumer provide documentary evidence that slavery existed there as early as the 4th millennium B.C. The Sumerian symbol for slave in cuneiform writing suggests “foreign.”

When the Constitutional Convention met in Philadelphia in 1787, not a single nation had made slavery illegal. As they looked back through history, the framers saw slavery as an accepted and acceptable institution. It was not until 1792 that Denmark became the first Western nation to abolish the slave trade. In 1807, the British Parliament passed a bill outlawing the slave trade — and slavery was abolished in British colonies between 1834 and 1840. France freed the slaves in its colonies in 1848. Spain ended slavery in Puerto Rico in 1873, and in Cuba in 1886. Brazil abolished slavery in 1888.

The respected British historian of classical slavery, Moses L. Finley, writes:

“The cities in which individual freedom reached its highest expression — most obviously Athens — were cities in which chattel slavery flourished.”

The same is true of Ancient Rome. Plutarch notes that on a single day in the year 167 B.C., 150,000 slaves were sold in a single market.

Our Judeo-Christian tradition was also one that accepted the legitimacy of slavery. The Old Testament regulates the relationship between master and slave in great detail. In Leviticus (XXV: 39-55), God instructs the Children of Israel to enslave the heathen and their progeny forever, but to employ poor Jews as servants only, and to free them and their children on the year of Jubilee. There is no departure from this approach to slavery in the New Testament. St. Paul urges slaves to obey their masters with full hearts and without obfuscation.

What is historically unique is not that slavery was the accepted way of the world in 1787, but that so many of the leading men of the American colonies of that day wanted to eliminate it — and pressed vigorously to do so.

Benjamin Franklin and Alexander Hamilton were ardent abolitionists. John Jay, who would become the first Chief Justice, was president of the New York Anti-Slavery Society. Rufus King and Gouverneur Morris were in the forefront of opposition to slavery.

One of the great debates at the Constitutional Convention related to the African slave trade. George Mason of Virginia made an eloquent plea for making it illegal. He said:

“Every master of slaves is born a petty tyrant. They bring the judgment of heaven on a country.”

In his original draft of the Declaration of Independence, one of the principal charges made by Thomas Jefferson against King George III and his predecessors was that they would not allow the American colonies to outlaw the importation of slaves. When Jefferson was first elected to the Virginia legislature at the age of twenty-five, his first political act was to begin the elimination of slavery. Though unsuccessful, he tried to further encourage the emancipation process by writing into the Declaration of Independence that “all men are created equal.” In his draft of a constitution for Virginia, he provided that all slaves would be emancipated in that state by 1800, and that any child born in Virginia after 1801, would be born free. This, however, was not adopted.

In his draft, instructions to the Virginia delegation to the Continental Congress of 1774, published as “A Summary View of the Rights of British America,” Jefferson charged the British crown with having prevented the colonies from abolishing slavery in the interest of avarice and greed:

“The abolition of domestic slavery is the great object of desire of these colonies, where it was, unhappily, introduced in their infant state. But previous to the enfranchisement of the slaves we have, it is necessary to exclude all further importations from Africa. Yet our repeated efforts to effect this by prohibition, and by imposing duties which might amount to a prohibition, have been hitherto defeated by his Majesty’s negative.” 

Thomas Jefferson and the other framers of the Constitution were imperfect men and it is not difficult to discover their personal flaws. But these imperfect men did an extraordinary thing in creating a new nation, which now has the world’s oldest continuous form of government. Professor Forrest McDonald points out that:

“The framers were guided by principles but not by formulas. They understood that no form or system of government is universally desirable or workable; instead, if government is to be viable, it must be made to conform to human nature and to the genius of the people — to their customs, morals, habits, institutions, aspirations. The framers did just that, and thereby used old materials to create a new order for the ages.” 

While the majority of the framers of the Constitution were opposed to slavery, a small minority supported it and if it were outlawed the union never would have come into being. Thus, they compromised. What they did do was outlaw the slave trade as of 1808 and Congress, in 1787, outlawed slavery in the new territories by passing the Northwest Ordinance. It was, we must remember, the framers of the Constitution who were the first duly constituted authority in the Western world to act decisively against slavery. 

One wonders how much of this history is known by those who wrote and signed the letter calling upon University of Virginia President Theresa Sullivan to stop quoting Thomas Jefferson. To her credit, President Sullivan understands the distinction between intrinsic principle and historical personality. To hold leaders of the past to the standards of the present time is to be guilty of missing the larger message of our history. Jefferson and our other Founding Fathers set in place a system of government that permitted growth and change. While they may not have shared the views of today, neither did Socrates, Plato, Dante, or Shakespeare. Shall we only be able to quote those from the 20th and 21st centuries who share the standards we only ourselves came to accept a very short time ago? This would be “contemporaneity” gone mad.

Thomas Sowell Ends His Column, But His Intellectual Legacy Will Only Grow

Thomas Sowell, one of America’s foremost public intellectuals and most outspoken black conservatives, submitted his final column in December after 25 years in syndication. At 86, he said, he thought the time had come to retire from this enterprise. Hopefully, his other literary pursuits will continue.

For more than 50 years, Sowell has published books and journals on race, economics, and government policy. He grew up in Harlem and was the first member of his family to go beyond 6th grade — eventually graduating from Harvard. A self-proclaimed Marxist in his 20s, he received his Ph.D. in economics from the University of Chicago, where he studied under Milton Friedman, the Nobel Prize-winning economic advocate of free markets. Sowell slowly lost faith in the ability of government to effectuate positive change in our economic life. He taught economics at Cornell and UCLA and has been a senior fellow at the Hoover Institution at Stanford University since 1980. (Shortly after he moved into his office at Stanford, I visited him there. I remember having dinner at a Mexican restaurant in Palo Alto, and putting a tape recorder on the table, and engaging in a lengthy interview, which was subsequently published in Human Events).

Thomas Sowell examined the history of race relations in America, and throughout the world. He questioned much of the orthodoxy to be found in intellectual circles and asked — and tried to answer — the most difficult questions. Do certain groups advance in society at varying rates because of the attitude of society toward them? Does discrimination against a given group cause it to do less well economically and educationally than those groups that do not face such external barriers?

In a landmark study, “The Economics and Politics of Race: An International Perspective”(1983), followed by an impressive succession of important books, Sowell uses an international framework to analyze group differences. Examining the experience of different groups in more than a dozen countries, he seeks to determine how much of each group’s economic fate has been due to the surrounding society and how much to internal patterns that follow the same group around the world.

The Italians in Australia and Argentina, for example, show social and economic patterns similar in many respects to those of Italians in Italy or in the United States. Chinese college students in Malaysia specialize in very much the same fields that they specialize in American colleges — a far different set of specializations from those of other groups in both countries. Germans have, similarly, concentrated in very similar industries and occupations in South America, North America, or Australia.

Analyzing the successes of each group, Sowell points to the group’s culture, which rewards some behaviors over others, as the determinant of skills, orientations and therefore economic performance. “Race may have no intrinsic significance,” he writes, “and yet be associated historically with vast cultural differences that are very consequential for economic performance.”

In Southeast Asia, for example, the overseas Chinese have been subjected to widespread discrimination. Quota systems were established in government employment and in admission to universities in Malaysia, and a “target” of 30 percent Malayan ownership in business and industry was established. In Indonesia, a 1959 law forbade the Chinese to engage in retailing in the villages. Chinese-owned rice mills were confiscated. In the Philippines, it was decreed that no new Chinese import business could be established, and Chinese establishments were closed by law. 

Despite all of this, Sowell points out, the Chinese thrived. As of 1972, they owned between 50 and 95 percent of the capital in Thailand’s banking and finance industries, transportation, wholesale and retail trade, restaurants, and the import and export business. In Malaysia, the Chinese earned double the income of Malays in 1976, despite a massive government program imposing preferential treatment of Malays in the private economy. In the U.S., as in Southeast Asia, writes Sowell, “The Chinese became hated for their virtues.” Despite discrimination, the Chinese advanced rapidly in the U.S., as did the Japanese, who met similar forms of racial bigotry, including special taxes and job restrictions.

In Europe, Sowell points out, precisely the same story can be told with regard to Jews. Anti-semitism was a powerful force in many countries, yet Jews continued to advance. Although Jews were only one percent of the German population, they became 10 percent of the doctors and dentists, 17 percent of the lawyers, and won 27 percent of the Nobel Prizes awarded Germans from 1901 to 1975. In the U.S., notes Sowell:

“Although the Jewish immigrants arrived with less money than most other immigrants, their rise to prosperity was unparalleled. Working long hours at low pay, they nevertheless saved money to start their own small businesses . . . or to send a child to college. While the Jews were initially destitute in financial terms, they brought with them not only specific skills but a tradition of success and entrepreneurship which could not be confiscated or eliminated, as the Russian and Polish governments had confiscated their wealth and eliminated most of their opportunities.”

In the case of blacks in the U.S., Sowell shows that West Indians have advanced much more rapidly than native born American blacks because of major cultural differences. In the West Indies, slaves had to grow the bulk of their own food — and were able to sell what they did not need from their individual plots of land. They were given economic incentives to exercise initiative, as well as experience in buying, selling, and managing their own affairs — experiences denied to slaves in the U.S.

The two black groups — native-born Americans and West Indians — suffered the same racial discrimination, but advanced at dramatically different rates. By 1969, black West Indians earned 94 percent of the average income of Americans in general, while native blacks earned only 62 percent. Second generation West Indians in the U.S. earned 15 percent more than the average American. More than half of all black-owned businesses in New York State were owned by West Indians. The highest-ranking blacks in the New York City Police Department in 1970 were all West Indians, as were all the black judges in the city.

It is a serious mistake, Sowell believes, to ignore the fact that economic performance differences between whole races and cultures “are quite real and quite large.” Attitudes of work habits, he argues, are key ingredients of success or failure. The market rewards certain kinds of behavior, and penalizes other behavior patterns — in a color-blind manner. Blaming discrimination by others for a group’s status, he states, ignores the lessons of history. 

Political efforts to address the “problems” of minorities, such as race-based affirmative action programs, usually fail, Sowell reports, because they refuse to deal with the real causes of such difficulties:

“. . . political ‘solutions’ tend to misconceive the basic issues. . . . Black civil rights leaders . . . often earn annual incomes running into hundreds of thousands of dollars, even if their programs and approaches prove futile for the larger purpose of lifting other blacks out of poverty.”

 Crucial to a group’s ability to advance is the stability of its family life and the willingness to sacrifice:

“. . . more than four-fifths of all white children live with both their parents. But among black children, less than half live with both parents. . . . What is relevant is the willingness to pay a price to achieve goals. Large behavioral differences suggest that the trade-off of competing desires vary enormously among ethnic groups. . . . The complex personal and social prerequisites for a prosperous level of output are often simply glided over, and material wealth treated as having been produced somehow, with the only real question being how to distribute it justly.”

If we seek to understand group differences, it is to “human capital” that we must turn our attention, Sowell declares. The crucial question is not the fairness of its distribution but “whether society as a whole — or mankind as a whole — gains when the output of both the fortunate and the unfortunate is discouraged by disincentives.”

It is Sowell’s view that many black leaders have not served their constituencies but themselves. Instead of expressing concern over the decline of the black family, the increasing out-of-wedlock birth rate, the rise of inner-city crime — they speak only of “discrimination.” Instead of calling for an end to such government licensing laws as those that limit the number of taxicabs in cities such as New York and Philadelphia, they call for more government “make-work” jobs. 

While many blame all problems within the black community on the legacy of slavery, Sowell points to the fact that more black children lived in two-parent families during slavery, Reconstruction, and the years of segregation than at the present time. He writes that:

“In reality, most black children were raised in two-parent homes even during the era of slavery, and for generations after, blacks had higher rates of marriage than whites in the early 20th century, and higher rates of labor force participation in every census from 1890 to 1950. The real causes of the very different patterns among blacks in the world of today must be sought in the 20th century, not in the era before emancipation.”

Tom Sowell has been telling the hard truth for many years, and has received much abuse for doing so. He has been a strong advocate for a genuinely color blind society, in which men and women would be judged on their individual merit, not on the basis of race. All Americans who believe in such a society, and believe that one’s view about economic, political, and other matters should be based on the facts as one sees them — not on race, religion, or ethnicity as the promoters of today’s “identity politics” would have it — should recognize what a champion of freedom Sowell has been. We will miss his regular column, but hope he will continue to share his wisdom with us. It is certain that his intellectual legacy will grow for it is based upon scholarship and a search for truth, not upon the changing needs of our political class for convenient and popular responses to the complex challenges we face. Sadly, there are too few such people among us. For a free society to thrive, we need more Thomas Sowells. We have been lucky indeed to have him with us.

Washington Once Again Shows Us That “Congressional Ethics” Is an Oxymoron

On the very first day of the new Congress, House Republicans met in secret. Their very first order of business was to vote to eliminate the quasi-independent office that investigates House ethics. Rep. Bob Goodlatte (R-VA) was the architect of the attack on the Office of Congressional Ethics, known as O.C.E. The rules change would have prevented the office from investigating potentially criminal allegations, allowed members of the House Ethics Committee to shut down any O.C.E. investigation, and silenced staff members in their dealings with the news media. 

The O.C.E. was created in 2008, after a series of bribery and corruption scandals involving members of both parties. Three House members were sent to jail. Among those joining Rep. Goodlatte in calling for the end of O.C.E. were Rep. Blake Farenthold (R-TX), who had been investigated by the O.C.E. for sexual harassment, Rep. Peter Roskam (R-IL), who was investigated after he and his wife took a $24,000 trip to Taiwan, which appeared to have been improperly paid for by the Taiwanese government, and Rep. Sam Graves (R-MO), who was ranking member of the House Committee on Small Business in 2009 when he invited expert testimony on the renewable fuel industry from a representative of a renewable fuels business in which his wife had a financial stake, a potential conflict of interest. Another advocate of ending O.C.E. was Rep. Steve Pearce (R-NM), who last year tried to eliminate the entire O.C.E. budget after it investigated one of his staff members. Or consider Rep. Duncan Hunter (R-CA), another supporter of eliminating O.C.E., who has used campaign funds for personal expenses, which is illegal. Among his reported expenditures: $1,400 for a dentist, $2,000 for a Thanksgiving trip to Italy, and $600 to take his children’s pet bunny on a commercial airplane. (After these expenses were exposed, he reimbursed his campaign $62,000). The list of those supporting the elimination of O.C.E. who have been the targets of investigation is not a short one.

President-elect Donald Trump quickly weighed in, questioning the priorities of Republican members of Congress. Shortly after, lawmakers were summoned to the basement of the Capitol for a meeting with Republican leaders. Rep. Kevin McCarthy (R-CA), the majority leader, asked his fellow Republicans whether they had campaigned to repeal the Affordable Care Act or to eliminate the ethics office? Shortly after this, the idea of eliminating the O.C.E. was scrapped.

This was not the first time that House lawmakers — Democrats and Republicans — had tried to curtail the powers or budget of the O.C.E. In 2011, Rep. Melvin Watt (D-NC), who later left Congress to join the Obama administration, tried to cut the agency’s budget by 40 percent, a proposal that failed on a 302-102 vote. The Republican effort, just after the election of Donald Trump, who promised to “drain the swamp” of Washington, was viewed as tone-deaf in the extreme. The vote to eliminate the O.C.E., noted The Economist, “showed those lawmakers to lack self-awareness to an amazing degree.” Rep. Walter Jones (R-NC) said:

“Mr. Trump campaigned that he was going to drain the swamp, and here we are on Day One trying to fill the swamp. . . . I just could not believe that the Congress does not understand that, if anything, we need to bring sunshine in.”

Many years ago, Mark Twain pointed out that Congress was our only “native born criminal class.” The evidence in recent years would fill many pages. In 2009, Rep. William Jefferson (D-LA) was convicted of corruption charges in a case made famous by the $90,000 in bribe money stuffed into his freezer. Federal jurors found Jefferson guilty of using his congressional office as a criminal enterprise to enrich himself, soliciting and accepting hundreds of thousands of dollars in bribes to support his business ventures in Africa. While the Jefferson case is an extreme example of congressional corruption, his attorney’s defense that, in effect, “everyone does it,” is not as far fetched as it may appear. Other members of Congress may not have $90,000 in their freezers, but too many are guilty of questionable activities. 

Just as Jefferson’s trial began, we learned of Sen. John Ensign’s (R-NV) affair with an aide and the subsequent payments to her family by his parents. Also at that time, Rep. Charles Rangel (D-NY), then chairman of the House Ways and Means Committee, was the subject of several ethics investigations over matters ranging from his occupying four apartments at below market rents in a Harlem building owned by a prominent real estate developer, and his admission that he had neglected to pay some taxes by failing to report $75,000 in income in rental income earned from a beachfront villa in the Dominican Republic. The Wall Street Journal commented: “Ever notice that those who endorse high taxes and those who actually pay them aren’t the same people?”

There is, of course, the larger question of the ethical standards of the Congress, beyond activities that are clearly illegal. Members of Congress subsidize, in one form or another, a host of special interests — farmers, businessmen, Wall Street, universities, welfare recipients, labor unions — and each group has a special Political Action Committee (PAC) that contributes to members’ campaigns. Cuts in subsidies to these groups will provoke cuts in contributions. The result: every group gets what it wants, and the budget deficits skyrocket. Added to this business-as-usual subsidization are the bailouts of failed banks, Wall Street firms, and auto companies — turning traditional ideas of free enterprise on their head. This is the “crony capitalism” now embraced by both political parties. 

We have created in America a permanent political class that has an interest in ever-expanding government. The party out of power always says government is too big — but once it comes to power, it makes it even bigger. Republicans accuse Democrats of being supporters of “big government,” which is true enough, but government power has also grown dramatically under Richard Nixon, Ronald Reagan, George H. W. Bush and George W. Bush. When will voters finally understand that both of our political parties are co-conspirators in the growth of both government power and our huge deficits? This is something the Founding Fathers sought to prevent — and would have been sorry to see. But they wouldn’t have been surprised.

Thomas Jefferson, in a letter to Edward Carrington, observed that:

“The natural progress of things is for liberty to yield and government to gain ground. . . . One of the profoundest preferences in human nature is for satisfying one’s needs and desires with the least possible exertion; for appropriating wealth produced by the labor of others, rather than producing it by one’s own labor. . . . In other words, the stronger the government, the weaker the producer, the less consideration need be given him and the more might be taken away from him. A deep instinct of human nature being for these reasons in favor of strong government, nothing could be a more natural progress of things than for Liberty to yield and government to gain ground.”

It was because of their fear of governmental power that the Framers of the Constitution limited government through the Bill of Rights and divided its authority through our federal system. By establishing the executive, legislative, and judicial branches — and by dividing authority between the state and national governments — the Framers hoped to ensure that no branch of government would ever obtain so much power that it would be a threat to freedom. 

The kind of activist government we have now — involved in every aspect of people’s lives, even running an automobile company — is the opposite of what the Founding Fathers had in mind. From the beginning of history, the great philosophers predicted that democratic government would not long preserve freedom. Plato, Aristotle and, more recently, De Tocqueville, Lord Bryce and Macauley predicted that men would give away their freedom voluntarily for what they perceived as greater security. French political philosopher Bertrand De Jouvenel noted that:

“The state, when once it is made the giver of protection and security, has but to urge the necessities of its protectorate and over-lordship to justify its encroachments.”

Voters say that they are against big government and oppose inflation and deficit spending, but when it comes to their own particular share, they act in a different way entirely. Walter Judd, who represented Minnesota in Congress for many years, once recalled that a Republican businessman from his district:

“. . . who normally decried deficit spending, berated me for voting against a bill which would have brought several million federal dollars into our city. My answer was, ‘Where do you think federal funds for Minneapolis come from? People in St. Paul?’. . . My years in public life have taught me that politicians and citizens alike invariably claim that government spending should be restrained, except where the restraints cut off federal dollars flowing into their cities, or their pocketbooks.”

If each group curbed its demands upon government, it would not be difficult to balance the budget and restore health to the economy. But as long as we allow politicians to solicit virtually unlimited amounts of money from those special interests with business before Congress, this is unlikely — and both parties are in it together. Human nature leads to the unfortunate situation in which, under representative government, people have learned that they can secure funds for themselves that have, in fact, been produced by the hard work of others.

This point was made more than 200 years ago by the British historian Alexander Tytler:

“A democracy cannot exist as a permanent form of government. It can only exist until the voters discover they can vote themselves largess out of the public treasury. From that moment on, the majority always votes for the candidate promising the most benefits from the public treasury — with the result that democracy collapses over a loose fiscal policy, always to be followed by dictatorship.”

The Founding Fathers never envisioned the creation of a permanent political class such as the one we have now. They believed that men would be farmers, businessmen, doctors, lawyers, teachers — and would devote several years of their lives to public service and then go home to their careers. Today, however, we have professional politicians — men and women who support their families by holding public office and intend to do so for many years. When they do leave public office, most do not go home and many remain in Washington as high-priced lobbyists. Their motivation, it seems, is clearly whatever will permit them to do so, not the long-run best interests of the country. Incumbents running for re-election in one-party districts raise millions from special interests that they do not need for their campaigns, and can keep it when they leave Congress.

The incoming Trump administration promises to “drain the swamp.” It will be interesting to see how — and if — this proceeds. Still, we must keep in mind that Members of Congress respond to our demands. As long as we — whether individuals, farmers, Wall Street banks or any other special interest, seek to be subsidized by government, and this is the price they must pay for our support, the politicians of both parties will comply. In this sense, our own selfishness, as well as theirs, is the culprit. The term “congressional ethics” may indeed be an oxymoron. But the ethics of the rest of us may not be far behind. In a sense, then, we have the kind of government we deserve — one that indeed represents our values. A brazen effort to eliminate the independent ethics office by a secret vote of House Republicans shows us how far we have gone down this path.     * 

Sunday, 22 January 2017 14:13

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

Why Did Fidel Castro, a Brutal Dictator, Attract So Much Western Support?

The death of Fidel Castro at the age of 90 marks the end of a long life filled with brutally inflicting a tyrannical regime upon the people of Cuba. People with AIDS were confined to sanitariums. Artists and writers were forced to join an official Union and told that their work must support the Castro regime. In 1965, Castro admitted to holding 20,000 political prisoners. Foreign observers said the number was twice as high. The Castro regime carried out thousands of political executions. 

Fidel Castro eliminated the celebration of Christmas. There were no elections and only a state-controlled press. Hundreds of thousands of Cubans simply left, most of them for the United States. Soon, Castro imposed restrictions, making it almost impossible to leave the country. In April 1980, he opened the port of Mariel to any Cuban wishing to leave. More than 125,000 people — branded as “worms” and “scum” by Castro — took advantage of the “boatlift” before it ended in October of that year. By 1994, economic conditions were so bad that riots in Havana were followed by another exodus. Thousands fled from the country’s beaches on makeshift rafts. 

Under Castro, Cubans lived mostly on black beans and rice. Once one of the richest countries in Latin America, under Castro, Cuba sank into decay and poverty. Castro himself lived in luxury. His former bodyguard, Juan Sanchez, reports the Castro lived on a private island, Cayo Pledra, and liked to travel aboard a large yacht with Soviet-built engines, the Asuarama II.

After taking power, he turned on those of his former comrades who naively thought his “revolution” would bring democracy, not tyranny. One of them, Huber Matos, a long- time democratic opponent of the dictator Fulgencio Batista, protested against Castro’s increasing closeness with Moscow. After a show trial, including a 7-hour tirade of denunciation by Castro. Matos was jailed for 20 years — 16 in solitary confinement, during which he was repeatedly tortured. 

Another victim was the poet Armando Valladares, originally a supporter of the revolution who, as Castro’s anti-democratic policies emerged, refused to put an “I’m With Fidel” sign on his office desk. He was charged with “terrorism” and sentenced to 30 years. He served 8,000 days — 20 years — often confined to cells so small he could not lie down. 

First, Valladares was sent to the huge complex on Isla de Pinos, where 100 lbs. of foodstuffs each day were allotted to feed 6,000 prisoners. Ironically, during the Batista regime, Fidel Castro had been held in the same prison. But Valladares points out:

“. . . he had been allowed visitors, national and international news, uncensored books, unlimited correspondence, a conjugal pavilion, and any food he wanted. He had never been mistreated.”

In his widely read book Against All Hope, Valladares quotes from a letter written by Castro on April 4, 1955:

“I get sun several hours every afternoon. . . . I’m taking two baths a day now. . . . I’m going to have dinner. . . . spaghetti and squid, Italian chocolates for dessert, then fresh-brewed coffee. . . . Don’t you envy me? . . . What would Karl Marx say about such revolutionaries?”

Under Castro, prison was quite different. Valladares and his fellow inmates suffered repeated beatings at the hands of the guards and were isolated for long stretches of time. Often, they were taken to punishment cells where they were held naked, unwashed and unable to escape the stench and disease produced by their own accumulating wastes. The food was the equivalent of a near-starvation diet. Less than a pound was allotted for every fifty prisoners each day — and this included almost no protein or vitamins.

The level of medical care in the prisons was reminiscent of the Nazi death camps. After repeated beatings, Valladares was suffering excruciating pain in his leg. He writes that:

“The military doctor was a Communist who tried to look like Lenin, wearing the same kind of goatee. . . . He wore the uniform of a doctor but was a sadist. When I asked for medical care, he looked through the peephole, stared at my leg, and told me he hoped it turned into a good case of gangrene, ‘so I can come in myself and cut it off.’”

While Fidel Castro imposed a totalitarian regime upon the people of Cuba he was, somehow, viewed in heroic terms by many Americans and others in the West, particularly intellectuals.

Author Norman Mailer, the pillar of many radical causes, declared:

“So Fidel Castro, I announce to the city of New York that you gave all of us who are alone in this country. . . . some sense that there were heroes in the world. . . . It was as if the ghost of Cortez had appeared in our century riding Zapata’s white horse. You were the first and greatest hero to appear in the world since the Second World War.”

Elizabeth Sutherland, book and arts editor of The Nation, wrote, “He (Castro) seems, first of all, utterly devoted to the welfare of his people — and his people are the poor, not the rich.” Author Jonathan Kozol declared, “Each of my two visits to Cuba was a pilgrimage and an adventure.” The writer Susan Sontag wrote that, “. . . it seems sometimes as if the whole country (Cuba) is high on some beneficial kind of speed, and has been for years.” Frank Mankiewicz, once an aide to Sen. George McGovern and later head of National Public Radio, visited Cuba with Kirby Jones and wrote a book lauding the revolution. He and Jones found Castro “one of the most charming and entertaining men either of us had ever met.”

Author France’s Fitzgerald, originally a sympathizer with the Cuban revolution, observed that:

“Many North American radicals who visited Cuba or live there have performed a kind of surgery on their critical faculties and reduced their conversation to a form of baby talk, in which everything is wonderful including the elevator that does not work and the rows of Soviet tanks on military parade that are in ‘the hands of the people.’”

When Castro visited New York in 1995 to address the U.N., Mort Zuckerman, owner of The New York Daily News, hosted a reception for him at his penthouse on Fifth Avenue. Time Magazine declared, “Fidel, Takes Manhattan!” Newsweek called Castro “The Hottest Ticket in Manhattan.” 

The adoration of Castro by Western intellectuals was hardly unique. They also embraced Stalin. In 1954, the French philosopher Jean Paul Sartre returned from a visit to the Soviet Union and declared that Soviets did not travel, not because they were prevented from doing so, but because they had no desire to leave their wonderful country. “The Soviet citizens,” he declared, “criticize their government much more, and more effectively, than we do. There is total freedom of criticism in the Soviet Union.”

Even during Stalin’s purge trials, many Western intellectuals warmly embraced the brutal dictator. Playwright Lillian Hellman, for example, visited Moscow in October 1937 — at the height of the trials — and returned to sign an ad in the Communist publication New Masses that approved of them. She even supported the 1939 Soviet invasion of Finland. Discussing Stalin’s powers, the British writers Beatrice and Sidney Webb wrote:

“He (Stalin) has not even the extensive power which the Congress of the U.S. has temporarily conferred on President Roosevelt or that which the American Constitution entrusts for four years to every successive President. . . . Stalin is not a dictator. . . . he is the duly elected representative of one of the Moscow constituencies of the Supreme Soviet. . . ” (The Truth About Soviet Russia, 1942). 

The world’s reaction to Fidel Castro’s death gives little indication that a brutal dictator has died. Vladimir Putin called Castro “a wise and strong leader . . . . an inspiring example for all the world’s people’s.” Narendra Modi, Prime Minsters of India, called Castro “one of the most iconic personalities of the 20th century.” Bashar Al-Assad of Syria, himself a brutal dictator, called Castro “a great leader.” Even Canadian Prime Minister Justin Trudeau referred to him as “a remarkable leader.”

It is important that the world recognize Fidel Castro’s real legacy. Yale historian Carlos Eire portrays that legacy in these terms:

“He turned Cuba into a colony of the Soviet Union and nearly caused a nuclear holocaust. He sponsored terrorism wherever he could and allied himself with many of the worst dictators on Earth. He was responsible for so many thousands of executions and disappearances in Cuba that a precise number is hard to reckon. He brooked no dissent and built concentration camps and prisons at an unprecedented rate, filling them to capacity, incarcerating a higher percentage of his own people than most other modern dictators, including Stalin. . . . He persecuted gay people and tried to eradicate religion. He censored all means of expression and communication. . . . He created a two-tier health-care system, with inferior medical care for the majority of Cubans and superior care for himself and his oligarchy. . . .”

Why Fidel Castro attracted admirers in our own society is part of the larger question of why Stalin and Communism itself had appeal to men and women who seemed indifferent to Communism’s rejection of free speech, free elections, a free press, freedom of religion, freedom of movement, and had contempt for the rights of minorities, racial, religious, ethnic and sexual. Now that Fidel Castro is dead, perhaps those who admired him will take a closer look at his legacy, one which the suffering Cuban people will, hopefully, overcome and move beyond to a better, freer, and more prosperous future.

By Opposing Charter Schools, the NAACP Would Harm the Black Students Whose Interest It Claims to Support

At its national convention in July, the NAACP approved a resolution calling for a moratorium on the expansion of privately managed charter schools. The Movement For Black Lives, a network of Black Lives Matter organizers, also passed resolutions criticizing charter schools and calling for a moratorium on their growth. The NAACP went so far as to liken the expansion of charters to “predatory lending practices” that put low-income communities at risk.

Charter schools provide parents with an opportunity for school choice and give inner city parents an opportunity to remove their children from poorly performing schools. Several studies by Stanford University’s Center for Research on Education Outcomes found that students enrolled in charter schools in 41 of the nation’s urban regions learned significantly more than their traditional public school counterparts. According to one study, charter school students received the equivalent of 40 days of additional learning a year in reading. Educational gains for charter school students turned out to be significantly larger for black, Hispanic, low-income, and special education students in both math and reading.

Although some charter schools have been poorly run, a performance advantage has been found to be particularly significant in the San Francisco Bay Area, Boston, Washington, D.C., Memphis, and Newark. There is heavy demand for more charter schools among low-income black and Latino families who are often trapped in failing school districts.

Where charter schools are doing well, demand for admission is high. In New York City, charter schools enroll about 107,000 students, roughly 10 percent of the city’s total enrollment. But more than 44,000 students who sought admission for the current school year were turned away. In Harlem and the South Bronx there are now four applicants for every charter school seat.

The Black Alliance for Free Educational Options (BAEO) and the National Alliance for Public Charter Schools has launched a campaign to tell the story of why more than 700,000 African American families have chosen charter schools. More than 160 African American advocates and community leaders have urged the NAACP to reconsider and learn more about how charter schools are helping black families.

In a letter, they declare:

“A blanket moratorium on charter schools would limit black students’ access to some of the best schools in America and deny black parents the opportunity to make decisions about what’s best for their children. Instead of enforcing a moratorium, let’s work together to improve low-achieving public schools and expand those that are performing well.”

One of the letter’s signers was Cheryl Brown Henderson, daughter of Oliver Brown, plaintiff in Brown v. Board of Education, and founding president and CEO of the Brown Foundation for Educational Equity, Excellence and Research. Brown Henderson said:

“Over 60 years ago my father joined with numerous parents to stand with the NAACP and fight for African American students stuck in a separate, broken education system. Brown v. Board of Education created better public education options for African American students and made it the law of the land that neither skin color, socioeconomic status, nor geography should determine the quality of education a child receives.”

In opposing charter schools and real school choice, the NAACP is flying in the face of the views of black parents. There are now more than 6,600 charter schools across the nation, educating nearly three million children. Black students account for 17 percent of charter school enrollment nationally. The American Federation for Children national school choice survey, conducted in January 2016, found that 76 percent of African Americans support school choice. Polling by BAEO also found that the majority of black voters surveyed supported charter schools.

In September, more than 25,000 parents, students, and educators attended a rally in Brooklyn’s Prospect Park calling for an expansion of the city’s charter schools. Among the speakers was actor and hip-hop artist Common, who said, “I’m here to tell you that you participating and being a part of charter school success stories is your path to possibility.” In New York City, black charter school students were 60 percent more likely than their public school counterparts to earn a seat in one of the city’s specialized high schools.

The idea of choice in education, which includes a voucher system and charter schools, is attracting both liberal and conservative support. Opposition comes primarily from teacher unions. Last summer, speaking to the National Education Association, Hillary Clinton declared: “When schools get it right, whether they’re traditional public schools or public charter schools, let’s figure out what’s working and share it with schools across America.” For this statement, Clinton was booed by NEA members.

Editorially, The Washington Post declared:

“The reaction speaks volumes about labor’s uninformed and self-interested opposition to charter schools and contempt for what’s best for children. . . . Since the first charter school opened 25 years ago in Minnesota, support for the non-traditional schools has grown with nearly 3 million students in more than 6,700 charters in 42 states and the District. Demand is high with parents of school-age children — particularly those who have low incomes — overwhelmingly saying they favor the opening of more charter schools. . . . We urge the NAACP leadership to put the interests of African American children ahead of the interests of political allies who help finance the group’s activities. . . .”

Wall Street Journal columnist Jason Riley, who is black, says of the NAACP:

“The organization would rather deny black children good schools than risk losing money from teacher unions. The organization’s primary concern today is self-preservation and maintaining its own relevance, not meeting the 21st century needs of the black underclass.”

While some charter schools have had problems, as in Detroit, where charter schools are not outperforming the traditional school alternative, the real reason for opposition by teacher unions, the NEA, and the NAACP is that they threaten the union monopoly on education. Most charters are non-Union and their growth is at the expense of poorly performing union-run public schools.

The NAACP interest in opposing charter schools seems to have nothing to do with the well being of black students. According to the Labor Department, unions have given the NAACP and its affiliates at least $3 million since 2010. The two major national teacher unions, the American Federation of Teachers (AFT) and the National Education Association (NEA) gave the NAACP $265,000 last year, significantly increasing their contribution between 2010 and 2014.

Teacher unions have also been giving financial aid to the Congressional Black Caucus (CBC), which influences groups such as the NAACP. According to the Labor Department, the AFT and NEA have given the CBC Foundation and CBC Institute $911,000 since 2010. Open Secrets campaign donation data shows that the AFT and NEA have given CBC members $253,000 and $206,000 respectively, this cycle.

It is ironic to see civil rights groups oppose charter schools even though black Americans are learning more at charters than at traditional public schools. In Boston, for example, students in charter middle schools outperform those in traditional public schools by two to three years worth of learning in math and about half that in reading. The Black Alliance for Educational Options, a pro-charter civil rights group, calls the NAACP resolution “inexplicable,” and urges the NAACP board to reject it.

Competition in education, which gives parents a right to choose where their children should be educated, is something all Americans should support. It is particularly important for low-income families in minority communities, who are often consigned to poorly performing schools. Conservatives have long embraced the idea of free choice in the form of a voucher system and of charter schools. Observing the success of such schools, many liberal voices, such as The New York Times and The Washington Post, have joined them. The NAACP should rethink its position if it hopes to remain relevant to the needs of the constituency it seeks to represent.

The Latest Target of Political Correctness on Campus: America’s “Melting Pot” Tradition

Strange things are happening in the name of political correctness at colleges and universities across the country.

California State University at Los Angeles (CSULA) has debuted segregated housing available to students who “identify as Black/African-Americans.” The Halisi Scholars Black Living Learning Community has opened approximately nine months after the CSULA Black Student Union issued a list of demands including “black student only” living space with a “full time resident director who can cater to the needs of black students.” Racially segregated housing can also be found at other universities, including University of California branches at Davis and Berkeley and the University of Connecticut.

A student at the University of Houston was punished for tweeting “All Lives Matter” after the shooting of five policemen in Dallas. The university’s student government sentenced the offending student to undergo mandatory diversity training. At Princeton, the word “man” is considered sexist. Employees were told to use gender-neutral terms such as “human beings.” At the University of Iowa, a clinical professor of pediatrics wrote to the athletic director expressing dismay over the ferocious facial expressions of Herky the Hawk. Herky is the mascot of the Hawkeyes, and was criticized for conveying an “invitation to act aggressively and even violence,” and lacking in “emotional diversity.” We could fill pages with similar examples.

Last year, University of California administrators released a document warning professors not to describe America as a “melting pot” because this unduly pressured minorities to “assimilate to the dominant culture.” This is an assault on the very important history of our country embracing men and women of every race, religion, and ethnic background, and making them into Americans.

When the melting pot philosophy was alive and well, our society succeeded dramatically. Immigrants from around the world entered an America that had self-confidence and believed in its own culture, history, and values and was determined to transmit them to the newcomers. And the immigrants wanted to become Americans. That, after all, is why they came.

Remembering the way American public schools served to bring children of immigrants into the mainstream, Fotine Z. Nicholas, who taught for 30 years in New York City Schools and wrote an education column for a Greek-American weekly, noted:

“I recall with nostalgia the way things used to be. At P.S. 82 in Manhattan, 90 percent of the students had European-born parents. Our teachers were mostly of Irish origin, and they tried hard to homogenize us. We might refer to ourselves as Czech, or Hungarian, or Greek, but we developed a sense of pride in being American. . . . There were two unifying factors, the attitude of our teachers and the English language. . . . After we started school, we spoke only English to our siblings, our classmates, and our friends. We studied and wrote in English, we played in English, we thought in English.”

America is indeed a nation of immigrants. Speaking in Philadelphia in 1776, Samuel Adams declared:

“Driven from every other corner of the earth, freedom of thought and the right of private judgment in matters of conscience direct their course in this happy country as their last resort.”

Those who think that the idea of the “melting pot” is, somehow, demeaning to those who come to our country as immigrants fail to understand the reality of what has happened in America during the past centuries. In his now famous letter to the Jewish congregation in Newport, Rhode Island in 1790, George Washington wrote:

“The Citizens of the United States of America have a right to applaud themselves for having given to mankind examples of an enlarged and liberal policy: a policy worthy of imitation. All possess alike liberty of conscience and immunities of citizenship. . . . For happily the Government of the United States, which gives to bigotry no sanction, to persecution no assistance, requires only that they who live under its protection should demean themselves as good citizens giving it on all occasions their effectual support.”

The man who coined the term “melting pot” was the British author Israel Zangwill. In a now famous passage, written in 1904, he wrote:

“America is God’s Crucible, the Great Melting Pot, where all the races of Europe are reforming. Here you stand, good folk, think I, when I see them at Ellis Island, here you stand in your fifty groups and your fifty languages and histories and your fifty blood-hatreds and rivalries. But you won’t long be like that, brothers, for these are the fires of God you’ve come to — these are the fires of God. A fig for your feuds and vendettas, Germans and Frenchmen, Irishmen and English, Jews and Russians, into the crucible with you all. God is making the American.”

America has been a nation much loved. Germans have loved Germany. Frenchmen have loved France. Swedes have loved Sweden. This, of course, is only natural. Yet, America is not simply another country. To think that it is — is to miss the point of our history. America has been beloved not only by Americans, but by men and women throughout the world who have yearned for freedom. By the millions they have come and found here the opportunities that existed in no other place.

America dreamed a bigger dream than any other nation in history. It was a dream of a free society in which a person’s race, religion, or ethnic origin would be completely beside the point. It was a dream of a common nationality in which the only price to be paid was a commitment to fulfill the responsibilities of citizenship. In the 1840s, Herman Melville wrote:

“We are the heirs of all time and with all nations we divide our inheritance. On this Western Hemisphere all tribes and peoples are forming into one federated whole and there is a future which shall see the estranged children of Adam restore as to the old hearthstone in Eden. The seed is sown and the harvest must come.”

America has been a new thing in the world, not without problems and challenges, which afflict any human enterprise, and which persist today. Yet, it remains a beacon for men and women in search of freedom in every corner of the world. When the enforcers of political correctness seek to proscribe the “melting pot” from our history, we can only lament that those in charge of some of our colleges and universities understand so little of the American story. How will the new generation learn that story at universities like these? That should be a question that concerns us all.

“Cultural Appropriation”: A Growing Political Correctness Tactic to Silence Free Expression

In the name of something called “cultural appropriation,” a growing assault upon free expression is now under way as “political correctness” expands its horizons.

This attack takes many forms. After the 2013 American Music Awards, Katy Perry was criticized for dressing like a geisha while performing her hit single, “Unconditionally.” Arab-American writer Randa Jarrar accused a Caucasian woman who practiced belly-dancing of “white appropriation of Eastern dance.” Daily Beast entertainment writer Amy Zimmerman wrote that pop star Iggy Azalea perpetuated “cultural crimes” by imitating African-American rap styles. At Oberlin College, students protested a “piratization” of Japanese culture when sushi was served in the school dining hall.

In 2015, the Museum of Fine Arts in Boston was charged with cultural insensitivity and racism for its “Kimono Wednesdays.” At the event, visitors were invited to try on a replica of the kimono worn by Claude Monet’s wife Camille in the painting “La Japonaise.” The historically accurate kimonos were made in Japan for that very purpose. Still, Asian-American activists and their supporters surrounded the exhibit with signs like, “Try on the kimono: learn what it’s like to be a racist imperialist today.” Others attacked “Yellow-face@the MFA” on Facebook. The museum eventually apologized and changed the program so that the kimonos were available for viewing only. Still, activists complained that the display invited a “creepy Orientalist gaze.”

At an Australian writers festival in Brisbane in September, American author Lionel Shriver stirred much attention by criticizing as runaway political correctness efforts to ban references to ethnicity, gender, or sexual orientation from Halloween celebrations, or to prevent artists from drawing on ethnic sources for their work. Ms. Shriver, the author of 13 books, was especially critical of efforts to stop novelists from “cultural appropriation.” She deplored critics of authors like Clive Cleave, an Englishman, for presuming to write from the point of view of a Nigerian girl in his best-selling book Little Bee.

Ms. Shriver noted that she had been criticized for using in The Mandibles the character of a black woman with Alzheimer’s disease, who is kept on a leash by her homeless white husband. And she defended her right to depict members of minority groups in any situation if it served her artistic purposes. “Otherwise, all I could write about would be smart-alecky 59-year-old 5-foot-2-inch white women from North Carolina,” she said.

Writing in The New York Times after the meeting in Australia, Ms. Shriver, criticized her fellow liberals for embracing cultural conformity:

“Do we really want every intellectual conversation to be scrupulously cleansed of any whiff of controversy? Will people be so worried about inadvertently giving offense, avoid those with different backgrounds altogether? Is that the kind of fiction we want — in which the novels of white writers all depict John Cheever’s homogeneous Connecticut suburbs of the 1950s, while the real world outside their covers becomes ever more diverse? . . . Protecting freedom of speech involves protecting the voices of people with whom you may violently disagree. In my youth, liberals would defend the right of neo-Nazis to march down Main Street. I cannot imagine anyone on the left making that case today.”

Professor Susan Scafidi of the Fordham University Law School notes that:

“Taking intellectual property, traditional knowledge, cultural expressions, or artifacts from someone else’s culture without permission is the definition [of] cultural appropriation.”

Writing in The New York Review of Books, novelist Francine Prose asks:

“Should Harriet Beecher Stowe have been discouraged from including black characters in Uncle Tom’s Cabin — a book that helped persuade the audience of the evils of slavery? Should Mark Twain have left Jim out of Huckleberry Finn, a novel that, more fully than any historical account, allows modern readers to begin to understand what it was like to live in a slave-owning society? Should someone have talked Kazuo Ishiguro out of writing The Remains of the Day, the beautiful novel whose protagonist — a white butler in England before World War II — presumably shares few surface similarities with his creator? Should immigrant writers and writers of color be restricted to portraying their own communities?”

Francine Prose asks questions which today’s cultural police seem never to have considered:

“What would modern art be like if the impressionists and later Van Gogh had not been so profoundly affected by Japanese woodblock prints or if Picasso and Braque had not been drawn to the beauty and sophistication of African Art? Should Roberto Bolano, a Chilean who lived mostly in Mexico, not have focused, in the third section of his novel 2666, on an African-American journalist, or set the novel’s final chapters in Europe during World War II? Don’t we want different cultures to enrich one another? Reading Chekov, we are amazed by his range, by his ability to see the world through the eyes of the rich and the poor, men and women, the old and the young, city dwellers and peasants. But had he caved to the pressures of identity politics and only described characters of his own gender and class, few of his six hundred or so stories would have been written.”

Another author, Cathy Young, provides this assessment:

“Welcome to the new war on culture. At one time such critiques were leveled against truly offensive art — work that trafficked in demeaning caricatures, such as blackface, 19th century minstrel shows, or ethnological expositions which literally put indigenous people on display, often in cages. But these accusations have become a common attack against any artist or artwork, no matter how thoughtfully or artfully presented. A work can reinvent the material or even serve as a tribute, but no matter. If artists dabble outside their own critical experience, they’ve committed a creative sin.”

The protests being launched by the militant advocates of political correctness, in Young’s view, have a potential not only to chill creativity and artistic expression but are equally bad for diversity:

“This raises the troubling specter of cultural cleansing when we attack people for stepping outside their own cultural experiences, we hinder our ability to develop empathy and cross-cultural understanding. What will be declared ‘problematic’ next? Picasso’s and Matisse’s works inspired by African art? Puccini’s ‘Orientalist’ operas, ‘Madame Butterfly’ and ‘Turandot?’ Should we rid our homes of Japanese prints? . . . Can Catholics claim appropriation when religious paintings of Jesus or the Virgin Mary are exhibited in a secular context, or when movies from ‘The Sound of Music’ to ‘Sister Act’ use nuns for entertainment? . . . Appropriation is not a crime. It’s a way to breathe new life into culture. People have borrowed, adopted, taken, infiltrated or reinvented from time immemorial. . . . Russian culture with its Slavic roots is also the product of Greek, Nordic, Tatar and Mongol influences. America is the ultimate blended culture.”

Actor and playwright J. B. Alexander points out that:

“William Shakespeare never personally felt the sting of racism, yet he wrote the character of Othello. He was never subjected to anti-Semitism, yet he wrote the character of Shylock. Nor was he ever a female adolescent, yet he wrote the character of Juliet. And we are all the richer for it. Artists must be free to create characters that lie within the scope of their imaginations, not merely to replicate their own identities, because great art allows us to transcend those identities and recognize our common humanity.”

If the crusade against “cultural appropriation” continues, we may reach a point where only Jews can read the Bible, only Greeks can read Plato or Aristotle, and only Italians read Dante or Machiavelli. Where will it end? Can only those of British descent appreciate Shakespeare or those of Russian descent read Tolstoy or Dostoevsky?

More than 100 years ago, the distinguished black intellectual W. E. B. Du Bois understood that art and culture, whatever the source, are relevant to men and women of all backgrounds. He declared:

“I sit with Shakespeare and he winces not. Across the color line I walk arm in arm with Balzac and Dumas, where smiling men and welcoming women glide in gilded halls. From out of the caves of evening that swing between the strong-limbed earth and the tracery of the stars, I summon Aristotle and Aurelius and what soul I will, and they come all graciously, with no scorn or condescension. So, wed with truth, I dwell above the veil.”     *

Friday, 04 November 2016 14:21

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

The 2016 Election Campaign Shows the Dramatic Decline in American Politics

The 2016 presidential campaign looms large, and there are no positive things to be said about it. The two candidates, Donald Trump and Hillary Clinton, are viewed in negative terms by the majority of voters. They are viewed as not being trustworthy, a strange characteristic for presidential candidates.

In preparation for a trip to visit my four grandchildren, I went through old copies of Cobblestone, a children’s history magazine. I asked my oldest grandson, who will soon be ten, which issues I should bring with me. These were magazines his father read, together with his aunt and uncle. He selected those that featured Thomas Jefferson, James Madison, Abraham Lincoln and Benjamin Franklin.

This immediately brought to mind the contrast between the men who involved themselves in public life in the colonial era and those who are offering themselves as leaders at the present time. Washington, Jefferson, Madison and others, in revolting against the most powerful empire in the world, risked everything they had, including their lives. If they did not succeed, which seemed likely, their families would have been left destitute. Contrast that with Hillary Clinton, who has spent her career in public life, and made millions of dollars as a result. The product she sold was influence. The Clinton Foundation, now known as the Bill, Hillary and Chelsea Clinton Foundation, accepted millions of dollars from at least seven foreign governments while Mrs. Clinton served as Secretary of State. The Foundation has admitted that a $500,000 donation it received from Algeria violated a 2008 ethics agreement between the foundation and the Obama administration. Or Donald Trump, whose business career is littered with bankruptcies, lawsuits, and charges of fraud, and whose campaign has insulted his opponents, made fun of those with disabilities, and flirted with racism. His experience in government is non-existent and his understanding of international affairs seems limited, at best. He has suggested that NATO is irrelevant, seems prepared to have Japan and South Korea pursue nuclear weapons, and seems sympathetic to Vladimir Putin. He has endorsed torture and the murder of innocent relatives of those involved in terrorism.

The authors of the Constitution carefully studied the history of Athens and the Roman Republic and created a system of limited government and checks and balances that they hoped would prevent a descent into tyranny, which ended these early democracies in the ancient world. The system they established is now the oldest existing form of government in the world, which tells us a great deal about the difficulty men and women have had in establishing governmental systems which provide for free speech, free elections, and individual rights. In the lifetime of many of us, countries that are now functioning democracies — Germany, Italy, Japan, Poland, Hungary and many others — were totalitarian states. Democracy is a difficult and easily threatened way to organize a society. When economies fail and times become difficult, demagogues are waiting in the wings. Hitler and Mussolini are the best-known examples, but there are many others. In history, tyranny is, sadly, not an exception. It has been the rule. Our own Constitutional system is a rare exception. But, as any human enterprise, it is fragile. Many predicted that it would not survive in the long run.

In On Power, the French political philosopher Bertrand De Jouvenel points out that we frequently say, “Liberty is the most precious of all goods” without noticing what this concept implies. He writes:

“A good thing which is of great price is not one of the primary necessities. Water costs nothing at all, and bread very little. What costs much is something like a Rembrandt, which though its price is above rubies, is wanted by very few people and by none who have not, as it happens, a sufficiency of bread and water. Precious things, therefore, are really desired by but few human beings and not even by them until their primary needs have been amply provided. It is from this point of view that Liberty needs to be looked at — the will to be free is in time of danger extinguished and revives again when once the need of security has received satisfaction. Liberty is in fact only a secondary need, the primary need is security.”

From the beginning of history, the great philosophers predicted that democratic government would produce this result. Plato, Aristotle, and more recently, De Tocqueville, Lord Bryce and Macaulay, predicted that people would give away their freedom voluntarily for what they perceived as greater security. De Jouvenel concludes:

“The state when once it is made the giver of protection and security, has but to urge the necessities of its protectorate and overlordship to justify its encroachments.”

In a similar vein, Thomas Babington Macaulay, the British historian, lamented in 1857 in a letter to Henry Randall, an American, that:

“I have long been convinced that institutions purely democratic must, sooner or later, destroy liberty or civilization or both. In Europe, where the population is dense, the effect of such institutions would be almost instantaneous. . . . Either the poor would plunder the rich, and civilization would perish; or order and prosperity would be saved by a strong military government and liberty would perish.”

Macaulay, looking to America, declared that:

“Either some Caesar or Napoleon will seize the reigns of government with a strong hand; or your republic will be as fearfully plundered and laid waste by barbarians in the 20th century as the Roman Empire was in the Fifth — with this difference — that your Huns and Vandals will have been engendered within your own country by your institutions.”

Nearly 200 years ago, the British historian Alexander Fraser Tytler, declared that:

“A democracy cannot exist as a permanent form of government. It can only exist until the voters discover that they can vote themselves largess out of the public treasury. From that moment on, the majority always votes for the candidates promising the most benefits from the public treasury — with the result that democracy collapses over a loose fiscal policy, always to be followed by a dictatorship.”

In the colonial era, the best men in the American society were engaged in public life. They had little to gain personally for their efforts, and much to lose. It has been said that the American society is rare in history, for its Golden Age was at the beginning. Our system has evolved to become one in which to engage in political life means to be on an endless quest for funds from special interest groups which, in turn, determine policy. Politicians argue that taking millions of dollars from Wall Street has nothing whatever to do with bailing out failing banks with taxpayer funds. Does anyone really believe this? Would Jefferson or Washington or Adams have entered public life if it involved endless fund-raising and subservience to those who contribute?

John Adams observed that: “Democracy never lasts long. It soon wastes, exhausts and murders itself. There never was a democracy that did not commit suicide.” As he left the Constitutional Convention, Benjamin Franklin was asked what form of government had been created. He replied, “A republic if you can keep it.”

The Founding Fathers would be disappointed in the dramatic decline in American politics, but they would not be surprised. They feared it would happen. It is now time for America’s elder statesmen — of both parties — to speak up and decry any politics that divides the American people on the basis of race, ethnicity, religion, or gender. As the late Rep. Shirley Chisholm (D-NY) used to say, “We came over on different ships, but we’re in the same boat now.” And that boat is now in increasingly troubled waters.

Growth of Executive Power Has Exploded Under President Obama — Altering Our System of Checks and Balances

Most Americans learned in government and civics classes that we live under a constitutional system of checks and balances. The elected representatives of the people in Congress pass the laws and the executive carries them out. This has not been our reality for some time, and both parties are responsible for the growth of executive power and the decline of the Congress. President Obama, The New York Times notes, “has been one of the most prolific authors of major regulations in presidential history.”

In its first seven years, the Obama administration finalized 500 major regulations — which were never passed by Congress. These were classified by the Congressional Budget Office as having particularly significant economic or social impacts. That was nearly 50 percent more than the George W. Bush administration during the comparable period, according to data kept by the regulatory studies center at George Washington University.

In recent years, whichever party has been in power, the power of the president and his willingness to issue executive orders rather than going to Congress for legislation has grown. Under President George W. Bush, what some called a new “Imperial Presidency” was said to have emerged. In The Cult of the Presidency, the Cato Institute’s Gene Healy noted that the administration’s broad assertion of executive power included:

“. . . the power to launch wars at will, to tap phones and read e-mail without a warrant, and to seize American citizens, and hold them for the duration of the war on terror — in other words perhaps forever.”

Healy points out that:

“Neither Left nor Right sees the president as the Framers saw him: a constitutionally constrained chief executive with an important, but limited job: to defend the country when attacked, check Congress when it violates the Constitution, enforce the law — and little else. Today, for conservatives as well as liberals, it is the president’s job to protect us from harm, to “grow the economy,” to spread democracy and American ideals abroad, and even to heal spiritual malaise.”

In 2014, President Obama vowed,

“I’ve got a pen and I’ve got a phone — and I can use that pen to sign executive orders and take executive actions and administrative action that move the ball forward.”

This notion of executive power has little to do with the Constitution. It is a formula for the rule of one individual, and has had appeal to those in the White House, of both parties.

In one celebrated case, President Obama issued an executive order that would have allowed millions of undocumented immigrants to remain in the country and work legally. This was challenged in the courts by 26 states, which said the president did not have the authority to issue an order determining how the immigration laws should be enforced in the case of millions of people, saying the issue should be left to Congress. The lower courts agreed. The Supreme Court, in the case of United States v. Texas, split 4-4 on the question. As a result, the lower court ruling blocking the president’s executive order was upheld. This, of course, was very rare. Most executive orders are quietly implemented with little discussion or debate.

The decline of the Congress and the growth of executive power is clear to all when it comes to the war-making power. The war in Iraq was not declared by Congress, nor was that in Korea or Vietnam, Panama, Haiti, Grenada or Somalia. In recent years, Congress has relinquished more authority than ever before over the nation’s foreign policy.

The Constitution reserves to Congress alone the power to declare war, despite its naming the president as commander-in-chief of the armed forces. In Federalist No. 69, Alexander Hamilton notes that the president’s authority:

“. . . would be nominally the same with that of the King of Great Britain, but in substance much inferior to it. . . . While that of the British King extends to the declaring of war and to the raising and regulating of fleets and armies, all of which, by the Constitution under consideration appertain to the legislature.”

According to the decision in the case of Perkins v. Rogers, the Supreme Court declared:

“The war making power is, by the Constitution, vested in Congress and . . . the President has no power to declare war or conclude peace except as he may be empowered by Congress.”

In Presidential War Power, Louis Fisher, a senior specialist in separation of powers at the Library of Congress, states:

“From 1789 to 1950, Congress either declared or authorized all major wars. Members of Congress understood that the Constitution vests in Congress, not the president, the decision to take the country from a state of peace to a state of war. The last half-century has witnessed presidential wars, including President Truman going to war against North Korea and President Clinton using military force against Yugoslavia, with neither president seeking authority from Congress.”

Indeed, in more than 200 years and more than 100 U.S. military engagements, Congress has formally declared only five wars — the War of 1812, the Mexican-American War (1846), the Spanish American War (1898), World War I (1917) and World War II (1941).

The office of president that we now observe is far different from what the Framers of the Constitution had in mind. According to the Cato Institute:

“The constitutional presidency, as the Framers conceived it, was designed to stand against the popular will as often as not, with the president wielding the veto power to restrain Congress when it transgressed its constitutional bounds. In contrast the modern president considers himself a tribune of the people, promising transformative action and demanding the power to carry it out.”

The result is what political scientist Theodore J. Lowi has called:

“. . . the plebiscitary presidency . . . an office of tremendous personal power drawn from the people. . . and based on the New Democratic theory that the presidency with all powers is the necessary condition for governing a large democratic nation.”

The growth of executive power is a real threat to the system of government established by the Constitution. Both parties are co-conspirators in expanding the power of the president. Even at the beginning of the Republic, perceptive observers such as John Calhoun, in his Disquisition on Government, predicted that the powers of government would inevitably grow, that those in power would always advocate a “broad” use of power, and those out of power would always argue for a “narrow” use of power, and that no one would ever turn back governmental authority that had once been assumed.

The scope of federal regulation has continued to grow and there is little reason to believe that this trend will not continue. Presidents, both Democrats and Republicans, have asserted greater power in recent decades to dictate the shape of regulations while Congress has become less specific in its instructions, in effect, abdicating its own authority. When she was a Harvard law professor, Elena Kagan, now a Supreme Court justice, said, “We live in an era of presidential administration.” Professor Robert Hahn of Oxford says:

“The big issue that I grapple with is that the regulatory state keeps growing. And as it keeps growing, when does it become too much?”

Whether our new president is Donald Trump or Hillary Clinton, major opposition to their campaign promises will be found in Congress. To bypass Congress, they now have the legacy of presidents such as George W. Bush and Barack Obama, who believe that issuing executive orders is an easy way to avoid the legislative process. This is not the system our Constitution established, but it is the one we seem to have now. This is not good news for those who believe in the system of checks and balances and division of powers that the Constitution established.

Looking at Race Relations Beyond the Overheated Rhetoric in the Political Arena

In the wake of police shootings in Baton Rouge and St. Paul and the murder of five police officers in Dallas by a shooter who said his goal was killing white policemen, there has been increasing focus upon and discussion of race relations. The overheated rhetoric in the political arena, on all sides, obscures a more complex reality.

According to a New York Times/CBS News poll conducted in July, 69 percent of Americans say race relations are generally bad. Six in ten Americans say race relations are growing worse, up from 38 percent a year ago. Relations between black Americans and the police have become so brittle that more than half of black people say they were not surprised by the attack that killed five police officers and wounded nine others in Dallas. Nearly half of white Americans say that they, too, were unsurprised by the episode.

While the particulars of recent killings of black men by white police officers are subject to differing analyses — in a number of cases police officers have been found innocent by federal and state authorities of any wrongdoing — there can be no doubt that a real problem exists. Recent events caused Sen. Tim Scott (R-SC), a conservative black Republican, to tell his own story.

The first black senator elected in the South since Reconstruction, Scott reports many run-ins with police officers over the course of his life. He recalls drawing the suspicion of a Capitol Police Officer last year who insisted on seeing identification even though he was wearing the distinctive lapel pin worn by senators. “The officer looked at me, full of attitude, and said, ‘The pin I know, you I don’t. Show me your ID,’ he said.” “I was thinking to myself, either he thinks I’m committing a crime — impersonating a member of Congress — or what?”

Sen. Scott states:

“While I thank God I have not endured bodily harm, I have, however, felt the pressure applied by the scales of justice when they are slanted. I have felt the anger, the frustration, the sadness and the humiliation that comes with feeling like you are being targeted for nothing more than being yourself. . . . The vast majority of time, I was pulled over for nothing more than driving a new car in the wrong neighborhood, or some reason just as trivial. Imagine the frustration, the irritation, the sense of a loss of dignity that accompanies each of those stops.”

When it comes to the question of the use of lethal force by police, the claim that blacks are more often victims has been widely claimed. Here, the rhetoric and reality seem at odds. A new study confirms that black men and women are treated differently in the hands of law enforcement. They are more likely to be touched, handcuffed, pushed to the ground or pepper-sprayed by a police officer even after accounting for how, where, and when they encounter the police. But when it comes to the most lethal force — police shootings — the study finds no racial bias.

“It is the most surprising result of my career,” said Roland G. Fryer, Jr., the author of the study, and a professor of economics at Harvard. The study examined more than 1,000 shootings in 10 major police departments in Texas, Florida, and California.

The results of this study by Dr. Fryer, who is black, contradicts the image of police shootings that many Americans hold. He said that anger after the deaths of Michael Brown, Freddie Gray, and others drove him to study the issue. “You know, protesting is not my thing, but data is my thing,” he said.

“So I decided that I was going to collect a bunch of data and try to understand what really is going on when it comes to racial differences in police use of force.”

The idea that race relations are approaching the divisiveness of the 1960s is hard to justify. President Obama states that:

“When we start suggesting that somehow there is this enormous polarization and we’re back to the situation in the 1960s — that’s just not true. You’re not seeing riots, and you’re not seeing police going after people who are protesting peacefully.”

The progress made by black Americans since the years of segregation is impressive. In 1950, only 13.7 percent of adult black Americans (25 and older) had completed high school or more; by 2014, this was 66.7 percent, according to the Department of Education. Over the same period, the number of African Americans with a bachelor’s degree or higher went from 2.2 percent to 22.8 percent. The black upper-middle class — defined as households with incomes of at least $100,000 — has grown from 2.8 percent of households in 1967 to 13 percent in 2014.

In every area of American life, blacks have been advancing dramatically. Black elected officials have made huge gains, reports the Joint Center for Political and Economic Studies. When the Voting Rights Act was passed in 1965, five African Americans served in the House and Senate; now there are 44 House members and two senators. Over a similar period, the number of black state legislators grew from about 200 to 700. We have, of course, elected our first black president — and then re-elected him.

Attitudes toward racial intermarriage have changed dramatically. NORC, an academic polling organization at the University of Chicago, periodically explores intermarriage in its surveys. One question asks whites and blacks whether they would favor or oppose a marriage of “a close relative” to a person of the other race. In 1990, only 5 percent of whites favored interracial marriage; 30 percent were neutral, and 65 percent opposed. By 2014, only 16 percent opposed. Blacks have been even more open to interracial marriage; since 2000, roughly 90 percent have either approved or not objected.

According to a recent study by the U.S. Census, residential segregation has been dramatically curtailed. The study of census results from thousands of neighborhoods by the Manhattan Institute found that the nation’s cities are more economically integrated than at any time since 1910. It was found that all-white enclaves “are effectively extinct.” Prof. Reynolds Farley of the University of Michigan’s Population Studies Center, says that:

“There is now much more black-white neighborhood integration than 40 years ago. Those of us who worked on segregation in the 1960s never anticipated such decline.”

The fact that major disparities exist between black and white Americans is true. Yet, to argue that “white racism” is the cause of all such disparities is to overlook a larger reality. The fact that 70 percent of black births involve unmarried mothers has serious consequences. As Child Trends, a research group, puts it, “These children tend to face unstable living arrangements, life in poverty and . . . have low educational achievement.” When it comes to the large number of young black men killed in shootings, Wall Street Journal columnist Jason Riley, who is black, notes that:

“More than 95 percent of black shooting deaths don’t involve the police. . . . Sadly, rates of murder, rape, robbery, assault and other violent crimes are 7 to 10 times higher among blacks than among whites.”

It is such high rates of crime, in Riley’s view, “that obviously underlie tensions between poor minority communities and cops.”

Those of us old enough to have lived through the years of segregation remember an era of segregated schools, segregated bus and train stations, “white” and “black” restrooms (visit the Pentagon and see the proliferation of rest rooms which were constructed in the years when it was illegal in Virginia for men and women of different races to use the same facilities), with water fountains reserved for “white” and “colored.” In many parts of the country blacks could not vote or sit on juries. Black travelers never knew when they would be able to stop for a meal. There was no pretense that racial equality of any kind existed.

Today, we live in an imperfect society, but one in which all citizens, regardless of race, have equal rights. It is against the law to discriminate on the basis of race. Men and women can go as far as their individual abilities can take them. Black Americans hold every conceivable position in our society — from CEO of major corporations, to chief of police in major cities (such as Dallas), to university president, to governor, to attorney general, to President of the United States.

No one should pretend that problems of race do not exist. Compared to the distance we have come, however, these problems should be put in perspective. “The sky is falling” is not an appropriate posture for those on the left or the right, although in this political season, speaking before thinking is increasingly becoming the norm. Our reality is far more positive and hopeful than the political debate we are forced to endure would indicate.

Kaepernick’s Protest: A Look Back at the Patriotism of Black Americans in Difficult Times

Controversy is now swirling around the decision by San Francisco 49ers quarterback Colin Kaepernick not to stand with his teammates during the national anthem. He has been condemned by some and hailed by others for this action, which he said was to protest racism in the American society.

Whatever one thinks of Kaepernick, the controversy it has provoked provides us with an opportunity to review the long history of patriotism on the part of black Americans, even in the years when they faced severe discrimination. Many Americans, of all backgrounds, are largely unaware of this history.

Few understand the complex history blacks have played in the history of the United States. The first blood shed in the struggle for American independence was shed by the leader of the group that precipitated the “Boston Massacre,” a black man named Crispus Attucks. The first electric streetlights in a metropolitan area of New York City were installed under the supervision of a black man, Lewis H. Latimer, assistant and associate of Thomas A. Edison. The U.S. Flag was first placed at the North Pole by a black explorer, Mathew A. Henson. The list goes on and on.

Black Americans, although they suffered the indignity of slavery and, after slavery, the legal barriers of segregation, have been committed patriots. In his important book, The Negro in the Making of America, Professor Benjamin Quarles, a distinguished black historian, points out that from the beginning, black Americans made one important decision: they would remain in America. From the time of the Revolutionary War, blacks had been advised — by many, black as well as white — to return to Africa. Instead, the decision to remain in America and be free was pervasive. (The book, The Negro in the Making of America was, published in 1964, at which time the term “Negro” was in common usage.)

At a black church meeting in Rochester, New York, in 1853, whose chairman was the noted orator Frederick Douglass, a statement was adopted which declared: “We ask that in our native land we shall not be treated as strangers, and worse than strangers.” The delegates officially rejected any move to abandon the United States and supported, instead, a proposal to establish a manual labor school that would teach the skilled trades.

Many efforts have been made by the enemies of the United States to enlist the support of black Americans, a group they viewed as likely to endorse their calls for revolution because of the legitimate grievances they felt.

To the Communist Party in the 1920s and 1930s, the black American was viewed as the prototype of the oppressed, exploited worker. During a 1925 meeting in Moscow, Joseph Stalin asked why blacks were not better represented in the U.S. Communist Party. To improve their standing with blacks, the Communists adopted a policy calling for self-determination for those areas of the American South where blacks lived in large numbers. Blacks were called an “oppressed nation” who had the right to separation from the United States.

The response to this effort to attract black membership was a dismal failure. Blacks wanted to be free and equal within America, not separate from it. Dr. Quarles writes:

“Negroes simply did not seem to be attuned to the Communist message, for reasons that are not hard to fathom. Typically American, the Negro was individualistic, not likely to submerge his personality in conformity to a party line from which there could be no deviation. . . . The Negro, again like other Americans of his day, was not class-conscious — the vocabulary of the Communists struck him as foreign. Basically, too, the Negro was a man of conservative mold.”

Because black Americans protested against segregation, some thought blacks were radical. Instead, they sought only the opportunity to enter the American society as free and equal citizens, to be able to go as far as their individual abilities would take them, to share in a common color-blind citizenship.

Some black Americans who briefly entered the Communist Party were repelled by it, discovering that the very freedom they sought was rejected within the party itself. Thus, the respected author Richard Wright recalled his experience as a young party member in Chicago in the 1930s:

“I found myself arguing alone against the majority opinion, and then I made still another amazing discovery, I saw that even those who agreed with me would not support me. At that meeting I learned that when a man was informed of the wish of the party he submitted, even though he knew with all the strength of his brain that the wish was not a wise one, was one that would ultimately hurt the party interests. . . . It was not courage that made me oppose the party. I simply did not know any better. It was inconceivable to me, tough-bred in the lap of Southern hate, that a man could not have his say. I had spent a third of my life traveling from the place of my birth to the North just to talk freely, to escape the pressure of fear. And now I was facing fear again.”

Discussing the meaning of black history that is often overlooked, J. A. Parker, one of the early black conservatives and president of the Lincoln Institute, noted that:

“In reviewing the history of black Americans, we should focus upon those who vigorously opposed the efforts of extremists to turn them against America, to isolate them from others in society and to cause them to abandon their goal of a free society in which men and women were to be judged as individuals, not as members of one racial or ethnic or religious group or another. We should focus upon individuals such as Gen. Daniel “Chappie” James, authors Max Yergan and George Schuyler, and composer William Grant Still, to name only several whose proper place in black history often seems to be overlooked. These men were outstanding in their individual careers and never ceased to fight for the day when race would be incidental in determining the place of any man or woman in the American society. They understood that America was the last, best hope of the world to achieve a truly free and just society.”

Benjamin Quarles was correct when he wrote that:

“To most Negroes . . . the vision of the founders of this republic was still a vital force. Americans to the core, they believe that freedom and equality for all could be achieved in their native land. . . . The belief has been one of their significant contributions in the making of America. In enlarging the meaning of freedom and in giving it new expressions, the Negro has played a major role. He has been a watchman on the wall. More fully than other Americans, he knew that freedom was hard-won and could be preserved only by continuous effort. The faith and works of the Negro over the years has made it possible for the American creed to retain so much of its appeal, so much of its moving power.”

America has been a unique and ethnically diverse society from the beginning. By the time of the first census in 1790, people of English origin were actually already a slight minority. Enslaved Africans and their American-born descendants made up 20 percent of the population, and there were large clusters of Scotch-Irish, German, Scottish and Dutch settlers, and smaller numbers of Swedes, Finns, Huguenots and Sephardic Jews.

America has always been something unique, not simply another country. In the 1840s, Herman Melville wrote: “We are the heirs of all time and with all the nations we divide our inheritance. If you kill an American, you shed the blood of the whole world.” Visiting New Amsterdam in 1643, French Jesuit missionary Isaac Jogues was surprised to discover that 18 languages were spoken in this town of 8,000 people. In his Letters from an American Farmer, J. Hector St. John Crevecoeur wrote in 1782: “Here individuals of all nations are melted into a new race of men, whose labor and posterity will one day cause great changes in the world.”

Mario Puzo, the author of The Godfather, wrote that:

“What has happened here has never happened in any other country in any other time. The poor who had been poor for centuries, whose children had inherited their poverty, their illiteracy, their hopelessness, achieved some economic dignity and freedom. You didn’t get it for nothing, you had to pay a price in tears, in suffering, why not? And some even became artists.”

As a young man, growing up in Manhattan’s Lower East Side, Puzo was asked by his mother, an Italian immigrant, what he wanted to be when he grew up. When he said he wanted to be a writer, she responded that, “For a thousand years in Italy, no one in our family was even able to read.” But in America everything was possible — in a single generation.

Ours is a complex society of more than 300 million people of every race, religion, and ethnic background. Inevitably, such a society will have problems and difficulties. These we must confront and resolve. To see only the problems and overlook the larger American story is to misunderstand reality. It would be good for Colin Kaepernick to review this history. In our free society, he has a right not to stand for the national anthem. Perhaps upon further consideration and a review of the dramatic progress our society has made, and continues to make, he will make a different decision in the future.     *

Sunday, 20 December 2015 08:12

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

"White Privilege": Not a Term Generations of Hardworking Immigrants Would Understand

A new term has emerged as an energized, and very youthful, civil rights movement has sought to focus attention upon what it perceives as widespread racism in today's American society. That term is "white privilege."

There is a very aggressive policing of language now under way. Former Maryland Governor Martin O'Malley was interrupted by protestors when the 2016 Democratic presidential candidate said "all lives matter" at the Netroots Nation conference in Phoenix in mid-July. He later apologized. "That was a mistake on my part, and I meant no disrespect," he said on "This Week In Blackness," a digital show. Several dozen demonstrators interrupted O'Malley's talk big shouting "Black Lives Matter," which has become a rallying cry in the wake of recent shootings of black men by police officers.

At the Phoenix meeting, O'Malley responded, "Black lives matter. White lives matter. All lives matter." This was unacceptable to the protestors, who also shouted down Sen. Bernie Sanders of Vermont, one of O'Malley's Democratic rivals. "Black lives, of course, matter. I spent 50 years of my life fighting for civil rights and for dignity," Sanders declared. "But if you don't want me to be here, that's OK. I don't want to outscream people."

A great deal of attention is being paid to the new book, Between the World and Me, by Ta-Nehisi Coates of The Atlantic. The 176-page book is addressed to his 14-year-old son and the subject is what it is like to be black in America today.

In America, it is traditional to destroy the black body - it is heritage. "White America" is a syndicate arrayed to protect its exclusive power to dominate and control our bodies.

Coates said that if he were king, he would let criminals out of prison, "And, by the way, I include violent criminals in that." He writes in his book that he watched the smoldering towers of the World Trade Center on 9/11 with a cold heart. He felt that the police and firefighters who died "were menaces of nature, they were the fire, the comet, the storm, which could - with no justification - shatter my body."

Racism is a blemish on our society. Older black observers, who lived through the years of segregation, recognize how far we have come. Ellis Close wrote a book, The Rage of a Privileged Class, in 1993 in which he argued that many successful black Americans "were seething about what they saw as the nation's broken promise of equal opportunity." More recently, Close wrote in Newsweek:

Now, Barack Obama sits in the highest office in the land and a series of high-powered African-Americans have soared to the uppermost realms of their professions. The idea of a glass ceiling is almost laughable. Serious thinkers are searching for a new vocabulary to explain an America where skin color is an unreliable marker of status . . .

The history of the world, sadly, shows us people at war with one another over questions of race, religion and ethnicity. Today, radical Islamists are killing people because they are of another religion. In Israel there are efforts to define the state as legally "Jewish," thereby making its 20 percent non-Jewish population less than full citizens. Russia has invaded and absorbed Ukraine to absorb its ethnic Russian population. When Britain left India, millions of Muslims were forced to leave Hindu-majority India to form Pakistan - at the cost of an untold number of lives. We have seen Armenians slaughtered by Turks and have witnessed genocide in Nazi Germany, Rwanda and Burundi.

Those who glibly call America a "racist" society are not comparing it to anyplace in the real world, either historically or at the present time. They are comparing it to perfection and here, of course, we fail, as would any collection of human beings. But our collection of human beings includes men and women of every race and nation. And the notion of "white privilege" seems not to understand the reality of the immigrant experience. Most of today's "white" Americans are descendants of those immigrants, who often suffered great prejudice and many indignities which they overcame through perseverance and hard work. They hardly considered themselves beneficiaries of "white privilege."

Consider the experience of Irish and Italian immigrants who arrived in the U.S. in the 19th and early 20th centuries.

Between 1840 and 1924, 35 million immigrants arrived, many of them illiterate, and most unable to speak English. People who are now described as "white Europeans," were viewed quite differently in the past. A century ago, the Irish were considered by many to be a separate and inferior race. As Mike Wallace and Edwin Burrows write:

Just as the English had long characterized their neighboring islanders more harshly than they had Africans, plenty of Anglo New Yorkers routinely used adjectives like "low-browed," "savage," "bestial," "wild" and "simian" to describe the Irish Catholic "race."

Thomas Nast, the leading political cartoonist from the 1870s to the 1890s portrayed Irishmen almost as monkeys and drew Catholic bishops' hats as sharks' jaws. Andrew Greeley described the Irishman in American cartoons:

By the mid-19th century, he was a gorilla, stovepipe hat on his head, a shamrock in his lapel, a vast jug of liquor in one hand and a large club in the other. His face was a mask of simian brutality and stupidity.

Italian immigrants, largely illiterate peasants from southern Italy and Sicily who had no experience of urban life, were a visually distinctive group, viewed by many as belonging to another race. "Swarthy" was a term often used to describe them and, as Richard Alba notes, "To the eyes of Americans they bore other physical signs of degradation such as low foreheads." Leonard Dinnerstein and David Reimers write that in addition to using epithets such as "wop," "dago," and "guinea," Americans referred to Italians as "the Chinese of Europe." Many Americans doubted that Italians were "white." In the American South, Italians were often segregated like blacks and were classified as yet another race - "between." Eleven Italians were lynched in New Orleans in 1891and five Italians were lynched in Tallulah, Louisiana in 1899.

The notion that all immigrants from Europe were regarded as white Europeans and accepted without prejudice - upon which notions of "white privilege" are based - is an artifact of 1990s "multiculturalism" with no historical basis. Life was difficult and challenging. By 1910, there were 540,000 Eastern European Jews living in 1.5 square miles on the lower East Side of Manhattan. There were 730 people per acre, possibly the highest density on earth. They lived in five- or six- story tenement houses, sleeping three or more to a room with most rooms opening only to narrow airshafts. These grim conditions were highlighted in Jacob Riis's book, How the Other Half Lives.

But despite all of this, America was different and unique, a society in which, no matter your origin, you could go as far as your ability would take you. As a young man growing up in Manhattan, author Mario Puzo was asked by his mother, an Italian immigrant, what he wanted to be when he grew up. When he said he wanted to be a writer, she responded that, "For a thousand years in Italy, no one in our family was even able to read." But in America, everything was possible - in a single generation.

Puzo writes:

It was hard for my mother to believe that her son could become an artist. After all, her own dream in coming to America had been to earn her daily bread, a wild dream in itself, and looking back she was dead right. Her son an artist? To this day she shakes her head. I shake mine with her. . . . What has happened here has never happened in any other country in any other time. The poor who had been poor for centuries . . . whose children had inherited their poverty, their illiteracy, their hopelessness, achieved some economic dignity and freedom. You didn't get it for nothing, you had to pay a price in tears, in suffering, why not? And some even became artists.

America has been a nation much loved. Germans have loved Germany, Frenchmen have loved France, Swedes have loved Sweden. This, of course, is only natural. America has been beloved not only by Native Americans but by men and women of every race and nation who have yearned for freedom. For all its failings, America dreamed a bigger dream than any nation in history. Those who think "white privilege" explains reality know little of America and the world.

America is more than simply another country. Visiting New Amsterdam in 1643, French Jesuit missionary Isaac Jogues was surprised to discover that 18 languages were spoken in this town of 8,000 people. In his Letters From An American Farmer, J. Hector St. John Crevecoeur wrote in 1782: "Here individuals of all nations are melted into a new race of men, whose labors and posterity will one day cause great changes in the world."

Looking at our complex history and recognizing only its shortcomings - and comparing America only to perfection, not to other real places in the world, may lead Ta-Nehisi Coates and other young people in the "Black Lives Matter" movement to believe that "white privilege" is, somehow, an explanation for a reality that is multi-faceted. The millions of immigrants who suffered the travails of displacement and discrimination would not recognize the term as representing the experience they and their descendants have had.

The Sin of Contemporaneity: Cleansing History by Applying Today's Standards to Our Ancestors

It is good that the Confederate battle flag has been removed from the South Carolina statehouse grounds. It properly belongs in a museum. Robert E. Lee himself would agree. After surrendering in 1865, he sought to bring the country together. He urged his fellow Confederates to furl their flags. He left instructions that the Confederate flag not be displayed at his funeral. In fact, when Lee surrendered at Appomattox, he was going against Jefferson Davis' order to fight on. "It's over," Lee declared.

What we are witnessing now, however, is a wholesale assault upon our history. The Founding Fathers have been targeted. It has been suggested that the Washington Monument and Jefferson Memorial are inappropriate, since they celebrate men who owned slaves. CNN commentator Don Lemon suggested that we "rethink" any homage to Jefferson. Even in states where slavery was outlawed at an early date, state flags are under attack, because of their depiction of Native Americans. Boston Globe columnist Yvonne Abraham said the Massachusetts flag "is no Confederate flag, but . . . still pretty awful." The Memphis City Council voted to dig up the bodies of Confederate Gen. Nathan Bedford Forrest and his wife from their public grave. The rebel flag-clad General Lee automobile from "The Dukes of Hazard" has been removed from memorabilia shops and the show itself removed from re-runs. The Washington National Cathedral is considering breaking its own windows because they contain Confederate flag imagery which was meant to be conciliatory. Louis Farrakhan has demanded that the American flag itself be hauled down. Speaking at a Washington church he declared:

I don't know what the fight is about over the Confederate flag. We've caught as much hell under the American flag as under the Confederate flag.

It's time for all of us to take a deep breath. Those who seek to erase our history sound a bit like the Taliban and ISIS, who are busy destroying historic structures all over the Middle East if they predate the rise of Islam. History is what it is, a mixed bag of mankind's strengths and weaknesses, of extraordinary achievements and the most horrible depredations. To judge the men and women of past eras by today's standards is to be guilty of what the respected Quaker theologian Elton Trueblood called the "sin of contemporaneity." In the case of those who refer to slavery as our "original sin," a look at history is instructive.

Sadly, from the beginning of recorded history until the 19th century, slavery was the way of the world. Rather than some American uniqueness in practicing slavery, the fact is that when the Constitution was written in 1787, slavery was legal every place in the world. What was unique was that in the American colonies there was a strenuous objection to slavery and that the most prominent framers of the Constitution wanted to eliminate it at the very start of the nation.

Slavery played an important part in many ancient civilizations. Indeed, most people in the ancient world regarded slavery as a natural condition of life, which could befall anyone at any time. It has existed almost universally through history among peoples of every level of material culture - it existed among nomadic pastoralists of Asia, hunting societies of North American Indians, and sea people such as the Norsemen. The legal codes of Sumer provide documentary evidence that slavery existed there as early as the 4th millennium B.C. The Sumerian symbol for slave in cuneiform writing suggests "foreign."

The British historian of classical slavery, Moses I. Finley, writes:

The cities in which individual freedom reached its highest expression - most obviously Athens - were cities in which chattel slavery flourished.

At the time of its cultural peak, Athens may have had 115,000 slaves to 43,000 citizens. The same is true of ancient Rome. Plutarch notes that on a single day in the year 167 B.C., 150,000 slaves were sold in a single market.

Our Judeo-Christian tradition was also one which accepted the legitimacy of slavery. The Old Testament regulates the relationship between master and slave in great detail. In Leviticus (XXV: 39-55), God instructs the Children of Israel to enslave the heathen and their progeny forever. By classical standards, the treatment of slaves called for in the Bible was humane. In Exodus (XXI: 20-21) it states that if a master blinded his slave or knocked out one of his teeth, the slave was to go free. There is no departure from this approach to slavery in the New Testament. In a number of places, St. Paul urges slaves to obey their masters with full hearts and without equivocation. St. Peter urges slaves to obey even unjust orders of their masters.

Slavery was a continuous reality throughout the entire history which preceded the American Revolution. In England, ten percent of the persons enumerated in the Domesday Book (A.D. 1086) were slaves, and they could be put to death with impunity by their owners. During the Viking age, Norse merchant sailors sold Russian slaves in Constantinople. Venice grew to prosperity and power as a slave-trading republic, which took its human cargo from the Byzantine Empire. Portugal imported large numbers of African slaves from 1444 on. By the middle of the 16th century, Lisbon had more black residents than white.

Slavery was not a European invention, but was universal. Throughout the Middle Ages, black Africans sold slaves to other Africans and to Moslem traders who also brought slaves to Asia. Among the Aztecs, a man who could not pay his debts sold himself into slavery to his creditor. In China, poor families who could not feed all of their children often sold some as slaves. As the Founding Fathers looked through history, they saw slavery as an accepted institution.

What is historically unique is not that slavery was the accepted way of the world in 1787, but that so many of the leading men in the American colonies of that day wanted to eliminate it, and pressed vigorously to do so. Benjamin Franklin and Alexander Hamilton were ardent abolitionists. John Jay, who would become the first Chief Justice, was president of the New York Anti-Slavery Society. Rufus King and Gouverneur Morris were in the forefront of opposition to slavery.

One of the great debates at the Constitutional Convention related to the African slave trade. George Mason of Virginia made an eloquent plea for making it illegal:

The infernal traffic originated in the avarice of British merchants. The British government constantly checked the attempt of Virginia to put a stop to it. . . . Slavery discourages arts and manufactures. The poor despise labor when performed by slaves. . . . Every master of slaves is born a petty tyrant. They bring the judgment of heaven on a country.

The provision finally adopted read:

The Migration or Importation of such Persons as any of the States now existing shall think proper to admit, shall not be prohibited by the Congress prior to the year one thousand eight hundred and eight.

This clause was widely viewed by opponents of slavery as an important first step toward abolition. The delay of 20 years was considered the price ten of the states were willing to pay in order to assure that the original union would include the three states of Georgia, South Carolina and North Carolina. Even in those states there was sympathy for an end to slavery, but they wanted additional time to phase out their economic dependence on it.

In his original draft of the Declaration of Independence, one of the principal charges made by Thomas Jefferson against King George III and his predecessors was that they would not allow the American colonies to outlaw the importation of slaves. When Jefferson was first elected to the Virginia legislature, at the age of 25, his first political act was to begin the elimination of slavery. Though unsuccessful, he tried to further encourage the emancipation process by writing in the Declaration of Independence that "all men are created equal." In his draft of a constitution for Virginia he provided that all slaves would be emancipated in that state by 1800, and that any child born in Virginia after 1801 would be born free. This, however, was not adopted.

In his autobiography, Jefferson declared, "Nothing is more certainly written in the book of fate than that these people are to be free." In 1784 when an effort was unsuccessfully made to exclude slavery from the Northwest Territory, Jefferson was one of its leading supporters. Finally, with the passage of the Northwest Ordinance of 1787, slavery was indeed excluded from these territories - a further step along the path to the final elimination of slavery, and a clear indication of the view of slavery which predominated among the framers of the Constitution.

American history is flawed, as is any human enterprise. Yet those who now call for the removal of statues and monuments commemorating our past are measuring our history against perfection, not against other real places. What other societies in 1787 - or any date in history prior to that time, would these critics find more free and equitable than the one established by the Constitution? Where else was religious freedom with no religious test for public office in 1787? Compared to perfection, our ancestors are found wanting. Compared to other real places in the world, they were clearly ahead of their time, advancing the frontiers of freedom.

If we judge the past by the standards of today, must we stop reading Plato and Aristotle, Sophocles and Aristophanes, Dante and Chaucer? Will we soon hear calls to demolish the Acropolis and the Coliseum, as we do to remove memorials to Jefferson and statues of Robert E. Lee? Must we abandon the Bible because it lacks modern sensibility. Where will it end? As theologian Elton Trueblood said, "contemporaneity" is indeed a sin. We would do well to avoid its embrace.

Remembering a Time When Our Leaders Risked Their Lives and Fortunes for What They Believed

Prior to the American Revolution, when Patrick Henry's famous declaration of "Give me liberty or give me death" was made at a church in Richmond, Virginia, his words had real meaning. Indeed, by advocating revolution against England, the most powerful nation in the world at the time, with the world's largest army and navy, the Founding Fathers were risking everything. If the revolution failed, which seemed likely to many, they would have lost their property - and their lives.

George Washington's home at Mt. Vernon, Thomas Jefferson's at Monticello, James Madison's at Montpelier - all would have been confiscated by the victorious British, had the war been lost. At the time the Declaration of Independence was signed in 1776, only one third of the population of the thirteen colonies supported breaking away from the British Empire. Those who supported independence put their lives on the line.

In his book American Creation, historian Joseph Ellis writes:

The revolutionary generation won the first successful war for colonial independence in the modern era, against all odds, defeating the most powerful army and navy in the world. . . . The British philosopher and essayist Alfred North Whitehead observed that there have been only two instances in the history of Western civilization when the political leaders of an emerging nation behaved as well as anyone could reasonably expect. The first was Rome under Caesar Augustus and the second was America's revolutionary generation. . . . The late eighteenth century was the most politically creative era in American history. They were, in effect, always on their best behavior because they knew we would be watching, an idea we should find endearing because it makes us complicitous in their greatness.

The Founding Fathers did not consult the colonial equivalent of pollsters to find out what people would like to hear. Instead, they developed ideas about how a government should be run and how freedom could be established in an environment of order and law. Alexander Hamilton and James Madison were prime movers behind the summoning of the Constitutional Convention and the chief authors of The Federalist Papers, an undertaking to convince Americans to support the Constitution of 1787.

In his biography, Alexander Hamilton, Ron Chernow notes that:

He had a pragmatic mind that minted comprehensive programs. In contriving the smoothly running machinery of a modern nation-state - including a budget system, a funded debt, a tax system, a central bank, a custom service, and a coast guard - and justifying them in some of America's most influential state papers, he set a high-water mark for administrative competence that has never been equaled. If Jefferson provided the essential poetry of American political discourse, Hamilton established the prose of American statecraft. No other founder articulated such a clear and prescient vision of America's future political, military, and economic strength or crafted such ingenious mechanisms to bind the nation together.

Hamilton, a careful reader of the skeptical Scottish philosopher David Hume, quoted his view that in framing a government "every man ought to be supposed a knave and to have no other end in all his actions but private interests." The task of government, he believed, was not to stop selfish striving - a hopeless task - but to harness it to public good. In starting to outline the contours of his own vision of government, Hamilton was spurred by Hume's dark vision of human nature, which corresponded to his own. From the "First Philippic" of Demosthenes, he plucked a passage that summed up his conception of a leader as someone who would not pander to popular whims. "As a general marches at the head of his troops," so should wise political leaders

. . . march at the head of affairs, insomuch that they ought not to wait the event to know what measures to take, but the measures which they have taken out to produce the event.

The Founding Fathers - Washington, Adams, Jefferson, Madison, Monroe, Hamilton, Franklin and the others - were an extraordinary group of men, truly representing a golden age in our history. The creation of the new American government clearly required both Republicans and Federalists, both a Jefferson and a Hamilton, both those jealous for individual freedom and those concerned that such freedom could only exist and be maintained within an orderly society ruled by law. In a society of only a few million people, we produced leaders who have stood the test of time. Such a generation has never again been seen, on these shores or elsewhere.

These men did not hire ghostwriters for The Federalist Papers. Their words and their thoughts were their own. They did not hire consultants and pollsters to tell them what their views should be on the issues of the day. They often took highly unpopular positions and did their best to convince their colleagues and the public at large of their merits. They risked their lives and everything they owned to declare independence, and knew very well that the possibility of losing everything was very real.

The contrast between the Founding Fathers and those engaged in public life at the present time could not be greater. In the colonial period, our leaders risked their fortunes for the principle of independence. Today, men and women make their fortunes through their participation in politics.

Hillary Clinton, for example, reported that she earned $10.2 million from 45 speeches in 2014, her first full year out of office. Of that, almost $4.6 million came from clients who did lobbying to shape policies on issues as varied as taxes, trade, financial regulation and health care. Later, we learned that the Clinton Foundation had received as much as $26.4 million in previously undisclosed payments from major corporations, universities, foreign sources, and other groups. The money was paid as "fees" for speeches for Bill, Hillary, and Chelsea Clinton.

There can be little doubt that this money was given to the Clintons because of her candidacy for president and her ability to provide assistance to those contributing. Sheila Krumholz, executive director of the money-tracking Center for Responsive Politics, states:

It's big money. They're spending it because they have far greater sums riding on those decisions that they're trying to shape. Every man, or woman in the street thought Hillary Clinton would run again.

Even those who are sympathetic to Mrs. Clinton's candidacy, such as liberal Washington Post columnist Ruth Marcus, have expressed dismay: "Again with the speeches. The gross excessiveness of it all, vacuuming up six-figure checks well past the point of rational need or political seemliness. . . ."

But Hillary Clinton is hardly alone. Norman R. Braman, a Florida billionaire who has long bolstered the career and personal finances of Sen. Marc Rubio (R-FL), is reported ready to invest $10 million or more for the Senator's presidential candidacy. Las Vegas casino billionaire Sheldon Adelson has auditioned possible Republican candidates who seek his support. Endorsement of the policies of Israeli Prime Minister Netanyahu, including his rejection of a two-state solution, is a requirement. When New Jersey Governor Chris Christie appeared, he made the mistake of referring to the West Bank as "occupied territory" (which, of course, it is under International law, as well as U.S. policy, under both Republicans and Democrats). Christie quickly apologized for his "mistake." Jeb Bush also sought Adelson's support and turned his back on long-time Bush family friend and former Secretary of State James Baker to get it. Baker, in a recent talk to J Street, was critical of Israel's rejection of the two-state solution, which was unacceptable to Adelson.

Beyond this, many candidates don't seem to know where they stand on the issues - except when their pollsters tell them what is necessary to win in Iowa or South Carolina or New Hampshire. Hillary Clinton once was a supporter of the trade pact being considered in the Congress. Now, she refuses to take a position - or even take questions from the press. Scott Walker was first hot and then cold on a path to citizenship for undocumented immigrants. Marco Rubio was in and then out on offsetting increased military spending with other cuts. And what exactly is his current position on immigration? Sen. Lindsay Graham (R-SC) is sure of one thing: his opposition to gambling on the Internet. This, of course, is a crusade of Sheldon Adelson, who wants no competition for his gambling casinos. He seems to favor competition and free enterprise in every commercial undertaking but his own.

The contrast between the political leaders of America's golden age and those we observe today could not be starker. No one today is risking his life, property, or honor for anything. The state of our government reflects this fact all too well. *

Sunday, 20 December 2015 08:08

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

Every Tragic Incident - Such as That in Missouri - Produces Cries That America Is a "Racist" Society, but Overlooks a More Complex Reality

The killing of 18-year-old Michael Brown, who is black, by a white police officer in Ferguson, Missouri led to days of demonstrations, rioting, and looting. There has been criticism of the overwhelming police response, as well as charges that racism was involved in the death of this teenager. Beyond this, many have proclaimed that this incident shows us that America is a "racist" society, and that talk of racial progress and a movement toward a genuinely "color-blind" society is false.

Exactly what happened in Ferguson will be determined by a thorough investigation, including participation by the FBI and the Department of Justice. If there was wrongdoing by the police officer involved, this will be documented and appropriate action will be taken. In the meantime, we can only withhold judgment on what actually occurred.

What we can properly lament, however, is the manner in which a chorus of voices is immediately heard after every negative event telling us that racism is alive and well in almost every sector of our society. The reality is far more complex.

Typical of this phenomenon is a column in The New York Times by Charles Blow, who is black. He declares that:

The criminalization of black and brown bodies, particularly male ones, [starts] from the moment they are first introduced to the institutions and power structures with which they must interact. . . . Black male dropout rates are more than one and a half times those of white males, the bias of the educational system bleeds easily into the bias of the criminal justice system, from cops to courts to correctional facilities. The school-to-prison pipeline is complete.

Earlier this year, the Department of Education's Office for Civil Rights released "the first comprehensive look at civil rights from every public school in the country in nearly 15 years." Attorney General Eric Holder said:

The critical report shows that racial disparities in school discipline policies are not only well documented among older students but actually begin during pre-school.

The fact that more young black men drop out of school, that they are over-represented in our criminal justice system, and that they are more often subjected to school discipline, is not necessarily an indication of "institutional racism" in our society, as Mr. Blow and so many others rush to proclaim. There are other, much more plausible explanations.

By 2004 federal data showed that black Americans, 13 percent of the population, accounted for 37 percent of the violent crimes, 54 percent of arrests for robbery, and 51 percent for murder. Most of the victims of these violent criminals were also black. If black men are over-represented in our prison population, the reason appears to be that they are guilty of committing an over-represented amount of crime. Commentator Juan Williams, who is black, laments that:

Any mention of black America's responsibility for committing the crimes, big and small, that lead so many people to prison is barely mumbled, if mentioned at all.

In a column titled "Our Selective Outrage," The Washington Post's Eugene Robinson, who is black, notes that:

The killing of 18-year-old Michael Brown has rightly provoked widespread outrage, drawing international media attention and prompting a comment from President Obama. The same should be true, but tragically is not, of the killing of 3-year-old Knijah Amore Bibb. Brown was killed in Ferguson, Missouri; Knijah died the following day in Landover, Maryland. Both victims were African-American. Both had their whole lives before them. The salient difference is that Brown was shot to death by a white police officer, according to witnesses, while the fugitive suspect in Bibb's killing is a 25-year-old black man with a long criminal record.

Robinson points to statistics showing the dimensions of the problem. According to the FBI, in 2012, the last year for which figures are available, 2,614 whites were killed by white offenders, and 2,412 blacks were killed by black offenders, similar numbers. "But," writes Robinson,

. . . the non-Hispanic white population is almost five times as large as the African-American population, meaning the homicide rate in black communities is staggeringly higher. . . . We need to get angry before we have to mourn the next Knijah Bibb.

It is not "white racism" which causes black-on-black crime, and it may be something other than racism that causes disciplinary disparities and the number of school dropouts. The breakdown of the black family is a more likely cause for such disparities.

In 1940, the black rate of out-of-wedlock birth was around 14 percent. Now, it's 75 percent. In 1870, right after slavery, 70 to 80 percent of black families were intact. Today, after segregation came to an end and the enactment of legislation making racial discrimination illegal, and myriad affirmative action programs, 70 percent of black children have single mothers, and estimates are that an even larger percentage will grow up without a father in the home.

Blaming the problems we confront on "racism" misses the point of the real dilemmas we face. Attorney General Holder does black Americans no favor by ignoring the disintegration of the black family in explaining disparities in school dropouts and disciplinary problems. White racism is not, somehow, compelling out-of-wedlock birth in the black community, a far more plausible causative factor in statistical disparities than blaming an amorphous "institutional racism."

What was missing in the response to developments in Missouri, which included rioting and arson, and cries of "No Justice, No Peace," was "the calming voice of a national civil rights leader of the kind that was so impressive during the 1950s and 1960s," writes author Joseph Epstein:

In those days, there were Martin Luther King, Jr. of the Southern Christian Leadership Conference, Roy Wilkins of the NAACP, Whitney Young of the National Urban League, Bayard Rustin of the A. Philip Randolph Institute - all solid, serious men, each impressive in different ways, who through dignified forbearance and strategic action, brought down a body of unequivocally immoral laws aimed at America's black population.

The NAACP, the Urban League, and the SCLC still exist, notes Epstein,

. . . . yet few people are likely to know the names of their leaders. That is because no black leader has come forth to set out a program for progress for the substantial part of the black population that has remained for generations in the slough of poverty, crime, and despair. . . . In Chicago, where I live, much of the murder and crime that has captured the interest of the media is black-on-black and cannot be chalked up to racism. Except when Bill Cosby, Thomas Sowell, or Shelby Steele and a few others have dared to speak about the pathologies at work, and for doing so these black figures are castigated.

Soon enough, exactly what happened in Ferguson, Missouri will become clear and the matter will be resolved through our legal system. It will take a much longer time before our society begins to confront the real causes of the racial disparities and pathologies which are all too easily, and falsely, attributed to "white racism." Until we do, the sad story of Ferguson is likely to happen again and again.

Family Breakdown: One Important Cause of Many of Society's Ills

In 1965, Daniel Patrick Moynihan, then assistant secretary of labor who went on to serve as Democratic U.S. senator from New York for nearly a quarter century, issued a report warning of a crisis growing for America's black families. It reported a dramatic increase in out-of-wedlock births and one-parent families and warned of the "tangle of pathologies" which resulted. Among these were poor performance in school, increased drug use, and a growing rate of incarceration for crime.

"The Moynihan argument . . . assumed that the troubles impending for black America were unique," writes Nicholas Eberstadt of the American Enterprise Institute:

. . . a consequence of the singular historical burdens that black Americans had endured in our country. That argument was not only plausible at the time, but also persuasive. Yet today that same "tangle of pathology" can no longer be described as characteristic of just one group within our country. Quite the contrary . . . these pathologies are evident throughout all of America today, regardless of race or ethnicity.

Single motherhood has become so common in America that demographers believe that half of all children will live with a single mother at some point before age 18. Research from Princeton University's Sara McLanahan and Harvard University's Christopher Jencks shows that more than 70 percent of all black children are born to an unmarried mother, a threefold increase since the 1960s.

In a new paper, McLanahan and Jencks assess the state of children born to single mothers, nearly fifty years after the Moynihan Report warned that the growing number of fatherless black children would struggle to avoid poverty. The report looks prescient. Black children today are about twice as likely as the national average to live with an unmarried mother. Research is confirming Moynihan's fears that children of unmarried mothers face more obstacles in life.

In the studies reviewed by McLanahan and Jencks, it was found that these children experience more family instability, with new partners moving in and out, and more half-siblings fathered by different men. The growing number of studies in this field also suggest that these children have more problem behaviors and more trouble finishing school.

The growing debate about income inequality ignores the evidence that shows that unwed parents raise poorer children. Isabel Sawhill of the Brookings Institution calculates that returning marriage rates to their 1970 level would lower the child poverty rate by a fifth. There may be a partisan political reason why this point is not made more often. The Economist suggests that, "This omission may be deliberate. Democrats are reluctant to offend unmarried women, 60 percent of whom voted for the party's candidates in 2014."

There may be, some observers point out, a connection between government welfare programs and the breakdown of the family, as well as the declining number of men in the workforce. As late as 1963, on the eve of the War on Poverty, more than 93 percent of American babies were coming into the world with two married parents. According to the 1960 census, nearly 88 percent of children under 18 were then living with two parents. For the quarter century from 1940 to 1965, official data recorded a rise in the fraction of births to unmarried women from 3.8 percent to 7.7 percent. Over the following quarter century, 1965-1990, out-of-wedlock births jumped from 7.7 percent of the nationwide total to 28 percent. The most recently available data are for 2012, which shows America's over-all out-of-wedlock ratio had moved beyond 40 percent.

The trends discussed in the 1965 Moynihan Report for black families have now extended to American families of all racial backgrounds. Among Hispanic Americans, more than 30 percent of children were in single-parent homes by 2013, and well over half were born out-of-wedlock by 2012. Among non-Hispanic white Americans, there were few signs of family breakdown before the massive government entitlement programs began with the War on Poverty in the 1960s. Between 1940 and 1963, the out-of-wedlock birth ratio increased, but only from 2 percent to 3 percent. In 1960, just 6 percent of white children lived with single mothers. As of 2012, the proportion of out-of-wedlock births was 29 percent, nearly 10 times as high as it was just before the War on Poverty.

In his study, The Great Society at Fifty: The Triumph and the Tragedy, Nicholas Eberstadt argues that:

What is indisputable . . . is that the new American welfare state facilitated these new American trends by helping to finance them: by providing support for working-age men who are no longer seeking employment and for single women with children who would not be able to maintain independent households without government aid. Regardless of the origins of the flight from work and family breakdown, the War on Poverty and successive welfare policies have made each of these modern tendencies more feasible as mass phenomena in our country today.

The War on Poverty, of course, did not envision such a result. These were unintended consequences that, as we have seen, are often the case with many well-intentioned government programs. President Lyndon Johnson wanted to bring dependence on government handouts to an eventual end, and did not intend to perpetuate them into the future. Three months after his Great Society speech, Johnson declared:

We are not content to accept the endless growth of relief rolls, of welfare rolls. . . . Our American answer to poverty is not to make the poor more secure in their poverty but to reach down and to help them lift themselves out of the ruts of poverty and move with the large majority along the high road of hope and prosperity.

In Eberhardt's view:

Held against this ideal, the actual unfolding of America's antipoverty policies can be seen only as a tragic failure. Dependence on government relief, in its many modern versions, is more widespread today, and possibly also more habitual, than at any time in our history. To make matters much worse, such aid has become integral to financing lifestyles and behavioral patterns plainly destructive to our commonwealth - and on a scale far more vast than could have been imagined in an era before such antipoverty aid was all but unconditionally available.

Any serious discussion of poverty and the growing gaps in income must confront the reasons why, for example, in the past 50 years, the fraction of civilian men ages 25 to 34 who were neither working nor looking for work has quadrupled and that for many women, children, and even working-age men, the entitlement state has become the breadwinner. Daniel Patrick Moynihan once said, "the issue of welfare is not what it costs those who provide it, but what it costs those who receive it."

At the heart of the social and economic decline we face at the present time is the breakdown of the family. Few in the political arena, in either party, are addressing this question. Unless they do, their proposals to move our economy forward and lessen the gaps in income and wealth are unlikely to succeed.

There Is a Growing Danger That Police Are Being Made Scapegoats for Larger Racial Problems That Society Ignores

The attacks upon police for "racism" have been mounting as a result of the killings of black men in Ferguson, Staten Island, and elsewhere. Many with a history of demagoguery when it comes to questions of race relations, Jesse Jackson and Al Sharpton among them, have done their best to keep this issue alive. Sadly, they have cast more heat than light on a question that is far more complex than their self-serving analysis would lead Americans to believe.

Recently, FBI director James Comey addressed this question. At the outset, he declared certain "hard truths," including the fact that the history of law enforcement has been tied to enforcing slavery, segregation, and other forms of discrimination. "One reason we cannot forget our law enforcement legacy," he said, "is that the people we serve and protect cannot forget it, either."

Mr. Comey also acknowledged the existence of unconscious racial bias "in our white-majority culture," and how that influences policing. He conceded that people in law enforcement can develop "different flavors of cynicism" that can be "lazy mental shortcuts," resulting in more pronounced racial profiling.

But he then warned against using police as scapegoats to avoid coming to grips with much more complex problems affecting minority communities, including a lack of "role models, adequate education, and decent employment," as well as "all sorts of opportunities that most of us take for granted." In his address at Georgetown University, Comey declared:

I worry that this incredibly important and difficult conversation about policing has become focused entirely on the nature and character of law enforcement officers when it should also be about something much harder to discuss.

Citing the song "Everyone's a Little Bit Racist" from the Broadway show "Avenue Q," Comey said that police officers of all races viewed black and white men differently using a mental shortcut that "becomes almost irresistible and maybe even rational by some lights" because black men commit crime at a much higher rate than white men.

Comey said that nearly all police officers had joined the force because they wanted to help others. Speaking in personal terms, he described how most Americans had initially viewed Irish immigrants like his ancestors "as drunks, ruffians, and criminals." He noted that, "Law enforcement's biased view of the Irish lives on in the nickname we still use for the vehicle that transports groups of prisoners. It is, after all, the 'Paddy Wagon.'"

If black men are committing crime out of proportion to their numbers, it is important to consider the reason. According to a report just released by the Marriage and Religion Research Institute (MARRI), by age 17 only 17 percent of black teenagers live with two married parents. Professor Orlando Patterson, a Harvard sociologist who is black, published an article in December in the Chronicle of Higher Education, lamenting that "fearful" sociologists had abandoned "studies of the cultural dimensions of poverty, particularly black poverty," and declared that the discipline had become "largely irrelevant."

Now, Patterson and Ethan Fosse, a Harvard doctoral student, are publishing a new anthology called The Cultural Matrix: Understanding Black Youth. In Patterson's view, fifty years after Daniel Moynihan issued his report about the decline of the black family, "History has been kind to Moynihan." Moynihan was concerned about an out-of-wedlock birth rate in the black community of 25 percent. According to the Centers for Disease Control and Prevention, the equivalent rate for 2013 was 71.5 percent. (The rate for non-Hispanic whites was 29.3 percent.)

The inner-city culture that promotes the social dissolution that results in crime has been written about for many years by respected black observers. In 1899, the scholar W. E. B. Du Bois drew on interviews and census data to produce The Philadelphia Negro: A Social Study. He spent a year living in the neighborhood he wrote about, in the midst of what he described as "an atmosphere of dirt, drunkenness, poverty and crime." He observed in language much harsher than Moynihan's, the large number of unmarried mothers, many of whom he referred to as "ignorant and loose." He called upon whites to stop employment discrimination, which he called "morally wrong, politically dangerous, industrially wasteful, and socially silly." He told black readers they had a duty to work harder, to behave better, and to stem the tide of "Negro crime," which he called "a menace to civilized people."

In 1999, on the hundredth anniversary of Du Bois's study, Elijah Anderson published a new sociological study of poor black neighborhoods in Philadelphia, Code of the Street, and recorded its informants' characterization of themselves and their neighbors as either "decent" or "street" or, in some cases, a bit of both. In The Cultural Matrix, Orlando Patterson lists "three main social groups" - the middle class, the working class, and "disconnected street people" that are common in "disadvantaged" African-American neighborhoods. He also lists "four focal cultural configurations" (adapted mainstream, proletarian, street, and hip-hop).

Patterson views the "hip-hop" culture of the inner city as a destructive phenomenon, compares MC Hammer to Nietzsche, contends that hip-hop routinely celebrates "forced abortions" and calls Lil Wayne "irredeemably vulgar" and "all too typical" of the genre. Thomas Shelby, a professor of African and African-American Studies at Harvard, writes in The Cultural Matrix that "suboptimal cultural traits" are the major impediment for many African-Americans seeking to escape poverty. "Some in ghetto communities," he writes, "are believed to devalue traditional co-parenting and to eschew mainstream styles of childbearing."

In his speech on race in 2008, President Obama said that African-Americans needed to take more responsibility for their own communities by "demanding more from our fathers." Fifty years ago, Daniel Moynihan worried that "the Negro community" was in a state of decline with an increasingly matriarchal family structure that led to increasing crime. In the fifteen years after he published his report, the homicide rate doubled, with blacks overrepresented among both perpetrators and victims.

Orlando Patterson, in a recent interview with Slate, said: "I am not in favor of a national conversation on race," and noted that most white people in America had come to accept racial equality. But whether or not such a "national conversation" is useful, we are now in the midst of such an enterprise. FBI director Comey is contributing to that exchange. He asks:

Why are so many black men in jail? Is it because cops, prosecutors, judges and juries are racist because they are turning a blind eye to white robbers and drug dealers? . . . I don't think so. If it were so, that would be easier to address. . . . The percentage of young men not working or not enrolled in school is nearly twice as high for blacks as it is for whites. . . . Young people in those neighborhoods too often inherit a legacy of crime and prison, and with that inheritance they become part of the police officer's life and shape the way that officer, whether white or black, sees the world. Changing that legacy is a challenge so enormous and so complicated that it is, unfortunately, easier to talk only about the cops. And that's not fair. *
Wednesday, 16 December 2015 12:07

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

What Dennis Hastert's Case Tells Us about Washington's Institutional Corruption

We now know that J. Dennis Hastert, who served for eight years as speaker of the House of Representatives, was paying a former student hundreds of thousands of dollars to not say publicly that Hastert had sexually abused him decades ago. According to The New York Times this information became public from "two people briefed on the evidence in an F.B.I. Investigation."

Federal prosecutors announced the indictment of Hastert late in May on allegations that he made cash withdrawals totaling $1.7 million, to evade detection by banks. The federal authorities also charged him with lying to them about the purpose of the withdrawals.

This is a personal tragedy for Hastert and his family. But it also paints a vivid picture of institutional corruption in Washington, how men and women of modest means are elected to public office and, before long, become very wealthy, using their public positions to do so.

How, we may ask, did a former high school teacher who held elective office from 1981 to 2007 leave Congress with a fortune estimated at $4 million to $17 million? When Hastert entered Congress in 1987 his net worth was reported to be at most $270,000. The record shows that he was the beneficiary of lucrative land deals while in Congress and since leaving office he has earned more than $2 million a year as a lobbyist - influencing his former colleagues.

Writing in National Review, John Fund recalls that:

Denny Hastert used to visit The Wall Street Journal where I worked when he was speaker. He was a bland, utterly conventional supporter of the status quo; his idea of reform was to squelch anyone who disturbed Congress' usual way of doing business. I saw him become passionate only once, when he defended earmarks - the special projects such as Alaska's "Bridge to Nowhere" - that members dropped at the last minute into conference reports, deliberately giving no time to debate or amend them. Earmarks reached the staggering level of 15,000 in 2005, and their stench helped cost the GOP control of Congress next year. But Hastert was unbowed. "Who knows best where to put a bridge or a highway or a red light in his district?" I recall him bellowing. I responded that the Illinois Department of Transportation came to mind, and then we agreed to disagree.

The Sunlight Foundation found that Hastert had used a secret trust to join others and invest in farmland near the proposed route of a new road called the Prairie Parkway. He then helped secure a $207 million earmark for the road. The land, approximately 138 acres, was bought for about $2.1 million in 2004 and later sold for almost $5 million, a profit of 140 percent. Local land records and Congressional disclosure forms never identified Hastert as the co-owner of any of the land in the trust. Hastert turned a $1.3 million investment (his portion of the land holdings) into a $1.8 million profit in less than two years.

Once he left Congress, Hastert joined the professional services firm of Dickstein Shapiro, working all sides of various issues and glad-handing his former colleagues in Congress often for controversial clients. From 2011 to 2014, Lorillard Tobacco paid Dickstein Shapiro nearly $8 million to lobby for the benefit of candy-flavored tobacco and electronic cigarettes, and Hastert was the most prominent member of the lobbying team.

Hastert also pressed lawmakers on climate change for Peabody Energy, the largest private sector coal company in the world, in 2013 and 2014 - then switched sides this year and pushed for requiring renewable fuel production for Fuels America. Lawmakers and fellow lobbyists compared Hastert's qualities as a lobbyist to those he displayed as speaker: affable and low-key, but attractive to clients. In the post-earmark world, notes Rep. Tom Cole (R-OK), a senior member of the House Appropriations Committee, Hastert pressed for policy "riders" in appropriations bills and programmatic changes that helped his client's interests.

"As you'd expect, he was very effective," said Cole, "Number one, he knew the process extremely well, and he knew all the players. When the former speaker calls no member rejects it." Former Rep. Jack Kingston, a Republican who led the Appropriations Committee, says, "Yeah, it's possible, he could amass in a 10-year period a nest egg of $5 to $10 million. I'm not saying it's easy, but it's not that hard."

Sadly, Dennis Hastert's use of public office as a path to wealth is hardly unique. National Review reports that:

In 2004, Sen. Harry Reid (D-NV) made $700,000 off a land deal that was, to say the least, unorthodox. It started in 1998 when he bought a parcel of land with attorney Jay Brown, a close friend whose name has surfaced multiple times in organized crime investigations. Reid transferred his portion of the property to Patrick Lane Industrial Center L.L.C., a holding company Brown controlled. But Reid kept putting the property on his financial disclosures and when the company sold it in 2009, he profited from the deal - a deal on land he didn't technically own and that nearly tripled in value in three years.

In addition, according to Judicial Watch, Reid, the Democratic leader of the Senate, sponsored at least $47 million in earmarks that directly benefited organizations with close ties to his son Key Reid.

In 2009, then-House Speaker Nancy Pelosi and her husband, Paul, made the first of three purchases of Visa stock - Visa was holding an initial public offering, among the most lucrative ever. The Pelosis were granted early access to the IPO as "special customers" who received their shares at the opening price, $44. They turned a 50 percent profit in just two days. Starting March 18, the speaker and her husband made the first of three Visa stock buys, totaling between $1 million and $5 million. Peter Schweitzer, a scholar at the Hoover Institution, notes that, "Mere mortals would have to wait until March 19, when the stock would be publicly traded to get their shares." He points out that the Pelosis got their stocks just two weeks after legislation was introduced in the House that would have allowed merchants to negotiate lower interchange fees with credit card companies. Visa's general counsel described it as a "bad bill." The speaker squelched it and kept further action bottled up for more than two years. During the time period, the value of her Visa stock jumped more than 200 percent while the stock market as a whole dropped 15 percent.

Another former House speaker, Newt Gingrich, served as a paid consultant for the drug industry's lobby group, and, according to conservative columnist Timothy Carney:

Gingrich worked hard to persuade Republican congressmen to vote for the Medicare drug subsidy that the industry favored. . . . Newt Gingrich spent the last decade being paid by big business to convince conservatives to support big government policies that would profit his clients.

The role of former members of Congress reaping financial gain by lobbying their former colleagues, as Dennis Hastert did, is increasingly widespread. Former Rep. Billy Tauzin of Louisiana, originally elected as a Democrat but later switching to the Republican Party, left his post as chairman of the powerful House Energy and Commerce Committee to become a lobbyist for the drug industry. In 2009, he reportedly earned $4.48 million as the head of the Pharma drug industry lobby, a huge increase from his congressional salary.

The idea of politicians enriching themselves as a result of their holding political office is something new. Bill and Hillary Clinton have amassed millions from various special interest groups - and foreign interests - hoping to influence a future president. Bill Clinton reported being paid more than $104 million from 2001 through 2012 just for speeches. As recently as the presidencies of Harry Truman and Dwight Eisenhower it was considered unthinkable to trade upon having held high office to enrich oneself. Until 1958, former presidents did not even receive a pension. Congress finally awarded Harry Truman and Herbert Hoover pensions and funds for staff. Washington, Jefferson, Adams, Madison, Monroe and other early leaders lost a great deal of money while serving in office. George Washington, it is reported, had to borrow money to make the trip from Mt. Vernon to New York for his own inauguration. Now, public office has, for many, become a path to riches.

A recent Rasmussen Reports poll finds that 82 percent of the American people now believe that we have a professional political class that is more focused on preserving its power and privilege than on conducting the people's business. Dennis Hastert has become the new face of this phenomenon, along with the Clintons, Nancy Pelosi, Newt Gingrich and a host of others. Whether public dismay with our current politics can be transformed into an effective effort to alter this behavior remains to be seen. Too many in Washington have a vested interest in today's corrupt system as it exists. How to change the incentive structure for those in political life is our real challenge.

Remembering the Real Heroism of Robert E. Lee at Appomattox

The surrender of Confederate Gen. Robert E. Lee to Union Lt. Gen. Ulysses S. Grant 150 years ago effectively ended the Civil War. What few remember today is the real heroism of Robert E. Lee. By surrendering, he was violating the orders given by Jefferson Davis, the elected leader of the Confederacy.

The story of April 1865 is not just one of decisions made, but also of decisions rejected. Lee's rejection of continuing the war as a guerrilla battle, the preference of Jefferson Davis, and Grant's choice to be magnanimous at Appomattox, cannot be overestimated in importance.

With the fall of Richmond, Jefferson Davis and the Confederate government were on the run. Davis, writes Professor Jay Winik of the University of Maryland in his important book April 1865: The Month That Saved America:

. . . was thinking about such things as a war of extermination . . . a national war that ruins the enemy. In short, guerrilla resistance. . . . The day after Richmond fell, Davis had called on the Confederacy to shift from a conventional war to a dynamic guerrilla war of attrition, designed to wear down the North and force it to conclude that keeping the South in the Union would not be worth the interminable pain and ongoing sacrifice.

Davis declared that:

We have now entered upon a new phase of a struggle the memory of which is to endure for the ages. Relieved from the necessity of guarding cities and particular points, important but not vital to our defense, with an army free to move from point to point and strike in detail detachments and garrisons of the enemy, operating on the interior of our own country, where supplies are more accessible, and where the foe will be far removed from his own base and cut off from all succor in case of reverse, nothing is now needed to render our triumph certain but the exhibition of our own unquenchable resolve. Let us but will it, and we are free.

But Robert E. Lee knew the war was over. Grant was magnanimous in victory and, Winik points out:

. . . was acutely aware that on this day, what had occurred was the surrender of one army to another - not of one government to another. The war was very much on. There were a number of potentially troubling rebel commanders in the field. And there were still some 175,000 other Confederates under arms elsewhere; one-half in scattered garrisons and the rest in three remaining rebel armies. What mattered now was laying the groundwork for persuading Lee's fellow armies to join in his surrender - and also for reunion, the urgent matter of making the nation whole again. Thus, it should be no great surprise that there was a curious restraint in Grant's tepid victory message passed on to Washington.

Appomattox was not preordained. "If anything," notes Winik:

. . . retribution had been the larger and longer precedent. So, if these moments teemed with hope - and they did - it was largely due to two men who rose to the occasion, to Grant's and Lee's respective actions: one general, magnanimous in victory, the other gracious and equally dignified in defeat, the two of them, for their own reasons and in their own ways, fervently interested in beginning the process to bind up the wounds of the last four years. . . . Above all, this surrender defied millenniums of tradition in which rebellions typically ended in yet greater shedding of blood. . . . One need only recall the harsh suppression of the peasants' revolt in Germany in the 16th century, or the ravages of Alva during the Dutch rebellion, or the terrible punishments inflicted on the Irish by Cromwell and then on the Scots after Culloden, or the bloodstained vengeance executed during the Napoleonic restoration, or the horrible retaliation imposed during the futile Chinese rebellion in the mid-19th century.

Lee was not alone in rejecting the idea of guerrilla war. General Joe Johnston, offered generous terms of surrender by Union General William Tecumseh Sherman, cabled the Confederate government for instructions. He was ordered to fight on. Johnston was told to take as many of his men as possible and fall back to Georgia. Johnston refused and decided to surrender. Later, he acknowledged that he directly "disobeyed" his instructions. But Johnston, who wired back to Davis that such a plan of retreat was "impracticable," saw no other way. In his view, it would be "the greatest of crimes for us to attempt to continue the war." To fight further, he declared, would only "spread ruin all over the south." By brazenly violating the chain of command, he helped to save many lives and to heal the country.

In early May, when the Mississippi governor and the former governor of Tennessee rode out and urged General Nathan Bedford Forrest to retreat with his cavalry to continue a guerrilla war, Forrest responded: "Any man who is in favor of further prosecution of this war is a fit subject for a lunatic asylum." The attempt to establish a "separate and independent confederacy had failed," Forrest noted, and they should meet their responsibilities "like men." He added: "Reason dictates and humanity demands that no more blood be shed."

In words that echo the sentiments of Robert E. Lee before him, in places almost word for word, Forrest added:

I have never on the field of battle sent you where I was unwilling to go myself, nor would I advise you to a course which I felt myself unwilling to pursue. You have been good soldiers, you can be good citizens. Obey the laws, preserve your honor, and the government to which you have surrendered can afford to be and will be magnanimous.

"April 1865," writes Professor Winik:

. . . was incontestably one of America's finest hours: for it was not the deranged spirit of an assassin that defined the country at war's end, but the conciliatory spirit of leaders who led as much in peace as in war, warriors and politicians who, by their example, their exhortation, their deeds, overcame their personal rancor, their heartache, and spoke as citizens of not two lands, but one, thereby bringing the country together. True, much hard work remained. But much, too, had already been accomplished.

If it were not for Robert E. Lee's decision not to blindly follow irrational instructions to keep fighting a guerrilla war indefinitely, the surrender at Appomattox never would have taken place and our nation's history might have been far different. Fortunately, our American tradition has never embraced the notion of blindly following orders, particularly if they involved illegal or immoral acts. No American could ever escape responsibility for such acts by saying, "I was simply following orders."

The Civil War era poet James Russell Lowell makes this point:

"Taint your eppyletts an' feathers,
Make the thing a grain more right;
"Taint afollerin' your bell-wethers
Will excuse ye in His sight;
Ef you take a sword an' dror it,
An' go stick a feller thru,
Guv'ment aint to answer for it,
God'll send the bill to you.

Without Robert E. Lee's decision to surrender - against his instructions - we would not be celebrating the 150th anniversary of Appomattox at the present time. This heroic act has not been widely recognized. It deserves to be.

The Proposal to Remove Alexander Hamilton from the $10 Bill Is an Assault on American History

Secretary of the Treasury Jack Lew announced in mid-June that the Treasury Department's 2020 redesign of the $10 bill will feature a female portrait. While including women on our currency is long overdue, and a welcome step, removing Alexander Hamilton makes no sense. In fact, it is an assault upon American history itself.

Alexander Hamilton is a towering figure. His story is an inspiring one. The illegitimate son of a British officer who emigrated from the West Indies, Hamilton rose by the sheer force of intellect to shape our entire nation. He died at the age of 50 in 1804. In those short fifty years, his achievements were extraordinary.

Three years before he died he founded The New York Evening Post, which is still being published. George Washington promoted him from an artillery captain to a colonel on his staff during the Revolutionary War. In 1785, Hamilton helped found the New York Manumission Society to work for an end to slavery in that state. Emancipation was not achieved in New York until 1827, long after Hamilton's death.

In 1787, Hamilton, along with collaborators John Jay and James Madison, produced a series of 85 opinion pieces to support the ratification of the Constitution, what we now know as The Federalist Papers. Hamilton wrote 51 of the 85 essays himself, using the pseudonym Publius. When we speak of an American political philosophy, the starting point is The Federalist Papers.

That government should be clearly limited and that power is a corrupting force was the essential perception of the framers of the Constitution. The Founding Fathers were not utopians. They understood man's nature. They attempted to form a government that was consistent with, not contrary to, that nature. Hamilton pointed out that:

Here we have seen enough of the fallacy and extravagance of those idle theories which have amused us with promises of an exemption from the imperfections, weaknesses, and evils incident to society in every shape. Is it not time to awake from the deceitful dream of a golden age, and to adopt as a practical maxim for the direction of our political conduct that we, as well as the other inhabitants of the globe, are yet remote from the happy empire of perfect wisdom and perfect virtue?

As our first Treasury Secretary under George Washington, Hamilton set the nation on a path of financial stability. He had the new federal government assume the debts of the states along with its own. He also declared that all holders of U.S. debt would be paid in a non-discriminating manner. Many were opposed to Hamilton's plans, arguing the assumption of state debts rewarded those states which had been lax in meeting their responsibilities and that a policy of non-discrimination in paying holders of U.S. debt rewarded speculators. In the end, Hamilton prevailed.

Hamilton also established a central bank - the Bank of the United States. This, again, was highly controversial. It would also be a private bank, selling stock and making loans. His collaborator on The Federalist Papers, James Madison, thought the bank was illegal, since there was no mention of a bank in the Constitution. Hamilton responded that a bank was necessary to fulfill a function the Constitution did mention, borrowing money on the credit of the United States. This was, Hamilton argued, an implied power. George Washington and the Congress agreed.

The response to the idea of removing Hamilton from the $10 bill has been uniformly negative from observers of all points of view. Former Federal Reserve Chairman Ben Bernanke said he was "appalled" by the idea of Hamilton's removal. He said:

Replace Andrew Jackson, a man of many unattractive qualities, and a poor president, on the $20 bill. Given his views on central banking, Jackson would probably be fine with having his image dropped from a Federal Reserve note. Another, less attractive possibility, is to circulate two versions of the $10 bill, one of which continues to feature Hamilton. . . . The importance of his achievement can be judged by the problems that the combination of uncoordinated national fiscal policies and a single currency has caused the Eurozone in recent years.

Ron Chernow, the author of a highly regarded biography, Alexander Hamilton (2004), laments that:

There is something sad and shockingly misguided in the spectacle of Treasury Secretary Jack Lew acting to belittle the significance of the foremost Treasury Secretary in U.S. history. . . . Hamilton was undeniably the most influential person in our history who never attained the presidency. . . . Drawing on a blank slate, Hamilton arose as the visionary architect of the executive branch, forming from scratch the first fiscal, monetary, tax and accounting systems. He assembled the Coast Guard, the customs service, and the Bank of the United States. . . . He took a country bankrupted by Revolutionary War debt and restored American credit.

Chernow declares:

Yes, by all means let us have a debate about the political figures on our currency, and, yes, let us now praise famous women. But why on earth should we start the debate by singling out and punishing Alexander Hamilton, who did so much to invent our country.

Alexander Hamilton believed that genuine freedom could only be found in a society which guaranteed economic freedom. In his Second Treatise John Locke, the philosopher who most significantly influenced the thinking of Hamilton and the other Founding Fathers, stated that:

The great and chief end . . . of man's uniting into commonwealths and putting themselves under government is the preservation of their property. . . . Every man has a property in his own person. This nobody has any right to but himself. The labor of his body and the work of his hands, we may say, are properly his. Whatsoever, then, he removed out of the state that nature hath provided and left it in, he hath mixed his labor with it, and joined to it something that is his own, and thereby makes it his property.

Those who advocate an "equal" distribution of property claim that, in doing so, they are simply applying the philosophy of the Founding Fathers to matters of economic concern. Nothing could be further from the truth.

In The Federalist Papers, it is written that:

The diversity in the faculties of men, from which all rights of property originate, is not less an insuperable obstacle to a uniformity of interest. The protection of these faculties is the first object of government. From the protection of different and unequal faculties of acquiring property, the possession of different degrees and kinds of property immediately results.

Writing in The New York Times, Steven Rattner notes that:

The various women who've been put forward for this pioneering role - including Susan B. Anthony (a second try after her dollar coin flopped twice), Harriet Tubman and Eleanor Roosevelt - are all outstanding individuals worthy of recognition. Just don't push Alexander Hamilton aside to make room.

Writing in The Washington Post, Steven Mufson charges that, "By pushing aside Alexander Hamilton . . . the Obama administration has committed a grave historical injustice."

Jack Lew's assault on U.S. history has almost no defenders or supporters. Removing Alexander Hamilton from the $10 bill is clearly a bad idea. What, we must wonder, was Mr. Lew thinking when he came up with this proposal? *

Page 2 of 8