Allan C. Brownfeld

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby(Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

Saturday, 05 December 2015 05:10

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

Out of Control Executive Authority Is a Growing Threat to Representative Government

The growth of presidential power in recent years represents a serious threat to representative government. The idea of the executive "executing" the laws passed by the elected representatives of the people in the Congress seems to those in power - whether Republicans or Democrats - to be an old-fashioned notion.

When President Obama unilaterally called a halt to deportation proceedings for certain unauthorized immigrants who came to the U.S. as minors, the eligibility requirements roughly tracked the requirements of the Dream Act, which was never passed by Congress.

In an interview with a panel of Latino journalists last fall, the president said:

This notion that somehow I can just change the laws unilaterally is just not true. We live in a democracy. You have to pass bills through the legislature and then I can sign it.

Gene Healy, vice president of the Cato Institute, notes that,

As it happens, Obama's "royal dispensation" for young immigrants is hardly the most terrifying instance of administration unilateralism. In fact, as a policy matter, it's a humane and judicious use of prosecutorial resources. But given the context, it stinks. It looks uncomfortably like implementing parts of a bill that didn't pass and - carried out as it was with great fanfare and an eye to the impending election - the move sits uneasily with the president's constitutional responsibility to "take care that the laws be faithfully executed."

Or consider the president's claim of "executive privilege" in withholding information about the Justice Department's Operation Fast and Furious, which deliberately put assault weapons in the hands of Mexican drug cartels as part of a sting, and then lost track of hundreds of them. A Border Patrol agent was killed in 2010, apparently by one of these guns.

Executive privilege, affirmed by the Supreme Court in U.S. v. Nixon, is historically limited to the president's own discussions. President Obama has now extended it to his attorney general. This contravenes the president's promises of transparency.

Recent legislation has made legal the president's right to detain a person indefinitely on suspicion of affiliation with terrorist organizations or "associated forces," a broad, vague power that can be abused without real oversight from the courts or the Congress.

At the same time, American citizens can now be targeted for assassination or indefinite detention. Recent laws have also canceled the restraints in the Foreign Intelligence Surveillance Act of 1978 to allow unprecedented violations of our right to privacy through warrantless wiretapping and government mining of our electronic communications.

According to The New York Times, President Obama is personally deciding upon every drone strike in Yemen and Somalia and the riskiest ones in Pakistan, assisted only by his own aides. It is reported that suspects are now being killed in Yemen without anyone knowing their names, using criteria that have not been made public.

Editorially, The Times declares that no president

. . . should be able to unilaterally order the killing of American citizens or foreigners located far from a battlefield - depriving Americans of their due process rights - without the consent of someone outside his political inner circle. How can the world know whether this president or a successor truly pursued all methods short of assassination, or instead - to avoid a political charge of weakness - built up a tough-sounding list of kills?

To permit President Obama - or any president - to execute American citizens without judicial review and outside the theater of war, gives him the power of judge, jury and executioner without any check or balance. This is clearly an abuse of presidential power.

For many years, under both parties, the power of the executive has been growing. In his classic work, The American Presidency, written in 1956. Professor Clinton Rossiter writes that:

The presidency has the same general outlines as that of 1789, but the whole picture is a hundred times magnified. The president is all the things he was intended to be, and he is several other things as well. . . . The presidency today is distinctly more powerful. It cuts deeply into the power of Congress. In fact it has quite reversed the expectations of the framers by becoming itself a vortex into which these powers have been drawn in massive amounts. It cuts deeply into the lives of the people; in fact, it commands authority over their comings and goings that Hamilton himself might tremble to behold. The outstanding feature of American constitutional development is the growth of the power and prestige of the Presidency.

He also makes the converse explicit:

The long decline of Congress has contributed greatly to the rise of the presidency. The framers . . . expected Congress to be the focus of our system of government.

That was 1956. The power of the presidency has steadily expanded since then, no matter which party was in power.

When Republican presidents have expanded the power of the presidency, Republicans in the Congress have acquiesced. When Democratic presidents expanded the power of the executive, Democrats in the Congress have embraced that expansion. The result is an executive branch increasingly unaccountable to the elected representatives of the people. That is not the system the authors of the Constitution had in mind. We would do well to return to the constitutional philosophy of checks and balances, and a division of powers. An all powerful executive is a threat to freedom and accountability, as the Framers of the Constitution understood very well as a result of their own experience and of the experience of the world.

To Reduce Government - and Debt - We Must Change the Incentive Structure for Politicians

Government spending and government debt has been skyrocketing. Under President George W. Bush, the debt reached unprecedented levels. Under President Barack Obama, it has exploded still further. Whichever party is in power, government gets larger and government debt increases.

Our political system, sadly, rewards big spenders. Every organized special interest group in the American society - the farmers, the teachers, the labor unions, manufacturers, Wall Street financiers, realtors, etc. - want one or another form of government subsidization for themselves.

They all have active political action committees, which promise rewards for those who open the government coffers to them, and penalties for those who do not. The incentive is clearly one-sided. As Democrats used to say in the New Deal days, the way to electoral success is clear: "Spend and spend, tax and tax, elect and elect." Now, Republicans too have learned this lesson. Since neither Republicans nor Democrats are too eager to antagonize voters by raising taxes to pay for their extravagant spending, the budget deficits grow each year.

In May, for example, President Obama reauthorized the Export-lmport Bank, raising its lending authority 40 percent to $140 billion by 2014, one day before the 78-year-old federal bank would have been shut down had he not signed the bill. During the 2008 presidential campaign, Mr. Obama called the bank, "little more than a fund for corporate welfare."

Despite President Obama's frequent criticism of corporate jets, the bill includes $1 billion in subsidies for jet manufacturers, which have experienced a steep decline in demand in recent years. Export-lmport Bank supporters in the business community - who speak of "free markets" but campaign vigorously for government subsidies - welcomed the President's support. John Murphy, vice president for international affairs at the U.S. Chamber of Commerce, said that the President's action was "Great news for thousands of American workers and businesses of all sizes." The National Association of Manufacturers - and both Republicans and Democrats in Congress-supported the reauthorization of the Export-lmport Bank.

Tim Phillips, president of Americans for Prosperity, described the Bank in these terms:

In (its) nearly 80 years, the official credit export agency of the United States has financed over $450 billion in purchases. Ex-Im allows the federal government to pick winners and losers in the market, and all too often, that leads to back room deals and government cronyism. . . . It is a heinous practice that gives money to a small number of politically connected companies while leaving taxpayers with the risk. . . . The American taxpayer does not exist in order to keep businesses from failing.

Republicans and Democrats are co-conspirators in this enterprise. The incentive structure for both parties is precisely the same. Republicans may talk of the "free market" and argue that Democrats are against it, but both parties raise their funds on Wall Street and in corporate boardrooms, and both parties have supported bailouts of failed businesses and subsidies for others.

Voters say that they are against big government, and oppose deficit spending, but when it comes to their own particular share, they act in a different manner entirely. This is nothing new. Longtime Minnesota Republican Congressman Walter Judd once recalled that a Republican businessman from his district

. . . who normally decried deficit spending berated me for voting against a bill which would have brought several million federal dollars into our city. My answer was, "Where do you think federal funds for Minneapolis come from? People in St. Paul?". . . My years in public life have taught me that politicians and citizens alike invariably claim that government spending should be restrained - except where the restraints cut off federal dollars flowing into their cities, their businesses, or their pocketbooks.

If each group curbed its demands upon government it would be easy to restore health to our economy. Human nature, however, leads to the unfortunate situation in which, under representative government, people have learned that through political pressure they can vote funds for themselves that have, in fact, been earned by the hard work of others.

This point was made 200 years ago by the British historian Alexander Tytler:

A democracy cannot exist as a permanent form of government. It can only exist until the voters discover they can vote themselves largess out of the public treasury. From that moment on, the majority always votes for the candidates promising the most benefits from the public treasury - with the result that democracy collapses over a loose fiscal policy, always to be followed by a dictatorship.

Hopefully, we can avoid fulfilling this prediction. It is an illusion to think that such a thing as "government money" exists. The only money which government has is what it first takes from citizens. Many years ago, Senator William Proxmire (D-Wisconsin) pointed out that no one ever petitions members of Congress to "leave us alone," everyone who comes before Congress, he lamented, wants something. Members of Congress - of both parties - have the same incentive, to give each group what it wants to ensure their support for the future. The result is that government spending - and government debt - steadily grows.

Unless we find a way to change this incentive structure, it seems unlikely that we will bring government spending - and government deficits - under control. As the presidential campaign gets under way, neither party is addressing this crucial question. Politics as usual, unfortunately, will not help us to resolve the very real problems we face.

Finally, Attention Is Being Focused on Our System of Excessive Public Pensions

In June, the effort by labor unions and others to recall Wisconsin Governor Scott Walker was soundly defeated. Governor Walker had not committed any crime or other indiscretion. He was being recalled only because he had implemented the policies he promised during his campaign.

In February 2011, Walker announced his plan to limit the subjects covered by collective bargaining for public employees, compel them to contribute more to their healthcare and pension plans, stop government from collecting dues automatically on unions' behalf, and require public employee unions to hold annual certification elections. Once in office, he implemented these policies.

Wisconsin's recall policy is questionable and, in many ways, a threat to representative democracy. Officeholders can be removed at the conclusion of their terms for policy disagreements. Wisconsin has had 14 elected state government officials involved in recall elections during the past year alone. The state's largest newspaper, the Milwaukee Journal Sentinel, endorsed Governor Walker, arguing that elected officials should not be recalled simply for policy differences. Some 60 percent of voters in exit polls agreed.

Beyond this, the union effort in Wisconsin has focused a much-needed spotlight on the excesses of public pensions. Over the years, politicians have given in to union demands for higher pensions - rather than wage increases - because they knew that such pensions would be paid many years later, under someone else's watch. Now, these bills are coming due -and many states and cities are in no position to pay them.

In New Jersey, Governor Chris Christie is seeking to reform his state's sick-pay policies. Current law allows public workers to accumulate unused sick pay, which they can cash in upon retirement. "They call them 'boat checks,"' Christie says.

Know the reason they call them boat checks? It is the check they use to buy their boat when they retire - literally.

He tells the story of the town of Parsippany, where four police officers retired at one time and were owed a collective $900,000 in unused sick pay. The municipality didn't have the money and had to float a bond in order to make the payments.

Governor Christie wants to end sick-pay accumulation. "If you're sick, take your sick day," he says. "If you don't take your sick day, know what your reward is? You weren't sick - that was the reward." Newsweek notes that,

It was by the force of such arguments that Christie was able to overcome the powerful teachers, union and force educators to help pay for their healthcare costs, and to win broad rollbacks in benefits for the state's huge public workforce.

Shortly after the vote in Wisconsin, there were landslide victories in San Jose and San Diego, California, of ballot measures meant to cut back public sector retirees' benefits. Warren Buffet calls the costs of public-sector retirees a "time bomb." They are, he believes, the single biggest threat to our fiscal health.

In California, total pension liabilities - the money the state is legally required to pay its public-sector retirees - are 30 times its annual budget deficit. Annual pension benefits rose by 2,000 percent from 1999 to 2009. In Illinois, they are already 15 percent of general revenue and growing. Ohio's pension liabilities are now 35 percent of the state's GDP.

Commentator Fareed Zakaria notes that,

The accounting at the heart of the government pension plans is fraudulent, so much so that it should be illegal. Here's how it works. For a plan to be deemed solvent, employees and the government must finance it with regular monthly contributions as determined by assumptions about investment returns of the plan. The better the investment returns the less the state has to put in. So states everywhere made magical assumptions about investment returns.

David Crane, an economic adviser to former California Governor Arnold Schwarzenegger, points out that state pension funds have assumed that the stock market will grow 40 percent faster in the 21st century than it did in the 20th. While the market has grown 175 times during the past 100 years, state governments are now assuming that it will grow 1,750 times its size over the next 100 years.

Inflated retirement benefits are the reason for dramatic cuts in spending by state and local government for anything else. Last year, California spent $32 billion on employee pay and benefits, up 65 percent over the past 10 years. In that same time period, spending on higher education is down 5 percent. Three-quarters of San Jose's discretionary spending goes to its public safety workers alone. The city has closed libraries, cut back on park services, laid off many civil servants and asked the rest to take pay cuts. By 2014, San Jose, the 10th largest city in the country, will be serviced by 1,600 public workers, one-third the number it had 25 years ago.

The Pew Center on the States has quantified the problem. In 2008, the states had set aside $2.35 trillion to pay for public employees' retirement benefits while owing $3.35 trillion in promises. A year later, the trillion-dollar gap had grown by 26 percent. The massive expanding obligation cuts into the provision of government services. Former Los Angeles Mayor Richard Riordan notes that:

A lot of things are going to happen dramatically over the next couple of years and then people will listen. If you close down all the parks and all the libraries, this is political dynamite.

In Wisconsin, as a result of Governor Walker's reforms, the state has balanced its two-year budget without tax increases and local school districts have used their new bargaining power to save money without layoffs or significant increases in class size. While leading Democrats, such as President Obama and former President Clinton, supported the recall effort in Wisconsin, many others, such as the liberal Democratic mayor of San Jose, recognize that it is time to bring the excesses of public sector unions under control. Editorially, The Washington Post declared,

. . . those who voted for Mr. Walker to show approval for his policies, and not just disapproval for the recall itself, had plausible reasons for doing so. . . . Public employee union leaders are pledging to fight . . . new laws in court. . . . They would do better to engage governments in a good-faith effort to restructure and preserve public services for the long term. States and localities face genuine problems, and the unions share responsibility for them.

Black-on-Black Crime: A Subject Which the African-American Community, Finally, Must Confront

In recent days, with the extraordinary publicity surrounding the Trayvon Martin case in Florida and an escalation in overheated racial rhetoric, one would think that the real problem facing black Americans are a result of "white racism."

Needless to say, this overlooks the fact that race relations in America have dramatically improved in recent years and that we are well on our way to achieving a genuinely color blind society.

Writing in The Washington Post, columnist Richard Cohen points out that most Americans

. . . do not know what a miracle has been pulled off - how a nation that once contained so much bigotry now contains so little. I am not a fool on these matters, I think, and I recognize . . . the residue of bigotry, but still the big picture is that Obama is a black man and is the president of the United States. Mamma, can you believe it?

Cohen provides this assessment:

Some insist that not much has changed. They cite a persistent racism. There are many such examples - not all that many, actually - but they are newsworthy because they are exceptions to the rule, not what we expect.

Recently, Wesley A. Brown, the first African American to graduate from the U.S. Naval Academy died. He was the sixth black man admitted and - the only one to successfully endure the racist hazing that had forced others to quit. He graduated in 1949. Cohen writes that,

When I read the obituary on Wesley A. Brown, I was shocked once again at the depth and meanness of our racism and just plain dumbstruck by how far we have come. The new field house at the Naval Academy is named for Brown. He called it, "The most beautiful building I've ever seen," but he was wrong. It's not a building. It's a monument.

This is not to say that the black community does not face many problems. These problems, however, are not a result of white racism but of internal forces at work within the black community. One such serious problem is crime.

Each year, roughly 7,000 blacks are murdered; 94 percent of the time, the murderer is another black person. According to the Bureau of Justice Statistics, between 1976 and 2011, there were 279,384 black murder victims. The 94 percent figure suggests that 262,621 were murdered by other blacks.

Though blacks are 13 percent of the national population, they account for more than 50 percent of homicide victims. Nationally, the black homicide victimization rate is six times that of whites, and in some cities, it is 32 times that of whites. Blacks are also disproportionately victimized by violent personal crimes, such as assault and robbery.

Economist Walter Williams points out that,

The magnitude of this tragic mayhem can be viewed in another light. According to a Tuskegee Institute study, between the years 1882 and 1998, 3,446 blacks were lynched at the hands of whites. Black fatalities during the Korean War (3,075), Vietnam War (7,243) and all the wars since 1980 (8,107) come to 18,425, a number that pales in comparison with black loss of life at home. Tragically, young black males have a greater chance of reaching maturity on the battlefields of Iraq and Afghanistan than on the streets of Philadelphia, Chicago, Detroit, Oakland, Newark, and other cities.

Sadly, the question is hardly ever discussed by black leaders. In Williams' view,

A much larger issue is how might we interpret the deafening silence about the day-to-day murder in black communities compared with the national uproar over the killing of Trayvon Martin. Such a response by politicians, civil rights organizations, and the mainstream news media could easily be interpreted as blacks killing other blacks is of little concern, but it's unacceptable for a white to kill a black person.

Several black leaders have started to discuss black-on-black crime. When President Obama commented about the Martin case, William Fair, president of the Urban League of Greater Miami, said that "the outrage should be about us killing each other, about black-on-black crime." He asked rhetorically:

Wouldn't you think to have 41 people shot (in Chicago) between Friday morning and Monday morning would be much more newsworthy and deserve much more outrage?

Former NAACP leader Pastor C. L. Bryant said that the rallies organized by Al Sharpton and Jesse Jackson suggest there is an epidemic of "white men killing young black men," adding, "The epidemic is truly black-on-black crime. The greatest danger to the lives of young black men are young black men."

Beyond this, argues Walter Williams,

Not only is there silence about black-on-black crime, there's silence about black racist attacks on whites - for example, the recent attacks on two Virginian-Pilot newspaper reporters set upon and beaten by a mob of young blacks (in Norfolk, Virginia). The story wasn't even covered by their own newspaper. In March, a black mob assaulted and knocked unconscious, disrobed and robbed a white tourist in downtown Baltimore. Black mobs have roamed the streets of Denver, Chicago, Philadelphia, New York, Cleveland, Washington, Los Angeles and other cities, making unprovoked attacks on whites and running off with their belongings.

This is not a new story. This writer was the author of a book (with Lincoln Review editor J. A. Parker) in 1974 entitled What The Negro Can Do About Crime (Arlington House). There was an extensive discussion of black-on-black crime and the manner in which black leaders refused to confront it.

On Page 54 is the following passage:

Criticizing those Negroes who have not spoken out against crime, Roy Wilkins, executive director of the NAACP, declared that, "except for a few voices, Negro citizens have given consent to robbery, muggings, assaults, and murder by their silence. They have been intimidated by a curious twisting of the 'us blacks together' philosophy that holds that complaining of black criminals is somehow 'betraying the race.'" This is nonsense. One can be proud of being black without embracing every black mugger, rapist, and auto thief.

For those in the black community genuinely concerned about the future prospects of its young men and women, focusing upon the black-on-black crime wave that now engulfs our inner cities, and has broken out into attacks upon the community at large, is an important place to begin. Thus far, however, this has largely been ignored in place of repeated attacks upon "white racism," which, by any standard, has receded dramatically. Such racial demagoguery serves the very community in whose name it is launched. It is time for a radically different direction.

Finally, Taking a Long-Needed Second Look at Vocational Education

We are now in an era when we are told that a proper goal for society is for "everyone" to go to college. At the same time, there is a serious mismatch of jobs that are now available and the number of individuals who are qualified to fill them. Manufacturing companies, for example, cannot find enough high-tech machinists, and they are subsidizing tuition at local community colleges in a desperate effort to fill vacancies.

The Cato Institute's Andrew Coulson reports that we spend - in real terms - almost twice as much per student in a public school as we did in 1970. Despite this, academic achievement has remained flat or worsened. Vocational training, a long and important path to gainful employment, has been pushed aside.

Vocational education once played an important part in our schools, designed for those who were not suited for, or had no interest in, higher education. About forty years ago, it began to fall out of fashion, in part because it became a civil rights issue. As Time recently noted:

Vocational education was seen as a form of segregation, a convenient dumping ground for minority kids in Northern cities.

Former New York City schools chancellor Joel Klein says that,

This was a real problem. And the vocational education programs were pretty awful. They weren't training the kids for specific jobs or for certified skills. It really was a waste of time and money.

In an important article, "Learning That Works," Time writer Joe Klein declares that,

Unfortunately, the education establishment's response to the voc-ed problem only made things worse. Over time, it morphed into the theology that every child should go to college (a four-year liberal arts college at that) and therefore every child should be required to pursue a college-prep course in high school. The results have been awful. High school dropout rates continue to be a national embarrassment, and most high school graduates are not prepared for the world of work. The unemployment rate for recent high school graduates who are not in school is a stratospheric 33 percent. The results for even those who go on to higher education are brutal: four-year colleges graduate only about 40 percent of the students who start them, and two-year community colleges graduate less than that, about 25 percent.

Diane Ravitch, a professor of education at New York University, says that,

College for everyone has become a matter of political correctness. But according to the Bureau of Labor Statistics, less than a quarter of new job openings will require a bachelor of arts degree. We're not training our students for the jobs that actually exist.

At the same time, the U.S. is beginning to run out of welders, glaziers, and auto mechanics - jobs that actually keep things running, and cannot be outsourced.

In Arizona and a few other states, things are beginning to change. Vocational education there is now called career and technical education (CTE) and now attracts about 27 percent of students. It has been found that they are more likely to score higher on the state's aptitude tests, graduate from high school, and go on to higher education than those who don't.

"It's not rocket science," says Sally Downey, superintendent of the East Valley Institute of Technology in Mesa, Arizona, 98.5 percent of whose students graduate from high school. "It's just finding something they like and teaching it to them with rigor."

At the Auto shop in East Valley, there are 40 late model cars and the latest in diagnostic equipment, donated by Phoenix auto dealers, who are in need of trained technicians. "If you can master the computer science and electronic components," Downey says, "you can make over $100,000 a year as an auto mechanic."

Carolyn Warner, a former Arizona schools chancellor, says tech track students

. . . are more focused, so they're more likely to graduate from two- and four-year colleges. Those who graduate from high school with a certificate of technical expertise in a field like auto repair or welding are certainly more likely to find jobs.

At East Valley, there are 38 programs, with more coming. There are firefighter, police, and EMT programs; a state-of-the-art kitchen for culinary services training and welding (which can pay $40 per hour), aeronautics, radio station, marketing, and massage therapy instruction. Almost all of these courses lead to professional certificates. In addition to high school diplomas, many of the students are trained by employers for needed technical specialties.

An interesting example of business participation in technical and vocational education can be seen in the case of a new public school in Brooklyn, New York. called P-Tech, or Pathways in Technology Early College High School. Started last September, it is a partnership of the New York City department of education, the New York City College of Technology, the City University of New York and IBM, whose head of corporate social responsibility, Stanley Litow, used to be the city's deputy schools chancellor.

The goal is to create a science and tech-heavy curriculum to prepare students - some of whom would be the first in their families to graduate from high school - for entry and mid-level jobs at top tech-oriented companies. Each student gets an IBM mentor and there is also a core curriculum focused on English, math, science, and technology.

P-Tech students will graduate with not only a high school diploma but an associate's degree as well. This is important, since 63 percent of American jobs will require postsecondary training by 2018. The U.S. economy will create more than 14 million new jobs over the next decade, but only for people with at least a community college degree. These jobs - positions like dental hygienist, medical laboratory technician, aircraft mechanic and entry level software engineer - will allow millions entry into the middle class. Many of them will require serious technology skills.

Harvard Business School professor Rosabeth Moss Kanter argues that as much as a third of the increase in unemployment in recent years can be attributed to a mismatch between skills and jobs. The gap is greatest in positions that require more than a high school diploma but less than a bachelor's degree. Companies feel that schools are simply not turning out graduates with the skills they need. That was an impetus for IBM's role with New York's P-Tech.

Chicago Mayor Rahm Emanuel is setting up five new STEM schools - the acronym stands for science, tech, engineering and math - in partnership with IBM, Microsoft, Verizon, Cisco and other companies.

Vocational education deserves a serious second look by school systems across the country. Training young men and women for jobs that actually exist in our economy - something our current educational system is not doing very well - is certainly worth doing, both for the sake of the young people involved and for the health of our larger society and economy. *

Saturday, 05 December 2015 05:08

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

Demagoguery and the Trayvon Martin Case: Denying Dramatic Progress in Race Relations

The facts in the case of the killing of Trayvon Martin in Sanford, Florida remain unclear. As the trial proceeds, such facts should be revealed.

By any standard, the shooting of an unarmed 17-year-old is a tragedy. Did Martin attack the alleged shooter, George Zimmerman, who claims he was defending himself? Was Zimmerman animated by racial animus? All of this, as we move forward, will, hopefully become known.

What we have seen, however, is a rush to judgment, particularly by those who seem to have a vested interest of their own in painting a bleak picture of race relations in the United States.

In an interview with the Los Angeles Times, the Rev. Jesse Jackson explained that with the election of President Obama:

. . . there was this feeling that we were kind of beyond racism. . . . That's not true. This victory has triggered tremendous backlash. Blacks are under attack.

The New Black Panther Party (NBPP), involved in voter fraud in Philadelphia but never prosecuted, has offered a $10,000 bounty for the capture of George Zimmerman. The Orlando Sentinel asked NBPP spokesman Mikhail Muhammad whether the call for a bounty was incitement. The response: "An eye for an eye, a tooth for a tooth." In an interview with CNN's Anderson Cooper, Muhammad said that black people were not obliged to obey "the white man's law."

On Capitol Hill, Rep. Bobby Rush (D-IL), who is black, was ousted from the House floor for violating the chamber's dress code after attempting to deliver a statement while wearing a gray hoodie with the hood pulled over his head. Rush contended that the hoodie Trayvon Martin was wearing symbolized the "racial profiling" that led to his death. "Racial profiling has got to stop," Rush said. "Just because someone wears a hoodie does not make them a hoodlum."

Writing in The Washington Post, Renique Allen of the New America Foundation, argues that the election of a black president has made it more difficult to talk about race in America. In her view:

The Obama presidency is "post-racial" only in the sense that it gives us an excuse not to grapple with race anymore . . . I have encountered many people who seem to believe . . . that Obama's win is proof that America has reached the mountaintop. What more is there to say about race, they ask me, after this country so proudly and overwhelmingly elected a black president? They cite success stories as disparate as Oprah Winfrey, Jay-Z, and former Time Warner chief Dick Parsons. . . . Even the most well-intentioned white people, who fundamentally understand the challenges of race in America, often can't understand why race, as a subject to wrestle with, can never be "over."

There is no doubt that racial problems have not disappeared overnight with the election of a black president. Still the evidence that race relations have been steadily improving is clear, and it is not helpful for various black spokesmen - from Jesse Jackson to Al Sharpton to Spike Lee - to use any incident, such as in Sanford, Florida, to publicly proclaim that nothing - or very little - has, in fact, changed.

Things in the Trayvon Martin case have clearly gotten out of hand. Marcus Davonne Higgins, a Los Angeles man, sent a tweet to several celebrities including what he thought was the street address of alleged shooter George Zimmerman. Film director Spike Lee didn't check but re-tweeted the incorrect address to the 250,000 people who follow him on Twitter.

Columnist Gregory Kane, who is black, reports that:

Suddenly the Sanford home of Elaine McClain, 70, and her 72-year-old husband, David McClain, started receiving hate mail and threats. George M. Zimmerman does not and has never lived at the address that Lee and others published on Twitter. But William George Zimmerman, Elaine McClain's son from a previous marriage, lived there at one time. Higgins had tweeted the wrong address. . . . Lee, an African-American who's always trying to prove how black he is, and how down with the brothers he is, probably couldn't resist what must have come naturally to him. He decided to retweet the address, the better to make a statement about the Martin shooting. The McClains had to move from their home to a hotel. . . .

Despite the demagoguery we have seen in the wake of this incident in Florida, there are abundant signs that America is really moving in the direction of becoming a color-blind society. According to a study the U.S. Census released late in January, residential segregation has been dramatically curtailed. The study of census results from thousands of neighborhoods by the Manhattan Institute found that the nation's cities are more economically integrated than at any time since 1910. It was found that all-white enclaves "are effectively extinct."

"There is now much more black-white neighborhood integration than 40 years ago," said Professor Reynolds Farley of the University of Michigan's Population Studies Center. "Those of us who worked on segregation in the 1960s never anticipated such decline."

At the same time, interracial marriages in the U.S. have climbed to 4.8 million - a record 1 in 12. A Pew Research Center study, released in February, details an America where interracial unions and mixed-race children are challenging typical notions of race.

"The rise in interracial marriage indicates that race relations have improved over the past quarter-century," said Daniel Lichter, a sociology professor at Cornell University.

Mixed-race children have blurred America's color line. They often interact with others on either side of the racial divide, and frequently serve as brokers between friends and family members of different racial backgrounds.

Black Americans are optimistic about the future. A 2011 survey conducted by the Washington Post-Kaiser-Harvard Poll found that in the midst of our economic downturn, 60 percent of blacks said they believed their children's standard of living would be better than their own, while only 36 percent of whites held this view. On the eve of President Obama's inauguration, 69 percent of black respondents told CNN pollsters that Martin Luther King's vision had been "fulfilled."

From 2002 to 2007, the number of black-owned businesses grew by 60.5 percent to 1.9 million, more than triple the national rate of 18 percent, according to the Census Bureau. Black Americans hold positions of responsibility in every aspect of our society - from President, to Governor, to Attorney General, to Supreme Court justice. We have, in recent years, had two black Secretaries of State. There is no position in our society to which black Americans cannot aspire.

Whatever facts finally emerge in the Trayvon Martin case, we must reject those racial demagogues who seek every opportunity to deny racial progress and to promote themselves as leaders of an embattled and isolated minority. Our society has made dramatic progress. Certainly, there is more progress to be made in the future. But no incident - such as the one in Florida - should be used as a means to deny that progress and paint a dark - and untrue - picture of a society that has moved dramatically to overcome the racial barriers of the past. Those who engage in such tactics are not friends of the black community but may, in the end, be doing it as much harm as genuine racists.

Serious Thought Should Be Given to Unintended Consequences of War with Iran

At the present time, many are speaking of launching a pre-emptive strike against Iran. Washington Post columnist Dana Milbank writes that:

It's beginning to feel a lot like 2003 in the capital. Nine years ago . . . there was a similar feeling of inevitability - that despite President George W. Bush's frequent insistence that "war is my last choice," war in Iraq was coming.

In the case of Iraq, one of the key reasons given for launching our attack was that Saddam Hussein was in possession of weapons of mass destruction. This, of course, turned out not to be the case. Once again, some are urging an attack upon Iran because of that country's nuclear program. Our experience in Iraq should give us pause. Not only did we go to war with a country that had not attacked us, had no weapons of mass destruction, and no connection with the terrorists who were responsible for 9/11, but we did serious damage to our economy and lost untold numbers of American lives in an effort which now seems difficult to explain and understand.

Everyone agrees that Iran does not currently have nuclear weapons. U.S. intelligence believes that Iran is several years from achieving a nuclear capacity and it remains unclear that the Iranian leaders have made a decision to move in that direction. Those who have studied this region are most critical of those who call for war.

Gary Sick, national security adviser on Iran during the country's Islamic revolution, does not envision a situation in which Iran decides to break out and build a bomb, unless it is first attacked. Actually crossing the nuclear threshold would be "inviting an attack," Sick said, and would not be in Tehran's interest. But if Iran doesn't build a bomb, its demonstrated capability to do so, Sick explains, will make it a member of a small club of nations, such as Japan, Brazil and Sweden, that can acquire a nuclear weapon if they break away from the Non-Proliferation Treaty. In either case, Iran's goal is to assert its position as a major player in the region, one that the world should take seriously and with which it should consult.

The International Atomic Energy Agency (IAEA) has documented that Iran is putting all the pieces in place to have the option to develop nuclear weapons at some point. If Supreme Leader Ayatolla Ali Khamenei decides to produce a bomb, Iran is believed to have the technical capability to produce a testable nuclear device in a year or so and a missile-capable device in several years. But as Director of National Intelligence James Clapper told the Senate Armed Services Committee on February 16, it does not appear that Khamenei has made this decision.

Colin Kahl, an associate professor at Georgetown University's School of Foreign Service, who was deputy assistant secretary of defense for the Middle East from 2009 to 2011, argues:

Khamenei is unlikely to dash for a bomb in the near future because IAEA inspectors would probably detect Iranian efforts to divert low-enriched uranium and enrich it to weapons-grade level at declared facilities. Such brazen acts would trigger a draconian international response. Until Iran can pursue such efforts more quickly or in secret - which could be years from now - Khamenei is unlikely to act.

A full page ad in The Washington Post was headlined: "Mr. President: Say No to War of Choice With Iran." The signatories included General Joseph Hoar (USMC, Ret.), Brigadier General John H. Johns (USA, Ret.), Major General Paul Eaton (USA, Ret.), Tom Fingar, former Deputy Director of National Intelligence for Analysis, and Paul Pillar, former National Intelligence Officer for the Near East and South Asia. They declare:

The U.S. military is the most formidable military force on earth. But not every challenge has a military solution. Unless we, or an ally, are attacked, arms should be the option of last resort. Our brave servicemen and women expect you to exhaust all diplomatic and peaceful options before you send them into harm's way. Preventing a nuclear-armed Iran is rightfully your priority and your red line. Fortunately, diplomacy has not been exhausted and peaceful solutions are still possible. Military action is not only unnecessary, it is dangerous - for the United States and for Israel. We urge you to resist the pressure for a war of choice with Iran.

In Israel, opinion is sharply divided over the question of pre-emptive war. Many respected Israelis believe that a pre-emptive attack against Iran would be a serious mistake for Israel and would do it serious long-term harm. Political scientist Yeherkel Dror, an Israel Prize winner and founding president of the Jewish People Policy Institute, says that with regard to Iran, Israel needs to rely on "ultimate deterrence," that an attack on Tehran's nuclear facilities will not only be counterproductive and that the real danger Israel faces is from a gradual wearing away of its staying power.

"Assuming you attack, then what?" he says.

In five years, they will recuperate with absolute determination to revenge. The idea that an Israeli attack will make Iran into a peace-loving country is not on my horizon. I don't know anything like this in history. I know the opposite from history. . . . Iran has a very low probability of being a suicidal state. They have a long culture, a long history, and they are much more involved in the Shia-Sunni conflict than the Israeli side issue. I think no one has any doubt that if Israel's future existence is in danger it will use mass killing weapons.

The Jerusalem Report notes that:

Three men once most closely involved in Israeli efforts to stop Iran - former Mossad chiefs Meir Dagan (2002-2011), Efraim Halevy (1998-2002) and Danny Yatom (1996-1998) - all see a lone Israeli military attack as a last resort, to be avoided if at all possible.

Speaking at the Hebrew University last May, Dagan derided an Israeli strike as "a stupid idea," it might not achieve its goals. It could lead to a long war, and worse, it could give Iranian leaders justification to build a nuclear weapon. In Dagan's view, precipitate Israeli action could break up the current anti-Iranian consensus, leading to less pressure on Iran, not more.

According to The Jerusalem Report:

Dagan holds that there is still time; last year he estimated that Iran would not have a nuclear weapon before 2015. . . . Efraim Halevy says Israel should recognize that it is a regional power and act like one. He says the country is too strong to be destroyed and the Israeli people should not have existential fears about Iran or anything else. . . . Israel's strategy should be to work with its allies to convince the Iranian regime to change course without force coming into play. In Halevy's view, this is achievable since the Iranian regime is dedicated primarily to its own survival and will likely back down if it feels threatened by even more crippling sanctions. Israel should be using its international connections to ratchet up pressure on the Iranian regime, while preparing a military option if, and only if, all else fails.

It seems clear that if Iran were ever to develop and use a nuclear weapon there would be massive retaliation, endangering the country's entire population. During the Cold War, the Soviet Union was armed to the teeth with nuclear weapons, and never used them precisely because of a fear of retaliation, what became known as Mutual Assured Destruction. Iran would have to be suicidal to even think of using such a weapon.

The available evidence is that Iran is not suicidal. General Martin Dempsey recently explained that he viewed Iran as a "rational actor." Although some protested this characterization, Time's Fareed Zakaria points out:

Dempsey was making a good point. A rational actor is not necessarily a reasonable actor or one who has the same goals or values that you or I do. A rational actor is someone who is concerned about his survival.
Compared with radical revolutionary regimes like Mao's China - which spoke of sacrificing half of China's population in a nuclear war to promote global Communism - the Iranian regime has been rational and calculating in its actions.

In an essay in the Washington Monthly, former senior U.S. intelligence official Paul Pillar writes:

More than three decades of history demonstrate that the Islamic Republic's rulers, like most rulers elsewhere, are overwhelmingly concerned with preserving their regime and their power - in this life, not some future one.

For the most powerful country in the world to even think of a pre-emptive war against a country that has not attacked us, has no nuclear weapons, and there is serious question about whether or not they have even decided to pursue them in the future, would itself be an irrational act. It is time for a serious debate - and serious discussion of the consequences of any such action. And if the time comes when the U.S. decides that an attack on Iran does make sense, it should be done in the form of a declaration of war by the U.S. Congress, as called for in our Constitution.

Efforts Grow to Restore Private Property Rights

Respect for private property is an essential element of a free society. In his Discourse on Political Economy, Rousseau writes that:

It should be remembered that the foundation of the social contract is property; and its first condition, that every one should be maintained in the peaceful possession of what belongs to him.

In The Prince, Machiavelli notes that, "When neither their property nor their liberty is touched, the majority of men live content."

In our own society, there have been increasing efforts to limit the rights of property owners. Fortunately, efforts are now growing to reverse such trends.

In March, the Supreme Court ruled that an Idaho couple facing ruinous fines for attempting to build a home on private property that the federal government considered protected wetlands may challenge an order from the Environmental Protection Agency.

This case was considered the most significant property rights case on the court's docket this year, with the potential to change the balance of power between landowners and the EPA in disputes over land use, development, and the enforcement of environmental regulations.

Critics called the EPA action an example of overreach, as the property in question was a small vacant lot in the middle of an established residential subdivision. The government argued that allowing EPA compliance orders to be challenged in court could severely delay actions needed to prevent imminent ecological disasters.

Justice Antonin Scalia, writing for a unanimous court, said that Michael and Chantell Sackett are entitled to appeal the EPA order, rejecting the agency's argument that allowing landowners timely challenges to its decisions would undermine its ability to protect sensitive wetlands.

In the decision, Justice Scalia wrote:

The law's presumption of judicial review is a repudiation of the principle that efficiency of regulation conquers all. And there is no reason to think that the Clean Water Act was uniquely designed to enable the strong-arming of regulated parties into "voluntary compliance" without the opportunity for judicial review - even judicial review of the question whether the regulated party is within the EPA's jurisdiction.

The EPA issues nearly 3,000 administrative compliance orders a year that call on suspected violators of environmental laws to stop what they're doing and repair the harm they have caused. Business groups, homebuilders, road builders and agricultural interests all came out in opposition to the EPA in the case.

Mr. Sackett said that the Supreme Court ruling affirmed his belief that "the EPA is not a law unto itself." He said that, "The EPA used bullying and threats of terrifying fines, and has made our life hell for the past five years."

Senator John Barasso (R-Wyoming) said that:

This decision delivers a devastating blow to the Obama administration's "War on Western Jobs." This victory by one Western couple against a massive Washington bureaucracy will inspire others to challenge the administration's regulatory overreach.

The case stemmed from the Sacketts' purchase of a 0.63 acre lot for $23,000 near Priest Lake, Idaho in 2005. They had begun to lay gravel on the land, located in a residential neighborhood, when they were hit by an EPA compliance order informing them that the property had been designated a wetland under the Clean Water Act. Justice Scalia noted that the property bore little resemblance to any popular concept of a wetland, protected or not.

The Pacific Legal Foundation in Sacramento, which represented the Sacketts, called it

. . . a precedent-setting victory for the rights of all property owners. . . . The Supreme Court's ruling makes it clear that EPA bureaucrats are answerable to the law in the courts like the rest of us.

There are also efforts under way to stop the abuses of the policy of eminent domain. In February, the Virginia General Assembly gave its first approval to a constitutional amendment restoring the sanctity of private property. The measure was made necessary by the 2005 Supreme Court decision in Kelo v. New London, that gave towns and cities free rein to grab land - not for public uses - but for the use and benefit of well-connected developers.

Over the years, the Supreme Court has expanded the scope of government takings by redefining "public use." The Washington Times declares that:

Originally, the term was applied to such things as parks, roads or rail lines - all of which were open for use by the entire community. The high court elasticized the concept to include land intended for a public "purpose" such as eliminating blight or other catch-all categories related to public safety. The Kelo court went further to rule that economic growth, and the tax revenue that would accrue from it, was sufficient to justify a land grab.

Virginia Attorney General Kenneth Cuccinelli and Governor Robert McDonnell have been leading the fight to reform that state's property laws over the developer interests that, until now, have succeeded in blocking the amendment from consideration. The measure clarifies that eminent domain may be used only for purposes that are truly public. Land could not be transferred by the government to private entities to generate more tax dollars.

Christina Walsh, of the Institute for Justice, which argued the Kelo case before the Supreme Court, states that:

The power of eminent domain is supposed to be for "public use" so government can build things like roads and schools. . . . But starting with the wildly unsuccessful urban renewal efforts of the 1940s and 1950s, "public use" has been stretched to mean anything that could possibly benefit the public. . . . It has been demonstrated time and again that eminent domain is routinely used to wipe out black, Hispanic, and poorer communities, with less political capital and influence in favor of developers' grand plans.

Groups across the political spectrum have recognized the need to limit this abuse of power. The diverse coalition has included the League of United Latin American Citizens, the National Federation of Independent Business and the Farm Bureau. There is now a bipartisan bill, H.R. 1433, making its way through the House that would strip a city of federal economic development funding for two years if the city takes private property to give to someone else for private use.

In all these cases - the Supreme Court decision concerning the EPA, the proposed constitutional amendment in Virginia - and the legislation now being considered in the House, we see a commitment to restore private property rights, an essential ingredient of a genuinely free society. All of these efforts should be encouraged and supported.

Understanding the Reasons for America's Growing Class Divide

The question of economic inequality has become an important part of our national conversation. Recently, the Congressional Budget Office supplied hard data on the widening economic gap. Among Western countries, America stands out as the place where economic and social status is most likely to be inherited.

What is less often discussed are the reasons for this disparity. A key element has been the dramatic changes that have taken place in recent years in family life.

In 1965, the respected liberal intellectual, and later Senator from New York, Daniel Patrick Moynihan, wrote a controversial report on the perilous state of the black family, pointing out that 24 percent of births among blacks and 3 percent among whites were out of wedlock. In retrospect, we can see that the decline in the American family was only beginning. Today, out-of-wedlock births account for 73 percent of births among blacks, 53 percent among Latinos, and 29 percent among whites.

Recently, a front-page article in The New York Times reported that more than half of births to mothers under age 30 now occur out of wedlock. Many are casting aside the notion that children should be raised in a stable two-parent family.

The economic class divide that is attracting increasing attention cannot be considered outside of an understanding of the lifestyle choices of those involved. Almost 70 percent of births to high school dropouts and 51 percent to high school graduates are out of wedlock. Among those with some college experience, the figure is 34 percent and for those with a college degree, just 8 percent.

The breakdown of the family has a significant impact upon children. Children in two-parent families, University of Virginia sociologist Bradford Wilcox shows, are more likely to "graduate from high school, finish college, become gainfully employed, and enjoy a stable family life themselves."

In the new book, Coming Apart: The State of White America, 1960-2010, Charles Murray, of the American Enterprise Institute, focusing on white Americans to avoid capitalizing on the problems faced by minority groups, sees a significant decline in what he considers America's founding virtues - industriousness, honesty, marriage and religiosity - over the last 50 years.

That decline, he illustrates, has not been uniform among different segments of the white population. Among the top 20 percent in income and education, he finds that rates of marriage and church attendance, after falling marginally in the 1970s, have plateaued at a high level since then. And these people have been working longer hours than ever before.

In contrast, among the bottom 30 percent, those indicators started falling in the 1970s, and have been plunging ever since. Among this group, he reports, one-third of men age 30 to 49 are not making a living, one-fifth of women are single mothers raising children, and nearly 40 percent have no involvement in a secular or religious organization. The result is that children being raised in such settings have the odds stacked against them.

Discussing Murray's book, columnist Michael Barone declares that:

These findings turn some conventional political wisdom on its head. They tend to contradict the liberals who blame increasing income disparity on free-market economics. In fact it is driven in large part by personal behavior and choices. They also undermine the conservatives who say that a liberation-minded upper class has been undermining traditional values to which more downscale Americans are striving to adhere. Murray's complaint against upscale liberals is not that they are libertines but that they fail to preach what they practice.

Society does not, of course, move only in a single direction. Some indicators of social dysfunction have improved dramatically, even as traditional families continue to lose ground. There has, for example, been a dramatic decline in teenage pregnancies among all racial groups since 1990. There has also been a 60 percent decline in violent crime since the mid-90s.

Still, something is clearly happening to the traditional working-class family. Part of it, of course, is a reduction in the work opportunities available to less-educated men as many unskilled jobs move abroad to cheaper labor markets, such as China. Adjusted for inflation, entry-level wages of male high school graduates working in the private sector had health benefits, but, by 2009, that was down to 29 percent.

In 1996, sociologist William Julius Wilson published When Work Disappears: The New World of the Urban Poor, in which he argued that much of the social disruption among African-Americans popularly attributed to collapsing values was actually caused by a lack of blue-collar jobs in urban areas.

As with all complex social problems, there are many causes. Charles Murray makes an important point about the importance of marriage and family in fostering economic security and well-being, something which cannot be ignored in confronting the question of economic inequality.

Writing in Time, Rich Lowery, editor of National Review, notes that:

No one wants to be preachy about marriage when everyone knows its inevitable frustrations. . . . At the very least, though, we should provide the facts about the importance of marriage as a matter of child welfare and economic aspiration. As a society, we have launched highly effective public-education campaigns on much less momentous issues, from smoking to recycling. It's not hard to think of a spokeswoman. Michelle Obama is the daughter in a traditional two-parent family and the mother in another one that even her husband's critics admire. If she took up marriage as a cause, she could ultimately have a much more meaningful impact on the lives of children than she will ever have urging them to do jumping jacks. For now, the decline of marriage is our most ignored national crisis. As it continues to slide away, our country will become less just and less mobile.

Students Are Not Learning What They Need to Compete in Today's Economy

There is growing evidence that our colleges and universities are not teaching students what they need to compete for jobs in our high-tech international economy.

A 2010 study published by the Association of American Colleges and Universities found that 87 percent of employers believe that higher-education institutions have to raise student achievement if the U.S. is to be competitive in the global market. Sixty-three percent say that recent college graduates do not have the skills they need to succeed. And, according to a separate survey, more than a quarter of employers say entry-level writing skills are deficient.

A recent book, Academically Adrift: Limited Learning on College Campuses, by Richard Arum of New York University and Josipa Roksa of the University of Virginia, point out that gains in critical thinking, complex reasoning, and writing skills are either

. . . exceedingly small or nonexistent for a larger proportion of students. It has been found that 36 percent of students experience no significant improvement in learning (as measured by the Collegiate Learning Assessment) over four years of higher education.

Most universities do not require the courses considered core education subjects -math, science, foreign languages at the intermediate level, U.S. government or history, composition, literature, and economics.

The American Council of Trustees and Alumni (ACTA) has rated schools according to how many of the core subjects are required. A review of more than 1,000 colleges and universities found that 29 percent of schools require two or fewer subjects. Only 5 percent require economics. Less than 20 percent require U.S. government or history.

ACNA President Anne Neal declares:

How can one think critically about anything if one does not have a foundation of skills and knowledge? It's like suggesting that our future leaders only need to go to Wikipedia to determine the direction of our country.

Eight years ago, leaders at the University of Texas set out to measure something few in higher education had thought to do - how much their students learn before graduation. The answer that emerged was: not very much. The conclusion is based on results from a 90-minute essay test given to freshmen and seniors that aims to gauge gains in critical thinking and communication skills. Both the University of Texas and several hundred other public universities have joined the growing accountability movement in higher education in an effort to quantify collegiate learning on a large scale.

Last year, University of Texas freshmen scored an average 1261 on the assessment, which is graded on a scale similar to that of the SAT. Seniors averaged 1303. Both groups scored well, but seniors fared little better than freshmen. "The seniors have spent four years there, and the scores have not gone up that much," says New York University's Richard Arum.

Needless to say, it is not only our colleges that seem not to be properly preparing our students. Our high schools have fallen dramatically behind in teaching algebra, geometry, and trigonometry. This means, writes economist Walter Williams, that:

There are certain relatively high-paying careers that are probably off-limits for life. These include careers in architecture, chemistry, computer programming, engineering, medicine and certain technical fields. For example, one might meet all of the physical requirements to be a fighter pilot, but he's grounded if he doesn't have enough math to understand physics, aerodynamics and navigation. Mathematical ability provides the disciplined structure that helps people to think, speak, and write more clearly.

Drs. Eric Hanushek and Paul Peterson, senior fellows at the Hoover Institution, looked at the performance of our young people compared with their counterparts in other nations in their Newsweek article, "Why Can't American Students Compete?" last year. In the latest international tests administered by the Organization for Economic Cooperation and Development, found that only 32 percent of U.S. students ranked proficient in math - coming in between Portugal and Italy, but far behind South Korea, Finland, Canada, and the Netherlands. Seventy-five percent of Shanghai students tested proficient. In the U.S. only 7 percent could perform at an advanced level in mathematics.

In a 2009 The New York Times article, "Do We Need Foreign Technology Workers?," Dr. Vivek Wadhwa of Duke University said

. . . 49 percent of all U.S. science and engineering workers with doctorates are immigrants, as were 67 percent of the additions to the U.S. science and engineering workers with doctorates are immigrants, as were 67 percent of the additions to the U.S. science and engineering work force between 1995 and 2006. And roughly 60 percent of engineering Ph.D. students and 40 percent of master's students are foreign nationals.

Recently, President Obama proposed making kids stay in school until they are 18. This would not do much to address the nation's educational woes, say education specialists. "It's not the slam bang that it looks like," said Russ Whitehurst, director of Brown Center on Education Policy at the Brookings Institution. "It's not like you raise the age to 18 and they're going to go ahead and graduate - they're just going to stay in school."

There is much talk about the need for "everyone" to go to college - and very little discussion about what is actually being taught in our colleges. Professor Richard Vedder of Ohio University argues that:

The number going to college exceeds the number capable of mastering higher levels of intellectual inquiry. This leads colleges to alter their mission, watering down the intellectual content of what they do.

Simply put, colleges dumb down courses so that the students they admit can pass them.

Professor Walter Williams notes that:

Much of American education is a shambles. Part of a solution is for colleges to stop admitting students who are unprepared for real college work. That would help reveal the shoddy education provided at the primary and secondary school levels. But college administrators are more interested in larger numbers of students because they translate to more economy.

Beyond this, the nation's security is also at risk if schools do not improve, warns a report by a panel led by former Secretary of State Condoleezza Rice and Joel I. Klein, a former chancellor of New York City's school system.

"The dominant power of the twenty-first century will depend on human capital," the report said. "The failure to produce that capital will undermine American security."

The report said that the State Department and intelligence agencies face critical shortages in the number of foreign-language speakers, and that fields like science, defense, and aerospace face a shortage of skilled workers that is likely to worsen as baby boomers retire.

According to the panel, 75 percent of young adults do not qualify to serve in the military because they are physically unfit, or have criminal records, or inadequate levels of education. It said 30 percent of high school graduates do not do well enough on an aptitude test to serve.

In our global, high-tech economy, we cannot afford to continue the educational system we have. It is high time that we turned our attention to making the necessary changes and reforms that would keep America competitive in the twenty-first century. *

Saturday, 05 December 2015 05:04

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

Narrow Political Partisanship Obscures the Fact that Institutional Corruption Distorts Our Political Life

There can be little doubt that government spending is out of hand, and that Washington's role in our society has dramatically expanded in recent years. The American people are dismayed about the manner in which our political life has deteriorated. The party out of power, whichever one it may be, seems to want the party in power to fail - so that it can be replaced by themselves. The long-term best interests of the country are obscured.

Many tend to think of our problems in narrow partisan terms. Some argue, for example, that Democrats favor big government and deficit spending, while Republicans favor balanced budgets and limited government. Our choices in elections would be clear-cut if this were, in fact, the case.

More realistically, we see that whichever party is in power tends to expand government power. Deficits reached all-time highs under President George W. Bush - and have now reached even higher levels - dramatically higher - under President Obama. The dilemma we face is far more complex than partisan political spokesmen permit themselves to admit.

An important new book, Throw Them All Out: How Politicians and Their Friends Get Rich Off Insider Stock Tips, Land Deals, and Cronyism that Would Send the Rest of Us to Prison (Houghton Mifflin Harcourt) by Peter Schweitzer explores the world in which our politicians - both Democrats and Republicans - live.

Three years ago, then-House Speaker Nancy Pelosi and her husband, Paul, made the first of three purchases of Visa stock - Visa was holding an initial public offering, among the most lucrative ever. The Pelosis were granted early access to the IPO as "special customers" who received their shares at the opening price, $44. They turned a 50 percent profit in just two days.

Starting on March 18, the speaker and her husband made the first of three Visa stock buys, totaling between $1 million and $5 million. "Mere mortals would have to wait until March 19, when the stock would be publicly traded to get their shares," writes Peter Schweitzer, a scholar at the Hoover institution. He points out that the Pelosis got their stocks just two weeks after legislation was introduced in the House that would have allowed merchants to negotiate lower interchange fees with credit card companies. Visa's general counsel described it as a "bad bill." The speaker squelched it and kept further action bottled up for more than two years. During the time period the value of her Visa stock jumped more than 200 percent while the stock market as a whole dropped 15 percent.

"Isn't crony capitalism beautiful?" asks Schweitzer. The book shows members of Congress enriching themselves through earmarks and unpunished insider trading, politically connected companies being given billions of dollars in stimulus funds, and public money intended to help the environment, plus many varieties of kickbacks and favors.

Sadly, most of these actions fall within the letter, if not the spirit, of the law and ethics rules governing Congress.

While Senator John K. Kerry (D-MA) was working on healthcare in 2009, he and his wife began buying stock in Teva Pharmaceuticals. The Kerrys purchased nearly $750,000 in November alone. As the bill got closer to passing, the stock value soared. Pharmaceutical companies support these legislative efforts because they would increase the demand for prescription drugs. When President Obama's healthcare bill became law, the Kerrys reaped tens of thousands of dollars in capital gains while holding onto more than $1 million in Teva shares.

Republicans join their Democratic colleagues in these and other enterprises. House majority Leader Eric Cantor (R-VA) relentlessly attacks run-away government spending. To Cantor, an $8 billion high-speed rail connecting Las Vegas to Disneyland is wasteful "pork-barrel spending." Rep. Cantor set up the "You Cut" website to demonstrate how easy it is to slash government spending. Yet Cantor has been pressing the Transportation Department to spend nearly $3 billion in stimulus money on a high-speed rail project in his home state of Virginia. Newsweek found about five-dozen of the most fiscally conservative Republicans - including Texas Governor Rick Perry and Rep. Ron Paul (R-TX) trying to gain access to the very government spending they publicly oppose. According to Newsweek:

The stack of spending-request letters between these GOP members and federal agencies stands more than a foot tall, and disheartens some of the activists who sent Republicans to Washington the last election.

Judson Phillips, founder of the Tea Party Nation, says:

It's pretty disturbing. We sent many of these people there, and really, I wish some of our folks would get up and say, you know what, we have to cut the budget, and the budget is never going to get cut if all 535 members of Congress have their hands out all the time.

Former vice presidential candidate Sarah Palin, writing in The Wall Street Journal, declares that:

The corruption isn't confined to one political party or just a few bad apples. It's an endemic problem encompassing leadership on both sides of the aisle. It's an entire system of public servants feathering their own nests.

Now, Republican presidential candidate Newt Gingrich denounces big government. Previously, he enriched himself at its trough. Conservative columnist Timothey P. Carney notes that:

When Newt Gingrich says he never lobbied, he's not telling the truth. When he was a paid consultant for the drug industry's lobby group, Gingrich worked hard to persuade Republican congressmen to vote for the Medicare drug subsidy that the industry favored. To deny Gingrich was a lobbyist requires an Obama-like parsing over who is and who isn't a lobbyist. . . . Newt Gingrich spent the last decade being paid by big businesses to convince conservatives to support the big government policies that would profit his clients.

The fact - which partisans on both sides like to deny - is that both parties are responsible for the sad state of our political life - and our economic decline. Money-making opportunities for members of Congress are widespread. Peter Schweitzer details the most lucrative methods: accepting sweetheart deals of IPO stock from companies seeking to influence legislation, practicing insider trading with nonpublic government information, earmarking projects that benefit personal real-estate holdings, and even subtly extorting campaign donations through the threat of legislation unfavorable to an industry. The list is a long one.

Congress has been able to exempt itself from the laws it applies to everyone else. That includes laws that protect whistleblowers - nothing prevents members of Congress from retaliating against staff members who expose corruption, as well as Freedom of Information Act requests. Some say that it is easier to get classified documents from the CIA than from a congressional office.

To correct any problem, it is essential first to understand it properly. The problems in our political life are institutional and thinking that a simple change of parties will correct them is to misunderstand reality. Those who seek limited government, balanced budgets, and a respect for the Constitution must understand that both parties are responsible for the current state of affairs. With an appreciation of the real challenge before us, perhaps real solutions will be explored and debated. These, however, do not appear to be on today's political agenda.

Is It Really Racist to Insist that Voters Identify Themselves at the Polls?

Late in December, the Justice Department blocked a new South Carolina law that would require voters to present photo identification, saying the law would disproportionately suppress turnout among eligible minority voters.

The move was the first time since 1994 that the department has exercised its powers under the Voting Rights Act to block out a voter identification law. It followed a speech by Attorney General Eric Holder that signaled an aggressive stance in reviewing a wave of new state voting restrictions enacted in the name of fighting fraud.

In a letter to the South Carolina government, Thomas E. Perez, the assistant attorney general for civil rights, said that allowing the new requirement to go into effect would have "significant racial disparities."

Richard L. Hansen, an election law specialist at the University of California at Irvine, predicts that South Carolina will go to court, which could set up a "momentous" decision in the Supreme Court on whether a part of the Voting Rights Act that prevents states like South Carolina from changing their voting rules without federal permission is unconstitutional.

Governor Nikki Haley criticized the decision, accusing the Obama administration of "bullying" the state. She declared: "It is outrageous, and we plan to look at every possible option to get this terrible, clearly political decision overturned so we can protect the integrity of our electoral process and our 10th amendment rights."

Under the Voting Rights Act, an election rule or practice that disproportionately affects minority voters is illegal - even if there is no sign of discriminatory intent. South Carolina is one of several states that, because of a history of discriminatory practices, must prove that a measure would not disproportionately discourage minority voting.

In 2011, eight states - Arkansas, Kansas, Mississippi, Rhode Island, South Carolina, Tennessee, Texas and Wisconsin - passed variations of a rule requiring photo identification for voters. It is unclear if the four states not subject to the Voting Rights Act requirement - Wisconsin, Kansas, Rhode Island, and Tennessee - will face challenges to their laws. These laws have proven popular. In November, Mississippi voters easily approved an initiative requiring a government-issued photo ID at the polls.

Artur Davis, who serves in Congress from 2003 to 2011, and was an active member of the Congressional Black Caucus, once vigorously opposed voter ID laws. Now, he has changed his mind. In a commentary in the Montgomery Advertiser, Davis says that Alabama "did the right thing" in passing a voter ID law and admits, "I wish I had gotten it right when I was in political office."

As a congressman, he says, he "took the path of least resistance," opposing voter ID laws without any evidence to justify his position. He simply

. . . lapsed into the rhetoric of various partisans and activists who contend that requiring photo identification to vote is a suppression tactic aimed at thwarting black voter participation.

Today, Davis recognizes that the "most aggressive" voter suppression in the black community "is the wholesale manufacture of ballots at the polls" in some predominantly black districts.

Hans A. von Spakovsky, senior legal fellow at the Heritage Foundation and a former member of the Federal Election Commission, wrote a case study about voter prosecution in one such district, Greene County, Alabama, which is 80 percent black. He writes that,

Incumbent black county officials had stolen elections there for years, perpetrating widespread, systematic voter fraud. The Democratic incumbents were challenged by black Democratic reformers in 1994 who wanted to clean up local government. Voter fraud ran rampant that year. Ultimately, the U.S. Department of Justice won 11 convictions of Greene County miscreants who had cast hundreds of fraudulent votes.

Spakowsky argues that,

There was no question that (fraudulent) tactics changed the election in Greene County in 1994. But the worst thing from the standpoint of the reformers who had complained to the FBI was the reaction of the NAACP and the Southern Christian Leadership Conference (SCLC). The reformers thought those civil rights organizations would be eager to help those whose elections had been stolen through fraud. Instead both organizations attacked the FBI and federal prosecutors, claiming that the voter-fraud investigation was simply an attempt to suppress black voters and keep them from the polls.

One of the black reformers, John Kennard, a local member of the NAACP, wrote a letter to then-NAACP chairman Julian Bond charging the group with "defending people who knowingly and willingly participated in an organized . . . effort to steal the 1994 election from other black candidates." Mr. Bond replied simply that "sinister forces" behind the prosecution were "part and parcel of an ongoing attempt to stifle black voting strength." The NAACP Legal Defense Fund even defended those later found guilty of fraud.

The rhetoric used by the NAACP at that time, states Spakovsky, "is exactly the same kind that is being used today by . . . the NAACP and others who oppose voter ID laws. . . . Mr. Davis was disappointed to see Bill Clinton . . . compare voter ID to Jim Crow."

In Davis's view, voter ID is "unlikely to impede a single good-faith voter - and that only gives voting the same elements of security as writing a check at the store, or maintaining a liberty card. The case for voter ID is a good one, and it ought to make politics a little cleaner and the process of conducting elections much fairer."

Photo IDS are required to drive a car, cash a check, collect government assistance and fly on a plane - among other things. No one suggests that the need for photo ID during such transactions are "racist." To ask voters to properly identify themselves seems to be simply common sense.

Robert Knight, a senior fellow for the American Civil Rights Union, notes that, "Article I, Section 4 of the U.S. Constitution leaves voting procedures largely to the states. The Voting Rights Act requires stricter scrutiny of some states, but the case for voter suppression has yet to be made."

What is motivating the Obama administration to embark upon a crusade against voters identifying who they are before casting their ballots, is less than clear. If they think they are somehow fighting "racism," they are clearly on the wrong track.

In an Increasingly Post-Racial Society, the Realization Is Growing that Not All Black Americans Think Alike

For many years there has been an effort to read black Americans who dare to think for themselves out of the black community. To disagree with liberal politics or affirmative action is to be, in some way, rejecting one's blackness.

One of the vocal enforcers of this policy of thought control is Professor Randall Kennedy of Harvard, author of books such as Sellout: The Politics of Racial Betrayal. In Kennedy's view, there should be an expulsion option in the black community for blacks who adopt conservative views. Clarence Thomas, he argues, should turn in his black card. There should be boundaries, he declares, or else the notion of a black community bound by shared struggle disappears.

Fortunately for all of us, this point of view is now in retreat. When Professor Cornel West painted President Obama as cowardly and out of touch with black culture, he was sharply criticized by Professor Melissa Harris-Perry of Tulane. Writing in The Nation, she declared:

I vigorously object to the oft-repeated sentiment that African-Americans should avoid public disagreements and settle matters internally to present a united front. . . . Citizenship in a democratic system rests on the ability to freely and openly choose, criticize, and depose one's leaders. This must obtain whether those leaders are elected or self-appointed. It cannot be contingent on whether the critiques are accurate or false, empirical or ideological, well or poorly made. Citizenship is voice. . . . That African-Americans strenuously disagree among ourselves about goals and strategies is an ancient historical truth.

The media attention given to the criticism of President Obama by Professor West, states Harris-Perry, "can be understood only by the repeated refusal by mainstream media and broader American political culture to adequately grasp the heterogeneity of black thought."

An important new book, Who's Afraid of Post-Blackness? has just appeared. Its author, Toure, is a correspondent for MSNBC, a contributing editor at Rolling Stone, and the author of three previous books. The central point of the book is that there is no single way to be black. Justice Clarence Thomas, in his view, is no less black than Jay-Z. One of his goals, Toure writes, is "to attack and destroy the idea that there is a correct or legitimate way of doing blackness." Post-blackness, he declares, has no patience with "self-appointed identity cops" and their "cultural bullying."

What this means, according to the 105 prominent black Americans interviewed for the book, is a liberating pursuit of individuality. Black artists, like other professionals, now feel free to pursue any interest they like and are no longer burdened with the requirement to represent "the race."

Reviewing Toure's book for The New York Times, Professor Orlando Patterson of Harvard notes that:

. . . this is one of the most acutely observed accounts of what it is like to be young, black, and middle-class in contemporary America. Toure inventively draws on a range of evidence - autobiography, music, art, interviews, comedy and popular social analysis - for a performance carried through with unsparing honesty, is a distinctive voice that is often humorous, occasionally wary and defensive, but always intensely engaging.

Toure says that: "If there are 40 million black Americans, then there are 40 million ways to be black," repeating a line from Harvard university's Henry Louis Gates, Jr.:

I'm telling the self-appointed identity cops, who want to say, "This person isn't black enough," to put down their swords. Fear of post-blackness just inhibits our potential. Stop the bullying, and stop telling people they don't walk right, talk right, think right or like the right things. It's silly and ridiculous and pernicious.

When he was a student at Emory University, Toure made friends with the white students in his dormitory. Then he read The Autobiography of Malcolm X, switched his major to African-American studies, started a black-nationalist newspaper and moved into the Black Student Association's private house.

It was in this all-black house, he says, that after a party, in a room full of black people, that he was "loudly and angrily told by a linebacker-sized brother: 'Shut up, Toure! You ain't black!'" This episode led to something of an epiphany, he says. "Who gave him the right to determine what is and is not blackness for me? Who made him the judge of blackness?"

An interesting phenomenon of the emerging 2012 presidential election is the success of Herman Cain among Republican candidates. Ron Christie, a former special assistant to President George W. Bush and a fellow at the Institute of Politics at Harvard's Kennedy School of Government, notes that:

Cain's candidacy is the ultimate extension of the Obama presidency. A contender for the highest office in the land can be taken seriously regardless of race. We are heading into a 2012 election cycle in which Republican and tea party conservatives appear eager to support a candidate who just happens to be black, based on his convictions and ideas.

Several members of the Congressional Black Caucus have called the tea party movement and its backers racist. In August, Rep. Andre Carson (D-Ind.) told an audience at a CBC event in Miami that "some of them in Congress right now of this tea party movement would love to see you and me . . . hanging on a tree." He likened to "Jim Crow" the efforts of the tea party and its supporters in Congress to limit the size of the federal government.

Mr. Christie, who is black, declares that:

There will always be a fringe element in this country that is unable to accept individuals based on the color of their skin. But to me, continuing to paint the tea party as racist - even as Cain is surging - is simply more race baiting by dissatisfied Democrats.

In an interview with CNN, Herman Cain said he thinks at least a third of black voters would be inclined to support his candidacy because they are "open-minded." He declared:

This whole notion that all black Americans are necessarily going to stay and vote for Obama, that's simply not true. More and more black Americans are thinking for themselves, and that's a good thing.

Sadly, for many years, freedom of speech and debate, hailed in the nation at large as an essential element of a thriving democratic society, has been discouraged in the black community in the name of "unity." As Julius Lester, a one-time black radical and later a member of the faculty of the University of Massachusetts, said almost twenty years ago:

For two decades, an honest exchange of ideas in black America has been discouraged in the name of something called unity. Public disagreements have been perceived as providing ammunition to "the Enemy," that amorphous white "they" that works with a relentlessly evil intent against blacks. . . . The suppression of dissent and differences in the name of unity evolved into a form of social fascism, especially on college and university campuses. In some instances, black students were harassed and ostracized for having white friends. . . . Thinking black took precedence over thinking intelligently. . . .

Stifling free speech in the name of "unity," Lester shows, is something quite new in black American history. He notes that:

In the first part of the 19th century, Negro national conventions were held where black leaders debated and disagreed bitterly with each other over slavery and freedom, abolitionism and separatism. Frederick Douglass, the first national leader and Martin Delaney, the first black separatist, were political adversaries and friends. Dissent and disagreement have been the hallmark of black history.

The first black to win a seat in the U.S. House of Representatives in the 20th century and the first to be elected from a Northern state was Oscar de Priest of Illinois, a Republican. He believed in limited government, hard work, and the free market.

Finally, the realization is growing that not all black Americans think alike - nor do white, Hispanic, or Asian Americans. This understanding is long overdue. *

Saturday, 05 December 2015 04:55

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

Eighty-four Percent of Americans Disapprove of Congress: Their Contempt Is Justified

A Washington Post-ABC News poll shows a new high - 84 per cent of Americans - disapproving of the job Congress is doing, with almost two-thirds saying they "disapprove strongly." Just 13 percent of Americans approve of how things are going. It has been nearly four years since even 30 percent expressed approval of Congress.

Editorially, The Washington Examiner notes that,

Nobody can remember the last time the public approval rating of Congress was so low. That's because it's never been as low as it is now. . . . It's not hard to see why: the American people are fed up with the bipartisan corruption, endless partisan bickering, and lack of concrete action to address the nation's most pressing problems, especially out-of-control spending and the exploding national debt. . . . Both parties have presided over congressional majorities as Congress sank in public esteem during the past decade.

One reason for public dismay is the manner in which members of Congress often support while in office the interests they then go to work for once out of office. Of equal concern is the manner in which members of Congress vote for subsidies to the groups that have contributed to their political campaigns. This is true of members of Congress from both parties.

For example, soon after he retired last year as one of the leading liberals in Congress, former Rep. William D. Delahunt (D-MA) started his own lobbying firm with an office on the 16th floor of a Boston skyscraper. One of his first clients was a small coastal town that has agreed to pay him $15,000 a month for help in developing a wind energy project.

The New York Times reports that,

Amid the revolving door of congressmen-turned-lobbyists, there is nothing particularly remarkable about Mr. Delahunt's transition, except for one thing. While in Congress, he personally earmarked $1.7 million for the same energy project. So today, his firm, the Delahunt Group, stands to collect $90,000 or more for six months of work from the town of Hull, on Massachusetts Bay, with 80 percent of it coming from the pot of money he created through a pair of Energy Department grants in his final term in office.

Beyond the town of Hull, Delahunt's clients include at least three others who received millions of dollars in federal aid with his direct assistance. Barney Keller, communications director for the Club for Growth, a conservative group that tracks earmarks, says:

I cannot recall such an obvious example of a member of Congress allocating money that went directly into his own pocket. It speaks to why members of Congress shouldn't be using earmarks.

While this case may be somewhat extreme, it is repeatedly duplicated in one form or another by members of Congress. Consider former Senator Rick Santorum of Pennsylvania. A review of Santorum's many earmarks suggests that the federal money he helped direct to Pennsylvania paid off in the form of campaign cash. In just one piece of legislation, the defense appropriations bill for the 2006 fiscal year, Santorum helped secure $124 million in federal financing for 54 earmarks, according to Taxpayers for Common Sense, a budget watchdog group. In that year's election cycle, Santorum's Senate campaign committee and its "leadership PAC" took in more than $200,000 in contributions from people associated with the companies that benefited or their lobbyists. In all, Taxpayers for Common Sense estimated, Santorum helped secure more than $1 billion in earmarks during his Senate career.

Or consider former House Speaker Newt Gingrich, who speaks about being a "Reagan conservative" who supports "limited government," yet received $1.6 million from Freddie Mac over an eight-year period and gave the government-backed mortgage giant assistance in resisting reformers in Congress. Mr. Gingrich denies that he was a "lobbyist," as do some other former members of Congress. The Lobbying Disclosure Act of 1995 has three tests:

(1) Do you make more than $3,000 over three months from lobbying?
(2) Have you had more than one lobbying contract?
(3) Have you spent more than 20 per cent of your time lobbying for a single client over three months?

Only a person who has met all three tests must register as a lobbyist. Thus, a former member of Congress who has many lobbying contacts and makes $1 million a year lobbying but has no single client who takes up more than 20 per cent of his time would not be considered a lobbyist.

Clearly, it is time to change this rule. A task force of the American Bar Association recommended last year that the 20 percent rule be eliminated, which would require far more people to register as lobbyists, and subject them to ethics and disclosure requirements. The Center for Responsive Politics found that more than 3,000 lobbyists simply "de-registered" after Congress imposed new reporting requirements for lobbyists in 2007.

With regard to Gingrich, Washington Times columnist Don Lambro writes:

Mr. Gingrich . . . is the quintessential Washington insider, peddling influence in government. . . . He denied he was lobbying, insisting that he was hired to be a historian, when he was selling his services to one of the richest bidders in government. He was being paid well out of Freddie Mac's coffers while it was sowing the seeds of a housing scandal that resulted in an economic meltdown that has hurt millions of Americans and cost taxpayers billions of dollars. In other words, as a paid insider, he was part of the problem, not part of the solution.

Cutting the size of government, reducing our debt, and balancing the budget are embraced rhetorically by candidates for public office. Once elected, however, many become part of the system they campaigned against. The incentive structure once in office is to raise money to stay in office, and the way to do this is to vote subsidies to those groups being called upon to contribute. Both parties are engaged in this behavior, and candidates of both parties are rewarded so that special interests will have a friend in office no matter who is elected.

Sadly, the action of Congress - and the lobbying enterprises of former members of Congress - are legal. This, of course, is because it is Congress itself that writes the laws. There was a time when members of Congress, when they retired or were defeated, returned home. Some still do. Many others, however, remain in Washington, getting rich trying to influence their former colleagues.

This enterprise, of course, is only part of why Congress is viewed in such negative terms by 84 percent of Americans. Narrow partisanship and a greater concern for politics than for the country's well being is another. All of this is on naked display in today's Washington. The public contempt has been well earned. Whether that public dismay with our current politics can be transformed into an effective effort to alter this behavior remains to be seen. Too many in Washington have a vested interest in today's corrupt system as it exists. How to change the incentive structure for those in political life is our real challenge.

We Must Recognize a New Threat to Freedom in the Name of "National Security"

On December 31, 2011, President Obama signed the National Defense Authorization Act, which was supported by both Republicans and Democrats in the Congress. This legislation allows for the indefinite detention of American citizens within the United States - without charging them with a crime.

Under this law, those suspected of involvement with terrorism are to be held by the military. The president has the authority to detain citizens indefinitely. While Senator Carl Levin (D-MI) said that the bill followed existing law, "whatever the law is," the Senate specifically rejected an amendment that would exempt citizens, and the administration has opposed efforts to challenge such authority in federal court. The administration claims the right to strip citizens of legal protections based on its sole discretion.

This legislation was passed by the Senate 93 to 7. "The only comparable example was Reconstruction in the South," says constitutional law scholar Bruce Fein.

That was 150 years ago. This is the greatest expansion of the militarization of law enforcement in this country since.

The opposition to this legislation assembled an unlikely coalition of liberal Democrats, the American Civil Liberties Union, constitutional conservatives, libertarians, and three Republican senators - Rand Paul (KY), Mark Kirk (IL), and Mike Lee (UT).

The law, argued Senator Paul:

. . . would arm the military with the authority to detain indefinitely - without due process or trial - suspected al-Qaeda sympathizers, including American citizens apprehended on American soil. I want to repeat that. We are talking about people who are merely suspected of a crime. And we are talking about American citizens. If these provisions pass, we could see American citizens being sent to Guantanamo Bay.

Senator Mark Udall (D-CO), who proposed a failed amendment to strip the language from the bill, said that these provisions would "authorize the military to exercise unprecedented power on U.S. soil.

Writing in The American Conservative, Kelley Beaucar Vlahos notes that:

Already the federal government has broad authority to decide whether terror suspects are detained and held by federal law enforcement agencies and tried in regular courts or carried off by the military under the Military Commissions Act. This new legislation would allow the military to take control over the detention of suspects first - which means no Miranda rights and potentially no trial even on U.S. soil, putting the front lines of the War on Terror squarely on Main Street.

Bruce Fein argues that the ambiguity of words like "associated groups" or "substantially supports" gives the military wide discretion over who is considered a terrorist. "It's a totally arbitrary weapon that can be used to silence people."

Rep. Justin Amash (R-MI), one of the leading critics of the bill in the House of Representatives, issued a fact-checking memo outlining how the language can be abused:

For example, a person makes a one-time donation to a non-violent humanitarian group. Years later, the group commits hostile acts against an ally of the U.S. Under the Senate's NDAA, if the President determines the group was "associated" with terrorists, the President is authorized to detain the donor indefinitely, and without charge or trial.

James Madison warned that, "The means of defense against foreign danger historically have become instruments of tyranny at home."

Senator Paul states that:

The discussion now to suspend certain rights to due process is especially worrisome, given that we are engaged in a war that appears to have no end. Rights given up now cannot be expected to be returned. So we do well to contemplate the diminishment of due process, knowing that the rights we lose now may never be restored. . . . This legislation would arm the military with the authority to detain indefinitely - without due process or trial - suspected al-Qaeda sympathizers, including American citizens apprehended on American soil. . . . There is one thing and one thing only protecting innocent Americans from being detained at will by the hands of a too-powerful state: our Constitution and the checks it puts on government power. Should we err and remove some of the most important checks on state power in the name of fighting terrorism, well, then the terrorists will have won.

In his dissent in Hamdi v. Rumfeld, Justice Antonin Scalia declared:

Where the government accuses a citizen a waging war against it, our constitutional tradition has been to prosecute him in federal court for treason or some other crime. . . . The very core of liberty secured by our Anglo-Saxon system of separated powers has been freedom from indefinite imprisonment at the will of the executive.

Jonathan Turley, professor of law at George Washington University, points out that:

In a signing statement with the defense authorization bill, Obama said he does not intend to use the latest power to indefinitely imprison citizens. Yet, he still accepted the power as a sort of regretful autocrat. An authoritarian nation is defined not just by the use of authoritarian powers, but by the ability to use them. If a president can take away your freedom or your life on his own authority, all rights become little more than a discretionary grant subject to executive will.

James Madison, Turley recalls,

. . . famously warned that we needed a system that did not depend on the good intentions or motivations of our rulers: "if men were angels, no government would be necessary." Since 9/11, we have created the very government the framers feared: a government with sweeping and largely unchecked powers resting on the hope that they will be used wisely. The indefinite-detention provision in the defense authorization bill seemed to many civil libertarians like a betrayal by Obama. While the president had promised to veto the law over that provision, Senator Levin, a sponsor of the bill, disclosed on the Senate floor that it was in fact the White House that asked for the removal of an exception for citizens from indefinite detention.

Historically, those who seek to expand government power and diminish freedom always have a variety of good reasons to set forth for their purposes. In the case of Olmstead v. United States (1927), Justice Louis Brandeis warned that:

Experience should teach us to be most on our guard to protect liberty when the government's purposes are beneficent. Men born to freedom are naturally alert to repel invasion of their liberty by evil-minded rulers. The greatest dangers to liberty lurk in the insidious encroachment of men of zeal, well meaning but without understanding.

In recent years, in the name of ecology, racial equality, public health, and a variety of other "beneficent" purposes, the power of government has grown and the freedom of the individual has diminished, just as Justice Brandeis feared it would. But it has also diminished in the name of national security, something many conservatives, usually alert to the growth of government power, tend to support - or to acquiesce in. This is a serious mistake, as we now face the new threat of indefinite detention of American citizens. Freedom cannot be preserved by taking it away.

The Arab Spring: Understanding the Promise and Peril of Revolution in the Middle East

Developments in the Middle East remain chaotic. In the wake of the Arab Spring we have seen the overthrow of autocratic regimes in Tunisia and Egypt, a virtual civil war in Syria, and challenges to such governments as those in Bahrain and Yemen. The brutal Libyan dictator Muammar Ghaddafi has been overthrown. What comes next in this volatile region is difficult to know.

In an important new book, The Invisible Arab, Marwan Bishara, senior political analyst for Al Jazeera's English language service and the editor of its flagship show "Empire," and a former lecturer at the American University of Paris, provides a thoughtful analysis on how Arabs broke their own psychological barrier of fear to kindle one of the first significant revolutionary transformations of the 21st century.

Bishara describes how the historic takeover of Tunisia's November 7 Square, Egypt's Tahrir Square, and Bahrain's Pearl Square, among others, was the culmination of a long social and political struggle: countless sit-ins, strikes, and demonstrations by people who risked and suffered intimidation, torture, and imprisonment. It was aided by the dramatic rise of satellite television networks, including Al Jazeera, which bypass attempts by governments to censor news and information.

"Like most revolutions," he writes,

. . . this one was a long time coming. . . . They were the culmination of a long social and political struggle - countless sit-ins, strikes, pickets, and demonstrations. . . . The story begins with the young Arabs whose networking and organizations brought the people out into the streets. The youth, who make up 60 percent of all Arabs, have been looked upon as a "demographic bomb," and "economic burden," or as a "reservoir for extremism." However, unlike previous generations, this group heralded change.

For decades, Bishara argues, these Arab citizens and their social and political movements

. . . have been either unfairly demonized or totally ignored by the West . . . who saw the region through the prism of Israel, oil, terrorists, or radical Islamism. But today's Arabs are presenting a stark contrast to the distortion . . . heaped upon them. Characterized as unreceptive to democracy and freedom, they are now giving the world a lesson in both.

The more difficult part of this revolutionary journey, he notes, will come as

. . . the Arabs, sooner rather than later, discover that democracy and freedom come with greater responsibility. Defeating dictators is a prerequisite for progress, but does not guarantee it, especially in the absence of functional state institutions, democratic traditions, and modern infrastructure. The prevalence of poverty, inequality, and rising regional and international competition present huge challenges.

The origins of what he calls "the miserable Arab reality" are not civilizational, economic, or philosophical per se. Instead,

. . . . The origins . . . are political par excellence. Like capital to capitalists, or individualism to liberalism, the use and misuse of political power has been the factor that defines the contemporary Arab state. Arab regimes have subjugated or transformed all facets of Arab society.

By the beginning of the 21st century, Arab autocracies represented some of the oldest dictators in the world. Zine el-Abidine Ben Ali's dictatorship in Tunsia, the most recently established in the region, ruled for 25 years, followed by 30 years for Egypt's Mubarak, and 33 years for Yemen's Ali Abdulla Sale, and 43 years for Ghaddafi in Libya. In Syria, the al-Assad dynasty has ruled for 43 years, and Saddam Hussein was removed in 2003 after 24 bloody years ruling Iraq. Only the authoritarian Arab monarchies precede these dictatorships in longevity. Bahrain, a repressive Sunni monarchy, has ruled a Shia majority since its independence from Britain in 1971.

Arab states, writes Bishara,

. . . were, for a lack of better words, turned into the private estates of the ruling families. While these regimes boasted of secular republicanism, they were run similar to the Kingdom of Saudi Arabia and the United Arab Emirates, where no political activism was allowed and where the ruling families dominated all facets of political life. . . . The energy-producing Arab states are sustained rentier-type economies, characterized by a trade-off between economic welfare and political representation. Whereas the modern democratic state was founded on the cry of "no taxation without representation" . . . the modern Arab state has turned that notion on its head. With free-flowing petro-dollars pouring into their countries, Arab leaders have been able to sell off national resources and enrich themselves without having to turn to their citizens for personal taxation. . . . It became a ritual in the wealthy monarchies for the kings, emirs, or princes to provide small sums of money to their "subjects," and the poor in particular, as a makrama or "generous gift" that was generated from the natural resources in their land.

According to the U.N. Development Program's (UNDP) first Arab Human Development Report, written exclusively by Arab experts,

. . . Arab countries have not developed as quickly as comparable nations in other regions. Indeed, more than half of Arab women are illiterate; the region's infant mortality rate is twice as high as in Latin America and the Caribbean. Over the past 20 years, income growth per capita has also been extremely low.

In virtually every Arab country, more than half the population is under 30 - more than 140 million people - while a quarter are between the ages of 15 and 29, making this generation the largest youth cohort in the history of the Middle East. This unemployed and increasingly angry demographic has given traction to the "youth bulge" theory, which posits that when population growth outstrips that of jobs, social unrest is inevitable.

The influence of the information revolution has been crucial to developments in the region. As a result, notes Bishara,

. . . The Arab youth were able to think for themselves, freely exchange ideas, and see clearly beyond their ruler's deception, vengeful jihadist violence, or cynical Western calculations. . . . At the beginning of 2011, there were 27 million Arabs on Facebook, including 6 million Egyptians. Within a few nights, 2 million more Egyptians joined, underlining the centrality of the medium to the changes in the country. More than 60 million people in the Arab world are online.

Yemeni activist and 2011 Nobel Peace Prize co-winner Tawakkol Karman described the use of social media:

The revolution in Yemen began immediately after the fall of Ben Ali in Tunisia. . . . As I always do when arranging a demonstration, I posted a message on Facebook, calling on people to celebrate the Tunisian uprising.

This new media, writes Bishara,

. . . had an important cultural, even sociological role to play in patriarchal Arab societies. It helped young people break free from social constraints. It propelled them into uncharted territory, and it helped them mold a certain type of individualism. They began to enjoy an uninhibited space where they could share information and experiences, join chat rooms, and participate with one another. Theirs is a new found egalitarianism. . . .

Bishara laments the fact that,

Arabs have been valued not for their embrace of freedom or respect for human rights, but rather in terms of their proximity to U.S. interests. A subservient ally and energy providing partner made for a good Arab regime, regardless of its despotic or theocratic rule. . . . Western leaders have talked in slogans . . . about democracy and Islam, but have always been as indifferent to the people of the region as their dictators.

What does the future hold? Bishara recognizes that there are great dangers:

Islamist movements, the likes of the Egyptian Brotherhood, have already opened dialogue with the military and with Western powers on the basis of mutual interest and respect. This might be seen as a positive development, that allows for a new sort of regional order on the basis of a new accommodation among Islamists, the generals, and Western leaders. However, this triangle could eventually be as oppressive and totalitarian as the previous dictatorships . . . the Islamists must make sure that they reconcile with the principles of democracy and modern statehood, not a division of labor with the military. . . . Many of the Islamists I spoke to reckon that if they have a majority they have a democratic right to change the constitution and govern as they see religiously fit. They don't recognize democracy as first and foremost a system of government based on democratic values that go beyond the right of the majority to rule, to ensure that the rights and privileges of the minorities are respected and preserved. . . .

The Invisible Arab is a thoughtful contribution to our understanding of the Middle East from one of its articulate new voices. He shows how the revolutions have evolved - and how it could all go terribly wrong. Marwan Bishara hopes for a free and democratic Middle East - and he has his fingers crossed. *

Saturday, 05 December 2015 04:47

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

"Flash Mobs" in the Summer of 2011: An Example of Family Breakdown

The summer of 2011 saw a proliferation of a phenomenon that has come to be known as "flash mobs." Organized largely through text messages and via Facebook and Twitter, the gangs of unruly youths, usually members of minority communities, have beaten and robbed citizens in Philadelphia, disrupted a fireworks display outside Cleveland, attacked fairgoers in Wisconsin, and looted a 7-Eleven in Germantown, Maryland.

Riots in London during the summer mirror some of the worst uprisings in modern U.S. history. Hundreds of stores across London, Manchester, Birmingham, and other British cities were torched or ransacked in four nights of mayhem after the police killing of a north Londoner named Mark Duggan, whose death was quickly overshadowed by the wave of recreational violence. "This is criminality, pure and simple," said Prime Minister David Cameron.

The looting was more than simply a race riot. While Duggan was black, and there are strong correlations between race and class in Britain, some of the worst violence happened in majority-white neighborhoods like Croydon. "This is much broader than race," says Caryl Phillips, a British writer with Afro-Caribbean roots. "This is about a whole group - black, white, and brown - who live just outside the law."

In the U.S., notes Jerry Ratcliffe, chairman of the Department of Criminal Justice at Temple University, and a former London police officer:

This is an old crime being organized with new tools. There's nothing new about groups of people assaulting people and robbing, but what's new is the technology. There's a fascination with the speed by which this can now take place. You can go from nothing happening to something happening in a matter of moments. Flash mobs take advantage of opportunities. Those opportunities are that the victims are outnumbered by the group and that there is an absence of law enforcement.

In Philadelphia, Mayor Michael A. Nutter, who is black, told marauding black youths, "You have damaged your own race." After imposing a strict curfew, Nutter told young people: "Take those God-darn hoodies down, especially in the summer. Pull your pants up and use a belt 'cause no one wants to see your underwear. . . ."

Mayor Nutter moved up the weekend curfew for minors to 9 p.m. and told parents that they would face increased fines for each time their child is caught violating the curfew.

The head of the Philadelphia chapter of the NAACP, J. Whyatt Mondesire, said it "took courage" for Mr. Nutter to deliver the message. "These are majority African-American youths and they need to be called on it."

In the past two years, Philadelphia has been the scene of a number of flash mobs in which youths meet at planned locations by texting one another and then commit assorted mayhem. In one episode, teens knocked down passersby on a Center City street and entered an upscale department store where they assaulted shoppers. In another incident, about 20 to 30 youths descended on Center City after dark, then punched, beat, and robbed bystanders. One man was kicked so savagely that he was hospitalized with a fractured skull. Police arrested four people, including an 11-year-old.

Speaking from the pulpit of his Baptist Church, Mr. Nutter delivered a 30-minute sermon about black families taking responsibility for the behavior. He said:

The Immaculate Conception of our Lord Jesus Christ took place a long time ago, and it didn't happen in Philadelphia. So every one of these kids has two parents who were around and participating at the time. They need to be around now.

The Mayor told parents:

If you're just hanging out out there, maybe you're sending them a check or bringing some cash by. That's not being a father. You're just a human ATM. . . . And if you're not providing the guidance and you're not sending any money, you're just a sperm donor.

Columnist Gregory Kane, who is black, writes:

What is particularly instructive in this instance is where the 11-year old (arrested in Philadelphia) ended up: in the custody of his grandmother. We don't know what the boy's mother and father are doing right about now, but we know what they aren't doing: parenting their son. . . . Excuses for the flash mobbers, many of whom are black, with some attacking whites at random . . . have been coming fast and furious. They need jobs, the excuse makers tell us. They need recreational facilities. What they need are parents who don't hesitate to put a foot squarely in their derrieres when a foot in that spot is needed.

In Mayor Nutter's view, the collapse of the black family is a key element in the problems we face.

Let me speak plainer: That's part of the problem in the black community. . . . We have too many men making too many babies they don't want to take care of, and then we end up dealing with your children.

In the U.S. at the present time, out-of-wedlock births are now at 41 percent of overall births, but there is a tremendous variation in illegitimate births by race. Such births are the norm in both the black (72 percent) and Hispanic (53 percent) communities, but less than a third of white births (29 percent) are illegitimate.

It is clear that there has been a racial component in the flash mob events this past summer. Columnist Gregory Kane states:

I don't know what else to call it when mobs of blacks single out whites to attack. But there still exists this notion that blacks can't be racists. Racism requires power, the thinking goes. Since blacks have no power, they can't be racists. Such nonsense is bad enough when left-wing loonies and black nationalist types parrot it. But the Rev. Jesse Jackson is a prominent black leader. He, at least, should know better. Alas, he does not. This, he declares, "Is nonsense."

The respected economist Thomas Sowell disputes the idea that the violence of flash mobs can be explained by disparities in income. In his view:

Today's politically correct intelligentsia will tell you that the reason for this alienation and lashing out is that there are great disparities and inequities that need to be addressed. But such barbarism was not nearly as widespread two generations ago, in the middle of the 20th century. Were there no disparities or inequities then? Actually there were more. What is different today is that there has been - for decades - a steady drumbeat of media and political hype about differences in income, education and other outcomes, blaming these differences on oppression against those with fewer achievements or lesser prosperity.

The fact that so many black voices are now being heard about the decline of the black family and the manner in which that decline has led to such events as the flash mobs of this past summer is a hopeful sign. No problem can be resolved unless it is properly understood. Hopefully, that understanding will grow and the real problems we face can then be addressed.

Crony Capitalism: A Growing Threat to Economic Freedom

Crony capitalism - the close alliance of big business with government - leads not to free enterprise but its opposite, in which government, not the market, chooses winners and losers through subsidies and other forms of government largesse. Adam Smith, the great philosopher of capitalism, understood that businessmen want to maximize profits, and how it is done is of secondary interest. Indeed, he once said that when two businessmen get together, the subject of discussion is how to keep the third out of the market. Adam Smith - and more recent philosophers of the free market such as Hayek, Ludwig von Mises, and Milton Friedman - believed deeply in capitalism. Many businessmen, and many on Wall Street, do not.

Consider some of the recent manifestations of this phenomenon.

The U.S. Government guaranteed a $535 million loan for Solyndra, LLC, the now bankrupt California company that was the centerpiece of President Obama's "clean energy" future. There are a least 16 more such loan guarantees worth in excess of $10 billion.

From e-mails made public in mid-September by the House Energy and Commerce subcommittee on Oversight and Investigation, it is clear that key Solyndra loan decisions were guided primarily by political considerations.

President Obama was not in the White House when the proposal to back the company initially appeared in Washington, but two weeks before President George W. Bush left office, an Energy Department review panel unanimously recommended against making the loan. Even after Obama decided to support the proposal, career employees at the Office of Management and Budget cautioned against doing so. One predicted that Solyndra would run out of money and file for bankruptcy by September 2011. A Government Accountability Office report said that the Energy Department had circumvented its own rules at least five times to make the loan. The leading investors in Solyndra were two investment funds with ties to George B. Kaiser, a major fundraising "bundler" for Obama.

Both Republicans and Democrats supported the loan-guarantee program, which was approved by the Republican-controlled Congress in 2005. The loan guarantee program for alternative energy companies was created as part of the Energy Policy Act of 2005, sponsored by Rep. Joe Barton (R-TX), who has been a leader in the congressional probe of Solyndra's ties to the Obama administration.

Similarly, Senator Jim DeMint (R-SC) said in the Senate that the Solyndra case exposed the "unintended results when our government tries to pick winners and losers." This, of course, is quite true. Yet DeMint himself had been a supporter of the loan-guarantee legislation in 2005.

The fact is that solar companies are not the only energy companies getting federal loan guarantees. The power giant Southern Co. won a $3.4 billion loan guarantee from the Energy Department last summer. Yet, even some Republican critics of big government have supported this huge expenditure. Rep. Phil Gingrey (R-GA) declared that it was wrong to compare Southern to Solyndra because "Southern Co. owns Mississippi Power, Alabama Power, Georgia Power, among others, and employs literally thousands of people."

Washington Examiner columnist Timothy Careny notes that:

The implication was clear: Federal subsidies to big, established companies are fine. It's the handouts to these upstarts that are objectionable. So Gingrey is embracing the heart of Obamanomics - the proposition that government ought to be an active partner in shaping the economy and helping business. . . . If Republicans were willing to broaden their attack beyond criticizing this one (Solyndra) deal, they could indict the whole practice of government-business collusion.

Or consider the Export-Import Bank, supported by both Republicans and Democrats, which is a government agency that subsidizes U.S. exporters. Recently, it broke its record for the most subsidy dollars provided in a single year, primarily to Boeing.

Members of both parties have voted to bail out failed banks, auto companies, and other enterprises considered "too big to fail." Now, business interests are lining up to influence the work of the new congressional "supercommittee" that will help decide whether to impose massive cuts in spending for defense, health-care, and other areas. Nearly 100 registered lobbyists for big corporations used to work for members of the committee and will be able to lobby their former employers to limit the effect of any reductions. They are representing defense companies, health-care conglomerates, Wall Street banks, and others with a vested interest in the outcome of the panel's work. Three Democrats and three Republicans on the panel also employ former lobbyists on their staff.

The 12-member committee is tasked with identifying $1.5 trillion in spending reductions over a decade. "When the committee sits down to do its work, it's not like they're in an idealized platonic debating committee," said Bill Allison, editorial director of the Sunlight Foundation, which is tracking ties between lobbyists and the panel. "They're going to have in mind the interests of those they are most familiar with, including their big donors and former advisers."

General Electric, for example, has been awarded nearly $32 billion in federal contracts over the past decade, with much of that business going to lucrative defense and health-care subsidiaries. General Electric's chief executive, Jeffrey Imelt, also heads President Obama's Council on Jobs and Competitiveness. At least eight GE lobbyists used to work for members of the supercommittee.

Top donors to the deficit committee members include AT&T, $562,045; BlueCross/Blue Shield; $460,02; General Electric, $452,999; American Bankers Association, $421,883; Citigroup, $443,006; and National Association of Realtors, $418,000. Needless to say, they contribute to both parties.

A study last year from the London School of Economics found 1,113 lobbyists who had formerly worked in the personal offices of lawmakers. At least nine members of the 12-member supercommittee have scheduled fundraisers this fall, putting them in a position to take money from industry donors at the same time they are helping to decide what to cut from government spending. The most active fundraiser on the panel appears to be Rep. James Clyburn (D-SC) who has a least five donor events scheduled before the panel's Thanksgiving deadline. According to the Sunlight Foundation, contributions given during the time the supercommittee is meeting will not be disclosed to the Federal Election Committee until January - well after the final decision is made.

Sadly, free markets are genuinely embraced more often by intellectuals than businessmen. All too often, businessmen seek government subsidy, bailout, and intervention to keep competitors out of the market. When Congress acted to eliminate the Civil Aeronautics Board and the Interstate Commerce Commission and open up the airline and trucking industries to real competition, it was the industries themselves that opposed deregulation, for they had found a way to control the government agencies involved in their own behalf.

The old warning by the economist Friedrich Hayek that socialism in its radical form is not nearly as dangerous as socialism in its conservative form is worthy of serious reconsideration. When the advocates of state power and the advocates of corporate bigness become allies, government involvement in the economy - a form of socialism - is inevitable. The result is the crony capitalism we now face. *

Saturday, 05 December 2015 04:43

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

How America Goes to War: Rediscovering the Dangers of an All-Powerful Executive

In recent days, our country has been embroiled in three wars - Iraq, Afghanistan, and Libya.

Article I, Section 8, of the U.S. Constitution clearly gives Congress - not the executive - the power to declare war. Since the Constitution was signed in 1787, Congress has declared war five times: the War of 1812, the Mexican War, the Spanish-American War, and World Wars I and II. Yet, since 1787, the U.S. has been involved in numerous military conflicts without a declaration.

In the case of the Korean War, President Truman sent some 1.8 million soldiers, sailors, and airmen over a period of just three years and 36,000 lost their lives - but never sought or resolved a congressional declaration of war. Congress has not declared war since World War II, despite dozens of conflicts since then.

In 1973, Congress passed the War Powers Resolution, which was meant to counteract what Presidents Nixon and Johnson had done in Vietnam. Congress felt deceived, particularly since it was later discovered that the Gulf of Tonkin incident that precipitated a larger war had never, in fact, taken place.

The law, however, hardly reasserts Congress' very clear constitutional power to declare war. Instead, it simply asks for an authorization letter and then gives the President a three-month deadline. It requires the President to withdraw U.S. forces from armed hostilities if Congress has not given its approval within 60 days.

Even fulfilling the requirements of the War Powers Resolution appears to be too much for the Obama Administration. In fact, the President rejected the views of top lawyers at the Pentagon and the Justice Department when he decided that he had the legal authority to continue American military participation in the air war in Libya without congressional authorization.

Jeh C. Johnson, the Pentagon general counsel, and Caroline D. Krass, the acting head of the Justice Department's Office of Legal Counsel, told the White House that they believed that the U.S. military's activities in the NATO-led air war amounted to "hostilities" under the War Powers Resolution, that would require Mr. Obama to terminate or scale back the mission after May 20.

The President, however, adopted the legal analysis of the White House counsel, Robert Bauer, and several others who argued that the military's activities in Libya fell short of "hostilities." Under that view, Obama needed no permission from Congress to continue the mission unchanged.

Late in June, the House rejected a bill to authorize the U.S. military operations in Libya. The resolution to support the mission failed 295 to 123, with 70 Democrats joining Republicans in a rebuff to the President. Still, the House also defeated a measure that would have limited financing to support these efforts.

Rep. Jason Chaffetz (R-UT) said:

It didn't go far enough. Under that resolution, the president is still going to be engaged in the war. We've been inept and irrelevant on the war actions. We have not lived up to our constitutional duty.

In Libya, the goal of our mission appears to have changed from month to month. In March, the President said that U.S. intervention would be confined to implementing a no-fly zone. He declared that, "Broadening our mission to include regime change would be a mistake." By May, the mission was to make Libyans "finally free of 40 years of tyranny." By June, after more than 10,000 sorties, including those by attack helicopters, the strategy seems to boil down to an effort to eliminate Gaddafi himself.

While some have charged that opponents of the conflict in Libya are "isolationists," conservative columnist George Will notes that:

Disgust with this debacle has been darkly described as a recrudescence of "isolationism" as though people opposing this absurdly disproportionate and patently illegal war are akin to those who, after 1938, opposed resisting Germany and Japan. Such slovenly thinking is a byproduct of shabby behavior.

While men and women of good will may disagree about the merits of the U.S. intervention in Libya - or Afghanistan and Iraq - the larger question is whether one man, the President, can take the country to war without a congressional declaration, as clearly called for in the Constitution.

What we are dealing with is the dangerous growth of executive power. During the years of the New Deal, when the power of president was dramatically expanded, Republicans, who were in the opposition, objected to the growth of such power as a threat to freedom. Later, when Republicans held the power of the Presidency, they, too, expanded executive power, and Democrats, now in opposition, objected. This has been characterized as argument from circumstance, not principle. If you hold power, you expand it. No one in power has an incentive to cede back the power that has been assumed.

Even at the beginning of the Republic perceptive men such as John Calhoun predicted that government would inevitably grow, and those in power would advocate a "broad" use of power, and those out of power would always argue for a "narrow" use of power, and that no one would ever turn back government authority which has once been embraced.

Calhoun was all too prophetic when he wrote the following in "A Disquisition On Government":

. . . . Being the party in possession of government, they will . . . be in favor of the powers granted by the Constitution and opposed to the restrictions intended to limit them. As the major and dominant parties, they will have no need of these restrictions for their protection. . . . The minor or weaker party, on the contrary, would take the opposite direction and regard them as essential to their protection against the dominant party. . . . But where there are no means by which they could compel the major party to observe the restrictions, the only resort left then would be a strict construction of the Constitution. . . . To this the major party would oppose a liberal construction . . . one which would give to the words of the grant the broadest meaning of which they were susceptible.

Calhoun continued:

It would then be construction against construction - the one to contract and the other to enlarge the powers of the government to the utmost. But of what possible avail could the strict construction of the minor party be, against the liberal interpretation of the major party, when the one and the other be deprived of all means of enforcing its construction? In a contest so unequal, the result would not be doubtful. The party in favor of the restrictions would be overpowered. . . . The end of the contest would be the subversion of the Constitution. . . the restrictions would ultimately be annulled and the government be converted into one of unlimited powers.

Our history shows that this is true. Republicans opposed big government when Democrats were in power, but spoke of concepts such as "executive privilege" when their own party held positions of authority. The Democrats have done exactly the same thing. The growth of government power has been a steady process, regardless of who was in office.

Those who want to rein in government power, to return to the federal system set forth in our Constitution, with its clearly defined separation of powers and checks and balances, would do well to turn their attention to the question of who has the power to take America to war. The Constitution did not give one man that power, although events in Afghanistan, Iraq, and Libya show us that this seems no longer to be the case. Concern over developments in Libya are a healthy sign that more and more Americans seem to be paying attention to the question of the war-making power.

Dramatic Decline in Public Education Leads to Renewed Push for Voucher Programs

Mounting evidence of a dramatic decline in American public education is leading to a renewed push for voucher programs across the country.

Details of decline are all around us. Some of the New York City high schools that received the highest grades under the Education Department's school assessment system are graduating students who are not ready for college. Of the 70 high schools that earned an "A" on the most recent city progress report and have at least one third of graduates attending college at City University of New York (CUNY), 46 posted remediation rates above 50 percent, according to reports sent to the city's high schools. Remediation rates - the percentage of students who fail a CUNY entrance exam and require remediation classes - rose to 49 percent in 2010 from 45 percent in 2007.

About three quarters of the 17,500 freshmen at CUNY community colleges this year have needed remedial instruction in reading, writing, or math, and nearly a quarter of the freshmen have required such instruction in all three subjects.

Fewer than half of all New York state students who graduated from high school in 2009 were prepared for college or careers, as measured by state Regents tests in English and math. In New York City, that number was 23 percent.

At LaGuardia Community College in Queens, where 40 percent of the math classes are remedial, faculty member Jerry G. Ianni says:

Most students have serious challenges remembering the basic rules of arithmetic. The course is really a refresher, but they aren't ready for a refresher. They need to learn how to learn.

About 65 percent of all community college students nationwide need some form of remedial education, with students' shortcomings in math outnumbering those in reading two to one, said Thomas R. Bailey, director of the Community College Research Center at Teachers University at Columbia University.

The New York State Department of Education released new data in June showing that only 37 percent of students who entered high school in 2006 left four years later adequately prepared for college, with even smaller percentages of minority graduates and those in the largest cities meeting that standard. In New York City, 21 percent who started high school in 2006 graduated last year with high enough scores on state math and English tests to be deemed ready for higher education, or well-paying careers. In Rochester County, it was 6 percent, in Yonkers, 14.5 percent.

Nearly one fourth of the students who try to join the U.S. Army fail its entrance exam, painting a grim picture of an educational system that produces graduates who can't answer basic math, sciences, and reading questions. The report by the Education Trust bolsters a growing worry among military and education leaders that the pool of young people qualified for military service will grow too small.

"Too many of our high school students are not graduating ready to begin college or a career - and many are not eligible to serve in our armed forces," Education Secretary Arne Duncan said. "I am deeply troubled by the national security burden created by America's underperforming education system."

The report found that 23 percent of recent high school graduates don't get the minimum score needed on the enlistment test to join any branch of the military. Questions are often basic, such as: "If 2 plus X equals 4, what is the value of X?"

The military exam results are also of concern because the test is given to a limited pool of people. Pentagon data shows that 75 percent of those aged 17 to 24 don't even qualify to take the test because they are physically unfit, have a criminal record, or don't graduate from high school.

"It's surprising and shocking that we still have students who are walking across the stage who really don't deserve to and haven't earned that right," said Tim Callahan with the Professional Association of Georgia Educators, a group that represents more than 80,000 educators.

The study shows wide disparities in scores among white and minority students, similar to racial gaps on other standardized tests. Nearly 40 percent of black students and 30 percent of Hispanics don't pass, compared with 16 percent of whites. The average score for blacks is 39 and for Hispanics is 44, compared to whites' average score of 55.

The decline in American public education has led to a renewed campaign for a voucher system which would give middle-class and poor parents the same freedom of choice of where to send their children to school that only well-to-do parents now have.

Early in May, Indiana Governor Mitch Daniels signed what is probably the broadest voucher law in the country. A few days later, Oklahoma approved the tax credits for those who contribute to a privately funded private school "opportunity scholarship" program. In New Jersey, in May, a voucher bill was approved by a Senate committee with bipartisan support. In Washington, D.C., the voucher program, which was killed by the Democratic majorities in the last Congress, is all but certain to be restored. In Wisconsin, Governor Scott Walker is pushing hard to broaden Milwaukee's voucher program to other cities and many more children.

According to the Foundation for Educational Choice, a pro-voucher group that lists Milton Friedman as its patriarch, more than 52 bills have emerged this year, some passed, some still pending, in 36 states - among them Arizona, Florida, Ohio, Oregon, and Pennsylvania - providing funding for vouchers, tax credits, or other tax-funded benefits for private education. "No year in recent memory," said foundation president Robert Enlow, has provided better opportunities for the cause."

Writing in The Nation, Peter Schrag, a liberal, declares that, "Milton Friedman's vision for school choice is becoming a reality around the country."

Early in April, a divided Supreme Court further heartened the movement by upholding Arizona's law providing tax credits for contributions to "school tuition organizations" - scholarship funds for private and religious schools.

Many forget that vouchers have never been an exclusively conservative issue. In the 1960s, liberal school reformers like Paul Goodman and John Holt, pushing for "free schools," the "open school," and other escapes from what they regarded as "over-bureaucratized, lockstep" school structures, embraced vouchers as a way of getting there.

Later, liberals like Berkeley law professor John Coons, who helped launch lawsuits seeking equity in school spending, became strong voucher advocates as a way to allow poor and minority children some way out of the ghetto schools.

Clearly, the time seems to have come for a voucher system - and genuinely free choice for parents with regard to where to send their children to school.

The Supreme Court's Strange Embrace of Violent Video Games for Children

The U.S. Supreme Court, in a 7-2 ruling late in June, declared, in a decision written by Justice Antonin Scalia, that a California law that bars selling extremely violent videos to children violated children's First Amendment rights to buy interactive games in which they vicariously steal, rape, torture, and decapitate people to score points.

Justice Scalia said that the state had no compelling interest in limiting the sale of such violent videos. He made light of studies showing that violent videos correlate to aggressive behavior in some children and denied that reading about violence is different from participating in full-color, sound-filled interactive depiction in which the children themselves commit the violence.

Justices Stephen G. Breyer, a liberal, and Clarence Thomas, a conservative, filed the only dissents, arguing that the law was intended to empower parents, not erode the First Amendment. The law targets adults who sell this material to children. The goal, clearly, was not to disempower children but to curb predators.

In a concurring opinion, Justice Samuel Alito and Chief Justice John G. Roberts, argued that the law should be struck down because of vagueness, but added that:

The Court is far too quick to dismiss the possibility that the experience of playing video games (and the effects on minors of playing violent video games) may be very different from anything that we have seen before. . . . In some of these games, the violence is astounding. Victims by the dozens are killed with every imaginable implement . . . dismembered, decapitated, disemboweled, set on fire and chopped into little pieces. They cry out in agony and beg for mercy. Blood gushes, splatters, and pools. Severed body parts and gobs of human remains are graphically shown. In some games, points are awarded based, not only on the number of victims killed, but on the killing technique employed.

In his dissent, Justice Thomas declared that:

The Farmers could not possibly have understood the freedom of speech to include a qualified right to speak to minors. Specifically, I am sure that the founding generation would not have understood "the freedom of speech" to include a right to speak to children without going through their parents.

In his dissent, Justice Breyer quoted from a 1944 case, where the court recognized that the "power of the state to control the conduct of children reaches beyond the scope of its authority over adults."

Most adult Americans probably have no idea of the nature of the video games children are playing - and which the Supreme Court has now embraced as free speech. One such graphic game involves the player torturing a girl as she pleads for mercy, urinating on her, dousing her with gasoline and setting her on fire.

Among the most popular games is "Bloody Day," described this way:

Back alley butchering has never been so much fun. It's like having your own barrel with moderately slow moving fish. How many kills can you rack up?

Another is "Boneless Girl," which is presented in these terms:

Poke and pull this scantily clad babe all over bubble-land. You'll be amazed by the small spaces she can fit through, and throwing her across the screen never gets old.

Sadly, notes Joel Bakan, author of the forthcoming book Childhood Under Siege: How Big Business Targets Children, children:

. . . don't need to rent or buy casual games. They are available on computers, tablets, and cellphones - free. (California's law wouldn't have applied to these games, even if it had survived the court's scrutiny, because they are not rented or sold.) Many popular casual games contain as much violence as notorious video games like Postal 2 and Grand Theft Auto, if not more. But they tend to exist under the radar; they're part of an obscure world into which teenagers and children escape and about which parents are often in the dark. (I learned about them only after I asked my 12-year-old son what he liked to do online.)

Bakan reports that,

Nickelodeon's www.addictinggames.com, a premier casual game site, calls itself "the largest source of the best free online games." It attracts 20 million unique monthly users, mostly children and teens. . . . Like other leading casual game sites, www.addictinggames.com makes money by running advertisements. According to Viacom, the site's corporate owner, the aptly named site allows "junkies" to "gorge themselves" and to "fuel their addiction." Viacom's interest in promoting addiction helps explain why Nickelodeon, the award-winning children's network, might want to push brutal, violent entertainment. Violence sells. And it continues to sell to children, teens, and tweens "hooked" at an early age and hungry for more. . . . The games' use of graphic violence to generate profit is strategic and calculated.

In the 1949 case of Terminiello v. Chicago, Justice Robert H. Jackson, in a famous dissent, declared that, "The Constitution is not a suicide pact." He wrote that,

The choice is not between order and liberty. It is between liberty with order and anarchy without either. There is danger if the court does not temper its doctrinaire logic with a little practical wisdom, it will convert the constitutional Bill of Rights into a suicide pact.

Discussing the California video game case, Brown v. Entertainment Merchants Association, Robert Knight, senior fellow for the American Civil Rights Union, notes that:

The Constitution is the greatest political document in history and the guarantor of our God-given rights. The First Amendment has proved foundational to maintaining all of our freedoms. Exceptions should be few and necessary. But in the hands of America's ruling lawmakers and jurists, the First Amendment is sometimes misapplied as a free pass for dysfunction and decadence.

It is apparently perfectly within the law for movie theaters to refuse to sell tickets to minors to see R- or X-rated movies. What is the difference when it comes to violent video games? To protect children from material of this kind has always been viewed as a sign of civilization. The evidence that watching violent material has a serious impact upon young people is widespread. Consider the role such violent videos played in the lives of the perpetrators of the massacre at Columbine.

Why would the Supreme Court turn its back on such evidence - and normal common sense - to issue a ruling such as the one it did? This is difficult to understand, and we are fortunate that Justices Breyer and Thomas dissented. From their dissents, hopefully, we can revisit this decision in the future. *

Saturday, 05 December 2015 04:39

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

In Contemporary American Society, Truth Is in Increasingly Short Supply

Truth seems to be an increasingly rare commodity in the contemporary American society.

In our political life, the lies are legion. In June, after ten days of adamant denials, Rep. Anthony Weiner (D-NY), finally admitted to having sent sexually explicit photographs to various young women. After telling the nation that he "did not have sexual relations with that woman," former President Clinton finally admitted the truth. Former Senator John Ensign (R-NV) denied the facts of his relationship with a married staff member and the payoff by his parents to the woman's husband. Former Senator John Edwards (D-NC) had a staff member claim to be the father of his mistresses' child.

But lack of truth goes far beyond the personal lives of our politicians. Where were the weapons of mass destruction we were told Saddam Hussein possessed - and which were used as a justification for launching the war in Iraq? We now know that the Gulf of Tonkin incident, the precipitating event which led to President Johnson's launching the Vietnam War, did not really happen. Sadly, the list is a long one.

And it is not only in our political life that truth is hard to find. In an important new book, Tangled Webs: How False Statements Are Undermining America: From Martha Stewart to Bernie Madoff, James B. Stewart warns of the risks from an epidemic of perjury that has "infected nearly every aspect of society."

Citing prosecutors who speak of a recent surge of deliberate lying by sophisticated individuals, often represented by the best lawyers, he focuses on four cases involving well-known people: business executive and lifestyle guru Martha Stewart, convicted of lying to investigators about the reasons her ImClone stock was sold; former Dick Cheney adviser Lewis "Scooter" Libby, found guilty of perjury in conjunction with the leak of CIA operative Valerie Plame's identity; baseball star Barry Bonds, indicted for perjury related to illegal use of steroid drugs; and Bernard Madoff, who while conducting the greatest Ponzi scheme in history, and lying to investors and investigators, was never actually indicted for perjury.

Stewart is particularly outraged when it comes to the failure to indict Madoff for perjury. It was clear to Securities and Exchange Commission investigators in 2005 that he was lying about his investment business, but their superiors decided not to press the issue:

At the time of his sworn testimony in 2006, Madoff purported to have approximately $20 billion under management. By the time his scheme collapsed, he had $65 billion. Failing to pursue his lies cost innocent victims another $45 billion.

Stewart believes that lying is on the rise, threatening to swamp the legal system, and sow cynicism nationwide. In the end, he argues, "it undermines civilization itself."

Consider the case of Greg Mortenson. His best-selling books, Three Cups of Tea, and Stones into Schools are full of lies and evasions. He tells the story of how in 1993, he stumbled into the tiny Pakistani village of Korphe after a failed attempt at climbing K2. He explains how the kind villagers nursed him back to health with many cups of tea and how, as payment for their generosity, he returned to build a school. That school then became hundreds of schools across Pakistan and Afghanistan. Millions were inspired by the idea that a man could make such a profound difference in a desperate part of the world. Mortenson was nominated three times for the Nobel Prize. He was called a secular saint.

In April, as a result of an investigative report by bestselling author Jon Krakauer and a "60 Minutes" expose, we learned that Mortenson may very well be a charlatan. The most significant passages in the book seem to be fictitious, including the whole story about his recovery in Korphe. The "Taliban abductors" described in "Three Cups of Tea" were supposedly friendly villagers protecting him as a guest of honor. It was reported that his charity is apparently badly mismanaged and that many of its schools stand empty, some of them serving as storage sheds for hay.

In 2009, only 41 percent of donations to Mortenson's charity went to its work in Afghanistan and Pakistan. Much of the rest, charge Krakauer and "60 Minutes," went to Mortenson himself - to chartered jets, massive purchases of his books (at retail, so he would get the royalties and keep them on the bestseller list), and advertisements for them in The New Yorker at more than $100,000 each time.

More and more Americans are also claiming to have military honors they never earned. Joseph Brian Cryer, for example, is a former candidate for City Council in Ocean City, Maryland. He claimed to be an elite U.S. Navy SEAL and bragged online about having "77 confirmed kills" in 102 hours during a Libyan operation in 1986. To prove his own bona fides, he showed a government ID card that shows him to be 100 percent disabled and a Navy commander.

But Cryer is a fraud, said Don Shipley, a retired SEAL who makes it his business to expose false ones. Shipley has access to a database of all Navy SEALs since 1947. Since Navy SEAL Team 6 took out Osama bin Laden in April, he said he has received about 50 requests each day to investigate people who claim to be SEALs.

The list of those criminally charged for falsifying their military service is a long one. In one case, Command Sgt. Maj. Stoney Crump, the senior enlisted man at Walter Reed Army Medical Center, was fired for faking his record and wearing numerous unauthorized awards and decorations. He was sentenced to six months in prison.

In another case, former Marine Corps Sgt. David Budwah was sentenced in 2009 to 18 months confinement and fined $25,000 for pretending to be an injured war hero to get free seats at rock concerts and professional sporting events.

"Every society in history, since the caveman days, has revered its warriors," said B. G. Burkett, author of Stolen Valor. He has uncovered thousands of suspected fakes and says most lie out of lack of self-esteem. "They haven't done anything in their lives," he said. "But the second they say they're a warrior, everybody sees them in a different light."

Congress passed the Stolen Valor Act in 2006. The law makes it illegal for someone to falsely claim to hold military honors or decorations. But some of those who have faced criminal charges claim the law is unconstitutional, arguing that it violates the First Amendment. The law "has every good intention," said Ken Paulson, president of the First Amendment Center. "But courts have been reluctant to outlaw lying in America. It's just too prevalent to legislate."

This far, federal courts have split on the law's constitutionality. A federal judge in Virginia ruled this year that the First Amendment doesn't protect the false claims the act makes illegal. But the California-based 9th Circuit Court of Appeals found the law unconstitutional last year.

In May, Rep. Joe Heck (R-NV) introduced a revised Stolen Valor Act that would make it a crime of fraud to benefit, or intend to benefit, from lying about military awards. "It's not O.K. to misrepresent yourself as a physician and practice medicine," Mr. Heck said.

It's not O.K. to misrepresent yourself as a police officer. Why should you be able to misrepresent yourself as a member of the military, specifically if you're trying to gain something of value?

The widespread telling of untruths - and the claim that people have a legal right to engage in lying about basic credentials - is an indication of our society's current moral standards. In the end, more is involved than simply immoral behavior. Such behavior is, in fact, a threat to democratic self-government.

Edmund Burke, in his letter to a member of the French National Assembly in 1791, made a point we might well ponder today:

Men are qualified for civil liberty in exact proportion to their disposition to put chains upon their own appetites in proportion as their love of justice is above their rapacity; in proportion as their soundness and honesty of understanding is above their vanity and presumption; in proportion as they are more disposed to listen to the counsels of the wise and good in preference to the flattery of knaves. Society cannot exist unless a controlling power upon will and appetite be placed somewhere, and the less there is of it within, the more of it there must be without. It is ordained in the eternal constitution of things that men of intemperate minds cannot be free. Their passions forge their fetters.

Can a Free Society Endure if It Does Not Teach Its History and Its Values to the Next Generation?

American students are less proficient in their nation's history than in any other subject, according to results of a nationwide test released in June, with most fourth graders unable to say why Abraham Lincoln was an important figure, and few high school seniors able to identify China as the North Korean ally that fought American troops during the Korean War.

Overall, 20 percent of fourth graders, 17 percent of eighth graders and 12 percent of high school seniors demonstrated proficiency on the exam, the National Assessment of Educational Progress. Fewer than a third of eighth graders could answer what was described as a "seemingly easy question," asking them to identify an important advantage American forces had over the British during the Revolution, the government's statement on the results said.

Diane Ravitch, an education historian who was invited by the national assessment's governing board to review the results, said she was particularly disturbed by the fact that only 2 percent of 12th graders correctly answered a question concerning Brown vs. Board of Education, which she called "very likely the most important decision of the U.S. Supreme Court in the past seven decades."

Students were given an excerpt including the passage, and were asked what social problem the 1954 ruling was supposed to correct:

We conclude that in the field of public education, separate but equal has no place, separate educational facilities are inherently unequal.

"The answer was right in front of them," Ms. Ravitch said. "This is alarming."

"The results tell us that, as a country, we are failing to provide children with a high-quality, well-rounded education," said Education Secretary Arne Duncan.

The evidence of our failure to teach our history is abundant. Fewer than half of American eighth graders knew the purpose of the Bill of Rights on the most recent national civics examination, and only one in 10 demonstrated acceptable knowledge of the checks and balances among the legislative, executive, and judicial branches, according to the test results released in April.

"These results confirm that we have a crisis on our hands when it comes to civics education," said Sandra Day O'Connor, the former Supreme Court justice, who last year founded icivics.org, a nonprofit group that teaches students civics through web-based games and other tools.

"The results confirm an alarming and continuing trend that civics in America is in decline," said Charles N. Quigley, executive director of the Center for Civic Education. "During the past decade or so, educational policy and practice appear to have focused more and more upon developing the worker at the expense of developing the citizen."

"We face difficult challenges at home and abroad," said Justice O'Connor.

Meanwhile, divisive rhetoric and a culture of sound bites threaten to drown out rational dialogue and debate. We cannot afford to continue to neglect the preparation of future generations for active and informed citizenship.

Historian David McCullough says that:

We're raising young people who are, by and large, historically illiterate. I know how much of these young people - even at the most esteemed institutions of higher learning - don't know. It's shocking.

McCullough, who has lectured on more than 100 college campuses, tells of a young woman who came up to him after a lecture at a renowned university in the Midwest. "Until I heard your talk this morning, I never realized the original 13 colonies were all on the East Coast," she said.

Some years ago, when 111 ninth graders in a Honolulu school were asked to write the Pledge of Allegiance, no one could do it correctly. One response described the United States as a nation "under guard" and dedicated "for richest stand." A teacher, who asked not to be identified so her students would not be embarrassed, called the results frightening. She said all the students had spelling problems and had little grasp of what the pledge words meant. The word "indivisible," for example, came out as "in the visible." The teacher said that 12 students had trouble spelling the word "America." The word appeared in some papers as "Americain," "Americai," "Amereca," "Amicra," and "Amica." The teacher said, "I'm sick. I don't know what to do or where to turn."

These trends were hardly new. More than twenty years ago, writing in Public Opinion magazine, author Ben Stein reported:

Recently a 19 year-old junior at the University of Southern California sat with me while I watched "Guadalcanal Diary" on T.V. It goes without saying that the child had never heard of Guadalcanal. More surprisingly, she did not know whom the U.S. was fighting against in the Pacific. ("The Germans?") She was genuinely shocked to learn that all those people were Japanese and that the U.S. had fought a war against them. ("Who won?") Another student at USC did not have any clear idea when World War II was fought. . . . She also had no clear notion of what had begun the war for the U.S. Even more astounding, she was not sure which side Russia was on and whether Germany was on our side or against us. In fact, I have not yet found one single student in Los Angeles, in either college or in high school, who could tell me the years when World War II was fought. Nor have I found one who knew when the American Civil War was fought.

Stein laments that:

Unless our gilded, innocent children are given some concept of why the society must be protected and defended, I fear that they will learn too soon about a whole variety of ugly ideas they did not want to know about. . . . People who do not value what they have rarely keep it for long, and neither will we.

Things have gotten far worse since Stein wrote those words. One reason for students' poor showing on recent tests underlines the neglect shown to the study of history by federal and state policy makers - both Republicans and Democrats - especially since the 2002 No Child Left Behind act began requiring schools to raise scores in math and reading, but in no other subject. This federal accountability law (surprisingly embraced by Republicans who previously argued that education was a state and local - not a federal - matter) has given schools and teachers an incentive to spend most of their time teaching to the math and reading tests, and totally ignoring history.

"History is very much being shortchanged," said Linda K. Salvucci, a history professor in San Antonio who is chairwoman-elect of the National Council for History Education.

Historian Paul Johnson points out that:

The study of history is a powerful antidote to contemporary arrogance. It is humbling to discover how many of our glib assumptions, which seem to us novel and plausible, have been tested before, not once but many times and in innumerable guises; and discovered to be, at great human cost, wholly false.

Free societies are rare in history. If their history and values are not transmitted to the next generation, their survival is questionable. As Cicero (106-43 B.C.) understood:

To remain ignorant of things that happened before you were born is to remain a child. What is human life worth unless it is incorporated into the lives of one's ancestors and set in a historical context?

European Leaders Are Turning Against Multi-culturalism - a Dilemma Faced by Our Own Society as Well

As immigration problems - particularly among the large North African and Middle Eastern populations in France, Germany, the Netherlands, Great Britain, and other West European countries - rise to the surface, the idea of "multi-culturalism" is coming under increasing criticism.

Germany's chancellor, Angela Merkel, called it "a total failure," and France's president Nicolas Sarkozy, told an interviewer that immigrants should "melt into a single community." In a speech in Munich, Britain's Prime Minister, David Cameron, traces the problem of homegrown Islamist alienation and terrorism to "a question of identity."

"A passively tolerant society," Cameron said, "stands neutral between different values." But "a generally liberal country . . . says to its citizens, this is what defines us as a society: to belong here is to believe in these things."

The things Cameron went on to cite were freedom of speech and worship, democracy, and the rule of law, and equal rights. Much of this is not new, as concern over multi-culturalism has been growing. A year after the London bombings of July, 2005, Ruth Kelly, then the Labor Party minister in charge of community policies, asked whether - in its anxiety to avoid imposing a single British identity on diverse communities - multi-culturalism had encouraged "separateness."

In December 2006, Tony Blair gave a speech on multi-culturalism which included many of Prime Minister Cameron's points. Both prime ministers called for tighter controls on Muslim groups receiving public funds, an entry ban on foreign preachers with extremist views, a tougher position on forced marriages, and an expectation that all British citizens support common values, from the rule of law to a rejection of discrimination.

French president Sarkozy declared that:

If you come to France, you accept to melt into a single community, which is the national community, and if you do not want to accept that, you cannot be welcome in France. Of course, we must respect all differences, but we do not want . . . a society where communities coexist side by side.

Europe's dilemma is real, as is its need for immigrants. Deaths are expected to outnumber births this year in 10 of the European Union's 27 member states. As of 2015 the EU as a whole will experience negative natural population growth, demographers say, and the gap will grow to one million excess deaths a year by 2035. By 2050 the EU will have 52 million fewer people of working age, the European Commission warns. Businesses across Europe are already facing severe shortages of engineers, technicians, craftspeople, and other skilled professionals, with four million unfilled jobs across the continent.

For decades, most European countries have consigned immigrants to the margins. In Germany - which, until recently, continued to proclaim it was "not an immigrant society" - some professions were restricted to German citizens well into the 1990s, while eligibility for citizenship itself was based on bloodlines until a landmark reform in 2001. Millions of refugees were legally barred from working, which forced them into welfare dependency. Muslims, in particular, remain unintegrated and ghettoized in many European countries.

The attention now being focused on the need to integrate immigrants into European society is a hopeful sign. We have had this same debate in the U.S. for some time, but, for a variety of reasons, have done a better job in integrating immigrants into our society. Fortunately, our "melting pot" tradition has served us well.

What Americans have in common is not a common racial, ethnic, or religious background, but, instead, a commitment to the concept of individual freedom in a society established by the U.S. Constitution, which protects and preserves it.

We have, of course, had our advocates of "bi-lingual," "Afro-centric," and other forms of multi-cultural education. According to the multiculturalist worldview, notes Linda Chavez:

African-Americans, Puerto Ricans, and Chinese Americans living in New York City have more in common with persons of their ancestral group living in Lagos or San Juan or Hong Kong than they do with other New Yorkers who are white. Culture becomes a fixed entity, transmitted, as it were, in the genes, rather than through experience.

Historian Arthur M. Schlesinger, Jr. declared that:

Multiculturalists would have our educational system reinforce, promote, and perpetuate separate ethnic communities and do so at the expense of the idea of a common culture and a common national identity.

Afro-centric education and other forms of separate education for separate groups is the opposite of the traditional goal of civil rights leaders who wanted only to open up American education to all students, regardless of race. The distinguished black leader in the early years of this century, W. E. B. DuBois, disputed the multiculturalists of his own day. He said:

I sit with Shakespeare and he winces not . . . . Across the color line I move arm in arm with Belzac and Dumas. I summon Aristotle and Aurelius and what soul I will, and they come all graciously with no scorn or condescension. So, wed with Truth, I dwell above the veil.

To him, the timeless wisdom of the classical works of Western civilization spoke to all people and races, not just to whites of European ancestry.

Professor Seymour Martin Lipset of the Hoover Institution at Stanford University declares:

The history of bilingual and bicultural societies that do not assimilate are histories of turmoil, tension, and tragedy. Canada, Belgium, Malaysia, Lebanon - all face crises of national existence in which minorities press for autonomy, if not independence. Pakistan and Cyprus have divided. Nigeria suppressed an ethnic rebellion. France faces difficulties with its Basques, Bretons, and Corsicans.

European societies will resolve their difficulties with today's immigrants when they adopt the American model and recognize that their societies will no longer be homogeneous in the future and that their goal should be to assimilate new immigrants into the culture and civilization of France, England, and other Western European countries.

Some of today's Islamic immigrants may provide a greater challenge, but this should not prove insurmountable if the proper policies are adopted. Finally, Western European leaders seem to have come to the understanding that multiculturalism is not the way.

Immigrants leave their native countries for the West because there is something in the West they want. It is the responsibility of these Western European countries to transmit to their new immigrants the values, culture, and civilization of their societies. This has been going on in our own country for more than 300 years with great success. The first female black candidate for president, Rep. Shirley Chisholm (D-NY) once said, "We came over on different ships, but we're in the same boat now." Multiculturalism is a dangerous detour from the assimilation and acculturation of immigrants which should be the primary goal of those societies now receiving large numbers of new residents. Abandoning the notion that Western European countries are "not immigrant societies" is an important step forward. *

Saturday, 05 December 2015 04:37

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

One Reason for Our Educational Decline May Be Bad Students, Not Bad Schools

The fact that our schools are not doing very well in educating our students is abundantly clear. A recent international assessment of students showed that as China's economic power has risen, so has its educational progress. The survey, which rated the aptitude for 15-year-olds in 65 countries, shows teens in Shanghai far outscoring international peers in reading, math, and science. The U.S. lags far behind countries such as Finland and South Korea. When it comes to math, Chinese students scored 600 while American students scored 497. In science, the Chinese scored 575 and Americans 501.

A study released in November by McKinsey, the international consulting firm, showed that throwing money at education does not seem to do much good, at least in those countries that already send all their young people to school. The U.S., for example, increased its spending on schools by 21 percent between 2000 and 2007, while Britain pumped in 37 percent more funds. Yet, during this period, standards in both countries slipped.

Many school systems that did not receive extra funds did much better. Schools in the state of Saxony, in Germany, in Latvia, Lithuania, Slovenia, and Poland have all raised their achievement scores. Even poor countries such as Chile and Ghana have made progress.

In an important new book, Bad Students, Not Bad Schools (Transaction), Robert Weissberg, who taught political science at the University of Illinois for decades, argues that the reason for educational decline "is the students, stupid, not the facilities or the curriculum."

It is his view that this "obvious truth" is one which none dares to speak. He reports that millions of lazy, incurious, disruptive, and nearly illiterate youngsters flood classrooms every day, and none of the popular and expensive initiatives and ideas that are being promoted by well-meaning foundations and professors of education will change them. In his view, the middling students far outnumber the motivated ones, and the most difficult ones - troublemakers and vandals, immigrants struggling with English, kids who hate homework (an Indiana survey counts 50 percent who complete one hour or less of studying per week) - effectively turn classrooms into chaotic free-for-alls.

Year after year, Weissberg shows, a new initiative is presented - laptops for every eighth grader, bills to equalize school funding, after-school day care for single mothers - founded on the assumption that a better environment will invigorate lagging ones, close the racial gap, and prepare every student for college.

Weissberg notes that bright students with a good work ethic excel whether they study with new laptops or much used textbooks. His example is the children of the Vietnamese boat people - from close-knit families who place a high priority on educational achievement. Students who come from families who pay little attention to their children's education, children who have never been read to and learned to appreciate books, whose parents - often a single mother - are usually otherwise occupied - are not coming to school prepared to learn. It should be no surprise, Weissberg argues, that they do so poorly.

The kinds of reforms advocated to improve our schools tend, Weissberg believes, to completely miss the point of the real challenge we face:

In sports, this would be as if a basketball franchise with inept, lackadaisical players tried to reverse its fortunes by constructing a spectacular new arena, adding high-tech training facilities, inventing clever new plays, and hiring a Hall of Fame coach.

To resolve these problems, Weissberg advocates a radical approach, one unlikely to find much support. He explicitly advocates a policy "to eliminate the bottom quarter of those past 8th grade" or, "altering the mix of smart and not so smart students." In this formulation, unintelligent and idle students exact a lot of time and labor, holding up gifted students. Once the bad-student pool reaches a certain proportion, the teacher, principal, and school board end up devoting all of their attention to it.

One can find this prescription too harsh and perhaps counterproductive while recognizing that Weissberg has identified the problem with current efforts at educational reform which do not identify the real problems we face. He devotes the bulk of his book to various elements of the reform effort, documenting the way each one rationalizes away the bored and unruly students. Judges force municipalities to integrate schools, foundations write multi-million-dollar checks to enhance "learning opportunities" in urban schools, conservative leaders push charter-school expansion. No one addresses the fact that we do not turn out the same kind of students we used to because we do not get the same kind in.

Professor Mark Bauerlein of Emory University, writing in Commentary, notes that:

An entire industry has prospered on malfunction. If public schools produced skilled, knowledgeable graduates, if at-risk kids turned into low-risk kids, if students in crumbling buildings achieved as well as others, an army of professors, advocates and activists and lawyers, foundation personnel, contractors, and construction workers, security officers, after-school and summer-school tutors, and school administrations, counselors, and professional developers, would have to find other employment. The more schools fail, the more money pours in. Los Angeles has one of the lowest graduation rates in the country, and in September it opened the $578 million Robert F. Kennedy Community Schools complex, the most expensive school in U.S. history.

The 2008 documentary "Hard Times at Douglass High," features a Baltimore student named Audie. "This is what we do," Audie said, talking about himself and other students who roamed the halls all day, learning nothing:

Just walking the halls all day, baby. (Bleep) class. That (bleep's) for clowns, man. Don't nobody go to class here, man. Man, (bleep) academics.

Discussing the Weissberg book, columnist Gregory Kane, who is black, declares: "Want to really reform American education? Get guys like Audie out of our schools."

No problem can be resolved unless it is properly understood. Whatever one thinks of Professor Weissberg's proposed solutions, his analysis is worthy of serious attention.

The Focus of Attention on the Role of Public Sector Unions in Leading Cities and States to Fiscal Crisis Is Long Overdue

The debate in Wisconsin, Ohio, Indiana, and other states over collective bargaining on the part of public employees has focused long-overdue attention upon the role of public sector unions in moving our cities and states toward fiscal crisis and, in many instances, insolvency.

The idea of collective bargaining for public employees is a rather new concept. Wisconsin was the first state to provide those rights in 1959. Other states followed and California became the biggest convert in 1978 under Jerry Brown in his first term as governor. President Kennedy permitted some federal workers to organize, although not bargain collectively, for the first time in 1962.

Few Americans today remember how, until recent years, even our most liberal political leaders resisted collective bargaining for public unions. President Franklin Roosevelt opposed collective bargaining for public unions as did legendary New York City Mayor Fiorello LaGuardia. Even George Meany, the long-time AFL-CIO president, opposed the right to bargain collectively with the government.

The Wall Street Journal explained the basis for such opposition:

. . . unlike in the private economy, a public union has a natural monopoly over government services. An industrial union will fight for a greater share of corporate profits, but it also knows that a business must make profits or it will move or shut down. The union chief for teachers, transit workers, or firemen knows that the city is not going to close the schools, buses, or firehouses. This monopoly power, in turn, gives public unions inordinate sway over elected officials. The money they collect from member dues helps elect politicians who are then supposed to represent the taxpayers during collective bargaining. . . . Public unions depend entirely on tax revenues to fund their pay and benefits. They thus have every incentive to elect politicians who favor higher taxes and more government spending. The great expansion of state and local spending followed the rise of public unions.

Concern about public sector unions is hardly new. When three-fourths of the Boston police department went on strike in 1919, leading to an escalation of crime, then-Massachusetts Governor Calvin Coolidge called out the state militia and broke the strike. Coolidge declared: "There is no right to strike against the pubic safety by anybody, anywhere, any time."

In August, 1981, the Professional Air Traffic Controllers Organization called a strike over better working conditions, better pay and a 32-hour work week. In doing so, the union violated a law that banned strikes by government unions. President Reagan declared the strike "a peril to national safety" and ordered the members back to work under terms of the Taft-Hartley Act of 1947. Only 1,300 of the nearly 13,000 controllers returned to work. President Reagan demanded those remaining on strike to resume work within 48 hours or forfeit their jobs. In the end, Reagan fired 11,345 striking air traffic controllers and banned them from federal service for life.

Collective bargaining rights are only one part of the problem we face. Consider the state of Virginia, which bans collective bargaining. Like pension systems in states friendlier to unions, Virginia's public employee retirement system is underfunded by $17.6 billion. At the same time, teachers in Virginia have slightly higher average salaries than the unionized teachers in Wisconsin, and over the past decade, Virginia teacher pay grew faster than teacher pay in Wisconsin.

The fact is that, regardless of the question of collective bargaining, states and local governments across the country are faced with chronic fiscal problems rooted in unsustainable employee compensation systems. This is an issue beyond traditional liberal and conservative divisions. Editorially, The Washington Post notes that:

Much of the issue is rooted in healthcare costs, especially benefits for public-sector retirees. States face a combined $555 billion in unfunded retiree health coverage liabilities. Yet in 14 states, taxpayers pick up 100 percent of the premium tab for retirees, who often collect benefits for a decade or more before going on Medicare. This is not only unfair to taxpayers, for whom free healthcare is usually a remote dream. It also encourages overconsumption of medical goods and services, thus raising the cost for everyone.

More than a third of the nation's $9.3 trillion in pension assets belong to state and local government employees, even though they make up only 15 percent of the U.S. work force, according to a study by the Spectrum investment group. Even with $3.4 trillion set aside to pay public pensions, dozens of state and local governments are struggling to make payments. Wisconsin, Ohio, and Florida are calling on state employees for the first time to contribute to their retirement plans the way workers do in the private sector. The $3.4 trillion set aside for public pensions understates the burden for states and taxpayers since the plans are collectively underfunded by as much as $2.5 trillion, said Milton Ezrati, senior economist at Lord Abbott & Company.

"The undeniable fact is that most states and municipalities offer more generous pensions that they can afford," he said, noting that the plans typically allow employees full retirement benefits after 20 or 30 years of employment and include generous cost-of-living increases, healthcare benefits, and other perks that are not common in the private sector.

The Spectrum study found that the nation's 19.7 million state and local employees constituted 15 percent of the 128-million American work force in 2009. Yet they laid claim to more than $3 in retirement assets for every $1 set aside for the retirement of the nation's 108 million workers in the private sector.

According to the Bureau of Labor Statistics in 2010, the total compensation costs of state and local government workers were 44 percent higher than private industry; pay was only 33 percent higher, but benefits cost 70 percent more.

"The cost," says Daniel DiSalvo of the City College of New York,

. . . of public-sector pay and benefits (which in many cases far exceed what comparable workers earn in the private sector), combined with hundreds of billions of dollars in unfunded pension liabilities for retired government workers, are weighing down state and city budgets. And staggering as these burdens seem now, they are actually poised to grow exponentially in the years ahead.

At long last, public attention is being focused upon the role of public sector unions. It could not come a moment too soon.

Horrors Continue in Zimbabwe, but the World Largely Looks Away

When it comes to dictators in Africa clinging to power, the list, unfortunately, is a long one. This year, popular uprisings in North Africa have led to the removal of Zine el-Abidine Ben Ali's 23-year regime in Tunisia, and Hosni Mubarak's 30-year control of Egypt. At the present time, Libya's Moammar Gaddafi is fighting popular resistance - as well as Western air strikes - to maintain his 41-year-old grip on power.

Sadly, many other dictators remain in place. Robert Mugabe of Zimbabwe has always been clear about his own ambition. "No matter what force you may have," he declared in 2001, "this is my territory and that which is mine I cling to until death."

In April 2008, voters in Zimbabwe flocked to the polls, and, by an overwhelming margin, repudiated Mugabe's rule. Then 84 and in failing health, Mugabe seemed ready to concede defeat to the opposition leader, Morgan Tsvangirai. Instead, Mugabe and his supporters launched a counterattack. The Zimbabwe Electoral Commission, controlled by the ruling party, falsified the vote count, forcing Tsvangirai into a second round. Foreign journalists were detained and removed from the country. Mugabe loyalists hunted down, beat, and killed supporters of Tsvangirai's Movement for Democratic Change (M.D.C.). Mugabe's generals called it "Operation Who Did You Vote For?"

In a new book, The Fear: Robert Mugabe and the Martyrdom of Zimbabwe, Peter Godwin, himself a native of Zimbabwe when it was Rhodesia, recalls that he was then one of the few Western journalists remaining in the country. He traveled from Harare to rural Zimbabwe, documenting the bloodshed. He visited hospitals overflowing with maimed and burned victims. "Think of deep, bone-deep lacerations, of buttocks with no skin left on them, think of being flayed alive."

He writes of a torture method called falanga: "Think of swollen, broken feet, of people unable to stand, unable to sit, unable to lie on their backs because of the blinding pain."

At one point, Godwin joins with James McGee, the American ambassador, on a fact-finding trip outside Harare. They repeatedly confront policemen, militia members, and intelligence agents, but McGee manages to move forward as he and his team gather evidence of torture and murder. Godwin wanders into a farmhouse used as a torture center by Mugabe's hit teams and discovers a notebook that documents interrogations and names people "who are to be beaten." Finally, Godwin is advised to leave the country for his own safety and he watched from New York as Tsvangirai withdraws from the runoff, saying he cannot participate in a "violent, illegitimate sham."

A few months later, Tsvangirai and Mugabe sign the so-called Global Political Agreement. Negotiated under international pressure by South African president Thabo Mbeki - who remained silent as the murder count rose - the deal kept Mugabe entrenched in power but forced him to install Tsvangirai as prime minister and turn over half the cabinet seats to members of the Movement for Democratic Change.

Peter Godwin returned to Zimbabwe to witness the inauguration of the new government. He quickly realized that the ruling party has no intention of upholding the agreement. Godwin's friend Roy Bennett, a white, Shona-speaking ex-farmer and M.D.C. leader popular with his black constituents, returns from exile in South Africa to assume a junior cabinet post and is almost immediately placed in jail, held for weeks in very poor conditions. Tendai Biti, a courageous attorney and M.D.C. secretary general, survives his own incarceration on treason charges and reluctantly signs on as finance minister, "Here is Tendai," Godwin writes, "trying to scrounge the money to pay for the bullets that were used against his own supporters in the last election."

Godwin portrays Mugabe as an "African Robespierre" - highly educated and completely ruthless. He cautions against viewing him as a case of a good leader gone bad. "His reaction to opposition has invariably been a violent one," writes Godwin.

Using violence to win elections has long been Mugabe's method of remaining in power. He first set out his views on electoral democracy in 1976, during the guerrilla war against the government of Rhodesia - in which he was widely embraced in the West, including in Washington - in a radio broadcast. "Our votes must go together with our guns." He even boasted of having "a degree in violence." Since coming to power in 1980, he has regularly resorted to the gun to deal with whatever challenge his regime has faced.

Peter Godwin details the manner in which, after the 2008 elections, Mugabe unleashed the army, police, security agencies, and party militias to beat the electorate into submission in time for the second round of elections. Among the electorate this campaign was known simply as "chidudu" - the fear. Villagers were beaten and told to "vote Mugabe next time or you will die." Scores of opposition organizers were murdered by death squads. Rape, arson, and false arrests were widespread.

Mugabe was open about his intentions and his contempt for democracy. "We are not going to give up our country because of a mere 'X,'" he told supporters at an election rally. "How can a ballpoint fight with a gun?"

What stands out in Godwin's reporting is not just the scale of destruction that Mugabe has inflicted on his country but the courage of Zimbabweans who defy his tyranny, knowing the consequences of doing so. Godwin describes the "insane bravery" of an opposition candidate who continued to taunt his attackers even while they were beating him and later, defying doctors' orders, appeared in plaster cast to take his place at the swearing-in ceremony at a local council.

The African Union, formerly the Organization of African Unity, says that it is determined to be more rigorous than its predecessor, which turned a blind eye to dictatorship and tyranny. According to The Economist:

. . . The AU still exudes a lot of hot air. . . . The AU's instinct is still to wring hands . . . rather than resolve issues. Its credibility was hurt when Moammar Gaddafi was elected chairman for 2009. This year, Equatorial Guinea's Teodoro Obiang, one of Africa's more venal leaders, looks likely to get the job.

And the people of Zimbabwe continue to suffer as the world, including our own country, which bears some responsibility for installing Mugabe in power, looks away. The brave men and women who have shown their willingness to put their lives on the line for freedom deserve better.

American Colleges and Universities Are Failing to Transmit Our History and Culture

There is growing evidence that our colleges and universities are failing to transmit our history and culture.

Recently, the Intercollegiate Studies Institute (ISI) gave a 60-question civic literacy test to more than 28,000 college students:

Less than half knew about federalism, judicial review, the Declaration of Independence, the Gettysburg Address, and NATO. And this was a multiple choice test, with the answers staring them right in the face. . . .

said political scientist Richard Brake, co-chairman of ISI's Civic Literacy Board. Brake said:

Ten percent thought that "We hold these truths to be self-evident, that all men are created equal . . ." came from the Communist manifesto.

In another study, a large number of U.S. university students were shown to have failed to develop critical thinking, reasoning, and writing skills because of easy classes and too little time spent studying.

The study of 3,000 students at 29 four-year universities found that 45 percent "did not demonstrate any significant improvement in learning" during their first two years in college as measured by a standardized test. After the full four years, 36 percent had shown no development in critical thinking, reasoning, and writing, according to the study, which forms the basis of the new book Academically Adrift: Limited Learning on College Campuses. The study attributed much of the problem to easy courses and lax study habits.

Real requirements at most colleges and universities have all but disappeared. John Hopkins University, for example, is America's premier research institution. Yet a student could complete a bachelor's degree without ever taking a course in science, math, history, or English. Students at John Hopkins - and many other colleges - notes Washington Post writer Daniel DeVise:

. . . choose classes the way a diner patron assembles a meal, selecting items from a vast menu. Broad distribution requirements ensure that students explore the academic universe outside their majors. But no one is required to study any particular field, let alone take a specific course. Shakespeare, Plato, Euclid - all are on the menu: none is required.

The American Council of Trustees and Alumni, a Washington-based advocacy group, recently handed out F grades to Hopkins and many of its peers, inviting debate on a basic question: What, if anything, should America's college students be required to learn?

The group faulted the schools, including Yale, Brown, Cornell, Amherst, and the University of California, Berkeley, for failing to require students to take courses in more than one of seven core academic subjects: math, science, history, economics, foreign language, literature, and composition.

"At Stanford, you can fulfill the American cultures requirement by taking a course on a Japanese drum," said Anne Neil, president of the trustees group.

"We're certainly not saying that Harvard or Hopkins or Yale are not good schools, or that their graduates are not smart kids," said Neal, who attended Harvard and Harvard Law. "What we're saying is that those schools don't do a good job at providing their students with a coherent core."

Richard Ekman, president of the Council of Independent Colleges in Washington, states: "I think the criticism that students may not be learning enough in general education resonates with most colleges."

Neal says that the group's examination of more than 700 college catalogs proves that:

It is quite possible to avoid American history, or Plato, or science. Many colleges don't even require their English majors to take a course on Shakespeare.

The study of history is in the process of dramatic change. In 1975, three quarters of college history departments employed at least one diplomatic historian; in 2005, fewer than half did. The number of departments with an economic historian fell to 32.7 percent from 54.7 percent. By contrast, the biggest gains were in women's history, which now has a representative in four out of five history departments.

The shift in focus began in the late 1960s and early 1970s when a generation of academics began looking into the roles of people generally missing from history books - women, minorities, immigrants, workers. Social and cultural history, then referred to as bottom-up history, offered fresh subjects.

At the University of Wisconsin, Madison, out of the 45 history faculty members listed, one includes diplomatic history as a specialty, one other lists American foreign policy; 13 name gender, race, or ethnicity. Of the 12 American history professors at Brown University, the single specialist in U.S. foreign policy also lists political and cultural history as areas of interest. The professor of international studies focuses on victims of genocide.

"The boomer generation made a decision in the 1960s that history was starting over," said David Kaiser, a history professor at the Naval War College. "It was an overreaction to a terrible mistake that was the Vietnam War." The result is that "history is no longer focused on government, politics, or institutions."

There are no reliable statistics of course offerings, but Mr. Kaiser and others argue that there has been an obvious decline. "European diplomacy is just about completely dead," Kaiser said, "and it's very hard to find a course on the origins of the First World War."

At Ohio University in Athens, when a military historian recently retired, there was a vigorous debate about how to advertise for a replacement. Some faculty members had the view that "military history is evil," said Alonzo L. Hamby, a history professor. The department finally agreed to post a listing for a specialist in "U.S. and the world," the sort of "mushy description that could allow for a lot of possibilities."

Our unity as a nation is threatened, argued Donald Kagan, Professor of History and Classics and Dean of Yale College in his address to Yale's freshman class in September, 1990, by those who would replace the teaching of our history and culture with something else. He declared:

. . . American culture derives chiefly from the experience of Western Civilization, and especially from England, whose language and institutions are the most copious springs from which it draws its life. I say this without embarrassment, as an immigrant who arrived here as an infant from Lithuania. . . . Our students will be handicapped in their lives after college if they do not have a broad and deep knowledge of the culture in which they live and roots from which they come . . . . As our land becomes ever more diverse, the danger of separation and segregation by ethnic group . . . increases and with it the danger to the national unity which, ironically, is essential to the qualities that attracted its many peoples to this country.

In his book The Roots of American Order, Russell Kirk pointed out that these roots go back to the ancient world - to the Jews and their understanding of a purposeful universe and under God's dominion; to the Greeks, with their high regard for the use of reason, to the stern virtues of the Romans such as Cicero; to Christianity, which taught the duties and limitations of Man, and the importance of the Transcendent in our lives. These roots, in addition, include the traditions and universities of the medieval world, the Reformation and the response to it, the development of English Common Law, the debates of the 18th century, and the written words of the Declaration of Independence and the Constitution.

American colleges and universities do our students - and our country - a disservice by not transmitting our history and culture to the next generation. Unless those who understand the very fragile nature of our civilization and the uniqueness of the tradition upon which free institutions are based, rise in defense of that culture may well be swept away. If this takes place, all of us will be the losers, not least the various groups in whose name such a cultural assault has been launched. *

Saturday, 05 December 2015 04:34

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

The Need to Curb the Role of Public Employee Unions Is Clear as Bankruptcy Looms for Many States and Cities

The state of Illinois is trying to pay billions in bills that it got from schools and social service providers last year. Arizona recently stopped paying for certain organ transplants for people in its Medicaid program. States are releasing prisoners early, largely to cut expenses. In December, the city of Newark, New Jersey, laid off 13 percent of its police officers.

"It seems to me that crying wolf is probably a good thing to do at this point," said Felix Rohatyn, the financier who helped save New York City from bankruptcy in the 1970s.

One of the important contributing factors in the current economic decline in the fortunes of our states and cities is the role being played by public employee unions.

While union membership has collapsed in the private sector over the past 30 years, from 33 percent of the workforce to 15 percent, it has remained buoyant in the public sector. Today, more than 35 percent of public employees are unionized, compared with only 11 percent in 1960.

The role of public employee unions in our political life has been growing. "We just won an election," labor boss Andy Stern declared two years ago, at about the time Barack Obama was taking the oath of office and the union movement was giving itself much of the credit for his victory. After spending some $450 million to elect Obama and other Democrats, labor was indeed riding high. Teachers alone accounted for a tenth of the delegates to the Democratic convention in 2008.

All too often, states The Economist:

Politicians have repeatedly given in, usually sneakily - by swelling pensions, adding yet more holidays or dropping reforms, rather than by increasing pay. Too many state workers can retire in their mid-50s on close to full pay. America's states have as much as $5 trillion in unfunded pensions liabilities. . . . Sixty-five should be a minimum age for retirement for people who spend their lives in classrooms and offices; and new civil servants should be switched to defined contribution pensions.

Another Battleground, reports The Economist, will be

. . . the unions' legal privileges. It is not that long since politicians of all persuasions were uncomfortable with the idea of government workers joining unions. (Franklin Roosevelt opposed this on the grounds that public servants have "special relations" and "special obligations" to the government and the rest of the public.) It would be perverse to ban public sector unions outright at a time when governments are trying to make public services more like private ones. But their right to strike should be more tightly limited; and the rules governing political donations and even unionization itself should be changed to "opt-in" ones, in which a member decides whether to give or join.

There are now more American workers in unions in the public sector (7.6 million) than in the private sector (7 million), although the private sector employs five times as many people. In fact, union density is now higher in the public sector than it was in the private sector in the 1950s.

Andy Stern, head of the Service Employees International Union (SEIU), was the most frequent guest at the White House in the first six months of the Obama administration. Public-sector unions, as providers of vital monopoly services, can close down entire cities. As powerful political machines, they can help pick the people who are on the other side of the bargaining table. David DiSalvo, writing in National Affairs, points out that the American Federation of State, County and Municipal Employees (AFSCME) was the biggest contributor to the political campaigns in 1989-2004. He also notes that such influence is more decisive in local campaigns, where turnout is low, than in national ones.

Evidence from the American Bureau of Labor Statistics shows that public-sector unions have used their power to extract a wage premium: public-sector workers earn, on average, a third more than their private-sector counterparts. At the same time, governments give their workers generous pensions, which do not have to come out of current budgets. Many public employees also game the system. Eighty-two percent of senior California Highway Patrol officers, for example, discover a disabling injury about a year before they retire.

Unions have also made it almost impossible to remove incompetent workers. Mary Jo McGrath, a California lawyer, says that "getting rid of a problem teacher can make the O. J. Simpson trial look like a cake-walk." In 2000-10 the Los Angeles school district spent $3.5 million trying to get rid of seven of its 33,000 teachers, and succeeded with only five.

Newly elected governors - both Republicans and Democrats - are focusing their attention on public-sector unions. In Wisconsin, Governor Scott Walker, a Republican, is promising to use "every legal means" to weaken the bargaining power of state workers - including decertification of the public employees' union. Ohio's new governor, Republican John Kasich, wants to end the rule that requires nonunion contractors to pay union wages, and is targeting the right of public employees to strike.

Even in states where Democrats remain in power, unions are under the gun. New York's Governor Andrew Cuomo intends to freeze the salaries of the state's 190,000 government workers, and has promised to tighten the budget belt when public union contracts are renegotiated this year. In California, Governor Jerry Brown, who gave public employees the right to unionize when he was governor in the 1970s, now speaks about the unsustainable drain that union pensions and health benefits are on the state's budget.

Things in the states are so bad that, in the case of Arizona, it has sold several state buildings - including the tower in which the governor has her office - for a $735 million upfront payment. But leasing back the building over the next 20 years will ultimately cost taxpayers an extra $400 million in interest. Many states are delaying payments to their pension funds, which eventually need to be made. In New Jersey, Governor Chris Christie deferred paying the $3.1 billion that was due to the pension funds in 2010.

The role of our public employee unions in leading our states and cities to insolvency has not been properly examined. It is high time that it was.

Up from the Projects: The Life of Walter E. Williams

Walter Williams has had a distinguished career - as an economist, author, columnist, teacher, and sometime radio and television personality. As a black advocate of the free market and a genuinely color-blind society, he has often come under bitter attack. Now 74, he has decided to tell the story of his life.

"What I've done, said, and written, and the positions I have taken challenging conventional wisdom," he writes

. . . have angered and threatened the agendas of many people. I've always given little thought to the possibility of offending critics and contravening political correctness and have called things as I have seen them. With so many "revisionist historians" around, it's worthwhile for me to set the record straight about my past and, in the process, discuss some of the general past, particularly as it relates to ordinary black people.

He recalls an angry response that former Secretary of Health, Education, and Welfare, Patricia Roberts Harris, wrote in response to a two-part series, "Blacker Than Thou," by another black conservative economist and good friend, Thomas Sowell, in The Washington Post in 1981. Assessing his criticism of black civil rights leaders, Harris said, "People like Sowell and Williams are middle class. They don't know what it is to be poor."

This assessment, however, is completely false. Williams notes that:

Both Sowell and I were born into poor families, as were most blacks who are now as old as we are. . . . While starting out poor, my life, like that of so many other Americans, both black and white, illustrates one of the many great things about our country: just because you know where a person ended up in life doesn't mean you know with any certainty where he began. . . . Unlike so many other societies around the world, in this country, one needn't start out at, or anywhere near, the top to eventually reach it. That's the kind of economic mobility that is the envy of the world. It helps explain why the number one destination of people around the world, if they could have their way, would be America.

Williams describes his humble beginnings, growing up in a lower middle-class, predominantly black neighborhood in West Philadelphia in the 1940s, raised by a strong and demanding single mother with high academic aspirations for her children. He recalls the teachers in middle school and high school who influenced him - teachers who gave him an honest assessment of his learning and accepted no excuses. In discussing his army experience, he recounts incidents of racial discrimination but stresses that his time in the army was a valuable part of his maturation process.

Growing up in the Richard Allen housing project, he remembers that:

Grocery, drug, and clothing stores lined Poplar between 10th and 13th streets. . . . Most of the grocery stores had unattended stands set up outside for fruits and vegetables - with little concern about theft. Often customers would select their fruits and vegetables and take them into the store to be weighed and paid for. . . . There was nothing particularly notable about a thriving business community in black neighborhoods, except that it would one day virtually disappear due to high crime and the 1960s riots. Such a disappearance had at least several results: in order to shop, today's poor residents must travel longer distances . . . and bear that expense; high crime costs reduce incentives for either a black or white business to locate in these neighborhoods. . . .

The absence of shops and other businesses, writes Williams:

. . . also reduces work opportunities for residents. One of my after-school and weekend jobs was to work at Sam Raboy's grocery store. . . . I waited on customers, delivered orders, stocked shelves and cleaned up. Other stores hired other young people to do the same kind of work.

After high school Williams joined his father in Los Angeles and enrolled at Los Angeles City College. Following Army service, including time in Korea, and his marriage to Connie - a major force in his life - in February 1962 he enrolled as a full-time student at California State College in Los Angeles, originally majoring in sociology. In the summer of 1965, after shifting to economics, Williams graduated from California State and was encouraged by professors to consider graduate school. He was admitted to UCLA, worked at night for the county probation department, and decided to pursue his Ph.D.

Initially, Williams failed in economic theory. He notes that:

I later realized this did have a benefit. It convinced me that UCLA professors didn't care anything about my race; they'd flunk me just as they'd flunk anyone else who didn't make the grade. The treatment reassured me in terms of my credentials.

After completing his Ph.D. examinations, he was offered a full-time tenure-track assistant professorship at Cal State. "Sometimes," Williams writes:

I sarcastically, perhaps cynically, say that I'm glad that I received virtually all of my education before it became fashionable for white people to like black people. By that I mean that I encountered back then a more honest assessment of my strengths and weaknesses. Professors didn't hesitate to criticize me - sometimes even to the point of saying, "That's nonsense, Williams."

In those years, Williams' political views were liberal. In 1964, he voted for Lyndon Johnson. He believed that higher minimum wages were the way to help poor people. "That political attitude," he writes:

. . . endured until I had a conversation with a UCLA professor (it might have been Armen Alchian) who asked me whether I most cared about the intentions behind a higher minimum wage or its effects. If I was concerned about the effects, he said, I should read studies by Chicago University Professor Yal Brozen and others about the devastating effects of the minimum wage on employment opportunities for minimally skilled workers. I probably became a libertarian through exposure to tough-minded professors who encouraged me to think with my brain instead of my heart.

It was Williams' desire

. . . to share my conviction that personal liberty, along with free markets, is morally superior to other forms of human organization. The most effective means of getting them to share it is to give them the tools to be rigorous, tough-minded thinkers.

Being a black professor, he reports:

. . . led to calls to become involved with the campus concerns of black students. They invited me to attend meetings of the Black Student Union. I tried to provide guidance with regard to some of the BSU's demands, such as black studies programs and an increase in the number of black faculty. As I was to do later at Temple University, I offered tutorial services for students having trouble in math. One of my efforts that fell mostly on deaf ears was an attempt to persuade black students that the most appropriate use of their time as students was to learn their subject as opposed to pursuing a political agenda.

Walter Williams' academic career began with teaching one class a week at Los Angeles City College to eventually holding the department chairmanship at George Mason University. He tells of his long friendship with economist Thomas Sowell and with J. A. Parker, president of the Lincoln Institute, with which Williams has been associated for many years. He reports of his time at the Urban Institute and the Hoover Institution of Stanford University, as well as his frequent testimony before Congress on issues ranging from the minimum wage to the negative effects of the Davis-Bacon Act. He was a member of the Reagan administration's transition team at the Department of Labor, but following the advice of economist Milton Friedman declined a position in the administration so that he could remain in his academic position where he could speak his mind freely.

The attacks upon black conservatives, which he cites, were often bitter. NAACP General Counsel Thomas Atkins, upon hearing that President Reagan was considering appointing Thomas Sowell as head of the Council of Economic Advisers, declared that Sowell "would play the same kind of role which historically house niggers played for the plantation owners." Syndicated columnist Carl Rowan said "If you give Thomas (Sowell) a little flour on his face, you'd think you had (former Ku Klux Klan leader) David Duke." NAACP Executive Director Benjamin Hooks called black conservatives "a new breed of Uncle Tom and some of the biggest liars the world ever saw."

Williams has much to say about what he calls "prevailing racial dogma." One element of that dogma asserts that "black role models in teaching are necessary to raise black achievement, instill pride, and offset the effects of our legacy of slavery and subsequent discrimination." But, Williams argues, his own life is a refutation of that notion:

Attending predominantly black junior high and high schools, and graduating from the latter in 1954, I recall having no more than two, possibly three, black teachers. . . . Nonetheless, many of my classmates, who grew up in the Richard Allen housing project and with whom I've kept up over the years, managed to become middle-class adults; and one, Bill Cosby, became a multi-millionaire. Our role models were primarily our parents and family; any teachers who also served in that role were white, not black.

Every few years, former Richard Allen residents hold a reunion. "I've asked some of my friends," Williams writes:

. . . and ex-schoolmates whether they recall any of our peers who couldn't read or write well enough to fill out a job application or who spoke the poor language that's often heard today among black youngsters, The answer is they don't remember anyone doing either. Yet in 2005, at my high school alma mater, Benjamin Franklin, only 4 percent of eleventh grade students scored "proficient" and above in reading, 12 percent in writing, and 1 percent in math. Today's Philadelphia school system includes a high percentage of black teachers and black administrators, but academic achievement is a shadow of what it was yesteryear. If the dogma about role models had any substance, the opposite should be the case.

Throughout this book, Williams refers to the immeasurable contribution of his wife of 48 years, who shared his vision through hard work and love. He leaves the reader with a bit of advice passed on by his stepfather: A lot of life is luck and chance and you never know when the opportunity train is going to come along. Be packed and ready to hop on board.

Walther Williams has lived a singular American life. This book deserves widespread recognition as a record of that life. And he is still teaching economics and, hopefully, will be doing so for many years to come,

A Thoughtful Look at Christianity as the Lifeblood of the American Society

When we look to the earliest days of the American society and seek to discover the beliefs and worldview which animated the Founders, we must carefully consider the role of religion.

In a thoughtful new book, Christianity: Lifeblood of America's Free Society (1620-1945) , Dr. John Howard argues that it was Christianity that was the dominant influence in the development of the American nation and the American society.

John Howard has had a distinguished career. After service in the First Infantry in World War II, he returned with two silver stars, two purple hearts, and a commitment to use his career to sustain and strengthen America's religious ideals. In the Eisenhower Administration, he headed the first program using government contracts to open jobs for qualified minority applicants. Dr. Howard served as president of Rockford College for seventeen years and as the national president of the American Association of Presidents of Independent Colleges and Universities for three years. Currently, he is a Senior Fellow at the Howard Center on Family, Religion, and Society. In the 2007 national contest for speechwriters sponsored by Vital Speeches and The Executive Speaker, John Howard's "Liberty Revisited" received the Grand Award as the best speech submitted in any of the 27 categories.

Sadly, at the present time, America's religious history is largely unknown. The review provided by Dr. Howard is instructive.

After the Plymouth colony was established, the number of settlers coming from England increased rapidly. In 1630, five ships with 900 passengers arrived to start the Massachusetts Bay colony with John Winthrop as governor. During a two-month voyage, Winthrop preached a long sermon entitled "A Model of Christian Charity." He set forth the tenets of Jesus' teaching that should be applied in the daily living of the new colonies.

His concluding instructions included the following:

The end is to improve our lives to do more service to the Lord. . . . We are entered into a covenant with Him to do this work. . . . We must entertain each other in brotherly affection. . . . We must delight in each other, make others' condition our own, rejoice together, mourn together, labor together and suffer together . . . so shall we keep the unity of spirit in the bond of peace. The Lord will be our God and delight to dwell among us. . . . We must consider that we shall be as a city upon a hill. The eyes of all people are upon us.

In ten years, the population of the Massachusetts Bay colony swelled to 16,000. Dr. Howard notes that:

It was recognized that schools as well as churches were essential to the Christian well-being of the society. In 1636, the Massachusetts legislature authorized the establishment of a college, and two years later Harvard College enrolled students. The purpose was "to train a literate clergy." . . . For many years, most of the Congregational Ministers in New England were Harvard graduates. . . . It is not surprising that the first English language book published in North America was religious. The Whole Book of Psalms Faithfully Translated into English Meter appeared in 1640. It was so much in demand that there were twenty editions of it and seventy printings.

During the 18th century, a number of powerful preachers influenced the American society - among them Jonathan Edwards, John Wesley and George Whitfield. Whitfield, an Englishman, is acknowledged as the most powerful influence in spreading the Great Awakening of that time. He had a great dramatic flare that brought people from long distances to hear him. The crowds grew until he was preaching to over 30,000 people at once, with no amplification. Benjamin Franklin wrote in his autobiography "he was able to hear his voice nearly a mile away." The two men became close friends and Franklin built a huge auditorium in Philadelphia to accommodate his revival meetings. The edifice later became the first building of the University of Pennsylvania.

"During the three-quarters of a century leading up to the Revolutionary War," writes Howard:

. . . the church services, revival meetings, and religious encampments were the primary social events in the lives of the colonists. . . . Many of the foremost clergy were instrumental in convincing the colonists that the revolution was necessary and just.

The people of a self-governing republic or democracy must, said Montesquieu, be virtuous, or that form of government cannot operate. In his inaugural address on April 30, 1789, George Washington stressed the need for honorable and conscientious citizens. "Like the other Founding Fathers," writes Howard:

Washington knew . . . that the people of a self-governing republic must be virtuous for that form of government to operate successfully and was keenly committed to do everything he could to help Americans measure up to the standards of virtue required for a self-governing republic. Rectitude and patriotism, he declared, keep the acts of government fair and just and free of the damage caused by party politics. Rectitude and patriotism will also assure that "national policy will be laid in the pure and immutable principles of private morality. . . . There is no truth more thoroughly established than that there exists . . . an indissoluble union between virtue and happiness; between duty and advantage; between the genuine maxims of an honest and magnanimous policy and the solid reward of prosperity and felicity. . . ."

When Alexis de Tocqueville visited the U.S. in 1831, he was amazed by the religious character of the people and impact Christianity had on the systems and institutions of the society. In Democracy in America he writes that:

By their practice Americans show that they feel the urgent necessity to instill morality into democracy by means of religion. What they think of themselves in this respect enshrines a truth that should penetrate into the consciousness of every democratic nation.

De Tocqueville marveled that the different denominations were in comfortable agreement about teaching morality:

There is an innumerable multitude of sects in the United States. They are all different in the worship they offer to the Creator, but all agree concerning the duties of men to one another . . . all preach the same morality in the name of God. . . .

The historian Paul Johnson observes that this denominational unity about teaching morality was even broader. It was

. . . a consensus which even non-Christians, deists, and rationalists could share. Non-Christianity, preeminently including Judaism, could thus be accommodated within the framework of Christianity. Both Catholicism and Judaism became heavily influenced by the moral assumptions of American Protestantism because both accepted its premise that religion (meaning morality) was essential to democratic institutions.

In the post-World War II years, Dr. Howard shows, the influence of religion upon the American society has been in decline. In 1988, after a number of highly placed graduates of Harvard and other elite universities engaged in a variety of less than honorable behavior, Harvard president Derek Bok published a long essay in Harvard Magazine. He provided a summary of Harvard's transition from being an instrument of the Christian church to a modern intellectual and research center, free of any coordinated effort to teach Christian or any other morality to the student body.

He wrote:

Until this century, education throughout history not only sought to build the character of their students, they made this task their central responsibility. . . . These tendencies continued strongly into the 19th century.

During the 20th century, he notes:

First artists and intellectuals, then broader segments of society, challenged every convention, every prohibition, every regulation that cramped the human spirit and blocked its appetites and ambitions.

In 1954, Robert Hutchins, president of the University of Chicago, noted that:

The pedagogical problem is how to use the educational system to form the kind of man that the country wants to produce. But in the absence of a clear ideal, and one that is attainable through education, the pedagogical problem is unsolvable; it cannot even be stated. The loss of an intelligible ideal lies at the root of the troubles of American education.

John Howard concludes his book with a quote from James Russell Lowell, author, poet, editor of the Atlantic Monthly, and U.S. ambassador to Spain from 1877 to 1880, and then to Great Britain. The French historian Francois Guizot once asked Lowell, "How long will the American Republic endure?" Lowell replied, "As long as the ideas of the men who founded it remain dominant."

John Howard hopes that we can recapture those ideas. He has performed a notable service with his important review of the role religion has played in the development of our country - and can play once again. *

Saturday, 05 December 2015 04:30

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review and a contributing editor to such publications as Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

A Growing and Largely Ignored Crisis: Public Pension Funds Are Running Out of Money

There is now a $1 trillion gap as of fiscal 2008 between what states had promised to public employees in retiree pensions, health care, and other benefits and the money they currently have to pay for it, according to a Pew Center on the States study. Some economists say that Pew is too conservative and that the problem is two or three times as large.

Roger Lowenstein, an outside director of the Sequoia Fund and author of While America Aged, points out that:

For years, localities and states have been skimping on what they owe. Public pension funds are now massively short of the money to pay future claims -- depending on how their liabilities are valued, the deficit ranges from $1 trillion to $3 trillion. Pension funds subsist on three revenue streams: contributions from the employer; contributions from the employees; and investment earnings. But public employees have often contributed less than the actuarially determined share, in effect borrowing against retirement plans to avoid having to cut budgets or raise taxes.

Pension funds, Lowenstein notes, also assumed that they could count on high annual returns, typically 8 percent, on their investments.

In the past, many funds did earn that much, but not lately. Thanks to high assumed returns, governments projected that they could afford to both ratchet up benefits and minimize contributions. Except, of course, returns were not guaranteed. Optimistic benchmarks, actually heightened the risk because they forced fund managers to overreach.

Consider the case of Massachusetts. The target of its pension board was 8.25 percent. "That was the starting point for all of our investment decisions," says Michael Travaglini, until recently its executive director. "There was no way a conservative strategy is going to meet that."

Lowenstein notes:

Travaglini put a third of the state's money into hedge funds, private equity, real estate, and timber. In 2008, assets fell 29 percent. New York State's fund, which is run by the comptroller, Thomas DiNapoli, a former state assemblyman with no previous investment experience, lost $40 billion in 2008.

In October, a report issued by the Empire Center for New York State Policy, a research organization that studies fiscal policy, reports that the cities, counties, and authorities of New York have promised more than $200 billion worth of health benefits to their retirees while setting aside almost nothing, putting the pubic work force on a collision course with the taxpayers who are expected to foot the bill.

Lowenstein notes:

The Teacher's Retirement System of Illinois lost 22 percent in the 2009 fiscal year. Alexandra Harris, a graduate journalism student at Northwestern University who investigated the pension fund, reported that it invested in credit-default swaps on A.I.G., the State of California, Germany, Brazil, and "a ton" of subprime mortgage securities.

According to Joshua Rauh of the Kellogg School of Management at Northwestern, assuming states make contributions at recent rates and assuming they do earn 8 percent, 20 state funds will run out of cash by 2025. Illinois, the first, will run dry by 2018.

In a new report issued by Professor Rauh and Robert Novy-Marx, a University of Rochester professor, five major cities -- Boston, Chicago, Cincinnati, Jacksonville, and St. Paul -- are said to have pension assets that can pay for promised benefits only through 2020. Philadelphia, according to the report, has assets on hand that can only pay for promised benefits through 2015.

Professor Rauh declares that:

We need fundamental reform of the ways employees in the public sector are compensated. It is not feasible to make promises of risk-free pensions when in the private sector (nearly) everyone has to share some of the risk.

In Roger Lowenstein's view, states need to cut pension benefits:

About half have made modest trims, but only for future workers. Reforming pensions is painfully slow, because pensions of existing workers are legally protected. But public employees benefit from a unique notion that, once they have worked a single day, their pension arrangement going forward can never be altered. No other American enjoy such protections. Private companies often negotiate (or force upon their workers) pension adjustments. But in the world of public employment, even discussion of cuts is taboo.

The market forced private employers like General Motors to restructure retirement plans or suffer bankruptcy. Government's greater ability to borrow enables it to defer hard choices but, as Greece discovered, not even governments can borrow forever. The days when state officials may shield their workers while subjecting all other constituents to hardship are fast at an end.

Recently, some states have begun to test the legal boundary. Minnesota and Colorado cut cost-of-living adjustments for existing workers' pensions. Each now faces a lawsuit.

In Colorado, in what many have called an act of political courage, a bipartisan coalition of state legislators passed a pension overhaul bill. Among other things, the bill reduced the raise that people who are already retired get in their pension checks each year. "We have to take this on, if there is any way of bringing fiscal sanity to our children," said former Governor Richard Lamm, a Democrat. "The New Deal is demographically obsolete. You can't fund the dream of the 1960s on the economy of 2010."

In Colorado, the average public retiree stops working at 58 and receives a check of $2,883 each month. Many of them also got a 3.5 percent annual raise, no matter what inflation was, until the rules changed this year.

Discussing the Colorado case in The New York Times, Ron Lieber notes that:

Private sector retirees who want their own monthly $2,883 check for life, complete with inflation adjustments, would need an immediate fixed annuity if they don't have a pension. A 58-year-old male shopping for one from an A-rated insurance company would have to hand over a minimum of $860,000, according of Craig Hemke of Buyapension.com. A woman would need at least $928,000 because of her longer life expectancy. Who among aspiring retirees has a nest egg that size, let alone people with the same moderate earning history as many state employees? And who wants to pay to top off someone else's pile of money via increased income taxes or a radical decline in state services?

"We have to do what unions call givebacks," said Mr. Lamm, the former Colorado governor. "That's the only way to sanity. Any other alternative, therein lie dragons."

Americans were quite properly shocked when it was revealed that the Los Angeles blue-collar suburb of Bell, California, was paying its city manager Robert Rizzo $787,637 a year -- with 12 percent annual pay increases. In July, Rizzo, along with Police Chief Randy Adams and Assistant City Manager Angela Spaccia resigned. The combined annual salary of these employees was $1,620,925 in a city where one of every six residents lives in poverty. The city's debt quadrupled between 2004 and 2009.

The Washington Examiner states that:

The likely reason why Rizzo, Adams, and Spaccia resigned so readily is that they are eligible for public pensions. Under current formulations, Adams will make $411,000 annually in retirement and Spaccia could make as much as $250,000, when she's eligible for retirement in four years at age 55. . . . California is but one of many states on the brink of fiscal ruin largely due to outrageous public employee benefits.

While the Bell, California, example may be extreme, the crisis in public employee pension funds across the country is very real -- and must be confronted to avoid massive bankruptcies in the future.

Racial Achievement Gap Is Alarming and Focuses Renewed Attention on the "Culture of Poverty"

An achievement gap separating black from white students has long been documented. But a new report focusing on black males suggests that the picture is bleaker than generally known.

Only 12 percent of black fourth-grade boys are proficient in reading, compared with 38 percent of white boys, and only 12 percent of black eight-grade boys are proficient in math, compared with 44 percent of white boys.

Poverty alone does not seem to explain the differences. Poor white boys do just as well as black boys who do not live in poverty, measured by whether they qualify for subsidized school lunches.

This data comes from national math and reading tests, known as the National Assessment for Educational Progress, which are given to students in fourth and eight grades, most recently in 2009. The report, "A Call for Change," was released in November by the Council of the Great City Schools, an advocacy group for urban public schools.

"What this clearly shows is that black males who are not eligible for free and reduced-price lunch are doing no better than white males who are poor," said Michael Casserly, executive director of the council.

The report shows that black boys on average fall behind from the earliest years. Black mothers have a higher infant mortality rate and black children are twice as likely as whites to live in a home where no parent has a job. In high school, black boys drop out at nearly twice the rate of white boys, and their SAT scores are on average 104 points lower. In college, black men represented just 5 percent of students in 2008.

The search for explanations is looking at causes besides poverty. "There's accumulating evidence that there are racial differences in what kids experience before the first day of kindergarten," said Ronald Ferguson, director of the Achievement Gap Initiative at Harvard. He said:

They have to do with a lot of sociological and historical forces. In order to address those, we have to be able to have conversations that people are unwilling to have.

Those include "conversations about early childhood parenting practices," Dr. Ferguson said.

The activities that parents conduct with their 2, 3, and 4 year-olds. How much we talk to them, the ways we talk to them, the ways we enforce discipline, the ways we encourage them to think and develop a sense of autonomy.

The New York Times reports that:

For more than 40 years, social scientists investigating the causes of poverty have tended to treat cultural explanations like Lord Voldemort: That Which Must Not Be Named. The reticence was a legacy of the ugly battles that erupted after Daniel Patrick Moynihan, then an assistant labor secretary in the Johnson administration introduced the idea of a "culture of poverty" to the public in a startling 1965 report. . . . His description of the black family as caught in an inescapable "tangle of pathology" of unmarried mothers and welfare dependency was seen as attributing self-perpetuating moral deficiencies to black people, as if blaming them for their own misfortune.

Now, after decades of silence, even liberal scholars who were harshly critical of Moynihan's thesis are conceding that culture and persistent poverty are enmeshed.

"We've finally reached the state where people aren't afraid of being politically incorrect," said Douglas S. Massey, a sociologist at Princeton who has argued that Moynihan was unfairly maligned.

In September, Princeton and the Brookings Institution released a collection of papers on unmarried parents, a subject, it noted, that became off limits after the Moynihan report. At the recent annual meeting of the American Sociological Association, attendees discussed the resurgence of scholarship on culture.

In recent years, prominent black Americans have begun to speak out on the subject. In 2004 the comedian Bill Cosby made headlines when he criticized poor blacks for "not parenting" and dropping out of school. President Obama, who was abandoned by his father, has repeatedly talked about "responsible fatherhood."

None of this is new. Repeated studies have shown that the decline of the black family and the decline in graduation rates, student achievement, and employment is clear. A 2002 report from the Institute for American Values, a non-partisan group that studies families, concluded that "marriage is an issue of paramount importance if we wish to help the most vulnerable members of our society: the poor, minorities, and children."

The statistical evidence for that claim is strong. Research shows that most black children, 68 percent, were born to unwed mothers. Those numbers have real consequences. For example, 35 percent of black women who had a child out of wedlock live in poverty. Only 17 percent of married black women overall are in poverty. In a 2005 report, the institute concluded:

Economically, marriage for black Americans is a wealth-creating and poverty-reducing institution. The marital status of African American parents is one of the most powerful determinants of the economic status of African-American families.

Over the past fifty years, the percentage of black families headed by married couples declined from 78 percent to 34 percent. In the thirty years from 1950 to 1980, households headed by black women who never married jumped from 3.8 per thousand to 69.7 per thousand. In 1940, 75 percent of black children lived with parents. By 1990 only 33 percent of black children lived with a mother and father.

"For policymakers who care about black America, marriage matters," wrote the authors of the report, a group of black scholars. They called marriage in black America an important strategy for "improving the well-being of African Americans and for strengthening civil society."

The latest study of the achievement gap separating black and white students should focus attention on the real causes for this problem. Unless a problem is diagnosed properly, it will never be solved. For too long, all discussions of the "culture of poverty" have been silenced with the false charge of "blaming the victim."

In America today, any individual, regardless of race or background, can go as far as his or her abilities will allow. But when doors are opened to all, it still requires hard work and determination to take advantage of these opportunities. Without strong and intact families, these qualities seem to be in short supply. How to rebuild such families should be the focus of increasing attention.

NPR and Juan Williams: the Peril of Speaking Honestly in an Era of Political Correctness

Juan Williams, a respected journalist and ten-year veteran of National Public Radio (NPR) was fired in October as a top news analyst for remarks he made about Muslims during an appearance on Fox News.

In a segment with Fox News talk-show host Bill O'Reilly, Williams acknowledged feeling "nervous" in the wake of the September 11 attacks when he sees Muslims board a plane on which he is traveling.

He said:

Look, Bill, I'm not a bigot. You know the kind of books I've written about the civil rights movement in this country. But when I get on the plane, I got to tell you, if I see people who are in Muslim garb and I think, you know, they are identifying themselves first and foremost as Muslims, I get worried. I get nervous.

Later, in the same segment, Williams challenged O'Reilly's suggestion that "the Muslims attacked us on 9/11," saying it was wrong to generalize about Muslims in this way just as it was to generalize about Christians, such as Oklahoma City bomber Timothy McVeigh, who have committed acts of terrorism. "There are good Muslims," Williams said later, making a distinction from "extremists."

A former Washington Post reporter and columnist, Williams began his tenure with Fox News in 1997, predating his hiring by NPR three years later. While at NPR, he has hosted the daily program "Talk of the Nation," and commented on its news program, "Morning Edition" and "All Things Considered."

In an interview shortly after being fired, Williams declared:

As a journalist, it's unsupportable that your employer would fire you for stating your honest feelings in an appropriate setting. . . . I think that I am open to being misinterpreted only if you snip one line out of what I said. But I would never guess that people who are professional journalists would just take one line and make me look bigoted so they can use it as an excuse to get rid of me.

Williams was not only fired by NPR, but his mental stability was also questioned. NPR chief executive Vivian Schiller told an audience at the Atlanta Press Club that Williams should have kept his feelings about Muslims between himself and "his psychiatrist or his publicist."

The outcry against Williams' firing has been widespread. Conservatives, of course, were particularly vocal. Leading Republicans, including former Arkansas Governor Mike Huckabee, former Alaska Governor Sarah Palin, political strategist Karl Rove, and House Minority Leader John Boehner of Ohio, released statements attacking the move and questioning why taxpayers should help fund NPR's budget. This, of course, was to be expected.

Many others have also criticized NPR's action. The Washington Post reported that, "Even NPR's own staff expressed exasperation at the decision during a meeting . . . with NPR president Vivian Schiller." Editorially, The Post declared that, "In firing Juan Williams, NPR discourages honest conversation."

In The Post's view:

In a democracy, the media must foster a free and robust political debate, even if such debate may, at times, offend some people. . . . What was Mr. William's sin? He admitted, with apparent chagrin, that he has engaged in a kind of racial profiling in the years since the September 11 attacks. . . . In making this confession, Mr. Williams undoubtedly spoke for many Americans who are wrestling with similar feelings. His words could be offensive to some, if construed as an endorsement of negative stereotyping. But the full broadcast makes clear that Mr. Williams intended the opposite. To be sure, he struggled to get his point across, because host Bill O'Reilly kept interrupting him. But Mr. Williams did manage to observe that "We don't want in America people to have their rights violated, to be attacked on the street because they hear rhetoric from Bill O'Reilly and they act crazy."

The Post concludes:

In short, Mr. Williams was attempting to do exactly what a responsible commentator should do: speak honestly without being inflammatory. His reward was to lose his job, just as Agriculture Department employee Shirley Sherrod lost hers over purportedly racist remarks that turned out to be anything but. NPR management appears to have learned nothing from that rush to judgment. "Political correctness can lead to some kind of paralysis where you don't address reality," Mr. Williams told Mr. O'Reilly. NPR, alas, has proved his point.

T.V. political satirist Jon Stewart criticized the firing of Juan Williams as well as the extremes to which "political correctness" frames what is permissible speech. When CNN commentator Rick Sanchez, complained of Stewart's mocking him on "The Daily Show" he complained of his treatment as a member of a minority group, being a Cuban-American. He was then told by his interviewer that Stewart, being Jewish, was also a member of a minority. Sanchez responded by pointing out that most of those in leadership positions at CNN and other networks were "just like Stewart." He was then fired. Jon Stewart also spoke out against the firing of Sanchez as an example of the extremes to which political correctness have taken us.

Interestingly, there are examples of editing free speech coming from liberals as well. The recipient of this year's Mark Twain Prize for American Humor was Tina Fey. In its broadcast from the Kennedy Center's award ceremony, PBS edited Fey's acceptance speech, in which she mock-praised "conservative women" like Sarah Palin, whom Fey has impersonated on "Saturday Night Live." Fey said that the rise of conservative women in politics is good for all women "unless you don't want to pay for your own rape kit. . . . Unless you're a lesbian who wants to get married to your partner of 20 years. . . Unless you believe in evolution."

That, however, was not what viewers heard when PBS broadcast an edited version of Fey's speech. The part about the rape kits and evolution was gone, leaving only Fey's more harmonious and blander comments about Palin and politics. Was PBS shielding its viewers from Fey's more pointed remarks?

It is unfortunate that our society has fewer and fewer newspapers. It is unfortunate that cable news provides less news and more overheated opinion, both on the right and left. What a free society desperately needs is free and open discussion -- with a variety of viewpoints being heard. Juan Williams has always provided a thoughtful voice, whether or not one agrees with his views. To fire him as NPR has done sends a chilling message to the nation about free speech. Juan Williams, of course, has a new contract with Fox and is doing very well indeed. It is the rest of us who are the losers. *

"No taxes can be devised which are not more or less inconvenient and unpleasant." --George Washington

Page 4 of 7