Sunday, March 15, 2009

GOD IS DEAD?

The New York Times

March 15, 2009
Op-Ed Columnist
The Culture Warriors Get Laid Off
By FRANK RICH

SOMEDAY we’ll learn the whole story of why George W. Bush brushed off that intelligence briefing of Aug. 6, 2001, “Bin Laden Determined to Strike in U.S.” But surely a big distraction was the major speech he was readying for delivery on Aug. 9, his first prime-time address to the nation. The subject — which Bush hyped as “one of the most profound of our time” — was stem cells. For a presidency in thrall to a thriving religious right (and a presidency incapable of multi-tasking), nothing, not even terrorism, could be more urgent.

When Barack Obama ended the Bush stem-cell policy last week, there were no such overheated theatrics. No oversold prime-time address. No hysteria from politicians, the news media or the public. The family-values dinosaurs that once stalked the earth — Falwell, Robertson, Dobson and Reed — are now either dead, retired or disgraced. Their less-famous successors pumped out their pro forma e-mail blasts, but to little avail. The Republican National Committee said nothing whatsoever about Obama’s reversal of Bush stem-cell policy. That’s quite a contrast to 2006, when the party’s wild and crazy (and perhaps transitory) new chairman, Michael Steele, likened embryonic stem-cell research to Nazi medical experiments during his failed Senate campaign.

What has happened between 2001 and 2009 to so radically change the cultural climate? Here, at last, is one piece of good news in our global economic meltdown: Americans have less and less patience for the intrusive and divisive moral scolds who thrived in the bubbles of the Clinton and Bush years. Culture wars are a luxury the country — the G.O.P. included — can no longer afford.

Not only was Obama’s stem-cell decree an anticlimactic blip in the news, but so was his earlier reversal of Bush restrictions on the use of federal money by organizations offering abortions overseas. When the administration tardily ends “don’t ask, don’t tell,” you can bet that this action, too, will be greeted by more yawns than howls.

Once again, both the president and the country are following New Deal-era precedent. In the 1920s boom, the reigning moral crusade was Prohibition, and it packed so much political muscle that F.D.R. didn’t oppose it. The Anti-Saloon League was the Moral Majority of its day, the vanguard of a powerful fundamentalist movement that pushed anti-evolution legislation as vehemently as it did its war on booze. (The Scopes “monkey trial” was in 1925.) But the political standing of this crowd crashed along with the stock market. Roosevelt shrewdly came down on the side of “the wets” in his presidential campaign, leaving Hoover to drown with “the dries.”

Much as Obama repealed the Bush restrictions on abortion and stem-cell research shortly after pushing through his stimulus package, so F.D.R. jump-started the repeal of Prohibition by asking Congress to legalize beer and wine just days after his March 1933 inauguration and declaration of a bank holiday. As Michael A. Lerner writes in his fascinating 2007 book “Dry Manhattan,” Roosevelt’s stance reassured many Americans that they would have a president “who not only cared about their economic well-being” but who also understood their desire to be liberated from “the intrusion of the state into their private lives.” Having lost plenty in the Depression, the public did not want to surrender any more freedoms to the noisy minority that had shut down the nation’s saloons.

In our own hard times, the former moral “majority” has been downsized to more of a minority than ever. Polling shows that nearly 60 percent of Americans agree with ending Bush restrictions on stem-cell research (a Washington Post/ABC News survey in January); that 55 percent endorse either gay civil unions or same-sex marriage (Newsweek, December 2008); and that 75 percent believe openly gay Americans should serve in the military (Post/ABC, July 2008). Even the old indecency wars have subsided. When a federal court last year struck down the F.C.C. fine against CBS for Janet Jackson’s “wardrobe malfunction” at the 2004 Super Bowl, few Americans either noticed or cared about the latest twist in what had once been a national cause célèbre.

It’s not hard to see why Eric Cantor, the conservative House firebrand who is vehemently opposed to stem-cell research, was disinclined to linger on the subject when asked about it on CNN last Sunday. He instead accused the White House of acting on stem cells as a ploy to distract from the economy. “Let’s take care of business first,” he said. “People are out of jobs.” (On this, he’s joining us late, but better late than never.)

Even were the public still in the mood for fiery invective about family values, the G.O.P. has long since lost any authority to lead the charge. The current Democratic president and his family are exemplars of precisely the Eisenhower-era squareness — albeit refurbished by feminism — that the Republicans often preached but rarely practiced. Obama actually walks the walk. As the former Bush speechwriter David Frum recently wrote, the new president is an “apparently devoted husband and father” whose worst vice is “an occasional cigarette.”

Frum was contrasting Obama to his own party’s star attraction, Rush Limbaugh, whose “history of drug dependency” and “tangled marital history” make him “a walking stereotype of self-indulgence.” Indeed, the two top candidates for leader of the post-Bush G.O.P, Rush and Newt, have six marriages between them. The party that once declared war on unmarried welfare moms, homosexual “recruiters” and Bill Clinton’s private life has been rebranded by Mark Foley, Larry Craig, David Vitter and the irrepressible Palins. Even before the economy tanked, Americans had more faith in medical researchers using discarded embryos to battle Parkinson’s and Alzheimer’s than in Washington politicians making ad hoc medical decisions for Terri Schiavo.

What’s been revealing about watching conservatives debate their fate since their Election Day Waterloo is how, the occasional Frum excepted, so many of them don’t want to confront the obsolescence of culture wars as a political crutch. They’d rather, like Cantor, just change the subject — much as they avoid talking about Bush and avoid reckoning with the doomed demographics of the G.O.P.’s old white male base. To recognize all these failings would be to confront why a once-national party can now be tucked into the Bible Belt.

The religious right is even more in denial than the Republicans. When Obama nominated Kathleen Sebelius, the Roman Catholic Kansas governor who supports abortion rights, as his secretary of health and human services, Tony Perkins, the leader of the Family Research Council, became nearly as apoplectic as the other Tony Perkins playing Norman Bates. “If Republicans won’t take a stand now, when will they?” the godly Perkins thundered online. But Congressional Republicans ignored him, sending out (at most) tepid press releases of complaint, much as they did in response to Obama’s stem-cell order. The two antiabortion Kansas Republicans in the Senate, Sam Brownback and Pat Roberts, both endorsed Sebelius.

Perkins is now praying that economic failure will be a stimulus for his family-values business. “As the economy goes downward,” he has theorized, “I think people are going to be driven to religion.” Wrong again. The latest American Religious Identification Survey, published last week, found that most faiths have lost ground since 1990 and that the fastest-growing religious choice is “None,” up from 8 percent to 15 percent (which makes it larger than all denominations except Roman Catholics and Baptists). Another highly regarded poll, the General Social Survey, had an even more startling finding in its preliminary 2008 data released this month: Twice as many Americans have a “great deal” of confidence in the scientific community as do in organized religion. How the almighty has fallen: organized religion is in a dead heat with banks and financial institutions on the confidence scale.

This, too, is a replay of the Great Depression. “One might have expected that in such a crisis great numbers of these people would have turned to the consolations of and inspirations of religion,” wrote Frederick Lewis Allen in “Since Yesterday,” his history of the 1930s published in 1940. But that did not happen: “The long slow retreat of the churches into less and less significance in the life of the country, and even in the lives of the majority of their members, continued almost unabated.”

The new American faith, Allen wrote, was the “secular religion of social consciousness.” It took the form of campaigns for economic and social justice — as exemplified by the New Deal and those movements that challenged it from both the left and the right. It’s too early in our crisis and too early in the new administration to know whether this decade will so closely replicate the 1930s, but so far Obama has far more moral authority than any religious leader in America with the possible exception of his sometime ally, the Rev. Rick Warren.

History is cyclical, and it would be foolhardy to assume that the culture wars will never return. But after the humiliations of the Scopes trial and the repeal of Prohibition, it did take a good four decades for the religious right to begin its comeback in the 1970s. In our tough times, when any happy news can be counted as a miracle, a 40-year exodus for these ayatollahs can pass for an answer to America’s prayers.

Wednesday, March 11, 2009

Stewart has Another Big Swinging Dick on his Pole



________________________________________

Cramer on Stock Monipulation: "It's satisfying"

Sunday, March 08, 2009

The Death of Ann Rand

from The New York Times

March 8, 2009
Op-Ed Columnist
Some Things Don’t Change in Grover’s Corners
By FRANK RICH

“WHEREVER you come near the human race, there’s layers and layers of nonsense,” says the Stage Manager in Thornton Wilder’s “Our Town.” Those words were first heard by New York audiences in February 1938, as America continued to reel from hard times. The Times’s front page told of 100,000 auto workers protesting layoffs in Detroit and of a Republican official attacking the New Deal as “fascist.” Though no one was buying cars, F.D.R. had the gall to endorse a mammoth transcontinental highway construction program to put men back to work.

In the 71 years since, Wilder’s drama has become a permanent yet often dormant fixture in our culture, like the breakfront that’s been in the dining room so long you stopped noticing its contents. Requiring no scenery and many players, “Our Town” is the perennial go-to “High School Play.” But according to A. Tappan Wilder, the playwright’s nephew and literary executor, professional productions have doubled since 2005, including two separate hit revivals newly opened in Chicago and New York.

You can see why there’s a spike in the “Our Town” market. Once again its astringent distillation of life and death in the fictional early-20th-century town of Grover’s Corners, N.H., is desperately needed to help strip away “layers and layers of nonsense” so Americans can remember who we are — and how lost we got in the boom before our bust.

At the director David Cromer’s shattering rendition of the play now running in Greenwich Village, it’s impossible not to be moved by that Act III passage where the Stage Manager comes upon the graves of Civil War veterans in the town cemetery. “New Hampshire boys,” he says, “had a notion that the Union ought to be kept together, though they’d never seen more than 50 miles of it themselves. All they knew was the name, friends — the United States of America. The United States of America. And they went and died about it.”

Wilder was not a nostalgic, sentimental or jingoistic writer. Grover’s Corners isn’t populated by saints but by regular people, some frivolous and some ignorant and at least one suicidal. But when the narrator evokes a common national good and purpose — unfurling our country’s full name in the rhetorical manner also favored by our current president — you feel the graveyard’s chill wind. It’s a trace memory of an American faith we soiled and buried with all our own nonsense in the first decade of our new century.

Retrieving that faith now requires extraordinary patience and optimism. We’re still working our way through the aftershocks of the orgy of irresponsibility and greed that brought America to this nadir. In his recent letter to shareholders, a chastened Warren Buffett likened our financial institutions’ recklessness to venereal disease. Even the innocent were infected because “it’s not just whom you sleep with” but also “whom they” — unnamed huge financial institutions — “are sleeping with,” he wrote. Indeed, our government is in the morally untenable position of rewarding the most promiscuous carrier of them all, A.I.G., with as much as $180 billion in taxpayers’ cash transfusions (so far) precisely because it can’t be disentangled from all the careless (and unidentified) trading partners sharing its infection.

Buffett’s sermon coincided with the public soul searching of another national sage, Elie Wiesel, who joined a Portfolio magazine panel discussion on Bernie Madoff. Some $37 million of Wiesel’s charitable foundation and personal wealth vanished in Madoff’s Ponzi scheme. “We gave him everything,” Wiesel told the audience. “We thought he was God.”

How did reality become so warped that Wiesel, let alone thousands of lesser mortals, could mistake Madoff for God? It was this crook’s ability to pass for a deity that allowed his fraud to escape scrutiny not just from his victims but from the S.E.C. and the “money managers” who pimped his wares. This aura of godliness also shielded the “legal” Madoffs at firms like Citibank and Goldman Sachs. They spread V.D. with esoteric derivatives, then hedged their wild gambles with A.I.G. “insurance” (credit-default swaps) that proved to be the most porous prophylactics in the history of finance.

The simplest explanation for why America’s reality got so distorted is the economic imbalance that Barack Obama now wants to remedy with policies that his critics deride as “socialist” (“fascist” can’t be far behind): the obscene widening of income inequality between the very rich and everyone else since the 1970s. “There is something wrong when we allow the playing field to be tilted so far in the favor of so few,” the president said in his budget message. He was calling for fundamental fairness, not class warfare. America hasn’t seen such gaping inequality since the Gilded Age and 1920s boom that preceded the Great Depression.

This inequity was compounded by Bush tax policy and by lawmakers and regulators of both parties who enabled and protected the banking scam artists who fled with their bonuses and left us holding the toxic remains. The fantasy of easy money at the top of the economic pyramid trickled down to the masses, who piled up debt by leveraging their homes much as their ’20s predecessors once floated stock purchases “on margin.” Our culture, meanwhile, painted halos over celebrity C.E.O.’s, turning the fundamentalist gospel of the market into a national religion that further accelerated the country’s wholesale flight from reality.

The once-lionized lifestyles of the rich and infamous were appallingly tacky. John Thain’s parchment trash can was merely the tip of the kitschy iceberg. The level of taste flaunted by America’s upper caste at the bubble’s height had less in common with the Medicis than, say, Uday and Qusay Hussein.

The cultural crash should have been a tip-off to the economic crash to come. Paul Greenwood and Stephen Walsh, money managers whose alleged $667 million fraud looted the endowments at the University of Pittsburgh and Carnegie Mellon, were fond of collecting Steiff stuffed animals, including an $80,000 teddy bear. Sir Robert Allen Stanford — a Texan who purchased that “Sir” by greasing palms in Antigua — poured some of his alleged $8 billion in ill-gotten gains into a castle, complete with moat, man-made cliff and pub. He later demolished it, no doubt out of boredom.

In a class apart is the genteel Walter Noel, whose family-staffed Fairfield Greenwich Group fed some $7 billion into Madoff’s maw. The Noels promoted themselves, their business and their countless homes by posing for Town & Country. Their firm took in at least $500 million in fees (since 2003 alone) for delivering sheep to the Madoff slaughterhouse. In exchange, Fairfield Greenwich claimed to apply “due diligence” to every portfolio transaction — though we now know Madoff didn’t actually trade a single stock or bond listed in his statements for at least the past 13 years.

But in the bubble culture, money ennobled absolutely. A former Wall Street executive vouched for his pal Noel to The Times: “He’s a terribly good person, almost in the sense of Jimmy Stewart in ‘It’s a Wonderful Life’ combined with an overtone of Gregory Peck in ‘To Kill a Mockingbird.’ ”

Last week Jon Stewart whipped up a well-earned frenzy with an eight-minute “Daily Show” takedown of the stars of CNBC, the business network that venerated our financial gods, plugged their stocks and hyped the bubble’s reckless delusions. (Just as it had in the dot-com bubble.) Stewart’s horrifying clip reel featured Jim Cramer reassuring viewers that Bear Stearns was “not in trouble” just six days before its March 2008 collapse; Charlie Gasparino lip-syncing A.I.G.’s claim that its subprime losses were “very manageable” in December 2007; and Larry Kudlow declaring last April that “the worst of this subprime business is over.” The coup de grâce was a CNBC interviewer fawning over the lordly Robert Allen Stanford. Stewart spoke for many when he concluded, “Between the two of them I can’t decide which one of those guys I’d rather see in jail.”

Led by Cramer and Kudlow, the CNBC carnival barkers are now, without any irony whatsoever, assailing the president as a radical saboteur of capitalism. It’s particularly rich to hear Cramer tar Obama (or anyone else) for “wealth destruction” when he followed up his bum steer to viewers on Bear Stearns with oleaginous on-camera salesmanship for Wachovia and its brilliant chief executive, a Cramer friend and former boss, just two weeks before it, too, collapsed. What should really terrify the White House is that Cramer last month gave a big thumbs-up to Timothy Geithner’s bank-rescue plan.

In one way, though, the remaining vestiges of the past decade’s excesses, whether they live on in the shouted sophistry of CNBC or in the ashes of Stanford’s castle, are useful. Seen in the cold light of our long hangover, they remind us that it was the America of the bubble that was aberrant and perverse, creating a new normal that wasn’t normal at all.

The true American faith endures in “Our Town.” The key word in its title is the collective “our,” just as “united” is the resonant note hit by the new president when saying the full name of the country. The notion that Americans must all rise and fall together is the ideal we still yearn to reclaim, and that a majority voted for in November. But how we get there from this economic graveyard is a challenge rapidly rivaling the one that faced Wilder’s audience in that dark late winter of 1938.

__________________________________

The New York Times

March 8, 2009
Op-Ed Columnist
The Inflection Is Near?
By THOMAS L. FRIEDMAN

Sometimes the satirical newspaper The Onion is so right on, I can’t resist quoting from it. Consider this faux article from June 2005 about America’s addiction to Chinese exports:

FENGHUA, China — Chen Hsien, an employee of Fenghua Ningbo Plastic Works Ltd., a plastics factory that manufactures lightweight household items for Western markets, expressed his disbelief Monday over the “sheer amount of [garbage] Americans will buy. Often, when we’re assigned a new order for, say, ‘salad shooters,’ I will say to myself, ‘There’s no way that anyone will ever buy these.’ ... One month later, we will receive an order for the same product, but three times the quantity. How can anyone have a need for such useless [garbage]? I hear that Americans can buy anything they want, and I believe it, judging from the things I’ve made for them,” Chen said. “And I also hear that, when they no longer want an item, they simply throw it away. So wasteful and contemptible.”

Let’s today step out of the normal boundaries of analysis of our economic crisis and ask a radical question: What if the crisis of 2008 represents something much more fundamental than a deep recession? What if it’s telling us that the whole growth model we created over the last 50 years is simply unsustainable economically and ecologically and that 2008 was when we hit the wall — when Mother Nature and the market both said: “No more.”

We have created a system for growth that depended on our building more and more stores to sell more and more stuff made in more and more factories in China, powered by more and more coal that would cause more and more climate change but earn China more and more dollars to buy more and more U.S. T-bills so America would have more and more money to build more and more stores and sell more and more stuff that would employ more and more Chinese ...

We can’t do this anymore.

“We created a way of raising standards of living that we can’t possibly pass on to our children,” said Joe Romm, a physicist and climate expert who writes the indispensable blog climateprogress.org. We have been getting rich by depleting all our natural stocks — water, hydrocarbons, forests, rivers, fish and arable land — and not by generating renewable flows.

“You can get this burst of wealth that we have created from this rapacious behavior,” added Romm. “But it has to collapse, unless adults stand up and say, ‘This is a Ponzi scheme. We have not generated real wealth, and we are destroying a livable climate ...’ Real wealth is something you can pass on in a way that others can enjoy.”

Over a billion people today suffer from water scarcity; deforestation in the tropics destroys an area the size of Greece every year — more than 25 million acres; more than half of the world’s fisheries are over-fished or fished at their limit.

“Just as a few lonely economists warned us we were living beyond our financial means and overdrawing our financial assets, scientists are warning us that we’re living beyond our ecological means and overdrawing our natural assets,” argues Glenn Prickett, senior vice president at Conservation International. But, he cautioned, as environmentalists have pointed out: “Mother Nature doesn’t do bailouts.”

One of those who has been warning me of this for a long time is Paul Gilding, the Australian environmental business expert. He has a name for this moment — when both Mother Nature and Father Greed have hit the wall at once — “The Great Disruption.”

“We are taking a system operating past its capacity and driving it faster and harder,” he wrote me. “No matter how wonderful the system is, the laws of physics and biology still apply.” We must have growth, but we must grow in a different way. For starters, economies need to transition to the concept of net-zero, whereby buildings, cars, factories and homes are designed not only to generate as much energy as they use but to be infinitely recyclable in as many parts as possible. Let’s grow by creating flows rather than plundering more stocks.

Gilding says he’s actually an optimist. So am I. People are already using this economic slowdown to retool and reorient economies. Germany, Britain, China and the U.S. have all used stimulus bills to make huge new investments in clean power. South Korea’s new national paradigm for development is called: “Low carbon, green growth.” Who knew? People are realizing we need more than incremental changes — and we’re seeing the first stirrings of growth in smarter, more efficient, more responsible ways.

In the meantime, says Gilding, take notes: “When we look back, 2008 will be a momentous year in human history. Our children and grandchildren will ask us, ‘What was it like? What were you doing when it started to fall apart? What did you think? What did you do?’ ” Often in the middle of something momentous, we can’t see its significance. But for me there is no doubt: 2008 will be the marker — the year when ‘The Great Disruption’ began.

Friday, February 20, 2009

Money for Idiots

February 20, 2009
Op-Ed Columnist
Money for Idiots
By DAVID BROOKS

Our moral and economic system is based on individual responsibility. It’s based on the idea that people have to live with the consequences of their decisions. This makes them more careful deciders. This means that society tends toward justice — people get what they deserve as much as possible

Over the last few months, we’ve made a hash of all that. The Bush and Obama administrations have compensated foolishness and irresponsibility. The financial bailouts reward bankers who took insane risks. The auto bailouts subsidize companies and unions that made self-indulgent decisions a few decades ago that drove their industry into the ground.

The stimulus package handed tens of billions of dollars to states that spent profligately during the prosperity years. The Obama housing plan will force people who bought sensible homes to subsidize the mortgages of people who bought houses they could not afford. It will almost certainly force people who were honest on their loan forms to subsidize people who were dishonest on theirs.

These injustices are stoking anger across the country, lustily expressed by Rick Santelli on CNBC Thursday morning. “The government is promoting bad behavior!” Santelli cried as Chicago traders cheered him on. “The president ... should put up a Web site ... to have people vote ... to see if they want to subsidize losers’ mortgages!”

Well, in some cases we probably do. That’s because government isn’t fundamentally in the Last Judgment business, making sure everybody serves penance for their sins. In times like these, government is fundamentally in the business of stabilizing the economic system as a whole.

Let me put it this way: Psychologists have a saying that when a couple comes in for marriage therapy, there are three patients in the room — the husband, the wife and the marriage itself. The marriage is the living history of all the things that have happened between husband and wife. Once the patterns are set, the marriage itself begins to shape their individual behavior. Though it exists in the space between them, it has an influence all its own.

In the same way, an economy has an economic culture. Out of billions of individual decisions, a common economic landscape emerges, which frames and influences the decisions everybody makes.

Right now, the economic landscape looks like that movie of the swaying Tacoma Narrows Bridge you might have seen in a high school science class. It started swinging in small ways and then the oscillations built on one another until the whole thing was freakishly alive and the pavement looked like liquid.

A few years ago, the global economic culture began swaying. The government enabled people to buy homes they couldn’t afford. The Fed provided easy money. The Chinese sloshed in oceans of capital. The giddy upward sway produced a crushing ride down.

These oscillations are the real moral hazard. Individual responsibility doesn’t mean much in an economy like this one. We all know people who have been laid off through no fault of their own. The responsible have been punished along with the profligate.

It makes sense for the government to intervene to try to reduce the oscillation. It makes sense for government to try to restore some communal order. And the sad reality is that in these circumstances government has to spend money on precisely those sectors that have been swinging most wildly — housing, finance, etc. It has to help stabilize people who have been idiots.

Actually executing this is a near-impossible task. Looking at the auto, housing and banking bailouts, we’re getting a sense of how the propeller heads around Obama operate. They try to put together programs that are bold, but without the huge interventions in the market implied by, say, nationalization. They’re balancing so many cross-pressures, they often come up with technocratic Rube Goldberg schemes that alter incentives in lots of medium and small ways. Some economists argue that the plans are too ineffectual, others that they are too opaque (estimates for the mortgage plan range from $75 billion to $275 billion and up). Personally, I hate the idea of 10 guys sitting around in the White House trying to redesign huge swaths of the U.S. economy on legal pads.

But at least they seem to be driven by a spirit of moderation and restraint. They seem to be trying to keep as many market structures in place as possible so things can return to normal relatively smoothly.

And they seem to understand the big thing. The nation’s economy is not just the sum of its individuals. It is an interwoven context that we all share. To stabilize that communal landscape, sometimes you have to shower money upon those who have been foolish or self-indulgent. The greedy idiots may be greedy idiots, but they are our countrymen. And at some level, we’re all in this together. If their lives don’t stabilize, then our lives don’t stabilize.

Tuesday, January 27, 2009

so little faith...

The New York Times

January 27, 2009
What Life Asks of Us
By DAVID BROOKS

A few years ago, a faculty committee at Harvard produced a report on the purpose of education. “The aim of a liberal education” the report declared, “is to unsettle presumptions, to defamiliarize the familiar, to reveal what is going on beneath and behind appearances, to disorient young people and to help them to find ways to reorient themselves.”

The report implied an entire way of living. Individuals should learn to think for themselves. They should be skeptical of pre-existing arrangements. They should break free from the way they were raised, examine life from the outside and discover their own values.

This approach is deeply consistent with the individualism of modern culture, with its emphasis on personal inquiry, personal self-discovery and personal happiness. But there is another, older way of living, and it was discussed in a neglected book that came out last summer called “On Thinking Institutionally” by the political scientist Hugh Heclo.

In this way of living, to borrow an old phrase, we are not defined by what we ask of life. We are defined by what life asks of us. As we go through life, we travel through institutions — first family and school, then the institutions of a profession or a craft.

Each of these institutions comes with certain rules and obligations that tell us how to do what we’re supposed to do. Journalism imposes habits that help reporters keep a mental distance from those they cover. Scientists have obligations to the community of researchers. In the process of absorbing the rules of the institutions we inhabit, we become who we are.

New generations don’t invent institutional practices. These practices are passed down and evolve. So the institutionalist has a deep reverence for those who came before and built up the rules that he has temporarily taken delivery of. “In taking delivery,” Heclo writes, “institutionalists see themselves as debtors who owe something, not creditors to whom something is owed.”

The rules of a profession or an institution are not like traffic regulations. They are deeply woven into the identity of the people who practice them. A teacher’s relationship to the craft of teaching, an athlete’s relationship to her sport, a farmer’s relation to her land is not an individual choice that can be easily reversed when psychic losses exceed psychic profits. Her social function defines who she is. The connection is more like a covenant. There will be many long periods when you put more into your institutions than you get out.

In 2005, Ryne Sandberg was inducted into the baseball Hall of Fame. Heclo cites his speech as an example of how people talk when they are defined by their devotion to an institution:

“I was in awe every time I walked onto the field. That’s respect. I was taught you never, ever disrespect your opponents or your teammates or your organization or your manager and never, ever your uniform. You make a great play, act like you’ve done it before; get a big hit, look for the third base coach and get ready to run the bases.”

Sandberg motioned to those inducted before him, “These guys sitting up here did not pave the way for the rest of us so that players could swing for the fences every time up and forget how to move a runner over to third. It’s disrespectful to them, to you and to the game of baseball that we all played growing up.

“Respect. A lot of people say this honor validates my career, but I didn’t work hard for validation. I didn’t play the game right because I saw a reward at the end of the tunnel. I played it right because that’s what you’re supposed to do, play it right and with respect ... . If this validates anything, it’s that guys who taught me the game ... did what they were supposed to do, and I did what I was supposed to do.”

I thought it worth devoting a column to institutional thinking because I try to keep a list of the people in public life I admire most. Invariably, the people who make that list have subjugated themselves to their profession, social function or institution.

Second, institutional thinking is eroding. Faith in all institutions, including charities, has declined precipitously over the past generation, not only in the U.S. but around the world. Lack of institutional awareness has bred cynicism and undermined habits of behavior. Bankers, for example, used to have a code that made them a bit stodgy and which held them up for ridicule in movies like “Mary Poppins.” But the banker’s code has eroded, and the result was not liberation but self-destruction.

Institutions do all the things that are supposed to be bad. They impede personal exploration. They enforce conformity.

But they often save us from our weaknesses and give meaning to life.

Monday, January 05, 2009

jobs

*

By SARAH E. NEEDLEMAN

Nineteen years ago, Jennifer Courter set out on a career path that has since provided her with a steady stream of lucrative, low-stress jobs. Now, her occupation -- mathematician -- has landed at the top spot on a new study ranking the best and worst jobs in the U.S.

"It's a lot more than just some boring subject that everybody has to take in school," says Ms. Courter, a research mathematician at mental images Inc., a maker of 3D-visualization software in San Francisco. "It's the science of problem-solving."

The study, to be released Tuesday from CareerCast.com, a new job site, evaluates 200 professions to determine the best and worst according to five criteria inherent to every job: environment, income, employment outlook, physical demands and stress. (CareerCast.com is published by Adicio Inc., in which Wall Street Journal owner News Corp. holds a minority stake.)

The findings were compiled by Les Krantz, author of "Jobs Rated Almanac," and are based on data from the U.S. Bureau of Labor Statistics and the Census Bureau, as well as studies from trade associations and Mr. Krantz's own expertise.

According to the study, mathematicians fared best in part because they typically work in favorable conditions -- indoors and in places free of toxic fumes or noise -- unlike those toward the bottom of the list like sewage-plant operator, painter and bricklayer. They also aren't expected to do any heavy lifting, crawling or crouching -- attributes associated with occupations such as firefighter, auto mechanic and plumber.

The study also considers pay, which was determined by measuring each job's median income and growth potential. Mathematicians' annual income was pegged at $94,160, but Ms. Courter, 38, says her salary exceeds that amount.

Her job entails working as part of a virtual team that designs mathematically based computer programs, some of which have been used to make films such as "The Matrix" and "Speed Racer." She telecommutes from her home and rarely works overtime or feels stressed out. "Problem-solving involves a lot of thinking," says Ms. Courter. "I find that calming."

Other jobs at the top of the study's list include actuary, statistician, biologist, software engineer and computer-systems analyst, historian and sociologist.
The Best and Worst Jobs

Of 200 Jobs studied, these came out on top -- and at the bottom:
The Best The Worst
1. Mathematician 200. Lumberjack
2. Actuary 199. Dairy Farmer
3. Statistician 198. Taxi Driver
4. Biologist 197. Seaman
5. Software Engineer 196. EMT
6. Computer Systems Analyst 195. Garbage Collector
7. Historian 194. Welder
8. Sociologist 193. Roustabout
9. Industrial Designer 192. Ironworker
10. Accountant 191. Construction Worker
11. Economist 190. Mail Carrier
12. Philosopher 189. Sheet Metal Worker
13. Physicist 188. Auto Mechanic
14. Parole Officer 187. Butcher
15. Meteorologist 186. Nuclear Decontamination Tech
16. Medical Laboratory Technician 185. Nurse (LN)
17. Paralegal Assistant 184.Painter
18. Computer Programmer 183. Child Care Worker
19. Motion Picture Editor 182. Firefighter
20. Astronomer 181. Brick Layer
More on the Methodology

For methodology info and detailed job descriptions, go to http://careercast.com/jobs/content/JobsRated_Methodology

* Read about the last study of the best and worst jobs.

Mark Nord is a sociologist working for the Department of Agriculture's Economic Research Service in Washington, D.C. He studies hunger in American households and writes research reports about his findings. "The best part of the job is the sense that I'm making some contribution to good policy making," he says. "The kind of stuff that I crank out gets picked up by advocacy organizations, media and policy officials."

The study estimates sociologists earn $63,195, though Mr. Nord, 62, says his income is about double that amount. He says he isn't surprised by the findings because his job generates little stress and he works a steady 7:30 a.m. to 4 p.m. schedule. "It's all done at the computer at my desk," he says. "The main occupational hazard is carpal tunnel syndrome."

On the opposite end of the career spectrum are lumberjacks. The study shows these workers, also known as timber cutters and loggers, as having the worst occupation, because of the dangerous nature of their work, a poor employment outlook and low annual pay -- just $32,124.

New protective gear -- such as trouser covers made of fiber-reinforcement materials -- and an increased emphasis on safety have helped to reduce injuries among lumberjacks, says Paul Branch, who manages the timber department at Pike Lumber Co. in Akron, Ind. Still, accidents do occur from time to time, and some even result in death. "It's not a job everybody can do," says Mr. Branch.

But Eric Nellans, who has been cutting timber for the past 11 years for Pike Lumber, is passionate about his profession. "It's a very rewarding job, especially at the end of the day when you see the work you accomplished," he says. Mr. Nellans, 35, didn't become discouraged even after he accidentally knocked down a dead tree and broke his right leg in the process four years ago. "I was back in the woods cutting timber in five weeks," he says.

Other jobs at the bottom of the study: dairy farmer, taxi driver, seaman, emergency medical technician and roofer.

Mike Riegel, a 43-year-old roofer in Flemington, N.J., says he likes working "outside in the fresh air." Since he runs his own business, which he inherited from his father, he can start and end his day early in hot weather or do the opposite when it's cold.

The study estimates roofers earn annual incomes of $34,164, which Mr. Riegel says is consistent with what he pays new employees. Roofers also ranked poorly because of their hazardous working conditions. "You obviously can't be afraid of heights," says Mr. Riegel, who once fell two stories while working on a rooftop in the rain but luckily landed safely on a pile of soft dirt. "I missed some cement by 10 feet."

Write to Sarah E. Needleman at sarah.needleman@wsj.com

Sunday, January 04, 2009

A President Remembered

January 4, 2009
Op-Ed Columnist
A President Forgotten but Not Gone
By FRANK RICH

WE like our failed presidents to be Shakespearean, or at least large enough to inspire Oscar-worthy performances from magnificent tragedians like Frank Langella. So here, too, George W. Bush has let us down. Even the banality of evil is too grandiose a concept for 43. He is not a memorable villain so much as a sometimes affable second banana whom Josh Brolin and Will Ferrell can nail without breaking a sweat. He’s the reckless Yalie Tom Buchanan, not Gatsby. He is smaller than life.

The last NBC News/Wall Street Journal poll on Bush’s presidency found that 79 percent of Americans will not miss him after he leaves the White House. He is being forgotten already, even if he’s not yet gone. You start to pity him until you remember how vast the wreckage is. It stretches from the Middle East to Wall Street to Main Street and even into the heavens, which have been a safe haven for toxins under his passive stewardship. The discrepancy between the grandeur of the failure and the stature of the man is a puzzlement. We are still trying to compute it.

The one indisputable talent of his White House was its ability to create and sell propaganda both to the public and the press. Now that bag of tricks is empty as well. Bush’s first and last photo-ops in Iraq could serve as bookends to his entire tenure. On Thanksgiving weekend 2003, even as the Iraqi insurgency was spiraling, his secret trip to the war zone was a P.R. slam-dunk. The photo of the beaming commander in chief bearing a supersized decorative turkey for the troops was designed to make every front page and newscast in the country, and it did. Five years later, in what was intended as a farewell victory lap to show off Iraq’s improved post-surge security, Bush was reduced to ducking shoes.

He tried to spin the ruckus as another victory for his administration’s program of democracy promotion. “That’s what people do in a free society,” he said. He had made the same claim three years ago after the Palestinian elections, championed by his “freedom agenda” (and almost $500 million of American aid), led to a landslide victory for Hamas. “There is something healthy about a system that does that,” Bush observed at the time, as he congratulated Palestinian voters for rejecting “the old guard.”

The ruins of his administration’s top policy priority can be found not only in Gaza but in the new “democratic” Iraq, where the local journalist who tossed the shoes was jailed without formal charges and may have been tortured. Almost simultaneously, opponents of Prime Minister Nuri al-Maliki accused him of making politically motivated arrests of rival-party government officials in anticipation of this month’s much-postponed provincial elections.

Condi Rice blamed the press for the image that sullied Bush’s Iraq swan song: “That someone chose to throw a shoe at the president is what gets reported over and over.” We are back where we came in. This was the same line Donald Rumsfeld used to deny the significance of the looting in Baghdad during his famous “Stuff happens!” press conference of April 2003. “Images you are seeing on television you are seeing over, and over, and over,” he said then, referring to the much-recycled video of a man stealing a vase from the Baghdad museum. “Is it possible that there were that many vases in the whole country?” he asked, playing for laughs.

The joke was on us. Iraq burned, New Orleans flooded, and Bush remained oblivious to each and every pratfall on his watch. Americans essentially stopped listening to him after Hurricane Katrina hit in 2005, but he still doesn’t grasp the finality of their defection. Lately he’s promised not to steal the spotlight from Barack Obama once he’s in retirement — as if he could do so by any act short of running naked through downtown Dallas. The latest CNN poll finds that only one-third of his fellow citizens want him to play a post-presidency role in public life.

Bush is equally blind to the collapse of his propaganda machinery. Almost poignantly, he keeps trying to hawk his goods in these final days, like a salesman who hasn’t been told by the home office that his product has been discontinued. Though no one is listening, he has given more exit interviews than either Clinton or Reagan did. Along with old cronies like Karl Rove and Karen Hughes, he has also embarked on a Bush “legacy project,” as Stephen Hayes of The Weekly Standard described it on CNN.

To this end, Rove has repeated a stunt he first fed to the press two years ago: he is once again claiming that he and Bush have an annual book-reading contest, with Bush chalking up as many as 95 books a year, by authors as hifalutin as Camus. This hagiographic portrait of Bush the Egghead might be easier to buy were the former national security official Richard Clarke not quoted in the new Vanity Fair saying that both Rice and her deputy, Stephen Hadley, had instructed him early on to keep his memos short because the president is “not a big reader.”

Another, far more elaborate example of legacy spin can be downloaded from the White House Web site: a booklet recounting “highlights” of the administration’s “accomplishments and results.” With big type, much white space, children’s-book-like trivia boxes titled “Did You Know?” and lots of color photos of the Bushes posing with blacks and troops, its 52 pages require a reading level closer to “My Pet Goat” than “The Stranger.”

This document is the literary correlative to “Mission Accomplished.” Bush kept America safe (provided his presidency began Sept. 12, 2001). He gave America record economic growth (provided his presidency ended December 2007). He vanquished all the leading Qaeda terrorists (if you don’t count the leaders bin Laden and al-Zawahri). He gave Afghanistan a thriving “market economy” (if you count its skyrocketing opium trade) and a “democratically elected president” (presiding over one of the world’s most corrupt governments). He supported elections in Pakistan (after propping up Pervez Musharraf past the point of no return). He “led the world in providing food aid and natural disaster relief” (if you leave out Brownie and Katrina).

If this is the best case that even Bush and his handlers can make for his achievements, you wonder why they bothered. Desperate for padding, they devote four risible pages to portraying our dear leader as a zealous environmentalist.

But the brazenness of Bush’s alternative-reality history is itself revelatory. The audacity of its hype helps clear up the mystery of how someone so slight could inflict so much damage. So do his many print and television exit interviews.

The man who emerges is a narcissist with no self-awareness whatsoever. It’s that arrogance that allowed him to tune out even the most calamitous of realities, freeing him to compound them without missing a step. The president who famously couldn’t name a single mistake of his presidency at a press conference in 2004 still can’t.

He can, however, blame everyone else. Asked (by Charles Gibson) if he feels any responsibility for the economic meltdown, Bush says, “People will realize a lot of the decisions that were made on Wall Street took place over a decade or so, before I arrived.” Asked if the 2008 election was a repudiation of his administration, he says “it was a repudiation of Republicans.”

“The attacks of September the 11th came out of nowhere,” he said in another interview, as if he hadn’t ignored frantic intelligence warnings that summer of a Qaeda attack. But it was an “intelligence failure,” not his relentless invocation of patently fictitious “mushroom clouds,” that sped us into Iraq. Did he take too long to change course in Iraq? “What seems like an eternity today,” he says, “may seem like a moment tomorrow.” Try telling that to the families of the thousands killed and maimed during that multiyear “moment” as Bush stubbornly stayed his disastrous course.

The crowning personality tic revealed by Bush’s final propaganda push is his bottomless capacity for self-pity. “I was a wartime president, and war is very exhausting,” he told C-Span. “The president ends up carrying a lot of people’s grief in his soul,” he told Gibson. And so when he visits military hospitals, “it’s always been a healing experience,” he told The Wall Street Journal. But, incredibly enough, it’s his own healing he is concerned about, not that of the grievously wounded men and women he sent to war on false pretenses. It’s “the comforter in chief” who “gets comforted,” he explained, by “the character of the American people.” The American people are surely relieved to hear it.

With this level of self-regard, it’s no wonder that Bush could remain undeterred as he drove the country off a cliff. The smugness is reinforced not just by his history as the entitled scion of one of America’s aristocratic dynasties but also by his conviction that his every action is blessed from on high. Asked last month by an interviewer what he has learned from his time in office, he replied: “I’ve learned that God is good. All the time.”

Once again he is shifting the blame. This presidency was not about Him. Bush failed because in the end it was all about him.