Wednesday, September 30, 2009

Feeble recovery leaves economies confused

Sep 29, 2009

Over the past week, our correspondents looked at the seeds of revival in key Asian economies. Today, to wrap up the series, our Europe correspondent Jonathan Eyal looks at the prospects for the continent.

LONDON: A gentle sigh of relief can be heard from all European capitals. A year after the worst financial crisis in living memory, manufacturing output is rising, house prices are reviving and stock markets roaring again.

Yet the recovery is feeble - a mere 0.2 per cent overall growth. And the downturn has left deep scars. The economies of the 'old continent' are emerging from the storm battered and confused about their future.

After initially dismissing the crisis as just an American 'disease', European governments quickly made up for lost time: While the United States Congress was still debating its options, Europe bailed out its failing banks and was the first to unveil massive economic stimulus packages.

In doing so, the continent enjoyed two distinct advantages. It has a European Central Bank which runs the euro, the currency of its biggest economies. Despite its reputation for prudence, the bank flooded Europe with cash. Even the small east European countries, outside the euro zone and vulnerable, were helped by this largesse.

Europe's second trump card was its social security net, the world's most extensive and lavish. It swung into action as people became unemployed or fell below the poverty line.

So, the mixture of extra government spending and social benefits ensured that the continent retained its genteel ways of life.

Mr Alfred Butt, a German factory worker, went ahead with his traditional family holiday on a sunny Mediterranean island, despite losing his job: 'We still have enough income, and it was good to get away,' he said.

Perversely, one key European sector actually benefited from the crisis: the car industry. Car manufacturers enjoyed an unprecedented boom, as customers took advantage of state subsidies to swop old models for new ones.

In Germany, at least two million people grabbed the $5,000 individual payout to ditch their old vehicles; new car sales shot up by 28 per cent last month alone.

The carnival atmosphere was so great that one German woman wondered if governments would introduce a scheme to swop old husbands for new. 'I'm in the market for that as well,' she said.

But the day of reckoning is approaching, because the measures to stimulate the economy have resulted in the fastest rise in government debt since World War II. In effect, Europe avoided an economic disaster, only to be saddled with a lingering problem for generations.

If no belt-tightening measures are taken, government debt will exceed Europe's entire annual wealth by the middle of the next decade.

And, if this was not enough, the economic recovery remains uneven. Germany and France may be out of the recession but Britain - hitherto the continent's best performer - is stuck: its economy will contract by a whopping 4 per cent this year.

In London, expatriates who flocked to work in Europe's financial capital are now deserting it in droves.

Mr Andrew Wesbecher, a young American who specialised in selling software to investment funds, left Britain this summer. 'When the performance bonuses go away, the value of being in this country goes away,' he concluded.

Over the past year, 45,000 jobs were lost in London's banking sector alone.

For many Europeans tired of being lectured about the virtues of unregulated capitalism, Britain's predicament is a source of wry satisfaction. French Prime Minister Francois Fillon claimed recently that 'the crisis has modified Europe's ideological landscape'.

From now on, the people of the continent will expect more state intervention, precisely what France argued for all along.

Perhaps, but the snag remains that no European government is in a position to borrow more, without risking massive inflation. Europe will have to cut expenditure and raise taxes at the same time. Governments will need to reprivatise their nationalised banks as fast as possible.

So, far from commanding the economy, economic realities will command policy.

The period of adjustment will be prolonged and painful. Unemployment is already rising fast: it averages 10 per cent of the labour force across western Europe, and stands at a horrible 18 per cent in Spain.

Mr David Gross, who directs the Centre for European Policy Studies think-tank, articulated the continent's frustration. 'This crisis may have started in the US, but even more combustible material had accumulated in Europe, so it is likely that the cost will be higher here.'

Meanwhile, the rest of the world is moving on while the Europeans continue to lag behind.

Perhaps that is why French President Nicolas Sarkozy now advocates a radical new approach to measuring economic performance.

Instead of the old, boring national statistics, Mr Sarkozy is suggesting that other factors - such as long holidays, leisure activities or health care - should be included in a 'new index of happiness'.

Based on such yardsticks, Europe may still score well. But being a pleasant place to live is not the same as remaining a relevant actor on the world stage.

Sunday, September 27, 2009

Former Aware president Josie Lau leaves DBS

SINGAPORE: The former president of the Association of Women for Action and Research (Aware), Ms Josie Lau, has left DBS Bank and joined hospitality group Overseas Union Enterprise (OUE).

She is now vice-president for centre management at Mandarin Gallery, where she oversees operations and marketing for the new premier shopping mall, Weekend TODAY has learnt.

The 48-year-old was thrust into the limelight in March, when she became embroiled in a leadership tussle at Aware. Her decision to take on the role of president of the women's advocacy group earned her a public rebuke from DBS for ignoring its code of conduct.

At the time, she was DBS' head of marketing, cards and unsecured loans, becoming its vice-president for network planning and deployment in May prior to her resignation.

Ms Lau, who started her new job on Thursday, said she took it because it "allows me to leverage on all my past work experience and my love for fashion retail and marketing".

OUE chief executive Thio Gim Hock said Ms Lau had "expressed her desire to look for new pastures", and he felt she might be suitable for the post. "I asked her to send in her CV and forwarded (it) to the senior vice-president (of) leasing, who interviewed her."

Mr Thio is married to lawyer Thio Su Mien, who had encouraged Ms Lau to get involved in Aware. They attend the same church, but Ms Lau stressed her hiring was "based on my merit".

Likewise Mr Thio - who has known Ms Lau, "though not very well", for over 10 years - said: "It's easier and more desirable to employ people you already know ... Her getting the job was entirely due to her experience, qualification and personality."

Mr Thio also noted Ms Lau's "traumatic experience" during the Aware saga which included "having her employer threatening to sack her".

When contacted, a DBS spokesman said: "Ms Lau decided to leave DBS to pursue other opportunities. It was a personal decision, and we wish her the best in her new endeavours."

[I guess this will be another evidence that the newspapers are out to get Josie. But it is news of interest. And lends support to the idea that the ingroup will protect each other. Stilll, she needs to make a living and I can only imagine how the environment at DBS was for her.]

Friday, September 25, 2009

Program to curb malpractice lawsuits

Sep 24, 2009

WASHINGTON - ADMITTING fault and saying sorry can be hard for anyone, but for an American doctor whose medical error has killed a patient it can mean a lawsuit that ends his or her career.

Doug Wojcieszak recalled the way doctors literally ran away from his mother in the hallway of the hospital after their fatal misdiagnosis of his brother Jim in 1998.

'It's a cultural thing in medicine, that initial instinct to pull back and to sever the relationship with the patient or the family and clam up as a way to mitigate or reduce the chance of a lawsuit,' he told AFP.

His brother had walked into hospital complaining of chest, shoulder, neck and stomach pains - classic signs of a possible heart attack.

Because he was only 39, and 'a big, strong guy", the doctors automatically assumed he was having some kind of stomach problem.

When his parents brought him back the next day in excruciating pain, they had a closer look but critically mixed up his X-rays with those of his father, who had undergone tests in the same hospital months before.

'His blood test is showing heart distress, but they are looking at my dad's charts showing no blockage, so they misdiagnose my brother with a bacterial infection to the heart,' explained Mr Wojcieszak.

Doctors plied Jim full of antibiotics for the next two days and by the time they realized their error it was too late - he died during emergency open heart surgery as they tried desperately to unblock four major arteries.

The tragedy was hard for the family to take, but their grief was compounded by an ensuing cover-up that meant they had no choice but to sue. Eventually, struggling to find any kind of closure, a settlement was reached in 2000.

'My parents had to relive the death of their first-born son for two years,' said Mr Wojcieszak.

Five years later he founded The Sorry Works! Coalition to unite fledgling projects that encouraged doctors to disclose mistakes, apologise and offer compensation. -- AFP

His coalition argues that solving the malpractice crisis is not a legal problem, involving complex reform of America's tort laws, but simply a customer service one.

President Barack Obama might agree as he has instructed Health Secretary Kathleen Sebelius to examine such projects as part of a US$25 million (S$35 million) program to curb medical malpractice lawsuits.

'We're thrilled that it's part of the national debate,' said Mr Wojcieszak, who dismissed the existing system as a 'never-ending fight between lawyers and doctors, pointing fingers and yelling and screaming'.

Malpractice reform has stalled in the United States for decades, even as studies indicate more than 100,000 people die each year as a result of an estimated 15 million preventable medical errors.

In high-risk specialities such as neuro-surgery and obstetrics, doctors also fork out hundreds of thousands of dollars a year on insurance premiums to offset the risk of professional error.

Mr Obama surprised many when he promised a new look at the complex issue as part of his ambitious health care overhaul, defying the conventional wisdom that malpractice reform is only a Republican cause.

Republicans, backed by the doctors, argue that caps should be imposed on the amount of damages patients can receive, but Mr Obama told the American Medical Association in June such measures would be unfair to those wrongfully harmed.

He is leaning more towards special courts and programs like Sorry Works, and Sebelius has been instructed to award grants from early 2010 to states and hospitals that find innovative ways to address the malpractice mess.

Tim McDonald, chief safety officer at the University of Illinois Medical Center in Chicago, finds himself at the vanguard of this movement.

His doctors maintain communication with families when things go wrong, conduct full investigations of any errors and then apologise, when appropriate, before fast-tracking cases for possible settlement.

Since the program launched in 2006, the culture of admission has helped doctors introduce new monitoring procedures for sedation, develop a better system to prevent blood clots, and even find equipment left inside patients.

'We had a case where we left a sponge behind and totally changed our process for how we prevent those,' Mr McDonald told AFP.

'There are certain patients at risk for leaving things behind and we actually get X-rays on them even when we think we've accounted for everything in the operating room. We've found several objects in those patients by having changed that process.'

Susan Steinman, director of policy at the American Association for Justice, the leading organisation for trial lawyers in the United States, is also on board.

'I think that 'Sorry Works' in particular has a lot of merit and has been used successfully by several large university hospitals,' she told AFP.

'It's a good program because not only does it look at ways to reduce litigation costs, it looks at ways to reduce medical errors.' Insurers were very skeptical at first as they thought disclosure would increase the number of lawsuits.

'That hasn't happened,' said Mr McDonald. 'Instead, we've been able to show we've prevented certain kinds of events from happening again.' For Wojcieszak, the Sorry Works revolution can't come soon enough.

'Patients and families can live with screw-ups, even fatal screw-ups so long as someone's got the integrity to stand up and say: 'I made a mistake, let's talk about how I can try and make this right by you.' 'This is simply just taking care of people and reaping the benefits.' -- AFP

Thursday, September 24, 2009

Dust storm covers Sydney in orange haze

Sep 24, 2009

SYDNEY: The worst dust storm in decades swept across eastern Australia yesterday, blanketing Sydney and snarling transport as freak conditions also brought earthquakes and giant hailstones.

Gale-force winds dumped thousands of tonnes of red desert dust on Australia's biggest city, shrouding it in an eerie orange haze and coating the iconic Sydney Opera House in a fine layer of powder.

Sydney residents told local radio that they woke up to scenes from an apocalyptic Hollywood movie, while many contacted emergency services fearing a big bushfire in the city.

One woman said she woke up to find that red dust had covered her floors, and birds had been blown out of their nests.

'It did feel like Armageddon because when I was in the kitchen looking out the skylight, there was this red, red glow coming through,' the resident, identified only as Karen, told Australian radio.

The storm, reportedly the most serious since the 1940s, then spread 600km up the coast to Queensland and could even hit New Zealand, some 4,000km away, experts said.

Dust covered most of New South Wales, Australia's most populous state, pushing air pollution to record levels and depositing about 75,000 tonnes of powder in the Tasman Sea every hour.

Dust storm disrupts air traffic

'Dust storms like this occur quite regularly but they rarely travel this far east and come through Sydney,' said Mr John Leys, principal research scientist with New South Wales' Department of Climate Change and Water.

Sydney residents wore face masks and covered their mouths with scarves as they travelled to work under hazy skies. Traffic was bumper-to-bumper on major highways.

Air transport was severely disrupted, with passengers facing long delays at Sydney airport and many international flights diverted to Melbourne and Brisbane.

Though the dust over Sydney had largely cleared by mid-afternoon, flag-carrier Qantas urged passengers to cancel any non-urgent travel, while budget offshoot Jetstar offered free flight rescheduling and refunds.

Singapore Airlines, responding to queries from The Straits Times, said that only one flight, which left Singapore on Tuesday, had to be diverted.

SQ221, which was diverted to Melbourne, has since arrived in Sydney, it said. All other operations to and from Sydney were proceeding as normal, it added.

Sydney Ferries suspended harbour services and police warned drivers to take extra care in poor visibility. Ambulance workers reported a sudden spike in respiratory problems.

Australia, in the grip of a decade-long drought, is emerging from an abnormally hot southern hemisphere winter, including the hottest August on record.

Elsewhere in New South Wales, hail stones 'the size of cricket balls' smashed windows, as thunderstorms and gale-force winds lashed the state late on Tuesday.

'We've had reports of cars with both their front and rear windscreens smashed,' an official from the State Emergency Service said.

Further north, Queensland imposed a ban on lighting fires across large parts of the state a day after a dozen bush blazes sprang up following a hot, dry spell.

Tough water restrictions there are to be set aside temporarily to allow people to wash dust from their cars, homes and business premises, the Australian AAP news agency reported.

Victoria state was on alert for flash floods as heavy rains fell, following a pair of minor earthquakes on Tuesday. The 3.0- and 2.6-magnitude tremors did not cause any damage, officials said.


[Dust, wind, bush fires, hail, earthquake - yep. End of the world.]

Lest we become strangers in our own land

Sep 24, 2009

By Ngiam Tong Dow

A CASE could be made that Singapore's transition from a British colony to an independent state was shaped by the Cold War. After all, Singapore achieved independence by first merging with Malaya because of fears it might succumb to communism if it were left alone.

The end of the Cold War brought about seismic geopolitical changes. For Singapore and other Asian countries, the most critical was China's decision to stop exporting revolution. Instead, Deng Xiaoping, in the words of Chinese President Hu Jintao, adopted the strategy of the 'peaceful rise of China'.

China opened its centrally planned economy to international trade in 1978. China today is well on its way to becoming one of the world's three largest economies.

Singapore's economic relations with China are growing by the day. Yet barely 40 years ago, states like Singapore with sizeable ethnic Chinese populations were wary if not fearful of trading with China lest cheap Chinese products were used to seduce our people politically.

Singapore's national trading arm, Intraco, was instructed by the Finance Ministry to diversify the country's sources of rice imports. In the late 1960s, China was the most competitive supplier of rice. But the Singapore authorities were afraid that Chinese rice could be used to subsidise revolution.

Singapore and China were mutually suspicious of each other then, as the following story indicates:

In the late 1970s, China placed an order with a shipbuilder here for two oil drilling rigs. Six Chinese engineers led by a political commissar were dispatched to Singapore to supervise the building of the rigs. As standard operating procedure, the Chinese were placed under surveillance. Singapore's intelligence officers followed them everywhere they went. One day, the leader of the Chinese team, in exasperation, told our liaison officer that there was no need to tail his people. He was in fact more worried that his people would defect to Singapore.

In the 1960s, the world was divided into political blocs. National economies produced behind tariff walls. The term 'global economy' was not yet coined. But by necessity, the 'little red dot', Singapore, had to be open.

Though the world has changed since then, the fundamentals of Singapore's economic and trade policies remain the same. Singapore has to be useful to its trading partners, as it has been since the 15th century, in order to survive.

So long as we add to our knowledge and remain nimble, we can earn a living. Our fundamental challenge is political. How do we become one people despite our diversity?

As we are unlikely to ever restore our natural birth rates to replacement levels, we have no choice but to add to our population through immigration. But how do we assimilate the newcomers? With a small population, will we ever be in a position to assimilate anyone? Or will we instead be absorbed by them as they come from stronger cultures? At what pace should we bring in new immigrants?

I do not want to sound alarmist but a recurring nightmare of mine is that someday we will find ourselves strangers in our own land.

The East India Company, and later the British colonial office, essentially followed a policy of laissez-faire: they let people come and go. Our forefathers who migrated to Malaya and Singapore in the late 1800s and early 1900s fended for themselves. They built their own businesses and social organisations; they established schools.

Some of them went back to their ancestral homelands to die. Most stayed in their adopted country. We are their children and grandchildren.

Today, migration is economics driven. The best and the brightest move around the world searching for higher paying jobs. We risk having them use us as a stepping stone. Foreign fathers may advise their sons born in Singapore to leave when they reach the national service age of 18.

Singapore will be left with the second tier of average people. Educationally, they would hardly measure up to the Singapore average. When they are given citizenship and the right to vote, they will use their new-found electoral power to demand equal access to social services as other Singaporeans. The difference is existing citizens would have paid for those social services over a lifetime of tax payments; the new citizens would not.

The population planners need to remember that international economic competitiveness is now knowledge-based. It is no longer a numbers game. Why the haste in adding to the population? Do we have the absorptive capacity to accommodate a million new people within a decade?

I believe we should make haste slowly. We should avoid repeating the 1960s mistake of 'stopping at two' - but this time in reverse.

The writer, a former senior civil servant, is currently an adjunct professor at Nanyang Technological University. The above is an excerpt from a 'fireside chat' he delivered to the Singapore chapter of the World Presidents' Organisation.

Wednesday, September 23, 2009

Is Hokkien my 'mother tongue'?

Sep 23, 2009

By Alfian Sa'at

A LONG time ago, a Chinese man saw some Malays eating a fruit. It had a spiky shell, but its insides were filled with large seeds covered by yellow, buttery flesh. He had never seen (nor smelt!) a fruit like it in his native village in Fujian. What was the fruit called, he asked the Malays.

'Durian,' they replied - from the Malay word duri, meaning 'thorn'. And so the Chinese man went back and told his friends about this new fruit. As the word spread, it became incorporated into Hokkien as loo lian.

Then one day, a new fruit made its appearance, native to South America. It was also green, with a spiky exterior. It was known as 'soursop' in English.

The Malays had a tendency to append the word belanda (meaning 'Dutch') to anything foreign that they had never seen before. Examples include kambing belanda (sheep), ayam belanda (turkey), kucing belanda (rabbit). So they called soursop durian belanda.

The Hokkiens, on the other hand, called it ang mo loo lian. Ang mo - roughly 'Western' - was also used for other edibles, like ang mo kio (tomato) and ang mo chai thou (carrot). The word ang mo loo lian carries traces of Hokkien's contact with both Malay and the West.

The study of loan words has always fascinated me, for they give clues to the kinds of social interactions that occurred in the past. I sketched a scenario above of how a single word from one language entered another. But the process is much more complex than that, probably involving long-term, sustained contact. The chain of transmission might even involve an intermediary, such as the Straits Chinese (or Peranakans), whose Baba patois contains both Malay and Hokkien words.

Here are some words that were borrowed from Hokkien into Malay: beca (trishaw), bihun (vermicelli), cat (paint), cincai (any old how), gua (I/me), guli (marbles), kentang (potato), kamceng (close), kuih (cake), kongsi (share), kuaci (melon seeds), teko (teapot), taugeh (bean sprout), tahu (beancurd) and tauke (boss). (Note that 'c' in Malay has the 'ch' sound.)

This linguistic exchange was a two-way process. Here are some Malay words that penetrated Hokkien: agak (guess or moderate), botak (bald), champur (mix), gadoh (fight), gaji (wages), jamban (toilet), kachiau (disturb), otang (owe/ debt), pakat (conspire), pasar (market), pitchia (break), salah (wrong), senget (crooked), sukak (like), tiam (quiet) and torlong (help).

There are even some Cantonese words that are now part of Malay parlance, such as pokai (broke or penniless) and samseng (gangster). Interestingly, it has been postulated that the word sam seng (three stars) was derived from the fact that recruits in the Malayan People's Anti-Japanese Army wore caps emblazoned with three stars, each representing one of the main races in Malaya: Malays, Chinese and Indians.

In the Singapore Armed Forces, one of the things all NSmen were told by their sergeants was that 'over here, Hokkien is your mother tongue'. This was based on the stereotypes that Hokkien was a gendered, macho language, with the most pungent swear words.

But considering how Hokkien words have entered the Malay language, I have realised that there is a larger truth to that statement. It is like tracing a family tree and then discovering that I had a Hokkien great-great-great-greatgrandmother. As a matter of fact, since almost two-thirds of the Malay lexicon consist of borrowings, I definitely had Arabic and Indian (linguistic) ancestors too.

Malays have a saying: bahasa jiwa bangsa, 'language is the soul of a race'. But there is a tension in the phrase. We tend to think of 'race' as something bounded and rigid. But 'language' does not have such impermeable borders. Words of various origins pass through open checkpoints, undergo shifts in meaning, and become naturalised over time.

Thus, as much as we may like to be essentialist about our race, we cannot escape from the hybridities already extant in our language. There is humility in the idea that no language is perfect on its own, and will borrow words to make up for its lack.

My Hokkien friends who travel overseas would often relate to me the sense of dislocation they feel when speaking to other Hokkien speakers. A friend who went to Taiwan, for example, was surprised to note that they did not understand what loti meant. Another friend shared a story about the nuances of pokai in Hong Kong.

At the end of the month, he moaned out loud at the office kam chi pokai le ('I'm broke this time') and all eyes turned on him. Pokai means 'broke' in Singapore. But in Hong Kong, pokai (literally, 'cast out on the streets') suggests something worse, like being destitute on the streets or being beaten up.

It is easy to interpret these instances as evidence that the Chinese in this part of the world have been 'contaminated' by other cultures. I happen to take the opposite view: The Nanyang Chinese have evolved an identity of their own, incorporating elements of other cultures. That this has been possible is a testament to their openness and curiosity.

Much ink (and tears) has been spilled on how the promotion of Mandarin here has resulted in what some have called the 'cultural lobotomy' of the Chinese community. In many ways, I sympathise with the late Kuo Pao Kun's observation that Chinese Singaporeans are 'cultural orphans', snatched as they were from their biological southern Chinese bosoms and placed in the laps of Mandarin-speaking foster mothers.

A familiar lament is that the declining use of the southern Chinese languages has resulted in the estrangement between generations of Chinese Singaporeans. I would argue that it has also led to some estrangement among the various languages. I do not know if I should worry about the fact that the traffic of loan words has almost ceased between Malay and Mandarin.

It is premature to theorise that this is a symptom of less interaction among the races. After all, there is English to mediate our communications with one another. But the fact remains that I do not know of a single Malay word that has Mandarin origins.

Somehow, our forefathers, of various races, knew how to pakat against common enemies, were able to kongsi their resources, and in the process of all that champur, became kamcheng with one another. The product of their alliances, friendships and inter-marriages is reflected in the languages we have inherited.

To lose this legacy is to sever a vital connection not only to the historical origins of the Nanyang Chinese, but also to Singapore's dynamic multicultural past.

The writer is a poet and playwright. He thanks Lai Chee Kien for his inputs. A longer version of this essay first appeared in The Online Citizen.

Monday, September 21, 2009

Rajaratnam, keeper of the faith

Sep 20, 2009

The Pledge is the enduring legacy of the late minister, to whom Singapore was worth fighting and dying for
By Irene Ng

As any writer knows, first drafts are subject to revision. Good writers hone, sculpt and polish their drafts to make sure that their final versions sparkle. Mr S. Rajaratnam, who drafted Singapore's national Pledge, was a very good writer. Before he joined politics in 1959, he had distinguished himself as an influential newspaper columnist and a short-story writer.

He understood the power of language and of ideas. And the idea he loved most was: A Singaporean Singapore where its citizens transcend their boundaries of race, religion and language, and unite to become one people.

The Pledge and its origins have been much discussed recently. The subject is of particular interest to me as I am writing Mr Rajaratnam's biography for the Institute of Southeast Asian Studies. In the course of my research over the last five years, I have gained a deeper appreciation of how the Pledge came about and why it stands as Mr Rajaratnam's enduring legacy to Singapore.

Before the then Foreign Minister put his mind to the wording of the Pledge in 1966, the initial idea, as envisaged by the then Education Minister Ong Pang Boon, was that the Pledge would be recited by students in classrooms and during flag-raising ceremonies.

The original idea arose, in fact, as a sort of administrative compromise, given the constraints of schools. It can be traced back to October 1965, shortly after Singapore's separation from Malaysia. The Education Ministry wanted to inculcate national consciousness and patriotism among students in schools by assembling them for a flag-raising ceremony to the strains of the National Anthem. However, the mass singing of the anthem, to be accompanied by a brass band, required a large field or assembly hall. Many schools lacked such facilities. Even those with the facilities could not carry out the flag-raising ceremony daily because of the tight curriculum.

In a letter dated Oct 20, 1965, Mr Willie Cheng, the Ministry of Education's principal assistant secretary (administration), proposed that the flag-raising ceremonies be carried out in the classroom instead, where students would salute the flag. In a response dated Oct 22, 1965, Mr Kwan Sai Kheong, acting permanent secretary and director of education, wrote that Mr Ong had suggested, as a compromise, a pledge of two to three lines to be recited in the classroom, in place of singing the national anthem.

On that basis, two earlier versions of the Pledge were produced.

The first, dated Dec 17, 1965, was by Mr Philip Liau, adviser on textbooks and syllabuses. He used the American Students' Pledge as a reference: 'I pledge (reaffirm) my allegiance (loyalty) to the Flag of Singapore, and to the country for which it stands; one sovereign nation of many freedom-loving people of one heart, one mind and one spirit, dedicated to a just and equal society.'

The second version, dated Dec 30, 1965, by Mr George Thomson, director of the Political Study Centre, read: 'I proudly and wholeheartedly pledge my loyalty to our flag of Singapore and to the honour and independence of our Republic whose banner it is. We come from different races, religions and cultures, but we are now united in mind and heart as one nation, and one people, dedicated to build by democratic means a more just and equal society.'

Both versions were submitted to Mr Ong on Jan 26, 1966. His senior staff preferred the first version, considering it shorter and less abstract.

It was no surprise that Mr Ong then turned to Mr Rajaratnam, a master stylist known for his strong convictions on building a common national identity. In a letter to Mr Rajaratnam dated Feb 2, 1966, Mr Ong asked him for comments on the two versions and for 'whatever amendments you wish to suggest'.

By the time Mr Rajaratnam reverted with his earliest draft dated Feb 18, 1966, the Pledge was completely transformed. While the first two versions were all about pledging loyalty to the flag and country, Mr Rajaratnam's was about pledging to shared ideals, to a vision of Singapore that was, to his mind, worth fighting and dying for.

And while the first two versions used the personal pronoun 'I', Mr Rajaratnam opted for the collective 'We, as citizens of Singapore'. Defying the evidence before his eyes, he imagined a nation pulsating as one people. Only he would have the boldness to envision a people giving voice to such ideals in unison, declaring their identity as 'citizens of Singapore', as opposed to 'the Malays', 'the Chinese', 'the Indians' and so on.

More profoundly, he changed the entire premise for the Pledge as originally conceived by the Education Ministry. It was not just a few lines to be recited to the flag. It was a promise made to oneself, to each other and to future generations.

Mr Rajaratnam's earliest draft which we have on record went: 'We, as citizens of Singapore, pledge to forget differences of race, language and religion and become one united people; to build a democratic society where justice and equality will prevail, and where we will seek happiness and progress by helping one another.'

When he wrote 'to forget differences of race, language and religion', it was not a call for collective amnesia. He was not enjoining the people to deny their differences. He was calling on them to disregard these differences in their quest for a Singaporean Singapore. As his speeches during that period show, he was not blind to the fact that these differences were deeply rooted. On the contrary, he was all too conscious of their potency after the trauma of the racial riots in 1964 and the Separation in 1965.

He was subjected to repeated reminders of the emotive power of appeals to these differences. During the years of merger, he was the strongest advocate for a Malaysian Malaysia, as opposed to a Malay Malaysia. Indeed, it was he who coined the slogan 'Malaysian Malaysia'.

What he did not foresee was that Singapore would be expelled. The Separation in 1965 was, as he put it, 'the crushing of my dreams'. 'I believed in one nation, regardless of race and religion. My dreams were shattered,' he said. In the wake of that agonising moment came another test.

Shortly after the Separation, a group of Chinese chauvinists, aware that their community now constituted 75 per cent, wanted Chinese language and culture to be the dominant consideration in government policy. Mr Rajaratnam recounted: 'They charged the PAP government with betraying Chinese language and culture. They believed that, in a predominantly Chinese Singapore, where the Chinese had overwhelming strength, the Government could be panicked into opting for Chinese chauvinism.'

The minorities, particularly the Malays, were fearful of what Singapore would be like.

When Mr Rajaratnam wrote his first draft of the Pledge, asking people to 'forget their differences', it was in the context of such anxious times. Disregard these differences of race, language and religion; do not let them stand in the way of becoming a united people, to the dream of creating a Singaporean Singapore, where justice and equality will prevail. Help one another to seek happiness and progress. Do this regardless of race, language or religion, we are implicitly urged.

This ideal permeates many of his speeches on the topic at the time. Indeed, he was the first minister to use the phrase 'unity in diversity' in the 1960s, enjoining people to come together while celebrating the diversity among them.

From available records, we can ascertain that the Pledge - as is used today - was finalised in July 1966. There is unfortunately a gap in the historical records on the revisions between Mr Rajaratnam's first draft, dated Feb 18, 1966, and the final one. But knowing how he worked and liked to work, in his days as a journalist as well as a politician, he would have refined it along the way, taking into account various views. As part of this process, he consulted the then Prime Minister Lee Kuan Yew.

According to Mr Lee's recollection, as related in an interview with me, he had pointed out to Mr Rajaratnam that some words sounded too idealistic. Mr Lee also tightened the draft.

Unlike the Singapore flag which was discussed in the Cabinet in 1959, there is no record of any discussion in Cabinet on the Pledge. A search of the Cabinet minutes of meetings and papers in 1965 and 1966 revealed nothing on it. From this and available records, one can conclude that the final version was firmed up between individual ministers with input from Ministry of Education officials.

There is no doubt in my mind, however, that the final words, spirit and sentiments of the Pledge owed much to Mr Rajaratnam. Mr Lee rightly described him as 'a great idealist'. He was also a great visionary. Without such qualities, the Pledge would not have been the imaginative leap that it was. He had great faith in the power of the human will to overcome the differences of race, language, and religion, and to transform the separate communities into a nation.

After Mr Rajaratnam left the Cabinet in 1988, he became concerned with policies which appeared to encourage Singaporeans to assert their communal identities. In 1990, he spoke up against what he perceived as a dangerous form of Chinese self-assertion in the Speak Mandarin campaign.

While making clear that he supported the campaign just as he would for others encouraging people to speak 'Malay, Tamil, French, Japanese, or even better English', he took issue with the campaign's slogan 'if you are Chinese, make a statement - in Mandarin'.

This, for the nervous minorities, would carry the subliminal message that the Chinese are different, he warned. Had he been asked, he would have suggested this slogan instead: 'Make a Singaporean statement in Mandarin.'

Such episodes underscored the importance of Mr Rajaratnam's role as the keeper of the faith, a role which few others could play with equal conviction. He became an institution as a protector of the vision in the national Pledge - his enduring legacy to Singapore.

The writer is a Member of Parliament and writer-in-residence at the Institute of Southeast Asian Studies (Iseas). The first volume of Mr S. Rajaratnam's two-part biography will be published by Iseas by early next year.

Sunday, September 20, 2009

Caesar's is dead

Sep 19, 2009

TIJUANA (Mexico) - THE Tijuana restaurant that popularised the Caesar salad has closed, an apparent victim of a tourism-dependent economy devastated by crime, drug violence and the H1N1 flu.

The slump in visitors from the US also appeared to have claimed another food innovator on the border - an eatery linked to the invention of nachos in the town of Piedras Negras, across the Rio Grande from Texas.

Ironically, legend has it that both dishes were whipped up in a hurry to satisfy hungry US visitors. Now they aren't coming, scared off by drug cartel turf battles that often occur in Mexico's border regions and by last spring's flu outbreak.

Employees at Caesar's restaurant said on Friday that the business shut down on Monday after decades of serving the romaine lettuce-and-dressing combination. The restaurant has ties to the Tijuana spot where the dish is believed to have been invented in the 1920s, but had moved to a new location long ago.

'I showed up for work on Monday and I found all the furniture outside,' said Mr Miguel Angel Ventura Oros, who worked as a waiter at the restaurant the last three years. 'The manager told us there was an eviction for not paying the rent.'

Ms Gabriela Mondragon, director of operations for the Tijuana office of the National Restaurant Chamber, said Caesar's 'became famous for making a very classic, traditional version of this salad'.

Ms Martha Gonzalez, an employee of the Smart Price drug store next door to Caesar's, said the drop-off in tourism appeared to have done in the restaurant, which toward the end also functioned as a bar. 'They tried everything to revive it, but the tourism declined. It was mainly a bar, and occasionally they got a tourist who ordered the salad, but only for the sake of tradition,' Ms Gonzalez said.

Contacted by telephone, an employee of the Restaurant Moderno in Piedras Negras, across from Eagle Pass, Texas, confirmed the reputed birthplace of the nacho closed this summer.

A waiter at the Moderno, Mr Ignacio Anaya, was credited with inventing the corn chip, cheese and jalapeno dish. The name came from Anaya's nickname, 'Nacho'.

The employee gave no reason for the shutdown, but local media quoted another employee as saying the closure in June was caused by a drop-off in business due to crime and the H1N1 flu. -- AP

Time to reinvent economics

Sep 16, 2009

By Robert J. Shiller

THE widespread failure of economists to forecast the financial crisis that erupted last year has much to do with faulty models. The lack of sound models meant that policymakers received no warning of what was to come.

The current financial crisis was driven by speculative bubbles in the housing market, the stock market and commodities markets. Bubbles are caused by feedback loops: rising speculative prices encourage optimism, which encourages more buying, and hence further speculative price increases - until the crash comes.

But you won't find the word 'bubble' in most economics treatises. A search of papers produced by central banks in recent years yields few instances of 'bubbles' even being mentioned. Indeed, the idea that bubbles exist has become so disreputable in much of the profession that bringing them up in an economics seminar is like bringing up astrology to a group of astronomers.

A generation of mainstream macroeconomic theorists has come to accept a theory that has an error at its very core: the axiom that people are fully rational. And as the statistician Leonard 'Jimmie' Savage showed in 1954, if people follow certain axioms of rationality, they must behave as if they knew all the probabilities and did all the appropriate calculations.

So economists assume that people do indeed use all publicly available information and know the probabilities of all conceivable future events. They update the probabilities as soon as new information becomes available. So any change in their behaviour must be attributable to their rational response to new information. And if economic actors are always rational, then no bubbles - irrational market responses - are allowed.

But abundant psychological evidence has shown that people do not satisfy Savage's axioms of rationality. This is the core element of the behavioural economics revolution that has begun to sweep economics over the last decade or so.

In fact, people almost never know the probabilities of future economic events. They live in a world where economic decisions are fundamentally ambiguous, because the future doesn't seem to be a mere repetition of a quantifiable past. For many people, it always seems that 'this time is different'.

Neuroscientists Scott Huettel and Michael Platt have shown, through functional magnetic resonance imaging, that 'decision-making under ambiguity does not represent a special, more complex case of risky decision-making; instead, these two forms of uncertainty are supported by distinct mechanisms'. In other words, different parts of the brain and emotional pathways are involved when ambiguity is present.

Economist Donald J. Brown and psychologist Laurie R. Santos are running experiments to try to understand how human tolerance for ambiguity in economic decision-making varies over time. They theorise that 'bull markets are characterised by ambiguity-seeking behaviour and bear markets by ambiguity-avoiding behaviour'. These behaviours are aspects of changing confidence, which we are only just beginning to understand.

To be sure, the purely rational theory remains useful for many things. It can be applied in areas where the consequences of violating Savage's axiom are not too severe. Economists have been right to apply the theory to microeconomic issues, such as why monopolists set higher prices.

But it has been overextended. For example, the Dynamic Stochastic General Equilibrium Model Of The Euro Area, developed by Frank Smets of the European Central Bank and Raf Wouters of the National Bank of Belgium, is very good at giving a precise list of external shocks that are presumed to drive the economy. But nowhere are bubbles modelled: the economy is assumed to do nothing more than respond in a completely rational way to external shocks.

Milton Friedman and Anna J. Schwartz, in their 1963 book A Monetary History Of The United States, showed that monetary policy anomalies were a significant factor in the Great Depression. Economists such as Barry Eichengreen, Jeffrey Sachs and Ben Bernanke have helped us to understand that these anomalies were the result of individual central banks' effort to stay on the gold standard, causing them to keep interest rates relatively high despite economic weakness.

To some, this revelation represented a culminating event for economic theory. The worst economic crisis of the 20th century was explained - and a way to correct, it suggested - with a theory that does not rely on bubbles. Yet the Great Depression, as well as the recent crisis, will never be fully understood without understanding bubbles. The fact that monetary policy mistakes were an important cause of the Great Depression does not mean that we completely understand that crisis, or that other crises fit that mould.

The failure of economists' models to forecast the current crisis will mark the beginning of their overhaul. This will happen as economists listen to scientists with different expertise. Only then will monetary authorities gain a better understanding of when and how bubbles can derail an economy, and what can be done to prevent that outcome.

The writer is Professor of Economics at Yale University.


How did Economists Get it So Wrong

September 6, 2009


It’s hard to believe now, but not long ago economists were congratulating themselves over the success of their field. Those successes — or so they believed — were both theoretical and practical, leading to a golden era for the profession. On the theoretical side, they thought that they had resolved their internal disputes. Thus, in a 2008 paper titled “The State of Macro” (that is, macroeconomics, the study of big-picture issues like recessions), Olivier Blanchard of M.I.T., now the chief economist at the International Monetary Fund, declared that “the state of macro is good.” The battles of yesteryear, he said, were over, and there had been a “broad convergence of vision.” And in the real world, economists believed they had things under control: the “central problem of depression-prevention has been solved,” declared Robert Lucas of the University of Chicago in his 2003 presidential address to the American Economic Association. In 2004, Ben Bernanke, a former Princeton professor who is now the chairman of the Federal Reserve Board, celebrated the Great Moderation in economic performance over the previous two decades, which he attributed in part to improved economic policy making.

Last year, everything came apart.

Few economists saw our current crisis coming, but this predictive failure was the least of the field’s problems. More important was the profession’s blindness to the very possibility of catastrophic failures in a market economy. During the golden years, financial economists came to believe that markets were inherently stable — indeed, that stocks and other assets were always priced just right. There was nothing in the prevailing models suggesting the possibility of the kind of collapse that happened last year. Meanwhile, macroeconomists were divided in their views. But the main division was between those who insisted that free-market economies never go astray and those who believed that economies may stray now and then but that any major deviations from the path of prosperity could and would be corrected by the all-powerful Fed. Neither side was prepared to cope with an economy that went off the rails despite the Fed’s best efforts.

And in the wake of the crisis, the fault lines in the economics profession have yawned wider than ever. Lucas says the Obama administration’s stimulus plans are “schlock economics,” and his Chicago colleague John Cochrane says they’re based on discredited “fairy tales.” In response, Brad DeLong of the University of California, Berkeley, writes of the “intellectual collapse” of the Chicago School, and I myself have written that comments from Chicago economists are the product of a Dark Age of macroeconomics in which hard-won knowledge has been forgotten.

What happened to the economics profession? And where does it go from here?

As I see it, the economics profession went astray because economists, as a group, mistook beauty, clad in impressive-looking mathematics, for truth. Until the Great Depression, most economists clung to a vision of capitalism as a perfect or nearly perfect system. That vision wasn’t sustainable in the face of mass unemployment, but as memories of the Depression faded, economists fell back in love with the old, idealized vision of an economy in which rational individuals interact in perfect markets, this time gussied up with fancy equations. The renewed romance with the idealized market was, to be sure, partly a response to shifting political winds, partly a response to financial incentives. But while sabbaticals at the Hoover Institution and job opportunities on Wall Street are nothing to sneeze at, the central cause of the profession’s failure was the desire for an all-encompassing, intellectually elegant approach that also gave economists a chance to show off their mathematical prowess.

Unfortunately, this romanticized and sanitized vision of the economy led most economists to ignore all the things that can go wrong. They turned a blind eye to the limitations of human rationality that often lead to bubbles and busts; to the problems of institutions that run amok; to the imperfections of markets — especially financial markets — that can cause the economy’s operating system to undergo sudden, unpredictable crashes; and to the dangers created when regulators don’t believe in regulation.

It’s much harder to say where the economics profession goes from here. But what’s almost certain is that economists will have to learn to live with messiness. That is, they will have to acknowledge the importance of irrational and often unpredictable behavior, face up to the often idiosyncratic imperfections of markets and accept that an elegant economic “theory of everything” is a long way off. In practical terms, this will translate into more cautious policy advice — and a reduced willingness to dismantle economic safeguards in the faith that markets will solve all problems.


The birth of economics as a discipline is usually credited to Adam Smith, who published “The Wealth of Nations” in 1776. Over the next 160 years an extensive body of economic theory was developed, whose central message was: Trust the market. Yes, economists admitted that there were cases in which markets might fail, of which the most important was the case of “externalities” — costs that people impose on others without paying the price, like traffic congestion or pollution. But the basic presumption of “neoclassical” economics (named after the late-19th-century theorists who elaborated on the concepts of their “classical” predecessors) was that we should have faith in the market system.

This faith was, however, shattered by the Great Depression. Actually, even in the face of total collapse some economists insisted that whatever happens in a market economy must be right: “Depressions are not simply evils,” declared Joseph Schumpeter in 1934 — 1934! They are, he added, “forms of something which has to be done.” But many, and eventually most, economists turned to the insights of John Maynard Keynes for both an explanation of what had happened and a solution to future depressions.

Keynes did not, despite what you may have heard, want the government to run the economy. He described his analysis in his 1936 masterwork, “The General Theory of Employment, Interest and Money,” as “moderately conservative in its implications.” He wanted to fix capitalism, not replace it. But he did challenge the notion that free-market economies can function without a minder, expressing particular contempt for financial markets, which he viewed as being dominated by short-term speculation with little regard for fundamentals. And he called for active government intervention — printing more money and, if necessary, spending heavily on public works — to fight unemployment during slumps.

It’s important to understand that Keynes did much more than make bold assertions. “The General Theory” is a work of profound, deep analysis — analysis that persuaded the best young economists of the day. Yet the story of economics over the past half century is, to a large degree, the story of a retreat from Keynesianism and a return to neoclassicism. The neoclassical revival was initially led by Milton Friedman of the University of Chicago, who asserted as early as 1953 that neoclassical economics works well enough as a description of the way the economy actually functions to be “both extremely fruitful and deserving of much confidence.” But what about depressions?

Friedman’s counterattack against Keynes began with the doctrine known as monetarism. Monetarists didn’t disagree in principle with the idea that a market economy needs deliberate stabilization. “We are all Keynesians now,” Friedman once said, although he later claimed he was quoted out of context. Monetarists asserted, however, that a very limited, circumscribed form of government intervention — namely, instructing central banks to keep the nation’s money supply, the sum of cash in circulation and bank deposits, growing on a steady path — is all that’s required to prevent depressions. Famously, Friedman and his collaborator, Anna Schwartz, argued that if the Federal Reserve had done its job properly, the Great Depression would not have happened. Later, Friedman made a compelling case against any deliberate effort by government to push unemployment below its “natural” level (currently thought to be about 4.8 percent in the United States): excessively expansionary policies, he predicted, would lead to a combination of inflation and high unemployment — a prediction that was borne out by the stagflation of the 1970s, which greatly advanced the credibility of the anti-Keynesian movement.

Eventually, however, the anti-Keynesian counterrevolution went far beyond Friedman’s position, which came to seem relatively moderate compared with what his successors were saying. Among financial economists, Keynes’s disparaging vision of financial markets as a “casino” was replaced by “efficient market” theory, which asserted that financial markets always get asset prices right given the available information. Meanwhile, many macroeconomists completely rejected Keynes’s framework for understanding economic slumps. Some returned to the view of Schumpeter and other apologists for the Great Depression, viewing recessions as a good thing, part of the economy’s adjustment to change. And even those not willing to go that far argued that any attempt to fight an economic slump would do more harm than good.

Not all macroeconomists were willing to go down this road: many became self-described New Keynesians, who continued to believe in an active role for the government. Yet even they mostly accepted the notion that investors and consumers are rational and that markets generally get it right.

Of course, there were exceptions to these trends: a few economists challenged the assumption of rational behavior, questioned the belief that financial markets can be trusted and pointed to the long history of financial crises that had devastating economic consequences. But they were swimming against the tide, unable to make much headway against a pervasive and, in retrospect, foolish complacency.


In the 1930s, financial markets, for obvious reasons, didn’t get much respect. Keynes compared them to “those newspaper competitions in which the competitors have to pick out the six prettiest faces from a hundred photographs, the prize being awarded to the competitor whose choice most nearly corresponds to the average preferences of the competitors as a whole; so that each competitor has to pick, not those faces which he himself finds prettiest, but those that he thinks likeliest to catch the fancy of the other competitors.”

And Keynes considered it a very bad idea to let such markets, in which speculators spent their time chasing one another’s tails, dictate important business decisions: “When the capital development of a country becomes a by-product of the activities of a casino, the job is likely to be ill-done.”

By 1970 or so, however, the study of financial markets seemed to have been taken over by Voltaire’s Dr. Pangloss, who insisted that we live in the best of all possible worlds. Discussion of investor irrationality, of bubbles, of destructive speculation had virtually disappeared from academic discourse. The field was dominated by the “efficient-market hypothesis,” promulgated by Eugene Fama of the University of Chicago, which claims that financial markets price assets precisely at their intrinsic worth given all publicly available information. (The price of a company’s stock, for example, always accurately reflects the company’s value given the information available on the company’s earnings, its business prospects and so on.) And by the 1980s, finance economists, notably Michael Jensen of the Harvard Business School, were arguing that because financial markets always get prices right, the best thing corporate chieftains can do, not just for themselves but for the sake of the economy, is to maximize their stock prices. In other words, finance economists believed that we should put the capital development of the nation in the hands of what Keynes had called a “casino.”

It’s hard to argue that this transformation in the profession was driven by events. True, the memory of 1929 was gradually receding, but there continued to be bull markets, with widespread tales of speculative excess, followed by bear markets. In 1973-4, for example, stocks lost 48 percent of their value. And the 1987 stock crash, in which the Dow plunged nearly 23 percent in a day for no clear reason, should have raised at least a few doubts about market rationality.

These events, however, which Keynes would have considered evidence of the unreliability of markets, did little to blunt the force of a beautiful idea. The theoretical model that finance economists developed by assuming that every investor rationally balances risk against reward — the so-called Capital Asset Pricing Model, or CAPM (pronounced cap-em) — is wonderfully elegant. And if you accept its premises it’s also extremely useful. CAPM not only tells you how to choose your portfolio — even more important from the financial industry’s point of view, it tells you how to put a price on financial derivatives, claims on claims. The elegance and apparent usefulness of the new theory led to a string of Nobel prizes for its creators, and many of the theory’s adepts also received more mundane rewards: Armed with their new models and formidable math skills — the more arcane uses of CAPM require physicist-level computations — mild-mannered business-school professors could and did become Wall Street rocket scientists, earning Wall Street paychecks.

To be fair, finance theorists didn’t accept the efficient-market hypothesis merely because it was elegant, convenient and lucrative. They also produced a great deal of statistical evidence, which at first seemed strongly supportive. But this evidence was of an oddly limited form. Finance economists rarely asked the seemingly obvious (though not easily answered) question of whether asset prices made sense given real-world fundamentals like earnings. Instead, they asked only whether asset prices made sense given other asset prices. Larry Summers, now the top economic adviser in the Obama administration, once mocked finance professors with a parable about “ketchup economists” who “have shown that two-quart bottles of ketchup invariably sell for exactly twice as much as one-quart bottles of ketchup,” and conclude from this that the ketchup market is perfectly efficient.

But neither this mockery nor more polite critiques from economists like Robert Shiller of Yale had much effect. Finance theorists continued to believe that their models were essentially right, and so did many people making real-world decisions. Not least among these was Alan Greenspan, who was then the Fed chairman and a long-time supporter of financial deregulation whose rejection of calls to rein in subprime lending or address the ever-inflating housing bubble rested in large part on the belief that modern financial economics had everything under control. There was a telling moment in 2005, at a conference held to honor Greenspan’s tenure at the Fed. One brave attendee, Raghuram Rajan (of the University of Chicago, surprisingly), presented a paper warning that the financial system was taking on potentially dangerous levels of risk. He was mocked by almost all present — including, by the way, Larry Summers, who dismissed his warnings as “misguided.”

By October of last year, however, Greenspan was admitting that he was in a state of “shocked disbelief,” because “the whole intellectual edifice” had “collapsed.” Since this collapse of the intellectual edifice was also a collapse of real-world markets, the result was a severe recession — the worst, by many measures, since the Great Depression. What should policy makers do? Unfortunately, macroeconomics, which should have been providing clear guidance about how to address the slumping economy, was in its own state of disarray.


“We have involved ourselves in a colossal muddle, having blundered in the control of a delicate machine, the working of which we do not understand. The result is that our possibilities of wealth may run to waste for a time — perhaps for a long time.” So wrote John Maynard Keynes in an essay titled “The Great Slump of 1930,” in which he tried to explain the catastrophe then overtaking the world. And the world’s possibilities of wealth did indeed run to waste for a long time; it took World War II to bring the Great Depression to a definitive end.

Why was Keynes’s diagnosis of the Great Depression as a “colossal muddle” so compelling at first? And why did economics, circa 1975, divide into opposing camps over the value of Keynes’s views?

I like to explain the essence of Keynesian economics with a true story that also serves as a parable, a small-scale version of the messes that can afflict entire economies. Consider the travails of the Capitol Hill Baby-Sitting Co-op.

This co-op, whose problems were recounted in a 1977 article in The Journal of Money, Credit and Banking, was an association of about 150 young couples who agreed to help one another by baby-sitting for one another’s children when parents wanted a night out. To ensure that every couple did its fair share of baby-sitting, the co-op introduced a form of scrip: coupons made out of heavy pieces of paper, each entitling the bearer to one half-hour of sitting time. Initially, members received 20 coupons on joining and were required to return the same amount on departing the group.

Unfortunately, it turned out that the co-op’s members, on average, wanted to hold a reserve of more than 20 coupons, perhaps, in case they should want to go out several times in a row. As a result, relatively few people wanted to spend their scrip and go out, while many wanted to baby-sit so they could add to their hoard. But since baby-sitting opportunities arise only when someone goes out for the night, this meant that baby-sitting jobs were hard to find, which made members of the co-op even more reluctant to go out, making baby-sitting jobs even scarcer. . . .

In short, the co-op fell into a recession.

O.K., what do you think of this story? Don’t dismiss it as silly and trivial: economists have used small-scale examples to shed light on big questions ever since Adam Smith saw the roots of economic progress in a pin factory, and they’re right to do so. The question is whether this particular example, in which a recession is a problem of inadequate demand — there isn’t enough demand for baby-sitting to provide jobs for everyone who wants one — gets at the essence of what happens in a recession.

Forty years ago most economists would have agreed with this interpretation. But since then macroeconomics has divided into two great factions: “saltwater” economists (mainly in coastal U.S. universities), who have a more or less Keynesian vision of what recessions are all about; and “freshwater” economists (mainly at inland schools), who consider that vision nonsense.

Freshwater economists are, essentially, neoclassical purists. They believe that all worthwhile economic analysis starts from the premise that people are rational and markets work, a premise violated by the story of the baby-sitting co-op. As they see it, a general lack of sufficient demand isn’t possible, because prices always move to match supply with demand. If people want more baby-sitting coupons, the value of those coupons will rise, so that they’re worth, say, 40 minutes of baby-sitting rather than half an hour — or, equivalently, the cost of an hours’ baby-sitting would fall from 2 coupons to 1.5. And that would solve the problem: the purchasing power of the coupons in circulation would have risen, so that people would feel no need to hoard more, and there would be no recession.

But don’t recessions look like periods in which there just isn’t enough demand to employ everyone willing to work? Appearances can be deceiving, say the freshwater theorists. Sound economics, in their view, says that overall failures of demand can’t happen — and that means that they don’t. Keynesian economics has been “proved false,” Cochrane, of the University of Chicago, says.

Yet recessions do happen. Why? In the 1970s the leading freshwater macroeconomist, the Nobel laureate Robert Lucas, argued that recessions were caused by temporary confusion: workers and companies had trouble distinguishing overall changes in the level of prices because of inflation or deflation from changes in their own particular business situation. And Lucas warned that any attempt to fight the business cycle would be counterproductive: activist policies, he argued, would just add to the confusion.

By the 1980s, however, even this severely limited acceptance of the idea that recessions are bad things had been rejected by many freshwater economists. Instead, the new leaders of the movement, especially Edward Prescott, who was then at the University of Minnesota (you can see where the freshwater moniker comes from), argued that price fluctuations and changes in demand actually had nothing to do with the business cycle. Rather, the business cycle reflects fluctuations in the rate of technological progress, which are amplified by the rational response of workers, who voluntarily work more when the environment is favorable and less when it’s unfavorable. Unemployment is a deliberate decision by workers to take time off.

Put baldly like that, this theory sounds foolish — was the Great Depression really the Great Vacation? And to be honest, I think it really is silly. But the basic premise of Prescott’s “real business cycle” theory was embedded in ingeniously constructed mathematical models, which were mapped onto real data using sophisticated statistical techniques, and the theory came to dominate the teaching of macroeconomics in many university departments. In 2004, reflecting the theory’s influence, Prescott shared a Nobel with Finn Kydland of Carnegie Mellon University.

Meanwhile, saltwater economists balked. Where the freshwater economists were purists, saltwater economists were pragmatists. While economists like N. Gregory Mankiw at Harvard, Olivier Blanchard at M.I.T. and David Romer at the University of California, Berkeley, acknowledged that it was hard to reconcile a Keynesian demand-side view of recessions with neoclassical theory, they found the evidence that recessions are, in fact, demand-driven too compelling to reject. So they were willing to deviate from the assumption of perfect markets or perfect rationality, or both, adding enough imperfections to accommodate a more or less Keynesian view of recessions. And in the saltwater view, active policy to fight recessions remained desirable.

But the self-described New Keynesian economists weren’t immune to the charms of rational individuals and perfect markets. They tried to keep their deviations from neoclassical orthodoxy as limited as possible. This meant that there was no room in the prevailing models for such things as bubbles and banking-system collapse. The fact that such things continued to happen in the real world — there was a terrible financial and macroeconomic crisis in much of Asia in 1997-8 and a depression-level slump in Argentina in 2002 — wasn’t reflected in the mainstream of New Keynesian thinking.

Even so, you might have thought that the differing worldviews of freshwater and saltwater economists would have put them constantly at loggerheads over economic policy. Somewhat surprisingly, however, between around 1985 and 2007 the disputes between freshwater and saltwater economists were mainly about theory, not action. The reason, I believe, is that New Keynesians, unlike the original Keynesians, didn’t think fiscal policy — changes in government spending or taxes — was needed to fight recessions. They believed that monetary policy, administered by the technocrats at the Fed, could provide whatever remedies the economy needed. At a 90th birthday celebration for Milton Friedman, Ben Bernanke, formerly a more or less New Keynesian professor at Princeton, and by then a member of the Fed’s governing board, declared of the Great Depression: “You’re right. We did it. We’re very sorry. But thanks to you, it won’t happen again.” The clear message was that all you need to avoid depressions is a smarter Fed.

And as long as macroeconomic policy was left in the hands of the maestro Greenspan, without Keynesian-type stimulus programs, freshwater economists found little to complain about. (They didn’t believe that monetary policy did any good, but they didn’t believe it did any harm, either.)

It would take a crisis to reveal both how little common ground there was and how Panglossian even New Keynesian economics had become.


In recent, rueful economics discussions, an all-purpose punch line has become “nobody could have predicted. . . .” It’s what you say with regard to disasters that could have been predicted, should have been predicted and actually were predicted by a few economists who were scoffed at for their pains.

Take, for example, the precipitous rise and fall of housing prices. Some economists, notably Robert Shiller, did identify the bubble and warn of painful consequences if it were to burst. Yet key policy makers failed to see the obvious. In 2004, Alan Greenspan dismissed talk of a housing bubble: “a national severe price distortion,” he declared, was “most unlikely.” Home-price increases, Ben Bernanke said in 2005, “largely reflect strong economic fundamentals.”

How did they miss the bubble? To be fair, interest rates were unusually low, possibly explaining part of the price rise. It may be that Greenspan and Bernanke also wanted to celebrate the Fed’s success in pulling the economy out of the 2001 recession; conceding that much of that success rested on the creation of a monstrous bubble would have placed a damper on the festivities.

But there was something else going on: a general belief that bubbles just don’t happen. What’s striking, when you reread Greenspan’s assurances, is that they weren’t based on evidence — they were based on the a priori assertion that there simply can’t be a bubble in housing. And the finance theorists were even more adamant on this point. In a 2007 interview, Eugene Fama, the father of the efficient-market hypothesis, declared that “the word ‘bubble’ drives me nuts,” and went on to explain why we can trust the housing market: “Housing markets are less liquid, but people are very careful when they buy houses. It’s typically the biggest investment they’re going to make, so they look around very carefully and they compare prices. The bidding process is very detailed.”

Indeed, home buyers generally do carefully compare prices — that is, they compare the price of their potential purchase with the prices of other houses. But this says nothing about whether the overall price of houses is justified. It’s ketchup economics, again: because a two-quart bottle of ketchup costs twice as much as a one-quart bottle, finance theorists declare that the price of ketchup must be right.

In short, the belief in efficient financial markets blinded many if not most economists to the emergence of the biggest financial bubble in history. And efficient-market theory also played a significant role in inflating that bubble in the first place.

Now that the undiagnosed bubble has burst, the true riskiness of supposedly safe assets has been revealed and the financial system has demonstrated its fragility. U.S. households have seen $13 trillion in wealth evaporate. More than six million jobs have been lost, and the unemployment rate appears headed for its highest level since 1940. So what guidance does modern economics have to offer in our current predicament? And should we trust it?


Between 1985 and 2007 a false peace settled over the field of macroeconomics. There hadn’t been any real convergence of views between the saltwater and freshwater factions. But these were the years of the Great Moderation — an extended period during which inflation was subdued and recessions were relatively mild. Saltwater economists believed that the Federal Reserve had everything under control. Fresh­water economists didn’t think the Fed’s actions were actually beneficial, but they were willing to let matters lie.

But the crisis ended the phony peace. Suddenly the narrow, technocratic policies both sides were willing to accept were no longer sufficient — and the need for a broader policy response brought the old conflicts out into the open, fiercer than ever.

Why weren’t those narrow, technocratic policies sufficient? The answer, in a word, is zero.

During a normal recession, the Fed responds by buying Treasury bills — short-term government debt — from banks. This drives interest rates on government debt down; investors seeking a higher rate of return move into other assets, driving other interest rates down as well; and normally these lower interest rates eventually lead to an economic bounceback. The Fed dealt with the recession that began in 1990 by driving short-term interest rates from 9 percent down to 3 percent. It dealt with the recession that began in 2001 by driving rates from 6.5 percent to 1 percent. And it tried to deal with the current recession by driving rates down from 5.25 percent to zero.

But zero, it turned out, isn’t low enough to end this recession. And the Fed can’t push rates below zero, since at near-zero rates investors simply hoard cash rather than lending it out. So by late 2008, with interest rates basically at what macroeconomists call the “zero lower bound” even as the recession continued to deepen, conventional monetary policy had lost all traction.

Now what? This is the second time America has been up against the zero lower bound, the previous occasion being the Great Depression. And it was precisely the observation that there’s a lower bound to interest rates that led Keynes to advocate higher government spending: when monetary policy is ineffective and the private sector can’t be persuaded to spend more, the public sector must take its place in supporting the economy. Fiscal stimulus is the Keynesian answer to the kind of depression-type economic situation we’re currently in.

Such Keynesian thinking underlies the Obama administration’s economic policies — and the freshwater economists are furious. For 25 or so years they tolerated the Fed’s efforts to manage the economy, but a full-blown Keynesian resurgence was something entirely different. Back in 1980, Lucas, of the University of Chicago, wrote that Keynesian economics was so ludicrous that “at research seminars, people don’t take Keynesian theorizing seriously anymore; the audience starts to whisper and giggle to one another.” Admitting that Keynes was largely right, after all, would be too humiliating a comedown.

And so Chicago’s Cochrane, outraged at the idea that government spending could mitigate the latest recession, declared: “It’s not part of what anybody has taught graduate students since the 1960s. They [Keynesian ideas] are fairy tales that have been proved false. It is very comforting in times of stress to go back to the fairy tales we heard as children, but it doesn’t make them less false.” (It’s a mark of how deep the division between saltwater and freshwater runs that Cochrane doesn’t believe that “anybody” teaches ideas that are, in fact, taught in places like Princeton, M.I.T. and Harvard.)

Meanwhile, saltwater economists, who had comforted themselves with the belief that the great divide in macroeconomics was narrowing, were shocked to realize that freshwater economists hadn’t been listening at all. Freshwater economists who inveighed against the stimulus didn’t sound like scholars who had weighed Keynesian arguments and found them wanting. Rather, they sounded like people who had no idea what Keynesian economics was about, who were resurrecting pre-1930 fallacies in the belief that they were saying something new and profound.

And it wasn’t just Keynes whose ideas seemed to have been forgotten. As Brad DeLong of the University of California, Berkeley, has pointed out in his laments about the Chicago school’s “intellectual collapse,” the school’s current stance amounts to a wholesale rejection of Milton Friedman’s ideas, as well. Friedman believed that Fed policy rather than changes in government spending should be used to stabilize the economy, but he never asserted that an increase in government spending cannot, under any circumstances, increase employment. In fact, rereading Friedman’s 1970 summary of his ideas, “A Theoretical Framework for Monetary Analysis,” what’s striking is how Keynesian it seems.

And Friedman certainly never bought into the idea that mass unemployment represents a voluntary reduction in work effort or the idea that recessions are actually good for the economy. Yet the current generation of freshwater economists has been making both arguments. Thus Chicago’s Casey Mulligan suggests that unemployment is so high because many workers are choosing not to take jobs: “Employees face financial incentives that encourage them not to work . . . decreased employment is explained more by reductions in the supply of labor (the willingness of people to work) and less by the demand for labor (the number of workers that employers need to hire).” Mulligan has suggested, in particular, that workers are choosing to remain unemployed because that improves their odds of receiving mortgage relief. And Cochrane declares that high unemployment is actually good: “We should have a recession. People who spend their lives pounding nails in Nevada need something else to do.”

Personally, I think this is crazy. Why should it take mass unemployment across the whole nation to get carpenters to move out of Nevada? Can anyone seriously claim that we’ve lost 6.7 million jobs because fewer Americans want to work? But it was inevitable that freshwater economists would find themselves trapped in this cul-de-sac: if you start from the assumption that people are perfectly rational and markets are perfectly efficient, you have to conclude that unemployment is voluntary and recessions are desirable.

Yet if the crisis has pushed freshwater economists into absurdity, it has also created a lot of soul-searching among saltwater economists. Their framework, unlike that of the Chicago School, both allows for the possibility of involuntary unemployment and considers it a bad thing. But the New Keynesian models that have come to dominate teaching and research assume that people are perfectly rational and financial markets are perfectly efficient. To get anything like the current slump into their models, New Keynesians are forced to introduce some kind of fudge factor that for reasons unspecified temporarily depresses private spending. (I’ve done exactly that in some of my own work.) And if the analysis of where we are now rests on this fudge factor, how much confidence can we have in the models’ predictions about where we are going?

The state of macro, in short, is not good. So where does the profession go from here?


Economics, as a field, got in trouble because economists were seduced by the vision of a perfect, frictionless market system. If the profession is to redeem itself, it will have to reconcile itself to a less alluring vision — that of a market economy that has many virtues but that is also shot through with flaws and frictions. The good news is that we don’t have to start from scratch. Even during the heyday of perfect-market economics, there was a lot of work done on the ways in which the real economy deviated from the theoretical ideal. What’s probably going to happen now — in fact, it’s already happening — is that flaws-and-frictions economics will move from the periphery of economic analysis to its center.

There’s already a fairly well developed example of the kind of economics I have in mind: the school of thought known as behavioral finance. Practitioners of this approach emphasize two things. First, many real-world investors bear little resemblance to the cool calculators of efficient-market theory: they’re all too subject to herd behavior, to bouts of irrational exuberance and unwarranted panic. Second, even those who try to base their decisions on cool calculation often find that they can’t, that problems of trust, credibility and limited collateral force them to run with the herd.

On the first point: even during the heyday of the efficient-market hypothesis, it seemed obvious that many real-world investors aren’t as rational as the prevailing models assumed. Larry Summers once began a paper on finance by declaring: “THERE ARE IDIOTS. Look around.” But what kind of idiots (the preferred term in the academic literature, actually, is “noise traders”) are we talking about? Behavioral finance, drawing on the broader movement known as behavioral economics, tries to answer that question by relating the apparent irrationality of investors to known biases in human cognition, like the tendency to care more about small losses than small gains or the tendency to extrapolate too readily from small samples (e.g., assuming that because home prices rose in the past few years, they’ll keep on rising).

Until the crisis, efficient-market advocates like Eugene Fama dismissed the evidence produced on behalf of behavioral finance as a collection of “curiosity items” of no real importance. That’s a much harder position to maintain now that the collapse of a vast bubble — a bubble correctly diagnosed by behavioral economists like Robert Shiller of Yale, who related it to past episodes of “irrational exuberance” — has brought the world economy to its knees.

On the second point: suppose that there are, indeed, idiots. How much do they matter? Not much, argued Milton Friedman in an influential 1953 paper: smart investors will make money by buying when the idiots sell and selling when they buy and will stabilize markets in the process. But the second strand of behavioral finance says that Friedman was wrong, that financial markets are sometimes highly unstable, and right now that view seems hard to reject.

Probably the most influential paper in this vein was a 1997 publication by Andrei Shleifer of Harvard and Robert Vishny of Chicago, which amounted to a formalization of the old line that “the market can stay irrational longer than you can stay solvent.” As they pointed out, arbitrageurs — the people who are supposed to buy low and sell high — need capital to do their jobs. And a severe plunge in asset prices, even if it makes no sense in terms of fundamentals, tends to deplete that capital. As a result, the smart money is forced out of the market, and prices may go into a downward spiral.

The spread of the current financial crisis seemed almost like an object lesson in the perils of financial instability. And the general ideas underlying models of financial instability have proved highly relevant to economic policy: a focus on the depleted capital of financial institutions helped guide policy actions taken after the fall of Lehman, and it looks (cross your fingers) as if these actions successfully headed off an even bigger financial collapse.

Meanwhile, what about macroeconomics? Recent events have pretty decisively refuted the idea that recessions are an optimal response to fluctuations in the rate of technological progress; a more or less Keynesian view is the only plausible game in town. Yet standard New Keynesian models left no room for a crisis like the one we’re having, because those models generally accepted the efficient-market view of the financial sector.

There were some exceptions. One line of work, pioneered by none other than Ben Bernanke working with Mark Gertler of New York University, emphasized the way the lack of sufficient collateral can hinder the ability of businesses to raise funds and pursue investment opportunities. A related line of work, largely established by my Princeton colleague Nobuhiro Kiyotaki and John Moore of the London School of Economics, argued that prices of assets such as real estate can suffer self-reinforcing plunges that in turn depress the economy as a whole. But until now the impact of dysfunctional finance hasn’t been at the core even of Keynesian economics. Clearly, that has to change.


So here’s what I think economists have to do. First, they have to face up to the inconvenient reality that financial markets fall far short of perfection, that they are subject to extraordinary delusions and the madness of crowds. Second, they have to admit — and this will be very hard for the people who giggled and whispered over Keynes — that Keynesian economics remains the best framework we have for making sense of recessions and depressions. Third, they’ll have to do their best to incorporate the realities of finance into macroeconomics.

Many economists will find these changes deeply disturbing. It will be a long time, if ever, before the new, more realistic approaches to finance and macroeconomics offer the same kind of clarity, completeness and sheer beauty that characterizes the full neoclassical approach. To some economists that will be a reason to cling to neoclassicism, despite its utter failure to make sense of the greatest economic crisis in three generations. This seems, however, like a good time to recall the words of H. L. Mencken: “There is always an easy solution to every human problem — neat, plausible and wrong.”

When it comes to the all-too-human problem of recessions and depressions, economists need to abandon the neat but wrong solution of assuming that everyone is rational and markets work perfectly. The vision that emerges as the profession rethinks its foundations may not be all that clear; it certainly won’t be neat; but we can hope that it will have the virtue of being at least partly right.

Paul Krugman is a Times Op-Ed columnist and winner of the 2008 Nobel Memorial Prize in Economic Science. His latest book is “The Return of Depression Economics and the Crisis of 2008.”

This article has been revised to reflect the following correction:

Correction: September 6, 2009
Because of an editing error, an article on Page 36 this weekend about the failure of economists to anticipate the latest recession misquotes the economist John Maynard Keynes, who compared the financial markets of the 1930s to newspaper beauty contests in which readers tried to correctly pick all six eventual winners. Keynes noted that a competitor did not have to pick “those faces which he himself finds prettiest, but those that he thinks likeliest to catch the fancy of the other competitors.” He did not say, “nor even those that he thinks likeliest to catch the fancy of other competitors.”


To be read with