A Data Weapon to Avoid the Next Financial Crisis

Photo
Richard Berner, director of the Office of Financial Research, in 2010.Credit Michael Falco for The New York Times

Many economists see a data revolution that could transform their field, opening a window to seeing and measuring economic behavior in greater detail than ever before. The potential — and limitations — of what can be thought of as Big Data economics was the topic of my column over the weekend.

The idea is that better measurement will inform better management of the economy. It is true, as they say in business, that you can’t manage what you can’t measure. But just because you can measure something doesn’t necessarily mean you can manage it — especially in the messy realm of human affairs.

What about a more targeted approach? That is, trying to get a better reading on one crucial slice of the economy to guide policy and perhaps behavior. That, in broad strokes, is the animating philosophy behind the Office of Financial Research, a unit of the Treasury Department established by the Dodd-Frank Act of 2010.

The precise recipe of causes — and apportioning responsibility — for the financial crisis is open to debate. But it is clear that leaders in the financial industry and policy makers were largely blindsided. Regulators and bankers lacked the data and analysis to see all the hidden risks in the financial system.

One of the lessons of the crisis, said Richard Berner, director of the Office of Financial Research, is that there were “serious deficiencies” in financial measurement. More stringent reporting requirements is one way to close some of the gaps. “The crisis really did spawn a lot of data collection efforts,” said Mr. Berner, a former adviser to the Treasury and former chief economist at Morgan Stanley.

After delays, Mr. Berner was nominated by the Obama administration and confirmed by the Senate this year.

The data analysis by the Office of Financial Research, Mr. Berner said, needs to balance the need for more detailed data collection with the need for data security, protecting the trade secrets of investment banks and other financial institutions.

Yet added collection of data on money market funds, credit-default swaps, financial leverage and counterparty risk exposure, he said, could make it possible for financial institutions themselves, as well as regulators, to have early-warning signals of trouble.

Mr. Berner said his office was looking at a “variety of approaches” to get a faster handle on emerging risks. One that looks intriguing is a proposal in a research paper that combines data analysis, financial economics and computer science.

In the paper, the authors contend that new streams of financial data — aggregated, properly encrypted and then analyzed — could give strong clues to hidden risk bombs in the system, like the institutions that touched off the crisis in the fall of 2008, Lehman Brothers and the American International Group.

Such data, the article argues, could “have played a critical role in providing regulators and investors with advance notice of A.I.G.’s unusually concentrated position in credit-default swaps, as well as the exposure of money market funds to Lehman bonds.”

“This is an effort to use technology to simplify some of the challenges created by our technology-driven financial markets,” said Andrew W. Lo, one of the article’s authors and an economist and finance expert at the Massachusetts Institute of Technology’s Sloan School of Management.

The other co-authors of the paper, “Privacy-Preserving Methods for Sharing Financial Risk Exposures,” are Emmanuel A. Abbe, a computer scientist now at Princeton University, and Amir E. Khandani, a financial engineer at Morgan Stanley.