A common theme throughout contemporary financial stress testing is “data, the risky river,” said David O’Connell, Senior Analyst, Aite Group, a financial services consulting group. He was the second of two speakers to address issues around data in stress testing in a webinar organized by GARP on January 28, 2014.
The recent financial crisis has permanently altered the relationship between the central bank and all other financial institutions, said O’Connell. The central bank is now looking at them as potential customers for a line of credit, and thus must carry out due diligence including asking for proof that the financial institutions can pay back what they are loaned, even in bad times. His “stress testing 101” summary was a fresh way of thinking about the increased regulatory burden.
O’Connell distinguished between level of reporting. The Comprehensive Capital Analysis and Review (CCAR) is the most onerous and highly governed report, run by the Office of the Comptroller of the Currency (OCC) and the Federal Reserve Board. It covers “too big to fail” banks. Another category is that of Dodd-Frank Act Stress Testing (DFAST) banks, where “guidance is vague.” The CAMEL banks have assets “poorly understood by regulators” and are given a 5-parameter rating.
Nowadays, “stress testing has teeth,” said O’Connell. Regulators need, and want, to see high levels of involvement from senior management at all stages of stress testing.
O’Connell listed several causes of regulatory citations: insufficient capital; insufficiently harsh economic down-turn scenarios; faulty calculations; faulty reporting. Or, financial institutions might simply have “data problems.”
Even if data quality issues do not result in citation, such issues “can harm banks by making stress testing ‘brittle, unstable, and costly.” Should errors cascade, as they often do, stress tests can become unusable.
O’Connell considered some instances where “good data goes bad.” The need for granular data is so great that many financial institutions are building extensions to core banking and loan origination systems. This is good, he said, “because you get loan-level data.” However, the drawback is the “lack of credit context,” he said, “due to the extreme obligor focus.”
O’Connell urged webinar attendees to press for fully automated stress test reporting at their respective firms. He presented a detailed cost analysis of data problems at a bank. Once automation is brought in, then “the data that needs to be monitored can easily be done so.” He noted the virtuous cycle that begins with better data quality: “when you have increased trust in the data, there’s an increase in productivity.
Ultimately, a good dataset does more than provide answers in the regulatory blanks. “I want you to use risk data to answer [strategic enterprise-wide] questions.” Data assurance should not be a daunting task. “There’s a finite, manageable number of parameters to keep your eye on.” ª
Click here to read about the first presentation. ª
The webinar presentation slides can be found at: http://event.on24.com/r.htm?e=733934&s=1&k=58F70BFD2BC23EA12BFFC8023A7C4B08
The research paper it’s based on can be found at: http://www.aitegroup.com/report/data-tail-wags-stress-test
The Twitter feed for David O’Connell is at: https://twitter.com/davidpoconnell
The photo of the river and many other stunning natural sights can be seen at Shawn Benbow Photography