During the throes of the last financial crisis, banks and regulators alike “struggled” to get good quality information. “The infrastructure was not there,” said James Dennison, CFA, Managing Director, Operational Risk Division, Office of the Superintendent of Financial Institutions (OSFI).

To enhance banks’ risk management infrastructure, the Basel Committee on Banking Supervision (BCBS) released a set of Principles for Effective Risk Data Aggregation & Risk Reporting in January 2013. Dennison was first to speak on the evening of September 19, 2013 at the Toronto chapter meeting of the Global Association of Risk Professionals (GARP). It was convened at First Canadian Place to allow members to hear two Canadian perspectives—the regulator’s and a bank’s—on the new guidelines.

The guidelines “are not about technology,” Dennison emphasized. Instead, they focus on what the information should look like. The risk data document took 4 to 5 months to draft and was put out for comment in 2012. It applies to both global systematically important banks (G-SIBs) and domestic systematically important banks (D-SIBs), with the difference lying in the date of implementation (January 1, 2016 and December 31, 2016 for Canadian D-SIBs, respectively).

The mechanisms that bring together, or aggregate, underlying data are highly significant. The guidelines are intended to improve the flow of data to risk management groups.  Weaknesses of information about the control environment can lead to significant losses. Adopting the guidelines will lead to improvement, and Dennison expressed concern that it was “not much time, considering the magnitude of changes and all the other initiatives” that banks have underway.

Dennison described the overarching governance and infrastructure. The regulator expects the board of directors and senior management to know the capability of their own bank’s aggregation and risk reporting, no matter what state it is currently in. The senior management must ensure enough resources are dedicated to improvement. For example, if the risk reporting is outsourced, how does this impact the outcome? There are significant differences between the Big Six in Canadian banking, for geography and marketing focus.

“Risk reporting may occur at a certain frequency, but you must build systems that can handle a crisis—those critical times when you need a fast-turnaround,” said Dennison. This extends to the business continuity plan, because “who knows what form the next crisis will take?”

It is important to establish data taxonomy and architecture across the banking group. Who owns the data, and what does ownership mean?

Risk data governance is a coming issue, globally. Dennison said the regulator is aware that not everything can be automated, but banks should strive for less human intervention. There should be a minimum of “manual touches” and risk estimates “should not be too spreadsheet dependent.”

Banks will need to devise a metric to define completeness of data. The infrastructure needs to be adaptable, so that banks can respond to a broad range of on-demand queries to “slice and dice.”

Timeliness depends on the type of risk exposure (such as credit month-to-month versus market day-to-day timelines). “The refresh rate should be fast enough so that if you need the information with greater frequency, you can get it,” he said, without adversely affecting the accuracy.

Risk reporting will have to be reconciled and validated. It must be accurate, comprehensive, clear and useful. The stakeholders should have some idea “how much things will change if input changes or is erroneous.”

All this, in a pint-size package. Dennison reminded the audience that boards could be overwhelmed by reports that are too big and unfocused.

Canadian banks completed a self-assessment on June 30, 2013 using the BCBS template and will continue to report annually until the end of 2016 on the progress of their updates. ª

Click here to go to Part 2. ª