ActiveViam

Nordic Capital, a leading sector-specialist private equity investor, has made a majority investment in ActiveViam | READ MORE

FRTB: Assessing the Challenges

Xavier Bellouard |
June 2, 2016

A few weeks ago, Quartet FS co-hosted a FRTB round-table event with PwC in London as we reported in a previous post. A large number of decision-makers from international banks were present to take part in a discussion that started with the simple opening gambit: “How ready are you to implement FRTB?”

Most of the guests admitted that they were still in the planning and gap analysis phase while waiting for the Basel Committee to clarify several issues – such as P&L attribution, back testing and the definitions and boundaries between trading and banking books – but already had a good understanding of the challenges that lie ahead.

They identified five main areas where the path to a successful integration of the new regulations appeared particularly arduous:

  • Data availability, quality, lineage and consistency will be onerous to maintain and reconciliation might be a considerable burden
  • Model validation per desk will be a lengthy and costly exercise especially if you get it wrong
  • Banks had expended lots of time, money and energy constituting Volcker desks – would these have to be re-evaluated in light of the potential capital charges?
  • It appeared that these capital charges were potentially attributable for a large part to non-risk sensitive measures such as Residual Risk Add-ons and Non-Modellable Risk Factors so are not assessed by either Standard Approach or Internal Model Approach.
  • The complexity of P&L Attribution tests – although banks can now use hypothetical P&L data from the front office. Hedged books present a particular challenge when trying to figure out why they failed the attribution tests.

The need for technological and managerial evolutions

Underlying those challenges is an overriding concern, recognised by everyone, of the exponential increase required in data storage and computational power. Massive extra storage would be required to collect and maintain additional observed market data with lineage control, trade data across all asset classes, P&L, Market, Credit and Liquidity risk vectors and scenarios for at least 10 years for stress testing as well as back testing.

In addition, extra computational power would be necessary to run the extraordinary number of additional scenarios required – especially for the calculation of the Default Risk Charge.

Ultimately, everyone was in agreement that as well the need for necessary IT developments, the transition will also require cultural change and a strong senior management leadership too.

In the next part of our series, we will examine a few ideas to help tackle those challenges and move forward. In the meantime, have a read of our White Paper on FRTB. We think you will find it both informative and interesting.

Like this post? Please share it on your socials

About the author

Xavier Bellouard

Xavier Bellouard

Co-founder & Managing Director
ActiveViam
Managing Director with 30+ years experience and dual expertise in financial markets and technology. Result driven, with attention to details, a co-founder of Quotient/Summit, one of the most successful financial software products in Capital Markets, with a wide range of skills, including software design and development, professional services, sales, marketing, business development and people management. Co-founder of ActiveViam a data analytics platform specialized for Financial Services.

Schedule a demo

ERROR!

Sorry! We were unable to process your
request. Please try again!

success!

Your request is submitted successfully!
We will keep you up-to-date.