Quartet’s Value at Risk Software Demo – Understanding the Measures Used

Xavier Bellouard |
June 27, 2011

In the previous post, we described the VaR demo and its cube structure. This post describes the different measures used in the demo and how to use the reference data files.

Value at Risk Software Measures


pnlVector.VaRDetail is a Value at Risk software measure that allows the contents of the vector to be exposed to the UI. Use this with care becuase it will return 1000 rows back to the UI – its primary purpose is to allow us to prove, during demos, that a large set of data has been loaded.



historical.VaR.Quality, another value at risk software measure, indicates the level of quality for each cell location. As data is assembled from several trade and var vector data sources, it may be that some historical P&L vectors used in the historical value at risk calculation are missing for some parts of the portfolio. This measure shows a numerical value between 100% (1.0) if all historical p&l vectors are there, and 0 if all P&L vectors are missing.


historical.VaR and historical.VaRDate

This measure of value at risk software displays the first 500 elements of the array. The post processor sorts the array and locates element at the required confidence level which is usually 99% histofical.VaRDate returns the date associated with VaR.

historical.VaR and historical.VaRDate

historical.VaRAddon1, VaRAddon2 and VaRAddon3

The addon functionality of our Value at Risk software allows corrections to be applied to the historic VaR results. The corrections are applied at a Book level so its best to demo this functionality in conjunction with using the Books on the pivot table. The addons are defined in a configuration file under the referential directory (addon.txt). The three different addon values: 1, 2 and 3 are independent of each other.

The file of addons can be edited and reloaded on the fly – however the measures will not be re-evaulated until the next query.


This measure is used in conjunction with the AsOfDate dimension and allows the movement in VaR from one day to the next to be analysed.



This measure is used in conjuction with the book dimension and shows the effect on the total VaR if the book were to be omitted from the VaR calculation.


This apparently complex calculation is achieved using a formula post processor which makes use of the disaggregation operator. Firstly a temporary VaR vector is constructed by disaggregating the contribution for the selected book from the parent value, a VaR value at the required confidence level is extracted from that temporary vector and subtracted from the VaR value at the parent.


This highlights the possibility for further calculations on the VaR measure such as multiplying the value by 3. This simple calculation is performed by the following formula post processor.

stress.VaR, stress.VaRDate, stress.VaRShift

These measures operate in the same way as for the historical measures described above.


This measure identifies the 100% confidence level in the scenarios section of the vector (the last 250 elements). The post processor outputs both the value and the scenario number (which is an ordinal in the range 0-250) that can be used to identify which scenario causes the worst Value at Risk.

Like this post? Please share it on your socials

About the author

Picture of Xavier Bellouard

Xavier Bellouard

Co-founder & Managing Director
Managing Director with 30+ years experience and dual expertise in financial markets and technology. Result driven, with attention to details, a co-founder of Quotient/Summit, one of the most successful financial software products in Capital Markets, with a wide range of skills, including software design and development, professional services, sales, marketing, business development and people management. Co-founder of ActiveViam a data analytics platform specialized for Financial Services.

Schedule a demo


Sorry! We were unable to process your
request. Please try again!


Your request is submitted successfully!
We will keep you up-to-date.