Quartet FS research into the problems investment banks face as a result of Big Data has provided some thought provoking results. Of the IT Managers and Systems Architects surveyed, 100% find analysing Big Data a significant problem, giving irrefutable evidence that Big Data is something which needs to be addressed sooner, rather than later, in financial institutions.
The reasons behind this problem are varied. 47% of respondents say the main problem is the volume of data; 37% the velocity; and 17% the variety. For many, these results will be as expected, with high data volumes a notable problem across all industry sectors – not just financial services. Velocity of data is set to be the next big challenge for financial institutions as the frequency of data generation and delivery is overwhelming in industries like investment banking, and must still be analysed in near real-time. Conversely, data variety is a greater problem in other industries, such as ecommerce, where the data is unstructured and comes from a variety of sources.
It is unsurprising that there is currently little agreement within the financial services industry as to how to tackle the problem of Big Data. There is an inherent attachment to more traditional technologies, with over half of the respondents seeing data warehousing appliances as the best way to speedily work with Big Data. However a more progressive attitude is certainly evolving, with 35% of respondents already identifying in-memory analytics as the preferred solution. Crucially, the future is looking exceptionally bright for newer technologies, with the same research showing that the respondents believe in-memory analytics will become the predominant architecture within just 2 and a half years.
Read the full report online at: http://activeviam.com/en/white-papers/188-in-memory-analytics-solving-the-big-data-challenge-for-banks
Watch this space for our next blog on the topic of in-memory analytics, which will dissect more of survey findings.