From a business perspective as well as a technology impact perspective, one of the biggest technical trends facing executives in financial services is big data, with the business value derived from this data helping to drive the pace of adoption.
There are very few industries that are as data-centric as the banking and financial services industries. Every interaction that a client or partner system has with a banking institution produces actionable data that has potential business value associated with it. Retail banking, wealth management, consumer banking and capital markets have historically had multiple sources and silos of data across the front-, back- and mid-offices.
Today, however, they are beginning to ask questions about how to on-board the data and draw actionable insights from the data all within a reasonable SLA. Many IT organizations are feeling pressure to deliver on this vision as it moves from industry hype to the datacenter.
Challenges That Financial Institutions Face
Financial service firms have operated under significantly increased regulatory requirements, such as Basel III, since the 2008 financial crisis. As capital and liquidity reserve requirements have increased, the requirement to know exactly how much capital needs to be reserved, based on current exposures, is critical. Unnecessarily tying up excess capital can keep the firm from taking advantage of business and market opportunities. Today’s risk management systems must respond to new reporting requirements and also handle ever-growing amounts of data to perform more comprehensive analysis of credit, counter-party, and geopolitical risk. However, existing systems that are not designed to meet today’s requirements cannot finish reporting in time for start of business or trading, which can lead to uninformed decisions. The problem is compounded by the increasing need for intra-day reporting as well as a short window for overnight batch processing as required by global trading and electronic exchanges. And, many of these systems are inflexible and expensive to operate.
There are other problems that aging in-house solutions can present, such as:
Data is often stored across many silos throughout the firm, using multiple technologies which all require different methods for obtaining access. Instead of focusing on analysis and reporting, valuable time can be wasted as teams try to figure out how to reliably obtain the necessary data.
Many proprietary solutions have been built using high performance computers or grid computing clusters that are inflexible and can consume large portions of the available technology budget without meeting evolving challenges. Since these systems often don’t use any standard interfaces, off-the-shelf tools can’t be used or require custom development.
Existing systems typically lack the security and controls necessary to keep up with compliance and data security requirements
Want to learn about how these challenges are being adddressed? Join Vamsi Chemitiganti, Red Hat and Ajay Singh, Hortonworks on our upcoming webinar, April 22 @ 9am PT/12pm ET and learn more about:
Big Data Use-cases and best practices
Requirements for a successful big data deployment
The Red Hat and Hortonworks collaborated solution for Risk Management