Big Data, Risk Management, and Full-Fidelity Analytics

The following was cross-published by the Wall Street Technology Association in the most recent issue of the WSTA Ticker e-zine.

The definition of Big Data is rapidly changing within the financial services industry—pivoting from a measure of volume, variety, and velocity to an evaluation of systems-enabled strategies. Whereas most of the discussion has revolved around the challenges of managing petabytes of unstructured data-in-motion, the most important questions actually relate to the potential of analyzing full data sets spanning multiple silos: the option to combine many data formats to deliver much faster speed to insight and greater iterative flexibility.

With an enterprise data hub built on Apache Hadoop, financial services firms are now able to not only fulfill increasingly stringent regulatory demands without the capital burden of specialized systems, but also take on more advanced workloads and realize new strategic benefits from the same data that they need to keep on-hand for compliance reporting. The IT department works across the different business units to build a multi-tenant active archive supporting many workloads and applications in real time with full fidelity, security, and governance based on role and profile.

Big Data and Basel III

One relevant example of the need for a data hub is Basel III compliance. Firms want risk management systems that are flexible enough to both incorporate current market and credit risk regulations and respond to future rules and calculation standards. Banks are compelled to build complex models that perform ongoing stress tests and scenario analyses across all their portfolios. They need to adequately and transparently plan for the possibility of an economic downturn in multiple time horizons throughout the life of each exposure. Accordingly, monitoring and reporting systems must be fully interactive to evaluate the new loss-mitigation strategies and hedges banks intend to put in place.

Unfortunately, many current systems are incapable of analyzing positions that are synchronously valued on large sets of historic data across multiple market factors like volatility, foreign exchange rates, and interest rates. Trading desks typically model scenarios using Microsoft Excel spreadsheets, which means they consider only data snapshots that are insufficient to meet new requirements. Conversely, specialized architecture for risk and capital adequacy is complex and expensive, with separate systems for modeling, extract-transform-load (ETL), grid computation, and data warehousing. Dedicated systems also may not be able to deal with the rapid iterations required to test new models that may at first be error-prone and inconsistent before they are coded for firm-wide reporting.

Minimize Risk with a Data Hub

An enterprise data hub enables risk managers to model tens of thousands of opportunities per second and enables the trading desk to perform intra-day calculations by running scenarios against a real-time events database—or tick store—as well as against massive historic data, all accessed centrally in original form. Instead of separate systems for each link in the data processing chain of storage, ETL, analysis, and reporting, a data hub offers Impala—Hadoop’s massively-parallel-processing structured query language (SQL) engine—and Apache Spark—the next-generation processing engine combining batch, streaming, and interactive analytics via in-memory capabilities—fully integrated with the storage and applications layers of existing data infrastructure to provide fast, complete transformation, calculation, and reporting at a fraction of the cost. Apache HBase—Hadoop’s distributed, scalable, NoSQL database for Big Data—provides real-time storage of massive tick data and more descriptive data to enable analysis of intra-day risk at much greater scale.

Cloudera Enterprise offers the first complete hub for Big Data built on Apache Hadoop, including deployments at three of the top five banks, as well as at the world’s leading insurers, credit card and payment companies, and regulatory agencies. With Cloudera, financial services firms can affordably and scalably analyze custom scenarios on an ad hoc basis prior to trade execution by extending the capabilities of existing tools within the data center, rather than requiring expensive, new, dedicated systems.

Read more about Information-Driven Financial Services >>

Learn more about Full-Fidelity Analytics and Regulatory Compliance >>

Find out why Morgan Stanley recognized Cloudera >>

The post Big Data, Risk Management, and Full-Fidelity Analytics appeared first on Cloudera VISION.

Leave a Comment

Your email address will not be published. Required fields are marked *