Insurers are risk managers by trade, but they've struggled to establish the unified technology and process environments needed to achieve enterprise risk management in their own businesses. Now big data has the potential to help enhance their ERM capabilities and achieve better business results.
Increased competition and new risks, such as more natural catastrophes and regulatory demands, are making ERM more important. Insurers that invest in ERM initiatives are likely to see improvements in their ratings and capital requirements, according to Towers Watson's 2012 ERM survey of 200 North American insurance executives.
The survey, published in February, revealed that prominent business changes resulting from respondents' ERM programs have steadily increased since Towers Watson last conducted the survey, in 2010. North American insurers ranked changes in product pricing, risk strategy and reinsurance strategy as the areas of their business most affected by their evolving ERM programs (see chart, p. 15).
The introduction of big data is likely to improve these results, says Don Canning, VP of software and services company SunGard's Financial Systems division.
Big data brings with it more volume, velocity and variety in the data being analyzed and will drive more advanced assessments of risk, Canning says. That, in turn, will help board members make better informed and faster decisions, he says.
Without big data, actuaries spend as much as 80% of their time gathering and cleansing data, rather than focusing on the higher-value tasks of analyzing output of models and searching for patterns, Canning adds. "Big data not only provides the ability to process massive amounts of data today but also allows for the inclusion of other data sources, such as claims, new business and investment management factors," he says. The additional sources enable a broader picture of an insurer's risk appetite than modeling from traditional, mostly internal data sources, Canning affirms.
New technologies are increasing the computing power of many insurers, says Van Beach, a principal with Milliman, a global provider of actuarial and related products and services. Large insurers with deep pockets have long used MPP (massively parallel processing) for complex actuarial calculations. Now high volume and velocity computing is accessible to smaller insurers -- and cheaper for larger ones -- through "Big Compute," which is essentially the extension of MPP into the cloud, obviating the need for on-site high-performance computing capabilities.
[What Fuels ERM? More Data]
"In the world of Big Compute, 10 computers for 100 hours costs the same as 1,000 computers for one hour," Beach explains. "If you have a simulation of 100 scenarios where each scenario takes one hour, in the old data center approach it would take 10 hours to complete; with the cloud it takes one hour."
Big data is democratizing insurers' ability to address the ERM challenge and achieve more sophisticated and accurate predictive models with as large a data set as possible, explains Ali Bouhouch, senior manager in Wipro's enterprise architecture practice. Instead of pursuing the perfect risk model, carriers are building simpler but adequate models that leverage big data tools that run on massive loads of historical and real-time data, Bouhouch says.
"Statistical models are subject to clustering anomalies, so the size of data samples must be large enough to include boundary conditions and outlier scenarios," Bouhouch says. "Big data enables the timely processing of these massive amounts of data from different sources and enables the delivery of proper guidance for identification, assessment and treatment of enterprise risk in an unprecedented manner."
Big data's impact on ERM goes beyond increased processing efficiency and quantity, providing a paradigm-shifting qualitative jump, says Todd Cooper, VP of enterprise risk and compliance at Wolters Kluwer, a worldwide provider of risk management, compliance, finance and audit products and services. "You start to get more data points outside of the actuarial universe, including external sources and the 'data exhaust,' or tangential information thrown off core systems that aren't central to their function," he explains. "You begin to examine seemingly disparate information where there's a correlation of impact -- it's like suddenly acquiring stereoscopic vision."
Anthony O'Donnell has covered technology in the insurance industry since 2000, when he joined the editorial staff of Insurance & Technology. As an editor and reporter for I&T and the InformationWeek Financial Services of TechWeb he has written on all areas of information ... View Full Bio