Insurance & Technology is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Management Strategies

03:49 PM
Connect Directly
RSS
E-Mail
50%
50%

Battening Down the Hatches

When it comes to predicting low-frequency, high-impact catastrophes, underwriters of risk must rely on thousands of years of scenarios rather than their proprietary loss data. Technology is critical to pulling off this high-stakes juggling act.

In a high-stakes game of hot potato, the most agile carriers and reinsurers are employing the latest in catastrophe modeling technologies to get a leg up in the risk-transfer game. Unfortunately for underwriters who are simply relying on existing loss data, this isn't a game at a kid's birthday party. When the music stops, billions of dollars, along with their company's solvency, are at risk.

"Simply relying on loss data for low-frequency, high-severity events is not sufficient," warns Uday Virkud, senior vice president, AIR Worldwide Corp. (Boston). There isn't enough historical information, he asserts. "You need scientific technological engineering that can simulate thousands of years' worth of scenarios. These scenarios help provide information for pricing and portfolio risk accumulations for things like buying reinsurance or creating rates."

Marked by the two worst natural disasters and the worst terrorism disaster in U.S. history, the past 12 years are a testament to the need for such models. A study released by reinsurance intermediary Guy Carpenter & Company (New York), "Advanced Techniques for Modeling Terrorism Risk," cites figures of the damages inflicted by these catastrophes. The study reports that 1992's Hurricane Andrew was responsible for $20 billion in damages and the loss of 40 to 60 lives, while 1994's Northridge Earthquake cost an estimated $12.5 billion and 25 lives. Although the losses caused by those events pale in comparison to the attacks of Sept. 11, 2001, which inflicted damages of more than $40 billion and a loss of life in excess of 3,000, they've all had eye-opening effects on American underwriters of risk.

"Prior to Hurricane Andrew, catastrophe modeling software was in its early stages of adoption," explains AIR's Virkud. "Many companies went belly up because they didn't know about their risk exposure, so they couldn't protect it with reinsurance or by underwriting more strategically." And, although lessons learned helped underwriters beef up their future mitigation strategies, nothing could have prepared them for the magnitude and breadth of loss that resulted from the events of Sept. 11.

Until that day, risk managers "had always aimed to analyze property risk accumulation," Virkud says. "But since 9/11, workers' compensation and life insurance risk exposure have also come to the fore."

Catastrophe modeling providers such as Risk Management Solutions (RMS, Newark, Calif.) have developed terrorism risk models in response to the renewed focus on such exposure, relates Robert Muir-Wood, chief research officer, RMS. But federal involvement in terrorism risk assessment has stalled the adoption of such technology solutions.

Still, though President Bush's signing of The Terrorism Risk Insurance Act of 2002 lifted some of the burden of terrorism risk from carriers, the events of Sept. 11 have catalyzed a more micro-level approach to overall management of the geographic density of policies, explains Kimberly Morton, director of location intelligence risk management, MapInfo (Troy, N.Y.), which embeds mapping information into its products so that underwriters can determine proximities to different landmarks and run "what-if" scenarios.

Mapping tools, such as those offered by MapInfo, are created through the use of geographic information systems (GIS), reports Jamie Bisker, research director, Insurance Practice, TowerGroup (Needham, Mass.). "Carriers can plot locations and judge them by their proximity to fault lines and areas prone to earthquakes, brushfires and tornadoes," he says. "Understanding a location in relation to other risks is a real advantage."

It's also part of the reason RMS scrapped its use of postal codes for more accurate location identification through the use of GIS technologies. "[RMS] has rebuilt its models to work at higher resolutions, especially around bigger concentrations of exposure or areas with a strong hazard gradient," explains RMS' Muir-Wood.

Although the incorporation of this critical spatial information into a carrier's underwriting system can lead to more dynamic pricing scenarios and better management of portfolios of risk, TowerGroup's Bisker reports that many insurance carriers are still using these models manually. "Some have integrated models with their underwriting systems, but it's not as many as you might like to think," he says.

This lack of integration might have something to do with the regulation of the personal insurance market, hypothesizes RMS' Muir-Wood. "Because commercial lines are not regulated, it's much more opportunistic for [commercial underwriters] to use modeling more dynamically," he contends. "For personal insurers, filing rates can be a complex and slow procedure. Therefore, it is often not simple for a personal lines underwriter to employ the latest scientific knowledge to change the way that they apply differential rates for risk."

So, as with most technology strategies, the best catastrophe paradigm for your organization depends on your company's specific needs. The following is a sampling of catastrophe modeling approaches taken by a range of underwriters.

Previous
1 of 4
Next
Register for Insurance & Technology Newsletters
Slideshows
Video