Insurance & Technology is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Channels

10:03 AM
Connect Directly
Facebook
Google+
Twitter
RSS
E-Mail
50%
50%

Insurers’ Buried Treasure

Veteran insurance executives might find it amusing to explain their esoteric trade to the ignorant.

Veteran insurance executives might find it amusing to explain their esoteric trade to the ignorant. They might patiently relate to an uncomprehending skeptic that it is possible to make money by predicting loss effectively through the examination of information about a certain risk. But insurers might be less amused to consider that when it comes to technology-driven business intelligence (BI), they play the role of the uncomprehending skeptic, and that if they continue to do so, someone else will be making the money.

Insurance always has been about information, but emerging technologies have created the possibility for accessing, combining and analyzing the information that insurance companies have within their grasp like never before. Insurance companies are loaded with data of their own and have access to data from other sources. Being able to mine that data for actionable intelligence will be the basis for new levels of competitive distinction as well as a new level of processing information for insurance's core competency - the effective prediction of loss. But the industry has yet to fully digest and effectively act upon that insight. The problem is not that insurance leaders are ignorant of BI technologies and techniques; rather, they misapprehend its potential because they underestimate the value of the information they possess.

Information Rich/Knowledge Poor

Insurance companies remain "information rich and knowledge poor," according to John Lucker, principal, Deloitte Consulting (New York), practice leader for Advanced Quantitative Services. "Too few insurers think of information as their mega-asset," Lucker asserts. "There isn't enough attention paid to how best to store, inventory, clean, normalize and leverage information."

That is the case even in companies that have driven BI initiatives to one degree or another, Lucker asserts. In the first place, technology leadership too often is focused on the technology, infrastructure and architecture themselves rather than their business purposes, he argues. "Too often, the CIO doesn't take the 'information' part of the title seriously enough and ends up being more of a CTO than a CIO," Lucker says.

Insurers also are discouraged by the failure of data warehouse initiatives and paralyzed by the perception that the "dirtiness" of data presents an insurmountable obstacle, Lucker continues. "Nearly every client I've ever spoken with has said their data was too dirty to do business intelligence and complex analytics," he reports. And yet, Lucker insists, "We find their data to be quite clean - if it weren't, they couldn't run the company."

In other words, companies shouldn't wait for perfection to get started. The first step is to conduct a robust inventory of existing data sources within the company and time frames for which the data is available, according to Lucker, who stresses that companies should by no means restrict their inquiries to electronic data. "If I could execute an analysis with a group of 10 people working with paper and pencil and generate $5 million in benefit, shouldn't I do that?" he asks.

Lucker advises that companies then inventory external data available from vendors or other sources, which could yield value when married to internal data. Likening data mining to digging for gold, Lucker identifies this technique as a poorly exploited source of rich ore. The question to ask, he says, is, "What are the real nuggets of external information that, when brought together with internal information, really provide value?"

Lucker also recommends that companies designate a "data czar" who is focused on the potential of the information embedded in the various data sources and dedicated to seeking opportunities to exploit the available information. It's not rare to find people at insurance companies who understand BI technologies, but it is rare for them to have this kind of scope and authority within the enterprise, he adds. "A data czar is someone who takes on the task of evaluating the kind of information a company has and can recognize uses that it can be put to, given the state of technology," Lucker explains. "He then develops an information strategy based on that - as opposed to pursuing piecemeal solutions."

The data czar should be empowered to recruit teams to study potential uses of information and should be able to operate creatively, with an understanding that experimentation will result in some failures along with success. Finally, work must be pursued aggressively. "It's not all right to be a laggard - you should seek to be a leader or a fast follower," Lucker submits. "The companies that know more, understand risk better and can price better than everyone else are going to be the most successful."

Refined Focus

Seattle-based Safeco's ($6.2 billion in 2004 revenue) leadership agrees. The company, which narrowed its business focus with the sale of its life and investment divisions in 2004, has used BI technology to refine its P&C underwriting capability. In 2001, Safeco began to install the foundation for what might be called a BI work in progress, according to Randy Eng, assistant vice president of business intelligence, Safeco, who explains that the carrier continues to build new insights off that foundation.

Seeing an opportunity to automate routine underwriting decisions, Safeco created predictive scorecards and predictive models to segment and price risk, Eng relates. "We wanted to identify a solution that was proprietary and innovative, based on the experiences we've had with our customers and distributors over the years," he says. "What it came down to was the wealth of information we had in different systems around the company."

The carrier conducted a comprehensive inventory of data sources throughout the company, first in personal, then in commercial lines. "The challenge was bringing all that data together, normalizing it and chaining it - tracing the continuity across systems of all the related bits of information during the life of a customer or policy - so that there was a story that could be used in the development of our models," Eng explains.

As an example of the challenge, Eng cites Safeco's assimilation of historical data from American States, a small-commercial insurer that Safeco acquired in 1997. "We didn't look at the most recent policy terms when we created our predictive models," he says. "In order to develop credible numbers from a loss development perspective, we had to go back to the early '90s to construct a viable warehouse of data."

The scorecards and predictive models went live in 2002 via the Safeco Now online quote and issue platform. The carrier has continued to add new lines of business through 2005, and all automated underwritten products, in both personal and commercial lines, are available through the portal.

Eng says the disciplines that drove creation of the scorecards and models are now being applied to other data sources to develop similar insights to address other challenges the carrier faces. These include what Eng calls "cracking the code" of the factors that drive customer satisfaction and, ultimately, customer behavior and loyalty; understanding what will enable the carrier to continue to grow profitably in a softening market; and developing insights to help strengthen bonds with its distribution partners. "By harnessing the collective intelligence that lies within our business, we believe we'll be able to solve some of our industry's age-old challenges in proprietary and innovative ways," Eng asserts.

According to Eng, gains are continual and cumulative. "It is a closed-loop process - we're growing our enterprise warehouse and have paid particular attention to the technology so that it's efficient and scalable," he stresses. "We're growing the infrastructure and we're sourcing that infrastructure with strategic data sources, and we're applying tools on top of the data to help us draw insights by various user groups in different areas of the company."

Akhil Tripathi, SVP and CIO of Harleysville, Pa.-based Harleysville Insurance ($1.3 billion in 2004 direct written premium) believes that a softening market makes BI more important than ever for P&C carriers such as Safeco and Harleysville. Accordingly, last year, Harleysville leveraged its existing BI infrastructure - a large mainframe-based data warehouse - and a mixture of internal and external data sources, such as D&B's (Short Hills, N.J.) and Experian's (Costa Mesa, Calif.) databases, to target the small-commercial business niche. "We know that in the consumer space all the major players are leveraging consumer information; in the commercial space there has not been as much evolution," Tripathi says. "There are enormous amounts of data available in the public domain, and what we have done is to leverage that in terms of risk selection and underwriting."

BI Evolution

Tripathi says that those efforts are beginning to bear fruit, though he declines to share figures. And, he notes, Harleysville is moving to the next stage in its BI evolution, which will be implemented over the next 12 to 24 months. "Where we need to go from here is to evolve with more modern tools that facilitate the extraction and transformation of data and leverage that data onto a server-based environment that allows various business units to slice, dice and mine the data," he says. The goal is "custom underwriting," which Tripathi says "could be done in the field, within our agent relationship-oriented model, as opposed to focusing on referrals to the home office."

To drive work to the next stage, Harleysville brought on board Fred Goxhaj, assistant vice president of enterprise data, whom Tripathi describes as the company's "data mastermind." Goxhaj first looked to what he calls operational aspects of enterprise data, which involves upgrading the database platform and acquiring new extraction, transformation and loading tools; new real-time application integration tools; and new BI tools that have not just strong reporting, but also strong analytical and data-mining capabilities, he explains. "The reason we looked at operational aspects immediately is that we needed not just to upgrade the infrastructure, but also improve productivity to make sure the decision-making process flows much faster and in a more elegant and streamlined way all the way to the business units," Goxhaj says.

Along with the technology upgrade comes an upgrade of skills and team structures. "We're pushing hard to organize various teams and processes to streamline the operations," Goxhaj relates.

A third push was made with regard to the actual data architecture, according to Goxhaj. "A great deal of work has been done in the past, but now the name of the game will be integration," he says. "We're bringing sources together - internal and external - integrating them with each other, allowing a multifaceted view of the business in the hands of the business users."

What prevailed prior to this push was an architecture visible on a department-by-department basis, within siloed information structures. What Harleysville will have going forward, according to Goxhaj, is one version of the truth, visible from various user standpoints, that integrates various sources under one enterprise data architecture and multiple, flexible physical data entities. "We want this architecture to support application development, operational reporting, strategic and trending analysis, and predictive analytics," Goxhaj relates. "This architecture will be the umbrella that manages and preps the data for decision making."

Key to pulling that off are efforts directed at streamlining the cleansing and quality management of the data. Given the ambitious nature of the new data architecture, that will require not only new concepts but also new technology, according to Goxhaj.

However, the greatest challenge the initiative presents is making the data strategy business-centric. Goxhaj observes that it is a well-known syndrome of IT shops to be data-centric. "Although we have a good history of being business oriented, and the IT shop has done an excellent job working with the various business units to accommodate their needs, the architecture itself needs to become more business-centric," he says. "It must surround the whole business process and not simply accommodate unique pieces of data - and that will change the whole business strategy mind-set."

Goxhaj's team now is in the process of interviewing business units across the company, identifying their needs and logging specific requirements. "These requirements are going to be the main driver of the new architecture," he says.

On The Net

www.deloitte.com
www.safeco.com
www.safeconow.com
www.harleysville.com
www.dnb.com

Anthony O'Donnell has covered technology in the insurance industry since 2000, when he joined the editorial staff of Insurance & Technology. As an editor and reporter for I&T and the InformationWeek Financial Services of TechWeb he has written on all areas of information ... View Full Bio

Register for Insurance & Technology Newsletters
Slideshows
Video