11:56 AM
Bring It on Home
Determining the relative merits of more- or less-centralized data center models is a recurring dilemma for technology decision makers. Keeping things in one place tends to bring advantages of control and efficiency. But monolithic models are bound to give way to more decentralized schemes to allow greater flexibility and agility. Distributed computing approaches have their own drawbacks, of course. Decentralizing resources tends to breed redundancy and isolate parts of the enterprise from each other.
With a tight economy now forcing efficiency, and the events of September 11 raising novel security concerns, the pendulum is swinging back to more centralized approaches to the data center. But the state of technology today means it's no longer a question of simply being centralized or decentralized. As it becomes easier to move information between points, the debate turns to whether assets are dedicated or shared. Furthermore, centralization can be of management rather than physical location of assetsa risk associated with having all of a company's technical architecture in a single location that came to light after September 11. As a consequence, gaining some of the benefits of centralization may not mean giving up all the advantages of distributing technology resources.
"Centralization and decentralization is a cyclical process that we go through over the years," says Robert Bruce, president, Fiserv Advanced Insurance Solutions (formerly The FREEDOM Group, West Des Moines, IA). On the one hand, companies centralize their IT resources to achieve economies of scale and reduce costs in timessuch as nowwhen expense control is critical, Bruce asserts. On the other hand, centralization tends to lessen the degree of service to a firm's business units. "There's always a trend to end up in a more decentralized model to improve levels of service and get the IT organizations closer to the people doing the work," Bruce says. "That tends to happen in boom times when business units have funds and are seeking a competitive performance edge. Ultimately it cycles back again."
Among the reasons for its return are the obstacles that need to be overcome in order to unite different parts of the business. "It's amazing how many enterprise initiatives, such as CRM, are running into problems based around infrastructure questions," says Chuck Johnston, vice president and director, insurance information strategies, META Group (Stamford, CT). "In part, it's a question of what has to live in a 'glass house,' what applications can coexist properly together, and how do processes have to change in order to deal with the 7x24 world."
Large multi-line insurers play in several distinct businesses, which move at different velocities and have characteristic requirements, Johnston notes. When the data of those businesses is spread out in a distributed environment, there is a fear of losing some of it and a consequent desire to corral it into a more protected data center environment. "But, by the same token, they don't want to lose the flexibility that in some cases is the hallmark of that level of security," Johnston says. Despite technological advances in distributed management, "the mainframe is still the best managed platform in the enterprise," he adds. But companies "are not ready to give up the flexibility which is the driving force behind client/server."
Exactly where and in what state a company should keep its data center assets has taken on a new urgency in the wake of the September 11 terrorist attacks. Fiserv AIS's Bruce opines that the attacks are likely to encourage more centralized approaches, since insurers can thereby "increase their security environment by having a critical mass of people focus on that functionin a more decentralized mode it's more costly and slower to have that corporate oversight of data," he says.
Eggs in a Basket
But there are also concerns of the "all-the-eggs-in-one-basket" variety, which counsel the spreading of exposure through the distribution of assetsincluding human capital. In this respect, according to Pat Tilton, partner, KPMG LLP (New York), "It's not to say that companies are moving away from centralization, but they're re-evaluating it and saying, 'Gee, do we have an increased business risk by being centralized?'" A likely alternative, Tilton adds, is "a centrally managed data environment, but housed in multiple locations."
Because technology has reached a point today where it's easier to combine flexibility with control and efficiency, according to Tilton, the virtues of centralized and decentralized need not all end up on one side of an either/or proposition. Consequently, "companies can look to see how flexible they can be technologically, while still operating cost-effectively and efficiently," she says.
Over the years, insurers have indeed often looked at the question as an all or nothing, according to Craig Lowenthal, CIO, Hartford Financial Products (HFP, New York; a unit of The Hartford, $170.6 billion in assets). "It's always one extreme or another; why can't we find a happy medium that says, 'Let's use centralized/shared services where it makes sense and let's use decentralized services where that makes sense,'" he asks.
Having the ability to leverage common infrastructure-related products and services throughout an organization through shared services means the technology organization in a given business unit can concentrate on core concerns. "I don't have to spend a lot of my time on proven technologies or LAN/WAN, or figuring out what machines to purchase," Lowenthal says. Through shared services the organization gains efficiencies in purchasing and maintenance, and enjoys a kind of plug-and-play interchangability.
A certain degree of control is relinquished through shared services, but the economy and flexibility is worth it, according to Lowenthal. "However, when it comes to application development, I need to be flexible enough to go out and underwrite a new product; I can't afford to wait for the technology from some other part of the organization or go through certain corporate processes," which might take months, he says. "If you pick and choose your shared services, you can leverage the best of both worlds; it doesn't have to be more expensive to be decentralized in the application environment."
In fact, to strike the most profitable balance, it's necessary to qualify the concept of centralization. There is no doubt that economics will drive some kind of centralization, since distribution of assets tends to result in low utilization, according to Steve Sheinheit, CTO, MetLife ($302.5 billion in assets). To conceive in terms only of centralized and decentralized is an oversimplification, Sheinheit asserts. Other dimensions to consider, he says, include, "Are we talking about the physical location of the equipment? Are we talking about the management of the data center or environment or equipment?"
The first thing to be done is to set the strategy and the architecture, according to Sheinheit. "Once I do that I can determine whether an asset is dedicated or shared, and then what the best location is for it," he says. "The location then becomes an economic decision around the best place to put it to handle both business and protection requirements," he says. "It's not until you can get into a shared environment with integrated and common technologies that you can raise the capacity of utilization."
Anthony O'Donnell has covered technology in the insurance industry since 2000, when he joined the editorial staff of Insurance & Technology. As an editor and reporter for I&T and the InformationWeek Financial Services of TechWeb he has written on all areas of information ... View Full Bio