10:50 AM
Insurer Recovery Teams Do the Job
In the wake of the devastating attacks on the United States, one high-priority item on most insurance company CIOs' to-do lists will not only be re-evaluating disaster recovery plans, but also practicing and running through the entire disaster recovery scenario.
"The recent terrorist attacks underscore the critically important role of insurance companies' technology organizations," according to Glenn Sieber, managing partner, insurance industry practice, North America, Accenture (Chicago). "This is the time to ensure that disaster planning and recovery are sufficient to enable operations to continue in the wake of a catastrophe."
One company that saw its contingency plans kick in successfully was New York-based American International Group (AIG, $268 billion in assets). On September 11 AIG had offices in nine buildings in and around Ground Zero, including its headquarters at 70 Pine St. and its Aviation Brokerage subsidiary on the 53rd floor of One World Trade Center.
"By the following morning we had most of our New York City operations staff functional in Livingston, Berkeley Heights, Basking Ridge in New Jersey and Jericho in New York," says Mark Popolano, the firm's chief information officer, and an Insurance & Technology Elite 8 2001 Award honoree (see profile). "We were about 95 percent operational by about 10 am" on Tuesday morning, September 12.
Internet Never Went Down
AIG's Internet operation was unaffected because it is housed in Livingston, Popolano says. "Most of our systems had backups and recovery tapes and we had SunGard Wayne, PA already initiated and on-call," he recounts. "UPS uninterrupted power supply systems kicked on in the main building automatically when local utility ConEd went out, and that allowed our routing systems to work and be fully operational, along with our phone systems."
The first order of business at Livingston on the morning of September 12 was to set up a crisis center, according to Popolano. "The next thing we did was lay out a plan and create a book with everyone's team organizations, phone numbers, access points, e-mail addresses, along with what groups, organizations and units they had to go over," he says. "Then we created crisis running-event logs by building, time and status. This is what we kept communicating to people."
Communications were maintained originally via the Internet, and the crisis center was up-and-running by 4 pm, with functioning emergency 887 phone numbers. "We had business already calling us and letting us know what was going on, and we started doing two-hour updates to all systems personnel," according to Popolano.
The crisis management team also linked to help desks and call centers to be able to ticket and track all transactions related to technology, Popolano says. "By contacting multiple technology vendors, such as Compaq Houston and Dell Round Rock, TX, we got equipment in within six hours," Popolano adds.
Once crisis unit team members were identified, each organization was appointed a team lead. "We then created service packs with equipment such as flashlights, masks, goggles and gloves," Popolano says. "That way, when we came back to New York we could do a floor-by-floor search to make sure that each one of the LAN rooms was clean, prepped and powered." Popolano's teams worked 12-hour shifts until Sunday, Sept. 23, by which time they had fended off a computer virus attack that came in the wake of the WTC catastrophe.
Anthony O'Donnell has covered technology in the insurance industry since 2000, when he joined the editorial staff of Insurance & Technology. As an editor and reporter for I&T and the InformationWeek Financial Services of TechWeb he has written on all areas of information ... View Full Bio