Insurance & Technology is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


01:09 PM
Nicholas Ilyadis, Network Computing
Nicholas Ilyadis, Network Computing
Connect Directly

Rethinking Data Center Design

With the skyrocketing number of connected devices and data processing requirements, data center operators are migrating to a new open architecture that's focused on virtualization.

By the end of the decade, the number of connected devices is expected to reach 50 billion. These billions of devices are generating a massive amount of data: It's estimated that, as early as 2017, 7.7 zettabytes of data will cross the network. This influx of data processing requirements represents a massive challenge for the data center ecosystem as operators abandon client-server and LAN architectures in favor of a design that emphasizes the use of virtualization in servers, storage, and networking.

Increasingly, companies are embracing a more flexible, open platform built on the technology pillars of mobile computing, cloud services, big data, and social networking. Trend setters such as Facebook are building megascale data centers to handle the tremendous bandwidth demands and workloads. Facebook has said it achieved $1.2 billion in savings as a result of its open-platform approach.

Many businesses and enterprises are embracing cloud computing, essentially buying compute capacity from a third party, saving them the capital and operating expenses of running their own data centers. As a result, cloud service providers are among the heaviest investors in open-platform, megascale data centers. Traditional server vendors, which provide high-level service but do so at a premium, are likely to face serious competition from open-platform vendors, which provide a less expensive, more flexible, and scalable infrastructure.

Using an open-platform approach means looking at a data center development project as a whole. Though servers are a core technology, it's important to look at the entire system of servers, storage, networking, and software together and take a fresh approach to how those components need to be better integrated to bring truly disruptive change to the data center.


An open-platform approach touches on more than just the server, but the server still plays a critical role in delivering the capacity, processing speed, and energy efficiency demanded of the next-generation data center. Servers must be built to house scores of virtual servers in one physical server in order to increase server utilization as virtualization becomes the norm. Servers need to be powered by multi-core processors that are both fast and energy efficient, and they must seamlessly interact with increasingly virtualized storage and networking systems.

Read the rest of this article on Network Computing

Register for Insurance & Technology Newsletters