What Is a Data Center?

The need to efficiently manage data has grown as the world continues to produce more data at a breakneck pace. By 2025, the volume of global data is expected to nearly triple in size since 2020. With more data being created, consumed and exchanged, data centers will play a critical role in sustaining companies and countries during the digital age.

What Is a Data Center?

Data centers are facilities that process, transmit and store data. They house large amounts of IT equipment — including routers, storage devices and servers — and environmental controls that mitigate heat generated by the hardware. Because they are the main source for storing company data, data centers are equipped with security features to keep data operations uninterrupted and protect against potential cyber attacks.  

Data center design has trended toward hyperscale centers — multi-level facilities with a high density of IT equipment and greater efficiency than smaller, traditional data centers. Site selection is driven by factors such as security, network proximity, tax incentives, access to renewable energy and energy costs.


Why Are Data Centers Important?

As of September 2023, there are more than 5,000 data centers in the United States. That’s about double the amount over the previous year. In addition, the North American data center market could experience an annual growth rate of over 15 percent on the way to surpassing $55.8 billion by 2028.    

Why such growth? Consumption. As storage and compute usage rises, generally speaking, so too does infrastructure demand. That’s hardly surprising considering that data centers are the load-bearing beams that support just about everything we do online.

Data centers, for instance, power:

That said, if you never thought about where “the cloud” is, you’re hardly alone. But whether or not we connect the dots, data centers remain the backbone of our increasingly online lives.


How Do Data Centers Work?

A data center contains a network of servers, with each server possessing memory, storage space and a processor — similar to a personal computer, but much more advanced. Servers are also linked to each other through internal or external connections, so they can communicate with each other to receive, store and share data. To keep servers’ workloads manageable, data centers rely on software to group servers and distribute workloads among these groups. 

While data centers are still physical facilities, modern data centers often feature a combination of physical and virtual servers as a result of virtualization. Virtualization takes memory, storage space and other hardware components of computers and allows them to be divided among virtual computers, or virtual machines (VMs). A VM may only contain a part of the physical computer’s hardware, but it still acts like a computer.

This transition to a virtual, cloud-based approach enables data centers to better manage resources, applications, services and workloads. Data centers can now accomplish more while using less hardware and resources, making them more scalable as they take on greater workloads and adapt to changing customer needs. 


What Are the Core Components of a Data Center?

Data centers are made up of core components that can be broken down into three main categories: computing, storage and network. Additional support infrastructure ensures these various elements are performing properly and working well together.

Computing Infrastructure

Computing infrastructure consists of servers, which are systems or devices that can receive, store and share data with other computers and devices. Servers accomplish these tasks through memory, storage and processing capabilities. These are the most common types of servers data centers use: 

  • Rack servers: Rack servers are flat, rectangular servers meant to be stacked on top of each other on shelves. This is an efficient setup designed to make room for each server to have its own cooling fans, cables, power supplies and other features. 
  • Blade servers: Blade servers can be stacked on top of each other in a smaller storage space known as a chassis. The chassis provides network and power resources, giving blade servers greater processing power and energy efficiency than rack servers.
  • Mainframes: Mainframes are more advanced than typical servers, being able to handle billions of calculations in real time. While technically a computer, a mainframe acts as a server and can complete the same amount of work as a room of rack or blade servers.  

Storage Infrastructure

Data centers must have some kind of storage system in place, so servers can manage company data. Below are a few types of storage methods:  

  • Direct-attached storage (DAS): True to its name, DAS refers to a storage system that is directly connected to a server. This system allows only the host computer to access data. Any other devices must go through the host computer first to reach the data.
  • Network-attached storage (NAS): NAS leverages an Ethernet connection to give multiple servers data storage and access abilities. Data centers can then store massive amounts of files, making this method ideal for cultivating media archives.
  • Storage area network (SAN): While similar to NAS, SAN possesses a more intricate network of servers, hardware and software. SANs also use a separate network dedicated solely to data, increasing their data storage capabilities compared to NAS. 

Network Infrastructure 

A system of cables, switches and routers connect servers to each other and enable the flow of data throughout a data center. At the same time, data centers can monitor and regulate data traffic through firewalls, preventing cyber threats from infiltrating data center networks.

Support Infrastructure

Companies and customers need data centers to be reliable storers and providers of data. To ensure seamless operations, data centers are equipped with additional features: 

  • Uninterruptible power supplies (UPS) sustain data centers during outages and surges. 
  • Power generators run data centers during more extreme outages.
  • Ventilation and cooling systems keep servers at a reasonable temperature.
  • Fire suppression systems put out fires quickly to prevent damage to data centers. 
  • Properly spaced power cables prevent cross-talk and overheating.
  • Redundant array of independent disks (RAID) makes copies of data as a precaution.   


What Are the Different Data Center Standards and Tiers?

The Uptime Institute standard is the most widely used system for classifying power resiliency in the industry. The Uptime Standard is a four-tier ranking — the higher the tier, the greater the redundancy.

  • Tier I (Basic Capacity): Data centers that have one path for power, cooling and network, and no backup.
  • Tier II (Redundant Capacity): Data centers on which “select” maintenance can be performed without impacting service, and which have “an increased margin of safety” against equipment failures over Tier I centers. Tier II data centers have a backup generator system for power and cooling systems.
  • Tier III (Concurrently Maintainable): Data centers that can have maintenance performed on any component of the system without interrupting service. Tier IIIs have more than one generator and more than one cooling system.
  • Tier IV (Fault Tolerant): Data centers that have at least two separate lines for power, cooling and network.

Once a data center builder establishes a track record and becomes a trusted entity, the certification itself is essentially considered superfluous within the industry, particularly within the United States. (Certifications, even from established colocation builders, are considered more important outside the United States, in emerging markets, where clients may be less familiar with the providers.) There’s even less need to go through the process for tech firms who build data centers for their own operations, unlike colocations, which rent space.


What Are the Types of Data Centers?

Cloud Data Centers

Cloud data centers are built and maintained by cloud computing providers, such as Amazon, Microsoft and Google. Clients rent instances for storage and computing tasks and have no access to the physical servers.

Colocation Data Centers

Colocation data centers are ones in which clients rent and manage servers. Responsibility for power, cooling, resiliency, security and other environmental support falls to the facility owner.

Managed Service Data Centers

These data centers provide similar services as cloud providers but also allow users access at the physical server level.

Enterprise Data Centers

Enterprise data centers are built and maintained by the one company using its servers. These could range from a small, on-premise facility that predates the cloud computing trend to multi-level hyperscale centers built by and for Meta.


What Is in a Data Center Facility?

Take a 20-minute walk south down Pacific from the historic AOL-turned-Stack facility and you’ll arrive at VA11, a 250,000-square-foot Vantage data center that is, in some ways, representative of broader data center approaches and, in others, more distinctive.

Built in 2019, VA11 is the first of five planned data centers on what will be a $1 billion, 142MW campus, stretched across 42 acres. It’s indicative of a long-term trend away from smaller data centers toward densely-built hyperscale designs, which has led to better energy efficiency compared to the old days.

Flooring layout is a big consideration in data centers. VA11 incorporates both of the two most common design options in different parts of the facility: raised floors and concrete floors. Raised floors, which hide cabling below your feet, are hardly unique to data centers. (Your office building may have them.) But the gaps between your feet and the real floor are deeper in data centers, with around two feet of airspace to help with cooling. This flooring tends to be used with lower-power servers, which require less sophisticated airflow.

Even though the industry is, by and large, moving toward concrete slabs, water piping considerations — and a strong dislike of visible cable clutter – may lead some center designers to opt for some raised flooring.

Concrete floors are more efficient and advantageous, according to Lee Kestler, chief commercial officer of Vantage and a longtime leader in the Northern Virginia market. They allow you to bring in heavier servers, and all the wiring for both network and power lives above the racks, which simplifies access. (At VA11, that cabling is sheltered overhead by a Gordon ceiling grid, the assembly video for which is a fun watch for engineering nerds.)

Tim Hughes, director of strategy and development at data center provider Stack Infrastructure, noted that Stack favors concrete, but agreed that the call boils down to situational drivers. “It’s really a combination of the client’s preferences combined with the capabilities of the facility,” he said.

As you might expect, security is an important detail too. At VA11, that includes perimeter fencing, CCTV monitoring and 24/7 security patrols. Every door is alarmed and uses PIN, badges and biometrics for entry. Inside the infrastructure rooms, low-sensitivity smoke detectors “detect the smallest amount of smoke,” said Steve Conner, vice president of solutions engineering and sales at Vantage, in a walkthrough video, which also peeks into the mechanical gallery, generator yard, electrical room and other spots in the building.

Even though servers can motor along at surprisingly high heats, guidelines exist to make sure they run at optimal temperatures. To keep things cool, both Vantage Data Centers and Stack Infrastructure utilize what are called air-cooled chillers, which sit on the rooftop and bring cooling fluid via a closed-loop system into the data rooms while devices called Computer Room Air Handling (CRAH) units draw out heat.

In so-called hot aisles, where the backs of server racks face each other, temperatures can sustain as high as 100 degrees Fahrenheit. “That heat gets evacuated up into a plenum, taken back to the air conditioning units, recycled and pushed back through — and the cold air comes out around 72 [degrees],” Kestler said.

What do data centers do?

Data centers act as a central location for organizations to store the infrastructure needed to run their applications and services and manage any associated data. Appropriate personnel can then easily access and share data as needed to serve the needs of employees, customers and other stakeholders.

What are the types of data centers?

Cloud, colocation, managed service and enterprise data centers are the four main types of data centers.

How do data centers make money?

Operators of data centers may choose to lease out space or computing power to customers as a way to finance data center operations.


Leave a Reply

Your email address will not be published. Required fields are marked *