Humidity control is a critical component in the design and operation of data centers, though it can take a backseat to the more obvious issue of heat generation.

Too much humidity, and you risk condensation, internal corrosion, and electrical shorts; not enough humidity, and you risk electrostatic discharge (ESD). Not only that, the rate-of-change of humidity can cause sensitive equipment, such as tape media storage, to expand and contract. This can lead to damage or premature failure.

There is a fine line between a well-controlled data center and one that houses equipment in danger of catastrophic failure. The installed mechanical systems must be provisioned and optimized to prevent downtime. Here are some considerations to ensure you are getting the most out of your mechanical systems.

 

Isolate Critical Spaces from Outside Conditions

Local climate is one of the biggest factors in humidity control. Typically, a makeup air system provides minimum ventilation to critical spaces. Controlling and conditioning ventilation air precisely prevents excess moisture from being introduced into critical spaces. Buildings that utilize vapor barriers and tight construction will naturally limit the introduction of outside humidity. In certain cases, it may make sense to buffer critical spaces from external walls with noncritical spaces to minimize infiltration. Regardless of the mode of entry, action must be taken to control humidity once inside critical spaces.

 

Controlling Humidity within the Space

A low percentage of relative humidity (RH%) is typically addressed with the addition of a humidification system that precisely controls the amount of water added to the air. The design and operation of humidification systems must be carefully performed to avoid excessive energy or water use. There may also be opportunities to limit outside air or address issues with the cooling system without having to provide additional humidification. These options should be considered before additional equipment is installed.

High RH% can be controlled in a variety of ways, with the most common being removal of moisture via direct-expansion or chilled water cooling coils. This can be achieved by the makeup air system or within the cooling system for critical spaces directly.

Another approach to high RH% is raising the space temperature of critical areas. This can naturally lower RH to appropriate levels for the equipment. Note: This can only be done within allowable temperature ranges for the IT equipment and does have its limitations. Make sure you fully understand the root of your high RH% issues before implementing solutions.

Transients and excursions are more difficult to identify and control. Certain factors, such as sensor placement and rack density, could create unfavorable conditions for IT equipment and exacerbate transients and excursions. A common approach is to analyze averaged space condition trend data, but this must be done with caution. Averaged data typically smooths trends and may not indicate transients or excursions that could be damaging your equipment. Wherever possible, analyze raw data at the sensor or device level.

 

Trends for the Future

Advances in IT equipment technology, including improved thermal capabilities and higher exhaust temperatures, continually push the boundaries for allowable operating conditions. ASHRAE Technical Committee 9.9 provides guidelines for many aspects of data center design, including recommended operating temperature and humidity ranges for critical IT equipment. TC9.9’s “Data Center Power Equipment Thermal Guidelines and Best Practices” originally provided recommended ranges of 68°-77°F (20°-25°C) in 2004, which was reasonably conservative based on available data at the time. The latest thermal guidelines in 2015 present a range from 64°-81°F (18°-27°C) for all non-legacy equipment, with much wider ranges for certain classes of equipment.

Take caution in applying more aggressive thermal guidelines with legacy equipment or where tape storage media is deployed. As always, ensure that all equipment is operating within manufacturer-recommended ranges regardless of the latest thermal guidelines.

 

Continuing Developments

Controlling humidity and temperature are interdependent processes; they must be considered together. If any part of the mechanical system is not operating as expected, critical failures and data center downtime could occur.  The thermal capabilities of IT equipment continue to expand, which provides more options for the designer.

This allows for reduced costs across the board, not only in construction costs but also within operating conditions. Economization availability increases, which directly reduces the need for cooling. Additionally, the need for humidification or dehumidification processes — both resource-intensive — will be greatly reduced. Be sure to follow along with the continuing developments to ensure you are getting the most out of your data center mechanical systems.