Skip to main content

Topology · 10 min read

Set-Point Recommendations for Data Centres (ASHRAE TC 9.9)

CRAC Services Engineering

Operating your data hall at 22°C costs you 30% more cooling power than 25°C — for no reliability benefit. Here's the case for raising set-points to the warm end of ASHRAE TC 9.9.

Why set-points matter

The set-point you operate your data centre at determines:

  • Annual cooling energy consumption (every degree warmer = ~3-5% saved on cooling kWh)
  • Equipment failure rate (warmer ≠ less reliable, within ASHRAE bands)
  • Free-cooling availability (warmer set-points unlock more economiser hours)
  • Humidity control regime

Most Australian data centres are still operating at 21-22°C — a legacy of older equipment specifications and conservative defaults from 2005. Modern equipment is rated for substantially warmer operation, and the cost of running cool is real.

ASHRAE TC 9.9 envelope

The ASHRAE Technical Committee 9.9 publishes the industry-standard Thermal Guidelines for Data Processing Environments. The 2021 revision (current) defines four classes of equipment:

  • Class A1 — recommended 18-27°C, allowable 15-32°C. Enterprise servers, storage.
  • Class A2 — recommended 18-27°C, allowable 10-35°C. Volume servers, storage.
  • Class A3 — recommended 18-27°C, allowable 5-40°C. Volume servers (extended).
  • Class A4 — recommended 18-27°C, allowable 5-45°C. Volume servers (very extended).

The recommended range is the 99% confidence interval where equipment will operate at full reliability indefinitely.

The allowable range is where equipment can operate without immediate failure. Time spent in the allowable but not recommended range affects long-term reliability statistics, but the impact is smaller than most operators assume.

Most current-generation IT equipment is rated A2 or higher. The vast majority of equipment in Australian data centres is rated for at least 27°C continuous operation.

Why most data centres run too cool

Three historical reasons:

  • Cooler used to mean safer. When equipment was rated for 18-22°C continuous, lower set-points gave headroom against cooling failure. Modern equipment has wider tolerance.
  • Operator habit. Server rooms have always run cool because that's how they've always run.
  • Cold-aisle measurement. Operators measuring cold-aisle supply air at 18°C may not realise that's 21°C at the equipment intake — already at the warm end of recommended.

What raising the set-point unlocks

For a 1 MW IT-load data centre in Sydney, operating at:

  • 22°C: ~$420,000/year on cooling power (typical commercial tariff)
  • 25°C: ~$320,000/year (saves $100k/year)
  • 27°C: ~$260,000/year (saves $160k/year)

Savings come from three compounding effects:

  • Compressor / chiller efficiency is higher at warmer return temperatures (less work per kW removed).
  • Free-cooling / economiser hours increase substantially. At 22°C you might get 2,000 free-cooling hours/year; at 27°C you get 5,000+ in Sydney climate.
  • Plant capacity increases at higher set-points, deferring or eliminating future capacity expansions.

What raising the set-point does NOT cost

Reliability impact between 22°C and 27°C, for ASHRAE A2-class equipment:

  • Server failure rate: negligible difference (within statistical noise of normal failure rates)
  • Hard drive AFR (annual failure rate): roughly +0.5-1% per °C above 22°C, but this is in the noise of typical failure rates (3-5%/year)
  • CPU temperature: server fans compensate, slight power increase at the IT side (typically less than 1% of IT power)

The net effect on a typical data centre is dominated by cooling power saved, not reliability lost.

Humidity considerations

Set-point alone is half the story. The humidity envelope is the other half. ASHRAE TC 9.9 recommends:

  • Dew point: -9°C to 15°C
  • Relative humidity: 8-80% (recommended), 5-95% (allowable)

Most data centres over-control humidity, running humidifiers and dehumidifiers in unintended fight when they should be wider-band tolerant. Energy consumed in humidity control is often comparable to cooling itself.

For most modern data centres, widening the RH band to 20-70% saves substantial humidifier and dehumidifier power without compromising equipment.

Implementation approach

We recommend a phased set-point migration rather than a single jump:

  • Audit current equipment — confirm ASHRAE class ratings, establish current actual server inlet temperatures (not return air or rack-top sensors)
  • Phase 1: 22°C → 24°C — most equipment is comfortable at 24°C; this is the easiest win and unlocks 5-7% cooling savings
  • Monitor for 4-12 weeks — track actual server inlet temps, fan speeds, equipment alarms; verify no impact
  • Phase 2: 24°C → 26°C — additional 5-7% savings, more free-cooling hours
  • Phase 3: 26°C → 27°C — final increment, realising full ASHRAE recommended range

Each phase typically takes 2-4 weeks of operations to validate. Total programme is 12-16 weeks for a careful migration.

What can go wrong

The failure modes for set-point migration:

  • Mis-mapped temperature sensors — operating to a sensor that isn't actually at the server intake leads to over-cooling or under-cooling
  • Pre-2010 legacy equipment — older equipment may be rated A1 only and have less margin; identify before migration
  • Hot-spot rack — single rack drawing far more heat than the average creates a localised hot zone that the average set-point doesn't cover
  • Insufficient airflow management — set-point change doesn't help if hot-aisle air is recirculating to cold aisle (containment is a prerequisite)

We perform set-point audits as part of our precision cooling design service. The audit covers actual sensor mapping, equipment class ratings, and airflow management before recommending any migration.

When to call us

For a set-point optimisation programme — including the audit, phased migration plan, and ongoing monitoring — contact us. The first-year savings typically pay for the audit several times over.

[Request a Quote](/contact#quick-quote).

References

  • ASHRAE Technical Committee 9.9 — Thermal Guidelines for Data Processing Environments (2021)
  • ASHRAE Datacom Series — Thermal Guidelines for Data Processing Environments (2nd edition)
  • AS/NZS 1668.2 — Mechanical ventilation
  • NCC Section J — Energy efficiency requirements