Contact Us Share on LinkedIn Tweet

Why is infrastructure important?

A resilient enterprise-class infrastructure is essential to allow customers’ peace of mind. However, with more of our customers seeking increased IT efficiency and value for money.

How can Aegis ONE support this?

  • Our Data Centre has been engineered for maximum cooling efficiency across all power densities.
  • Direct Fresh Air Cooling allows customers continually to scale-up their IT utilisation.
  • No limits on either power density or headroom, allowing for total flexibility across your IT footprint.

Aegis One is supplied by dedicated dual (n+n) feeds with 7.7MVA of power available across the Surrey Data Park. The allows our customers the confidence to plan for tomorrow, knowing they have power headroom for many years to come.

"The Internet of Things, gaming, virtual reality, and streaming media are only a handful of the latest technologies that have provided a number of opportunities for the data centre to capitalise on."

About Aegis ONE

Specifically designed to house High Performance Compute (HPC) and Open Compute Project capabilities, and supported with Direct Fresh Air Cooling (DFAC), Aegis ONE is a 34,000ft² facility configured to provide complete flexibility across your rack environment.

Whether it’s a single phase 4kW rack or a three phase 25kW rack, all configurations can be housed contiguously in our data halls without the requirement for expensive segregation or ancillary cooling.

The resilience and security of the data centre is supported by our fully integrated Building Management System (BMS). Real-time digital interactive BMS provides constant monitoring of power and cooling infrastructure, leak detection, fire detection and security systems. To ensure the integrity of the facility, the BMS is monitored 24 x 7 by a dedicated team with structured notification and escalation processes. This is further enhanced by optional remote tenant DCIM monitoring capability.

Data Centre Technical Specifications

Building

  • 2 floors of data halls
  • 1,500m² of white space
  • 1.2MW of fully fitted space
  • Loading/delivery secure docking area
  • Equipment lift (2m x 2.5m) – 1.5 tons
  • Flexibility and scalability of infrastructure and white space offerings

Power

  • 6,200kVA – Dual dedicated building power feeds (N+N)
  • 2,000kVA N+1 power distribution to PDU’s
  • 2,000kVA Diesel driven generator with 24 hours run time at full load
  • 400kVA UPS modules with static and maintenance bypass
  • Design rack average of 7.5kW power supply per rack
  • Rack load of 25kW (+) contiguously
  • AC power voltage 230v or 415v (32 or 64 amp)
  • Service Level Agreements (SLAs) of 100% availability of power
  • Under floor critical dual power distribution

Environment

  • Average supply temperature: 20°C – 25°C ± 2°C (In line with ASHRAE Guidelines TC9.9)
  • N+1 Cooling infrastructure delivered under floor with (CRAC) units
  • 100% Direct Fresh Air Cooling capability N+1
  • Heat rejection installed to take air away from white space
  • Moisture detection and control sensors
  • Full N+1 DX back up cooling infrastructure
  • 1.13 design PUE based on full load (with no external influence)

Monitoring

  • Full component monitoring back to NOC
  • Power metering per rack
  • Site wide DCIM (Data centre infrastructure management)
  • Customer Portal for DCIM interface
  • Fire Detection & Suppression
  • Inert Gas fire extinguishing system throughout the technical areas
  • Very Early Smoke Detection Alert (VESDA)
  • 1 Hour fire rated walls
Press enter or esc to cancel