As the number and size of the services provided by Research Computing has grown over the past 5 years, we’ve steadily put pressure on the University’s data centres. The power consumption of CPUs have risen as servers are packed with more and more cores and, at the same time, driven by the demands of research, our BlueBEAR HPC and BEAR Cloud services have grown and data is being generated at a phenomenal rate.
We’re at the stage where we are rapidly running out of power and cooling capacity in the existing facilities. But all is not lost! Over the past 12 months we’ve been hard at work planning a new build, and just 2 months since the contractor started work, the building is taking shape… Yes, the shape is “just” a box, but a box engineered specifically for its purpose – to meet the demands of a research intensive university.
The data centre is the product of a collaboration between Research Computing, IT Services, Estates and external contractors. It will deliver a flexible space to support Research Computing into the future with enough capacity to grow in response to the success of Birmingham’s research. For the University it is critical that this investment is a highly efficient and cost-effect facility. We already use some of the most efficient methods of cooling our high density compute systems. The water cooling technology from Lenovo™ allows us to recover up to 85% of the heat directly from the system and whisk it away. This means we don’t need to deploy large air conditioning, saving energy and giving as a greener and smaller data centre. A green roof is another feature helping to limit the environmental impact of the operation by providing a wildlife habitat.
Although small physically, once the data centre is complete, we’ll have a potential power footprint of nearly 1MW with the highly efficient and expandable cooling system to match. There will be space for 30 racks of equipment which, with high density compute, means we can accommodate many thousands of compute cores and extreme scale storage.
The expertise and involvement of the Research Computing Team in the design and development of the facility is key to its success. Not only are we responsible for the technology which will go into the data centre but we’ve also taken care of the many small but essential details, from making sure the doors are tall enough to get the equipment in, to ensuring the floor can take the weight of our high density kit – some racks weighing nearly 1200kg!
We’re currently expecting to take over the data centre after Easter 2018, and planning to rapidly migrate much of the existing computing infrastructure into the new space. This will of course mean some disruption to our services as we have to physically move (literally) tonnes of equipment and make it work in the new location! Our current plan is that the majority of this work will take place in the first week after the Easter break (early April 2018). Preparations are already underway to get BEAR services ready for this big step forward and limit the impact during the move.
We’re sure you’ll agree this short term disruption will be a small price to pay for ensuring the long term sustainability of BEAR operations and the ambitious programme of technology innovation which underpins the service.
As the plan progresses and we get closer to having firm dates, we’ll be sending out more detailed information to all BEAR service users. And, once the migration is underway, we will keep BEAR users up to date with progress throughout.