We are delighted to announce the launch of BEAR Cloud — an ‘on demand’ service for computationally/data intensive research, developed by the Research Computing Team in IT Services
To celebrate, a one day conference took place in the Bramall Concert Hall on Friday 21st October. The event, opened by Professor Tim Softley, PVC for Research and Knowledge Transfer, featured talks by some of our leading academics as well as sector experts and our technology-industry partners alongside central contributions from Dr John Owen and Simon Thompson of the Research Support Section. Click here for further information, including biographies for the presenters and their slides. The day was topped off with a lively panel session
The BEAR Cloud infrastructure is designed to meet the needs of researchers from a broad spectrum of academic disciplines for big compute power but, for whom, BlueBEAR HPC is not the answer. Using Openstack technology and building on the experience gained from implementing CLIMB[1], the Research Computing Team is now able to provide that power ‘on-demand’, with highly tailored and specialized environments as well as the secure isolation vital to research groups handling sensitive data. BEAR Cloud is tightly coupled with existing BEAR services to deliver excellent performance and easy integration for different aspects of a research group’s work.
Early adopters are already employing BEAR Cloud for work on bladder and liver cancers, controlling software developments and building systems to support research into the New Testament.
Look away now and skip the following paragraph if you are NOT interested in the technical detail!
The new infrastructure exploits many new developments from our key technology partners. We pushed them hard and they responded, in some cases modifying their product lines to fit our needs. As a result:
- We are the first UK site to deploy Lenovo’s direct warm water cooled server technology and one of the first in the world to do so with Broadwell (Intel’s latest generation of CPU). As a direct result of this server and cooling technology we are able to remove up to 90% of the heat generated by the kit directly to the outside, radically reducing energy consumed by the data centre air conditioning
- We have deployed Mellanox’s ConnectX-4 technology which includes hardware offload for network virtualization functions and ensures we have low latency Ethernet, preserving the compute power for research workloads. It also gives us EDR Infiniband with 100Gbps connectivity into the BEAR InfiniBand fabric.
- Building on this collaboration with Mellanox, we are one of the first sites in the world to deploy SwitchIB-2 to support the HPC and cloud workloads. This gives us the capability to support scalable workloads with minimal latency allowing data to move faster and jobs to run more quickly.
- Across all BEAR services, storage is managed using IBM Spectrum Scale, another key element of the solution to delivering extreme performance and minimize maintenance outages. Underpinning BEAR Cloud is DDN storage, undoubtedly some of the fastest and most price performant storage available.
From the reaction we are getting from our peers, it is evident that, with our partners, we have pulled together a truly innovative, performant and flexible solution to support research at Birmingham.
Dr John Owen, Head of Research Support
For more information please visit: https://intranet.birmingham.ac.uk/it/teams/infrastructure/research/bear/index.aspx
[1] CLIMB – an MRC funded cloud infrastructure for microbial bioinformatics built between Birmingham and 3 other Universities. This system has already been used in the fight against deadly diseases such as Zika and Ebola