Matt Maccaux
Global Field CTO for HPE Ezmeral Enterprise Software
April 7, 2022
University research data workloads have become increasingly complex to provision and manage. In fact, research has found that as big data has shifted from the edge of compute within the Higher Education environment to the mainstream, it has brought with it complexities around agility, provisioning, storage and capacity.
The challenge is to find smart ways of managing infrastructure and compute without compromising on a university’s access to, and ability to leverage, this data because it has become critically important to growth, funding and student engagement.
The demands placed on infrastructure, technology, people and funding by data-intensive research are extensive. Universities have to find smart ways of resolving the intelligence challenge to ensure that they are capable of the storage, security, access and agility needed to lure top researchers and that they have the tools they need to analyse increasingly large datasets.
The same research also found that the key obstacles facing researchers when it came to their workloads and data analytics requirements were: managing complex data; developing structures for collaboration; sharing knowledge; and support and training.
This is echoed in the report Supporting Academic Research: Understanding the Challenges, that found researchers are under pressure to find their own funding; face a challenging admin burden; and want more support in collaborations and interdisciplinary endeavours. For power users and dedicated research universities there is a growing need for ‘well-coordinated scientific and engineering partnerships’ and ’ and to ensure that data is not just stored and managed properly, but made as transparent and as accessible as possible. Researchers need tools that will help them sift through vast quantities of data so they can glean insights, recognise patterns and expand their studies.
However, these tools have to be capable of withstanding the weight of the data, both new and old, without compromising on system integrity, accessibility and speed. Researchers need smarter systems, better platforms, improved collaborative capabilities and ever-evolving learning requirements. They need data-centric digital transformation that doesn’t fundamentally impact budgets and funding, but instead allows for incremental steps towards high-performance capacity and improved efficiencies. After all, it is this access to research and data that delivers a proven long-term benefit to the economy and citizen wellbeing.
Shifting within the tectonic plates of data-centric digital transformation—an imperative for any establishment looking to stay ahead in a highly competitive market—is the concept of data platform modernisation. Many universities are battling with data silos, vague data strategies, legacy architecture and limited integration across multiple third-party systems and service providers. By modernising their infrastructure, they can pull every one of these threads together into a cohesive and holistic ecosystem that connects the digital dots, and puts the data in the right places.
Modernising data platforms allows for increasingly seamless integration at scale, providing unique flexibility and agility that not just improves speed and accessibility, but reduces the university’s reliance on physical servers and infrastructure. For example, a container-based, platform-as-a-service solution can leverage the capabilities of a hybrid cloud platform to then enable rate to scale at a reduced cost to the university. It also allows for the institution to expand into virtual and cloud-based environments on demand.
Many universities need to undertake pure research that requires a cloud experience for the researcher or the research team. These teams need access to modern tools on a self-service level, while retaining secure access to the data—the latter being of critical importance when considering the sensitivity of the data often used by universities and research teams.
This data cannot leave the premises or systems, so requires alternative approaches such as the use of virtualised systems and cloud-based solutions. Cloud allows for this intelligent approach while also providing research teams with unprecedented flexibility. Lab environments used to process vast quantities of data could be reclaimed to run idle research tasks when not in use thanks to the ubiquity and scale of the cloud environment.
For many universities the biggest inhibitors are cost and access. This is where HPE Ezmeral, in collaboration with DTP Group, excels. Providing the fastest infrastructure with access to data on demand, this intelligent, data-centric solution creates a platform from which researchers can leap into the digital unknown. Using these resources and platform, teams can create sandboxes for experimentation, develop and introduce their own research tools, provision environments as required, and iterate research environments without lengthy wait times or impacting on other users.
With HPE Ezmeral and DTP, universities can create data-centric platforms that deliver secure access to data and allow for self-service provisioning all within the cloud. It’s a resilient and agile solution that puts the data where it delivers the most value and gives universities the keys they need to unlock the future.
FIND OUT MORE