Data can reveal a lot about an organisation’s environment: the good, the bad, and the ugly. Now generated across every facet of digital organisations, the examination of how data is generated, where it is stored, and most importantly, how it’s used, can be a powerful diagnostic tool for organisations. Perhaps nowhere is this more observable than the beating heart of an organisation, the data centre. Here, the implications of underperformance are clearly visible, far reaching and closely felt.
With the growth of virtual machines now rapid and often uncontrolled, ensuring complete visibility across the resources (or lack of) within your data centre is more important than ever. Just how many application servers have been spun up in your environment that was initially designed to run 200 VMs? How much runway do you have until you reach capacity? Which applications are consuming the most resources?
For IT teams, having the answers to these types of questions, and the ability to turn swathes of data into actionable pieces of information, is critical to building and maintaining infrastructure that delivers on promised efficiencies and cost savings. Accurate, rich insights and a complete picture of the health of one of an organisation’s biggest assets can enable far more informed strategic business decisions. In fact, companies that use data analytics to make decisions are 19 times more likely to be profitable1. However, according to Forrester, 73% of business data goes unused – leading to decisions lacking in facts and evidence2.
At Data#3 we’ve offered free data centre assessments to customers for many years. Having completed hundreds of these assessments, our specialist team has seen a range of data centre setups. From the disastrous, to the decent and dependable, this blog shares discoveries from the data centre, and provides a deep dive into the engagement.
Traditionally, the process of gathering actionable workload characteristics and performance details has been complex and resource-intensive, typically taking months to complete. Even then, if your data usage is not accurate, your data centre solution may end up being over or underprovisioned, or over budget.
This is where a Data#3 Data Centre Assessment steps in. Using Live Optics – a lightweight, remote and agentless software – the specialist team can quickly collect, visualise and share data about a customer’s infrastructure and workloads. This drastically simplifies the formerly clunky discovery process, enabling us to analyse the data and gain a deeper, real-world understanding of workload performance. Ultimately, the Data Centre Assessment creates a mutual data-led understanding of optimal project requirements.
Let’s break down exactly what insights can be gained from the assessment exercise.
When considering a data centre refresh or consolidation, understanding consumption and how these resources are being allocated and utilised is key to guiding the right infrastructure decisions. Too much and you’ll be overspending, too little and you will be under delivering. Data#3’s Data Centre Assessments uncover which resources are being underutilised, overutilised, or creating performance issues – which too could reveal hidden assets that have been unknowingly consuming resources, or wasted resources that are whittling away budget.
The assessment can also serve as a catalyst to initiate an internal clean up process. If consumption isn’t organic, it can be a key indicator that teams are not undertaking regular or thorough housekeeping. Shedding light on these issues can be hugely advantageous, as it allows organisations to adapt their maintenance strategy to reduce the negative impact on service levels. A data centre assessment can also help you:
For instance:
Traditionally it’s been extremely difficult to understand not only what data is being consumed, but also what data has been written to storage on a daily basis. When it comes to disaster recovery, it’s critical to be able to replicate this stored data to a secondary data centre.
As an example, let’s compare two customers, each with 10TB of data. For one customer, they’re overwriting their database at only 100GB a day. Conversely, the other is overwriting their database using 30% of their 10TB. These two scenarios require vastly different disaster recovery solutions. The difficulty of nailing down exactly how much data is being overwritten can be a major hurdle. The data centre assessment can assist customers in understanding these critical figures, providing insights to inform right-fit disaster recovery infrastructure and processes.
Another consideration is the bandwidth required should replication for disaster recovery be needed. Data#3’s Victorian team supported a company who had been replicating their backup to a secondary site daily. The organisation had assumed that their existing bandwidth was sufficient if they were to initiate replication for their disaster recovery requirements with an RPO (Recovery Point Objective) of 1 hr. Data#3’s assessment was able to help the customer identify the amount the data that needed to be shipped to the second site every day in order to maintain the RPO, and then determine the WAN bandwidth that this amount of data required. Ultimately, this helped the customer save costs by being able to optimise their WAN requirements.
Capacity planning requires an understanding of the consumption of all resources – compute, storage, networking, and assets on the floor, as well as anticipating run rates and growth rates. By leveraging the power of Live Optics in a Data Centre Assessment, we can deliver a holistic view of a customer’s environment. These insights allow us to design and validate a solution that accurately reflects an organisation’s strategic requirements, today and tomorrow.
Innately helpful if you’re making the move to the cloud, the engagement can also offer guidance for the cost of hosting your existing on-premises workload in the public cloud. For example, a Queensland customer was planning to migrate a percentage of its virtual machine workloads to Azure. They were seeking a high-level view of the public cloud costs, as well as the consolidated refreshed infrastructure for the remaining workloads in their on-premises data centre. The Data Centre Assessment helped provide these insights quickly and accurately, saving the customer precious time and giving them the necessary tools to make the right strategic decision for their hybrid cloud transition.
The process is free, quick and simple, and can be completed remotely.
As a Dell Titanium Solutions Provider Partner, and the current ANZ Dell Server Partner of the Year and APJ Dell Transformational Partner of the Year, Data#3 is uniquely positioned to provide end-to-end services that enable our customers to optimise storage budgets and outcomes.
Book your free data centre assessment today, simply leave us your details below and a storage specialist will be in touch with you shortly.
1. McKinsey (2016). Straight talk about big data. [Online] Available at: https://www.mckinsey.com/business-functions/digital-mckinsey/our-insights/straight-# talk-about-big-data
2. Forrester (2016). The Forrester Wave: Big Data Hadoop Distributions. [Online] Available at: https://www.forrester.com/report/The+Forrester+Wave+Big+Data+Hadoop+Distributions+Q1+2016/-/E-RES121574
Tags: Backup, Cisco Data Centre, Data Centre, Data Storage, Dell, Dell EMC Servers, Dell Technologies, Disaster Recovery, Infrastructure, VMware