View all on-demand sessions from the Intelligent Security Summit here.
Imagine being a doctor expecting a time-sensitive organ donation and eager to tell your patient that the long wait is finally over. There’s just one problem: the credentials to access your patient’s medical records don’t work. Then you get a message: The current heat wave caused an IT system failure in the entire hospital. No operations are performed today. Your patient will return to the donor waiting list.
Unfortunately, this example is not far-fetched. During the recent heat wave in Europe, computer servers overheated in data centers used by one of the UK’s largest hospital systems. This prevented doctors from requesting medical records, accessing results for CT and MRI scans, and even performing surgery. Some seriously ill patients had to be transferred to other hospitals in the area.
Welcome to the least glamorous but definitely critical corner of the tech world. You’ve heard of “the cloud:” It’s not in the sky. In more than 8,000 data centers scattered around the world, rows of computers and miles of cables make up the infrastructure that houses trillions of gigabytes of data ranging from family photos to top secret government information – all the data needed to keep the modern world running.
Manage data, not just generate it
It is said that “data is the new oil” in our information economy, fueling trillions of dollars. If the flow of that data were slowed — either by catastrophic failure or our own inability to keep up with demand — there would be incalculable economic damage, combined with the human toll of canceled operations, missed flights, and more. So we have to keep our assets to manage data on our ability to to generate the.
Modern data centers are in high demand and are immensely complex to design and build, with precise physical layouts, exact ventilation requirements, power consumption and more factors to consider. Facilities must be able to withstand environmental disturbances, operate as sustainably as possible, and be equipped with redundant backups at every step to ensure 100% uptime.
Fortunately, we now have the digital design technology to efficiently tame these daunting challenges. The task of redesigning and upgrading a complex design once meant “back to the drawing board”. But we can now”virtual twinof buildings, processes, systems — even cities. With this tool, we can create digital layouts, evaluate virtual adjustments, and run thousands of simulations to see which changes are likely to produce the best real-world results. Virtual twins safely speed up the design process and help avoid costly, time-consuming adjustments once physical construction begins.
This virtual design capability has revolutionary implications for assessing and improving systems and process performance. Our data centers are the place to start as they are in dire need of a sustainability overhaul.
Virtual twin essential
Existing data centers usually started on an ad hoc basis in response to data storage needs that few realized were about to grow exponentially. They then haphazardly expanded to devourers of power and water to keep the electrons buzzing and the hardware cooled.
Today data centers consume 3% of global electricity, a figure that could jump to 8% by the end of the decade. Data centers are already producing about 2% of global greenhouse gas emissions, roughly equivalent to the entire aviation sector. The average data center required three to five million liters of water – up to seven Olympic-size swimming pools – per day to prevent key technology components from overheating.
We need to get a handle on this rising consumption cycle because while we can’t live with it, we literally can’t live without the work these data centers are doing. Data centers need to run 24/7 and virtual twins can help create a resilient and reliable facility before the first concrete is poured.
Keeping data in the zettabytes
How much redundancy do you need to keep a system running at all times? Where are the vulnerabilities, and how do you best protect yourself against failure or deliberate exploitation? What is the best way to reduce power and water consumption?
With a virtual twin available, the answers to questions like these can be digitally explored in detail. A data center virtual twin can provide guidance in eradicating inefficiencies, improving performance, and even determining the best sequence for implementing physical changes designed on the twin. The twin can continue to grow alongside its real-world counterpart, creating a permanent simulation platform for exploring improvements.
By 2021, the world had generated 79 zettabytes From the data. We need to ensure our data centers can keep up as we climb to an estimated 181 zettabytes by 2025 – more than double the current numbers.
We’ve never had better technology to apply to that task, and the technology itself is getting better every day. It is now not only possible, but realistic to think in terms of 100% uptime. But that requires both technical ability and 100% human effort.
David Brown is VP Customer Solutions for North America at Dassault Systèmes.
Data decision makers
Welcome to the VentureBeat community!
DataDecisionMakers is where experts, including the technical people who do data work, can share data-related insights and innovation.
To read about advanced ideas and up-to-date information, best practices and the future of data and data technology, join DataDecisionMakers.
You might even consider contributing an article yourself!
Read more from DataDecisionMakers