Having shifted to the cloud, government agencies are now eyeing the formation of their very own Software-Defined Data Centre (SDDC) architecture. When you think about it, it makes sense. Nobody else processes as much valuable sensitive data and holds as much stake in storing, managing, and securing it from prying hands. In addition, government agencies face mounting pressure from their citizens to move paper processes online and offer stable, if not seamless, federal services. All things considered, SDDC is a sound strategy—but are agencies ready for it?
For SDDC to be possible requires three elements: server virtualisation, software-defined networking (SDN), and software-defined storage (SDS). And then there’s another: software that allows all three to talk seamlessly and enable automated workflows or data transfers with each other. Chances are, most agencies have one (likely server virtualisations) or two (likely storage) running alongside more traditional network designs on-prem due to recent shifts to cloud. Required architecture aside, there are also at least three considerations for agencies before they flip on the SDDC switch.
What’s the size of your environment?
The concept of SDDC is compelling to federal decision-makers, but the shift relies heavily on one keyword: virtualisation. The biggest barrier would be swapping equipment—like switches, routers, storage, network equipment—with their software-defined equivalents. This is harder than it sounds, depending on the size and complexity of your agency’s network infrastructure.
If your team’s daily struggles lie in managing a heterogenous environment, then be warned: SDDC represents a steep challenge. For starters, you’ll face the arduous task of virtualising the critical areas of that environment, without disrupting daily operations much. And then you’ll run into the same complexity troubles as before—remapping that diverse virtual environment, constantly needing to spin up more VMs or cloud services, and reconnecting established processes, protocols, and workflows.
On the flipside, SDDC makes more sense for smaller homogenous environments, where virtualisation is less steep with complexity and need for re-integration. In fact, I’d argue smaller environments have more to gain from the operational efficiency and cost-savings that full-on virtualisation brings.
The simple verdict? SDDC architecture works against larger environments and for smaller ones. That’s not to say large-scale SDDC architecting is impossible—it’s just more time-consuming and resource-intensive. If larger agencies are keen on adopting SDDC at all costs, then I’d recommend virtualising in batches—instead of all at once—and implementing some form of visualization management and monitoring software for better visibility over the health and traffic flows of both on-prem and virtual networks.
How are applications being developed?
SDDC architecture is a boon for in-house application development. Cost-effective deployments of, and easy access to, cloud-based services and virtual solutions has the potential to transform application or software development for the better. That’s great news for agencies facing pressure to increasingly offer self-service or online portal access for citizen services.
But before agencies make the jump to SDDC, it may be pertinent to question how, and how frequently, applications will be developed and supported over time. Take contact tracing apps, for instance. Agencies could choose to create an app that only accomplishes a single purpose, requiring fewer development resources. But what if they wanted to create an application that’s constantly adapting to new requirements or adding new features, as in the case of New York state’s COVID app? Virtualisation will make it easier to deploy test environments and feedback loops for DevOps and Agile methodologies, which are key for the higher quality development and improvement of applications over time.
Compare those examples with long-term application development demands, and you’ll have an idea of whether SDCC would benefit an agency’s in-house development team. If applications or software don’t require further development beyond initial deployment, agencies are better off using cloud-based services and even external developer teams. But if the application roadmap features extensive improvement and iteration cycles, then a move to SDDC architecture pays dividends down the line.
How critical is stability for daily operations?
While the shift to SDDC architecture is to a degree disruptive, the payoff is greater stability and service quality, as government agency IT teams have greater visibility over the virtual network. The move to SDDC could benefit agencies that require round-the-clock stability of digital operations, like law enforcement or utilities, the general public relies on.
When the unspoken goal is to avoid any disruption or failure whatsoever from impacting public wellbeing, an SDDC-based architecture allows IT teams to quickly spin up alternative solutions or capacity to prop up operations, while they work to rapidly identify and troubleshoot problems. A fully virtualised back end, coupled with in-depth monitoring, will allow agencies to get things back to normal, far sooner than they did before virtualisation.
These are broad considerations, but they’re nonetheless solid handles for agencies ruminating the viability of SDDC. It’s worth taking a deeper dive into any one of these considerations if your agency is deciding whether to take the leap—only because SDDC transition efforts can become costly and fast without clear oversight. But done well, it will allow government agencies to become more nimble, responsive, and stable than they have ever been before.