Skip to main content
Cloud

Revisiting our “3 Critical Steps to Transition to the Cloud”

Insight from John Partee, NTS Machine Learning Engineer  Back in 2018, we wrote about the DoD’s transition to the cloud, the challenges coming, and our personal approach to migrating an app to the cloud. A lot has changed in the years since, so we figured it was time for a retrospective!  

Here I’ll cover what I think we got right and wrong, and how our perspective has shifted now that the cloud environment and our approach are more mature. I’m an AWS guy, so I’ll be talking mostly to AWS’s progress in the space, but the same is more or less true across the board, if you’re using another provider. 

“In fact, from our perspective, the cloud remains a mythical beast in the government space. Everyone wants to talk about being in the cloud, but no one really knows how to get there.”

This phrasing is still pretty true, in the government space there seem to be two major camps; those that think the cloud is wrong for their mission (with some merit!), and those that are already there. 

“At the end of the day, the cloud belongs to someone else.” 

This has been a big shift culturally. AWS’s introduction of IL4/5/6 has more or less assuaged the security fears we all once had. GD’s GovCloud contract vehicle makes it extremely accessible, and AWS services are available on any network, at any classification that we’d want. 

“And while it can be that great connector between the data center and edge devices, moving data back and forth as needed, it’s still a stressful decision to make the cloud move.” 

I think we got this right, for the wrong reasons. The real power of the cloud in 2021 is offloading workloads to the cloud when we can afford to. The edge has gained a ton of market and government attention in the last year or two, and we have made strides in making this distributed future easier to manage. Traditional huge strategic data centers have been usurped by easy-to-use cloud services, improving reliability and reducing overhead. The stress of the new world isn’t so much security these days, it’s managing all of the nodes we’ve spread out across the globe! 

“First, we recommend a small-scale cloud project with servers that have non-sensitive information.”

Here we’ve changed our approach a lot! Sensitivity isn’t as much of a concern, but the workload’s ability to be shifted is crucial. Latency and throughput constraints are the leading problem in the DoD edge fight, especially as we refocus on the near-peer competition. The cloud is useless if you can’t reach out and touch it, which drives our edge focus. Scale is a non-factor, that’s the beauty of the cloud! 

The other major consideration here is “primitive” flexibility. In software architecture, we talk about “primitive” types; numbers, strings, booleans, etc. In the cloud, our primitives have become considerably more granular than in the olden days. If you are just sticking a virtual machine (VM) in the cloud, you’re probably doing it wrong. This drives up cost, latency, and doesn’t deliver any of the benefits of the cloud, like scalability.  

Shifting from a traditional monolithic VM to microservice and serverless or container-oriented architecture is the way to go if you are determined to live in the cloud. This does make you more dependent on one cloud provider (as they all do it a little differently), but it also saves a ton of money if done well, and makes systems nearly infinitely scalable! 

“Second, after getting that comfort level and a closer familiarity with the cloud service provider’s security and risk, it is appropriate to move up to a data server that has more important data which merits stronger security requirements.”

Again, a bit of a shift. These days we start walking while we’re crawling, and it’s fine. Security has improved massively since the early days of cloud adoption, and we can start with more sensitive data. Here is where we advocate that teams start to build modern software practices and tools into their workflow. For instance, adding some DevSecOps practices like continuous code scanning, GitOps, and more agile development and management practices. GitLab covers most of these bases for us! The software industry has gotten a lot right, and here is where we go from just “lift and shift” to the cloudy, secure, wonderful future. 

“Run. Third, larger quantities of data can now be moved in and out of the data center as needed. This can become a large deployment because that trust and comfort have been earned.” 

These days, we have the trust already built! At this stage, we are usually focusing on making our edge management more reliable and scalable. What apps should we shift to the edge? How are they communicating to the cloud? How long does it take to stand up a new edge node? Are we warehousing data returned from the cloud? Scaling our enterprise solutions and adding machine learning is the move here. 

Overall, the landscape has shifted. The cloud is a big player for traditional data center operations, but the edge has come into a much sharper focus. IoT sensors and edge computing are here to stay. Scalable apps, Kubernetes, and serverless are gargantuan shifts in the thinking of our engineers and are making our apps faster, and under budget. We just have to leverage new tech the right way! 

NexTech Solutions is a team of experts and engineers who understand the challenges that Federal agencies face in finding and implementing the best technologies and IT solutions to meet their mission requirements. For more information about how we can help your organization can transition to the cloud, view our cloud capabilities