What is the Cloud?
Unless you have been hiding under a rock for the last decade, chances are that you have heard the term “Cloud” in relation to IT devices and the IT industry. Most of our readers likely have a pretty solid understanding of this concept and for those of you that don’t, simply put, there is no cloud, it is someone else’s computer. All joking aside, the term Cloud abstracts the complexity of computer systems that are interconnected, software driven, and easily consumable. More eloquently put, “Cloud computing is the on-demand availability of computer system resources, especially data storage and computing power, without direct active management by the user.”
That being said, the Cloud definition that was coined many years ago has evolved over the years and can mean different things depending on who is using the term. In general, the Cloud can come in many shapes and sizes (and even models). The main differentiator in most of these situations is how much control you have as a consumer. Here are the important terms to understand as you navigate this complex landscape.
Infrastructure as a Service (IaaS):
IaaS is a cloud computing model where virtualized infrastructure is offered to, and managed for, businesses by external cloud providers. With IaaS, companies can outsource for storages, servers, data center space and cloud networking components connected through the internet, offering similar functionality as that of an on-premises infrastructure. Some examples of the wide usage of IaaS are automated, policy-driven operations such as backup, recovery, monitoring, clustering, internal networking, website hosting, etc. The service provider is responsible for building the servers and storage, networking firewalls/ security, and the physical data center. Some key players offering IaaS are Amazon EC2, Microsoft Azure, Google Cloud Platform, GoGrid, Rackspace, DigitalOcean among others.
Platform as a Service (PaaS):
PaaS builds on IaaS. Here, cloud vendors deliver computing resources, both cloud software and hardware infrastructure components like middleware and operating systems required to develop and test applications. The PaaS environment enables cloud users (accessing them via a webpage) to install and host data sets, development tools and business analytics applications, apart from building and maintaining necessary hardware. Some key players offering PaaS are Bluemix, CloudBees, Salesforce.com, Google App Engine, Heroku, AWS, Microsoft Azure, OpenShift, Oracle Cloud, SAP and OpenShift.
Software as a Service (SaaS):
SaaS is special in that it incorporates both IaaS and Paas. Here, the cloud service provider delivers the entire software suite as a pay-per-use model. SaaS lets users easily access software applications — such as emails — over the internet. Most common examples of SaaS are Microsoft Office 360, AppDynamics, Adobe Creative Cloud, Google G Suite, Zoho, Salesforce, Marketo, Oracle CRM, Pardot Marketing Automation, and SAP Business ByDesign.
Public cloud, in general, is SaaS offered to users over the internet. It is the most economical option for users in which the service provider bears the expenses of bandwidth and infrastructure. It has limited configurations, and the cost is determined by usage capacity. That said, the limitations of the public cloud are its lack of SLA specifications. Despite high reliability, lower costs, zero maintenance and on-demand scalability, the public cloud is not suitable for organizations operating with sensitive information as they must comply with stringent security regulations.
As the name suggests, the private cloud is used by large organizations to build and manage their own data centers for specific business and IT needs/ operations. The private cloud provides more control over customizability, scalability and flexibility, while improving security of assets and business operations. This sort of infrastructure can be built on premises or outsourced to a third-party service provider. Either way, it has the ability to maintain the hardware and software environment over a private network solely for the owner. Large and medium scale financial enterprises and government agencies typically opt for private clouds.
Hybrid cloud is the combination of a private and public cloud, providing for more flexibility to businesses while having control over critical operations and assets, coupled with improved flexibility and cost efficiency. The hybrid cloud architecture enables companies to take advantage of the public cloud as and when necessary due to their easy workload migration. For instance, businesses can use the public cloud for running high-volume applications like emails, and utilize private clouds for sensitive assets like financials, data recovery, and during scheduled maintenance and rise in demand.
A multi-cloud strategy is the use of two or more cloud computing services. While a multi-cloud deployment can refer to any implementation of multiple software as a service (SaaS) or platform as a service (PaaS) cloud offerings, today, it generally refers to a mix of public infrastructure as a service (IaaS) environments, such as Amazon Web Services and Microsoft Azure.
Cloud as an Operating Model
We at Sanity Solutions don’t look at the cloud as a destination, we treat it as an operational model that is driven by software defined architecture. It’s about simplifying and automating the reoccurring tasks that allow the IT organization to drive business strategy. It’s about, using the hyperscale environments to increase agility and drive value for the business. It’s about making the infrastructure seamless for its consumers and improving migration and lifecycle management. It’s about determining the best economies of scale and having the portability to take advantage of those economies.
What are the driving forces and where is the market going?
The cloud was really born in the virtualization age. Hypervisors abstracted the hardware layers as software and a revolution was born. Virtualization is at the heart of cloud infrastructures and it imbodies the “software defined everything” model. The Virtual Machine (VM) is at the core of virtualization and they rule supreme in most ecosystems today. VMs have typically function as a means of running an application and one of the major pitfalls to the VM is its portability for Applications.
While virtual machines are king in today’s infrastructure, a container uprising is occurring. One of the main drawbacks to a Virtual Machine is its portability. Virtual Machines cannot be transferred across hypervisors without the use of special tools and that requires downtime. Software developers have come up with a solution to this problem in the form of containers. A container is an abstract resource. It is allocated CPU, memory, and networking resources that allow an application/code to run in it, but it doesn’t contain an operating system.
A container has to run on top of an operating system, but the absence of the operating system within the container allows you to transfer containers between different platforms without vendor lock-in. At a high level, there are some caveats to this as it is up to the developers of the application to ensure its portability between systems.
The bottom line here is that consumers are already operating multi-cloud environments today and this will likely continue to trend upwards in the future. If you want to take advantage of the best cloud economics, you will need to use containers if you want agility — using the public cloud is all about agility.
Dark Clouds can be avoided
Clouds can provide agility and scalability, which can lead to the creation of rainbows. However, there are important considerations to take heed to when looking at deploying into various clouds.
Not all clouds are created equal and there are many different cost structures. Public clouds typically give you a price to rent virtual infrastructure and then charge consumers for networking, storage, API calls, and other resources you consume. Many Private clouds and managed service providers typically charge a flat rate for use of the resources, but some take the metered approach as well. Ultimately, it is up to the consumer to understand their workloads and how those workloads interact with each other because failure to understand this can have significant financial impact on a business.
The major public cloud providers have a myriad of services that they and their 3rd party affiliates offer consumers. Some of these offerings come with proprietary software and configurations. It may be very easy to get something deployed up front, but companies need to be very considerate about how they are architecting their applications as it is possible to get locked-in to a specific vendor. That has the potential to take agility away from the consumers.
Companies need to strongly consider a vendor agnostic approach to cloud offerings as it can be very costly to pivot in a different direction in the future. However, some companies may decide to take a different approach and look at building upon a stack provided by the Cloud companies. The major public cloud providers typically make it very easy to consume this type of model and this is likely because they would like to have you as a customer for a long time. All in all, companies should be very diligent in which path they choose as it could make or break a company down the road.
Security is everyone’s responsibility and Cloud providers have invested billions of dollars to ensure solid security practices are implemented within the infrastructure. At the same time, security in the public cloud is still the consumer’s responsibility.
There are many products and services out in the public cloud marketplaces that will allow companies to implement their security practices in the fashion they see fit. Most major security vendors provide access to physical or virtual resources in those marketplaces making it easy for consumers to extend their on-premise security eco system into the public cloud for hybrid cloud deployments. Many of the public cloud providers have their own tools and services you can use to build your security practice in the cloud as well. Just make sure you have a good plan and strategy for using proprietary services as you may experience some vendor lock-in depending on which tools and services you utilize.
Public and private cloud infrastructure is all about adding layers of resiliency to ensure infrastructure/software uptime. However, those built in layers typically only cover the primary source of data. As such it is important to have a well thought and robust back up strategy. Businesses must have well defined RTO and RPO objectives so that IT teams can make effective decisions surrounding their back up strategies.
Incorporating the standard 3-2-1 rule (3 copies of data, 2 on different media types, one offsite) is still very necessary to ensure data integrity. Many cloud offerings have a service level agreement for how long the provider is responsible for the ensuring the integrity of the data and it is very important to read the fine print for cloud offerings so that you know what your business is responsible for and how resilient the data structures they provide you actually are. Many service/cloud offerings will give you the ability to build a 3-2-1 strategy, but there is typically additional costs associated with those offerings.
Staffing and human considerations are often some of the most difficult things to put costs on and this is why it is especially important to understand staffing and human considerations when you are looking at venturing into different clouds.
Cloud infrastructure is incredibly complex and with this complexity comes new technologies and new interfaces that will take employees time to learn and adapt to. Adopting a new cloud offering might mean that you will need to hire new skillsets or hire 3rd party resources to help business complete projects and migrations.
Most companies are interested in saving time and/or money in the long run by reducing the amount of time necessary to manage an environment. This is what the cloud promises as consumers want to be able to spend more time driving revenue for the business. Ultimately, companies need to make sure they do the work up front to factor in what it may cost to get there.
Sanity Solutions has you covered. A better approach than a questionnaire and/or an automated discovery method is one which consults with the business to identify individual business transactions, documents their end-to-end flow through the IT infrastructure, and maps the business requirement and importance to the technological considerations supporting it. If you are ready to optimize your use of the cloud to best fit your organization let us help by contacting us at firstname.lastname@example.org.