Reverse hybrid cloud: how the cloud moves in place

Google Cloud Platform and Amazon Web Services are moving to private data centers.

In many cloud eras, companies are currently using a variety of public, private, and local infrastructures to manage services, applications, and workloads. Surprisingly, we’ve seen large public cloud platforms enter private data centers to expand Coverage and continue to maintain relationships. In this article, I will explore some of the ways in which public clouds can be extended to private data centers, further blurring the boundaries between different types of infrastructure in IT architecture. business.
Amazon Web Services (AWS) launched the Snowball Edge appliance in 2015 to provide enterprise cloud users with on-site storage and limited computing power. The initial use case for Edge devices (and AWS Snowmobile trucks) was to help companies transfer large amounts of data to the AWS cloud.

In July 2018, AWS announced the EC2 of its AWS Snowball Edge device, bringing the true power of the AWS cloud to its peak. Customers can now run virtualized applications with local EC2 as long as they have power and no internet connection. Snowball Edge runs AWS Lambda for computers without servers, and AWS Greengrass connects Snowball Edge to the AWS cloud to process data, including machine learning. With new support for Amazon S3 data storage, these improvements have transformed Snowball Edge from a simple data medium to a powerful compute node.

[Note 9 bad IT warning signs to see why these 10 old school IT principles are still valid. |Register a CIO newsletter. ]

A few days after the AWS announcement, Google Cloud Platform (GCP) announced that it will extend the Google Kubernetes Engine (GKE) – the core service for managing edge device containers, supporting on-site container deployment. Since Google has been announced for a long time, Google intends to end the error dichotomy between the premise and the cloud, GKE On-Prem will be monitored as another available regional facility in the dashboard and Google Cloud and infrastructure on the public cloud.

What is a reverse hybrid cloud?

A reverse hybrid cloud is a term that describes the architecture in which an enterprise runs software and public cloud services in its own data center. In this case, companies must also use traditional public cloud services and have on-site infrastructure and capabilities to manage hardware and software.

Often, a hybrid cloud of Islam refers to the use of two or more caution clouds through common or proprietary technologies (usually hybrid integration platforms). Older businesses began to use the cloud, and digital native companies left the cloud for a number of reasons, often finding themselves using a hybrid cloud. However, turning GCP and AWS into a private data center is a giant game on the public cloud platform that will help companies transform their IT in their journeys. therefore.

What are the benefits of a reverse hybrid cloud?

Here are some ways companies can benefit from using a hybrid hybrid cloud architecture to bring the power of the public cloud home.

This setup does not force teams to work and maintain multiple environments, but instead encourages companies to standardize technologies that can be deployed anywhere. With Google Kubernetes Engine, it doesn’t matter whether the workload is handled on-site or on Google Google machines – the interface, applications, technology and necessary skills will remain the same.

For large enterprises, better control and broader choices require the use of containers and microservices, but for any reason you need to run hardware and infrastructure internally.

Using common technologies in the environment, such as standardizing Kubernetes, can help IT decision makers improve policy implementation and compliance. This standardization type can be implemented by running a local cloud-based technology.

Provide IT organizations with the opportunity to test and develop fully internal cloud-based services before being pushed to the public cloud.

For current cloud-based workloads that require minimal latency, local cloud placement can improve performance and potentially save WAN-related costs.

For large-scale data transfers, it saves money to download data on edge devices and send the device to a cloud provider for storage.

Running EC2 on the edge allows businesses to manipulate their data before sending it to AWS while companies find the right balance between private and hybrid clouds.

Be the first to comment

Leave a Reply

Your email address will not be published.