CompTIA Cloud+ Study Guide. Ben Piper

Чтение книги онлайн.

Читать онлайн книгу CompTIA Cloud+ Study Guide - Ben Piper страница 29

CompTIA Cloud+ Study Guide - Ben Piper

Скачать книгу

decide to run a TV commercial on a Saturday afternoon televised game. After the commercial airs, the website experiences a huge traffic spike and an increase of online orders. Once the load subsides to normal levels, autoscaling terminates the additional web servers so that the retailer doesn't have to keep paying for them when they're not needed. This works well because the retailer can match the load on the website with the needed amount of computing, memory, storage, and other back-end resources in the cloud. Combining this pay-as-you-go model with autoscaling maximizes cost efficiency because you don't have to expend money to purchase the hardware for any peak loads or future growth. Autoscaling will just provision more capacity when needed. With automation and rapid provisioning, adding capacity can be as simple as a few clicks in a console, and the resources are immediately deployed!

      Contrast this scenario with what would happen without autoscaling. If the retailer were stuck with only three web servers, during the traffic spike the servers might slow down or crash. Adding more servers would be a manual, expensive, and time-consuming process that even in a best-case scenario would take several minutes to complete. By that time, the damage would have already been done.

      Understanding Cloud Performance

      Cloud performance encompasses all of the individual capabilities of the various components as well as how they interoperate. The performance you are able to achieve with your deployment is a combination of the capabilities and architecture of the cloud service provider and how you design and implement your operations.

      Ongoing network monitoring and management allow you to measure and view an almost unlimited number of cloud objects. If any parameter extends beyond your predefined boundaries, alarms can be generated to alert operations and even to run automated scripts to remedy the issue. Here are just a few of the things you may want to monitor:

       Database performance

       Bandwidth usage

       Network latency

       Storage I/O operations per second (IOPS)

       Memory utilization

      Delivering High Availability Operations

      By implementing a well-architected network using best design practices, and by selecting a capable cloud service provider, you can achieve high availability operations. You and the cloud provider share responsibility for achieving high availability for your applications running in the cloud.

      The cloud provider must engineer its data centers for redundant power, cooling, and network systems, and create an architecture for rapid failover if a data center goes offline for whatever reason. As we discussed with computing, network, and storage pools, the cloud provider is responsible for ensuring high availability of these pools, which means that they're also responsible for ensuring redundancy of the physical components that compose these pools.

      It's your responsibility as the cloud customer to engineer and deploy your applications with the appropriate levels of availability based on your requirements and budgetary constraints. This means using different regions and availability zones to eliminate any single point of failure. It also means taking advantage of load balancing and autoscaling to route around and recover from individual component failures, like an application server or database server going offline.

      Managing and Connecting to Your Cloud Resources

      By definition, your cloud resources are off-premises. This raises the question of how to connect to the remote cloud data center in a way that is both reliable and secure. You'll look at this question in this chapter. Finally, you'll learn about firewalls, a mainstay of network security, and you'll see the role of firewalls in cloud management deployments.

      Managing Your Cloud Resources

      It's instructive to note the distinction between managing your cloud resources and using them. Managing your cloud resources includes provisioning VMs, deploying an application, or subscribing to an SaaS service such as hosted email. You'll typically manage your cloud services in one of three ways:

       Web management interface

       Command-line interface (CLI)

       APIs and SDKs

      Web Management Interface

      When getting started with the cloud, one of the first ways you'll manage your cloud resources is via a web interface the cloud provider offers. You'll securely access the web management interface over the Internet. Here are a few examples of what you can do with a typical cloud provider web interface:

       IaaS: Provision VMs, create elastic block storage volumes, create virtual networks

       PaaS: Upload and execute an application written in Python, deploy a web application from a Git repository

       SaaS: Send and receive email, create and collaborate on documents

      Note that when it comes to the PaaS and SaaS side of things, there's considerable overlap between managing a service and using it.

      Command-Line Interface, APIs, and SDKs

      Cloud providers offer one or more command-line interfaces to allow scripted/programmatic management of your cloud resources. The command-line interface is geared toward sysadmins who want to perform routine management tasks without having to log in and click around a web interface.

      Command-line interfaces work by using the cloud provider's APIs. In simple terms, the API allows you to manage your cloud resources programmatically. In contrast to a web management interface, in which you're clicking and typing, an API endpoint is a web service that listens for specially structured requests. Cloud provider API endpoints are usually open to the Internet, encrypted using Transport Layer Security (TLS), and require some form of authentication.

      Cloud providers offer software development kits (SDKs) for software developers who want to write applications that integrate with the cloud. SDKs take care of the details of communicating with the API endpoints so that developers can focus on writing their application.

      Connecting to Your Cloud Resources

      How you connect to your cloud resources depends on how you set them up. As I alluded to earlier, cloud resources that you create are not necessarily reachable via the Internet by default. There are three ways that you can connect to your resources:

       Internet

       VPN access

       Dedicated private connections

      Internet

      If you're hosting an application that needs to be reachable anytime and anywhere, you'll likely open it up to the Internet. If a resource is open to the Internet, it will have a publicly routable Internet IP address. This is typically going to be a web application, but it doesn't have to be. Although anywhere, anytime access can be a great benefit, keep in mind that traffic traversing the Internet is subject to high,

Скачать книгу