A few weeks ago, Sia wrote about configuring your DNS for high availability. Another aspect to making sure you’ve continuously got an always-on web app, is to consider using load balancing for limiting time-outs.
As more and more workloads move to the cloud, load balancing is modern computing's answer to providing the thousands of concurrent web users requesting information from your website, with an effective customer experience. When your network’s being hit at peak traffic times, distributing the incoming requests from clients and users across a group of back-end resources let’s you deal with the requests faster, and more reliably. Think of it as a traffic orchestration manager.
Why are load balancers important?
It’s a known fact; there’s only so much processing power, memory, and disk space associated with a single server. Furthermore, there’s always going to be network limitations on the TCP/IP protocol based on the number of actual connections it can serve up at any given moment.
A physical server has a finite number of resources, whereas hosting your app on more than one server lets you share the load, balance the load and handle far more connections because you have multiple resources to route and manage traffic across.
What are some of the benefits?
There are very few (if any) tech companies out there that don’t require some sort of of scheduled maintenance or planned downtime. Most companies try and schedule this for obscure hours like early am on Sunday morning to avoid any disruption. But if you’re a global business with users across multiple timezones, someone somewhere will be likely to get hacked off. With load balancing, if you need to take out a server for maintenance, you can just shut it down and channel traffic to your other resources.
Handling Peak Performance
One of the other advantages load balancing provides is the ability to add and remove instances as your needs change, without causing any disruption to incoming traffic. If you’re an e-tailer or media publisher, there’s bound to be times when your servers will get hit hard (Black Friday, Christmas sales, breaking News to name a few), so the ease and speed of being able to automatically scale and balance the load is critical to the user experience.
Most load balancers are sophisticated in how they monitor channels, handle incoming application requests or check the load balancing between server instances you’re running. If you’re using multiple data centres across a number of cloud providers, detected failures can be bypassed by re-distributing resources to other areas that are unaffected, with minimal disruption.
Most experts consider it best practice to have your load balancer running concurrently in the same environment as the application resources you want to distribute. Cloud 66 offers a number of options on how you can set this up depending on the cloud vendor you’re using. All your existing web servers will automatically be added to the load balancer, and if you’re using a cloud provider that has an in-house load balancer like Amazon’s ELB service, Cloud 66 won’t charge you for that.
You can also add a load balancer straight from the Cloud 66 control panel if you prefer we manage this for you. With the load balancer in place, it’s then really easy for you to add more servers based on the needs of your incoming traffic.