How Does Bandwidth Affect Website Performance?

by Ellis Davidson
Digital Vision./Photodisc/Getty Images

One of the most crucial aspects of a website's performance is the amount of bandwidth allocated to its use. Bandwidth determines how quickly the Web server is able to upload requested information. While there are other factors to consider regarding a website's performance, bandwidth is frequently the limiting factor.

Bandwidth Definition

Bandwidth is defined in terms of number of bits per second. A bit is the smallest amount of computer information, a zero or one, and eight of them make a single byte. Historically, network connections are measured in bits, while end-user devices such as computer memory and hard drives are measured in bytes. A megabit is just over a million bits (more precisely, 1,048,576 bits), while a gigabit is over a billion bits (precisely 1,073,741,824 bits). Therefore, a 100-megabit-per-second connection can send over 104 million bits per second, which is more usefully phrased as 13MB per second.

Calculating Bandwidth Requirements

Most Web servers send static pages and images upon request to Web browsers, which means that no modification is necessary to these files before they're uploaded over the Internet. In these cases, bandwidth requirements are fairly straightforward. Add up the total number of bytes used by the resources that make up a particular page on your site: an HTML page is a certain number of kilobytes of HTML, JavaScript and included files, while the images the page requests will probably be a larger number of kilobytes. For example, a homepage may be 25KB of HTML, Cascading Style Sheets and JavaScript, which then makes requests for 250KB of images. Therefore, requesting this page for the first time you'd need 275KB of data, or 2,200 kilobits. Over a 100-megabit Internet connection, you would use just over a quarter of one megabit for one second, which roughly works out to 300 simultaneous users being able to access the site without slowing down. Some amount of bandwidth is always lost to overhead networking requirements, so rounding down these estimates (in this case, from 364 to 300) is a good idea. Websites that expect more than 300 simultaneous users, therefore, would need more Internet bandwidth, or use smaller files on their homepages.

CPU Bottlenecking

A website can also be slowed down by the need to serve dynamic files. This is the case whenever a Web page is generated based on programming code; for example, pages on a WordPress site are not static HTML but are generated upon request by PHP programming in the WordPress template. A dynamic site with the same parameters as the static page example would be able to handle only 300 simultaneous users if the CPU requirements to generate each page are roughly under one-third of 1 percent of the CPU. On the other hand, if 1 percent of the CPU is used to generate a page, then the maximum number of simultaneous users will be closer to 100 than 300, even though its available bandwidth can handle more users.

Caching and Content Delivery Networks

Bandwidth bottlenecking can be improved by using content delivery networks, or CDNs, which store copies of your Web server's data, usually at multiple points across the Internet. For example, a CDN may be located in Tokyo and London, duplicating -- or caching -- information from your Web server for users in Asia and Europe respectively. The CDN is physically closer to the user, speeding up delivery and reducing the number of hits on your primary Web server. Likewise, caching on the local Web server or CDNs can also greatly reduce CPU requirements, but at the cost of sending slightly out-of-date pages to Web browsers.

More Articles