Reduce Server Load for Apache and PHP Websites

This is an introductory tutorial that teaches you useful and easy practices you can use to reduce website server load. It is applicable to websites hosted on Apache web servers, preferably using Cpanel. It also assumes that you are using PHP as your server side scripting language.

It would be a good practice as a website administrator to regularly check the server load of your website. Ideally the lower the server load, the faster your website serves content to your visitors; the faster you serve content, the faster you can increase sales, website followers, and trust.

Server Load Metrics in Cpanel

Before you begin learning these tips, it would be useful for you to know the server load metrics you need to monitor in Cpanel. To do this:

1.) Login to your Cpanel hosting account.
2.) Under “Stats” you will see lots of hosting related information such as Main Domain, Home Directory, Disk space usage and so forth. Scroll down until you see “Service Status”. Click the link “Click to View”.
3.) Under “Server Status”, you will see a lot of services in your website, alongside its status. Examples of this are imap, mysql, sshd, etc.

The following are the important metrics you need to check:

a.) Server Load – This is the load on the server CPU. The lower this figure, the better. What is acceptable varies depending on the number of CPUs your server is using. Below are some rough guidelines to judge if your website server load is normal.

To interpret the data table guide above, every hosting server may use a different number of CPUs. So for example – if your server load readings in Cpanel are 2.37 (4 CPUs), it means your website is experiencing moderate server load. Since 2.37 falls between the 1.65-2.85 range for 4cpus server, this is not yet a problem.

However if your server load data is around 4.51 (2 CPUs), then your website is experiencing high/excessive server loads. It is because 4.51 is beyond 1.9, which is the threshold for a 2 CPUs server. How frequent this happens is also very important. For example: if it reads “consistently” for 3.8 to 4.0 for a 4 CPUs server, then you need to be worry.

If it just happens very rarely, it might be a special event (like having your website featured on a front page of a high traffic website like www.yahoo.com ). Take server load readings during different times of the day, not just on one occasion.

Increasing the number of CPUs is not a permanent solution to all high server load issues. You still need to check your overall website status to make sure no web application has the possibility of abusing your server load.

b.) Memory used – generally as long as it does not stay over 85% all the time, then it is fine. High memory usage can affect server load because whenever your server is short on memory, it uses the swap file, which is essentially hard drive space (similar to the principle of page files in Windows). The more reading and writing tasks the hard drive, the higher the increase in CPU usage, which in turn increases the server load.

c.) tmp space usage (e.g. Disk /dev/sda3 (/tmp)- this is the storage space for temporary files. Sometimes when this runs out of space, a lot of your website features will be seriously affected. As long as it does not stay consistently between the 85% to 100% range, then it is fine. Contact your web host if you see above average usage.

If your website is experiencing high server load issues, I recoomend the following steps:

Reducing Server Load Tip #1: Implement Captcha on Your Website

There are malicious bots (scripts) programmed by spammers to exploit your website’s features. For example if you have a useful tool on your website, not only does this attract normal visitors but spammers as well.

If your web application accepts user input and is not captcha protected, spammers can write scripts to exploit it. Once it is exploited, the user of your web application is no longer a human, but a bot. Bots can make thousands of requests in a small amount of time, which in turn increases the server load of your website.

Things you need to do:

1.) Examine all of the pages and features on your website that accept user input.
2.) Apply a captcha to these pages to ensure all users are verified humans and not bots.

For PHP, the most recommended captcha solution (which is strong and standard) is Recaptcha: http://code.google.com/apis/recaptcha/docs/php.html

Reducing Server Load Tip #2: Enable Gzip Compression

Gzip compression is very useful for reducing website bandwidth because the information that is transmitted from your server to your users is compressed or significantly reduced in size. Reducing website bandwidth can reduce the transfer times significantly, which in turn reduces server load.

Things you need to do:

1.) Check to see if you have enabled gzip compression. You can use this tool: http://www.whatsmyip.org/http_compression/

2.) Enter your homepage URL as well as one of your inner pages. If you do NOT see a result such as “http://www.example.com is gzipped” then you have not enabled Gzip on your server.

3.) To enable Gzip compression on Apache servers using Cpanel,  login to Cpanel and then go to Software/Services. Click “Optimize Website”. For the best results, select “Compress the specified MIME types”, as compressing all of your content can sometimes cause problems in your hosting configuration. Make sure all MIME types on your website are compressed to get the most benefits out of the compression.

{mospagebreak title=More Tips to Reduce Server Load}

Reducing Server Load Tip #3: Put Limitation on Your PHP Script’s Runtime Execution

Suppose you have a web application that generates random numbers between X and Y, with a quantity specified by the user. If you do not put a limitation on the PHP scripts that execute this application, it is prone to abuse, which can consume a lot of server load.

Why? In a realistic scenario, normal users may want to generate 50 random numbers between 1 and 100. This is not a CPU intensive activity. But what if a malicious user were to enter numbers between 0.01 to 1000000000000000 and want to generate 10000000000 random numbers. This consumes a lot of server resources. Things will only get worse if a bot does this intentionally.

Things you will need to do:

1.) Examine your web applications and ensure that you impose limitations to them.
2.) Protect your web applications with captcha.
3.) Look for bugs in your scripts that can cause infinite looping, etc.

Reducing Server Load Tip #4: Prevent Unauthorized Websites from “HotLinking” To Your Site

You may not be aware of this, but if you have a popular website (or even not so popular), malicious users can abuse your site by doing extensive “hotlinking”. Suppose you have files, images, and so forth on your server for downloading. These files, depending on the application, can be massive in size.

Now if you allow other websites to link to that file, any bots and users coming from other websites can end up downloading your massive files, which in turn can increase the server load.

Things you need to implement:

1.) Use .htaccess located in the root directory of your website to allow only authorized websites to remotely access your files. Other websites will be blocked. The .htaccess example syntax is shown below:

RewriteEngine on
# Options +FollowSymlinks
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http://(www.)?yourdomain.com/.*$ [NC]
RewriteCond %{HTTP_REFERER} !^http://(subdomain.)?yourdomain.com/.*$ [NC]
RewriteCond %{HTTP_REFERER} !^http://(www.)?allowthisdomain1.com/.*$ [NC]
RewriteCond %{HTTP_REFERER} !^http://(www.)?allowthisdomain2.com/.*$ [NC]
RewriteCond %{HTTP_REFERER} !^http://(subdomain.)?example.com/.*$ [NC]

RewriteRule .(gif|jpg|jpeg|png|bmp|js|css|mpg|mp3|zip|ppt|pps|xls|inc|gz|pdf|doc|wav)$ – [F]

Simply replace “yourdomain.com” with your own domain name (the website that you would like to reduce the server load). If you have a subdomain within your main domain, add it as well. Otherwise, delete it from the syntax above.

If you allow other domains, then specify them under “allowthisdomain1.com”. Otherwise if you want to block ALL external domains from accessing your content, just delete those affected lines above. You can also specify the file types that you want to prevent from being hotlinked to by other sites.

Reducing Server Load Tip #5: Use AJAX along with PHP

AJAX is a wonderful technology that allows the user to interact with your web applications (such as web forms) without reloading the page. AJAX only updates necessary portions in the HTML, which does not  require the page to be reloaded. This significantly reduces server load.

This page provides a good introduction on using AJAX with PHP: http://www.w3schools.com/php/php_ajax_php.asp

Reducing Server Load Tip #6: Enable Website Caching

Depending on your webhost, you need to make sure mod_headers are enabled for your Apache web server. Then add the following lines to htaccess:

<IfModule mod_headers.c>
<FilesMatch ".(ico|pdf|flv)$">
Header set Cache-Control "max-age=29030400, public"
</FilesMatch>
<FilesMatch ".(jpg|jpeg|png|gif|swf)$">
Header set Cache-Control "max-age=604800, public"
</FilesMatch>
<FilesMatch ".(xml|txt|css|js)$">
Header set Cache-Control "max-age=172800, proxy-revalidate"
</FilesMatch>
<FilesMatch ".(html|htm|php)$">
Header set Cache-Control "max-age=60, private, proxy-revalidate"
</FilesMatch></IfModule>


Source: http://www.askapache.com/htaccess/speed-up-sites-with-htaccess-caching.html

Why does caching reduce server load? After the visitor visits the page, your website content (images, etc) will reside in the browser cache. If this content (such as images) are said to last up to 2 weeks (as defined in your cache headers above), then the visitors browser will not request to your server the same type of content since it is already cached. By reducing the server request, you are also reducing the server load.

Reducing Server Load Tip #7: Use robots.txt or .htaccess  to Block Bots

Robots.txt (http://support.microsoft.com/kb/217103 ) and “Deny from all” directives (http://www.kavoir.com/2009/01/htaccess-deny-from-all-restrict-directory-access.html) in .htaccess helps in preventing unauthorized users such as bots from visiting sections of your website that are meant to store large amounts of data such as documents, etc.

If these are crawled, it consumes large amounts of bandwidth and server load.  It is because it is being downloaded unintentionally. Using a combination of robots.txt and .htaccess will be help prevent this.

Google+ Comments

Google+ Comments