Home arrow Site Administration arrow Webserver Security (Part II)

Webserver Security (Part II)

This second part of our two-part series on webserver security explores the problem of keeping private data in publicly accessible areas of you server and keeping data from untrustworthy sources from entering your system.

  1. Webserver Security (Part II)
  2. The server trusts data from untrustworthy sources
By: Kristian Kohntopp
Rating: starstarstarstarstar / 4
May 08, 2000

print this article


The second large class of errors we are going to address in this article deals with private data in public directories on a server. Many webspace providers offer just that: Web-Space. Their hosting solutions map the root of your ftp directory onto the root of your server. That is, the server directory "/home/www/servers/www.customer.com/" is visible for the customer via ftp as "/". It is also readable for everyone through the URL http://www.customer.com/". Should the customers web application have the need to keep some files private and inaccessible from the web, there is no such location. A file stored via ftp as "/password" will be available through the URL "http://www.customer.com/password".

Many webshops write order logs or debugging output into one or multiple logfiles or have configuration files with passwords and article data. If this data is being stored below the document root, it will have an URL and is by default accessible via the web. All an attacker has to do is to guess the filenames of these files. This is quite easy if you know the defaults used by the 20 most popular shopping solutions and know how to identify the software used.

This problem does no occur with hosting solutions which were designed to provide private data storage as well as public page directories. In such solutions the ftp root directory "/" is mapped onto "/home/www/servers/www.customer.com/", but the servers document root is located one level futher down at "/home/www/servers/www.customer.com/pages", accessible via ftp as "/pages". In such a setup the customer can create additional directories above and in parallel to the document root and store sensitive data there. Because these directories are available through ftp, but not through http, they cannot be accessed via the web.

On system where such a root separation scheme is not in effect it is possible to work around the problem by creating a storage directory such as "/shop" below document root and creating a .htaccess file in it which denies all http access (requires Apache webserver):

$ cat /shop/.htaccess
order deny, allow
deny from all

Files can be transferred from this directory only by ftp, because ftp will ignore .htaccess files. This approach is slightly more risky than a directory outside document root, though, because this kind of protection will fail if the server administrator does accidentally turn off the needed "AllowOverride Limit" privilege for the directory in the server master configuration.

A variant of this problem exists if multiple customers are hosted on a single machine and one customer can trick the machine into accessing pathes outside the customers directory hierarchy, that is to access any file outside of the tree "/home/www/servers/www.customer.com". Often this is possible with symbolic links or hard links to files stored outside the virtual webserver of the customer. Targets of interest are include files and private keys of other customers, to get at their database passwords or other secret information which has to be stored in such files in clear in order to for their application to work. Other possible targets are their order logs or other useable information stored in nonpublic directories.

This problem can be partly reduced by chroot()ing as many services as possible, for example by using the sbox replacement for the Apache suexec program to have all CGI running in an isolated environment and under the user id of the customer instead of the user id of the web server. Also, many installations do run a ftp server such as wu-ftpd, which does all file transfers chrooted, again to protect well behaved customers from being spied on by others.

A rogue customer could still use CGI programs to create a symbolic link into the storage area of another customer and use the web server itself to look at foreign files, since in a hosted environment the web server cannot be easily running chrooted and under the user id of the customer for which it serves the request. Administrators should instruct their web server and all other file transfer programs not to follow symbolic links. In Apache, this can be done by disabling the "FollowSymLinks" option at the top level and not turning it on at lower levels, for example using a configuration section like the following.

<directory />
Options -FollowSymLinks

>>> More Site Administration Articles          >>> More By Kristian Kohntopp

blog comments powered by Disqus
escort Bursa Bursa escort Antalya eskort


- Coding: Not Just for Developers
- To Support or Not Support IE?
- Administration: Networking OSX and Win 7
- DotNetNuke Gets Social
- Integrating MailChimp with Joomla: Creating ...
- Integrating MailChimp with Joomla: List Mana...
- Integrating MailChimp with Joomla: Building ...
- Integrating MailChimp with Joomla
- More Top WordPress Plugins for Social Media
- Optimizing Security: SSH Public Key Authenti...
- Patches and Rejects in Software Configuratio...
- Configuring a CVS Server
- Managing Code and Teams for Cross-Platform S...
- Software Configuration Management
- Back Up a Joomla Site with Akeeba Backup

Developer Shed Affiliates


Dev Shed Tutorial Topics: