There are a number of ways to get your database-backed Web application to run faster on the same hardware--allowing you to postpone upgrades for at least a little while, thus cutting costs. One way involves examining how your applications are interacting with the database. Tom Copeland explains how to "tune the queries" for a PostgreSQL database.
Everyone wants their database-backed Web application to run faster on the same hardware - if a software tweak can postpone a hardware upgrade for a year, it's usually well worth it. One way to improve performance is to examine how the application is interacting with the database, and see if there's anything that we can do to speed things up. We'll take a look at a handy script that can examine PostgreSQL logs and let you know what queries are getting run the most.
It's worth a moment to think of the various types of optimization that you can do on a database. There's tuning the database server itself - for PostgreSQL this means tweaking the shared memory parameters and write-ahead log sizes and such in the PostgreSQL configuration file. There's tuning the operating system on which the database is running - making sure you've got enough threads to prevent the server from starving, making sure you've got a proper filesystem selected, and so forth. We're going to tune things at a slightly higher level - the database queries themselves.
Naturally, "tuning the queries" can be further broken down as well. One option is to work on optimizing queries that will help make specific parts of your application run faster. For example, if you notice that bringing up a customer detail screen takes a long time - whatever "long" means in your environment - it might be worth seeing what queries are being run to bring up the information. If the application is doing twenty different queries to get the data for that screen, perhaps the queries can be combined, or perhaps an expensive table scan is being done on a large but infrequently updated table, and an index could be added somewhere.
Another type of optimization is a sort of general optimization that you can do by seeing what queries are getting run the most and working on those. This doesn't have the immediate satisfaction of seeing a screen display faster, but it's not a bad idea if you've got some time and you want to do some future load-proofing. That's the sort of optimization we're going to focus on here.
As background information, I help work on GForge (http://gforge.org/), an Open Source collaborative software development tool which allows you to organize and manage any number of software development projects. I also administer a fairly active installation of GForge called RubyForge (http://rubyforge.org/); it hosts 170 projects and over 500 users. It's backed by a PostgreSQL database that executes about 60,000 queries per day. Thus, I've got a vested interest in helping GForge to run efficiently on commodity hardware. There are many other sites running GForge as well - you can find a list of sites here (http://gforge.org/docman/view.php/1/52/gforge-sites.html). Several of these installations are much larger than mine, and general optimizations will help them as well.