Call us Toll-Free:
Email us


I can make your site run 10 times faster

Mike Peters, 01-06-2008
I have been doing a lot of work over the last few weeks that involves crawling, parsing and indexing a large number of webpages.

I'm talking half a dozen servers parsing 200,000 pages per hour over 40 IP addresses, 24 hours a day.

As part of this project I was running plenty of bechmarking tests, trying every technique imaginable to squeeze more juice from the servers.

It worked out phenomenally well.

And so, I wanted to take this opportunity to share a few "secrets" that are going to put your site on steroids.

What Slows your site down? (Most expensive to least)

1. Database write access (read is cheaper)

2. Database read access

3. PHP, ASP, JSP and any other server side scripting

4. Client side JavaScript

5. Multiple/Fat Images, scripts or css files from different domains on your page

6. Slow keep-alive client connections, clogging your available sockets

How to make your site run faster

== Database layer

1. Switch all database writes to offline processing

2. Minimize number of database read access to the bare minimum. No more than two queries per page.

3. Denormalize your database and Optimize MySQL tables

4. Implement MemCached and change your database-access layer to fetch information from the in-memory database first. The least you should do is store all sessions in memory.

5. If your system has high reads, keep MySQL tables as MyISAM. If your system has high writes, switch MySQL tables to InnoDB. If you need 99.999% availablity - consider switching to MySQL Cluster storage.

== Server side scripting

6. Limit server side processing to the minimum.

Short scripts that queue any heavyduty processing to be done offline. Use a caching engine that generates static files from dynamic ones, so that processing only takes place once.

7. Percompile all php scripts using eAccelrator. If you're using WordPress, implement WP-Cache

== Front end

8. Reduce size of all images by using an image optimizer, Merge multiple css/js files into one, Minify your .js scripts

9. Avoid hardlinking to images or scripts residing on other domains.

10. Put .css references at the top of your page, .js scripts at the bottom.

11. Install FireFox FireBug and YSlow. YSlow analyze your web pages on the fly, giving you a performance grade and recommending the changes you need to make.

== Web Server

12. Optimize httpd.conf to kill connections after 5 seconds of inactivity, turn gzip compression on.

13. Configure Apache to add Expire and ETag headers, allowing client web browsers to cache images, .css and .js files

14. Consider dumping Apache and replacing it with Lighttpd or Nginx. If you absolutely need to stick with Apache - upgrade to its latest version.

The Results

Implementing the steps described above will result in a faster browsing experience for your visitors as well as significantly improved website scalability.

The chart below illustrates heavyload impact on a non optimized machine (linear degradation in performance) vs an optimized machine.

With our non optimized machine, CPU spiked to 90% with 50 concurrent connections. The optimized machine was effectively handling 500 concurrent connections per second with CPU at 8% and no degradation in performance.


Kate Murphy, 01-07-2008
No it's a great post title!!

I didn't know about lighttpd. My Apache explodes after the 30 concurrent connections point. I'm going to give lighttpd a try

Mari Holt, 01-07-2008
I have a WordPress blog using WP-Cache. Should I also use eAccelarator? What is the difference between the two?

Andrew Johnson, 01-07-2008
WP-Cache turns your pages static so the code only has to be executed once.

eAccelerator precompiles all your php libraries so executing the code (even if it's only once per unique page) is faster.

jean, 01-07-2008
blogs require db writes and reads, basically every other page requires it, so how do you put your words in action from that point.

Andrew Johnson, 01-07-2008
Jean - You completely missed the whole point of this post. Read it again and you'll see he is suggesting you use WP-Cache and eAccelerator to minimize the number of times your blog is accessing the database.

If you have 1,000 visitors to your blog, instead of 1,000 reads and 1,000 writes, if you implement WP-Cache, you'll have a total of 1 database read and write.

Mike Peters, 01-07-2008
Thanks Andrew. You got it

Dan, 01-07-2008
I'd recommend Nginx over LightTPD. Its a touch faster, but more importantly its much more stable.

Mike Peters, 01-07-2008
Thanks Dan!

I did some digging around and it does sound very promising.

Quite a few sites are reporting Nginx has better overall CPU utilization and doesn't defragment memory.

I'll just have to get over the original documentation written in the Russian language. We'll test it out and post our review here.

Belinda Jones, 01-07-2008
Excellent article. I just subscribed to your RSS feed

mausch, 03-01-2008
Nice sum up. I would add:
- Maximize browser connections by spreading content out across multiple hostnames.

Google "http images concurrent connections" for more info.

Bogdan, 03-04-2008
Thanks for these tips.
Are of great use

Daniel Hepper, 10-02-2008
9. Avoid hardlinking to images or scripts residing on other domains.

How about loading libraries from Google or Yahoo! or pictures from flickr? Isn't that supposed to take some load from your server.

jep, 10-06-2008
Daniel, hotlinking items from other sites only slows down the browsing experience. Sure server load will drop, but layout rendering will be lot slower and user experience will drop dramatically.

Adrian Singer, 08-25-2009
Here's a great tool to measure page load time:

It tests with a clear cache/dns, so you get a true reading of what a first time user sees

Sam Burdge, 01-23-2010
Great article! very useful info indeed.

One thing I have noticed about using caching plugins in WordPress such as WP-Cache / Supercache is that they respond badly to robot crawler traffic.

If you are receiving a high volume of visitors from sites like digg, slashdot or the caching plugin will help to reduce load on the databasae CPU as each page only has to process the sql / php once and then the cached version is loaded for all subsequent visitors.

However, crawlers will be accessing many of the longtail pages of your site, which aren't cached - so in this instance not only are all the mysql queries and php processing necessary for each page crawled, but a cache of each page will also be created on the fly too, which adds greatly to the processor load.

I tried caching the mysql queries themselves using the db-cache plugin - but at the first spike in traffic the CPU usage went through the roof! The processing of the cached query data proved to be more expensive than actually performing the queries themselves...

WP-Supercache works great most of the time and is crucial for the occasions when we get high spikes in traffic to a particular page, but certain crawlers (particularly yandex) can bring the server to its knees.
Any advice?

dick normis, 01-20-2011
but im just an average nigger
Enjoyed this post?

Subscribe Now to receive new posts via Email as soon as they come out.

Post your comments

Note: No link spamming! If your message contains link/s, it will NOT be published on the site before manually approved by one of our moderators.

About Us  |  Contact us  |  Privacy Policy  |  Terms & Conditions