We've come to use lots and lots of caching in order to optimise a web application that we've been developing for longer than a year now. It's passed it's prototype stage and is being used in anger so we needed to make sure that latency was reduced as much as we could.
Caching was the obvious solution: instead of the server sending queries off to our RDBMS we could cache previous calls and return them instead - saving time on the travel of the query, the crunching of the query and the return of the result. There's a lovely video illustrating this process where Amazon introduces it's ElastiCache service. I got the chance to play with the Amazon service yesterday afternoon and it's brilliant.
We use base machine images running a LAMP stack and had to install Memcached seperately in order to make use of it. That's not so much of an issue though TBH and once it's up and running it's a dream to manage in that there's no options: once it's up and running you can't do anything other than empty it if you need to. I guess it can be quite memory intensive as it's running in RAM (and virtual RAM in our case), but we've not noticed as yet so I guess it's all fine and dandy.
I was impressed anyway ;-)
Amazon's ElastiCache, if you haven't already watched the video above, acts in exactly the same way except that the cache is based in their cloud. It takes about 5 mins to set one up, though YMMV, and once it's done you simply point PHPs Memcache object at its location in the cloud rather than your localhost. The testing that I did showed that the difference in speed wasn't statistically significant but where this comes in handy is the sharing of Cache between machine. Instead of setting up some elaborate system for storing the same key-value pairs across a number of machines they can all share the same one. That's very cool!