I got about 12k people through to my blog post "Thanks Louis C.K now here's my dad" post the other day, and then about 4,000 of them clicked through to nickdooley.com (which is on the same server).
I had already increased my MaxClients setting to 120 a few weeks previously when I got about the same number through to "Your templating engine sucks and everything you've ever written is spaghetti code" post but I forgot that with people downloading those mp3 files, the connections would get held open for much longer so I had to put MaxClients up again to 220 and restart apache at one point (which I'm sure kicked a bunch of people off, some of whom would have abandoned trying to load the site at that point).
With MaxClients set at 220 the site was able to serve pages quickly and reliably the rest of the day as well as allowing roughly 5500 people to download my Dad's music files.
I'm running the site on Apache 2.x on a CentOS VM with php 5.2.x. We don't serve static assets with nginx so it's all coming out of the same apache instance so I have 2GB of RAM on that VM (yes I know I really need to get around to sorting that out).
We have a crappy cache in Decal CMS (which we make and which runs all our sites) right now which still serves pages from PHP but does so from cached content.
At some point we're going to move to using Varnish in front of nginx for static assets so that the only requests that get through to apache are to generate the pages after a site publish and requests where a cookie is present (ie. when someone is actually editing their site) but for now we're able to manage with just the shitty hack we've got in place to served cached documents.
Setting your max clients to 220 on that setup is a terrible idea. You will probably cause the machine to start swapping as there is insufficient memory for each PHP process. It will grind to a slow halt. You should calculate your MaxClients based on the amount of load the machine can actually handle, not the amount you wish it could handle :-)
A conservative estimate of 20MB per PHP process would already put the requirements for 220 of them at over 4GB (twice what you have) and that's not allowing anything for the OS and anything else running on the machine. Its not an exact science figuring out the appropriate MaxClients, but you should find out how much the rest of the machine needs, look up the average Apache process size (ps), divide the available memory by that and then reduce it a little. Use this as you starting point for figuring out the MaxClients. Keep an eye on the available free memory for a while (vmstat) and gradually up that limit until you reach a comfortable working level for the setting.
Yeah - what's weird is that it didn't start swapping. I haven't really looked into it, but maybe because so many of the processes were downloading those MP3 files they didn't take up as much memory (although that sounds wrong as I'm pretty sure as far as Apache is concerned, a process is a process regardless of what's being downloaded).
At any rate it worked fine and was maxed at one point so hell knows why. I increased it gradually and was looking at the free memory etc. but it never exhausted it - but ps ax | grep httpd showed circa 200 processes.
I've put it back down to something safer now ;) when I have some spare time I'm going to try and figure out if/why it didn't appear to run out of memory when it looked like it should have.
But then again when I have some spare time I really just want to set up a better hosting environment that doesn't rely so much on apache.
I never got 10K from Hacker News alone, but on my blog I got 10K from HN combined with Reddit, with the links being submitted at aprox. the same time, for 3 articles I wrote.
My website is static, with no MySQL or PHP to speak of (by means of Jekyll), right now hosted on an AWS micro instance, served by Nginx. Prior to this it was hosted on Heroku's free plan, with offloading of static assets to GAE. For free and it didn't even blink.
Seeing how people are talking about caching, load-balancing, clusters, beefy servers and so on, just makes me think how extremely awful and bloated Wordpress is.
I don't think it's fair to conclude that wordpress is awful and bloated based on it performing worse than your setup. Static files and a cdn and no dynamic script are the ultimate optimization. Wordpress compares unfavourably because anything involving dynamic features would compare unfavourably.