So if you haven't noticed yet, we've down a chunk of today and last night. In response to this, I updated the site with code that I felt 99% sure would resolve the memory issues we're having. After only 2 hours of this, it's apparent that my assumption was incorrect. The culprit is Google, but there's no reason why the site shouldn't handle their crawlers just fine -- they've been crawling us now for months without any problems. Now, however, they've literally got 10-30 crawlers running almost non-stop, and it's as if they've completely ignored my crawl delay directives in Robots.txt.
I thought the main issue was the creation of "ghost" sessions by these bots. It appears now that that's not the case, as we're not even using session management in the traditional sense.
I'll keep ya updated.
-Dain