Performance issues under load, unsure of what to do next

Hi all,

I am at a loss as to where to look into next regarding some performance issues I am having.

I have a Yii developed application which has about 20 inter-related tables. It’s a fairly straight forward application with user tables, item tables, location, profiles, etc. All the tables have relations to other tables.

The site is very fast, and the Db queries all run extremely quick and the page loads fast when it’s only a handful of users using the site. I’ve started doing tests with openload to simulate loading with 20, 40 and 50 concurrent connections to my main index page.

My main index page has a lot of cached information and only performs 6 database calls (non cached) to retrieve a list of "items" and their relations (location, user, category) upon load. Once the site hits about 20 concurrent connections it starts to really hammer the CPU on the server.

Once I get to 40 and 50 connections the server starts to go down and apache becomes unresponsive.

At the moment I am on a Micro instance using Amazon Web Services. I ran a test and upgraded it to a Medium instance and while I was able to get a higher load, I would have thought I would be able to get more connections than that before maxing out.

As a bit of background, it’s a single Debian Linux server with Mysql and Apache, nothing else really worth mentioning on the server.

I’ve tuned it in the following ways:

  • Set-up APC for the PHP files,

  • Disabled debug mode

  • Use yiilite

  • Have set-up APC cache for some of the database queries (including schema caching)

Is there anything else I should do? Obviously the load tool I am running is hammering the php scripts which isn’t really reflective of what user activity will be ongoing, however it is concerning that someone can bring my site to a grinding halt so easily.

I thought about a firewall solution to limit the amount of times users can access the site, but that’s more of a workaround rather than solving the root cause. What sort of limits should I be seeing with respect to max simultaneous user connections?

I appreciate any feedback or comments that anyone has… :)


  1. Have you configured Apache to handle such amount of parallel connections?

  2. Having nginx (at least to serve static content) will help.

  3. Try not using APC stat and re-building cache on deploy:

  4. Disable Yii DB profiling and param logging if these are enabled.

  5. yiilite isn’t helpful in all cases. You should test both ways. Sometimes it’s better w/o yiilite.

A medium instance has 3.75 Gb of RAM available and a single virtual CPU core, right ?

From these 3.75 in stand-by, your system is using about 1 Gb of ram, so you are left with 2.75.

A single request to Yii can vary from 10 to 60 MB (at least as far as i saw in some of my pages that were doing lots of queries). Anyway, you can run a request and see the memory usage (yii debug toolbar is helpful).

I think you are using InnoDb as your mysql storage engine ? If so, innodb needs fine tuning in order to work as it should, and it isn’t easy to find the right settings, you need many tests for each innodb variable in order to get the most out of it.

I am sure you also have slow queries (it’s impossible to not have), so fix those too, if not all, at least a fair amount.

Also, maybe install another OpenSource CMS, like drupal/wordpress/magento(magento uses innodb, so worth installing and testing) and run tests on those too, compare the results to see if the fault is in your yii app or it’s the server itself.

Also, as already mentioned, beside fine tunning of the mysql server, the apache server needs this too.

Another thing is that, when using PHP with apache, PHP should be used as an apache module, not fcgi.

This one is a bit tedious, but the fact that you install the stock mysql/php/apache from the distribution repository(debian apt-get) isn’t the same as downloading, compiling and installing their latest versions.

Have you enabled mysql slow query log? Are you sure that all your queries run quickly enough?

Sometimes it is something silly like missing an index on a table that can make a big difference.

Hi all,

I’ve since run some more tests. I have moved the database server to another server and confirmed that the load on that is negligible. It is definitely focused around the Apache/PHP component of the stack.

The queries are efficient, and they’re only doing an inner join to retrieve some relations. I did try the slow query log and nothing was output.

I have also disabled apc stat and tuned as per the recommendation on the Yii site. Profiling and param logging are also disabled. PHP is running as a module, not CGI.

I am only testing the application with my main controller, and this uses about 7mb per request. The memory usage seems to be fine, so I know the server can handle the number of connections from a memory point of view, but the CPU gets hammered.

Interestingly enough, I installed the yii blog demo using mysql on the server and run that and saw very similar results to what I am seeing with my application. Are there any performance stats that indicate what sort of max connections the framework should be able to support under certain configurations?



maybe your server is weak for 50 simultaneous connections…

you can see this for example:

1.5 million visitors per month and no performance issues

Apache is in part the issue, because it starts a new thread for each concurrent connection. Threading isn’t a problem until the concurrents start stacking up.

I’d suggest looking at a nginx php-fpm stack. nginx is a beast under high concurrent connections, amazing performance, light years ahead of apache.

Here’s a link to a {nginx php-fpm mysql} bash installer for setting up a quick test vps install

I would also suggest to try nginx, especially on a micro instance where RAM is low. I guess with Apache you will run into swap usage easily.

Regarding your concern "someone can bring my site to a grinding halt so easily": With nginx you can define zone limits (max number of concurrent connections, max number of connections per minute) and attach those to locations. For example, you could define a zone which allows up to 2 connections per ip and attach it to the location which handles php requests. Now a browser may establish several connections to retrieve the assets (css, images, etc.), but no more than 2 connections will be allowed to process php scripts. You can also define a zone which allows up to x requests per minute and attach it to the php location as well.

If a rate limit is reached, nginx will block the request with an error page and a log entry will be written to the error.log. With fail2ban you could automatically block those logged ip’s (using the iptables firewall) for a specified amount of time.

Probably possible with Apache as well somehow.