get out of memory error when fetch 3000 record of data

i’m trying to index a table that contains more than 3000 record with zend lucene in console command

here is my code

public function run($args) {


    ini_set('memory_limit', '60M');


    require_once 'Zend/Search/Lucene.php';



            new Zend_Search_Lucene_Analysis_Analyzer_Common_Utf8_CaseInsensitive ()


    $searchPath = Yii::app()->runtimePath.'/search';

    $index = Zend_Search_Lucene::create($searchPath);


     * fecth data to index


    $sql = 'select id, title, content, succinct, create_time from `news` where status="published"';

    $query = Yii::app()->db->createCommand($sql);

    $result = $query->queryAll();

    foreach ($result as $data) {

        $doc = new Zend_Search_Lucene_Document();


            CHtml::encode($data['title']), 'UTF-8')



                strip_tags($data['succinct']), 'UTF-8')



                strip_tags($data['content']), 'UTF-8')



                $data['id'], 'UTF-8')



                $data['create_time'], 'UTF-8')








but i get this error

Out of memory (Needed 228480 bytes)

exception 'CDbException' with message 'CDbCommand failed to execute the SQL statement: SQLSTATE[HY000]: General error: 2008 MySQL client ran out of memory. The SQL statement executed was: select id, title, content, succinct, create_time from news where status="published"' in /home/user/public_html/v2/lib/framework/db/CDbCommand.php:528

I don’t think that this is a Yii or even a PHP issue as it is clearly saying that MySQL has exceeded its memory limit. I heard that this can happen if your table has millions of rows. I suppose you will get the same error message using a mysql client like MySQLWorkbench or even using the console?


Should this be inside the loop?

I’m suspecting the creation of the lucene doc objects are using a lot of memory.

During a recent project I used a Yii extension to handle Lucene objects, and it had no problem indexing +30.000 documents.

btw.: I would not query all rows but try to add a bulk approach. So instead of loading all documents (and the resultset will grow over time) you could load 500 at once, index them with lucene and do the next 500. This way MySQL doesn’t have to cache all the documents with all the content in there (I assume that the content can be a very long text)