Begintransaction() Fails

I suspect the issue is that the db connection thinks it is dealing with the MyISAM engine, even though each of the tables is InnoDB. The following line fails: $trans = $model->dbConnection->beginTransaction();

Any suggestions?

EDIT: I also tried $trans = Yii::app()->db->beginTransaction(); - same failure

I was using Yii::app()->db->beginTransaction();


 $dao = Yii::app()->db;


        $transaction = $dao->beginTransaction();

        try {

            $cmd = $dao->createCommand($query1);

            $cmd->execute();

            $cmd2 = $dao->createCommand($query2);

            $cmd2->execute();

            $transaction->commit();

            return true;

        } catch (Exception $e) { // If any of query failed then roll back.

            $transaction->rollBack();

            return false;

        }

Thank you! That’s the code structure I was using, and I did try Yii::app()->db->beginTransaction() as well. No luck.

What do you mean by "it fails"? Is there an error?

Thanks everyone. I may have something else going awry. I’m new to Yii but not to PHP or MySQL. This test case does hit the exception:




$trans = Yii::app()->db->beginTransaction();

$trans->commit();

throw new CException("Test case stops here");



I found I need to redesign that code sequence in any event. Forty million inserts using Active Record just plain takes too long! It’s a batch process importing comma-separated-value data, which means I ought to be able to run a lot faster with PDO building the SQL commands once and binding new data each import row.

(EDIT: “load data” won’t work for me because I’m doing multiple inserts per row. The 200 or so columns exceed the 65,536 maximum row length. I’m spreading the result across various normalized tables. Brute force fgetcsv() seems the best approach.)

I sincerely appreciate the rapid responses here. I won’t be shy next time I get stuck!

Ed Barnard

Cannon Falls MN

Is that a pure import process? If so, MySQL understands CSV as a storage format. I wonder if that weren’t a lot faster …

I tried building a table 200 or so columns wide (that took a while!), but MySQL choked, complaining that the total row width exceeds 65536 bytes. LOAD DATA runs extremely quickly, but things don’t work smoothly with this many columns. I would still need to do a second phase of moving the imported data to normalized tables anyway. A two hundred column table sounds like awfully bad data design to me anyway!

I’m also finding data errors in the import data, which need to be accounted for. So, line by line brute force is looking like the best solution in terms of reliability. It’s also looking like the specific columns may change over time, which makes the 200 column table sound even more unwieldy.

For what it’s worth, my changed data design has a main table with very few columns, about 200,000 rows per import, and then a second table (one-to-many relationship) with key/value pairs representing each of the columns for that row, about 41,000,000 rows per import.

The imports are an overnight batch process. From that I generate various reports. The report generation is also a batch process. Anything we might want to look at interactively (i.e., Yii-based web pages), I can wrangle the data into reporting tables. In effect, any "view" of interest gets built as a separate table.

That’s the cost of storing everything as key/value pairs. It’s hard to do joins, etc.! But only a very few data items are of interest for the summaries, so it’s easy enough to extract that data into tables designed for that purpose.

That’s the luxury of doing the batch process overnight. I can wrangle the data once, and read-only interaction during the day can be fast and responsive.

For what it’s worth, I’ve been driving my development via PHPUnit. The batch functionality is in components/, and can be called by PHPUnit directly. The actual commands (./yiic mystuff mystuff) can be called in PHPUnit via exec(), allowing PHPUnit to check the exit code but not much else.