Is there some option to force model->save() to use SQL limit 1 when it update record in database. Its clear how it can be done by using other methods or using command, but I’m wonder is there option to use save() and force it to use limit parameter.
I see that no one reply so I will try to describe more what is problem. I have database table with more than 6.500.000 records and it will be more than 10.000.000 soon. I perform just search and update on that table. Its some kind of tickets. So user can try to use tickets and I do search to see does it exists and this work good. Other operation that I use on this table is update. I set flag field to be ‘1’ instead of ‘0’ in start. Code that do that is something like this:
$model->used = ‘1’;
$model->save();
And here I have problem. I look at mysql_slow_queries log file and see that this update is one of them. When I look at query I see that its without LIMIT 1 parameter, so I become wonder is there some way to push save() function to use this parameter when it doing update, or in other case I will need to make it custom.
From my POV there is not too much sense to not user LIMIT 1 in update procedure for one model, because (if Im right) its usually one record (in my case it is for sure)…
If someone have some idea how to optimize this, please write here…
I doubt LIMIT 1 will solve this issue. I would remove the framework from the equation and grab the generated SQL and try it in an SQL console directly. I imagine it’s already using the primary key index to find the record in the first place and as such it might not be possible to speed it up while all that data’s in one large table.
Databases optimize queries internally as much as possible. I’m pretty sure all of them apply LIMIT 1 to the query automatically when you select/update by the primary key.
@jacmoe - I think that there is big difference in using LIMIT in UPDATE query than if you not use it. For sure its not so important on small amount of data, but on big amount it can be really big difference. When You use UPDATE without LIMIT DBMS try to found each record that match WHERE part and on big table like mine (6.000.000 rows) it need to go trough all of them to be sure then they match/not match WHERE part. If you use LIMIT it will stop as soon as it find amount of rows specified in LIMIT.
@Say_Ten - I do some tests and its big difference. For example, if there is no LIMIT query take about 0.0116 sec, but when I use same query and add LIMIT it take 0.0007 sec.
I know that its not look like big numbers, but in combination with rest queries it can be problem. Also I will like to know how to control this because usually I use LIMIT everywhere where I’m sure that I need just exactly number of rows, its small speed, but when you collect it all it will bring good benefit to global site speed.
UPDATE table SET cid=‘5269841’, name=‘d441402f548a3ac377a2d161b6f0d48a’, used=‘1’ WHERE table.cid=‘5269841’
This query is exactly from log and You can see that in this case it take almost 1.5 seconds to execute. I can say that is because of other, really bad, query that we had on site and which make site slow, but since this query is on list of slow queries I wish to optimize it somehow…
So only what I need is some common way how I can set LIMIT before I do $model->save(); . If there is nothing like that I will change this part and use other way with command and conditions, but I will like to stay with save() because its really clear and self descriptive way…