Searching DB query on relational records

I’m having a bit of a problem with relational queries against the database. I’ll try to be brief, so some of you might have the patience to read. Further clarification upon questions. :)

Problem

So, I’m searching for records through a $model->with(‘relatedThing’)->findAll() scheme.

Now then, if I want to search for values contained on the related model through relatedThing relation… like, for instance relatedThing.name, it only works if:

  1. relatedThing is a relation to one (BELONGS_TO or HAS_ONE).

This way the ActiveFinder will join relatedThing’s table with the query, and all is well.

  1. I do a with(‘relatedThing’)->together() trick, and join everything into one query.

But. This leaves me cold as well. I need to list one row per result found, but need to search for all records related. :)

By doing a together() I’m left with a result set of who-knows-how-many-rows from the DB, meaning that for example I have no way of predicting where to cut off for pagers of a fixed size.

Also, with a few with()-joins, the result size might get out of hand without a limit (and of course having a limit means that I’m not guaranteed to get all the results I searched for)

Heeelp please

Has anyone done this? Have you any suggestions how to go about doing it? Thanks.

use findBySql if the thing gets to complex or the overhead by joining every unneeded row is too large.

The thing is, even with findBySql I’m faced with the same problem.

I do my query, it results in a large number of rows, be it through relational find or through findBySql. Might be like 300 rows for 100 records of a table, joined with some other tables (meaning among those 300 rows are 100 distinct records of the original table I’m searching for, and then duplicates with differing results among the joined tables in a LEFT JOIN-fashion).

These rows are then used to populate the AR models of those 100 distinct records.

But to show those 100 records, how do I set a DB limit? I’ll have to perhaps do many queries, first to find out how many rows I need, then to fetch those rows… feels somehow wrong. :expressionless:

If I set the DB limit to 100, 200 out of 300 result rows are left out of the result, and thus, some out of the 100 desired records are not retrieved.

I’ll have to have a DB limit of something, since joining a lot of tables might result in a really, really big result set and a huge load time. But I’d like for my DB limit to give some predictable results. Like for example not cutting off before all the desired records are there, but still, not having to fetch all the database at a time.

I’m leaning towards making some pre-emptive queries to map a bit what’s going to be fetched from the table for the actual query that’s going to populate the records in the end.

You can provide limit and offset in your sql statement.

To avoid distinct occurences you can you "group by" within the statement.