What to do when it is necessary to execute a large number of complex SQL queries of the same type


Let's fantasize:

We have a certain hypothetical database from which it is periodically necessary to obtain data on, say, users.

The selection of users must be carried out by a certain set of filters, in general – boredom.

Let's add a set of parameters for each user, which will be somewhat laborious to retrieve, for example, we will attach information to each user about how many unique visits to the site by other users have been recorded since the last visit by the current user. Or something else like that, i.e. information requiring additional gestures and JOINs.

Let's take it for truth that we have a certain number of places in the project in which it becomes necessary to obtain information about users from the database, while in each place we have different (strongly overlapping) combinations of the required returned fields.

How to deal with this situation? I see the following approaches:

  1. Write one request that will fill in all the information, and use it always . Not the smartest approach, it eats up a large amount of resources even when you can get by with a little blood. Perhaps the approach is viable with deep caching of results

  2. Write on request for each specific task . Most productive, but the code begins to abound in almost identical methods with similar names, and swells up a lot. Perhaps the approach is viable with a consistently small number of options for the required combinations of returned fields.

  3. Automatically generate a request by some method, directly passing to this method a list of fields that need to be received in response . We get maximally productive queries for the full range of variations of the returned fields, but we can get (and probably will get) a hefty, poorly diagnosed, complex method, which, after writing, is not strongly recommended to touch.

  4. Create a set of methods with the simplest requests and juggle them in the main language used (in the subject – php). Those. at first we selected all users by filters, then we fastened the necessary data to each … it smells of an indecently large number of requests and the death of performance with large numbers of rows. There will also be problems in cases where the filters by which the selection is carried out refer to additional returned fields.

In general, I would like to see your opinion on the question. Perhaps the mention of some pitfalls that I did not see.

UPD: Of particular interest are thoughts on the option of dynamic query generation. Perhaps there are some materials in which this issue was covered? TS has always been extremely negative about the automatic generation of any executable code (both SQL and JS), but in the context, this approach looks quite decent if you do everything carefully. What do you think?


My opinion is as follows. The choice of one of the listed options is just one of those decisions for the adoption of which the developer is needed in the project. This is partly why he is involved in solving the problem, so that, knowing the specific features of the project, he can, taking into account his experience, choose the best option.

Scroll to Top