How to improve query performance in mongoDB on collections with millions of records and each one with many searchable attributes?


There is a collection of approximately 20 million records and each of them has about 200 searchable attributes.


{atrib001:"abc", atrib002:"123", atrib003:"1x3"... atrib200:"1zz"}

The search application allows it to be dynamically assembled, according to the options desired by the user.

How is it possible or what is the recommendation for creating indexes in mongoDB to improve the performance of this type of search? Also, is it feasible to create an index for each attribute, which in this case would be 200 indexes, and trust mongoDB to choose the best one?


(Disclaimer: I'm not a DBA. And test anything in a separate copy of the database before applying in production).

When it comes to indexes, MongoDB is not that different from relational banks. They are typically B-trees.

If the query is dynamic and the user can choose which attributes he wants to include in the search, the best thing to do is to create 200 indexes, one for each attribute, and thus the searches will be optimized.

If there was a pattern, for example if the user always included both attributes 1 and 2 in the search, you could create a composite index that maps both attributes at once. But since there isn't, the indexes must be created separately.

Scroll to Top