Hi Dojo,
What is reasonable number of DOM instances before any significant performance issue can be noticed? Are we taking about few hundreds, thousands or few tens of thousands?
What about size of each DOM definition, in the docs it is very vauge indicating they should be "as small as possible", but no hard suggestion. Is 50 fields too many? Is it better to have 1 field with long string that we can interprete in code than 50 small fields that have meaning of their own, performance wise?
Is it possible to update the docs with this information as well? Right now the only concrete number is "number of DOM modules should be kept well below 50", but even that is quite vague? I understand it is difficult to give exact values, but even a ballpark would be helpful.
Is it better to have 1 instance containing the same section 20 times or to have 20 instances containing it only once?
Thank you,
Cheers
Hi Edib,
You're right—the current DOM documentation is quite vague and avoids giving specific numbers. That’s pretty much intentional, as the ideal setup really depends on many factors, including the acceptable performance for your specific use case. For example, a DOM instance with 200 fields might be too large for a high-performance system but perfectly fine for sporadic use in an LCA where load times of one second are acceptable. The best way to determine if something is "too much" is by testing and checking whether it meets your expected performance benchmarks.
That said, here are a few pointers that may already be helpful:
DOM Instances:
You can safely and performantly store millions of DOM instances. We've tested up to 10 million without significant performance degradation. Just ensure that your database has sufficient resources allocated (if you're self-hosting them. No issues for STaaS 😉). It's also important to account for this large volume in your solution design (scripts, ad-hoc data sources, etc.). Avoid reading all instances at once; always use paging and limits to retrieve only the necessary data.
Fields per Instance:
While the guidance to "keep it as small as possible" may sound vague, it’s still very relevant. The more data you store in a DOM instance, the more it affects read performance, almost linearly. Doubling the data can halve the performance, as it's constrained by throughput and serialization/deserialization costs. A good strategy is to store only critical, frequently used data in the main object, and move additional or less frequently needed data to a metadata object. This keeps the main object lightweight and improves performance. However, remember that joining this extra data later will also have a performance impact, so it's crucial to strike the right balance.
Long Strings:
Generally, we recommend avoiding the use of long strings to compactly store data. While it may reduce the field count, it significantly hinders readability and maintainability. It's better to first test performance using regular fields. Only consider long strings if they offer a clear and significant performance benefit, which is rarely the case.
Sections:
Regarding the use of sections: we recommend keeping the number limited. If you’re dealing with around 20 sections, that might already justify splitting them into separate DOM instances. Again, it depends on when and how often you need to read the data. If these instances need to be joined every time the main DOM instance is accessed, you could end up with worse performance instead of gains.
Let me know if you need any more clarification or help!

Hi Thomas,
Thank you for your answer. I notice that there is a heavy focus on testing different approaches, however it feels to me that that is not trivial thing to do and would more or less mean double(or triple amount of work, depending on number of ideas). Is there a standard solution that can help with this testing so we can avoid duplicate work?