Quote from: danielw at Feb 05, 2010, 09:52 AM
I increated the memory limit in my php.ini to 64M and now it seems to work.
@OpenGeek: Any reason for this? I know it is far from ideal, but I can’t think of another way.
It just takes that much memory to list all of those Resources in a single node of the tree (combine the memory needs for querying, processing, and building a list of all those Resource with what the whole system is using for each request already; see the size of your assets/cache/siteCache.idx.php file and consider that must be loaded on every request to the system, along with the required MODx core files).
Quote from: danielw at Feb 05, 2010, 09:52 AM
If we were talking about 20,000 documents (which is insane) then I would probably have been forced to find a bespoke solution, but since 6000 docs is round about the modx "limit" of 5000, I thought I would give it a go and until now it is proving pretty robust.
Personally, I consider anything with a quantity of 100 or more as data best represented by a custom model and presented by custom Snippets that can create an endless variety of flexible, custom views into your data. To me, MODx is a Content Management System, and is great at helping organize your distinct views, but it is not meant for managing data of any significant quantity as a Resource. Even blog posts to me work better as an external data model; there is no better way to create multi-faceted views into your posts, like a good blogging system should allow (see WordPress). It also makes code/content maintenance and changes to your data model much easier to IMO. YMMV.