We launched new forums in March 2019—join us there. In a hurry for help with your website? Get Help Now!
    • 16192
    • 2 Posts
    I have several category of news and each category have thousands news.

    Can Ditto help me?
      • 727
      • 502 Posts
      I think you might run into performance problems with that many documents. Probably a better approach would be to create a custom MySQL table, import your articles and then modify Ditto or write your own snippet to display them. Shouldn’t be hard.

      Andy
        • 18397
        • 3,250 Posts
        Ditto 1.1b1 or later will work fine with this. Hopefully, by 1.1 performance will be even better.

        You might consider paging the results to make it easier on the visitor--- and your db server.
          • 727
          • 502 Posts
          Ditto will work fine with 1000s of documents or will work fine with a custom table?
          I thought that MODx itself had performance problems above 5000 documents, so I was referring to that rather than any performance problems with Ditto.

          Andy
            • 18397
            • 3,250 Posts
            Ditto will work with as many documents as MODx supports. So, if he has less than 5k he will be fine. If he has more than that, it is possible to modify Ditto to use an alternate table but it would require considerable changes to 2 core functions.
            • Quote from: ajayre at Dec 09, 2006, 03:07 AM

              Ditto will work fine with 1000s of documents or will work fine with a custom table?
              I thought that MODx itself had performance problems above 5000 documents, so I was referring to that rather than any performance problems with Ditto.
              Andy, this has nothing specifically to do with Ditto actually, though I’m sure Ditto adds some additional overhead to the problem. The real problem is the current approach to caching the entire site structure within a single cache file in MODx. Sites with thousands of documents will have very large siteCache files which will tax the web server more by requiring more memory, more time to load the file from the filesystem, and more time to process it with the PHP interpreter. In addition, thousands of pageCache files will also exist in a single directory, which can also increase resource loads for the web server when reading the cache directory contents searching for a cached file. This reduces performance exponentially.

              All of this is being addressed at the core level, and hopefully we’ll have some additional scalability in this respect with the next minor release of MODx. wink
                • 727
                • 502 Posts
                Yes, I understand, that’s why I mentioned I was referring to MODx as the limitation, not Ditto. laugh There isn’t much point trying to use Ditto with 5k+ documents if MODx can’t handle it well.

                Andy
                  • 20204
                  • 6 Posts
                  Quote from: OpenGeek at Dec 09, 2006, 03:30 AM

                  Quote from: ajayre at Dec 09, 2006, 03:07 AM

                  Ditto will work fine with 1000s of documents or will work fine with a custom table?
                  I thought that MODx itself had performance problems above 5000 documents, so I was referring to that rather than any performance problems with Ditto.
                  Andy, this has nothing specifically to do with Ditto actually, though I’m sure Ditto adds some additional overhead to the problem. The real problem is the current approach to caching the entire site structure within a single cache file in MODx. Sites with thousands of documents will have very large siteCache files which will tax the web server more by requiring more memory, more time to load the file from the filesystem, and more time to process it with the PHP interpreter. In addition, thousands of pageCache files will also exist in a single directory, which can also increase resource loads for the web server when reading the cache directory contents searching for a cached file. This reduces performance exponentially.

                  All of this is being addressed at the core level, and hopefully we’ll have some additional scalability in this respect with the next minor release of MODx. wink

                  I’m curious--can you get around the 5K limit on total documents via turning off caching?

                  Thanks,
                  Dan
                  • Quote from: mooreds at Jan 02, 2007, 09:12 PM

                    Quote from: OpenGeek at Dec 09, 2006, 03:30 AM

                    All of this is being addressed at the core level, and hopefully we’ll have some additional scalability in this respect with the next minor release of MODx. wink

                    I’m curious--can you get around the 5K limit on total documents via turning off caching?

                    Thanks,
                    Dan
                    Not yet Dan; part of the bottleneck is a large index of document metadata that is loaded on all requests, which a large part of the current parsing engine is dependent on. As I mentioned though, these issues are being addressed.