⚠️ Urgent! Active Attacks on MODX Revolution Sites Below Revolution 2.6.5
Subscribe: RSS
  • Every month or two I seem to get into a heated debate with another MODX developer about why I feel strongly compress_css and compress_js should:

    • default to false
    • never have been added in the first place

    So, I'd like to shed some light on this subject and get a discussion going. Now, before we get into the performance tests and numbers (we'll do that later) let's establish a few things first.

    The MODX Manager is NOT a website. In fact, it is the exact opposite of one. Let's compare the user experience of each to illustrate how different they are.

    Website A:

    • is public facing
    • has short (<30s) page visits
    • has a high bounce rate
    • has many inbound users visiting for short periods of time
    • is commonly viewed on mobile
    • serves the same CSS/JS, for every user and maybe every page
    • runs on the same controlled server for all your users
    • has some form of dependency management

    Your MODX Manager:

    • is not public facing
    • has longer page visits
    • has an incredibly low bounce rate
    • has fewer users logging in for longer periods of time
    • is commonly accessed from a high bandwidth desktops
    • may serve different CSS/JS for different users/pages
    • runs on a server MODX has no control of (with the exception of MODX Cloud of course)
    • has no dependency management at all

    We're starting to realize that our website is very much a like a drive-thru fast food restaurant. We have lots of users in the queue, who are looking to both digest and leave your establishment quickly.

    The MODX Manager is more like a gourmet kitchen. You take your coat off, punch your time card, stay a while, and work on something really nice for people to efficiently enjoy at a later time. You and your sous chefs will probably be back tomorrow.

    It is our responsibility as developers to examine the user experience(s) of what we are building first, and apply critical thinking to how we go about implementing, or not implementing given features.

    User Trends
    On the web, we have to build very snappy pages because our users likely may visit one page and never come back to our site again. Since we have control over the assets we'll be loading, we can optimize them accordingly. With something like a CMS Manager you can ensure that users will both be back, and stay a while. The performance benefit of compress_css and compress_js is only noticeable on the first visit to a given page. After that, the assets are cached in the browser.

    So performance doesn't matter?
    Performance absolutely matters. It is very important that MODX performs. There are countless threads and tickets related to compress_css and/or compress_js completely breaking the entire MODX Manager. Most commonly, this is due to what we call server misconfiguration. It's your server's fault we tell users. No, it's our fault for shipping such vulnerable software.

    By allowing the MODX Manager to trip on it's own shoelaces and fall flat on it's face all in the name of performance benefits of compression we've established the assumption that said performance is *critical* to our user experience. My questions are as follows:


    • Is this supposed benefit critical to the user experience?
    • Have we measured this mission critical performance benefit?
    • Have we gained more users with "snappiness" than those we have lost who gave up installation at the white screen of death?
    • Should servers ever concat & minify front end assets?

    Now I actually have tested 2.2 extensively in this regard. The intent of this post is to prove the benefit, if any, is irrelevant so I am leaving those numbers out for now.

    Personally I loose sleep over the thought of some random Go-Daddy server compressing JavaScript. It is the exact opposite of a test driven workflow. Did MODX test compress_css and compress_js on the server you are running? Nope.

    A few other questions to ponder:

    • Are there no instances where a server takes longer to process, concat, and minify an array of files than it would take to serve them flat?
    • Do we really deserve to play the performance card?
    • Is this methodology future-proof?

    In my opinion, we have no right to play the performance card. Not because ExtJS is slower than molasses in the winter time, but because the MODX Manager has no dependency Management. Install a couple of Extras and premium or not, you are probably loading 5 separate versions of jQuery. I thought we were talking about saving bytes here...

    If the intent of compress_css and compress_js was to save bytes, practical features like dependency management should have been looked at instead. If the intent was to improve the reliability and performance of the software, we failed. [ed. note: dinocorn last edited this post 4 years, 4 months ago.]
      jpdevries
    • This thread lost me, simply because it promised performance tests and numbers and failed to deliver. tongue If you just want to discuss the merits, that's fair game, but if you want to sway me I need to see numbers that show it doesn't do what it promises or even makes it worse.

      I also strongly disagree with that we "don't deserve to play the performance card" just because there's no real dependency management. Yes, that is a problem that results in extra bytes being downloaded, but that doesn't mean we can just skip any optimisation whatsoever. Especially in performance and when other parts (ExtJS) are not particularly fast, every small bit helps.

      One thing you're skipping is the difference between loading 25 or 5 files. Or with a couple of extras loading their own assets, 40 or 10 files.

      If the minifier is harming performance, it needs to be taken out. If you have stats and test cases for proving that it does, spill 'm.
        Mark Hamstra • Developer spending his days working on Premium Extras and a MODX Site Dashboard with remote management features to make the MODX world a little better.

        Tweet me @mark_hamstra, check my infrequent blog at markhamstra.com, my slightly more frequent ramblings at MODX.today or see code at Github.
      • Quote from: markh at Jun 15, 2014, 12:12 PM
        This thread lost me, simply because it promised performance tests and numbers and failed to deliver. tongue

        Does it?

        es, that is a problem that results in extra bytes being downloaded, but that doesn't mean we can just skip any optimisation whatsoever.

        Let's work on proving that we are in fact optimizing at all. Packing assets into page specific bundles is extremely optimal for high bounce rate websites. It is thus by design extremely un-optimial for low bounce rate web apps such as the MODX Manager. For example, say you have compression on and page A and it is loading 500kb worth of JavaScript, mostly from Extras. page B utilizes 90% of the same utilities packaged with page A, but adds a JavaScript file that is 0bytes in size. It's empty. When we go from page A to page B we are re-loading 450kb of JavaScript that is already sitting in the user's browser cache. How is that optimal?

        One thing you're skipping is the difference between loading 25 or 5 files. Or with a couple of extras loading their own assets, 40 or 10 files.

        Loading lots of files is slow. Very slow. The first time. Once they are cached it is virtually irrelevant. So yes, I'm a proponent of maybe adding 0.5s to the first page load time after login.

        If the minifier is harming performance, it needs to be taken out. If you have stats and test cases for proving that it does, spill 'm.

        As I said above, I'd rather this not turn immediately into a numbers game. There are more important issues at hand, such as test driven development or whether servers should even be concatenating front end assets in the first place. While numbers are accurate, they can also very easily be construed to your liking. For example, it takes only a few minutes to whip up some numbers to argue either side.
          jpdevries
        • Hey JP,

          I agree. Your argument is a philosophical one. In my opinion numbers are not necessary.

          My question is this, and it comes from little understanding of MODX. I've been working exclusively with node stacks for a while now- and we do compress CSS/JS, then serve it static so nothing is concatenated at runtime. I've seen it drop hundreds of kb..

          However- when I see hundreds of kb drop off- it's a bad sign and invariably means there has been poor attention given to Symantec CSS and most often is a dev problem- meaning the developer (plural or non) depended so much on compression at the sake of proper, non duplicating CSS/JS.

          Is there a bootstrap of some kind that can modify itself per server environment and at the base stack level, serve the compressed code specific to the rules of the server environment?

          • Philosophy aside, whether the two settings typically speed up or slow down the MODX is an empirical question and I would also like to see some numbers.

            It would be interesting to gin up a benchmark suite that performs some standard Manager operations on a fresh site (say, with cURL and MODX Cloud where you can spin up a fresh site very fast) and see what you get with and without the compression. Such a suite would be really handy for testing future options for the Manager (maybe it exists already).

            That said, I turn both settings off when I create a new site, and have never seen a noticeable performance hit. I also agree that having a small but significant percentage of new MODX users see a trashed Manager or a blank screen is a serious issue that is probably more important than speeding things up unless the speed difference is very noticeable.

            It also seems to me that compressing and especially concatenating JS that includes code injected via extras written by people of various abilities is asking for trouble. In addition, if moving around in the Manager results in different concatenated chunks of JS with a large overlap in the contents, it's probably counter-productive.

            Finally, it's completely irrelevant to the discussion, but it's "sous chef" (I have a friend who works as one and she would never forgive me if I didn't offer the correction). wink




              Get my Book: MODX:The Official Guide
              MODX info for everyone: http://bobsguides.com/MODx.html
              My MODX Extras
              Bob's Guides is now hosted at A2 MODX Hosting
            • I never did any real measurements, but a "mississippi" count with and without it actually has a localhost taking about half the count without. I have my browser set to clear most cookies and all of its cache every time it gets shut down, and I shut it down or clear my cache and cookies frequently.
                Studying MODX in the desert - http://sottwell.com
                Tips and Tricks from the MODX Forums and Slack Channels - http://modxcookbook.com
                Join the Slack Community - http://modx.org
              • Quote from: joshpope at Jun 15, 2014, 03:29 PM
                Hey JP,

                I agree. Your argument is a philosophical one. In my opinion numbers are not necessary.

                My question is this, and it comes from little understanding of MODX. I've been working exclusively with node stacks for a while now- and we do compress CSS/JS, then serve it static so nothing is concatenated at runtime. I've seen it drop hundreds of kb..

                However- when I see hundreds of kb drop off- it's a bad sign and invariably means there has been poor attention given to Symantec CSS and most often is a dev problem- meaning the developer (plural or non) depended so much on compression at the sake of proper, non duplicating CSS/JS.

                Is there a bootstrap of some kind that can modify itself per server environment and at the base stack level, serve the compressed code specific to the rules of the server environment?


                2.3 introduces Grunt (and thus node) to the development workflow. It mostly did so for Sass support, but of course we can do more. I'm working on a PR that lifts some of the well, grunt work off of the server-side minification and moved to part of the build process that gets run.

                Part of this challenge, is determining what versions to target. I think how much this particular area can, or should be improved upon for 2.x is still up in the air.
                  jpdevries
                • For my template .css files I use a combination of the RO.IDEs concept - resources with Ace for the IDE, and the cssSweet extra. No need for installing all kinds of cra... er, stuff on my machine. Just sweet, sweet, MODX all the way.
                    Studying MODX in the desert - http://sottwell.com
                    Tips and Tricks from the MODX Forums and Slack Channels - http://modxcookbook.com
                    Join the Slack Community - http://modx.org
                  • Quote from: BobRay at Jun 16, 2014, 06:00 AM
                    That said, I turn both settings off when I create a new site, and have never seen a noticeable performance hit.

                    Ditto. The moral of this story really is to point out that yes there is a big bag of performance tips and best practices we can leverage, but they all don't always apply.

                    I think that if we could figure out a way to add CDN support to the core, API, or for common libraries somehow it would have much more of a performance benefit that would be seen across all your MODX installations.

                    I think it's great that there are people out there logging into 10+ MODX Managers a day and we should look at a ways to better make use of their cache(s).

                    Quote from: BobRay at Jun 16, 2014, 06:00 AM

                    It also seems to me that compressing and especially concatenating JS that includes code injected via extras written by people of various abilities is asking for trouble. In addition, if moving around in the Manager results in different concatenated chunks of JS with a large overlap in the contents, it's probably counter-productive.

                    I agree with your concerns here. I'm not overly familiar with how the "other crowds" handle this, but it seems like separating the assets based on whether they are sharable libraries or parts of an extra would be beneficial. Having every extra loading it's own jQuery isn't a performance benefit no matter how many HTTP request are being saved.

                      jpdevries
                    • I just have to bump this.

                      I support OP 100%. Using compressed js/css makes no sense in the maanger like we do today. What would actually make sense would be to do a build for every version with ONE js- and ONE css-file that are minified. These two files would then be included at every pageload in the manager.

                      It would:
                      1. Cache once, fetch locally afterwards
                      2. Put no load on the server, because the minification is already done

                      Win-win imo.