We launched new forums in March 2019—join us there. In a hurry for help with your website? Get Help Now!
  • Actually, I've got a little further with this...

    Following this tutorial here, http://brandonsummers.name/blog/2012/02/10/using-bitbucket-for-automated-deployments/, I've setup a deployment script to pull from the bitbucket repo on update (thanks to bitbucket's POST hook).

    This worked! Now when I push to the bibucket repo, the cloud is updated. Ideal!

    Now I just need to work out a model for working locally, and I'm getting closer.

    I will update you as I go...
      • 36561
      • 38 Posts
      git pull seems broken, at the moment. I have an open ticket with the Cloud Team and will report back.

      For now, you can do:
      git fetch origin master

      followed by
      git merge origin/master


      The strange thing is, that pull doesn't work from the command line, but you are reporting, that it works, when done from inside a PHP script. Can you confirm that?

      Pushing directly to the Cloud:
      Last time I asked, it wasn't possible to setup a remote tracking branch on MODX Cloud. So the intermediary step with bitbucket or gitHub is needed.

      Cloud as a remote DB:
      Not yet possible. You can't do SSH tunneling to a DB either. They're working on that one, I heard. But for now it's not implemented because of security concerns.

      • Quote from: saschame at Mar 21, 2013, 12:06 PM
        git pull seems broken, at the moment. I have an open ticket with the Cloud Team and will report back.

        The strange thing is, that pull doesn't work from the command line, but you are reporting, that it works, when done from inside a PHP script. Can you confirm that?


        Yep, this does seem to be happening. Odd, I know.
        • Hi - Ok, an update (and a question)...

          I have my cloud pulling from bitbucket automatically - that's all great.

          I'm now working on a solution to work locally, before pushing my changes to bitbucket (and ultimately, moments later, the cloud).

          The way we work (which is slightly different to saschame's solution) is to have a '/site/' folder in the root, which contains all our static js/css/chunks/snippets etc etc. This is essentially all that is stored in our bitbucket repo.

          Locally I have installed a copy of MODX (2.2.6), let's say 'C:\xampp\htdocs\sites\modx\modx-2.2.6-pl\'. I intend to "share" this installation with all those projects I run locally (proving they're all on the same version). Any new versions of MODX will get a new, seperate install on my machine.

          So, I also have a projects folder, which contains all my projects, in 'C:\xampp\htdocs\sites\modx\projects\project-name'.

          'project-name' contains the two php files, index.php & config.core.php, as well as the PULLED repo. Inside config.core.php I have set the following:

          
          define('MODX_CORE_PATH', 'C:/xampp/htdocs/sites/modx/modx-2.2.6-pl/core/');
          
          define('MODX_PROCESSORS_PATH', 'C:/xampp/htdocs/sites/modx/modx-2.2.6-pl/core/model/modx/processors/');
          define('MODX_PROCESSORS_PATH', 'C:/xampp/htdocs/sites/modx/modx-2.2.6-pl/connectors/');
          
          define('MODX_MANAGER_PATH', 'C:/xampp/htdocs/sites/modx/modx-2.2.6-pl/manager');
          define('MODX_MANAGER_URL', '/manager/');
          
          define('MODX_BASE_PATH', 'C:/xampp/htdocs/sites/modx/modx-2.2.6-pl/');
          define('MODX_BASE_URL', '/');
          
          define('MODX_ASSETS_PATH', 'C:/xampp/htdocs/sites/modx/modx-2.2.6-pl/assets/');
          define('MODX_ASSETS_URL', '/assets/');
          
          define('MODX_CACHE_DISABLED', true);
          
          define('MODX_CONFIG_KEY', 'config');
          
          define('MODX_DB_OVERRIDE', true);
          define('MODX_DB_SERVER', 'localhost');
          define('MODX_DB_USER', 'root');
          define('MODX_DB_PASSWORD', '');
          define('MODX_DB_NAME', 'modx-2.2.6-pl');
          define('MODX_DB_PREFIX', 'modx_');
          define('MODX_DB_DSN', 'mysql:host=localhost;dbname=modx-2.2.6-pl;charset=latin1');
          
          


          You'll notice some new definitions there! That's because I've modified my LOCAL modx-2.2.6-pl config.inc.php (naughty, but it'll never get upgraded). There is now a new section right after the DB properties:

          
          /******************************/
          /* ADD TO YOUR config.inc.php */
          if (defined('MODX_DB_OVERRIDE') && MODX_DB_OVERRIDE) {
          	$database_server = MODX_DB_SERVER;
          	$database_user = MODX_DB_USER;
          	$database_password = MODX_DB_PASSWORD;
          	$dbase = MODX_DB_NAME;
          	$table_prefix = MODX_DB_PREFIX;
          	$database_dsn = MODX_DB_DSN;
          }
          /* ADD TO YOUR config.inc.php */
          /******************************/
          
          


          So now my local projects can all dynamically set the DB I intend to use. Therefore each project has it's own DB, but I only need to install MODX once (for each version of MODX).

          So, I can go to 'http://localhost:81/projects/project-name/' and I will be using the DB for my site. But, 'http://localhost:81/modx-2.2.6-pl/manager/' will still used the "original" DB.

          Is there a way I can get 'http://localhost:81/projects/project-name/manager/' to route correctly? (as '/manager/' doesn't exist, so fails). This way I can have my manager using a different database each time too. Is this a good idea? I know different DBs will have different modules installed, but they'll ALL have the source (whether it's used or not).

          I would LIKE this DB to be the cloud DB. Therefore, I can use the manager in the cloud, but build locally. If I install a module in the cloud, I just need to ensure it's also installed in my local MODX.

          I feel I'm rambling here, and possible getting myself in a mess. I'd be interested to hear people's thoughts.
            • 36561
            • 38 Posts
            I don't see, what your gaining, by installing MODX only once and sharing the files between your sites.

            Since you're on Cloud, you can go the other way around:
            Create a snapshot for a good base install and inject that into new Dev Clouds.
            Export that Cloud to run locally.

            Since you have to change the DB config anyway, the speed-gain is minimal.
            • I think you're right. I was looking for some clever way of running multiple projects on my dev machine through one manager. Perhaps this was a bad idea. It seemed reasonable, but maybe unobtainable.

              However, I think this is all in vain. Unless I can access the cloud DB locally this is never going to be worthwhile.

              The problem I HAVE to get past is being able to work locally on a DB that's hosted in the cloud, as there's no way I can work locally with a DB that doesn't match the one on the cloud. I could be doing loads of work locally, deploying the files, then having to replicate everything in the cloud (static snippets, resources, etc...)

              I would really like to know how people are managing this? How are others working day-to-day, and then deploying? Especially when working with a live and staging cloud?
                • 36561
                • 38 Posts
                What I do is, I dump the DB and import that to the Cloud.

                That way, I can work locally, until I have a version of the site, I want to show to a client.
                Then I export the local DB, reimport it on the Cloud and push the changes to files (like CSS or JS) via git.

                But there's at least on catch:
                When you f.e. install an extra on your local machine you need to synchronize the DB and also upload all the files in /assets/components/the_extra/ and /core/components/the_extra/

                Which leaves us with the old conclusion:
                There's no easy, automated method to keep two MODX installs in sync.

                But as long as you remember to sync the DB, /core/components/ and /assets/ you're ok.
                • For me there's too much reliance on knowing how pieces fit together to be effective or efficient in a team. I don't want developers having to manually import databases and such.

                  Also, we tend to push changes to a demo/staging site several times a day for out clients to review, and so we couldn't keep doing this each time.

                  Also, what happens when you go live with changes on an already-live site? Those databases need to 'merge' in some way, and not fully replaced.
                    • 36561
                    • 38 Posts
                    Well, that is exactly our problem, too. This workflow works on small sites without constant updates or for CSS, JS updates.
                    But it completely falls apart with multiple developers or editors, working on the same DB.

                    I saw your new thread and am hoping to get a few answers there, too.
                      • 34120
                      • 236 Posts
                      I'm fairly new to using git, I'm still getting to grips with it really and probably lacking some fundamental knowledge. I'm working on my own on a development site rather than a live site. This probably avoids some of the issues mentioned above. Currently my repo is on the cloud, I don't have a remote tracking branch (bitbucket). Is this a bad idea?

                      I'm not running a local copy of modx because I felt it negated the advantages of the cloud. Working directly on cloud means I don't need to sync sites and my sites actually run faster on cloud than locally on xampp. I'm just working on assets and static files locally, these are uploaded automatically on save.

                      So far this is working well for me and helping me get to grips with git. Is there anything glaringly wrong with this setup?

                      @chrischerrett I will checkout you new thread.