This forum has moved to a new location and is in read-only mode. Please visit talk.octobercms.com to access the new location.

that0n3guy
that0n3guy

@abass, I think your over thinking it. If keeping your theme in git "scares" you... don't do it. Treat it like your uploads folder and gitignore it. You do all you changes on the web interface instead of locally then.

Then, you just keep backups of your themes folder the same way you keep backups of an uploads folder. No need to over complicate it.

abass
abass

@that0n3guy, but if you just make all your changes on the web interface then you are essentially editing the live server for every single change. Having a local server to make changes and then pushing them to the production server is my current workflow and it's preferred because making layout tweaks won't affect the live server, not to mention, using a CSS preprocessor, you can't do that through the web interface. I even have a development server for some larger clients. So if I just worked through the web interface then I'd have to repeat everything I'm doing in the themes folder on each server.

I want absolutely everything version controlled and available locally EXCEPT for the content (unless of course there is some way to have it automatically push any changes made on the server to a separate git branch in the repo or something whenever a client makes a change to the content - again, or just have the content managed via a database instead of flat files).

@Blackpig Creative, I really don't see the use in static pages (unless I'm missing something). It's still using a flat file structure and not a database so what's the point of using static pages over the default "CMS", "Pages", "Layouts", "Partials", and "Content"? They seem for the most part identical to me when I was messing around with that plugin. I already have a WYSIWYG user interface within the default site's content section for clients to edit, or I just use the "editable" plugin for example. I really can't seem to understand what "static pages" provides over the default layout of it all..?

I just want my clients to ONLY be able to edit the content and not screw up the layout of the site and then be able to get the content changes that they have made. Preferably doing so without completely adding large portions of the site to gitignore. Like honestly, if the "content" section under "CMS" just saved to the database instead of local files, then that'd be the perfect solution. Is there maybe a plugin that acts that way? Or am I going to have to do a heroku git pull everytime I go to edit the site to make sure to pull in the most recent content changes to make sure I don't overwrite them upon pushing an update from localhost?

Not sure if I'm explaining myself the best, but for example @thatt0n3guy, you mentioned a few killer examples how you possibly overcome this problem by having the server automatically git add to a separate branch whenever changes were made on the server, etc. It was all quite a bit complex, I guess I'm just wondering what you ultimately decided upon doing and/or what workflow you feel would work the best with this?

I can just foresee a bunch of problems trying to juggle content that my client can edit while still allowing me to keep the theme version controlled (because the layout and things of the theme is really important to have properly version controlled..)

Last updated

that0n3guy
that0n3guy

@abass

How about the "themes/yourtheme/content" folder not being tracked by git? Then the client can add content (via static pages or other) and the layout (partials, pages, layouts) stay in VCS. Maybe simlink your content folder someplace else so it feels more "out" of git.

Blackpig Creative
Blackpig Creative

Yep, my bad - I'd not played with the pages plug-in - I assumed it was db driven like the blog. Sorry.

abass
abass

@that0n3guy

That's what I'm mostly leaning towards but in theory I'm finding it to be a huge problem. I use Heroku for hosting which is hosting that uses git if you are unfamiliar with it. I have it tied to that respective project's GitHub repo to automatically pull whenever the GitHub repo is updated.

If I .gitignore the themes/yourtheme/content folder for GitHub, then it will also be ignored from heroku's git (which is how everything is stored). Now if I just completely ignored that directory and ONLY added content from within the website.com/backend then I could of course start adding content and it'd be updated on the site, but heroku's git would never actually commit the changes, so if I were to do git pull heroku master then the content directory would never be accounted for. There'd be no actual way to backup the content on the back of the site.

Blog posts for example are working beautifully though but that's because the content of the blog post is saved via the database, so I don't have to worry about .gitignoring any files or worry about clients making changes to the site and them somehow getting lost or overridden or not able to be backed up.

I guess file-based CMS, VCS, and client managed content is kind of a bad mixture at the end of the day. When it comes to being able to backup the most recent version of the site. I guess maybe if it could be set up so whenever the client makes a change, it automatically commits the change to the heroku git or even better, the github repo on a spare branch, that would be awesome.

I can't be the only one running into the problem though, even if I wasn't using Heroku and using just a general hosting provider using SSH, it's still a pretty risky way to manage the site by ignoring the content folder and never being able to properly backup the content on the client's site.

Please let me know if I'm maybe overlooking something but I've thought through this countless times and it just doesn't seem to be clicking as a proper solution? Unless someone came out with a plugin that functioned exactly like the content folder but was entirely database driven. And creating an entirely pages database plugin doesn't really fix that (which I think Autumn Pages is that?) because the structure and layout of the page should not be in the database so you can still manage and edit that locally, ONLY the actual content blocks should be in the database.

Last updated

ScDraft
ScDraft

My 5 cents to this great discussion. I think that the best way is described by @BENFREKE in 3rd post. I use the same way. In short:

  • I have production branch on production server, repository for project (for example Bitbacket), and my local version of site. Steps of deploying:
  • commit all changes on production
  • push them to rep.production ( ex: bitbucket or own repository)
  • pull them from rep.production to local.master (merge if need)
  • Push local.master to rep.master
  • deploy - pull rep.master to production.production (ff-only)

The trick is use envoy for automatisation those tasks. So for me long deploy (when users changed content on production) is like 5 commands only on local machine.

Envoy run status (eg got status on production) 
Envoy run commit ( eg commit all changes on production and push them to     production branch of main repository
Git pull origin production
Git push origin master
Envoy run deploy (eg pull on production from rep.master)

That's all.

abass
abass

So I have finally decided on a fairly simple solution that I am pretty happy with. This is a detailed explanation of how to do it if you are using Envoyer.io. If you aren't using Envoyer.io, all you really need to pay attention to is the part where I talk about the Backup Manager Plugin because Envoyer.io is a little bit funky with how it manages to do zero downtime deployment and that was one of the bigger hurdles for me overcome to accomplish this.

I use the service Envoyer.io to deploy and manage my projects, and it is set up so each deploy is a completely new folder. It's a zero downtime deployment service. So essentially it uploads the most recent commit to a folder labeled with the date and time that it was uploaded, and then it has a folder called "current" which just automatically switches over to the newest commit. Anyway, that's beside the point, go look into Envoyer, it's pretty nifty and has tons of features.

So anyway, Envoyer has the "storage" folder outside of the site folder and it has just a "linked folder" (shortcut directory) that is outside of the site (e.g. it stays there when you push new repo updates, it doesn't get reuploaded each time or anything.)

So what you can do is just make the themes content directory also a linked folder and have it sit inside of the storage directory (that's outside of the site folder) using Envoyer. Simply going under Deployment Hooks and making your content folder a linked folder set to:

FROM:

themes/theme-name/content

TO:

storage/content

(keep in mind that the storage folder is actually outside of the site, so what the above command is doing is creating a shortcut folder and directing it to the storage shortcut folder which is linked outside of the site (where the VCS repo pushes to)). I then added the themes/theme-name/content directory to .gitignore and now whatever I push or do to the site, there is the content sitting outside of the site's github repo directory so it is never messed with our changed.

So that's great - and if you aren't using Envoyer, you can pretty much ignore everything above this point. It probably sounds like jibberish too if you haven't messed around at all with Envoyer and understand how it works. You'd just remove the content directory from your git repo.

So now the important part, just use the Backup Manager Plugin and tell it to back up the storage/content directory (or if you aren't using Envoyer, it'd just be the themes/theme-name/content directory and have it set to do it every 5 minutes, 15 minutes, hourly, daily, or whatever you want to do:

img

(You would just remove the "include root path" from that above example if you only want it to back up the content and not the entire site - which the site is in a VCS so you really shouldn't need to use it for anything but the content anyway). And it's nice too that this plugin will regularly backup your database as well. So you'd just be using it to backup content (which is no longer in your VCS) and your site's database.

And boom, you then give the client full control of the "content" directory within the site and they can make any changes they want in there and it's constantly being backed up through the plugin as many times a day as you want, and no matter what you do by pushing repo updates, the content isn't affected.

Last updated

info14532
info14532

How do you apply the contents of the backup to VCS?

I'm following a similar process to this by using the backup plugin to write to dropbox. I have the dropbox app on my laptop with my local root inside dropbox. That means as soon as the backup runs it is copied to dropbox and automatically synced down to my machine so I have the most recent version of content on my local.

It's explained in this How To Handle Content Updates On Production in October CMS guide.

21-28 of 28

You cannot edit posts or make replies: the forum has moved to talk.octobercms.com.