Hi there -
EspoCRM has been a critical part of our company for a few years now.
However, I have struggled to come up with a flow that I really liked...until now.
My problem before was the ugliness of making changes in production that mapped to Custom, while Modules were the 'preferred' way to load in customizations. I found the workflow prescribed here clumsy and it was difficult for me to feel confident in my version control system (git), alongside changes in the database.
I finally bit the bullet and moved from 5.8 to 8.0 (yes, this is how long took me to want to bite the bullet, there were too many breaking changes and I didn't have development capacity to handle them). Couple this with my fanatic fear of data loss or corruption and I just kept kicking the can down the road.
But, Docker.
Docker is amazing and the team has had an official image for a long time. I wish I had taken the initiative to use it sooner (https://docs.espocrm.com/administrat.../installation/)
So, my flow and my stack has become:
1. Use the docker-compose.yml file provided and officially supported to run the application.
2. Inside of docker-compose, add mounted references in the espocrm/volume to custom, client/custom, and data/logs on the host machine. custom and client/custom go into version control, as does my Dockerfile (which installs VIM in the container) and my docker-compose file.
3. I keep my database outside of the docker container and then update the espocrm/environment variables to map to the IP Address of my host machine and access the database there on port 3306. This allows me to set up my hourly backups directly to Amazon S3 more easily than trying to figure out how to connect to the MySQL instance inside the container. I use simplebackups.io for backups. It also allows me to swap the database out to Amazon RDS (which I intend to do) to facilitate scaling without having to manage infrastructure.
4. I keep an active production-edits branch on my server and, when I need to do work, I merge into master, push to github and pull down to my local environment when I need to do something.
On my local development environment, then, I create a new branch and get to work. If I'm not doing anything that edits the database, then I use my local docker-compose to set the database settings to my production database, so I can have access to that data in real-time.
But, I also have a staging database that mirrors the schema of the production database that I can use if I'm doing database updates or rebuilds, so I can see how they work.
Finally, when I'm done, I push my changes up to github and then pull down again on the server. I complete the merge, then recreate my production-edits branch, which serves as a catchall for all the GUI updates until the next time I want to release a feature or an update.
As an aside, if I DO need to work on the server, I just SSH in from VSCode, but I try to avoid this.
I'm sure there are better approaches, but this one has really been working well for me and allowed me to get a lot of stuff done much more quickly. Hope it helps some of you.
EspoCRM has been a critical part of our company for a few years now.
However, I have struggled to come up with a flow that I really liked...until now.
My problem before was the ugliness of making changes in production that mapped to Custom, while Modules were the 'preferred' way to load in customizations. I found the workflow prescribed here clumsy and it was difficult for me to feel confident in my version control system (git), alongside changes in the database.
I finally bit the bullet and moved from 5.8 to 8.0 (yes, this is how long took me to want to bite the bullet, there were too many breaking changes and I didn't have development capacity to handle them). Couple this with my fanatic fear of data loss or corruption and I just kept kicking the can down the road.
But, Docker.
Docker is amazing and the team has had an official image for a long time. I wish I had taken the initiative to use it sooner (https://docs.espocrm.com/administrat.../installation/)
So, my flow and my stack has become:
1. Use the docker-compose.yml file provided and officially supported to run the application.
2. Inside of docker-compose, add mounted references in the espocrm/volume to custom, client/custom, and data/logs on the host machine. custom and client/custom go into version control, as does my Dockerfile (which installs VIM in the container) and my docker-compose file.
3. I keep my database outside of the docker container and then update the espocrm/environment variables to map to the IP Address of my host machine and access the database there on port 3306. This allows me to set up my hourly backups directly to Amazon S3 more easily than trying to figure out how to connect to the MySQL instance inside the container. I use simplebackups.io for backups. It also allows me to swap the database out to Amazon RDS (which I intend to do) to facilitate scaling without having to manage infrastructure.
4. I keep an active production-edits branch on my server and, when I need to do work, I merge into master, push to github and pull down to my local environment when I need to do something.
On my local development environment, then, I create a new branch and get to work. If I'm not doing anything that edits the database, then I use my local docker-compose to set the database settings to my production database, so I can have access to that data in real-time.
But, I also have a staging database that mirrors the schema of the production database that I can use if I'm doing database updates or rebuilds, so I can see how they work.
Finally, when I'm done, I push my changes up to github and then pull down again on the server. I complete the merge, then recreate my production-edits branch, which serves as a catchall for all the GUI updates until the next time I want to release a feature or an update.
As an aside, if I DO need to work on the server, I just SSH in from VSCode, but I try to avoid this.
I'm sure there are better approaches, but this one has really been working well for me and allowed me to get a lot of stuff done much more quickly. Hope it helps some of you.
Comment