The fuss
I am the mantainer of an 11 years old php application. It's a monolithic application, it has only 5 classes but it works. It was built using mysql 5 and php 5.3, now it keeps working with mariadb 10 and php 7.4.
Static code analisys
The most used tools to maintain and add new functionalities are:
- git to trace changes in the code;
- robot framework, to create test cases of the functionalities of the web application;
- postman to test web servicies and api;
- phpstorm and phpstan that provides a lot of suggestions to avoid deprecated functionalities and increase code readability;
- sonarqube to trace the changes on code quality over time and find ugly pieces of code and forgotten parts.
Deploy the changes
At the moment the code deploy was made manually, comparing the code base with Beyond compare. Today I have tried a new way using git.
I have copied the .git folder in stage and production environment and keep only the stage and main branches in each of the two environments.
Every change is a new branch, when I have to publish the change i merge the branch with stage or main branch.
In the server I run
git fetch # Get the changes
git merge # Merge the changes
If there are some discrepancies in some script the last command shows a list of files not aligned.
I can restore the files with the command
git restore <<file_name>>
It works, but I have fear to create a complete mess.
Any suggestion of an alternative strategy?
Top comments (0)