DEV Community

Cover image for Automatic Deployment via good ol' FTP

Automatic Deployment via good ol' FTP

Andreas on September 03, 2020

Since their release, GitHub actions are on my long-term todo list for increasing automation of my workflows. Thanks to DEVs GitHub Actions Hackatho...
Collapse
 
lampewebdev profile image
Michael "lampe" Lazarski

Just one thing to keep in mind:
FTP is not build to be secure.
FTP is sending usernames and passwords in clear text through the network.
FTP is vulnerable to sniffing, spoofing, and brute force attacks, among other basic attack methods.
SFTP should be used or SSH instead.

Collapse
 
devmount profile image
Andreas

Totally agree, thank you for this addition. git-ftp supports FTP, FTPS and SFTP, you just have to specify the protocol at the beginning of the url, e.g.:

ftps://your.server.com
Enter fullscreen mode Exit fullscreen mode
Collapse
 
devmount profile image
Andreas

Added a corresponding security hint in the article. Thanks again 😊

Collapse
 
dennisk profile image
Dennis Keirsgieter

You should go a step further and let the building (and testing) also be done via github actions. I've been doing that for a while now and it feels great knowing all i have to do is push my code (and or/release depending on the repo) for it to spin up, do its thing and deploy it to ftp.

Collapse
 
devmount profile image
Andreas

Of course, this should be the logical next step. Do you also use GitHub actions for that? How do you handle failed tests?

Collapse
 
dennisk profile image
Dennis Keirsgieter

I do use GH actions for that yes. And if a test fails i get notified via email and it will simple stop deploying :)

Thread Thread
 
devmount profile image
Andreas

Cool! I looked through your GitHub repos and found one of your yml files - I will take it as a starting point to increase automation of my own workflows - thanks again 😊

Thread Thread
 
dennisk profile image
Dennis Keirsgieter

Yeah i think all that is missing from that repo is the testing part. Just do a command to test and if it fails it should stop all together. The git hard -reset command is there because i had an issue earlier with the ftp plugin also uploading files i don't want to my FTP. Not sure if that is still needed. No problem and have fun with it!

Thread Thread
 
devmount profile image
Andreas

Thank you for sharing your workflow 👍🏻

Collapse
 
shaijut profile image
Shaiju T

Nice, 😄, In some cases I upload to Godaddy windows hosting manually using File Zilla FTP client. So you mean using your workflow I can automate that ? Also how can I mention the FTP default port number ?

Collapse
 
devmount profile image
Andreas

Exactly, you can automate that. The port number is appended to the server url, divided by colon, e.g. ftps://your.ftp-server.com:21.

Collapse
 
jpkeisala profile image
Jukka-Pekka Keisala

I tried to use FTP deploy but it did not fit for my needs. I think it is OK for few files but when you need to move thousands of small files it is way too slow. FTP's weakness is that it transfers each file individually. Perhaps if you could have process that zips the directory then FTP transfers to the Zip and then some other process on the server unzip's it... But then why not just hooking directly to Githooks or something alike?

Collapse
 
devmount profile image
Andreas

Oh I know exactly, what you mean! If only FTP had some kind of built in transport compression/bundleing, it would be so much faster.

The good thing about Git-ftp is, that you only have to upload all your files once. Every following upload will only contain the files that changed since the last upload (which I assume to be not that much files). You can even git ftp catchup, if your files are already existing on your server. But I agree, that there are some limitations at some point. So it depends on the project, I guess.

Collapse
 
fardarter profile image
sauln

If you don't increase the fetch depth you'll get screwed here on more than a few commits. You'll end up reloading your whole repo not just the changes.

Collapse
 
devmount profile image
Andreas

Yes you're right. At least the last 2 checkins are required in order to determine differences, so you can either increase the fetch depth or deploy on every commit. I'm not sure if increasing the fetch depth has any side effects though...

Collapse
 
fardarter profile image
sauln

Whatever they are I'd bet they're worth not risking 2hr deploys by accident if you commit a few too many times before pushing.

Collapse
 
wvitalik profile image
Vitalii

Will this approach work for WordPress websites?

Collapse
 
devmount profile image
Andreas

It will. However the first commit could take some time since all files have to be uploaded.