When I started developing, in particular on servers I started writing scripts to get them set up which mainly stemmed from the fact I didn’t know what I was doing. I would boot up a virtual machine or VPS and start tinkering with Apache or Nginx, change a setting nonchalantly not thinking about it. 10 minutes later I’m desperately trying to find which setting has changed, most of the time leaving me feeling like this:
I thought there has to be a better way and there is! Enter Bash Scripts!
Using Linux at the time made it easy to write scripts to install everything I needed for a new instance. I started to take what I called the destructible approach whereby I don’t install anything on an instance unless it’s in the script. What that meant is I could tear down my server and get it back to where it was in less than 10 minutes. Beautiful.
Then it clicked. Why not use this pattern in my normal setup? 🤯🤯🤯
Below I will explain how I achieved this on Linux and Windows.
Linux 💻
I got to it and started creating what I now know as “Linux Tools” which is my set of scripts that do the above and more.
Installs ⚡
As Linux is a command-line driven which means apps can be installed via the command-line, for example, to install a package/app the most common approach is to use a package manager, we call the one that comes with Ubuntu apt.
To install git you will issue the following commands (in an Ubuntu flavoured distribution):
sudo apt update
sudo apt install git
- First command updates repositories
- Second installs git
The advantage of installing programs in this way is that it is scriptable. We can put the above into a script(.sh) file and then run from the command line.
If we saved the above into the file “installGit.sh” there are 2 commands used to get the script ready to execute and then run.
sudo chmod u+x ./installGit.sh
sh ./installGit
The first command makes the script executable and only needs to be run once on each script.
The second command runs the script as if we had run the commands in the terminal ourselves.
Any command that can be executed in the terminal can be scripted this way allowing us to set up our environment extremely quickly.
Functions 🛠
I use functions to organise code, I call update at the start of the script to avoid having to call it in each function:
sudo apt-get update
# function definition
GIT_INSTALL () {
sudo apt-get install git
}
# function call
GIT_INSTALL
This may seem like a lot of boilerplate but the advantage is that there may be many steps to installing an app that can be encapsulated into a single function, the above is the most simple example.
Dotfiles 📁
The above explains the automation of installing packages but what blew my mind when first getting into this is being able to restore the apps the state I had them in previous to a reinstall 🤯 This is where Dotfiles come in. I’m not going to go into lots of detail about Dotfiles as you can find that on this awesome post:
How I increased my productivity using dotfiles [updated]
Mpho Mphego ・ Mar 3 '19
Simply put Dotfiles are user configuration files 🔧. This would hold any changes to settings you the user has made the program/app in question. One of the most common you may have come across is .gitconfig which stores info about your git account on your local system. When you run:
git config credential.username “new_username”
It will update your .gitconfig with:
[user]
email = julianiaquinandi@gmail.com
As mentioned I have a set of files that get copied to where they need to go as part of my install, you can find my dotfiles here
Aliases 👻
I use aliases to quickly perform commands without having to type the whole thing out. So in this scenario, you could map the alias “installOs” to run your installer script. It’s not a practical example as you're not likely to need to run your installer script all the time, although I thought it was worth mentioning because:
- Aliases are awesome and save me lots of time
- Part of my install script copies my aliases file from a repo to my local install ready to be consumed by the command line.
If you haven’t used them before I would recommend checking this article for a primer:
Bash Aliases (aka part 1 of 2000 of how to get more done in less time)
Aniket Kadam ・ May 31 '19
Windows 💻
Granted I have a lot more developer experience on Linux but I was able to recreate what I do to automate my OS on Windows with less effort than expected.
Installs ⚡
Unfortunately, Windows does not have a built-in package manager by default but there is a way to achieve command-line installs. Chocolatey touts itself as “The Package Manager for Windows”.
To replicate the git install from above we will run the following:
choco install git
That’s it.
Seems easy and it is. The biggest issue I’ve found with chocolatey is it not having everything I need. You can search the registry here
To turn it into a script you could put the above into a Powershell script: “installGit.ps1” and run it from the command-line like any other script:
./installGit.ps1
Dotfiles 📁
Dotfiles work the same in Windows as they do in Linux. They are user configuration files that I save to a repo and then copy to their required location to restore an app’s settings.
To backup your git config on windows it’s as simple as copying C:\Users\YOUR_USERNAME.gifconfig to a backup location/repo. If you want to restore it simply copy it back to its original location 😉
Aliases 👻 & Functions 🛠
I found creating aliases a bit confusing at first but then I realised you can use functions that are defined in your Powershell default profile.
Here’s a couple of examples from mine, the first is used to install packages from chocolatey by typing “install” The second to clone a repo by typing its name:
function install() {
choco install -y $args
}
function clone() {
clear
echo “Enter repo name”
echo “-----------------“
$repo = Read-Host -Prompt ‘|=>‘
$start=$repo.substring(0, 4)
clear
if ($start -eq “http”) {
git clone $repo
} Else {
git clone git@github.com:kensleDev/$repo
}
}
With the above in your Powershell profile, you can simply type the name of the function to call it e.g.
install git
clone
Wrapping Up 🏁
I love scripting away menial tasks and this was a little look into how I achieve that within my OS. I am constantly refining my scripts and finding better ways to save time and concentrate on what’s important.
TLDR - An overview of how I automate the setup/maintenance of my OS/Environment via Bash/Powershell scripts.
Top comments (14)
This is a nice article, but the title can seem misleading to some. When I think of automating an OS I think Shell/Python/AutoKey in Linux and AuotHotkey/AutoIt/Selenium and just about any other language in Windows.
Hi Ian. Thanks for commenting and your feedback! Funnily enough, my next post is about using AutohotKey and Bash 🤣 I'm a massive fan of both. It's a tricky one (for me anyway, second ever post 😜) maybe "How I automate my OS installs" is more fitting but I'm not sure.
edit Also bonus points for being the first person to comment on one of my posts 🎉🎉🎉
You're welcome. I just finished my first post on AHK here last night, but I've been reading articles here for a little while now. Love this place.
I'm a long time lurker and LOVE this community! I also love AHK, use it for all sorts and have a program that lets your create your own custom quick chat messages for a game called Rocket League. Got an email the other day and it cost some poor guy 3 of his steam accounts! (Some games class it as macros as cheating). So if you play games on your dev machine be careful!
I only cheat in single player games and only because I don't have time to fully play through all the thousands of games I want to play lol. The fact that people use it a lot for cheating is one of the reasons it has a bad name. It's more than a game cheating tool; far more.
I love your perspective here. As a developer it seems like the list on install and configure steps to get my workstation up and running is ever growing. I have been tinkering on ideas on how to automate the process and came to a similar solution. Now I am starting to work on a project to make this concept easy to manage, store in the cloud and sync between machines regardless of OS. I would love feedback on my idea and obviously any contributions from the community!
So far I mainly have worked on writing a wiki with my ideas and put together a basic setup to test with as I formalized my design. If anyone is interested please check out the wiki for my repo: github.com/itlackey/drifter
Not trying to spam your comments but am honestly looking for people in the community that might be interested in a tool like this and/or would like to help make it happen.
Thanks for writing this up and inspiring others to automate all the things!
Thanks! I feel like my career in automating all the things is just beginning! 😉
No worries for posting your project, I've tried implementing something similar to what you are proposing with drifter.
I've settled on syncing everything via Github. I have a couple of bash scripts that either pull new changes in or push them back to the repo. So when changing a dotfile I run the appropriate script. My next step was to add some sort of file watcher to do this automatically but haven't got round to it.
For installs I have a bash script that generates a menu from the names of files in a scripts folder. I generally write a bash script for each package to install and can pick and choose what to install in the script.
For VSCode I use Settings Sync which I'm more than happy with but I am interested in integrating this into my script.
Have a look at my repo Linux Tools, most of the good stuff is in the installs folder. It's nothign special but it works for me!
That is great stuff!! It seems like our approach is more similar that I even thought originally. Basically drifter just adds a layer on top of what you are doing to help package the scripts for each app so it can be easily shared and used by others similar to npm, apt or package manager x. I started down that road for my own organization and sanity and then added the idea of syncing the associated configurations along with the app packages.
In theory this could provide the same feature as windows 10s setting synchronization but across all machines/OSes.
Anyway, I can rant all day about this stuff. So I will just say if you or anyone is interested in working together to make something like this please contact me. I'd love to work together with other automation junkies and am not opposed to ditching my project to contribute to something else that solves the same problem.
Very nice article.
I do use the Windows profile file (well, actually through cmder/conEmu but doesn't change so much) to automate a few common task for basic development without docker or other "virtualwares":
And then my favorite
Thanks, I appreciate it. I like Cmder/Conemu really good terminal for Windows but I'm back on linux for the foreseeable I think.
I've not yet tried the new Windows Terminal but I've heard good things.
I'm team ansible. Sorry. ...but not so sorry lol.
My boss is switching from Windows to Linux when I buy my next workstation tomorrow. Watch me bootstrap his whole rig with ansible 😂
Never heard of ansible but looks interesting. Will definitely be checking it out. Thanks for brining this to my attention 😀
If you've never heard of Ansible, you may as well look up puppet and chef too. They're all similar tools - configuration management and machine provisioning.
I like ansible because it's yaml and I had a smoother time doing things on my local machine (install this. Setup this project. Do these other things I don't feel like doing). ...but the others may be more "you".
But... If you're bashing apt installs, I almost promise you'll go for one of them 😉
I've heard of puppet and chef mentioned by the environment guys at work but wasn't sure what they did or in what context they were being used.
My bash scripts have done me proud and I've learnt a lot but I'm going to have a look at them all and see what's what.
Thanks for sharing 🙏