Recently I had to do this, found a shell script that cloned all repositories matching a search query, and edited it to meet my purposes.
You’ll need
- A working installation of Git with GitBash.
- A copy of jq in a directory that’s in your path (or in the directory where you’re running the script)
- Git set up to work with SSH. Alternatively, you might want to try changing “ssh_url” below to “clone_url” and that should pull the https link.
- An approximate count of the number of repos you’ll be cloning. Set the end of the loop in the first line to that number divided by 100, rounded up to the next whole number. For example, I had a little over 100 public repos to clone, so my loop is
1..2
.
The Script
for i in {1..2}
do curl "https://api.github.com/users/[your user name]/repos?per\_page=100&page=$i" \ | jq -r '.[].ssh\_url' >> urls.txtdonecat urls.txt | xargs -P8 -L1 git clone
Then just run the script in GitBash. sh [scriptname]
And that did it for me. Hope it helps.
Top comments (1)
FYI, I have so many repos because I had to mass fork a bunch of repos for work to do some batch operations, then mass commit, push, and submit pull requests.