I encounter an error when I push my database to Github, and git suggest to checkout git lfs (large files storage) .
Quoted from the article by Atlassian,
Git LFS does this by replacing large files in your repository with tiny pointer files. During normal usage, you'll never see these pointer files as they are handled automatically by Git LFS
It's so easy to apply even with an existing large file, for the case, my database.
# for the very first time, run install
git lfs install
# add the large files you want to track
git lfs track '*.sqlite3'
# without argument, you can see the patterns being tracked
git lfs track
# show the files being tracked
git lfs ls-files
Note that after running git lfs track '*.sqlite3'
, it will auto create a file .gitattributes
record the patterns.
Then commit your large file and .gitattributes to github, done.
When others trying to git pull the remote repo, after running git lfs install
(only the very first time), they can git pull
as usual.
reference:
UPDATE:
After adding more data to the database, github reaches its limit even with lfs.π
Let's just keep the database in local file.
(^._.^)οΎ
Top comments (0)