In my previous company we also use a database dump for e2e test: the dump is imported into the postgres database before running the test. This way we make sure that the tests actually pass with regard to a real database. The dump is then updated from time to time on an adhoc basis.
That is a good idea that brings your tests even closer to the production environment. Good point. I wonder how big the dump is though and how big the resulting containers were?
Actually our dump is from staging (we don't want to meddle with production data, imagine a test email arrives at customer inbox 😅) and the database is not that big, in my memory, it was less than 100MB. And as the dump data is only updated once in a while, it's built into the Dockerfile and not imported every time docker-compose is run.
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
In my previous company we also use a database dump for e2e test: the dump is imported into the postgres database before running the test. This way we make sure that the tests actually pass with regard to a real database. The dump is then updated from time to time on an adhoc basis.
That is a good idea that brings your tests even closer to the production environment. Good point. I wonder how big the dump is though and how big the resulting containers were?
Actually our dump is from staging (we don't want to meddle with production data, imagine a test email arrives at customer inbox 😅) and the database is not that big, in my memory, it was less than 100MB. And as the dump data is only updated once in a while, it's built into the Dockerfile and not imported every time docker-compose is run.