Importing a database dump into a staging environment

Sometimes it is useful to import the database from a production environment into a staging environment for testing. The procedure below assumes you have SSH+sudo access to both the production environment and the staging VM.

Destroy your staging VMwhen you are done with it. It is important to avoid data leaks.

On the staging VM, add the following line to/etc/gitlab/gitlab.rbto speed up large database imports.

# On STAGINGecho "postgresql['checkpoint_segments'] = 64" | sudo tee -a /etc/gitlab/gitlab.rbsudo touch /etc/gitlab/skip-auto-migrationssudo gitlab-ctl reconfiguresudo gitlab-ctl stop unicornsudo gitlab-ctl stop sidekiq

Next, we let the production environment stream a compressed SQL dump to our local machine via SSH, and redirect this stream to a psql client on the staging VM.

# On LOCAL MACHINEssh -C gitlab.example.com sudo -u gitlab-psql /opt/gitlab/embedded/bin/pg_dump -Cc gitlabhq_production |\ssh -C staging-vm sudo -u gitlab-psql /opt/gitlab/embedded/bin/psql -d template1

Recreating directory structure

If you need to re-create some directory structure on the staging server you can use this procedure.

First, on the production server, create a list of directories you want to re-create.

# On PRODUCTION(umask 077; sudo find /var/opt/gitlab/git-data/repositories -maxdepth 1 -type d -print0 > directories.txt)

Copydirectories.txtto the staging server and create the directories there.

# On STAGINGsudo -u git xargs -0 mkdir -p < directories.txt
Baidu
map