Development
This project is composed of a set of multiple services which are deployed with docker compose. The
docker/compose.dev.yaml file is a compose file suitable for development
Tip
When the development docker stack is up and running, run docker compose commands with this incantation:
docker compose -f docker/compose.dev.yaml <docker-command> <service-name>
This makes it easier to scope the commands to this project
The more relevant services are:
webapp- the main web application, which is implemented with starlette, sqlmodel, jinja and datastarprocessing-worker- the background worker that does most processing and state-changing DB modifications. It is a dramatiq workermessage-broker- a redis instance that passes messages between the webapp and the processing workerweb-gateway- a traefik instance that acts as a reverse proxy for the systemauth-webapp- an authentik instance that takes care of user authenticationcaddy-file-server- a caddy instance that serves local datasets via HTTP
Environment set up
Start by retrieving the sample datasets that have been made available by the client. These are contained in the file
ipma/2025-marine-data-catalog/sample-data/20251125_datasample01_restored_data.tar.gz
which is available at the internal knowledge base platform (you should know how to retrieve it).
Create a base directory for the project's datasets (for example at ~/data/seis-lab-data)
and then get the archive and extract it inside this directory:
mkdir -p ~/data/seis-lab-data
cd ~/data/seis-lab-data
# now get the tar archive into this dir and extract it
tar -xvf 20251125_datasample01_restored_data.tar.gz
# remove the archive after extraction
rm 20251125_datasample01_restored_data.tar.gz
You should get something that looks like this (this is an abbreviated listing):
ricardo@tygra:~/data/seis-lab-data/$ tree -L 4
.
└── prr_eolicas
└── base-final
└── surveys
└── owf-2025
Now clone this repo locally:
cd ~/dev # or wherever you want to store the code
git clone https://github.com/NaturalGIS/seis-lab-data.git
cd seis-lab-data
In order to simplify mounting the data directory inside docker services, the project assumes
there is a sample-data directory under the root of the repo. As such, make a symlink
pointing to the data directory you created above:
# assuming your sample-data directory lives at `~/data/seis-lab-data`
ln -s ~/data/seis-lab-data sample-data
Ensure you have installed docker and uv on your machine
Use uv to get the project installed locally.
uv sync --group dev --locked
Install the included pre-commit hooks:
uv run pre-commit install
Pull the project docker images from their respective registries. You will likely need to log in
to the ghcr.io registry.
docker compose -f docker/compose.dev.yaml pull
Then launch the stack:
docker compose -f docker/compose.dev.yaml up -d
You should now be able to access the webapp at
http://localhost:8888
Please continue with the bootstrapping section.
Bootstrapping a fresh installation
The bootstrap process consists of:
- Creating/upgrading the database;
- Loading all default variables into the apropriate DB tables;
- Optionally adding some default projects, survey missions and survey-related records.
Bootstrapping is done by using the seis-lab-data CLI, which is available in the webapp
service. It contains many commands and can be called like this:
docker compose -f docker/compose.dev.yaml exec -ti webapp uv run seis-lab-data --help
Run the following commands:
# initialize the DB
docker compose -f docker/compose.dev.yaml exec -ti webapp uv run seis-lab-data db upgrade
# add default data
docker compose -f docker/compose.dev.yaml exec -ti webapp uv run seis-lab-data bootstrap all
# optionally, load sample records
docker compose -f docker/compose.dev.yaml exec -ti webapp uv run seis-lab-data dev load-all-samples
# also optionally, generate a large number of synthetic records
# (this is perhaps more useful when working on the web UI)
docker compose -f docker/compose.dev.yaml exec -ti webapp uv run seis-lab-data dev generate-many-projects --num-projects=50
Additional notes
The docker image used for development uses docker's latest tag and is rebuilt whenever there are commits to the
repo's main branch. As such you should be sure to run
docker compose -f docker/compose.dev.yaml pull webapp
docker compose -f docker/compose.dev.yaml up -d
whenever you know there have been recent merges.
Building the docker image locally
Most of the time you will be using a prebuilt docker image. However, there is a special case where you will need to build it locally. This case is when you add a new Python dependency to the project. In this case, build the image with:
docker build \
--tag ghcr.io/naturalgis/seis-lab-data/seis-lab-data:$(git branch --show-current) \
--file docker/Dockerfile \
.
Then stand up the docker compose stack with:
CURRENT_GIT_BRANCH=$(git branch --show-current) docker compose -f docker/compose.dev.yaml up -d --force-recreate
[!NOTE]
Getting translations to work correctly in your local dev environment
Because the docker compose file used for dev bind mounts the entire src directory, it will
mask the container's own compiled *.mo files. This means that after running
seis-lab-data translations compile you need to restart the webapp service for the changes to take effect.
Running tests
Normal tests can be run from inside the webapp compose container, after installing the required dependencies:
docker compose --file docker/compose.dev.yaml exec webapp uv sync --locked --group gdal --group dev
docker compose --file docker/compose.dev.yaml exec webapp uv run pytest
Integration tests can be run with the following incantation:
docker compose --file docker/compose.dev.yaml exec webapp uv run pytest -m integration
End to end tests
End to end tests are run from outside the docker stack. They require that playwright is installed locally. You can do this with:
uv run playwright install --with-deps chromium
Then tests can be run with:
uv run pytest \
tests/e2e/ \
-m e2e \
--confcutdir tests/e2e \
--user-email akadmin@email.com \
--user-password admin123 \
--base-url http://localhost:8888
The previous incantation will run all end to end tests in headless mode. To run them in headed mode, you can use:
uv run pytest \
tests/e2e/ \
-m e2e \
--confcutdir tests/e2e \
--user-email akadmin@email.com \
--user-password admin123 \
--base-url http://localhost:8888 \
--headed \
--slowmo 1500