Docker Compose files to spin up an instance of {{ cookiecutter.__service_slug.capitalize() }}.
# How to run
Add a `COMPOSE_ENV` file and save its location as a shell variable along with the location where this repo lives, here for example `/opt/containers/{{ cookiecutter.__project_slug }}` plus all other variables. At [env/fqdn_context.env.example](env/fqdn_context.env.example) you'll find an example environment file.
When everything's ready start {{ cookiecutter.__service_slug.capitalize() }} with Docker Compose, otherwise head down to [Initial setup](#initial-setup) first.
On your deployment machine create the necessary Docker context to connect to and control the Docker daemon on whatever target host you'll be using, for example:
{%- set components = cookiecutter.__component_list_slug.split(',') -%}
{% for component in components %}
{%- if loop.first %}
We build the `{{ cookiecutter.__service_slug }}` image locally. Our adjustment to the official image is simply adding `/tmp/{{ cookiecutter.__service_slug }}` to it. See [build-context/{{ cookiecutter.__service_slug }}/Dockerfile](build-context/{{ cookiecutter.__service_slug }}/Dockerfile). We use `/tmp/{{ cookiecutter.__service_slug }}` to bind-mount a dedicated ZFS dataset for the application's `tmpdir` location.
We're assuming you run Docker Compose workloads with ZFS-based bind mounts. ZFS management, creating a zpool and setting adequate properties for its datasets is out of scope of this document.
## Datasets
Create ZFS datasets and set permissions as needed.