Logbook Service
Optional knowledge management and document ingestion service. Logbook processes uploaded documents (PDF, DOCX, images, and more), extracts text, classifies content, and generates knowledge base articles for your team.
Docker Compose
Add the Logbook service and its PostgreSQL database to your docker-compose.yml:
services:
windshift:
image: ghcr.io/windshiftapp/windshift:latest
environment:
- BASE_URL=https://windshift.example.com
- SSO_SECRET=${SSO_SECRET}
- LOGBOOK_ENDPOINT=http://logbook:8090
depends_on:
logbook:
condition: service_healthy
logbook:
image: ghcr.io/windshiftapp/logbook:latest
container_name: windshift-logbook
restart: unless-stopped
environment:
- LOGBOOK_DATABASE_URL=postgresql://logbook:${LOGBOOK_POSTGRES_PASSWORD}@logbook-postgres:5432/logbook?sslmode=disable
- LOGBOOK_LLM_ENDPOINT=http://llm:8081
- LOGBOOK_ARTICLE_ENDPOINT=http://windshift:8080/api/internal/llm
- SSO_SECRET=${SSO_SECRET}
- LOGBOOK_PORT=8090
- LOGBOOK_STORAGE_PATH=/data/logbook
volumes:
- logbook-data:/data/logbook
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8090/api/logbook/health"]
interval: 30s
timeout: 10s
retries: 3
depends_on:
logbook-postgres:
condition: service_healthy
logbook-postgres:
image: postgres:18
container_name: windshift-logbook-postgres
restart: unless-stopped
environment:
- POSTGRES_USER=logbook
- POSTGRES_PASSWORD=${LOGBOOK_POSTGRES_PASSWORD}
- POSTGRES_DB=logbook
volumes:
- logbook-postgres-data:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U logbook"]
interval: 5s
timeout: 5s
retries: 5
volumes:
logbook-data:
logbook-postgres-data:Add LOGBOOK_POSTGRES_PASSWORD to your .env file:
LOGBOOK_POSTGRES_PASSWORD=secure-logbook-passwordEnvironment Variables
| Variable | Default | Description |
|---|---|---|
LOGBOOK_DATABASE_URL |
- | PostgreSQL connection string |
LOGBOOK_LLM_ENDPOINT |
- | Direct LLM service endpoint for article generation |
LOGBOOK_ARTICLE_ENDPOINT |
- | Fallback LLM endpoint proxied through the main Windshift server |
SSO_SECRET |
- | Must match the main Windshift service for authentication |
LOGBOOK_PORT |
8090 |
Server port |
LOGBOOK_STORAGE_PATH |
- | Directory for uploaded document files |
Connecting to Windshift
Set LOGBOOK_ENDPOINT on your main Windshift service:
windshift:
environment:
- LOGBOOK_ENDPOINT=http://logbook:8090
depends_on:
logbook:
condition: service_healthyThe main Windshift server authenticates users and proxies requests to Logbook with trusted headers. The SSO_SECRET must match between both services.
See Environment Variables for the full configuration reference.
How It Works
Logbook processes documents through a multi-stage pipeline:
- Upload - Users upload files through the Windshift UI (max 64 MB per file)
- Text Extraction - Extracts text using poppler-utils (PDF) and kreuzberg-cli (DOCX, images, etc.)
- Classification - Categorizes the document as knowledge, record, or correspondence
- Content Cleaning - Strips formatting artifacts and normalizes the extracted text
- Article Generation - Uses the LLM service to generate a structured knowledge base article
- Storage - Chunks and stores the article for search and retrieval
Document Types
Logbook classifies each uploaded document and processes it accordingly:
| Type | Description | Processing |
|---|---|---|
| Knowledge | Technical docs, guides, procedures | Full article generated with structured content |
| Correspondence | Emails, letters, memos | Brief summary extracted, no full article |
| Record | Invoices, receipts, forms | Metadata indexed, no article generated |
Buckets
Knowledge base content is organized into buckets - containers with their own permissions, retention policies, and optional approval workflows. Buckets allow teams to maintain separate knowledge bases (e.g., Engineering, Support, HR) with independent access controls.
Requirements
- Database: Requires its own PostgreSQL instance (separate from the main Windshift database)
- Text Extraction: The Docker image includes poppler-utils and kreuzberg-cli for document processing
- LLM: Article generation requires either the LLM Inference Service or an external LLM provider
- File Size: Maximum upload size is 64 MB per file
- Storage: Uploaded files are stored at the configured
LOGBOOK_STORAGE_PATH