# L4D2 Workshop Overlays Implementation Plan > **Approval gate:** This plan may be written and refined without further approval. Do not implement code changes from this plan until the user explicitly approves implementation. **Goal:** Implement the workshop overlay feature per `docs/superpowers/specs/2026-05-07-l4d2-workshop-overlays-design.md`. Add a `WorkshopItem` registry, a typed `Overlay.type` column with a builder registry, a workshop builder that downloads from the Steam Web API and manages symlinks into a deduplicated cache, and the supporting routes, templates, jobs, and tests. **Architecture:** Keep the v1 single-process Flask architecture. New code is additive: a `WorkshopBuilder` class registered in a builder dispatcher, a `steam_workshop` service module for the Steam Web API and downloader, two new database tables and one extended one, and two new job operations on the existing in-process worker. fuse-overlayfs mount handling in `l4d2host` is unchanged — workshop content arrives at overlay paths the same way externals do today. --- ## Locked Decisions See `docs/superpowers/specs/2026-05-07-l4d2-workshop-overlays-design.md` for the design rationale. Implementation-relevant decisions: - Typed overlays: `external` (existing rows; no-op builder) and `workshop` (new); future types deferred. - No JSON `source_config` blob; per-type structured data in proper tables. - `WorkshopItem` is a global deduplicated registry keyed on `steam_id`. Cache at `/var/lib/left4me/workshop_cache/{steam_id}.vpk`. - Overlay symlinks are absolute, named `{steam_id}.vpk`; no Steam filename in any on-disk path. - `overlay_workshop_items` is a pure association; toggle = remove/re-add. - Collections are atomic UI bulk-imports; DB never tracks collection attribution. - Single global admin "Refresh all workshop items" button. - No cache GC in v1. - `Overlay.user_id` is the scope (NULL = system, set = private); independent of `type`. - Workshop overlays default to private; existing externals stay system-wide. - One unified Create-overlay button with type radio; no path field — paths are always `str(overlay_id)`. - `consumer_app_id == 550` validated at fetch/add; not stored. - Input field accepts numeric ID, full Workshop URL, or multi-line batch. - Auto-rebuild after add/remove with build coalescing. - HTTPS for all Steam Web API calls. - `Overlay.id` uses `AUTOINCREMENT`; `create_overlay_directory` uses `exist_ok=False`. - Two partial unique indexes for overlay names: `(name) WHERE user_id IS NULL` and `(name, user_id) WHERE user_id IS NOT NULL`. --- ## Current Gap - `Overlay` rows have `id`, `name`, `path`, no type, no scope. - The web app cannot download anything from Steam; users must SFTP `.vpk` files into prepared overlay directories. - The job worker has no operations for overlay builds or workshop refreshes. - The mount/build pipeline assumes overlay directories are externally populated. - There is no UI affordance to add or list workshop content. --- ## Task 1: Extend Tests First — Schema Migration And Models **Files:** - Create: `l4d2web/tests/test_workshop_overlay_models.py` - Modify: `l4d2web/tests/test_models.py` (extend) — partial unique index behavior Write tests against fresh SQLite schemas asserting: - An `Overlay` migration round-trip: existing rows acquire `type='external'` and `user_id=NULL`; their `name` values remain unique by partial index. - After migration, two externals (both `user_id=NULL`) with the same name are rejected by the system partial unique index. - After migration, two users may both own a workshop overlay named `"my-maps"` (per-user partial unique index). - `WorkshopItem.steam_id` is unique; concurrent inserts of the same `steam_id` raise integrity errors. - `overlay_workshop_items` enforces `UNIQUE(overlay_id, workshop_item_id)`. - `Overlay` deletion cascades `overlay_workshop_items` rows but does not delete `WorkshopItem` rows (`ON DELETE RESTRICT`). - `Job.overlay_id` is nullable and references `overlays(id)`. - `Overlay.id` does not reuse a deleted ID after the migration (AUTOINCREMENT). Verification command: ```bash pytest l4d2web/tests/test_workshop_overlay_models.py l4d2web/tests/test_models.py -q ``` Expected before implementation: FAIL. --- ## Task 2: Schema Migration And ORM Mappings **Files:** - Create: `l4d2web/alembic/versions/0002_workshop_overlays.py` - Modify: `l4d2web/models.py` Migration `0002_workshop_overlays` (`down_revision = "b2c684fddbd3"`): 1. `op.batch_alter_table("overlays")`: - Add `type VARCHAR(16) NOT NULL DEFAULT 'external'` (server_default during migration; remove after backfill). - Add `user_id INTEGER NULL REFERENCES users(id)`. - Drop the existing `unique=True` on `name`. - Add index `ix_overlays_type_user_id` on `(type, user_id)`. - Switch `id` to `AUTOINCREMENT`. 2. After batch alter, create the two partial unique indexes via raw `op.create_index(..., postgresql_where=..., sqlite_where=...)`: - `uq_overlay_name_system` on `(name)` `WHERE user_id IS NULL`. - `uq_overlay_name_per_user` on `(name, user_id)` `WHERE user_id IS NOT NULL`. 3. `op.create_table("workshop_items", ...)` per spec data-model section. 4. `op.create_table("overlay_workshop_items", ...)` with the unique constraint and the reverse-lookup index. 5. `op.batch_alter_table("jobs")`: add `overlay_id INTEGER NULL REFERENCES overlays(id)`. ORM (`models.py`): - Extend `Overlay`: add `type`, `user_id`. Drop `unique=True` on `name`. Set `__table_args__` with the two partial indexes and `ix_overlays_type_user_id`. - Extend `Job`: add `overlay_id` mapped column with FK. - New `WorkshopItem` and `OverlayWorkshopItem` classes per spec. Set up `Overlay.workshop_items` relationship through the association. Verification command: ```bash pytest l4d2web/tests/test_workshop_overlay_models.py l4d2web/tests/test_models.py -q ``` Expected after implementation: PASS. Run alembic against a fresh test DB to verify upgrade and downgrade succeed. --- ## Task 3: Tests First — Steam Web API And Downloader **Files:** - Create: `l4d2web/tests/test_steam_workshop.py` Mock HTTP with `responses` or `pytest-httpserver`. Cover: - `parse_workshop_input` accepts a single numeric ID, a single Workshop URL (`steamcommunity.com/sharedfiles/filedetails/?id=N`), and a multi-line whitespace-separated batch of either; returns deduplicated ordered list of digit-only IDs. - `parse_workshop_input` rejects garbage, paths outside `?id=`, non-digit IDs. - `resolve_collection` POSTs to the HTTPS endpoint with the form-encoded payload and returns `publishedfileid` children. - `fetch_metadata_batch` POSTs once with `itemcount=N`; returns parsed `WorkshopMetadata` per item; captures `result != 1` into `last_error`; raises `WorkshopValidationError` when any `consumer_app_id != 550` during user-add; logs and skips during refresh-mode. - `WorkshopMetadata.preview_url` is captured. - `download_to_cache` writes `cache_root/{steam_id}.vpk.partial`, then `os.replace` to the final name; sets `os.utime(file, (time_updated, time_updated))`. - `download_to_cache` is idempotent: a second call where on-disk `(mtime, size)` matches `(time_updated, file_size)` is a no-op (no HTTP request issued). - `refresh_all` runs downloads via `ThreadPoolExecutor(max_workers=8)` and reports per-item errors without aborting the batch. - All Steam API URLs use `https://`. Verification command: ```bash pytest l4d2web/tests/test_steam_workshop.py -q ``` Expected before implementation: FAIL. --- ## Task 4: Steam Workshop Service Module **Files:** - Create: `l4d2web/services/steam_workshop.py` Public surface: ```python def parse_workshop_input(raw: str) -> list[str]: ... def resolve_collection(collection_id: str) -> list[str]: ... def fetch_metadata_batch(steam_ids: list[str], *, mode: Literal["add","refresh"]) -> list[WorkshopMetadata]: ... def download_to_cache(meta: WorkshopMetadata, cache_root: Path, *, on_progress=None, should_cancel=None) -> Path: ... def refresh_all(items: list[WorkshopItem], cache_root: Path, executor_workers: int = 8) -> RefreshReport: ... ``` Implementation rules: - Endpoints are HTTPS: - `https://api.steampowered.com/ISteamRemoteStorage/GetCollectionDetails/v1/` - `https://api.steampowered.com/ISteamRemoteStorage/GetPublishedFileDetails/v1/` - Form-encoded POSTs with `itemcount=N` / `collectioncount=N` and `publishedfileids[i]=…` per index. - Per-request timeout 30s; per-item ceiling 5min. No retry or backoff in v1. - `consumer_app_id != 550`: - In `mode="add"`: raise `WorkshopValidationError` with the offending `steam_id`. - In `mode="refresh"`: log and skip; do not abort other items. - `result != 1`: capture Steam's result code in the item's `last_error`; do not download; do not abort siblings. - Cooperative cancellation: `download_to_cache` checks `should_cancel()` between chunked reads; `refresh_all`'s executor checks before each task. - `WorkshopMetadata` is a dataclass with `steam_id, title, filename, file_url, file_size, time_updated, preview_url, consumer_app_id, result`. - `RefreshReport` aggregates per-item outcomes for the caller's job log. - Use a single `requests.Session` per call site for connection reuse. Verification command: ```bash pytest l4d2web/tests/test_steam_workshop.py -q ``` Expected after implementation: PASS. --- ## Task 5: Tests First — Path Helpers And Overlay Creation **Files:** - Create: `l4d2web/tests/test_workshop_paths.py` - Create: `l4d2web/tests/test_overlay_creation.py` Cover: - `workshop_cache_root()` returns `LEFT4ME_ROOT/workshop_cache`. - `cache_path(steam_id)` returns `cache_root / f"{steam_id}.vpk"` for valid digit strings; rejects non-digits, slashes, dot-dot. - `generate_overlay_path(overlay_id)` returns `str(overlay_id)`; passes `validate_overlay_ref` from `l4d2host.paths`. - `create_overlay_directory(overlay)` creates `LEFT4ME_ROOT/overlays/{path}/` with `exist_ok=False`. Calling twice raises (DB/disk drift surfaced loudly). Verification command: ```bash pytest l4d2web/tests/test_workshop_paths.py l4d2web/tests/test_overlay_creation.py -q ``` Expected before implementation: FAIL. --- ## Task 6: Path Helpers And Overlay Creation **Files:** - Create: `l4d2web/services/workshop_paths.py` - Create: `l4d2web/services/overlay_creation.py` `workshop_paths`: ```python def workshop_cache_root() -> Path: ... # LEFT4ME_ROOT/workshop_cache def cache_path(steam_id: str) -> Path: ... # validates digits-only; returns cache_root/{steam_id}.vpk ``` `overlay_creation`: ```python def generate_overlay_path(overlay_id: int) -> str: ... # str(overlay_id) + validate_overlay_ref def create_overlay_directory(overlay: Overlay) -> None: # makedirs(..., exist_ok=False) ... ``` Verification command: ```bash pytest l4d2web/tests/test_workshop_paths.py l4d2web/tests/test_overlay_creation.py -q ``` Expected after implementation: PASS. --- ## Task 7: Tests First — Overlay Builders **Files:** - Create: `l4d2web/tests/test_overlay_builders.py` Cover with `tmp_path`: - `BUILDERS` dict resolves `"external"` and `"workshop"` to instances; unknown types raise `KeyError` (caller's error). - `ExternalBuilder.build()` is a no-op: makes the overlay directory if missing, writes one log line, returns. Existing files in the directory are untouched. - `WorkshopBuilder.build()` against a fixture overlay with three associated `WorkshopItem` rows (two with cache files present, one without): - Creates `left4dead2/addons/` if missing. - Creates symlinks `addons/{steam_id_a}.vpk → cache_root/{steam_id_a}.vpk` for items with cache files. Symlinks are absolute. - Skips the uncached item; emits a warning log line. Does not create a dangling symlink. - On a re-run with the same associations: no FS changes; logs report `unchanged=2 skipped(uncached)=1`. - On a re-run after one association is removed: removes the obsolete symlink only; leaves cache files alone. - On a re-run after one item is added: adds only the new symlink. - Files in `addons/` that aren't symlinks into the cache are left untouched. - `should_cancel` mid-build: stops between filesystem ops; partial state is consistent and a re-run heals. Verification command: ```bash pytest l4d2web/tests/test_overlay_builders.py -q ``` Expected before implementation: FAIL. --- ## Task 8: Overlay Builders And Dispatcher **Files:** - Create: `l4d2web/services/overlay_builders.py` ```python class OverlayBuilder(Protocol): def build(self, overlay: Overlay, *, on_stdout, on_stderr, should_cancel) -> None: ... class ExternalBuilder: ... class WorkshopBuilder: ... BUILDERS: dict[str, OverlayBuilder] = { "external": ExternalBuilder(), "workshop": WorkshopBuilder(), } ``` `WorkshopBuilder.build()`: 1. Load the overlay's `WorkshopItem` rows. 2. `os.makedirs(overlay_root / "left4dead2/addons", exist_ok=True)`. 3. Compute `desired = {f"{steam_id}.vpk": cache_path(steam_id)}` for items where `last_downloaded_at IS NOT NULL` and the cache file exists. Skip and warn for items missing a cache file. 4. Inspect existing entries in `addons/` via `os.scandir`: keep entries that are not symlinks into `workshop_cache`; otherwise diff against `desired` and apply changes via `os.unlink` and `os.symlink(absolute_target, link_path)`. 5. Emit `created N, removed M, unchanged K, skipped (uncached) S` log line. 6. Check `should_cancel()` between filesystem ops. Verification command: ```bash pytest l4d2web/tests/test_overlay_builders.py -q ``` Expected after implementation: PASS. --- ## Task 9: Tests First — Worker Scheduler Truth Table And Coalescing **Files:** - Modify: `l4d2web/tests/test_job_worker.py` Add coverage: - Truth table for `can_start`: - `install` not claimed while `refresh_workshop_items`, any `build_overlay`, or any server job is running. - `refresh_workshop_items` not claimed while `install`, any `build_overlay`, or any server job is running. - `build_overlay(N)` not claimed while `install`, `refresh_workshop_items`, or another `build_overlay(N)` is running. Two `build_overlay` jobs for **different** overlay IDs claim concurrently. - Server start/init blocks if `refresh_workshop_items` runs or if any `build_overlay(N)` runs where N ∈ overlays of the server's blueprint. - `enqueue_build_overlay(overlay_id)`: - Inserts a new queued job when no pending job exists. - Returns the existing pending job when one is already queued (coalescing). - Does not coalesce against running jobs (a new add after build start gets a fresh queued job). - `refresh_workshop_items` post-completion enqueues `build_overlay` only for overlays whose items had `time_updated` advance or `filename` change; each such enqueue uses the coalescing helper. Verification command: ```bash pytest l4d2web/tests/test_job_worker.py -q ``` Expected before implementation: FAIL. --- ## Task 10: Worker Scheduler And New Operations **Files:** - Modify: `l4d2web/services/job_worker.py` Changes: - Define `OVERLAY_OPERATIONS = {"build_overlay"}` and `GLOBAL_OPERATIONS = {"install", "refresh_workshop_items"}`. Update `malformed_server_job` to allow `server_id IS NULL` for these. - Extend `SchedulerState` with `running_overlays: set[int]` and `refresh_running: bool`. - Update `claim_next_job()`: - Compute `running_overlays` from queries against `running` jobs of operation `build_overlay`. - Apply the truth-table rules above. - Continue using `created_at, id` ordering for deterministic claim. - Add `enqueue_build_overlay(overlay_id: int) -> Job` helper: - Look for `queued` `build_overlay` job with same `overlay_id`. Return it if present. - Otherwise insert a new queued job with `overlay_id` set, `server_id=None`, `operation="build_overlay"`. - Update `run_job` dispatch: - `build_overlay` → load `Overlay`, dispatch to `BUILDERS[overlay.type].build(overlay, on_stdout, on_stderr, should_cancel)`. - `refresh_workshop_items` → call `steam_workshop.refresh_all(...)`. After completion, for each affected overlay, call `enqueue_build_overlay(overlay_id)`. Verification command: ```bash pytest l4d2web/tests/test_job_worker.py -q ``` Expected after implementation: PASS. --- ## Task 11: Tests First — Routes, Permissions, And Auto-Rebuild **Files:** - Modify: `l4d2web/tests/test_overlays.py` - Create: `l4d2web/tests/test_workshop_routes.py` Cover: - `POST /overlays` with `type='workshop'` and `name` succeeds for any logged-in user; `path` is auto-generated; `user_id` is set; the directory exists at `LEFT4ME_ROOT/overlays/{id}`. - `POST /overlays` with `type='external'` succeeds only for admins; `user_id` is NULL. - Duplicate workshop name within the same user is rejected; duplicate names across users are accepted. - Duplicate external name is rejected. - Non-admins see `type='external' OR user_id=current_user.id` only when listing overlays. - `POST /overlays/{id}/items` with one numeric ID adds an association and enqueues a coalesced `build_overlay`. The response is an HTMX fragment of the updated item table. - `POST /overlays/{id}/items` with a multi-line batch (mix of IDs and URLs) adds all and enqueues one coalesced job for the batch. - `POST /overlays/{id}/items` with a collection ID resolves members and adds N associations. - Adding a non-L4D2 item (`consumer_app_id != 550`) returns HTTP 400 with a useful message; no association is created. - Adding an item already in the overlay returns "already in overlay" (no 500). - `POST /overlays/{id}/items/{item_id}/delete` removes the association and enqueues a coalesced build. - `POST /overlays/{id}/build` enqueues the manual rebuild and redirects to the job page. - `POST /admin/workshop/refresh` is admin-only; non-admins receive 403. Mock `steam_workshop` HTTP layer for these tests. Verification command: ```bash pytest l4d2web/tests/test_overlays.py l4d2web/tests/test_workshop_routes.py -q ``` Expected before implementation: FAIL. --- ## Task 12: Routes And Templates **Files:** - Modify: `l4d2web/routes/overlay_routes.py` - Create: `l4d2web/routes/workshop_routes.py` - Modify: `l4d2web/routes/page_routes.py` - Modify: `l4d2web/templates/overlays.html` - Modify: `l4d2web/templates/overlay_detail.html` - Create: `l4d2web/templates/_overlay_item_table.html` - Modify: `l4d2web/templates/admin.html` - Modify: `l4d2web/app.py` (register the workshop blueprint) `overlay_routes.py`: - `create_overlay`: read `type` and `name` from form. No `path` field accepted. - `type='external'`: admin-only; `user_id=NULL`. After insert, set `path = generate_overlay_path(id)`; call `create_overlay_directory(overlay)`. - `type='workshop'`: any logged-in user; `user_id=current_user.id`. After insert, set `path = generate_overlay_path(id)`; call `create_overlay_directory(overlay)`. - `update_overlay`: forbid changing `type` and `path`. Workshop: owner or admin can edit `name`. External: admin-only `name` edits. - `delete_overlay`: after the row deletes, `shutil.rmtree(LEFT4ME_ROOT/overlays/{path})` only if `overlay.path == str(overlay.id)` (legacy externals are left alone). Cache untouched. `workshop_routes.py`: - `POST /overlays/{id}/items`: parse input via `parse_workshop_input`; if a collection ID, resolve members; batch-fetch metadata in `mode="add"`; reject non-550 with HTTP 400; upsert `WorkshopItem` via SQLite `INSERT ... ON CONFLICT DO UPDATE` on `steam_id`; bulk-add associations catching `(overlay_id, workshop_item_id)` unique violations; call `enqueue_build_overlay(overlay_id)`; return rendered `_overlay_item_table.html` fragment. - `POST /overlays/{id}/items/{item_id}/delete`: ownership check; remove association; call `enqueue_build_overlay(overlay_id)`; return updated fragment. - `POST /overlays/{id}/build`: ownership check; enqueue (coalesced); redirect to `/jobs/{job_id}`. - `POST /admin/workshop/refresh`: `@require_admin`; insert a `refresh_workshop_items` queued job; redirect to `/admin/jobs`. `page_routes.py`: - `overlays()`: admins see all; non-admins see `type='external' OR user_id=current_user.id`. - `overlay_detail()`: load `WorkshopItem` rows for workshop-type overlays. Templates: - `overlays.html`: add Type column. Modal has type radio (External | Workshop) and name field. No path field. - `overlay_detail.html`: branch on `overlay.type`. - External view: read-only path display, name edit (admin only). - Workshop view: an `