Adds two managed system overlays (l4d2center-maps, cedapug-maps) that fetch curated map archives from upstream sources and reconcile addons symlinks for non-Steam maps. A daily systemd timer enqueues a coalesced refresh_global_overlays worker job; downloads, extraction, and rebuilds run in the existing job worker and surface in the job log UI. Schema: GlobalOverlaySource / GlobalOverlayItem / GlobalOverlayItemFile plus nullable Job.user_id so system jobs render as "system" in the UI. The new builder reconciles symlinks against the per-source vpk cache and leaves foreign symlinks untouched. Initialize-time guard refuses to mount a partial overlay if any expected vpk is missing from cache. Refresh service uses shutil.move to handle EXDEV when /tmp and the cache live on different filesystems. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2389 lines
84 KiB
Markdown
2389 lines
84 KiB
Markdown
# L4D2 Global Map Overlays Implementation Plan
|
|
|
|
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
|
|
|
|
**Goal:** Add daily-refreshed, system-wide `l4d2center-maps` and `cedapug-maps` overlays populated from upstream map sources.
|
|
|
|
**Architecture:** Keep the host library unchanged. Add managed global overlay source rows, source-specific manifest parsers, a global-map cache, a shared global-map overlay builder, a coalesced `refresh_global_overlays` worker operation, and a systemd timer that only enqueues the job via Flask CLI. Global map overlays are `Overlay.user_id = NULL`, visible and blueprint-selectable for every authenticated user, but their managed types are not available in normal overlay creation.
|
|
|
|
**Tech Stack:** Python 3.12+, Flask CLI, SQLAlchemy, Alembic, pytest, requests, py7zr, zipfile, systemd timer units.
|
|
|
|
---
|
|
|
|
## Source Design
|
|
|
|
- `docs/superpowers/specs/2026-05-07-l4d2-global-map-overlays-design.md`
|
|
|
|
## File Map
|
|
|
|
- `l4d2web/models.py`: make `Job.user_id` nullable; add `GlobalOverlaySource`, `GlobalOverlayItem`, and `GlobalOverlayItemFile` ORM classes.
|
|
- `l4d2web/alembic/versions/0003_global_map_overlays.py`: schema migration for nullable system jobs and global overlay metadata tables.
|
|
- `l4d2web/services/global_overlays.py`: constants, singleton seeding, coalesced `refresh_global_overlays` enqueue helper, and managed-type policy helpers.
|
|
- `l4d2web/services/global_map_sources.py`: parse/fetch L4D2Center CSV and CEDAPUG custom page manifests.
|
|
- `l4d2web/services/global_map_cache.py`: cache paths, safe archive extraction, atomic downloads, and verification helpers.
|
|
- `l4d2web/services/global_overlay_refresh.py`: refresh both global sources, update DB metadata, download/extract changed items, and invoke builders directly.
|
|
- `l4d2web/services/overlay_builders.py`: register `l4d2center_maps` and `cedapug_maps`; add `GlobalMapOverlayBuilder`.
|
|
- `l4d2web/services/l4d2_facade.py`: add initialize-time missing-cache guard for global map overlays.
|
|
- `l4d2web/services/job_worker.py`: add `refresh_global_overlays` operation, nullable job-owner support for overlay jobs, and scheduler blocking rules.
|
|
- `l4d2web/cli.py`: add `refresh-global-overlays` Flask CLI command.
|
|
- `l4d2web/routes/page_routes.py`: admin manual refresh route; nullable-owner job joins; global overlay visibility.
|
|
- `l4d2web/routes/job_routes.py`: nullable-owner job access and display data.
|
|
- `l4d2web/routes/overlay_routes.py`: reject managed singleton types in create/update/delete policy.
|
|
- `l4d2web/templates/*.html`: render system jobs as `system`; show global overlay source metadata; admin refresh button.
|
|
- `l4d2web/pyproject.toml`: add `py7zr` dependency.
|
|
- `deploy/files/usr/local/lib/systemd/system/left4me-refresh-global-overlays.service`: timer-triggered enqueue service.
|
|
- `deploy/files/usr/local/lib/systemd/system/left4me-refresh-global-overlays.timer`: daily persistent timer.
|
|
- `deploy/deploy-test-server.sh`: provision cache directory, install timer units, enable timer.
|
|
- `deploy/README.md`: document global overlay cache and timer behavior.
|
|
- Tests under `l4d2web/tests/` and `deploy/tests/` as listed in each task.
|
|
|
|
Do not use git worktrees; `AGENTS.md` explicitly forbids them. Do not create commits unless the user explicitly asks for commits.
|
|
|
|
---
|
|
|
|
## Task 1: Schema Tests For System Jobs And Global Overlay Metadata
|
|
|
|
**Files:**
|
|
- Create: `l4d2web/tests/test_global_overlay_models.py`
|
|
- Modify: `l4d2web/tests/test_job_logs.py`
|
|
|
|
- [ ] **Step 1: Write failing model tests**
|
|
|
|
Create `l4d2web/tests/test_global_overlay_models.py`:
|
|
|
|
```python
|
|
from sqlalchemy.exc import IntegrityError
|
|
|
|
from l4d2web.db import init_db, session_scope
|
|
from l4d2web.models import (
|
|
GlobalOverlayItem,
|
|
GlobalOverlayItemFile,
|
|
GlobalOverlaySource,
|
|
Job,
|
|
Overlay,
|
|
User,
|
|
)
|
|
|
|
|
|
def test_system_job_allows_null_user_id(tmp_path, monkeypatch):
|
|
monkeypatch.setenv("DATABASE_URL", f"sqlite:///{tmp_path/'models.db'}")
|
|
init_db()
|
|
|
|
with session_scope() as db:
|
|
job = Job(user_id=None, server_id=None, overlay_id=None, operation="refresh_global_overlays")
|
|
db.add(job)
|
|
db.flush()
|
|
assert job.id is not None
|
|
assert job.user_id is None
|
|
|
|
|
|
def test_global_overlay_source_uniqueness(tmp_path, monkeypatch):
|
|
monkeypatch.setenv("DATABASE_URL", f"sqlite:///{tmp_path/'sources.db'}")
|
|
init_db()
|
|
|
|
with session_scope() as db:
|
|
overlay = Overlay(name="l4d2center-maps", path="1", type="l4d2center_maps", user_id=None)
|
|
db.add(overlay)
|
|
db.flush()
|
|
db.add(
|
|
GlobalOverlaySource(
|
|
overlay_id=overlay.id,
|
|
source_key="l4d2center-maps",
|
|
source_type="l4d2center_csv",
|
|
source_url="https://l4d2center.com/maps/servers/index.csv",
|
|
)
|
|
)
|
|
|
|
try:
|
|
with session_scope() as db:
|
|
other = Overlay(name="cedapug-maps", path="2", type="cedapug_maps", user_id=None)
|
|
db.add(other)
|
|
db.flush()
|
|
db.add(
|
|
GlobalOverlaySource(
|
|
overlay_id=other.id,
|
|
source_key="l4d2center-maps",
|
|
source_type="l4d2center_csv",
|
|
source_url="https://example.invalid/duplicate",
|
|
)
|
|
)
|
|
except IntegrityError:
|
|
pass
|
|
else:
|
|
raise AssertionError("duplicate source_key must fail")
|
|
|
|
|
|
def test_global_overlay_items_and_files_are_unique_per_parent(tmp_path, monkeypatch):
|
|
monkeypatch.setenv("DATABASE_URL", f"sqlite:///{tmp_path/'items.db'}")
|
|
init_db()
|
|
|
|
with session_scope() as db:
|
|
overlay = Overlay(name="cedapug-maps", path="1", type="cedapug_maps", user_id=None)
|
|
db.add(overlay)
|
|
db.flush()
|
|
source = GlobalOverlaySource(
|
|
overlay_id=overlay.id,
|
|
source_key="cedapug-maps",
|
|
source_type="cedapug_custom_page",
|
|
source_url="https://cedapug.com/custom",
|
|
)
|
|
db.add(source)
|
|
db.flush()
|
|
item = GlobalOverlayItem(
|
|
source_id=source.id,
|
|
item_key="FatalFreight.zip",
|
|
display_name="Fatal Freight",
|
|
download_url="https://cedapug.com/maps/FatalFreight.zip",
|
|
expected_vpk_name="FatalFreight.vpk",
|
|
)
|
|
db.add(item)
|
|
db.flush()
|
|
db.add(
|
|
GlobalOverlayItemFile(
|
|
item_id=item.id,
|
|
vpk_name="FatalFreight.vpk",
|
|
cache_path="cedapug-maps/vpks/FatalFreight.vpk",
|
|
size=123,
|
|
md5="",
|
|
)
|
|
)
|
|
|
|
try:
|
|
with session_scope() as db:
|
|
source = db.query(GlobalOverlaySource).filter_by(source_key="cedapug-maps").one()
|
|
db.add(
|
|
GlobalOverlayItem(
|
|
source_id=source.id,
|
|
item_key="FatalFreight.zip",
|
|
display_name="Fatal Freight duplicate",
|
|
download_url="https://cedapug.com/maps/FatalFreight.zip",
|
|
expected_vpk_name="FatalFreight.vpk",
|
|
)
|
|
)
|
|
except IntegrityError:
|
|
pass
|
|
else:
|
|
raise AssertionError("duplicate item_key per source must fail")
|
|
|
|
|
|
def test_normal_user_rows_still_require_real_users(tmp_path, monkeypatch):
|
|
monkeypatch.setenv("DATABASE_URL", f"sqlite:///{tmp_path/'users.db'}")
|
|
init_db()
|
|
|
|
with session_scope() as db:
|
|
user = User(username="alice", password_digest="digest", admin=False)
|
|
db.add(user)
|
|
db.flush()
|
|
db.add(Job(user_id=user.id, server_id=None, operation="install", state="queued"))
|
|
```
|
|
|
|
- [ ] **Step 2: Extend job log tests for nullable user jobs**
|
|
|
|
Append this test to `l4d2web/tests/test_job_logs.py`:
|
|
|
|
```python
|
|
def test_system_job_logs_persist(db_session):
|
|
from l4d2web.models import Job, JobLog
|
|
from l4d2web.services.job_worker import append_job_log
|
|
|
|
job = Job(user_id=None, server_id=None, operation="refresh_global_overlays", state="queued")
|
|
db_session.add(job)
|
|
db_session.flush()
|
|
|
|
seq = append_job_log(db_session, job.id, "stdout", "queued by system timer")
|
|
db_session.flush()
|
|
|
|
row = db_session.query(JobLog).filter_by(job_id=job.id).one()
|
|
assert seq == 1
|
|
assert row.line == "queued by system timer"
|
|
```
|
|
|
|
- [ ] **Step 3: Run tests and verify failure**
|
|
|
|
Run: `pytest l4d2web/tests/test_global_overlay_models.py l4d2web/tests/test_job_logs.py -q`
|
|
|
|
Expected: FAIL with missing `GlobalOverlaySource` import or `jobs.user_id` nullability failure.
|
|
|
|
---
|
|
|
|
## Task 2: Schema Migration And ORM Models
|
|
|
|
**Files:**
|
|
- Modify: `l4d2web/models.py`
|
|
- Create: `l4d2web/alembic/versions/0003_global_map_overlays.py`
|
|
- Test: `l4d2web/tests/test_global_overlay_models.py`
|
|
- Test: `l4d2web/tests/test_job_logs.py`
|
|
|
|
- [ ] **Step 1: Add ORM classes and nullable job owner**
|
|
|
|
Modify `l4d2web/models.py`:
|
|
|
|
```python
|
|
class GlobalOverlaySource(Base):
|
|
__tablename__ = "global_overlay_sources"
|
|
__table_args__ = (
|
|
Index("ix_global_overlay_sources_type", "source_type"),
|
|
)
|
|
|
|
id: Mapped[int] = mapped_column(Integer, primary_key=True)
|
|
overlay_id: Mapped[int] = mapped_column(
|
|
ForeignKey("overlays.id", ondelete="CASCADE"), unique=True, nullable=False
|
|
)
|
|
source_key: Mapped[str] = mapped_column(String(64), unique=True, nullable=False)
|
|
source_type: Mapped[str] = mapped_column(String(32), nullable=False)
|
|
source_url: Mapped[str] = mapped_column(Text, nullable=False)
|
|
last_manifest_hash: Mapped[str] = mapped_column(String(64), default="", nullable=False)
|
|
last_refreshed_at: Mapped[datetime | None] = mapped_column(DateTime, nullable=True)
|
|
last_error: Mapped[str] = mapped_column(Text, default="", nullable=False)
|
|
created_at: Mapped[datetime] = mapped_column(DateTime, default=now_utc, nullable=False)
|
|
updated_at: Mapped[datetime] = mapped_column(DateTime, default=now_utc, nullable=False)
|
|
|
|
|
|
class GlobalOverlayItem(Base):
|
|
__tablename__ = "global_overlay_items"
|
|
__table_args__ = (
|
|
UniqueConstraint("source_id", "item_key", name="uq_global_overlay_item_source_key"),
|
|
Index("ix_global_overlay_items_source", "source_id"),
|
|
)
|
|
|
|
id: Mapped[int] = mapped_column(Integer, primary_key=True)
|
|
source_id: Mapped[int] = mapped_column(
|
|
ForeignKey("global_overlay_sources.id", ondelete="CASCADE"), nullable=False
|
|
)
|
|
item_key: Mapped[str] = mapped_column(String(255), nullable=False)
|
|
display_name: Mapped[str] = mapped_column(String(255), default="", nullable=False)
|
|
download_url: Mapped[str] = mapped_column(Text, nullable=False)
|
|
expected_vpk_name: Mapped[str] = mapped_column(String(255), default="", nullable=False)
|
|
expected_size: Mapped[int | None] = mapped_column(BigInteger, nullable=True)
|
|
expected_md5: Mapped[str] = mapped_column(String(32), default="", nullable=False)
|
|
etag: Mapped[str] = mapped_column(String(255), default="", nullable=False)
|
|
last_modified: Mapped[str] = mapped_column(String(255), default="", nullable=False)
|
|
content_length: Mapped[int | None] = mapped_column(BigInteger, nullable=True)
|
|
last_downloaded_at: Mapped[datetime | None] = mapped_column(DateTime, nullable=True)
|
|
last_error: Mapped[str] = mapped_column(Text, default="", nullable=False)
|
|
created_at: Mapped[datetime] = mapped_column(DateTime, default=now_utc, nullable=False)
|
|
updated_at: Mapped[datetime] = mapped_column(DateTime, default=now_utc, nullable=False)
|
|
|
|
|
|
class GlobalOverlayItemFile(Base):
|
|
__tablename__ = "global_overlay_item_files"
|
|
__table_args__ = (
|
|
UniqueConstraint("item_id", "vpk_name", name="uq_global_overlay_item_file_name"),
|
|
Index("ix_global_overlay_item_files_item", "item_id"),
|
|
)
|
|
|
|
id: Mapped[int] = mapped_column(Integer, primary_key=True)
|
|
item_id: Mapped[int] = mapped_column(
|
|
ForeignKey("global_overlay_items.id", ondelete="CASCADE"), nullable=False
|
|
)
|
|
vpk_name: Mapped[str] = mapped_column(String(255), nullable=False)
|
|
cache_path: Mapped[str] = mapped_column(Text, nullable=False)
|
|
size: Mapped[int] = mapped_column(BigInteger, nullable=False)
|
|
md5: Mapped[str] = mapped_column(String(32), default="", nullable=False)
|
|
created_at: Mapped[datetime] = mapped_column(DateTime, default=now_utc, nullable=False)
|
|
updated_at: Mapped[datetime] = mapped_column(DateTime, default=now_utc, nullable=False)
|
|
```
|
|
|
|
Also change the `Job` mapping:
|
|
|
|
```python
|
|
user_id: Mapped[int | None] = mapped_column(ForeignKey("users.id"), nullable=True)
|
|
```
|
|
|
|
- [ ] **Step 2: Add Alembic migration**
|
|
|
|
Create `l4d2web/alembic/versions/0003_global_map_overlays.py`:
|
|
|
|
```python
|
|
"""global map overlays
|
|
|
|
Revision ID: 0003_global_map_overlays
|
|
Revises: 0002_workshop_overlays
|
|
Create Date: 2026-05-07
|
|
"""
|
|
from typing import Sequence, Union
|
|
|
|
from alembic import op
|
|
import sqlalchemy as sa
|
|
|
|
|
|
revision: str = "0003_global_map_overlays"
|
|
down_revision: Union[str, Sequence[str], None] = "0002_workshop_overlays"
|
|
branch_labels: Union[str, Sequence[str], None] = None
|
|
depends_on: Union[str, Sequence[str], None] = None
|
|
|
|
|
|
def upgrade() -> None:
|
|
with op.batch_alter_table("jobs") as batch_op:
|
|
batch_op.alter_column("user_id", existing_type=sa.Integer(), nullable=True)
|
|
|
|
op.create_table(
|
|
"global_overlay_sources",
|
|
sa.Column("id", sa.Integer(), primary_key=True),
|
|
sa.Column("overlay_id", sa.Integer(), sa.ForeignKey("overlays.id", ondelete="CASCADE"), nullable=False, unique=True),
|
|
sa.Column("source_key", sa.String(length=64), nullable=False, unique=True),
|
|
sa.Column("source_type", sa.String(length=32), nullable=False),
|
|
sa.Column("source_url", sa.Text(), nullable=False),
|
|
sa.Column("last_manifest_hash", sa.String(length=64), nullable=False, server_default=""),
|
|
sa.Column("last_refreshed_at", sa.DateTime(), nullable=True),
|
|
sa.Column("last_error", sa.Text(), nullable=False, server_default=""),
|
|
sa.Column("created_at", sa.DateTime(), nullable=False),
|
|
sa.Column("updated_at", sa.DateTime(), nullable=False),
|
|
)
|
|
op.create_index("ix_global_overlay_sources_type", "global_overlay_sources", ["source_type"])
|
|
|
|
op.create_table(
|
|
"global_overlay_items",
|
|
sa.Column("id", sa.Integer(), primary_key=True),
|
|
sa.Column("source_id", sa.Integer(), sa.ForeignKey("global_overlay_sources.id", ondelete="CASCADE"), nullable=False),
|
|
sa.Column("item_key", sa.String(length=255), nullable=False),
|
|
sa.Column("display_name", sa.String(length=255), nullable=False, server_default=""),
|
|
sa.Column("download_url", sa.Text(), nullable=False),
|
|
sa.Column("expected_vpk_name", sa.String(length=255), nullable=False, server_default=""),
|
|
sa.Column("expected_size", sa.BigInteger(), nullable=True),
|
|
sa.Column("expected_md5", sa.String(length=32), nullable=False, server_default=""),
|
|
sa.Column("etag", sa.String(length=255), nullable=False, server_default=""),
|
|
sa.Column("last_modified", sa.String(length=255), nullable=False, server_default=""),
|
|
sa.Column("content_length", sa.BigInteger(), nullable=True),
|
|
sa.Column("last_downloaded_at", sa.DateTime(), nullable=True),
|
|
sa.Column("last_error", sa.Text(), nullable=False, server_default=""),
|
|
sa.Column("created_at", sa.DateTime(), nullable=False),
|
|
sa.Column("updated_at", sa.DateTime(), nullable=False),
|
|
sa.UniqueConstraint("source_id", "item_key", name="uq_global_overlay_item_source_key"),
|
|
)
|
|
op.create_index("ix_global_overlay_items_source", "global_overlay_items", ["source_id"])
|
|
|
|
op.create_table(
|
|
"global_overlay_item_files",
|
|
sa.Column("id", sa.Integer(), primary_key=True),
|
|
sa.Column("item_id", sa.Integer(), sa.ForeignKey("global_overlay_items.id", ondelete="CASCADE"), nullable=False),
|
|
sa.Column("vpk_name", sa.String(length=255), nullable=False),
|
|
sa.Column("cache_path", sa.Text(), nullable=False),
|
|
sa.Column("size", sa.BigInteger(), nullable=False),
|
|
sa.Column("md5", sa.String(length=32), nullable=False, server_default=""),
|
|
sa.Column("created_at", sa.DateTime(), nullable=False),
|
|
sa.Column("updated_at", sa.DateTime(), nullable=False),
|
|
sa.UniqueConstraint("item_id", "vpk_name", name="uq_global_overlay_item_file_name"),
|
|
)
|
|
op.create_index("ix_global_overlay_item_files_item", "global_overlay_item_files", ["item_id"])
|
|
|
|
|
|
def downgrade() -> None:
|
|
op.drop_index("ix_global_overlay_item_files_item", table_name="global_overlay_item_files")
|
|
op.drop_table("global_overlay_item_files")
|
|
op.drop_index("ix_global_overlay_items_source", table_name="global_overlay_items")
|
|
op.drop_table("global_overlay_items")
|
|
op.drop_index("ix_global_overlay_sources_type", table_name="global_overlay_sources")
|
|
op.drop_table("global_overlay_sources")
|
|
|
|
with op.batch_alter_table("jobs") as batch_op:
|
|
batch_op.alter_column("user_id", existing_type=sa.Integer(), nullable=False)
|
|
```
|
|
|
|
- [ ] **Step 3: Run model tests**
|
|
|
|
Run: `pytest l4d2web/tests/test_global_overlay_models.py l4d2web/tests/test_job_logs.py -q`
|
|
|
|
Expected: PASS.
|
|
|
|
- [ ] **Step 4: Run migration smoke test**
|
|
|
|
Run: `DATABASE_URL=sqlite:////tmp/left4me-global-overlays-plan.db alembic -c l4d2web/alembic.ini upgrade head`
|
|
|
|
Expected: command exits 0 and creates the new tables.
|
|
|
|
---
|
|
|
|
## Task 3: Global Overlay Seeding And Managed-Type Policy
|
|
|
|
**Files:**
|
|
- Create: `l4d2web/services/global_overlays.py`
|
|
- Create: `l4d2web/tests/test_global_overlays.py`
|
|
- Modify: `l4d2web/routes/overlay_routes.py`
|
|
- Modify: `l4d2web/tests/test_overlays.py`
|
|
|
|
- [ ] **Step 1: Write failing seeding tests**
|
|
|
|
Create `l4d2web/tests/test_global_overlays.py`:
|
|
|
|
```python
|
|
from l4d2web.db import init_db, session_scope
|
|
from l4d2web.models import GlobalOverlaySource, Job, Overlay
|
|
|
|
|
|
def test_ensure_global_overlays_creates_singletons(tmp_path, monkeypatch):
|
|
monkeypatch.setenv("DATABASE_URL", f"sqlite:///{tmp_path/'seed.db'}")
|
|
monkeypatch.setenv("LEFT4ME_ROOT", str(tmp_path))
|
|
init_db()
|
|
|
|
from l4d2web.services.global_overlays import ensure_global_overlays
|
|
|
|
with session_scope() as db:
|
|
created = ensure_global_overlays(db)
|
|
again = ensure_global_overlays(db)
|
|
|
|
with session_scope() as db:
|
|
overlays = db.query(Overlay).order_by(Overlay.name).all()
|
|
sources = db.query(GlobalOverlaySource).order_by(GlobalOverlaySource.source_key).all()
|
|
|
|
assert created == {"cedapug-maps", "l4d2center-maps"}
|
|
assert again == set()
|
|
assert [overlay.name for overlay in overlays] == ["cedapug-maps", "l4d2center-maps"]
|
|
assert {overlay.type for overlay in overlays} == {"cedapug_maps", "l4d2center_maps"}
|
|
assert all(overlay.user_id is None for overlay in overlays)
|
|
assert {source.source_key for source in sources} == {"cedapug-maps", "l4d2center-maps"}
|
|
assert (tmp_path / "overlays" / overlays[0].path).is_dir()
|
|
assert (tmp_path / "overlays" / overlays[1].path).is_dir()
|
|
|
|
|
|
def test_enqueue_refresh_global_overlays_coalesces_queued_and_running(tmp_path, monkeypatch):
|
|
monkeypatch.setenv("DATABASE_URL", f"sqlite:///{tmp_path/'jobs.db'}")
|
|
monkeypatch.setenv("LEFT4ME_ROOT", str(tmp_path))
|
|
init_db()
|
|
|
|
from l4d2web.services.global_overlays import enqueue_refresh_global_overlays
|
|
|
|
with session_scope() as db:
|
|
first = enqueue_refresh_global_overlays(db, user_id=None)
|
|
second = enqueue_refresh_global_overlays(db, user_id=None)
|
|
assert first.id == second.id
|
|
|
|
with session_scope() as db:
|
|
job = db.query(Job).filter_by(operation="refresh_global_overlays").one()
|
|
job.state = "running"
|
|
|
|
with session_scope() as db:
|
|
running = enqueue_refresh_global_overlays(db, user_id=None)
|
|
assert running.state == "running"
|
|
assert db.query(Job).filter_by(operation="refresh_global_overlays").count() == 1
|
|
|
|
|
|
def test_managed_global_types_are_not_creatable():
|
|
from l4d2web.services.global_overlays import is_creatable_overlay_type
|
|
|
|
assert is_creatable_overlay_type("workshop", admin=False) is True
|
|
assert is_creatable_overlay_type("external", admin=False) is False
|
|
assert is_creatable_overlay_type("external", admin=True) is True
|
|
assert is_creatable_overlay_type("l4d2center_maps", admin=True) is False
|
|
assert is_creatable_overlay_type("cedapug_maps", admin=True) is False
|
|
```
|
|
|
|
- [ ] **Step 2: Extend overlay route test for managed type rejection**
|
|
|
|
Append to `l4d2web/tests/test_overlays.py`:
|
|
|
|
```python
|
|
def test_admin_cannot_create_managed_global_overlay_type(admin_client) -> None:
|
|
response = admin_client.post(
|
|
"/overlays",
|
|
data={"name": "duplicate-center", "type": "l4d2center_maps"},
|
|
headers={"X-CSRF-Token": "test-token"},
|
|
)
|
|
assert response.status_code == 400
|
|
assert "unknown overlay type" in response.get_data(as_text=True)
|
|
```
|
|
|
|
- [ ] **Step 3: Run tests and verify failure**
|
|
|
|
Run: `pytest l4d2web/tests/test_global_overlays.py l4d2web/tests/test_overlays.py -q`
|
|
|
|
Expected: FAIL with missing `l4d2web.services.global_overlays`.
|
|
|
|
- [ ] **Step 4: Implement seeding helper and type policy**
|
|
|
|
Create `l4d2web/services/global_overlays.py`:
|
|
|
|
```python
|
|
from __future__ import annotations
|
|
|
|
from dataclasses import dataclass
|
|
from pathlib import Path
|
|
|
|
from sqlalchemy import select
|
|
from sqlalchemy.orm import Session
|
|
|
|
from l4d2host.paths import get_left4me_root
|
|
|
|
from l4d2web.models import GlobalOverlaySource, Job, Overlay, now_utc
|
|
from l4d2web.services.overlay_creation import generate_overlay_path
|
|
|
|
|
|
@dataclass(frozen=True, slots=True)
|
|
class ManagedGlobalOverlay:
|
|
name: str
|
|
overlay_type: str
|
|
source_type: str
|
|
source_url: str
|
|
|
|
|
|
GLOBAL_OVERLAYS: tuple[ManagedGlobalOverlay, ...] = (
|
|
ManagedGlobalOverlay(
|
|
name="l4d2center-maps",
|
|
overlay_type="l4d2center_maps",
|
|
source_type="l4d2center_csv",
|
|
source_url="https://l4d2center.com/maps/servers/index.csv",
|
|
),
|
|
ManagedGlobalOverlay(
|
|
name="cedapug-maps",
|
|
overlay_type="cedapug_maps",
|
|
source_type="cedapug_custom_page",
|
|
source_url="https://cedapug.com/custom",
|
|
),
|
|
)
|
|
|
|
MANAGED_GLOBAL_OVERLAY_TYPES = {entry.overlay_type for entry in GLOBAL_OVERLAYS}
|
|
USER_CREATABLE_TYPES = {"workshop"}
|
|
ADMIN_CREATABLE_TYPES = {"external", "workshop"}
|
|
|
|
|
|
def is_creatable_overlay_type(overlay_type: str, *, admin: bool) -> bool:
|
|
allowed = ADMIN_CREATABLE_TYPES if admin else USER_CREATABLE_TYPES
|
|
return overlay_type in allowed
|
|
|
|
|
|
def ensure_global_overlays(session: Session) -> set[str]:
|
|
created: set[str] = set()
|
|
for entry in GLOBAL_OVERLAYS:
|
|
overlay = session.scalar(select(Overlay).where(Overlay.name == entry.name, Overlay.user_id.is_(None)))
|
|
if overlay is None:
|
|
overlay = Overlay(name=entry.name, path="", type=entry.overlay_type, user_id=None)
|
|
session.add(overlay)
|
|
session.flush()
|
|
overlay.path = generate_overlay_path(overlay.id)
|
|
session.flush()
|
|
_overlay_root(overlay).mkdir(parents=True, exist_ok=False)
|
|
created.add(entry.name)
|
|
else:
|
|
overlay.type = entry.overlay_type
|
|
overlay.user_id = None
|
|
if not overlay.path:
|
|
overlay.path = generate_overlay_path(overlay.id)
|
|
_overlay_root(overlay).mkdir(parents=True, exist_ok=True)
|
|
|
|
source = session.scalar(select(GlobalOverlaySource).where(GlobalOverlaySource.source_key == entry.name))
|
|
if source is None:
|
|
source = GlobalOverlaySource(
|
|
overlay_id=overlay.id,
|
|
source_key=entry.name,
|
|
source_type=entry.source_type,
|
|
source_url=entry.source_url,
|
|
)
|
|
session.add(source)
|
|
else:
|
|
source.overlay_id = overlay.id
|
|
source.source_type = entry.source_type
|
|
source.source_url = entry.source_url
|
|
source.updated_at = now_utc()
|
|
session.flush()
|
|
return created
|
|
|
|
|
|
def enqueue_refresh_global_overlays(session: Session, *, user_id: int | None) -> Job:
|
|
existing = session.scalar(
|
|
select(Job)
|
|
.where(
|
|
Job.operation == "refresh_global_overlays",
|
|
Job.state.in_(["queued", "running", "cancelling"]),
|
|
)
|
|
.order_by(Job.created_at)
|
|
)
|
|
if existing is not None:
|
|
return existing
|
|
job = Job(user_id=user_id, server_id=None, overlay_id=None, operation="refresh_global_overlays", state="queued")
|
|
session.add(job)
|
|
session.flush()
|
|
return job
|
|
|
|
|
|
def _overlay_root(overlay: Overlay) -> Path:
|
|
return get_left4me_root() / "overlays" / overlay.path
|
|
```
|
|
|
|
- [ ] **Step 5: Use policy in overlay route**
|
|
|
|
Modify `l4d2web/routes/overlay_routes.py`:
|
|
|
|
```python
|
|
from l4d2web.services.global_overlays import is_creatable_overlay_type
|
|
|
|
|
|
VALID_TYPES = {"external", "workshop"}
|
|
```
|
|
|
|
Replace the create-route type check with:
|
|
|
|
```python
|
|
if not is_creatable_overlay_type(overlay_type, admin=user.admin):
|
|
return Response(f"unknown overlay type: {overlay_type}", status=400)
|
|
```
|
|
|
|
Keep the existing external/workshop scope behavior unchanged.
|
|
|
|
- [ ] **Step 6: Run tests and verify pass**
|
|
|
|
Run: `pytest l4d2web/tests/test_global_overlays.py l4d2web/tests/test_overlays.py -q`
|
|
|
|
Expected: PASS.
|
|
|
|
---
|
|
|
|
## Task 4: Source Manifest Parsers
|
|
|
|
**Files:**
|
|
- Create: `l4d2web/services/global_map_sources.py`
|
|
- Create: `l4d2web/tests/test_global_map_sources.py`
|
|
|
|
- [ ] **Step 1: Write failing parser tests**
|
|
|
|
Create `l4d2web/tests/test_global_map_sources.py`:
|
|
|
|
```python
|
|
from l4d2web.services.global_map_sources import (
|
|
GlobalMapManifestItem,
|
|
parse_cedapug_custom_html,
|
|
parse_l4d2center_csv,
|
|
)
|
|
|
|
|
|
def test_parse_l4d2center_csv_semicolon_manifest():
|
|
raw = """Name;Size;md5;Download link
|
|
carriedoff.vpk;128660532;0380e12c57156574e17a96da1252cf21;https://l4d2center.com/maps/servers/carriedoff.7z
|
|
"""
|
|
|
|
items = parse_l4d2center_csv(raw)
|
|
|
|
assert items == [
|
|
GlobalMapManifestItem(
|
|
item_key="carriedoff.vpk",
|
|
display_name="carriedoff.vpk",
|
|
download_url="https://l4d2center.com/maps/servers/carriedoff.7z",
|
|
expected_vpk_name="carriedoff.vpk",
|
|
expected_size=128660532,
|
|
expected_md5="0380e12c57156574e17a96da1252cf21",
|
|
)
|
|
]
|
|
|
|
|
|
def test_parse_l4d2center_rejects_missing_header():
|
|
try:
|
|
parse_l4d2center_csv("bad,data\n")
|
|
except ValueError as exc:
|
|
assert "Name;Size;md5;Download link" in str(exc)
|
|
else:
|
|
raise AssertionError("bad header must fail")
|
|
|
|
|
|
def test_parse_cedapug_custom_html_extracts_relative_zip_links():
|
|
html = """
|
|
<script>renderCustomMapDownloads([
|
|
["c1m1_hotel","<span style='color: #977d4c;'>Dead Center<\/span>"],
|
|
["l4d2_ff01_woods","<span style='color: #854C34;'>Fatal Freight<\/span>","\/maps\/FatalFreight.zip"],
|
|
["external","External","https://steamcommunity.com/sharedfiles/filedetails/?id=123"]
|
|
])</script>
|
|
"""
|
|
|
|
items = parse_cedapug_custom_html(html)
|
|
|
|
assert items == [
|
|
GlobalMapManifestItem(
|
|
item_key="FatalFreight.zip",
|
|
display_name="Fatal Freight",
|
|
download_url="https://cedapug.com/maps/FatalFreight.zip",
|
|
expected_vpk_name="",
|
|
expected_size=None,
|
|
expected_md5="",
|
|
)
|
|
]
|
|
|
|
|
|
def test_parse_cedapug_custom_html_rejects_missing_data():
|
|
try:
|
|
parse_cedapug_custom_html("<html></html>")
|
|
except ValueError as exc:
|
|
assert "renderCustomMapDownloads" in str(exc)
|
|
else:
|
|
raise AssertionError("missing embedded data must fail")
|
|
```
|
|
|
|
- [ ] **Step 2: Run tests and verify failure**
|
|
|
|
Run: `pytest l4d2web/tests/test_global_map_sources.py -q`
|
|
|
|
Expected: FAIL with missing module.
|
|
|
|
- [ ] **Step 3: Implement manifest parser module**
|
|
|
|
Create `l4d2web/services/global_map_sources.py`:
|
|
|
|
```python
|
|
from __future__ import annotations
|
|
|
|
import csv
|
|
from dataclasses import dataclass
|
|
import hashlib
|
|
import html as html_lib
|
|
import io
|
|
import json
|
|
from urllib.parse import urljoin, urlparse
|
|
import re
|
|
|
|
import requests
|
|
|
|
|
|
REQUEST_TIMEOUT_SECONDS = 30
|
|
L4D2CENTER_CSV_URL = "https://l4d2center.com/maps/servers/index.csv"
|
|
CEDAPUG_CUSTOM_URL = "https://cedapug.com/custom"
|
|
|
|
|
|
@dataclass(frozen=True, slots=True)
|
|
class GlobalMapManifestItem:
|
|
item_key: str
|
|
display_name: str
|
|
download_url: str
|
|
expected_vpk_name: str = ""
|
|
expected_size: int | None = None
|
|
expected_md5: str = ""
|
|
|
|
|
|
def fetch_l4d2center_manifest() -> tuple[str, list[GlobalMapManifestItem]]:
|
|
response = requests.get(L4D2CENTER_CSV_URL, timeout=REQUEST_TIMEOUT_SECONDS)
|
|
response.raise_for_status()
|
|
text = response.text
|
|
return _sha256(text), parse_l4d2center_csv(text)
|
|
|
|
|
|
def fetch_cedapug_manifest() -> tuple[str, list[GlobalMapManifestItem]]:
|
|
response = requests.get(CEDAPUG_CUSTOM_URL, timeout=REQUEST_TIMEOUT_SECONDS)
|
|
response.raise_for_status()
|
|
text = response.text
|
|
return _sha256(text), parse_cedapug_custom_html(text)
|
|
|
|
|
|
def parse_l4d2center_csv(raw: str) -> list[GlobalMapManifestItem]:
|
|
reader = csv.DictReader(io.StringIO(raw), delimiter=";")
|
|
expected = ["Name", "Size", "md5", "Download link"]
|
|
if reader.fieldnames != expected:
|
|
raise ValueError("expected L4D2Center CSV header: Name;Size;md5;Download link")
|
|
items: list[GlobalMapManifestItem] = []
|
|
for row in reader:
|
|
name = (row.get("Name") or "").strip()
|
|
size_raw = (row.get("Size") or "").strip()
|
|
md5 = (row.get("md5") or "").strip().lower()
|
|
url = (row.get("Download link") or "").strip()
|
|
if not name or not url:
|
|
continue
|
|
items.append(
|
|
GlobalMapManifestItem(
|
|
item_key=name,
|
|
display_name=name,
|
|
download_url=url,
|
|
expected_vpk_name=name,
|
|
expected_size=int(size_raw) if size_raw else None,
|
|
expected_md5=md5,
|
|
)
|
|
)
|
|
return items
|
|
|
|
|
|
def parse_cedapug_custom_html(raw: str) -> list[GlobalMapManifestItem]:
|
|
match = re.search(r"renderCustomMapDownloads\((\[.*?\])\)</script>", raw, re.DOTALL)
|
|
if match is None:
|
|
raise ValueError("CEDAPUG page did not contain renderCustomMapDownloads data")
|
|
rows = json.loads(match.group(1))
|
|
items: list[GlobalMapManifestItem] = []
|
|
for row in rows:
|
|
if len(row) < 3:
|
|
continue
|
|
label = str(row[1])
|
|
link = str(row[2])
|
|
if link.startswith("http"):
|
|
continue
|
|
if not link:
|
|
continue
|
|
url = urljoin(CEDAPUG_CUSTOM_URL, link)
|
|
parsed = urlparse(url)
|
|
basename = parsed.path.rsplit("/", 1)[-1]
|
|
items.append(
|
|
GlobalMapManifestItem(
|
|
item_key=basename,
|
|
display_name=_strip_html(label),
|
|
download_url=url,
|
|
)
|
|
)
|
|
return items
|
|
|
|
|
|
def _strip_html(raw: str) -> str:
|
|
no_tags = re.sub(r"<[^>]+>", "", raw)
|
|
return html_lib.unescape(no_tags).strip()
|
|
|
|
|
|
def _sha256(raw: str) -> str:
|
|
return hashlib.sha256(raw.encode("utf-8")).hexdigest()
|
|
```
|
|
|
|
- [ ] **Step 4: Run tests and verify pass**
|
|
|
|
Run: `pytest l4d2web/tests/test_global_map_sources.py -q`
|
|
|
|
Expected: PASS.
|
|
|
|
---
|
|
|
|
## Task 5: Global Map Cache, Download, And Extraction Helpers
|
|
|
|
**Files:**
|
|
- Create: `l4d2web/services/global_map_cache.py`
|
|
- Create: `l4d2web/tests/test_global_map_cache.py`
|
|
- Modify: `l4d2web/pyproject.toml`
|
|
|
|
- [ ] **Step 1: Add dependency**
|
|
|
|
Modify `l4d2web/pyproject.toml` dependencies:
|
|
|
|
```toml
|
|
"py7zr>=0.21",
|
|
```
|
|
|
|
- [ ] **Step 2: Write failing cache helper tests**
|
|
|
|
Create `l4d2web/tests/test_global_map_cache.py`:
|
|
|
|
```python
|
|
from pathlib import Path
|
|
from zipfile import ZipFile
|
|
|
|
from l4d2web.services.global_map_cache import (
|
|
extracted_vpk_md5,
|
|
global_overlay_cache_root,
|
|
safe_extract_zip_vpks,
|
|
source_cache_root,
|
|
)
|
|
|
|
|
|
def test_global_overlay_cache_paths(tmp_path, monkeypatch):
|
|
monkeypatch.setenv("LEFT4ME_ROOT", str(tmp_path))
|
|
|
|
assert global_overlay_cache_root() == tmp_path / "global_overlay_cache"
|
|
assert source_cache_root("l4d2center-maps") == tmp_path / "global_overlay_cache" / "l4d2center-maps"
|
|
|
|
|
|
def test_safe_extract_zip_vpks_extracts_only_vpks(tmp_path):
|
|
archive = tmp_path / "maps.zip"
|
|
with ZipFile(archive, "w") as zf:
|
|
zf.writestr("FatalFreight.vpk", b"vpk-bytes")
|
|
zf.writestr("readme.txt", b"ignore")
|
|
|
|
out_dir = tmp_path / "out"
|
|
files = safe_extract_zip_vpks(archive, out_dir)
|
|
|
|
assert files == [out_dir / "FatalFreight.vpk"]
|
|
assert (out_dir / "FatalFreight.vpk").read_bytes() == b"vpk-bytes"
|
|
assert not (out_dir / "readme.txt").exists()
|
|
|
|
|
|
def test_safe_extract_zip_vpks_rejects_path_traversal(tmp_path):
|
|
archive = tmp_path / "bad.zip"
|
|
with ZipFile(archive, "w") as zf:
|
|
zf.writestr("../evil.vpk", b"bad")
|
|
|
|
try:
|
|
safe_extract_zip_vpks(archive, tmp_path / "out")
|
|
except ValueError as exc:
|
|
assert "unsafe archive member" in str(exc)
|
|
else:
|
|
raise AssertionError("path traversal must fail")
|
|
|
|
|
|
def test_extracted_vpk_md5(tmp_path):
|
|
p = tmp_path / "x.vpk"
|
|
p.write_bytes(b"abc")
|
|
assert extracted_vpk_md5(p) == "900150983cd24fb0d6963f7d28e17f72"
|
|
```
|
|
|
|
- [ ] **Step 3: Run tests and verify failure**
|
|
|
|
Run: `pytest l4d2web/tests/test_global_map_cache.py -q`
|
|
|
|
Expected: FAIL with missing module.
|
|
|
|
- [ ] **Step 4: Implement cache helpers**
|
|
|
|
Create `l4d2web/services/global_map_cache.py`:
|
|
|
|
```python
|
|
from __future__ import annotations
|
|
|
|
import hashlib
|
|
import os
|
|
from pathlib import Path
|
|
import tempfile
|
|
import os
|
|
from zipfile import ZipFile
|
|
|
|
import py7zr
|
|
import requests
|
|
|
|
from l4d2host.paths import get_left4me_root
|
|
|
|
|
|
REQUEST_TIMEOUT_SECONDS = 30
|
|
DOWNLOAD_CHUNK_BYTES = 1_048_576
|
|
|
|
|
|
def global_overlay_cache_root() -> Path:
|
|
return get_left4me_root() / "global_overlay_cache"
|
|
|
|
|
|
def source_cache_root(source_key: str) -> Path:
|
|
if "/" in source_key or ".." in source_key or not source_key:
|
|
raise ValueError(f"invalid source_key: {source_key!r}")
|
|
return global_overlay_cache_root() / source_key
|
|
|
|
|
|
def archive_dir(source_key: str) -> Path:
|
|
return source_cache_root(source_key) / "archives"
|
|
|
|
|
|
def vpk_dir(source_key: str) -> Path:
|
|
return source_cache_root(source_key) / "vpks"
|
|
|
|
|
|
def download_archive(url: str, target: Path, *, should_cancel=None) -> tuple[str, str, int | None]:
|
|
target.parent.mkdir(parents=True, exist_ok=True)
|
|
partial = target.with_suffix(target.suffix + ".partial")
|
|
response = requests.get(url, stream=True, timeout=REQUEST_TIMEOUT_SECONDS)
|
|
response.raise_for_status()
|
|
etag = response.headers.get("ETag", "")
|
|
last_modified = response.headers.get("Last-Modified", "")
|
|
content_length_raw = response.headers.get("Content-Length")
|
|
content_length = int(content_length_raw) if content_length_raw and content_length_raw.isdigit() else None
|
|
try:
|
|
with open(partial, "wb") as f:
|
|
for chunk in response.iter_content(chunk_size=DOWNLOAD_CHUNK_BYTES):
|
|
if should_cancel is not None and should_cancel():
|
|
raise InterruptedError("download cancelled")
|
|
if chunk:
|
|
f.write(chunk)
|
|
os.replace(partial, target)
|
|
except BaseException:
|
|
partial.unlink(missing_ok=True)
|
|
raise
|
|
return etag, last_modified, content_length
|
|
|
|
|
|
def safe_extract_zip_vpks(archive_path: Path, output_dir: Path) -> list[Path]:
|
|
output_dir.mkdir(parents=True, exist_ok=True)
|
|
extracted: list[Path] = []
|
|
with ZipFile(archive_path) as zf:
|
|
for member in zf.infolist():
|
|
name = Path(member.filename)
|
|
if name.is_absolute() or any(part in {"", ".", ".."} for part in name.parts):
|
|
raise ValueError(f"unsafe archive member: {member.filename}")
|
|
if name.suffix.lower() != ".vpk":
|
|
continue
|
|
target = output_dir / name.name
|
|
with zf.open(member) as src, open(target, "wb") as dst:
|
|
shutil.copyfileobj(src, dst)
|
|
extracted.append(target)
|
|
if not extracted:
|
|
raise ValueError(f"archive {archive_path} did not contain any .vpk files")
|
|
return sorted(extracted)
|
|
|
|
|
|
def safe_extract_7z_vpks(archive_path: Path, output_dir: Path) -> list[Path]:
|
|
output_dir.mkdir(parents=True, exist_ok=True)
|
|
with tempfile.TemporaryDirectory(prefix="left4me-7z-") as raw_tmp:
|
|
raw_dir = Path(raw_tmp)
|
|
with py7zr.SevenZipFile(archive_path, mode="r") as archive:
|
|
names = archive.getnames()
|
|
for name in names:
|
|
p = Path(name)
|
|
if p.is_absolute() or any(part in {"", ".", ".."} for part in p.parts):
|
|
raise ValueError(f"unsafe archive member: {name}")
|
|
archive.extractall(path=raw_dir)
|
|
extracted: list[Path] = []
|
|
for candidate in raw_dir.rglob("*.vpk"):
|
|
target = output_dir / candidate.name
|
|
os.replace(candidate, target)
|
|
extracted.append(target)
|
|
if not extracted:
|
|
raise ValueError(f"archive {archive_path} did not contain any .vpk files")
|
|
return sorted(extracted)
|
|
|
|
|
|
def extracted_vpk_md5(path: Path) -> str:
|
|
digest = hashlib.md5()
|
|
with open(path, "rb") as f:
|
|
for chunk in iter(lambda: f.read(1024 * 1024), b""):
|
|
digest.update(chunk)
|
|
return digest.hexdigest()
|
|
```
|
|
|
|
- [ ] **Step 5: Run tests and verify pass**
|
|
|
|
Run: `pytest l4d2web/tests/test_global_map_cache.py -q`
|
|
|
|
Expected: PASS.
|
|
|
|
---
|
|
|
|
## Task 6: Global Map Overlay Builder
|
|
|
|
**Files:**
|
|
- Modify: `l4d2web/services/overlay_builders.py`
|
|
- Create: `l4d2web/tests/test_global_overlay_builders.py`
|
|
|
|
- [ ] **Step 1: Write failing builder tests**
|
|
|
|
Create `l4d2web/tests/test_global_overlay_builders.py`:
|
|
|
|
```python
|
|
import os
|
|
from pathlib import Path
|
|
|
|
from l4d2web.db import init_db, session_scope
|
|
from l4d2web.models import GlobalOverlayItem, GlobalOverlayItemFile, GlobalOverlaySource, Overlay
|
|
from l4d2web.services.overlay_builders import BUILDERS
|
|
|
|
|
|
def seed_source(tmp_path: Path, monkeypatch) -> int:
|
|
monkeypatch.setenv("DATABASE_URL", f"sqlite:///{tmp_path/'builder.db'}")
|
|
monkeypatch.setenv("LEFT4ME_ROOT", str(tmp_path))
|
|
init_db()
|
|
cache_vpk = tmp_path / "global_overlay_cache" / "l4d2center-maps" / "vpks" / "carriedoff.vpk"
|
|
cache_vpk.parent.mkdir(parents=True, exist_ok=True)
|
|
cache_vpk.write_bytes(b"vpk")
|
|
with session_scope() as db:
|
|
overlay = Overlay(name="l4d2center-maps", path="7", type="l4d2center_maps", user_id=None)
|
|
db.add(overlay)
|
|
db.flush()
|
|
source = GlobalOverlaySource(
|
|
overlay_id=overlay.id,
|
|
source_key="l4d2center-maps",
|
|
source_type="l4d2center_csv",
|
|
source_url="https://l4d2center.com/maps/servers/index.csv",
|
|
)
|
|
db.add(source)
|
|
db.flush()
|
|
item = GlobalOverlayItem(
|
|
source_id=source.id,
|
|
item_key="carriedoff.vpk",
|
|
display_name="carriedoff.vpk",
|
|
download_url="https://example.invalid/carriedoff.7z",
|
|
expected_vpk_name="carriedoff.vpk",
|
|
)
|
|
db.add(item)
|
|
db.flush()
|
|
db.add(
|
|
GlobalOverlayItemFile(
|
|
item_id=item.id,
|
|
vpk_name="carriedoff.vpk",
|
|
cache_path="l4d2center-maps/vpks/carriedoff.vpk",
|
|
size=3,
|
|
md5="",
|
|
)
|
|
)
|
|
db.flush()
|
|
return overlay.id
|
|
|
|
|
|
def test_registry_contains_global_map_builders():
|
|
assert "l4d2center_maps" in BUILDERS
|
|
assert "cedapug_maps" in BUILDERS
|
|
|
|
|
|
def test_global_builder_creates_absolute_symlink(tmp_path, monkeypatch):
|
|
overlay_id = seed_source(tmp_path, monkeypatch)
|
|
out: list[str] = []
|
|
err: list[str] = []
|
|
with session_scope() as db:
|
|
overlay = db.query(Overlay).filter_by(id=overlay_id).one()
|
|
BUILDERS["l4d2center_maps"].build(overlay, on_stdout=out.append, on_stderr=err.append, should_cancel=lambda: False)
|
|
|
|
link = tmp_path / "overlays" / "7" / "left4dead2" / "addons" / "carriedoff.vpk"
|
|
assert link.is_symlink()
|
|
assert os.path.isabs(os.readlink(link))
|
|
assert link.resolve() == (tmp_path / "global_overlay_cache" / "l4d2center-maps" / "vpks" / "carriedoff.vpk").resolve()
|
|
assert any("global overlay" in line for line in out)
|
|
|
|
|
|
def test_global_builder_removes_obsolete_managed_symlink_but_keeps_foreign(tmp_path, monkeypatch):
|
|
overlay_id = seed_source(tmp_path, monkeypatch)
|
|
addons = tmp_path / "overlays" / "7" / "left4dead2" / "addons"
|
|
addons.mkdir(parents=True, exist_ok=True)
|
|
foreign_target = tmp_path / "foreign.vpk"
|
|
foreign_target.write_bytes(b"foreign")
|
|
os.symlink(str(foreign_target), addons / "foreign.vpk")
|
|
|
|
with session_scope() as db:
|
|
overlay = db.query(Overlay).filter_by(id=overlay_id).one()
|
|
BUILDERS["l4d2center_maps"].build(overlay, on_stdout=lambda line: None, on_stderr=lambda line: None, should_cancel=lambda: False)
|
|
source = db.query(GlobalOverlaySource).filter_by(source_key="l4d2center-maps").one()
|
|
db.query(GlobalOverlayItem).filter_by(source_id=source.id).delete()
|
|
|
|
with session_scope() as db:
|
|
overlay = db.query(Overlay).filter_by(id=overlay_id).one()
|
|
BUILDERS["l4d2center_maps"].build(overlay, on_stdout=lambda line: None, on_stderr=lambda line: None, should_cancel=lambda: False)
|
|
|
|
assert not (addons / "carriedoff.vpk").exists()
|
|
assert (addons / "foreign.vpk").is_symlink()
|
|
```
|
|
|
|
- [ ] **Step 2: Run tests and verify failure**
|
|
|
|
Run: `pytest l4d2web/tests/test_global_overlay_builders.py -q`
|
|
|
|
Expected: FAIL because registry lacks global map types.
|
|
|
|
- [ ] **Step 3: Implement `GlobalMapOverlayBuilder`**
|
|
|
|
Modify `l4d2web/services/overlay_builders.py`:
|
|
|
|
```python
|
|
from l4d2web.models import GlobalOverlayItem, GlobalOverlayItemFile, GlobalOverlaySource
|
|
from l4d2web.services.global_map_cache import global_overlay_cache_root
|
|
from l4d2web.services.global_overlays import MANAGED_GLOBAL_OVERLAY_TYPES
|
|
```
|
|
|
|
Add the builder class before `BUILDERS`:
|
|
|
|
```python
|
|
class GlobalMapOverlayBuilder:
|
|
"""Reconcile symlinks for managed global map overlays."""
|
|
|
|
def build(
|
|
self,
|
|
overlay: Overlay,
|
|
*,
|
|
on_stdout: LogSink,
|
|
on_stderr: LogSink,
|
|
should_cancel: CancelCheck,
|
|
) -> None:
|
|
addons_dir = _overlay_root(overlay) / "left4dead2" / "addons"
|
|
addons_dir.mkdir(parents=True, exist_ok=True)
|
|
|
|
with session_scope() as db:
|
|
source = db.scalar(select(GlobalOverlaySource).where(GlobalOverlaySource.overlay_id == overlay.id))
|
|
if source is None:
|
|
raise ValueError(f"global overlay source for overlay {overlay.id} not found")
|
|
rows = db.execute(
|
|
select(GlobalOverlayItemFile.vpk_name, GlobalOverlayItemFile.cache_path)
|
|
.join(GlobalOverlayItem, GlobalOverlayItem.id == GlobalOverlayItemFile.item_id)
|
|
.where(GlobalOverlayItem.source_id == source.id)
|
|
).all()
|
|
source_key = source.source_key
|
|
|
|
cache_root = global_overlay_cache_root().resolve()
|
|
source_vpk_root = (global_overlay_cache_root() / source_key / "vpks").resolve()
|
|
desired: dict[str, Path] = {}
|
|
skipped = 0
|
|
for vpk_name, cache_path_value in rows:
|
|
target = (global_overlay_cache_root() / cache_path_value).resolve()
|
|
if not _is_under(target, source_vpk_root) or not target.exists():
|
|
on_stderr(f"global overlay {overlay.name!r}: missing cache file for {vpk_name}")
|
|
skipped += 1
|
|
continue
|
|
desired[vpk_name] = target
|
|
|
|
existing: dict[str, Path] = {}
|
|
for entry in os.scandir(addons_dir):
|
|
if not entry.is_symlink():
|
|
continue
|
|
try:
|
|
resolved = Path(os.readlink(entry.path)).resolve(strict=False)
|
|
except OSError:
|
|
continue
|
|
if _is_under(resolved, source_vpk_root):
|
|
existing[entry.name] = resolved
|
|
elif _is_under(resolved, cache_root):
|
|
on_stderr(f"global overlay {overlay.name!r}: leaving foreign cache symlink {entry.name}")
|
|
|
|
created = 0
|
|
removed = 0
|
|
unchanged = 0
|
|
for name, current_target in existing.items():
|
|
if should_cancel():
|
|
on_stderr("global overlay build cancelled mid-removal")
|
|
return
|
|
desired_target = desired.get(name)
|
|
if desired_target is None:
|
|
os.unlink(addons_dir / name)
|
|
removed += 1
|
|
elif current_target == desired_target:
|
|
unchanged += 1
|
|
else:
|
|
os.unlink(addons_dir / name)
|
|
|
|
current_names = {
|
|
name for name, current_target in existing.items() if name in desired and current_target == desired[name]
|
|
}
|
|
for name, target in desired.items():
|
|
if should_cancel():
|
|
on_stderr("global overlay build cancelled mid-creation")
|
|
return
|
|
if name in current_names:
|
|
continue
|
|
link_path = addons_dir / name
|
|
if link_path.exists() and not link_path.is_symlink():
|
|
on_stderr(f"refusing to overwrite non-symlink at {link_path}")
|
|
continue
|
|
if link_path.is_symlink():
|
|
on_stderr(f"refusing to overwrite foreign symlink at {link_path}")
|
|
continue
|
|
os.symlink(str(target), str(link_path))
|
|
created += 1
|
|
|
|
on_stdout(
|
|
f"global overlay {overlay.name!r}: created={created} removed={removed} "
|
|
f"unchanged={unchanged} skipped(missing)={skipped}"
|
|
)
|
|
```
|
|
|
|
Extend `BUILDERS`:
|
|
|
|
```python
|
|
BUILDERS: dict[str, OverlayBuilder] = {
|
|
"external": ExternalBuilder(),
|
|
"workshop": WorkshopBuilder(),
|
|
"l4d2center_maps": GlobalMapOverlayBuilder(),
|
|
"cedapug_maps": GlobalMapOverlayBuilder(),
|
|
}
|
|
```
|
|
|
|
- [ ] **Step 4: Run tests and verify pass**
|
|
|
|
Run: `pytest l4d2web/tests/test_global_overlay_builders.py l4d2web/tests/test_overlay_builders.py -q`
|
|
|
|
Expected: PASS.
|
|
|
|
---
|
|
|
|
## Task 7: Refresh Service For Global Map Sources
|
|
|
|
**Files:**
|
|
- Create: `l4d2web/services/global_overlay_refresh.py`
|
|
- Create: `l4d2web/tests/test_global_overlay_refresh.py`
|
|
|
|
- [ ] **Step 1: Write failing refresh tests**
|
|
|
|
Create `l4d2web/tests/test_global_overlay_refresh.py`:
|
|
|
|
```python
|
|
from pathlib import Path
|
|
|
|
from l4d2web.db import init_db, session_scope
|
|
from l4d2web.models import GlobalOverlayItem, GlobalOverlayItemFile, GlobalOverlaySource
|
|
from l4d2web.services.global_map_sources import GlobalMapManifestItem
|
|
|
|
|
|
def test_refresh_global_overlays_updates_manifest_items_and_invokes_builders(tmp_path, monkeypatch):
|
|
monkeypatch.setenv("DATABASE_URL", f"sqlite:///{tmp_path/'refresh.db'}")
|
|
monkeypatch.setenv("LEFT4ME_ROOT", str(tmp_path))
|
|
init_db()
|
|
|
|
from l4d2web.services import global_overlay_refresh
|
|
monkeypatch.setattr(
|
|
global_overlay_refresh,
|
|
"fetch_l4d2center_manifest",
|
|
lambda: ("hash-center", [GlobalMapManifestItem("carriedoff.vpk", "carriedoff.vpk", "https://example.invalid/carriedoff.7z", "carriedoff.vpk", 3, "" )]),
|
|
)
|
|
monkeypatch.setattr(
|
|
global_overlay_refresh,
|
|
"fetch_cedapug_manifest",
|
|
lambda: ("hash-ceda", [GlobalMapManifestItem("FatalFreight.zip", "Fatal Freight", "https://example.invalid/FatalFreight.zip")]),
|
|
)
|
|
|
|
def fake_download_and_extract(source_key, item, *, should_cancel):
|
|
target = tmp_path / "global_overlay_cache" / source_key / "vpks" / (item.expected_vpk_name or item.item_key.replace(".zip", ".vpk"))
|
|
target.parent.mkdir(parents=True, exist_ok=True)
|
|
target.write_bytes(b"vpk")
|
|
return [(target.name, f"{source_key}/vpks/{target.name}", 3, "")], "etag", "last-modified", 3
|
|
|
|
built: list[str] = []
|
|
monkeypatch.setattr(global_overlay_refresh, "download_and_extract_item", fake_download_and_extract)
|
|
monkeypatch.setattr(global_overlay_refresh, "build_global_overlay", lambda overlay, **kwargs: built.append(overlay.name))
|
|
|
|
out: list[str] = []
|
|
result = global_overlay_refresh.refresh_global_overlays(on_stdout=out.append, on_stderr=out.append, should_cancel=lambda: False)
|
|
|
|
assert result == ["cedapug-maps", "l4d2center-maps"]
|
|
assert set(built) == {"cedapug-maps", "l4d2center-maps"}
|
|
with session_scope() as db:
|
|
assert db.query(GlobalOverlaySource).count() == 2
|
|
assert db.query(GlobalOverlayItem).count() == 2
|
|
assert db.query(GlobalOverlayItemFile).count() == 2
|
|
|
|
|
|
def test_refresh_removes_items_absent_from_manifest(tmp_path, monkeypatch):
|
|
monkeypatch.setenv("DATABASE_URL", f"sqlite:///{tmp_path/'remove.db'}")
|
|
monkeypatch.setenv("LEFT4ME_ROOT", str(tmp_path))
|
|
init_db()
|
|
|
|
from l4d2web.services.global_overlays import ensure_global_overlays
|
|
from l4d2web.services import global_overlay_refresh
|
|
|
|
with session_scope() as db:
|
|
ensure_global_overlays(db)
|
|
source = db.query(GlobalOverlaySource).filter_by(source_key="l4d2center-maps").one()
|
|
item = GlobalOverlayItem(source_id=source.id, item_key="old.vpk", display_name="old.vpk", download_url="https://example.invalid/old.7z")
|
|
db.add(item)
|
|
db.flush()
|
|
db.add(GlobalOverlayItemFile(item_id=item.id, vpk_name="old.vpk", cache_path="l4d2center-maps/vpks/old.vpk", size=3))
|
|
|
|
monkeypatch.setattr(global_overlay_refresh, "fetch_l4d2center_manifest", lambda: ("empty-center", []))
|
|
monkeypatch.setattr(global_overlay_refresh, "fetch_cedapug_manifest", lambda: ("empty-ceda", []))
|
|
monkeypatch.setattr(global_overlay_refresh, "build_global_overlay", lambda overlay, **kwargs: None)
|
|
|
|
global_overlay_refresh.refresh_global_overlays(on_stdout=lambda line: None, on_stderr=lambda line: None, should_cancel=lambda: False)
|
|
|
|
with session_scope() as db:
|
|
assert db.query(GlobalOverlayItem).filter_by(item_key="old.vpk").count() == 0
|
|
```
|
|
|
|
- [ ] **Step 2: Run tests and verify failure**
|
|
|
|
Run: `pytest l4d2web/tests/test_global_overlay_refresh.py -q`
|
|
|
|
Expected: FAIL with missing module.
|
|
|
|
- [ ] **Step 3: Implement refresh service**
|
|
|
|
Create `l4d2web/services/global_overlay_refresh.py`:
|
|
|
|
```python
|
|
from __future__ import annotations
|
|
|
|
from datetime import UTC, datetime
|
|
from pathlib import Path
|
|
import shutil
|
|
import tempfile
|
|
|
|
from sqlalchemy import select
|
|
|
|
from l4d2web.db import session_scope
|
|
from l4d2web.models import GlobalOverlayItem, GlobalOverlayItemFile, GlobalOverlaySource, Overlay
|
|
from l4d2web.services.global_map_cache import (
|
|
archive_dir,
|
|
download_archive,
|
|
extracted_vpk_md5,
|
|
safe_extract_7z_vpks,
|
|
safe_extract_zip_vpks,
|
|
vpk_dir,
|
|
)
|
|
from l4d2web.services.global_map_sources import (
|
|
GlobalMapManifestItem,
|
|
fetch_cedapug_manifest,
|
|
fetch_l4d2center_manifest,
|
|
)
|
|
from l4d2web.services.global_overlays import ensure_global_overlays
|
|
|
|
|
|
def refresh_global_overlays(*, on_stdout, on_stderr, should_cancel) -> list[str]:
|
|
with session_scope() as db:
|
|
ensure_global_overlays(db)
|
|
|
|
refreshed: list[str] = []
|
|
for source_key, fetcher in (
|
|
("l4d2center-maps", fetch_l4d2center_manifest),
|
|
("cedapug-maps", fetch_cedapug_manifest),
|
|
):
|
|
if should_cancel():
|
|
on_stderr("global overlay refresh cancelled before manifest fetch")
|
|
return refreshed
|
|
manifest_hash, manifest_items = fetcher()
|
|
on_stdout(f"{source_key}: fetched manifest with {len(manifest_items)} item(s)")
|
|
overlay = _refresh_source(
|
|
source_key,
|
|
manifest_hash,
|
|
manifest_items,
|
|
on_stdout=on_stdout,
|
|
on_stderr=on_stderr,
|
|
should_cancel=should_cancel,
|
|
)
|
|
build_global_overlay(overlay, on_stdout=on_stdout, on_stderr=on_stderr, should_cancel=should_cancel)
|
|
refreshed.append(source_key)
|
|
return sorted(refreshed)
|
|
|
|
|
|
def _refresh_source(source_key: str, manifest_hash: str, manifest_items: list[GlobalMapManifestItem], *, on_stdout, on_stderr, should_cancel) -> Overlay:
|
|
now = datetime.now(UTC)
|
|
desired_keys = {item.item_key for item in manifest_items}
|
|
with session_scope() as db:
|
|
source = db.scalar(select(GlobalOverlaySource).where(GlobalOverlaySource.source_key == source_key))
|
|
if source is None:
|
|
raise ValueError(f"global overlay source {source_key!r} not found")
|
|
overlay = db.scalar(select(Overlay).where(Overlay.id == source.overlay_id))
|
|
if overlay is None:
|
|
raise ValueError(f"overlay for source {source_key!r} not found")
|
|
existing_items = {item.item_key: item for item in db.scalars(select(GlobalOverlayItem).where(GlobalOverlayItem.source_id == source.id)).all()}
|
|
for old_key, old_item in list(existing_items.items()):
|
|
if old_key not in desired_keys:
|
|
db.delete(old_item)
|
|
for manifest_item in manifest_items:
|
|
item = existing_items.get(manifest_item.item_key)
|
|
if item is None:
|
|
item = GlobalOverlayItem(source_id=source.id, item_key=manifest_item.item_key, download_url=manifest_item.download_url)
|
|
db.add(item)
|
|
db.flush()
|
|
item.display_name = manifest_item.display_name
|
|
item.download_url = manifest_item.download_url
|
|
item.expected_vpk_name = manifest_item.expected_vpk_name
|
|
item.expected_size = manifest_item.expected_size
|
|
item.expected_md5 = manifest_item.expected_md5
|
|
item.updated_at = now
|
|
source.last_manifest_hash = manifest_hash
|
|
source.last_refreshed_at = now
|
|
source.last_error = ""
|
|
source.updated_at = now
|
|
db.expunge(overlay)
|
|
|
|
for manifest_item in manifest_items:
|
|
if should_cancel():
|
|
on_stderr(f"{source_key}: refresh cancelled during downloads")
|
|
return overlay
|
|
_refresh_item(source_key, manifest_item, on_stdout=on_stdout, on_stderr=on_stderr, should_cancel=should_cancel)
|
|
return overlay
|
|
|
|
|
|
def _refresh_item(source_key: str, manifest_item: GlobalMapManifestItem, *, on_stdout, on_stderr, should_cancel) -> None:
|
|
try:
|
|
files, etag, last_modified, content_length = download_and_extract_item(source_key, manifest_item, should_cancel=should_cancel)
|
|
except Exception as exc:
|
|
with session_scope() as db:
|
|
source = db.scalar(select(GlobalOverlaySource).where(GlobalOverlaySource.source_key == source_key))
|
|
if source is not None:
|
|
item = db.scalar(select(GlobalOverlayItem).where(GlobalOverlayItem.source_id == source.id, GlobalOverlayItem.item_key == manifest_item.item_key))
|
|
if item is not None:
|
|
item.last_error = str(exc)
|
|
on_stderr(f"{source_key}: {manifest_item.item_key}: {exc}")
|
|
return
|
|
|
|
now = datetime.now(UTC)
|
|
with session_scope() as db:
|
|
source = db.scalar(select(GlobalOverlaySource).where(GlobalOverlaySource.source_key == source_key))
|
|
if source is None:
|
|
raise ValueError(f"global overlay source {source_key!r} not found")
|
|
item = db.scalar(select(GlobalOverlayItem).where(GlobalOverlayItem.source_id == source.id, GlobalOverlayItem.item_key == manifest_item.item_key))
|
|
if item is None:
|
|
raise ValueError(f"global overlay item {manifest_item.item_key!r} not found")
|
|
db.query(GlobalOverlayItemFile).filter_by(item_id=item.id).delete()
|
|
for vpk_name, cache_path, size, md5 in files:
|
|
db.add(GlobalOverlayItemFile(item_id=item.id, vpk_name=vpk_name, cache_path=cache_path, size=size, md5=md5))
|
|
item.etag = etag
|
|
item.last_modified = last_modified
|
|
item.content_length = content_length
|
|
item.last_downloaded_at = now
|
|
item.last_error = ""
|
|
item.updated_at = now
|
|
on_stdout(f"{source_key}: refreshed {manifest_item.item_key} ({len(files)} vpk file(s))")
|
|
|
|
|
|
def download_and_extract_item(source_key: str, item: GlobalMapManifestItem, *, should_cancel) -> tuple[list[tuple[str, str, int, str]], str, str, int | None]:
|
|
archives = archive_dir(source_key)
|
|
vpks = vpk_dir(source_key)
|
|
archives.mkdir(parents=True, exist_ok=True)
|
|
vpks.mkdir(parents=True, exist_ok=True)
|
|
archive_name = item.download_url.rsplit("/", 1)[-1]
|
|
archive_path = archives / archive_name
|
|
etag, last_modified, content_length = download_archive(item.download_url, archive_path, should_cancel=should_cancel)
|
|
with tempfile.TemporaryDirectory(prefix="left4me-global-map-") as tmp:
|
|
tmp_dir = Path(tmp)
|
|
if archive_name.lower().endswith(".7z"):
|
|
extracted = safe_extract_7z_vpks(archive_path, tmp_dir)
|
|
elif archive_name.lower().endswith(".zip"):
|
|
extracted = safe_extract_zip_vpks(archive_path, tmp_dir)
|
|
else:
|
|
raise ValueError(f"unsupported archive extension for {archive_name}")
|
|
results: list[tuple[str, str, int, str]] = []
|
|
for path in extracted:
|
|
if item.expected_vpk_name and path.name != item.expected_vpk_name:
|
|
continue
|
|
size = path.stat().st_size
|
|
md5 = extracted_vpk_md5(path)
|
|
if item.expected_size is not None and size != item.expected_size:
|
|
raise ValueError(f"{path.name} size mismatch: expected {item.expected_size}, got {size}")
|
|
if item.expected_md5 and md5 != item.expected_md5:
|
|
raise ValueError(f"{path.name} md5 mismatch: expected {item.expected_md5}, got {md5}")
|
|
final = vpks / path.name
|
|
os.replace(path, final)
|
|
results.append((path.name, f"{source_key}/vpks/{path.name}", size, md5))
|
|
if not results:
|
|
raise ValueError(f"no expected .vpk files extracted from {archive_name}")
|
|
return results, etag, last_modified, content_length
|
|
|
|
|
|
def build_global_overlay(overlay: Overlay, *, on_stdout, on_stderr, should_cancel) -> None:
|
|
from l4d2web.services.overlay_builders import BUILDERS
|
|
|
|
builder = BUILDERS.get(overlay.type)
|
|
if builder is None:
|
|
raise ValueError(f"no builder registered for overlay type {overlay.type!r}")
|
|
builder.build(overlay, on_stdout=on_stdout, on_stderr=on_stderr, should_cancel=should_cancel)
|
|
```
|
|
|
|
- [ ] **Step 4: Run tests and verify pass**
|
|
|
|
Run: `pytest l4d2web/tests/test_global_overlay_refresh.py -q`
|
|
|
|
Expected: PASS.
|
|
|
|
---
|
|
|
|
## Task 8: Worker Operation, Scheduler Rules, And CLI
|
|
|
|
**Files:**
|
|
- Modify: `l4d2web/services/job_worker.py`
|
|
- Modify: `l4d2web/cli.py`
|
|
- Modify: `l4d2web/tests/test_job_worker.py`
|
|
- Create: `l4d2web/tests/test_global_overlay_cli.py`
|
|
|
|
- [ ] **Step 1: Add failing worker scheduler tests**
|
|
|
|
Append to `l4d2web/tests/test_job_worker.py`:
|
|
|
|
```python
|
|
def test_refresh_global_overlays_blocks_install_build_refresh_and_servers() -> None:
|
|
from l4d2web.services.job_worker import SchedulerState, can_start
|
|
|
|
state = SchedulerState(refresh_global_overlays_running=True)
|
|
assert can_start(DummyJob(operation="install"), state) is False
|
|
assert can_start(DummyJob(operation="refresh_workshop_items"), state) is False
|
|
assert can_start(DummyJob(operation="build_overlay", overlay_id=1), state) is False
|
|
assert can_start(DummyJob(operation="start", server_id=1), state) is False
|
|
|
|
|
|
def test_refresh_global_overlays_waits_for_active_work() -> None:
|
|
from l4d2web.services.job_worker import SchedulerState, can_start
|
|
|
|
assert can_start(DummyJob(operation="refresh_global_overlays"), SchedulerState(install_running=True)) is False
|
|
assert can_start(DummyJob(operation="refresh_global_overlays"), SchedulerState(refresh_running=True)) is False
|
|
state = SchedulerState()
|
|
state.running_overlays.add(1)
|
|
assert can_start(DummyJob(operation="refresh_global_overlays"), state) is False
|
|
state = SchedulerState()
|
|
state.running_servers.add(1)
|
|
assert can_start(DummyJob(operation="refresh_global_overlays"), state) is False
|
|
```
|
|
|
|
Append a dispatch test:
|
|
|
|
```python
|
|
def test_run_worker_once_dispatches_refresh_global_overlays(seeded_worker, monkeypatch):
|
|
from l4d2web.services import job_worker
|
|
from l4d2web.models import Job
|
|
from l4d2web.db import session_scope
|
|
|
|
called = []
|
|
|
|
def fake_refresh(*, on_stdout, on_stderr, should_cancel):
|
|
called.append("refresh")
|
|
on_stdout("global refresh complete")
|
|
return ["l4d2center-maps"]
|
|
|
|
monkeypatch.setattr(job_worker, "_run_refresh_global_overlays", fake_refresh)
|
|
with session_scope() as db:
|
|
job = Job(user_id=None, server_id=None, operation="refresh_global_overlays", state="queued")
|
|
db.add(job)
|
|
|
|
assert job_worker.run_worker_once() is True
|
|
assert called == ["refresh"]
|
|
```
|
|
|
|
- [ ] **Step 2: Add failing CLI test**
|
|
|
|
Create `l4d2web/tests/test_global_overlay_cli.py`:
|
|
|
|
```python
|
|
from l4d2web.app import create_app
|
|
from l4d2web.db import init_db, session_scope
|
|
from l4d2web.models import Job
|
|
|
|
|
|
def test_refresh_global_overlays_cli_enqueues_system_job(tmp_path, monkeypatch):
|
|
db_url = f"sqlite:///{tmp_path/'cli.db'}"
|
|
monkeypatch.setenv("DATABASE_URL", db_url)
|
|
monkeypatch.setenv("LEFT4ME_ROOT", str(tmp_path))
|
|
app = create_app({"TESTING": True, "DATABASE_URL": db_url, "SECRET_KEY": "test"})
|
|
init_db()
|
|
|
|
result = app.test_cli_runner().invoke(args=["refresh-global-overlays"])
|
|
|
|
assert result.exit_code == 0
|
|
assert "queued refresh_global_overlays job" in result.output
|
|
with session_scope() as db:
|
|
job = db.query(Job).filter_by(operation="refresh_global_overlays").one()
|
|
assert job.user_id is None
|
|
```
|
|
|
|
- [ ] **Step 3: Run tests and verify failure**
|
|
|
|
Run: `pytest l4d2web/tests/test_job_worker.py l4d2web/tests/test_global_overlay_cli.py -q`
|
|
|
|
Expected: FAIL because worker and CLI do not know `refresh_global_overlays`.
|
|
|
|
- [ ] **Step 4: Update scheduler state and `can_start`**
|
|
|
|
Modify `l4d2web/services/job_worker.py`:
|
|
|
|
```python
|
|
GLOBAL_OPERATIONS = {"install", "refresh_workshop_items", "refresh_global_overlays"}
|
|
```
|
|
|
|
Extend `SchedulerState`:
|
|
|
|
```python
|
|
refresh_global_overlays_running: bool = False
|
|
```
|
|
|
|
In `can_start`, add:
|
|
|
|
```python
|
|
if job.operation == "refresh_global_overlays":
|
|
return (
|
|
not state.install_running
|
|
and not state.refresh_running
|
|
and not state.refresh_global_overlays_running
|
|
and len(state.running_servers) == 0
|
|
and len(state.running_overlays) == 0
|
|
)
|
|
```
|
|
|
|
Update all other branches that already check `install_running` or `refresh_running` to also check `refresh_global_overlays_running`.
|
|
|
|
In `build_scheduler_state`, add:
|
|
|
|
```python
|
|
elif job.operation == "refresh_global_overlays":
|
|
state.refresh_global_overlays_running = True
|
|
```
|
|
|
|
- [ ] **Step 5: Dispatch worker operation**
|
|
|
|
In `run_job`, add before `build_overlay`:
|
|
|
|
```python
|
|
elif operation == "refresh_global_overlays":
|
|
_run_with_boundaries(
|
|
"refresh",
|
|
"global overlays",
|
|
_run_refresh_global_overlays,
|
|
on_stdout=on_stdout,
|
|
on_stderr=on_stderr,
|
|
should_cancel=should_cancel,
|
|
)
|
|
```
|
|
|
|
Add helper:
|
|
|
|
```python
|
|
def _run_refresh_global_overlays(*, on_stdout, on_stderr, should_cancel) -> list[str]:
|
|
from l4d2web.services.global_overlay_refresh import refresh_global_overlays
|
|
|
|
return refresh_global_overlays(
|
|
on_stdout=on_stdout,
|
|
on_stderr=on_stderr,
|
|
should_cancel=should_cancel,
|
|
)
|
|
```
|
|
|
|
Change `enqueue_build_overlay` signature to accept system jobs:
|
|
|
|
```python
|
|
def enqueue_build_overlay(session: Session, *, overlay_id: int, user_id: int | None) -> Job:
|
|
```
|
|
|
|
- [ ] **Step 6: Add CLI command**
|
|
|
|
Modify `l4d2web/cli.py`:
|
|
|
|
```python
|
|
@click.command("refresh-global-overlays")
|
|
def refresh_global_overlays_command() -> None:
|
|
from l4d2web.services.global_overlays import ensure_global_overlays, enqueue_refresh_global_overlays
|
|
|
|
with session_scope() as db:
|
|
ensure_global_overlays(db)
|
|
job = enqueue_refresh_global_overlays(db, user_id=None)
|
|
click.echo(f"queued refresh_global_overlays job #{job.id}")
|
|
```
|
|
|
|
Register it:
|
|
|
|
```python
|
|
app.cli.add_command(refresh_global_overlays_command)
|
|
```
|
|
|
|
- [ ] **Step 7: Run tests and verify pass**
|
|
|
|
Run: `pytest l4d2web/tests/test_job_worker.py l4d2web/tests/test_global_overlay_cli.py -q`
|
|
|
|
Expected: PASS.
|
|
|
|
---
|
|
|
|
## Task 9: Nullable Job Owner UI And Authorization
|
|
|
|
**Files:**
|
|
- Modify: `l4d2web/routes/job_routes.py`
|
|
- Modify: `l4d2web/routes/page_routes.py`
|
|
- Modify: `l4d2web/templates/_job_table.html`
|
|
- Modify: `l4d2web/templates/job_detail.html`
|
|
- Modify: `l4d2web/tests/test_pages.py`
|
|
- Modify: `l4d2web/tests/test_job_logs.py`
|
|
|
|
- [ ] **Step 1: Write failing page/access tests**
|
|
|
|
Append to `l4d2web/tests/test_pages.py`:
|
|
|
|
```python
|
|
def test_admin_jobs_page_renders_system_job(admin_client):
|
|
from l4d2web.db import session_scope
|
|
from l4d2web.models import Job
|
|
|
|
with session_scope() as db:
|
|
db.add(Job(user_id=None, server_id=None, operation="refresh_global_overlays", state="queued"))
|
|
|
|
response = admin_client.get("/admin/jobs")
|
|
text = response.get_data(as_text=True)
|
|
|
|
assert response.status_code == 200
|
|
assert "refresh_global_overlays" in text
|
|
assert "system" in text
|
|
|
|
|
|
def test_non_admin_cannot_view_system_job(user_client_with_overlay):
|
|
from l4d2web.db import session_scope
|
|
from l4d2web.models import Job
|
|
|
|
with session_scope() as db:
|
|
job = Job(user_id=None, server_id=None, operation="refresh_global_overlays", state="queued")
|
|
db.add(job)
|
|
db.flush()
|
|
job_id = job.id
|
|
|
|
response = user_client_with_overlay.get(f"/jobs/{job_id}")
|
|
assert response.status_code == 403
|
|
```
|
|
|
|
- [ ] **Step 2: Run tests and verify failure**
|
|
|
|
Run: `pytest l4d2web/tests/test_pages.py l4d2web/tests/test_job_logs.py -q`
|
|
|
|
Expected: FAIL because job queries use inner joins to `User` or templates access `owner.username`.
|
|
|
|
- [ ] **Step 3: Update job authorization and joins**
|
|
|
|
Modify `l4d2web/routes/job_routes.py`:
|
|
|
|
```python
|
|
def can_access_job(job: Job, user: User) -> bool:
|
|
if user.admin:
|
|
return True
|
|
if job.user_id is None:
|
|
return False
|
|
return job.user_id == user.id
|
|
```
|
|
|
|
Change job detail query join:
|
|
|
|
```python
|
|
.outerjoin(User, User.id == Job.user_id)
|
|
```
|
|
|
|
Template render remains `owner=owner`; owner may be `None`.
|
|
|
|
- [ ] **Step 4: Update page route joins**
|
|
|
|
Modify each job-row query in `l4d2web/routes/page_routes.py` to use outer joins:
|
|
|
|
```python
|
|
.outerjoin(User, User.id == Job.user_id)
|
|
```
|
|
|
|
This applies to admin jobs, server detail recent jobs, and server jobs page.
|
|
|
|
- [ ] **Step 5: Update templates to render `system`**
|
|
|
|
Modify `l4d2web/templates/_job_table.html`:
|
|
|
|
```jinja2
|
|
{% if show_user %}<td>{{ user.username if user else "system" }}</td>{% endif %}
|
|
```
|
|
|
|
Modify `l4d2web/templates/job_detail.html`:
|
|
|
|
```jinja2
|
|
<tr><th>User</th><td>{{ owner.username if owner else "system" }}</td></tr>
|
|
```
|
|
|
|
- [ ] **Step 6: Run tests and verify pass**
|
|
|
|
Run: `pytest l4d2web/tests/test_pages.py l4d2web/tests/test_job_logs.py -q`
|
|
|
|
Expected: PASS.
|
|
|
|
---
|
|
|
|
## Task 10: Global Overlay Visibility, Detail UI, And Manual Admin Refresh
|
|
|
|
**Files:**
|
|
- Modify: `l4d2web/routes/page_routes.py`
|
|
- Modify: `l4d2web/routes/overlay_routes.py`
|
|
- Modify: `l4d2web/routes/blueprint_routes.py`
|
|
- Modify: `l4d2web/templates/admin.html`
|
|
- Modify: `l4d2web/templates/overlay_detail.html`
|
|
- Modify: `l4d2web/tests/test_overlays.py`
|
|
- Modify: `l4d2web/tests/test_blueprints.py`
|
|
|
|
- [ ] **Step 1: Write failing visibility and admin refresh tests**
|
|
|
|
Append to `l4d2web/tests/test_overlays.py`:
|
|
|
|
```python
|
|
def test_global_map_overlays_visible_to_non_admin(user_client_with_overlay):
|
|
from l4d2web.db import session_scope
|
|
from l4d2web.models import GlobalOverlaySource, Overlay
|
|
|
|
with session_scope() as db:
|
|
overlay = Overlay(name="l4d2center-maps", path="7", type="l4d2center_maps", user_id=None)
|
|
db.add(overlay)
|
|
db.flush()
|
|
db.add(
|
|
GlobalOverlaySource(
|
|
overlay_id=overlay.id,
|
|
source_key="l4d2center-maps",
|
|
source_type="l4d2center_csv",
|
|
source_url="https://l4d2center.com/maps/servers/index.csv",
|
|
)
|
|
)
|
|
|
|
response = user_client_with_overlay.get("/overlays")
|
|
text = response.get_data(as_text=True)
|
|
assert response.status_code == 200
|
|
assert "l4d2center-maps" in text
|
|
|
|
|
|
def test_managed_global_overlay_detail_is_not_editable(admin_client):
|
|
from l4d2web.db import session_scope
|
|
from l4d2web.models import GlobalOverlaySource, Overlay
|
|
|
|
with session_scope() as db:
|
|
overlay = Overlay(name="cedapug-maps", path="8", type="cedapug_maps", user_id=None)
|
|
db.add(overlay)
|
|
db.flush()
|
|
overlay_id = overlay.id
|
|
db.add(
|
|
GlobalOverlaySource(
|
|
overlay_id=overlay.id,
|
|
source_key="cedapug-maps",
|
|
source_type="cedapug_custom_page",
|
|
source_url="https://cedapug.com/custom",
|
|
)
|
|
)
|
|
|
|
response = admin_client.get(f"/overlays/{overlay_id}")
|
|
text = response.get_data(as_text=True)
|
|
assert response.status_code == 200
|
|
assert "https://cedapug.com/custom" in text
|
|
assert f'action="/overlays/{overlay_id}"' not in text
|
|
assert "delete-overlay-modal" not in text
|
|
|
|
|
|
def test_admin_can_enqueue_refresh_global_overlays(admin_client):
|
|
response = admin_client.post("/admin/global-overlays/refresh", headers={"X-CSRF-Token": "test-token"})
|
|
assert response.status_code == 302
|
|
assert response.headers["Location"] == "/admin/jobs"
|
|
```
|
|
|
|
- [ ] **Step 2: Run tests and verify failure**
|
|
|
|
Run: `pytest l4d2web/tests/test_overlays.py -q`
|
|
|
|
Expected: FAIL because non-admin overlay list filters only external or owned workshop overlays, and admin route is missing.
|
|
|
|
- [ ] **Step 3: Update overlay visibility**
|
|
|
|
Modify `l4d2web/routes/page_routes.py` overlay list query:
|
|
|
|
```python
|
|
if not user.admin:
|
|
query = query.where(
|
|
(Overlay.user_id.is_(None)) | (Overlay.user_id == user.id)
|
|
)
|
|
```
|
|
|
|
Modify `overlay_detail` visibility:
|
|
|
|
```python
|
|
if not user.admin and overlay.user_id is not None and overlay.user_id != user.id:
|
|
return Response(status=403)
|
|
```
|
|
|
|
Load global source metadata:
|
|
|
|
```python
|
|
global_source = None
|
|
if overlay.type in {"l4d2center_maps", "cedapug_maps"}:
|
|
global_source = db.scalar(select(GlobalOverlaySource).where(GlobalOverlaySource.overlay_id == overlay.id))
|
|
```
|
|
|
|
Pass `global_source=global_source` into `render_template`.
|
|
|
|
- [ ] **Step 4: Protect managed overlays from edit/delete**
|
|
|
|
Modify `l4d2web/routes/overlay_routes.py`:
|
|
|
|
```python
|
|
from l4d2web.services.global_overlays import MANAGED_GLOBAL_OVERLAY_TYPES
|
|
```
|
|
|
|
Update `_can_edit_overlay`:
|
|
|
|
```python
|
|
if overlay.type in MANAGED_GLOBAL_OVERLAY_TYPES:
|
|
return False
|
|
```
|
|
|
|
Keep existing external/workshop checks after that.
|
|
|
|
- [ ] **Step 5: Add admin manual refresh route**
|
|
|
|
Modify `l4d2web/routes/page_routes.py`:
|
|
|
|
```python
|
|
@bp.post("/admin/global-overlays/refresh")
|
|
@require_admin
|
|
def enqueue_global_overlay_refresh() -> Response:
|
|
user = current_user()
|
|
assert user is not None
|
|
from l4d2web.services.global_overlays import ensure_global_overlays, enqueue_refresh_global_overlays
|
|
|
|
with session_scope() as db:
|
|
ensure_global_overlays(db)
|
|
enqueue_refresh_global_overlays(db, user_id=user.id)
|
|
return redirect("/admin/jobs")
|
|
```
|
|
|
|
- [ ] **Step 6: Update templates**
|
|
|
|
Modify `l4d2web/templates/overlay_detail.html` `can_edit` line:
|
|
|
|
```jinja2
|
|
{% set managed_global = overlay.type in ['l4d2center_maps', 'cedapug_maps'] %}
|
|
{% set can_edit = (not managed_global) and (g.user.admin or (overlay.type == 'workshop' and overlay.user_id == g.user.id)) %}
|
|
```
|
|
|
|
Add source metadata below the definition table:
|
|
|
|
```jinja2
|
|
{% if global_source %}
|
|
<section class="panel">
|
|
<h2>Managed source</h2>
|
|
<table class="definition-table">
|
|
<tbody>
|
|
<tr><th>Source</th><td>{{ global_source.source_key }}</td></tr>
|
|
<tr><th>URL</th><td><a href="{{ global_source.source_url }}">{{ global_source.source_url }}</a></td></tr>
|
|
<tr><th>Last refreshed</th><td>{{ global_source.last_refreshed_at or "-" }}</td></tr>
|
|
<tr><th>Last error</th><td>{{ global_source.last_error or "-" }}</td></tr>
|
|
</tbody>
|
|
</table>
|
|
</section>
|
|
{% endif %}
|
|
```
|
|
|
|
Modify `l4d2web/templates/admin.html` to add:
|
|
|
|
```jinja2
|
|
<section class="panel">
|
|
<h2>Global map overlays</h2>
|
|
<p class="muted">Queue a refresh for managed L4D2Center and CEDAPUG map overlays.</p>
|
|
<form method="post" action="/admin/global-overlays/refresh">
|
|
<input type="hidden" name="csrf_token" value="{{ session.get('csrf_token', '') }}">
|
|
<button type="submit">Refresh global overlays</button>
|
|
</form>
|
|
</section>
|
|
```
|
|
|
|
- [ ] **Step 7: Ensure blueprint selection includes system overlays only**
|
|
|
|
Modify `l4d2web/routes/page_routes.py` blueprint detail overlay query:
|
|
|
|
```python
|
|
all_overlays = db.scalars(
|
|
select(Overlay)
|
|
.where((Overlay.user_id.is_(None)) | (Overlay.user_id == user.id))
|
|
.order_by(Overlay.name)
|
|
).all()
|
|
```
|
|
|
|
- [ ] **Step 8: Run tests and verify pass**
|
|
|
|
Run: `pytest l4d2web/tests/test_overlays.py l4d2web/tests/test_blueprints.py l4d2web/tests/test_pages.py -q`
|
|
|
|
Expected: PASS.
|
|
|
|
---
|
|
|
|
## Task 11: Initialize-Time Guard For Global Map Cache Files
|
|
|
|
**Files:**
|
|
- Modify: `l4d2web/services/l4d2_facade.py`
|
|
- Modify: `l4d2web/tests/test_l4d2_facade.py`
|
|
|
|
- [ ] **Step 1: Write failing initialize guard test**
|
|
|
|
Append to `l4d2web/tests/test_l4d2_facade.py`:
|
|
|
|
```python
|
|
def test_initialize_fails_when_global_overlay_cache_file_missing(tmp_path, monkeypatch):
|
|
from l4d2web.db import init_db, session_scope
|
|
from l4d2web.models import (
|
|
Blueprint,
|
|
BlueprintOverlay,
|
|
GlobalOverlayItem,
|
|
GlobalOverlayItemFile,
|
|
GlobalOverlaySource,
|
|
Overlay,
|
|
Server,
|
|
User,
|
|
)
|
|
from l4d2web.services.l4d2_facade import initialize_server
|
|
|
|
monkeypatch.setenv("DATABASE_URL", f"sqlite:///{tmp_path/'facade-global.db'}")
|
|
monkeypatch.setenv("LEFT4ME_ROOT", str(tmp_path))
|
|
init_db()
|
|
|
|
with session_scope() as db:
|
|
user = User(username="alice", password_digest="digest")
|
|
db.add(user)
|
|
db.flush()
|
|
overlay = Overlay(name="l4d2center-maps", path="7", type="l4d2center_maps", user_id=None)
|
|
db.add(overlay)
|
|
db.flush()
|
|
source = GlobalOverlaySource(overlay_id=overlay.id, source_key="l4d2center-maps", source_type="l4d2center_csv", source_url="https://l4d2center.com/maps/servers/index.csv")
|
|
db.add(source)
|
|
db.flush()
|
|
item = GlobalOverlayItem(source_id=source.id, item_key="carriedoff.vpk", display_name="carriedoff.vpk", download_url="https://example.invalid/carriedoff.7z")
|
|
db.add(item)
|
|
db.flush()
|
|
db.add(GlobalOverlayItemFile(item_id=item.id, vpk_name="carriedoff.vpk", cache_path="l4d2center-maps/vpks/carriedoff.vpk", size=123))
|
|
blueprint = Blueprint(user_id=user.id, name="bp", arguments="[]", config="[]")
|
|
db.add(blueprint)
|
|
db.flush()
|
|
db.add(BlueprintOverlay(blueprint_id=blueprint.id, overlay_id=overlay.id, position=0))
|
|
server = Server(user_id=user.id, blueprint_id=blueprint.id, name="alpha", port=27015)
|
|
db.add(server)
|
|
db.flush()
|
|
server_id = server.id
|
|
|
|
monkeypatch.setattr("l4d2web.services.host_commands.run_command", lambda *args, **kwargs: None)
|
|
|
|
try:
|
|
initialize_server(server_id)
|
|
except RuntimeError as exc:
|
|
assert "carriedoff.vpk" in str(exc)
|
|
assert "l4d2center-maps" in str(exc)
|
|
else:
|
|
raise AssertionError("missing global overlay cache file must fail")
|
|
```
|
|
|
|
- [ ] **Step 2: Run test and verify failure**
|
|
|
|
Run: `pytest l4d2web/tests/test_l4d2_facade.py -q`
|
|
|
|
Expected: FAIL because global cache guard is missing.
|
|
|
|
- [ ] **Step 3: Implement global cache guard**
|
|
|
|
Modify `l4d2web/services/l4d2_facade.py` imports:
|
|
|
|
```python
|
|
from l4d2web.models import GlobalOverlayItem, GlobalOverlayItemFile, GlobalOverlaySource
|
|
from l4d2web.services.global_map_cache import global_overlay_cache_root
|
|
```
|
|
|
|
After `_check_workshop_overlay_caches(blueprint_id=blueprint.id)`, call:
|
|
|
|
```python
|
|
_check_global_overlay_caches(blueprint_id=blueprint.id)
|
|
```
|
|
|
|
Add helper:
|
|
|
|
```python
|
|
def _check_global_overlay_caches(*, blueprint_id: int) -> None:
|
|
with session_scope() as db:
|
|
rows = db.execute(
|
|
select(Overlay.name, GlobalOverlayItemFile.vpk_name, GlobalOverlayItemFile.cache_path)
|
|
.join(BlueprintOverlay, BlueprintOverlay.overlay_id == Overlay.id)
|
|
.join(GlobalOverlaySource, GlobalOverlaySource.overlay_id == Overlay.id)
|
|
.join(GlobalOverlayItem, GlobalOverlayItem.source_id == GlobalOverlaySource.id)
|
|
.join(GlobalOverlayItemFile, GlobalOverlayItemFile.item_id == GlobalOverlayItem.id)
|
|
.where(BlueprintOverlay.blueprint_id == blueprint_id)
|
|
).all()
|
|
|
|
missing: dict[str, list[str]] = {}
|
|
root = global_overlay_cache_root()
|
|
for overlay_name, vpk_name, cache_path_value in rows:
|
|
if not (root / cache_path_value).exists():
|
|
missing.setdefault(overlay_name, []).append(vpk_name)
|
|
|
|
if not missing:
|
|
return
|
|
|
|
details = []
|
|
for overlay_name, names in sorted(missing.items()):
|
|
details.append(f"overlay {overlay_name!r}: missing {', '.join(sorted(names))}")
|
|
raise RuntimeError("global overlay content missing — " + "; ".join(details))
|
|
```
|
|
|
|
- [ ] **Step 4: Run tests and verify pass**
|
|
|
|
Run: `pytest l4d2web/tests/test_l4d2_facade.py -q`
|
|
|
|
Expected: PASS.
|
|
|
|
---
|
|
|
|
## Task 12: Deployment Timer And Cache Provisioning
|
|
|
|
**Files:**
|
|
- Create: `deploy/files/usr/local/lib/systemd/system/left4me-refresh-global-overlays.service`
|
|
- Create: `deploy/files/usr/local/lib/systemd/system/left4me-refresh-global-overlays.timer`
|
|
- Modify: `deploy/deploy-test-server.sh`
|
|
- Modify: `deploy/README.md`
|
|
- Modify: `deploy/tests/test_deploy_artifacts.py`
|
|
|
|
- [ ] **Step 1: Write failing deployment tests**
|
|
|
|
Modify `deploy/tests/test_deploy_artifacts.py` constants:
|
|
|
|
```python
|
|
GLOBAL_REFRESH_SERVICE = DEPLOY / "files/usr/local/lib/systemd/system/left4me-refresh-global-overlays.service"
|
|
GLOBAL_REFRESH_TIMER = DEPLOY / "files/usr/local/lib/systemd/system/left4me-refresh-global-overlays.timer"
|
|
```
|
|
|
|
Add tests:
|
|
|
|
```python
|
|
def test_global_refresh_timer_units_exist_and_enqueue_only():
|
|
service = GLOBAL_REFRESH_SERVICE.read_text()
|
|
timer = GLOBAL_REFRESH_TIMER.read_text()
|
|
|
|
assert "User=left4me" in service
|
|
assert "EnvironmentFile=/etc/left4me/host.env" in service
|
|
assert "EnvironmentFile=/etc/left4me/web.env" in service
|
|
assert "flask --app l4d2web.app:create_app refresh-global-overlays" in service
|
|
assert "OnCalendar=daily" in timer
|
|
assert "Persistent=true" in timer
|
|
assert "WantedBy=timers.target" in timer
|
|
|
|
|
|
def test_deploy_script_installs_and_enables_global_refresh_timer():
|
|
script = DEPLOY_SCRIPT.read_text()
|
|
|
|
assert "/var/lib/left4me/global_overlay_cache" in script
|
|
assert "left4me-refresh-global-overlays.service" in script
|
|
assert "left4me-refresh-global-overlays.timer" in script
|
|
assert "systemctl enable --now left4me-refresh-global-overlays.timer" in script
|
|
```
|
|
|
|
- [ ] **Step 2: Run tests and verify failure**
|
|
|
|
Run: `pytest deploy/tests/test_deploy_artifacts.py -q`
|
|
|
|
Expected: FAIL because timer units do not exist.
|
|
|
|
- [ ] **Step 3: Add systemd service**
|
|
|
|
Create `deploy/files/usr/local/lib/systemd/system/left4me-refresh-global-overlays.service`:
|
|
|
|
```ini
|
|
[Unit]
|
|
Description=left4me refresh global map overlays
|
|
After=network-online.target left4me-web.service
|
|
Wants=network-online.target
|
|
|
|
[Service]
|
|
Type=oneshot
|
|
User=left4me
|
|
Group=left4me
|
|
WorkingDirectory=/opt/left4me
|
|
Environment=HOME=/var/lib/left4me
|
|
Environment=PATH=/opt/left4me/.venv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
|
|
EnvironmentFile=/etc/left4me/host.env
|
|
EnvironmentFile=/etc/left4me/web.env
|
|
ExecStart=/opt/left4me/.venv/bin/flask --app l4d2web.app:create_app refresh-global-overlays
|
|
ProtectSystem=full
|
|
ReadWritePaths=/var/lib/left4me
|
|
```
|
|
|
|
- [ ] **Step 4: Add systemd timer**
|
|
|
|
Create `deploy/files/usr/local/lib/systemd/system/left4me-refresh-global-overlays.timer`:
|
|
|
|
```ini
|
|
[Unit]
|
|
Description=Daily left4me global map overlay refresh
|
|
|
|
[Timer]
|
|
OnCalendar=daily
|
|
Persistent=true
|
|
Unit=left4me-refresh-global-overlays.service
|
|
|
|
[Install]
|
|
WantedBy=timers.target
|
|
```
|
|
|
|
- [ ] **Step 5: Update deploy script**
|
|
|
|
Modify `deploy/deploy-test-server.sh` directory creation block to include:
|
|
|
|
```sh
|
|
/var/lib/left4me/global_overlay_cache \
|
|
```
|
|
|
|
Modify chown block to include:
|
|
|
|
```sh
|
|
/var/lib/left4me/global_overlay_cache \
|
|
```
|
|
|
|
Copy units:
|
|
|
|
```sh
|
|
$sudo_cmd cp /opt/left4me/deploy/files/usr/local/lib/systemd/system/left4me-refresh-global-overlays.service /usr/local/lib/systemd/system/left4me-refresh-global-overlays.service
|
|
$sudo_cmd cp /opt/left4me/deploy/files/usr/local/lib/systemd/system/left4me-refresh-global-overlays.timer /usr/local/lib/systemd/system/left4me-refresh-global-overlays.timer
|
|
```
|
|
|
|
Enable timer after daemon reload:
|
|
|
|
```sh
|
|
$sudo_cmd systemctl enable --now left4me-refresh-global-overlays.timer
|
|
```
|
|
|
|
- [ ] **Step 6: Update deploy README**
|
|
|
|
Modify `deploy/README.md` target layout to include:
|
|
|
|
```markdown
|
|
- `/var/lib/left4me/global_overlay_cache`: cache of non-Steam map archives and extracted `.vpk` files used by managed global map overlays.
|
|
```
|
|
|
|
Add timer note:
|
|
|
|
```markdown
|
|
`left4me-refresh-global-overlays.timer` runs daily with `Persistent=true`. It invokes `flask refresh-global-overlays`, which only enqueues a `refresh_global_overlays` job; downloads and rebuilds run in the web worker and are visible in the normal job log UI.
|
|
```
|
|
|
|
- [ ] **Step 7: Run deploy tests and shell syntax check**
|
|
|
|
Run: `pytest deploy/tests/test_deploy_artifacts.py -q`
|
|
|
|
Expected: PASS.
|
|
|
|
---
|
|
|
|
## Task 13: Full Verification
|
|
|
|
**Files:**
|
|
- Modify: `docs/superpowers/specs/2026-05-07-l4d2-global-map-overlays-design.md` if implementation differs from the approved contract.
|
|
- Modify: `l4d2web/README.md` if user-facing behavior needs component documentation.
|
|
|
|
- [ ] **Step 1: Run focused global overlay tests**
|
|
|
|
Run:
|
|
|
|
```bash
|
|
pytest \
|
|
l4d2web/tests/test_global_overlay_models.py \
|
|
l4d2web/tests/test_global_overlays.py \
|
|
l4d2web/tests/test_global_map_sources.py \
|
|
l4d2web/tests/test_global_map_cache.py \
|
|
l4d2web/tests/test_global_overlay_builders.py \
|
|
l4d2web/tests/test_global_overlay_refresh.py \
|
|
l4d2web/tests/test_global_overlay_cli.py \
|
|
-q
|
|
```
|
|
|
|
Expected: PASS.
|
|
|
|
- [ ] **Step 2: Run affected web tests**
|
|
|
|
Run:
|
|
|
|
```bash
|
|
pytest \
|
|
l4d2web/tests/test_job_worker.py \
|
|
l4d2web/tests/test_job_logs.py \
|
|
l4d2web/tests/test_overlays.py \
|
|
l4d2web/tests/test_blueprints.py \
|
|
l4d2web/tests/test_pages.py \
|
|
l4d2web/tests/test_l4d2_facade.py \
|
|
-q
|
|
```
|
|
|
|
Expected: PASS.
|
|
|
|
- [ ] **Step 3: Run deployment artifact tests**
|
|
|
|
Run: `pytest deploy/tests/test_deploy_artifacts.py -q`
|
|
|
|
Expected: PASS.
|
|
|
|
- [ ] **Step 4: Run full web suite**
|
|
|
|
Run: `pytest l4d2web/tests -q`
|
|
|
|
Expected: PASS.
|
|
|
|
- [ ] **Step 5: Run full host suite to confirm no host regression**
|
|
|
|
Run: `pytest l4d2host/tests -q`
|
|
|
|
Expected: PASS.
|
|
|
|
- [ ] **Step 6: Run whitespace check**
|
|
|
|
Run: `git diff --check`
|
|
|
|
Expected: no output.
|
|
|
|
---
|
|
|
|
## Manual Test Plan
|
|
|
|
1. Deploy to the test server and confirm `systemctl list-timers left4me-refresh-global-overlays.timer` shows the timer enabled.
|
|
2. Run `/opt/left4me/.venv/bin/flask --app l4d2web.app:create_app refresh-global-overlays` as `left4me`; confirm it prints a job id and does not download directly.
|
|
3. In the admin UI, open Jobs and confirm the system timer job owner displays as `system` for timer-created jobs.
|
|
4. Click Admin -> Refresh global overlays; confirm it reuses an active queued/running refresh job instead of creating duplicates.
|
|
5. Wait for the worker to finish on a test instance; confirm cache files exist under `/var/lib/left4me/global_overlay_cache`.
|
|
6. Open `/overlays`; confirm both `l4d2center-maps` and `cedapug-maps` are visible to admin and non-admin users.
|
|
7. Create a user blueprint and select either global map overlay; confirm server initialize uses the overlay path in generated spec.
|
|
8. Delete one managed symlink from `overlays/{id}/left4dead2/addons`; run refresh; confirm the symlink is restored.
|
|
9. Remove one map from a copied parser fixture in a local test run; confirm managed symlink reconciliation removes the obsolete symlink.
|
|
10. Add a foreign file in the addons directory; run build; confirm it remains and the job log mentions a foreign entry.
|
|
|
|
---
|
|
|
|
## Commit Strategy
|
|
|
|
Do not create commits unless the user explicitly asks for commits. If commits are approved, use these boundaries:
|
|
|
|
1. `feat(l4d2-web): add global overlay metadata schema`
|
|
2. `feat(l4d2-web): seed managed global map overlays`
|
|
3. `feat(l4d2-web): parse global map source manifests`
|
|
4. `feat(l4d2-web): cache and build global map overlays`
|
|
5. `feat(l4d2-web): refresh global overlays through worker jobs`
|
|
6. `feat(l4d2-web): expose global overlays in admin and blueprint ui`
|
|
7. `feat(deploy): add global overlay refresh timer`
|
|
|
|
---
|
|
|
|
## Self-Review Checklist
|
|
|
|
- Spec coverage: managed singleton overlays, nullable system jobs, daily timer, exact reconciliation, source parsing, cache separation, no host changes, visibility and create policy, admin manual refresh, initialize-time guard.
|
|
- Red flag scan: no banned placeholder markers or vague deferred-work instructions.
|
|
- Type consistency: overlay types are `l4d2center_maps` and `cedapug_maps`; job operation is `refresh_global_overlays`; source keys are `l4d2center-maps` and `cedapug-maps`.
|
|
- Verification commands: every task has a concrete pytest or alembic command and an expected outcome.
|