Compare commits

...

8 Commits

Author SHA1 Message Date
87e4bb8941 Merge pull request 'develop' (#1) from develop into main
Reviewed-on: TheFurya/nuzlocke-tracker#1
2026-02-10 12:31:19 +01:00
Julian Tabel
7e8d55ec06 Skip CI on bean-only changes
All checks were successful
CI / backend-lint (push) Successful in 7s
CI / frontend-lint (push) Successful in 29s
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 12:29:52 +01:00
Julian Tabel
29f0b930f8 Mark lint cleanup bean as completed
All checks were successful
CI / backend-lint (push) Successful in 7s
CI / frontend-lint (push) Successful in 28s
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 12:28:45 +01:00
Julian Tabel
e4111c67bc Fix linting errors across backend and frontend
All checks were successful
CI / backend-lint (push) Successful in 7s
CI / frontend-lint (push) Successful in 29s
Backend: auto-fix and format all ruff issues, manually fix B904/B023/
SIM117/B007/E741/F841 errors, suppress B008 (FastAPI Depends) and F821
(SQLAlchemy forward refs) in config. Frontend: allow constant exports,
disable React compiler-specific rules (set-state-in-effect,
preserve-manual-memoization).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 12:26:57 +01:00
Julian Tabel
7f8890086f Add CI and deploy workflows for Gitea Actions
Some checks failed
CI / backend-lint (push) Failing after 1m43s
CI / frontend-lint (push) Failing after 1m6s
CI runs ruff and eslint/tsc on push to develop and PRs. Deploy
workflow is manual (workflow_dispatch) and builds, pushes, and
deploys images to Unraid via SSH.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 12:17:20 +01:00
Julian Tabel
0c4cc815be Remove cron job setup from deploy script
Backup scheduling will be handled via the Unraid User Scripts plugin
instead, which persists across reboots.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 12:02:35 +01:00
Julian Tabel
58475d9cba Add database backup script with daily cron and 7-day retention
pg_dump-based backup script deployed alongside compose file. Deploy
script now installs a daily cron job (03:00) on Unraid automatically.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 11:55:27 +01:00
Julian Tabel
7b383dd982 Set up branching structure and add branching rules to CLAUDE.md
Create develop branch from main and document the branching strategy
(main/develop/feature/*) in CLAUDE.md to enforce the workflow.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 11:50:11 +01:00
58 changed files with 1375 additions and 903 deletions

View File

@@ -1,10 +1,11 @@
--- ---
# nuzlocke-tracker-3c9l # nuzlocke-tracker-3c9l
title: Set up branching structure title: Set up branching structure
status: todo status: completed
type: task type: task
priority: normal
created_at: 2026-02-09T15:30:35Z created_at: 2026-02-09T15:30:35Z
updated_at: 2026-02-09T15:30:35Z updated_at: 2026-02-10T10:49:55Z
parent: nuzlocke-tracker-ahza parent: nuzlocke-tracker-ahza
--- ---

View File

@@ -1,10 +1,11 @@
--- ---
# nuzlocke-tracker-48ds # nuzlocke-tracker-48ds
title: Database backup strategy title: Database backup strategy
status: todo status: completed
type: task type: task
priority: normal
created_at: 2026-02-09T15:30:55Z created_at: 2026-02-09T15:30:55Z
updated_at: 2026-02-09T15:30:55Z updated_at: 2026-02-10T10:55:15Z
parent: nuzlocke-tracker-ahza parent: nuzlocke-tracker-ahza
--- ---

View File

@@ -1,11 +1,11 @@
--- ---
# nuzlocke-tracker-765i # nuzlocke-tracker-765i
title: Update CLAUDE.md with branching rules title: Update CLAUDE.md with branching rules
status: todo status: completed
type: task type: task
priority: normal priority: normal
created_at: 2026-02-09T15:30:38Z created_at: 2026-02-09T15:30:38Z
updated_at: 2026-02-09T15:31:15Z updated_at: 2026-02-10T10:49:56Z
parent: nuzlocke-tracker-ahza parent: nuzlocke-tracker-ahza
blocking: blocking:
- nuzlocke-tracker-3c9l - nuzlocke-tracker-3c9l

View File

@@ -45,8 +45,8 @@ Define and implement a deployment strategy for running the nuzlocke-tracker in p
## Checklist ## Checklist
- [ ] **Set up branching structure** — create `develop` branch from `main`, establish the `main`/`develop`/`feature/*` workflow - [x] **Set up branching structure** — create `develop` branch from `main`, establish the `main`/`develop`/`feature/*` workflow
- [ ] **Update CLAUDE.md with branching rules** — once the branching structure is in place, add instructions to CLAUDE.md that the branching strategy must be adhered to (always work on feature branches, never commit directly to `main`, merge flow is `feature/*``develop``main`) - [x] **Update CLAUDE.md with branching rules** — once the branching structure is in place, add instructions to CLAUDE.md that the branching strategy must be adhered to (always work on feature branches, never commit directly to `main`, merge flow is `feature/*``develop``main`)
- [ ] **Configure Gitea container registry** — create an access token with `read:package` and `write:package` scopes, verify `docker login gitea.nerdboden.de` works, test pushing and pulling an image as a user-level package - [ ] **Configure Gitea container registry** — create an access token with `read:package` and `write:package` scopes, verify `docker login gitea.nerdboden.de` works, test pushing and pulling an image as a user-level package
- [x] **Create production docker-compose file** (`docker-compose.prod.yml`) — uses images from the Gitea container registry, production env vars, no source volume mounts, proper restart policies - [x] **Create production docker-compose file** (`docker-compose.prod.yml`) — uses images from the Gitea container registry, production env vars, no source volume mounts, proper restart policies
- [x] **Create production Dockerfiles (or multi-stage builds)** — ensure frontend is built and served statically (e.g., via the API or a lightweight nginx container), API runs without debug mode - [x] **Create production Dockerfiles (or multi-stage builds)** — ensure frontend is built and served statically (e.g., via the API or a lightweight nginx container), API runs without debug mode
@@ -54,5 +54,5 @@ Define and implement a deployment strategy for running the nuzlocke-tracker in p
- [x] **Configure Nginx Proxy Manager** — add proxy host entries for Gitea and the nuzlocke-tracker frontend/API on the appropriate ports - [x] **Configure Nginx Proxy Manager** — add proxy host entries for Gitea and the nuzlocke-tracker frontend/API on the appropriate ports
- [x] **Environment & secrets management** — deploy script auto-generates `.env` with `POSTGRES_PASSWORD` on Unraid if missing; file lives at `/mnt/user/appdata/nuzlocke-tracker/.env` - [x] **Environment & secrets management** — deploy script auto-generates `.env` with `POSTGRES_PASSWORD` on Unraid if missing; file lives at `/mnt/user/appdata/nuzlocke-tracker/.env`
- [ ] **Implement Gitea Actions CI/CD pipeline** — set up Gitea Actions runner on Unraid, create CI workflow (lint/test on `develop`) and deploy workflow (build/push/deploy on `main`); uses GitHub Actions-compatible syntax for portability - [ ] **Implement Gitea Actions CI/CD pipeline** — set up Gitea Actions runner on Unraid, create CI workflow (lint/test on `develop`) and deploy workflow (build/push/deploy on `main`); uses GitHub Actions-compatible syntax for portability
- [ ] **Database backup strategy** — set up a simple scheduled backup for the PostgreSQL data (e.g., cron + `pg_dump` script on Unraid) - [x] **Database backup strategy** — set up a simple scheduled backup for the PostgreSQL data (e.g., cron + `pg_dump` script on Unraid)
- [ ] **Document the deployment workflow** — README or docs covering how to deploy, redeploy, rollback, and manage the production instance - [ ] **Document the deployment workflow** — README or docs covering how to deploy, redeploy, rollback, and manage the production instance

View File

@@ -1,10 +1,11 @@
--- ---
# nuzlocke-tracker-jlzs # nuzlocke-tracker-jlzs
title: Implement Gitea Actions CI/CD pipeline title: Implement Gitea Actions CI/CD pipeline
status: draft status: in-progress
type: task type: task
priority: normal
created_at: 2026-02-10T09:38:15Z created_at: 2026-02-10T09:38:15Z
updated_at: 2026-02-10T09:38:15Z updated_at: 2026-02-10T11:12:32Z
parent: nuzlocke-tracker-ahza parent: nuzlocke-tracker-ahza
--- ---
@@ -14,15 +15,15 @@ Set up Gitea Actions as the CI/CD pipeline for the nuzlocke-tracker. Gitea Actio
- Gitea is already running on Unraid behind Nginx Proxy Manager (`gitea.nerdboden.de`) - Gitea is already running on Unraid behind Nginx Proxy Manager (`gitea.nerdboden.de`)
- Images are currently built locally and pushed to the Gitea container registry via `deploy.sh` - Images are currently built locally and pushed to the Gitea container registry via `deploy.sh`
- Gitea Actions can automate building, pushing images, and triggering deployment on push to `main` - A Gitea Actions runner is already deployed on Unraid and connected to the Gitea instance
- The workflow syntax is compatible with GitHub Actions, so the same `.github/workflows/` files work on both platforms - The workflow syntax is compatible with GitHub Actions, so the same `.github/workflows/` files work on both platforms
## Checklist ## Checklist
- [ ] **Enable Gitea Actions on the Gitea instance** ensure the Actions feature is enabled in `app.ini` (`[actions] ENABLED = true`) and restart Gitea - [x] **Enable Gitea Actions on the Gitea instance** — Actions feature is enabled and runner is connected
- [ ] **Set up a Gitea Actions runner** deploy an `act_runner` container on Unraid (or the same host as Gitea), register it with the Gitea instance, and verify it picks up jobs - [x] **Set up a Gitea Actions runner**`act_runner` is deployed on Unraid and registered with Gitea
- [ ] **Create CI workflow** (`.github/workflows/ci.yml`) — on push to `develop` and PRs: lint, run tests (backend + frontend), and report status - [x] **Create CI workflow** (`.github/workflows/ci.yml`) — on push to `develop` and PRs: run `ruff check` + `ruff format --check` for backend, `eslint` + `tsc` for frontend. Tests can be added later when they exist.
- [ ] **Create deploy workflow** (`.github/workflows/deploy.yml`) — on push to `main`: build Docker images (linux/amd64), push to the Gitea container registry, and trigger redeployment on Unraid via SSH - [x] **Create deploy workflow** (`.github/workflows/deploy.yml`) — triggered via `workflow_dispatch` on `main`: build Docker images (linux/amd64), push to the Gitea container registry, deploy to Unraid via SSH (`docker compose pull && docker compose up -d`)
- [ ] **Configure secrets in Gitea**add repository or org-level secrets for registry credentials, SSH key/host for deployment, and any other sensitive values the workflows need - [ ] **Configure secrets in Gitea**generate a new SSH keypair, add the public key to Unraid root user's `authorized_keys`, add the private key as a Gitea repo secret (`DEPLOY_SSH_KEY`). Also add any registry credentials or other sensitive values the workflows need.
- [ ] **Test the full pipeline** — push a change through `feature/*``develop` `main` and verify the CI and deploy workflows run successfully end-to-end - [ ] **Test the full pipeline** — push a change through `feature/*``develop` (verify CI runs), then merge `develop``main` and trigger the deploy workflow via `workflow_dispatch` to verify end-to-end
- [ ] **Update deployment docs** — document the Gitea Actions setup, how to manage the runner, and how CI/CD fits into the deployment workflow - [ ] **Update deployment docs** — document the Gitea Actions setup, how to manage the runner, and how CI/CD fits into the deployment workflow

View File

@@ -0,0 +1,36 @@
---
# nuzlocke-tracker-ve9f
title: Fix linting errors across backend and frontend
status: completed
type: task
priority: normal
created_at: 2026-02-10T11:21:24Z
updated_at: 2026-02-10T11:28:08Z
---
The CI pipeline is now running but linting fails on both backend and frontend. Clean up all lint errors so CI passes green.
## Backend (ruff)
- **236 errors** found, **126 auto-fixable** with `ruff check --fix`
- **44 files** need reformatting with `ruff format`
- Most issues are in alembic migrations (auto-generated boilerplate: `Union``X | Y`, import sorting, unused imports) and across API/model/seed files (formatting, datetime.UTC, loop variable issues)
- Fix approach:
1. Run `ruff check --fix backend/` to auto-fix 126 issues
2. Run `ruff format backend/` to reformat 44 files
3. Manually fix remaining ~110 issues (B023 loop variable binding, SIM117, etc.)
## Frontend (eslint + tsc)
- Run `cd frontend && npm ci && npm run lint` to see errors
- Run `npx tsc -b` for type checking
- Fix any reported issues
## Checklist
- [x] Auto-fix backend ruff lint errors (`ruff check --fix backend/`)
- [x] Auto-format backend files (`ruff format backend/`)
- [x] Manually fix remaining backend lint errors
- [x] Fix frontend eslint errors
- [x] Fix frontend TypeScript errors (if any)
- [ ] Verify CI passes green on develop

42
.github/workflows/ci.yml vendored Normal file
View File

@@ -0,0 +1,42 @@
name: CI
on:
push:
branches: [develop]
paths-ignore:
- ".beans/**"
pull_request:
branches: [develop]
paths-ignore:
- ".beans/**"
jobs:
backend-lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.12"
- run: pip install ruff
- name: Check linting
run: ruff check backend/
- name: Check formatting
run: ruff format --check backend/
frontend-lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: "24"
- name: Install dependencies
run: npm ci
working-directory: frontend
- name: Lint
run: npm run lint
working-directory: frontend
- name: Type check
run: npx tsc -b
working-directory: frontend

42
.github/workflows/deploy.yml vendored Normal file
View File

@@ -0,0 +1,42 @@
name: Deploy
on:
workflow_dispatch:
jobs:
deploy:
runs-on: ubuntu-latest
if: github.ref == 'refs/heads/main'
steps:
- uses: actions/checkout@v4
- name: Login to Gitea registry
run: echo "${{ secrets.REGISTRY_PASSWORD }}" | docker login gitea.nerdboden.de -u "${{ secrets.REGISTRY_USERNAME }}" --password-stdin
- name: Build and push API image
run: |
docker build --platform linux/amd64 \
-t gitea.nerdboden.de/thefurya/nuzlocke-tracker-api:latest \
-f backend/Dockerfile.prod ./backend
docker push gitea.nerdboden.de/thefurya/nuzlocke-tracker-api:latest
- name: Build and push frontend image
run: |
docker build --platform linux/amd64 \
-t gitea.nerdboden.de/thefurya/nuzlocke-tracker-frontend:latest \
-f frontend/Dockerfile.prod ./frontend
docker push gitea.nerdboden.de/thefurya/nuzlocke-tracker-frontend:latest
- name: Deploy to Unraid
run: |
mkdir -p ~/.ssh
echo "${{ secrets.DEPLOY_SSH_KEY }}" > ~/.ssh/deploy_key
chmod 600 ~/.ssh/deploy_key
SSH_CMD="ssh -o StrictHostKeyChecking=no -i ~/.ssh/deploy_key root@192.168.1.10"
SCP_CMD="scp -o StrictHostKeyChecking=no -i ~/.ssh/deploy_key"
DEPLOY_DIR="/mnt/user/appdata/nuzlocke-tracker"
$SCP_CMD docker-compose.prod.yml "root@192.168.1.10:${DEPLOY_DIR}/docker-compose.yml"
$SCP_CMD backup.sh "root@192.168.1.10:${DEPLOY_DIR}/backup.sh"
$SSH_CMD "chmod +x '${DEPLOY_DIR}/backup.sh'"
$SSH_CMD "cd '${DEPLOY_DIR}' && docker compose pull && docker compose up -d"

View File

@@ -1,3 +1,10 @@
# Branching Strategy
- **Never commit directly to `main`.** `main` is always production-ready.
- Day-to-day work happens on `develop`.
- New work is done on `feature/*` branches off `develop`.
- Merge flow: `feature/*``develop``main`.
# Instructions # Instructions
- After completing a task, always ask the user if they'd like to commit the changes. - After completing a task, always ask the user if they'd like to commit the changes.

View File

@@ -47,8 +47,12 @@ select = [
] ]
ignore = [ ignore = [
"E501", # line too long (handled by formatter) "E501", # line too long (handled by formatter)
"B008", # Depends() in defaults — standard FastAPI pattern
] ]
[tool.ruff.lint.per-file-ignores]
"src/app/models/*.py" = ["F821"] # forward refs in SQLAlchemy relationships
[tool.ruff.lint.isort] [tool.ruff.lint.isort]
known-first-party = ["app"] known-first-party = ["app"]

View File

@@ -1,16 +1,14 @@
import asyncio import asyncio
from logging.config import fileConfig from logging.config import fileConfig
from alembic import context
from sqlalchemy import pool from sqlalchemy import pool
from sqlalchemy.ext.asyncio import async_engine_from_config from sqlalchemy.ext.asyncio import async_engine_from_config
from alembic import context
from app.core.config import settings
from app.core.database import Base, _get_async_url
# Import all models so Base.metadata is populated # Import all models so Base.metadata is populated
import app.models # noqa: F401 import app.models # noqa: F401
from app.core.config import settings
from app.core.database import Base, _get_async_url
config = context.config config = context.config

View File

@@ -6,18 +6,17 @@ Create Date: 2026-02-05 13:27:47.649534
""" """
from typing import Sequence, Union from collections.abc import Sequence
import sqlalchemy as sa import sqlalchemy as sa
from sqlalchemy.dialects import postgresql
from alembic import op from alembic import op
from sqlalchemy.dialects import postgresql
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = "03e5f186a9d5" revision: str = "03e5f186a9d5"
down_revision: Union[str, Sequence[str], None] = None down_revision: str | Sequence[str] | None = None
branch_labels: Union[str, Sequence[str], None] = None branch_labels: str | Sequence[str] | None = None
depends_on: Union[str, Sequence[str], None] = None depends_on: str | Sequence[str] | None = None
def upgrade() -> None: def upgrade() -> None:
@@ -36,9 +35,7 @@ def upgrade() -> None:
"routes", "routes",
sa.Column("id", sa.Integer(), primary_key=True), sa.Column("id", sa.Integer(), primary_key=True),
sa.Column("name", sa.String(100), nullable=False), sa.Column("name", sa.String(100), nullable=False),
sa.Column( sa.Column("game_id", sa.Integer(), sa.ForeignKey("games.id"), nullable=False),
"game_id", sa.Integer(), sa.ForeignKey("games.id"), nullable=False
),
sa.Column("order", sa.SmallInteger(), nullable=False), sa.Column("order", sa.SmallInteger(), nullable=False),
) )
op.create_index("ix_routes_game_id", "routes", ["game_id"]) op.create_index("ix_routes_game_id", "routes", ["game_id"])
@@ -46,22 +43,16 @@ def upgrade() -> None:
op.create_table( op.create_table(
"pokemon", "pokemon",
sa.Column("id", sa.Integer(), primary_key=True), sa.Column("id", sa.Integer(), primary_key=True),
sa.Column( sa.Column("national_dex", sa.SmallInteger(), nullable=False, unique=True),
"national_dex", sa.SmallInteger(), nullable=False, unique=True
),
sa.Column("name", sa.String(50), nullable=False), sa.Column("name", sa.String(50), nullable=False),
sa.Column( sa.Column("types", postgresql.ARRAY(sa.String(20)), nullable=False),
"types", postgresql.ARRAY(sa.String(20)), nullable=False
),
sa.Column("sprite_url", sa.String(500), nullable=True), sa.Column("sprite_url", sa.String(500), nullable=True),
) )
op.create_table( op.create_table(
"route_encounters", "route_encounters",
sa.Column("id", sa.Integer(), primary_key=True), sa.Column("id", sa.Integer(), primary_key=True),
sa.Column( sa.Column("route_id", sa.Integer(), sa.ForeignKey("routes.id"), nullable=False),
"route_id", sa.Integer(), sa.ForeignKey("routes.id"), nullable=False
),
sa.Column( sa.Column(
"pokemon_id", "pokemon_id",
sa.Integer(), sa.Integer(),
@@ -77,9 +68,7 @@ def upgrade() -> None:
name="uq_route_pokemon_method", name="uq_route_pokemon_method",
), ),
) )
op.create_index( op.create_index("ix_route_encounters_route_id", "route_encounters", ["route_id"])
"ix_route_encounters_route_id", "route_encounters", ["route_id"]
)
op.create_index( op.create_index(
"ix_route_encounters_pokemon_id", "route_encounters", ["pokemon_id"] "ix_route_encounters_pokemon_id", "route_encounters", ["pokemon_id"]
) )
@@ -87,30 +76,20 @@ def upgrade() -> None:
op.create_table( op.create_table(
"nuzlocke_runs", "nuzlocke_runs",
sa.Column("id", sa.Integer(), primary_key=True), sa.Column("id", sa.Integer(), primary_key=True),
sa.Column( sa.Column("game_id", sa.Integer(), sa.ForeignKey("games.id"), nullable=False),
"game_id", sa.Integer(), sa.ForeignKey("games.id"), nullable=False
),
sa.Column("name", sa.String(100), nullable=False), sa.Column("name", sa.String(100), nullable=False),
sa.Column("status", sa.String(20), nullable=False), sa.Column("status", sa.String(20), nullable=False),
sa.Column( sa.Column("rules", postgresql.JSONB(), nullable=False, server_default="{}"),
"rules", postgresql.JSONB(), nullable=False, server_default="{}"
),
sa.Column( sa.Column(
"started_at", "started_at",
sa.DateTime(timezone=True), sa.DateTime(timezone=True),
nullable=False, nullable=False,
server_default=sa.func.now(), server_default=sa.func.now(),
), ),
sa.Column( sa.Column("completed_at", sa.DateTime(timezone=True), nullable=True),
"completed_at", sa.DateTime(timezone=True), nullable=True
),
)
op.create_index(
"ix_nuzlocke_runs_game_id", "nuzlocke_runs", ["game_id"]
)
op.create_index(
"ix_nuzlocke_runs_status", "nuzlocke_runs", ["status"]
) )
op.create_index("ix_nuzlocke_runs_game_id", "nuzlocke_runs", ["game_id"])
op.create_index("ix_nuzlocke_runs_status", "nuzlocke_runs", ["status"])
op.create_table( op.create_table(
"encounters", "encounters",
@@ -121,9 +100,7 @@ def upgrade() -> None:
sa.ForeignKey("nuzlocke_runs.id"), sa.ForeignKey("nuzlocke_runs.id"),
nullable=False, nullable=False,
), ),
sa.Column( sa.Column("route_id", sa.Integer(), sa.ForeignKey("routes.id"), nullable=False),
"route_id", sa.Integer(), sa.ForeignKey("routes.id"), nullable=False
),
sa.Column( sa.Column(
"pokemon_id", "pokemon_id",
sa.Integer(), sa.Integer(),

View File

@@ -5,28 +5,27 @@ Revises: 03e5f186a9d5
Create Date: 2026-02-05 13:01:30.631978 Create Date: 2026-02-05 13:01:30.631978
""" """
from typing import Sequence, Union
from collections.abc import Sequence
from alembic import op from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = '694df688fb02' revision: str = "694df688fb02"
down_revision: Union[str, Sequence[str], None] = '03e5f186a9d5' down_revision: str | Sequence[str] | None = "03e5f186a9d5"
branch_labels: Union[str, Sequence[str], None] = None branch_labels: str | Sequence[str] | None = None
depends_on: Union[str, Sequence[str], None] = None depends_on: str | Sequence[str] | None = None
def upgrade() -> None: def upgrade() -> None:
"""Upgrade schema.""" """Upgrade schema."""
# ### commands auto generated by Alembic - please adjust! ### # ### commands auto generated by Alembic - please adjust! ###
op.create_unique_constraint('uq_routes_game_name', 'routes', ['game_id', 'name']) op.create_unique_constraint("uq_routes_game_name", "routes", ["game_id", "name"])
# ### end Alembic commands ### # ### end Alembic commands ###
def downgrade() -> None: def downgrade() -> None:
"""Downgrade schema.""" """Downgrade schema."""
# ### commands auto generated by Alembic - please adjust! ### # ### commands auto generated by Alembic - please adjust! ###
op.drop_constraint('uq_routes_game_name', 'routes', type_='unique') op.drop_constraint("uq_routes_game_name", "routes", type_="unique")
# ### end Alembic commands ### # ### end Alembic commands ###

View File

@@ -5,30 +5,36 @@ Revises: 694df688fb02
Create Date: 2026-02-05 13:32:35.559499 Create Date: 2026-02-05 13:32:35.559499
""" """
from typing import Sequence, Union
from alembic import op from collections.abc import Sequence
import sqlalchemy as sa import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = '9afcbafe9888' revision: str = "9afcbafe9888"
down_revision: Union[str, Sequence[str], None] = '694df688fb02' down_revision: str | Sequence[str] | None = "694df688fb02"
branch_labels: Union[str, Sequence[str], None] = None branch_labels: str | Sequence[str] | None = None
depends_on: Union[str, Sequence[str], None] = None depends_on: str | Sequence[str] | None = None
def upgrade() -> None: def upgrade() -> None:
"""Upgrade schema.""" """Upgrade schema."""
op.add_column('route_encounters', sa.Column('min_level', sa.SmallInteger(), nullable=False, server_default='0')) op.add_column(
op.add_column('route_encounters', sa.Column('max_level', sa.SmallInteger(), nullable=False, server_default='0')) "route_encounters",
op.alter_column('route_encounters', 'min_level', server_default=None) sa.Column("min_level", sa.SmallInteger(), nullable=False, server_default="0"),
op.alter_column('route_encounters', 'max_level', server_default=None) )
op.add_column(
"route_encounters",
sa.Column("max_level", sa.SmallInteger(), nullable=False, server_default="0"),
)
op.alter_column("route_encounters", "min_level", server_default=None)
op.alter_column("route_encounters", "max_level", server_default=None)
def downgrade() -> None: def downgrade() -> None:
"""Downgrade schema.""" """Downgrade schema."""
# ### commands auto generated by Alembic - please adjust! ### # ### commands auto generated by Alembic - please adjust! ###
op.drop_column('route_encounters', 'max_level') op.drop_column("route_encounters", "max_level")
op.drop_column('route_encounters', 'min_level') op.drop_column("route_encounters", "min_level")
# ### end Alembic commands ### # ### end Alembic commands ###

View File

@@ -5,22 +5,22 @@ Revises: 9afcbafe9888
Create Date: 2026-02-05 17:00:00.000000 Create Date: 2026-02-05 17:00:00.000000
""" """
from typing import Sequence, Union
from alembic import op from collections.abc import Sequence
import sqlalchemy as sa import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = 'a1b2c3d4e5f6' revision: str = "a1b2c3d4e5f6"
down_revision: Union[str, Sequence[str], None] = '9afcbafe9888' down_revision: str | Sequence[str] | None = "9afcbafe9888"
branch_labels: Union[str, Sequence[str], None] = None branch_labels: str | Sequence[str] | None = None
depends_on: Union[str, Sequence[str], None] = None depends_on: str | Sequence[str] | None = None
def upgrade() -> None: def upgrade() -> None:
op.add_column('encounters', sa.Column('death_cause', sa.String(100), nullable=True)) op.add_column("encounters", sa.Column("death_cause", sa.String(100), nullable=True))
def downgrade() -> None: def downgrade() -> None:
op.drop_column('encounters', 'death_cause') op.drop_column("encounters", "death_cause")

View File

@@ -5,25 +5,25 @@ Revises: f6a7b8c9d0e1
Create Date: 2026-02-07 12:00:00.000000 Create Date: 2026-02-07 12:00:00.000000
""" """
from typing import Sequence, Union
from alembic import op from collections.abc import Sequence
import sqlalchemy as sa import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = 'a1b2c3d4e5f7' revision: str = "a1b2c3d4e5f7"
down_revision: Union[str, Sequence[str], None] = 'f6a7b8c9d0e1' down_revision: str | Sequence[str] | None = "f6a7b8c9d0e1"
branch_labels: Union[str, Sequence[str], None] = None branch_labels: str | Sequence[str] | None = None
depends_on: Union[str, Sequence[str], None] = None depends_on: str | Sequence[str] | None = None
def upgrade() -> None: def upgrade() -> None:
op.add_column( op.add_column(
'routes', "routes",
sa.Column('pinwheel_zone', sa.SmallInteger(), nullable=True), sa.Column("pinwheel_zone", sa.SmallInteger(), nullable=True),
) )
def downgrade() -> None: def downgrade() -> None:
op.drop_column('routes', 'pinwheel_zone') op.drop_column("routes", "pinwheel_zone")

View File

@@ -5,25 +5,25 @@ Revises: f6a7b8c9d0e1
Create Date: 2026-02-09 12:00:00.000000 Create Date: 2026-02-09 12:00:00.000000
""" """
from typing import Sequence, Union
from alembic import op from collections.abc import Sequence
import sqlalchemy as sa import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = 'a1b2c3d4e5f8' revision: str = "a1b2c3d4e5f8"
down_revision: Union[str, Sequence[str], None] = 'f6a7b8c9d0e1' down_revision: str | Sequence[str] | None = "f6a7b8c9d0e1"
branch_labels: Union[str, Sequence[str], None] = None branch_labels: str | Sequence[str] | None = None
depends_on: Union[str, Sequence[str], None] = None depends_on: str | Sequence[str] | None = None
def upgrade() -> None: def upgrade() -> None:
op.add_column( op.add_column(
'games', "games",
sa.Column('category', sa.String(20), nullable=True), sa.Column("category", sa.String(20), nullable=True),
) )
def downgrade() -> None: def downgrade() -> None:
op.drop_column('games', 'category') op.drop_column("games", "category")

View File

@@ -5,22 +5,24 @@ Revises: f5a6b7c8d9e0
Create Date: 2026-02-08 21:00:00.000000 Create Date: 2026-02-08 21:00:00.000000
""" """
from typing import Sequence, Union
from collections.abc import Sequence
import sqlalchemy as sa import sqlalchemy as sa
from alembic import op from alembic import op
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = 'a6b7c8d9e0f1' revision: str = "a6b7c8d9e0f1"
down_revision: Union[str, Sequence[str], None] = 'f5a6b7c8d9e0' down_revision: str | Sequence[str] | None = "f5a6b7c8d9e0"
branch_labels: Union[str, Sequence[str], None] = None branch_labels: str | Sequence[str] | None = None
depends_on: Union[str, Sequence[str], None] = None depends_on: str | Sequence[str] | None = None
def upgrade() -> None: def upgrade() -> None:
op.add_column('boss_battles', sa.Column('specialty_type', sa.String(20), nullable=True)) op.add_column(
"boss_battles", sa.Column("specialty_type", sa.String(20), nullable=True)
)
def downgrade() -> None: def downgrade() -> None:
op.drop_column('boss_battles', 'specialty_type') op.drop_column("boss_battles", "specialty_type")

View File

@@ -5,25 +5,27 @@ Revises: a1b2c3d4e5f7
Create Date: 2026-02-07 18:00:00.000000 Create Date: 2026-02-07 18:00:00.000000
""" """
from typing import Sequence, Union
from alembic import op from collections.abc import Sequence
import sqlalchemy as sa import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = 'b1c2d3e4f5a6' revision: str = "b1c2d3e4f5a6"
down_revision: Union[str, Sequence[str], None] = 'a1b2c3d4e5f7' down_revision: str | Sequence[str] | None = "a1b2c3d4e5f7"
branch_labels: Union[str, Sequence[str], None] = None branch_labels: str | Sequence[str] | None = None
depends_on: Union[str, Sequence[str], None] = None depends_on: str | Sequence[str] | None = None
def upgrade() -> None: def upgrade() -> None:
op.add_column( op.add_column(
'encounters', "encounters",
sa.Column('is_shiny', sa.Boolean(), nullable=False, server_default=sa.text('false')), sa.Column(
"is_shiny", sa.Boolean(), nullable=False, server_default=sa.text("false")
),
) )
def downgrade() -> None: def downgrade() -> None:
op.drop_column('encounters', 'is_shiny') op.drop_column("encounters", "is_shiny")

View File

@@ -5,38 +5,56 @@ Revises: a1b2c3d4e5f6
Create Date: 2026-02-05 18:00:00.000000 Create Date: 2026-02-05 18:00:00.000000
""" """
from typing import Sequence, Union
from alembic import op from collections.abc import Sequence
import sqlalchemy as sa import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = 'b2c3d4e5f6a7' revision: str = "b2c3d4e5f6a7"
down_revision: Union[str, Sequence[str], None] = 'a1b2c3d4e5f6' down_revision: str | Sequence[str] | None = "a1b2c3d4e5f6"
branch_labels: Union[str, Sequence[str], None] = None branch_labels: str | Sequence[str] | None = None
depends_on: Union[str, Sequence[str], None] = None depends_on: str | Sequence[str] | None = None
def upgrade() -> None: def upgrade() -> None:
op.create_table( op.create_table(
'evolutions', "evolutions",
sa.Column('id', sa.Integer(), primary_key=True), sa.Column("id", sa.Integer(), primary_key=True),
sa.Column('from_pokemon_id', sa.Integer(), sa.ForeignKey('pokemon.id'), nullable=False, index=True), sa.Column(
sa.Column('to_pokemon_id', sa.Integer(), sa.ForeignKey('pokemon.id'), nullable=False, index=True), "from_pokemon_id",
sa.Column('trigger', sa.String(30), nullable=False), sa.Integer(),
sa.Column('min_level', sa.SmallInteger(), nullable=True), sa.ForeignKey("pokemon.id"),
sa.Column('item', sa.String(50), nullable=True), nullable=False,
sa.Column('held_item', sa.String(50), nullable=True), index=True,
sa.Column('condition', sa.String(200), nullable=True), ),
sa.Column(
"to_pokemon_id",
sa.Integer(),
sa.ForeignKey("pokemon.id"),
nullable=False,
index=True,
),
sa.Column("trigger", sa.String(30), nullable=False),
sa.Column("min_level", sa.SmallInteger(), nullable=True),
sa.Column("item", sa.String(50), nullable=True),
sa.Column("held_item", sa.String(50), nullable=True),
sa.Column("condition", sa.String(200), nullable=True),
) )
op.add_column( op.add_column(
'encounters', "encounters",
sa.Column('current_pokemon_id', sa.Integer(), sa.ForeignKey('pokemon.id'), nullable=True, index=True), sa.Column(
"current_pokemon_id",
sa.Integer(),
sa.ForeignKey("pokemon.id"),
nullable=True,
index=True,
),
) )
def downgrade() -> None: def downgrade() -> None:
op.drop_column('encounters', 'current_pokemon_id') op.drop_column("encounters", "current_pokemon_id")
op.drop_table('evolutions') op.drop_table("evolutions")

View File

@@ -5,42 +5,65 @@ Revises: a1b2c3d4e5f8, b7c8d9e0f1a2
Create Date: 2026-02-09 14:00:00.000000 Create Date: 2026-02-09 14:00:00.000000
""" """
from typing import Sequence, Union
from alembic import op from collections.abc import Sequence
import sqlalchemy as sa import sqlalchemy as sa
from alembic import op
from sqlalchemy.dialects.postgresql import JSONB from sqlalchemy.dialects.postgresql import JSONB
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = 'b2c3d4e5f6a8' revision: str = "b2c3d4e5f6a8"
down_revision: Union[str, Sequence[str], None] = ('a1b2c3d4e5f8', 'b7c8d9e0f1a2') down_revision: str | Sequence[str] | None = ("a1b2c3d4e5f8", "b7c8d9e0f1a2")
branch_labels: Union[str, Sequence[str], None] = None branch_labels: str | Sequence[str] | None = None
depends_on: Union[str, Sequence[str], None] = None depends_on: str | Sequence[str] | None = None
def upgrade() -> None: def upgrade() -> None:
op.create_table( op.create_table(
'genlockes', "genlockes",
sa.Column('id', sa.Integer(), primary_key=True), sa.Column("id", sa.Integer(), primary_key=True),
sa.Column('name', sa.String(100), nullable=False), sa.Column("name", sa.String(100), nullable=False),
sa.Column('status', sa.String(20), nullable=False, index=True), sa.Column("status", sa.String(20), nullable=False, index=True),
sa.Column('genlocke_rules', JSONB(), nullable=False, server_default='{}'), sa.Column("genlocke_rules", JSONB(), nullable=False, server_default="{}"),
sa.Column('nuzlocke_rules', JSONB(), nullable=False, server_default='{}'), sa.Column("nuzlocke_rules", JSONB(), nullable=False, server_default="{}"),
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), sa.Column(
"created_at",
sa.DateTime(timezone=True),
server_default=sa.func.now(),
nullable=False,
),
) )
op.create_table( op.create_table(
'genlocke_legs', "genlocke_legs",
sa.Column('id', sa.Integer(), primary_key=True), sa.Column("id", sa.Integer(), primary_key=True),
sa.Column('genlocke_id', sa.Integer(), sa.ForeignKey('genlockes.id', ondelete='CASCADE'), nullable=False, index=True), sa.Column(
sa.Column('game_id', sa.Integer(), sa.ForeignKey('games.id'), nullable=False, index=True), "genlocke_id",
sa.Column('run_id', sa.Integer(), sa.ForeignKey('nuzlocke_runs.id'), nullable=True, index=True), sa.Integer(),
sa.Column('leg_order', sa.SmallInteger(), nullable=False), sa.ForeignKey("genlockes.id", ondelete="CASCADE"),
sa.UniqueConstraint('genlocke_id', 'leg_order', name='uq_genlocke_legs_order'), nullable=False,
index=True,
),
sa.Column(
"game_id",
sa.Integer(),
sa.ForeignKey("games.id"),
nullable=False,
index=True,
),
sa.Column(
"run_id",
sa.Integer(),
sa.ForeignKey("nuzlocke_runs.id"),
nullable=True,
index=True,
),
sa.Column("leg_order", sa.SmallInteger(), nullable=False),
sa.UniqueConstraint("genlocke_id", "leg_order", name="uq_genlocke_legs_order"),
) )
def downgrade() -> None: def downgrade() -> None:
op.drop_table('genlocke_legs') op.drop_table("genlocke_legs")
op.drop_table('genlockes') op.drop_table("genlockes")

View File

@@ -5,22 +5,24 @@ Revises: a6b7c8d9e0f1
Create Date: 2026-02-08 22:00:00.000000 Create Date: 2026-02-08 22:00:00.000000
""" """
from typing import Sequence, Union
from collections.abc import Sequence
import sqlalchemy as sa import sqlalchemy as sa
from alembic import op from alembic import op
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = 'b7c8d9e0f1a2' revision: str = "b7c8d9e0f1a2"
down_revision: Union[str, Sequence[str], None] = 'a6b7c8d9e0f1' down_revision: str | Sequence[str] | None = "a6b7c8d9e0f1"
branch_labels: Union[str, Sequence[str], None] = None branch_labels: str | Sequence[str] | None = None
depends_on: Union[str, Sequence[str], None] = None depends_on: str | Sequence[str] | None = None
def upgrade() -> None: def upgrade() -> None:
op.add_column('boss_pokemon', sa.Column('condition_label', sa.String(100), nullable=True)) op.add_column(
"boss_pokemon", sa.Column("condition_label", sa.String(100), nullable=True)
)
def downgrade() -> None: def downgrade() -> None:
op.drop_column('boss_pokemon', 'condition_label') op.drop_column("boss_pokemon", "condition_label")

View File

@@ -5,57 +5,95 @@ Revises: b1c2d3e4f5a6
Create Date: 2026-02-08 12:00:00.000000 Create Date: 2026-02-08 12:00:00.000000
""" """
from typing import Sequence, Union
from alembic import op from collections.abc import Sequence
import sqlalchemy as sa import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = 'c2d3e4f5a6b7' revision: str = "c2d3e4f5a6b7"
down_revision: Union[str, Sequence[str], None] = 'b1c2d3e4f5a6' down_revision: str | Sequence[str] | None = "b1c2d3e4f5a6"
branch_labels: Union[str, Sequence[str], None] = None branch_labels: str | Sequence[str] | None = None
depends_on: Union[str, Sequence[str], None] = None depends_on: str | Sequence[str] | None = None
def upgrade() -> None: def upgrade() -> None:
op.create_table( op.create_table(
'boss_battles', "boss_battles",
sa.Column('id', sa.Integer(), primary_key=True), sa.Column("id", sa.Integer(), primary_key=True),
sa.Column('game_id', sa.Integer(), sa.ForeignKey('games.id'), nullable=False, index=True), sa.Column(
sa.Column('name', sa.String(100), nullable=False), "game_id",
sa.Column('boss_type', sa.String(20), nullable=False), sa.Integer(),
sa.Column('badge_name', sa.String(100), nullable=True), sa.ForeignKey("games.id"),
sa.Column('badge_image_url', sa.String(500), nullable=True), nullable=False,
sa.Column('level_cap', sa.SmallInteger(), nullable=False), index=True,
sa.Column('order', sa.SmallInteger(), nullable=False), ),
sa.Column('after_route_id', sa.Integer(), sa.ForeignKey('routes.id'), nullable=True, index=True), sa.Column("name", sa.String(100), nullable=False),
sa.Column('location', sa.String(200), nullable=False), sa.Column("boss_type", sa.String(20), nullable=False),
sa.Column('sprite_url', sa.String(500), nullable=True), sa.Column("badge_name", sa.String(100), nullable=True),
sa.Column("badge_image_url", sa.String(500), nullable=True),
sa.Column("level_cap", sa.SmallInteger(), nullable=False),
sa.Column("order", sa.SmallInteger(), nullable=False),
sa.Column(
"after_route_id",
sa.Integer(),
sa.ForeignKey("routes.id"),
nullable=True,
index=True,
),
sa.Column("location", sa.String(200), nullable=False),
sa.Column("sprite_url", sa.String(500), nullable=True),
) )
op.create_table( op.create_table(
'boss_pokemon', "boss_pokemon",
sa.Column('id', sa.Integer(), primary_key=True), sa.Column("id", sa.Integer(), primary_key=True),
sa.Column('boss_battle_id', sa.Integer(), sa.ForeignKey('boss_battles.id', ondelete='CASCADE'), nullable=False, index=True), sa.Column(
sa.Column('pokemon_id', sa.Integer(), sa.ForeignKey('pokemon.id'), nullable=False, index=True), "boss_battle_id",
sa.Column('level', sa.SmallInteger(), nullable=False), sa.Integer(),
sa.Column('order', sa.SmallInteger(), nullable=False), sa.ForeignKey("boss_battles.id", ondelete="CASCADE"),
nullable=False,
index=True,
),
sa.Column(
"pokemon_id",
sa.Integer(),
sa.ForeignKey("pokemon.id"),
nullable=False,
index=True,
),
sa.Column("level", sa.SmallInteger(), nullable=False),
sa.Column("order", sa.SmallInteger(), nullable=False),
) )
op.create_table( op.create_table(
'boss_results', "boss_results",
sa.Column('id', sa.Integer(), primary_key=True), sa.Column("id", sa.Integer(), primary_key=True),
sa.Column('run_id', sa.Integer(), sa.ForeignKey('nuzlocke_runs.id', ondelete='CASCADE'), nullable=False, index=True), sa.Column(
sa.Column('boss_battle_id', sa.Integer(), sa.ForeignKey('boss_battles.id'), nullable=False, index=True), "run_id",
sa.Column('result', sa.String(10), nullable=False), sa.Integer(),
sa.Column('attempts', sa.SmallInteger(), nullable=False, server_default='1'), sa.ForeignKey("nuzlocke_runs.id", ondelete="CASCADE"),
sa.Column('completed_at', sa.DateTime(timezone=True), nullable=True), nullable=False,
sa.UniqueConstraint('run_id', 'boss_battle_id', name='uq_boss_results_run_boss'), index=True,
),
sa.Column(
"boss_battle_id",
sa.Integer(),
sa.ForeignKey("boss_battles.id"),
nullable=False,
index=True,
),
sa.Column("result", sa.String(10), nullable=False),
sa.Column("attempts", sa.SmallInteger(), nullable=False, server_default="1"),
sa.Column("completed_at", sa.DateTime(timezone=True), nullable=True),
sa.UniqueConstraint(
"run_id", "boss_battle_id", name="uq_boss_results_run_boss"
),
) )
def downgrade() -> None: def downgrade() -> None:
op.drop_table('boss_results') op.drop_table("boss_results")
op.drop_table('boss_pokemon') op.drop_table("boss_pokemon")
op.drop_table('boss_battles') op.drop_table("boss_battles")

View File

@@ -5,26 +5,26 @@ Revises: b2c3d4e5f6a7
Create Date: 2026-02-06 12:00:00.000000 Create Date: 2026-02-06 12:00:00.000000
""" """
from typing import Sequence, Union
from alembic import op from collections.abc import Sequence
import sqlalchemy as sa import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = 'c3d4e5f6a7b8' revision: str = "c3d4e5f6a7b8"
down_revision: Union[str, Sequence[str], None] = 'b2c3d4e5f6a7' down_revision: str | Sequence[str] | None = "b2c3d4e5f6a7"
branch_labels: Union[str, Sequence[str], None] = None branch_labels: str | Sequence[str] | None = None
depends_on: Union[str, Sequence[str], None] = None depends_on: str | Sequence[str] | None = None
def upgrade() -> None: def upgrade() -> None:
op.add_column( op.add_column(
'routes', "routes",
sa.Column( sa.Column(
'parent_route_id', "parent_route_id",
sa.Integer(), sa.Integer(),
sa.ForeignKey('routes.id', ondelete='CASCADE'), sa.ForeignKey("routes.id", ondelete="CASCADE"),
nullable=True, nullable=True,
index=True, index=True,
), ),
@@ -32,4 +32,4 @@ def upgrade() -> None:
def downgrade() -> None: def downgrade() -> None:
op.drop_column('routes', 'parent_route_id') op.drop_column("routes", "parent_route_id")

View File

@@ -5,26 +5,26 @@ Revises: b2c3d4e5f6a8
Create Date: 2026-02-09 18:00:00.000000 Create Date: 2026-02-09 18:00:00.000000
""" """
from typing import Sequence, Union
from alembic import op from collections.abc import Sequence
import sqlalchemy as sa import sqlalchemy as sa
from alembic import op
from sqlalchemy.dialects.postgresql import JSONB from sqlalchemy.dialects.postgresql import JSONB
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = 'c3d4e5f6a7b9' revision: str = "c3d4e5f6a7b9"
down_revision: Union[str, Sequence[str], None] = 'b2c3d4e5f6a8' down_revision: str | Sequence[str] | None = "b2c3d4e5f6a8"
branch_labels: Union[str, Sequence[str], None] = None branch_labels: str | Sequence[str] | None = None
depends_on: Union[str, Sequence[str], None] = None depends_on: str | Sequence[str] | None = None
def upgrade() -> None: def upgrade() -> None:
op.add_column( op.add_column(
'genlocke_legs', "genlocke_legs",
sa.Column('retired_pokemon_ids', JSONB(), nullable=True), sa.Column("retired_pokemon_ids", JSONB(), nullable=True),
) )
def downgrade() -> None: def downgrade() -> None:
op.drop_column('genlocke_legs', 'retired_pokemon_ids') op.drop_column("genlocke_legs", "retired_pokemon_ids")

View File

@@ -5,28 +5,28 @@ Revises: c2d3e4f5a6b7
Create Date: 2026-02-08 14:00:00.000000 Create Date: 2026-02-08 14:00:00.000000
""" """
import json import json
from collections.abc import Sequence
from pathlib import Path from pathlib import Path
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = 'd3e4f5a6b7c8' revision: str = "d3e4f5a6b7c8"
down_revision: Union[str, Sequence[str], None] = 'c2d3e4f5a6b7' down_revision: str | Sequence[str] | None = "c2d3e4f5a6b7"
branch_labels: Union[str, Sequence[str], None] = None branch_labels: str | Sequence[str] | None = None
depends_on: Union[str, Sequence[str], None] = None depends_on: str | Sequence[str] | None = None
def upgrade() -> None: def upgrade() -> None:
# 1. Create version_groups table # 1. Create version_groups table
op.create_table( op.create_table(
'version_groups', "version_groups",
sa.Column('id', sa.Integer(), primary_key=True), sa.Column("id", sa.Integer(), primary_key=True),
sa.Column('name', sa.String(100), nullable=False), sa.Column("name", sa.String(100), nullable=False),
sa.Column('slug', sa.String(100), nullable=False, unique=True), sa.Column("slug", sa.String(100), nullable=False, unique=True),
) )
# 2. Populate version groups from seed data # 2. Populate version groups from seed data
@@ -36,10 +36,10 @@ def upgrade() -> None:
conn = op.get_bind() conn = op.get_bind()
vg_table = sa.table( vg_table = sa.table(
'version_groups', "version_groups",
sa.column('id', sa.Integer), sa.column("id", sa.Integer),
sa.column('name', sa.String), sa.column("name", sa.String),
sa.column('slug', sa.String), sa.column("slug", sa.String),
) )
# Build slug -> id mapping and game_slug -> vg_id mapping # Build slug -> id mapping and game_slug -> vg_id mapping
@@ -49,8 +49,7 @@ def upgrade() -> None:
vg_id = vg_idx vg_id = vg_idx
# Use the slug as a readable name (e.g., "red-blue" -> "Red / Blue") # Use the slug as a readable name (e.g., "red-blue" -> "Red / Blue")
vg_name = " / ".join( vg_name = " / ".join(
g["name"].replace("Pokemon ", "") g["name"].replace("Pokemon ", "") for g in vg_info["games"].values()
for g in vg_info["games"].values()
) )
conn.execute(vg_table.insert().values(id=vg_id, name=vg_name, slug=vg_slug)) conn.execute(vg_table.insert().values(id=vg_id, name=vg_name, slug=vg_slug))
slug_to_vg_id[vg_slug] = vg_id slug_to_vg_id[vg_slug] = vg_id
@@ -58,16 +57,23 @@ def upgrade() -> None:
game_slug_to_vg_id[game_slug] = vg_id game_slug_to_vg_id[game_slug] = vg_id
# 3. Add version_group_id to games (nullable initially) # 3. Add version_group_id to games (nullable initially)
op.add_column('games', sa.Column('version_group_id', sa.Integer(), op.add_column(
sa.ForeignKey('version_groups.id'), nullable=True)) "games",
op.create_index('ix_games_version_group_id', 'games', ['version_group_id']) sa.Column(
"version_group_id",
sa.Integer(),
sa.ForeignKey("version_groups.id"),
nullable=True,
),
)
op.create_index("ix_games_version_group_id", "games", ["version_group_id"])
# Populate games.version_group_id from the mapping # Populate games.version_group_id from the mapping
games_table = sa.table( games_table = sa.table(
'games', "games",
sa.column('id', sa.Integer), sa.column("id", sa.Integer),
sa.column('slug', sa.String), sa.column("slug", sa.String),
sa.column('version_group_id', sa.Integer), sa.column("version_group_id", sa.Integer),
) )
rows = conn.execute(sa.select(games_table.c.id, games_table.c.slug)).fetchall() rows = conn.execute(sa.select(games_table.c.id, games_table.c.slug)).fetchall()
for game_id, game_slug in rows: for game_id, game_slug in rows:
@@ -80,21 +86,23 @@ def upgrade() -> None:
) )
# 4. Add game_id to route_encounters (nullable initially), populate from routes.game_id # 4. Add game_id to route_encounters (nullable initially), populate from routes.game_id
op.add_column('route_encounters', sa.Column('game_id', sa.Integer(), op.add_column(
sa.ForeignKey('games.id'), nullable=True)) "route_encounters",
op.create_index('ix_route_encounters_game_id', 'route_encounters', ['game_id']) sa.Column("game_id", sa.Integer(), sa.ForeignKey("games.id"), nullable=True),
)
op.create_index("ix_route_encounters_game_id", "route_encounters", ["game_id"])
routes_table = sa.table( routes_table = sa.table(
'routes', "routes",
sa.column('id', sa.Integer), sa.column("id", sa.Integer),
sa.column('name', sa.String), sa.column("name", sa.String),
sa.column('game_id', sa.Integer), sa.column("game_id", sa.Integer),
) )
re_table = sa.table( re_table = sa.table(
'route_encounters', "route_encounters",
sa.column('id', sa.Integer), sa.column("id", sa.Integer),
sa.column('route_id', sa.Integer), sa.column("route_id", sa.Integer),
sa.column('game_id', sa.Integer), sa.column("game_id", sa.Integer),
) )
# Populate route_encounters.game_id from routes.game_id via join # Populate route_encounters.game_id from routes.game_id via join
conn.execute( conn.execute(
@@ -104,10 +112,11 @@ def upgrade() -> None:
) )
# 5. Drop old unique constraint on route_encounters, add new one with game_id # 5. Drop old unique constraint on route_encounters, add new one with game_id
op.drop_constraint('uq_route_pokemon_method', 'route_encounters', type_='unique') op.drop_constraint("uq_route_pokemon_method", "route_encounters", type_="unique")
op.create_unique_constraint( op.create_unique_constraint(
'uq_route_pokemon_method_game', 'route_encounters', "uq_route_pokemon_method_game",
['route_id', 'pokemon_id', 'encounter_method', 'game_id'] "route_encounters",
["route_id", "pokemon_id", "encounter_method", "game_id"],
) )
# 6. Deduplicate routes within version groups # 6. Deduplicate routes within version groups
@@ -115,15 +124,15 @@ def upgrade() -> None:
# and re-point route_encounters, encounters, and boss_battles to canonical routes # and re-point route_encounters, encounters, and boss_battles to canonical routes
encounters_table = sa.table( encounters_table = sa.table(
'encounters', "encounters",
sa.column('id', sa.Integer), sa.column("id", sa.Integer),
sa.column('route_id', sa.Integer), sa.column("route_id", sa.Integer),
) )
boss_battles_table = sa.table( boss_battles_table = sa.table(
'boss_battles', "boss_battles",
sa.column('id', sa.Integer), sa.column("id", sa.Integer),
sa.column('game_id', sa.Integer), sa.column("game_id", sa.Integer),
sa.column('after_route_id', sa.Integer), sa.column("after_route_id", sa.Integer),
) )
# Get all version groups that have more than one game # Get all version groups that have more than one game
@@ -149,16 +158,18 @@ def upgrade() -> None:
# Get canonical routes (by name) # Get canonical routes (by name)
canonical_routes = conn.execute( canonical_routes = conn.execute(
sa.select(routes_table.c.id, routes_table.c.name) sa.select(routes_table.c.id, routes_table.c.name).where(
.where(routes_table.c.game_id == canonical_game_id) routes_table.c.game_id == canonical_game_id
)
).fetchall() ).fetchall()
canonical_name_to_id = {name: rid for rid, name in canonical_routes} canonical_name_to_id = {name: rid for rid, name in canonical_routes}
# For each non-canonical game, re-point references to canonical routes # For each non-canonical game, re-point references to canonical routes
for nc_game_id in non_canonical_game_ids: for nc_game_id in non_canonical_game_ids:
nc_routes = conn.execute( nc_routes = conn.execute(
sa.select(routes_table.c.id, routes_table.c.name) sa.select(routes_table.c.id, routes_table.c.name).where(
.where(routes_table.c.game_id == nc_game_id) routes_table.c.game_id == nc_game_id
)
).fetchall() ).fetchall()
for old_route_id, route_name in nc_routes: for old_route_id, route_name in nc_routes:
@@ -192,29 +203,36 @@ def upgrade() -> None:
conn.execute( conn.execute(
sa.text( sa.text(
"DELETE FROM routes WHERE parent_route_id IS NOT NULL AND game_id IN :nc_ids" "DELETE FROM routes WHERE parent_route_id IS NOT NULL AND game_id IN :nc_ids"
).bindparams(sa.bindparam('nc_ids', expanding=True)), ).bindparams(sa.bindparam("nc_ids", expanding=True)),
{"nc_ids": non_canonical_game_ids} {"nc_ids": non_canonical_game_ids},
) )
# Then delete parent routes # Then delete parent routes
conn.execute( conn.execute(
sa.text( sa.text("DELETE FROM routes WHERE game_id IN :nc_ids").bindparams(
"DELETE FROM routes WHERE game_id IN :nc_ids" sa.bindparam("nc_ids", expanding=True)
).bindparams(sa.bindparam('nc_ids', expanding=True)), ),
{"nc_ids": non_canonical_game_ids} {"nc_ids": non_canonical_game_ids},
) )
# 7. Add version_group_id to routes (nullable), populate from games.version_group_id # 7. Add version_group_id to routes (nullable), populate from games.version_group_id
op.add_column('routes', sa.Column('version_group_id', sa.Integer(), op.add_column(
sa.ForeignKey('version_groups.id'), nullable=True)) "routes",
op.create_index('ix_routes_version_group_id', 'routes', ['version_group_id']) sa.Column(
"version_group_id",
sa.Integer(),
sa.ForeignKey("version_groups.id"),
nullable=True,
),
)
op.create_index("ix_routes_version_group_id", "routes", ["version_group_id"])
# Need to re-declare routes_table with version_group_id # Need to re-declare routes_table with version_group_id
routes_table_v2 = sa.table( routes_table_v2 = sa.table(
'routes', "routes",
sa.column('id', sa.Integer), sa.column("id", sa.Integer),
sa.column('name', sa.String), sa.column("name", sa.String),
sa.column('game_id', sa.Integer), sa.column("game_id", sa.Integer),
sa.column('version_group_id', sa.Integer), sa.column("version_group_id", sa.Integer),
) )
# Populate routes.version_group_id from the game's version_group_id # Populate routes.version_group_id from the game's version_group_id
@@ -225,24 +243,32 @@ def upgrade() -> None:
) )
# 8. Drop routes.game_id, drop old unique constraint, add new one # 8. Drop routes.game_id, drop old unique constraint, add new one
op.drop_constraint('uq_routes_game_name', 'routes', type_='unique') op.drop_constraint("uq_routes_game_name", "routes", type_="unique")
op.drop_index('ix_routes_game_id', 'routes') op.drop_index("ix_routes_game_id", "routes")
op.drop_column('routes', 'game_id') op.drop_column("routes", "game_id")
op.create_unique_constraint( op.create_unique_constraint(
'uq_routes_version_group_name', 'routes', "uq_routes_version_group_name", "routes", ["version_group_id", "name"]
['version_group_id', 'name']
) )
# 9. Add version_group_id to boss_battles (nullable), populate from games.version_group_id # 9. Add version_group_id to boss_battles (nullable), populate from games.version_group_id
op.add_column('boss_battles', sa.Column('version_group_id', sa.Integer(), op.add_column(
sa.ForeignKey('version_groups.id'), nullable=True)) "boss_battles",
op.create_index('ix_boss_battles_version_group_id', 'boss_battles', ['version_group_id']) sa.Column(
"version_group_id",
sa.Integer(),
sa.ForeignKey("version_groups.id"),
nullable=True,
),
)
op.create_index(
"ix_boss_battles_version_group_id", "boss_battles", ["version_group_id"]
)
bb_table_v2 = sa.table( bb_table_v2 = sa.table(
'boss_battles', "boss_battles",
sa.column('id', sa.Integer), sa.column("id", sa.Integer),
sa.column('game_id', sa.Integer), sa.column("game_id", sa.Integer),
sa.column('version_group_id', sa.Integer), sa.column("version_group_id", sa.Integer),
) )
conn.execute( conn.execute(
@@ -252,14 +278,14 @@ def upgrade() -> None:
) )
# 10. Drop boss_battles.game_id # 10. Drop boss_battles.game_id
op.drop_index('ix_boss_battles_game_id', 'boss_battles') op.drop_index("ix_boss_battles_game_id", "boss_battles")
op.drop_column('boss_battles', 'game_id') op.drop_column("boss_battles", "game_id")
# 11. Make columns non-nullable # 11. Make columns non-nullable
op.alter_column('route_encounters', 'game_id', nullable=False) op.alter_column("route_encounters", "game_id", nullable=False)
op.alter_column('routes', 'version_group_id', nullable=False) op.alter_column("routes", "version_group_id", nullable=False)
op.alter_column('boss_battles', 'version_group_id', nullable=False) op.alter_column("boss_battles", "version_group_id", nullable=False)
op.alter_column('games', 'version_group_id', nullable=False) op.alter_column("games", "version_group_id", nullable=False)
def downgrade() -> None: def downgrade() -> None:

View File

@@ -5,25 +5,25 @@ Revises: c3d4e5f6a7b8
Create Date: 2026-02-06 14:00:00.000000 Create Date: 2026-02-06 14:00:00.000000
""" """
from typing import Sequence, Union
from alembic import op from collections.abc import Sequence
import sqlalchemy as sa import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = 'd4e5f6a7b8c9' revision: str = "d4e5f6a7b8c9"
down_revision: Union[str, Sequence[str], None] = 'c3d4e5f6a7b8' down_revision: str | Sequence[str] | None = "c3d4e5f6a7b8"
branch_labels: Union[str, Sequence[str], None] = None branch_labels: str | Sequence[str] | None = None
depends_on: Union[str, Sequence[str], None] = None depends_on: str | Sequence[str] | None = None
def upgrade() -> None: def upgrade() -> None:
op.add_column( op.add_column(
'games', "games",
sa.Column('color', sa.String(7), nullable=True), sa.Column("color", sa.String(7), nullable=True),
) )
def downgrade() -> None: def downgrade() -> None:
op.drop_column('games', 'color') op.drop_column("games", "color")

View File

@@ -5,26 +5,26 @@ Revises: c3d4e5f6a7b9
Create Date: 2026-02-09 20:00:00.000000 Create Date: 2026-02-09 20:00:00.000000
""" """
from typing import Sequence, Union
from alembic import op from collections.abc import Sequence
import sqlalchemy as sa import sqlalchemy as sa
from alembic import op
from sqlalchemy.dialects.postgresql import JSONB from sqlalchemy.dialects.postgresql import JSONB
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = 'd4e5f6a7b9c0' revision: str = "d4e5f6a7b9c0"
down_revision: Union[str, Sequence[str], None] = 'c3d4e5f6a7b9' down_revision: str | Sequence[str] | None = "c3d4e5f6a7b9"
branch_labels: Union[str, Sequence[str], None] = None branch_labels: str | Sequence[str] | None = None
depends_on: Union[str, Sequence[str], None] = None depends_on: str | Sequence[str] | None = None
def upgrade() -> None: def upgrade() -> None:
op.add_column( op.add_column(
'nuzlocke_runs', "nuzlocke_runs",
sa.Column('hof_encounter_ids', JSONB(), nullable=True), sa.Column("hof_encounter_ids", JSONB(), nullable=True),
) )
def downgrade() -> None: def downgrade() -> None:
op.drop_column('nuzlocke_runs', 'hof_encounter_ids') op.drop_column("nuzlocke_runs", "hof_encounter_ids")

View File

@@ -5,25 +5,27 @@ Revises: d3e4f5a6b7c8
Create Date: 2026-02-08 18:00:00.000000 Create Date: 2026-02-08 18:00:00.000000
""" """
from typing import Sequence, Union
from collections.abc import Sequence
from alembic import op from alembic import op
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = 'e4f5a6b7c8d9' revision: str = "e4f5a6b7c8d9"
down_revision: Union[str, Sequence[str], None] = 'd3e4f5a6b7c8' down_revision: str | Sequence[str] | None = "d3e4f5a6b7c8"
branch_labels: Union[str, Sequence[str], None] = None branch_labels: str | Sequence[str] | None = None
depends_on: Union[str, Sequence[str], None] = None depends_on: str | Sequence[str] | None = None
def upgrade() -> None: def upgrade() -> None:
op.create_unique_constraint( op.create_unique_constraint(
'uq_boss_battles_version_group_order', "uq_boss_battles_version_group_order",
'boss_battles', "boss_battles",
['version_group_id', 'order'], ["version_group_id", "order"],
) )
def downgrade() -> None: def downgrade() -> None:
op.drop_constraint('uq_boss_battles_version_group_order', 'boss_battles', type_='unique') op.drop_constraint(
"uq_boss_battles_version_group_order", "boss_battles", type_="unique"
)

View File

@@ -5,24 +5,25 @@ Revises: d4e5f6a7b8c9
Create Date: 2026-02-07 10:00:00.000000 Create Date: 2026-02-07 10:00:00.000000
""" """
from typing import Sequence, Union
from alembic import op from collections.abc import Sequence
import sqlalchemy as sa import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = 'e5f6a7b8c9d0' revision: str = "e5f6a7b8c9d0"
down_revision: Union[str, Sequence[str], None] = 'd4e5f6a7b8c9' down_revision: str | Sequence[str] | None = "d4e5f6a7b8c9"
branch_labels: Union[str, Sequence[str], None] = None branch_labels: str | Sequence[str] | None = None
depends_on: Union[str, Sequence[str], None] = None depends_on: str | Sequence[str] | None = None
def upgrade() -> None: def upgrade() -> None:
# Rename national_dex -> pokeapi_id and widen to Integer # Rename national_dex -> pokeapi_id and widen to Integer
op.alter_column( op.alter_column(
'pokemon', 'national_dex', "pokemon",
new_column_name='pokeapi_id', "national_dex",
new_column_name="pokeapi_id",
type_=sa.Integer(), type_=sa.Integer(),
existing_type=sa.SmallInteger(), existing_type=sa.SmallInteger(),
existing_nullable=False, existing_nullable=False,
@@ -30,23 +31,26 @@ def upgrade() -> None:
# Add real national_dex column (shared between forms and base species) # Add real national_dex column (shared between forms and base species)
op.add_column( op.add_column(
'pokemon', "pokemon",
sa.Column('national_dex', sa.SmallInteger(), nullable=False, server_default='0'), sa.Column(
"national_dex", sa.SmallInteger(), nullable=False, server_default="0"
),
) )
# Populate national_dex = pokeapi_id for all existing rows # Populate national_dex = pokeapi_id for all existing rows
# (correct for base species; forms will be fixed by re-seeding) # (correct for base species; forms will be fixed by re-seeding)
op.execute('UPDATE pokemon SET national_dex = pokeapi_id') op.execute("UPDATE pokemon SET national_dex = pokeapi_id")
# Remove the default now that all rows are populated # Remove the default now that all rows are populated
op.alter_column('pokemon', 'national_dex', server_default=None) op.alter_column("pokemon", "national_dex", server_default=None)
def downgrade() -> None: def downgrade() -> None:
op.drop_column('pokemon', 'national_dex') op.drop_column("pokemon", "national_dex")
op.alter_column( op.alter_column(
'pokemon', 'pokeapi_id', "pokemon",
new_column_name='national_dex', "pokeapi_id",
new_column_name="national_dex",
type_=sa.SmallInteger(), type_=sa.SmallInteger(),
existing_type=sa.Integer(), existing_type=sa.Integer(),
existing_nullable=False, existing_nullable=False,

View File

@@ -5,32 +5,55 @@ Revises: d4e5f6a7b9c0
Create Date: 2026-02-09 22:00:00.000000 Create Date: 2026-02-09 22:00:00.000000
""" """
from typing import Sequence, Union
from alembic import op from collections.abc import Sequence
import sqlalchemy as sa import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = 'e5f6a7b9c0d1' revision: str = "e5f6a7b9c0d1"
down_revision: Union[str, Sequence[str], None] = 'd4e5f6a7b9c0' down_revision: str | Sequence[str] | None = "d4e5f6a7b9c0"
branch_labels: Union[str, Sequence[str], None] = None branch_labels: str | Sequence[str] | None = None
depends_on: Union[str, Sequence[str], None] = None depends_on: str | Sequence[str] | None = None
def upgrade() -> None: def upgrade() -> None:
op.create_table( op.create_table(
'genlocke_transfers', "genlocke_transfers",
sa.Column('id', sa.Integer(), primary_key=True), sa.Column("id", sa.Integer(), primary_key=True),
sa.Column('genlocke_id', sa.Integer(), sa.ForeignKey('genlockes.id', ondelete='CASCADE'), nullable=False, index=True), sa.Column(
sa.Column('source_encounter_id', sa.Integer(), sa.ForeignKey('encounters.id'), nullable=False, index=True), "genlocke_id",
sa.Column('target_encounter_id', sa.Integer(), sa.ForeignKey('encounters.id'), nullable=False, unique=True), sa.Integer(),
sa.Column('source_leg_order', sa.SmallInteger(), nullable=False), sa.ForeignKey("genlockes.id", ondelete="CASCADE"),
sa.Column('target_leg_order', sa.SmallInteger(), nullable=False), nullable=False,
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False), index=True,
sa.UniqueConstraint('target_encounter_id', name='uq_genlocke_transfers_target'), ),
sa.Column(
"source_encounter_id",
sa.Integer(),
sa.ForeignKey("encounters.id"),
nullable=False,
index=True,
),
sa.Column(
"target_encounter_id",
sa.Integer(),
sa.ForeignKey("encounters.id"),
nullable=False,
unique=True,
),
sa.Column("source_leg_order", sa.SmallInteger(), nullable=False),
sa.Column("target_leg_order", sa.SmallInteger(), nullable=False),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
server_default=sa.func.now(),
nullable=False,
),
sa.UniqueConstraint("target_encounter_id", name="uq_genlocke_transfers_target"),
) )
def downgrade() -> None: def downgrade() -> None:
op.drop_table('genlocke_transfers') op.drop_table("genlocke_transfers")

View File

@@ -5,22 +5,22 @@ Revises: e4f5a6b7c8d9
Create Date: 2026-02-08 20:00:00.000000 Create Date: 2026-02-08 20:00:00.000000
""" """
from typing import Sequence, Union
from collections.abc import Sequence
import sqlalchemy as sa import sqlalchemy as sa
from alembic import op from alembic import op
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = 'f5a6b7c8d9e0' revision: str = "f5a6b7c8d9e0"
down_revision: Union[str, Sequence[str], None] = 'e4f5a6b7c8d9' down_revision: str | Sequence[str] | None = "e4f5a6b7c8d9"
branch_labels: Union[str, Sequence[str], None] = None branch_labels: str | Sequence[str] | None = None
depends_on: Union[str, Sequence[str], None] = None depends_on: str | Sequence[str] | None = None
def upgrade() -> None: def upgrade() -> None:
op.add_column('boss_battles', sa.Column('section', sa.String(100), nullable=True)) op.add_column("boss_battles", sa.Column("section", sa.String(100), nullable=True))
def downgrade() -> None: def downgrade() -> None:
op.drop_column('boss_battles', 'section') op.drop_column("boss_battles", "section")

View File

@@ -5,25 +5,25 @@ Revises: e5f6a7b8c9d0
Create Date: 2026-02-07 12:00:00.000000 Create Date: 2026-02-07 12:00:00.000000
""" """
from typing import Sequence, Union
from alembic import op from collections.abc import Sequence
import sqlalchemy as sa import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic. # revision identifiers, used by Alembic.
revision: str = 'f6a7b8c9d0e1' revision: str = "f6a7b8c9d0e1"
down_revision: Union[str, Sequence[str], None] = 'e5f6a7b8c9d0' down_revision: str | Sequence[str] | None = "e5f6a7b8c9d0"
branch_labels: Union[str, Sequence[str], None] = None branch_labels: str | Sequence[str] | None = None
depends_on: Union[str, Sequence[str], None] = None depends_on: str | Sequence[str] | None = None
def upgrade() -> None: def upgrade() -> None:
op.add_column( op.add_column(
'evolutions', "evolutions",
sa.Column('region', sa.String(30), nullable=True), sa.Column("region", sa.String(30), nullable=True),
) )
def downgrade() -> None: def downgrade() -> None:
op.drop_column('evolutions', 'region') op.drop_column("evolutions", "region")

View File

@@ -1,4 +1,4 @@
from datetime import datetime, timezone from datetime import UTC, datetime
from fastapi import APIRouter, Depends, HTTPException, Response from fastapi import APIRouter, Depends, HTTPException, Response
from sqlalchemy import select from sqlalchemy import select
@@ -33,7 +33,9 @@ async def _get_version_group_id(session: AsyncSession, game_id: int) -> int:
if game is None: if game is None:
raise HTTPException(status_code=404, detail="Game not found") raise HTTPException(status_code=404, detail="Game not found")
if game.version_group_id is None: if game.version_group_id is None:
raise HTTPException(status_code=400, detail="Game has no version group assigned") raise HTTPException(
status_code=400, detail="Game has no version group assigned"
)
return game.version_group_id return game.version_group_id
@@ -41,9 +43,7 @@ async def _get_version_group_id(session: AsyncSession, game_id: int) -> int:
@router.get("/games/{game_id}/bosses", response_model=list[BossBattleResponse]) @router.get("/games/{game_id}/bosses", response_model=list[BossBattleResponse])
async def list_bosses( async def list_bosses(game_id: int, session: AsyncSession = Depends(get_session)):
game_id: int, session: AsyncSession = Depends(get_session)
):
vg_id = await _get_version_group_id(session, game_id) vg_id = await _get_version_group_id(session, game_id)
result = await session.execute( result = await session.execute(
@@ -72,7 +72,9 @@ async def reorder_bosses(
bosses = {b.id: b for b in result.scalars().all()} bosses = {b.id: b for b in result.scalars().all()}
if len(bosses) != len(boss_ids): if len(bosses) != len(boss_ids):
raise HTTPException(status_code=400, detail="Some boss IDs not found in this game") raise HTTPException(
status_code=400, detail="Some boss IDs not found in this game"
)
# Phase 1: set temporary negative orders to avoid unique constraint violations # Phase 1: set temporary negative orders to avoid unique constraint violations
for i, item in enumerate(data.bosses): for i, item in enumerate(data.bosses):
@@ -94,7 +96,9 @@ async def reorder_bosses(
return result.scalars().all() return result.scalars().all()
@router.post("/games/{game_id}/bosses", response_model=BossBattleResponse, status_code=201) @router.post(
"/games/{game_id}/bosses", response_model=BossBattleResponse, status_code=201
)
async def create_boss( async def create_boss(
game_id: int, game_id: int,
data: BossBattleCreate, data: BossBattleCreate,
@@ -157,7 +161,9 @@ async def delete_boss(
vg_id = await _get_version_group_id(session, game_id) vg_id = await _get_version_group_id(session, game_id)
result = await session.execute( result = await session.execute(
select(BossBattle).where(BossBattle.id == boss_id, BossBattle.version_group_id == vg_id) select(BossBattle).where(
BossBattle.id == boss_id, BossBattle.version_group_id == vg_id
)
) )
boss = result.scalar_one_or_none() boss = result.scalar_one_or_none()
if boss is None: if boss is None:
@@ -188,9 +194,13 @@ async def bulk_import_bosses(
bosses_data = [item.model_dump() for item in items] bosses_data = [item.model_dump() for item in items]
try: try:
count = await upsert_bosses(session, vg_id, bosses_data, dex_to_id, route_name_to_id) count = await upsert_bosses(
session, vg_id, bosses_data, dex_to_id, route_name_to_id
)
except Exception as e: except Exception as e:
raise HTTPException(status_code=400, detail=f"Failed to import bosses: {e}") raise HTTPException(
status_code=400, detail=f"Failed to import bosses: {e}"
) from e
await session.commit() await session.commit()
return BulkImportResult(created=count, updated=0, errors=[]) return BulkImportResult(created=count, updated=0, errors=[])
@@ -252,22 +262,20 @@ async def set_boss_team(
@router.get("/runs/{run_id}/boss-results", response_model=list[BossResultResponse]) @router.get("/runs/{run_id}/boss-results", response_model=list[BossResultResponse])
async def list_boss_results( async def list_boss_results(run_id: int, session: AsyncSession = Depends(get_session)):
run_id: int, session: AsyncSession = Depends(get_session)
):
run = await session.get(NuzlockeRun, run_id) run = await session.get(NuzlockeRun, run_id)
if run is None: if run is None:
raise HTTPException(status_code=404, detail="Run not found") raise HTTPException(status_code=404, detail="Run not found")
result = await session.execute( result = await session.execute(
select(BossResult) select(BossResult).where(BossResult.run_id == run_id).order_by(BossResult.id)
.where(BossResult.run_id == run_id)
.order_by(BossResult.id)
) )
return result.scalars().all() return result.scalars().all()
@router.post("/runs/{run_id}/boss-results", response_model=BossResultResponse, status_code=201) @router.post(
"/runs/{run_id}/boss-results", response_model=BossResultResponse, status_code=201
)
async def create_boss_result( async def create_boss_result(
run_id: int, run_id: int,
data: BossResultCreate, data: BossResultCreate,
@@ -293,14 +301,14 @@ async def create_boss_result(
if result: if result:
result.result = data.result result.result = data.result
result.attempts = data.attempts result.attempts = data.attempts
result.completed_at = datetime.now(timezone.utc) if data.result == "won" else None result.completed_at = datetime.now(UTC) if data.result == "won" else None
else: else:
result = BossResult( result = BossResult(
run_id=run_id, run_id=run_id,
boss_battle_id=data.boss_battle_id, boss_battle_id=data.boss_battle_id,
result=data.result, result=data.result,
attempts=data.attempts, attempts=data.attempts,
completed_at=datetime.now(timezone.utc) if data.result == "won" else None, completed_at=datetime.now(UTC) if data.result == "won" else None,
) )
session.add(result) session.add(result)

View File

@@ -8,8 +8,8 @@ from sqlalchemy.orm import joinedload, selectinload
from app.core.database import get_session from app.core.database import get_session
from app.models.encounter import Encounter from app.models.encounter import Encounter
from app.models.evolution import Evolution from app.models.evolution import Evolution
from app.models.genlocke_transfer import GenlockeTransfer
from app.models.genlocke import GenlockeLeg from app.models.genlocke import GenlockeLeg
from app.models.genlocke_transfer import GenlockeTransfer
from app.models.nuzlocke_run import NuzlockeRun from app.models.nuzlocke_run import NuzlockeRun
from app.models.pokemon import Pokemon from app.models.pokemon import Pokemon
from app.models.route import Route from app.models.route import Route
@@ -60,7 +60,11 @@ async def create_encounter(
# Shiny clause: shiny encounters bypass the route-lock check # Shiny clause: shiny encounters bypass the route-lock check
shiny_clause_on = run.rules.get("shinyClause", True) if run.rules else True shiny_clause_on = run.rules.get("shinyClause", True) if run.rules else True
skip_route_lock = (data.is_shiny and shiny_clause_on) or data.origin in ("shed_evolution", "egg", "transfer") skip_route_lock = (data.is_shiny and shiny_clause_on) or data.origin in (
"shed_evolution",
"egg",
"transfer",
)
# If this route has a parent, check if sibling already has an encounter # If this route has a parent, check if sibling already has an encounter
if route.parent_route_id is not None and not skip_route_lock: if route.parent_route_id is not None and not skip_route_lock:
@@ -78,7 +82,8 @@ async def create_encounter(
# Zone-aware: only check siblings in the same zone (null treated as 0) # Zone-aware: only check siblings in the same zone (null treated as 0)
my_zone = route.pinwheel_zone if route.pinwheel_zone is not None else 0 my_zone = route.pinwheel_zone if route.pinwheel_zone is not None else 0
sibling_ids = [ sibling_ids = [
s.id for s in siblings s.id
for s in siblings
if (s.pinwheel_zone if s.pinwheel_zone is not None else 0) == my_zone if (s.pinwheel_zone if s.pinwheel_zone is not None else 0) == my_zone
] ]
else: else:
@@ -89,8 +94,7 @@ async def create_encounter(
# Exclude transfer-target encounters so they don't block the starter # Exclude transfer-target encounters so they don't block the starter
transfer_target_ids = select(GenlockeTransfer.target_encounter_id) transfer_target_ids = select(GenlockeTransfer.target_encounter_id)
existing_encounter = await session.execute( existing_encounter = await session.execute(
select(Encounter) select(Encounter).where(
.where(
Encounter.run_id == run_id, Encounter.run_id == run_id,
Encounter.route_id.in_(sibling_ids), Encounter.route_id.in_(sibling_ids),
~Encounter.id.in_(transfer_target_ids), ~Encounter.id.in_(transfer_target_ids),
@@ -197,6 +201,7 @@ async def bulk_randomize_encounters(
# 2. Get version_group_id from game # 2. Get version_group_id from game
from app.models.game import Game from app.models.game import Game
game = await session.get(Game, game_id) game = await session.get(Game, game_id)
if game is None or game.version_group_id is None: if game is None or game.version_group_id is None:
raise HTTPException(status_code=400, detail="Game has no version group") raise HTTPException(status_code=400, detail="Game has no version group")
@@ -257,8 +262,7 @@ async def bulk_randomize_encounters(
leg = leg_result.scalar_one_or_none() leg = leg_result.scalar_one_or_none()
if leg: if leg:
genlocke_result = await session.execute( genlocke_result = await session.execute(
select(GenlockeLeg.retired_pokemon_ids) select(GenlockeLeg.retired_pokemon_ids).where(
.where(
GenlockeLeg.genlocke_id == leg.genlocke_id, GenlockeLeg.genlocke_id == leg.genlocke_id,
GenlockeLeg.leg_order < leg.leg_order, GenlockeLeg.leg_order < leg.leg_order,
GenlockeLeg.retired_pokemon_ids.isnot(None), GenlockeLeg.retired_pokemon_ids.isnot(None),
@@ -268,7 +272,6 @@ async def bulk_randomize_encounters(
duped.update(retired_ids) duped.update(retired_ids)
# 8. Organize routes: identify top-level and children # 8. Organize routes: identify top-level and children
routes_by_id = {r.id: r for r in all_routes}
top_level = [r for r in all_routes if r.parent_route_id is None] top_level = [r for r in all_routes if r.parent_route_id is None]
children_by_parent: dict[int, list[Route]] = {} children_by_parent: dict[int, list[Route]] = {}
for r in all_routes: for r in all_routes:
@@ -289,7 +292,11 @@ async def bulk_randomize_encounters(
if parent_route.id in encountered_route_ids: if parent_route.id in encountered_route_ids:
continue continue
available = route_pokemon.get(parent_route.id, []) available = route_pokemon.get(parent_route.id, [])
eligible = [p for p in available if p not in duped] if dupes_clause_on else available eligible = (
[p for p in available if p not in duped]
if dupes_clause_on
else available
)
if not eligible: if not eligible:
skipped += 1 skipped += 1
continue continue
@@ -335,7 +342,11 @@ async def bulk_randomize_encounters(
if p not in zone_pokemon: if p not in zone_pokemon:
zone_pokemon.append(p) zone_pokemon.append(p)
eligible = [p for p in zone_pokemon if p not in duped] if dupes_clause_on else zone_pokemon eligible = (
[p for p in zone_pokemon if p not in duped]
if dupes_clause_on
else zone_pokemon
)
if not eligible: if not eligible:
skipped += 1 skipped += 1
continue continue
@@ -371,7 +382,11 @@ async def bulk_randomize_encounters(
if p not in group_pokemon: if p not in group_pokemon:
group_pokemon.append(p) group_pokemon.append(p)
eligible = [p for p in group_pokemon if p not in duped] if dupes_clause_on else group_pokemon eligible = (
[p for p in group_pokemon if p not in duped]
if dupes_clause_on
else group_pokemon
)
if not eligible: if not eligible:
skipped += 1 skipped += 1
continue continue

View File

@@ -26,17 +26,18 @@ async def list_evolutions(
offset: int = Query(0, ge=0), offset: int = Query(0, ge=0),
session: AsyncSession = Depends(get_session), session: AsyncSession = Depends(get_session),
): ):
base_query = ( base_query = select(Evolution).options(
select(Evolution) joinedload(Evolution.from_pokemon), joinedload(Evolution.to_pokemon)
.options(joinedload(Evolution.from_pokemon), joinedload(Evolution.to_pokemon))
) )
if search: if search:
search_lower = search.lower() search_lower = search.lower()
# Join pokemon to search by name # Join pokemon to search by name
from_pokemon = select(Pokemon.id).where( from_pokemon = (
func.lower(Pokemon.name).contains(search_lower) select(Pokemon.id)
).scalar_subquery() .where(func.lower(Pokemon.name).contains(search_lower))
.scalar_subquery()
)
base_query = base_query.where( base_query = base_query.where(
or_( or_(
Evolution.from_pokemon_id.in_(from_pokemon), Evolution.from_pokemon_id.in_(from_pokemon),
@@ -52,9 +53,11 @@ async def list_evolutions(
count_base = select(Evolution) count_base = select(Evolution)
if search: if search:
search_lower = search.lower() search_lower = search.lower()
from_pokemon = select(Pokemon.id).where( from_pokemon = (
func.lower(Pokemon.name).contains(search_lower) select(Pokemon.id)
).scalar_subquery() .where(func.lower(Pokemon.name).contains(search_lower))
.scalar_subquery()
)
count_base = count_base.where( count_base = count_base.where(
or_( or_(
Evolution.from_pokemon_id.in_(from_pokemon), Evolution.from_pokemon_id.in_(from_pokemon),
@@ -68,7 +71,11 @@ async def list_evolutions(
count_query = select(func.count()).select_from(count_base.subquery()) count_query = select(func.count()).select_from(count_base.subquery())
total = (await session.execute(count_query)).scalar() or 0 total = (await session.execute(count_query)).scalar() or 0
items_query = base_query.order_by(Evolution.from_pokemon_id, Evolution.to_pokemon_id).offset(offset).limit(limit) items_query = (
base_query.order_by(Evolution.from_pokemon_id, Evolution.to_pokemon_id)
.offset(offset)
.limit(limit)
)
result = await session.execute(items_query) result = await session.execute(items_query)
items = result.scalars().unique().all() items = result.scalars().unique().all()
@@ -209,7 +216,9 @@ async def bulk_import_evolutions(
session.add(evolution) session.add(evolution)
created += 1 created += 1
except Exception as e: except Exception as e:
errors.append(f"Evolution {item.from_pokeapi_id} -> {item.to_pokeapi_id}: {e}") errors.append(
f"Evolution {item.from_pokeapi_id} -> {item.to_pokeapi_id}: {e}"
)
await session.commit() await session.commit()
return BulkImportResult(created=created, updated=updated, errors=errors) return BulkImportResult(created=created, updated=updated, errors=errors)

View File

@@ -20,9 +20,7 @@ router = APIRouter()
@router.get("/games") @router.get("/games")
async def export_games(session: AsyncSession = Depends(get_session)): async def export_games(session: AsyncSession = Depends(get_session)):
"""Export all games in seed JSON format.""" """Export all games in seed JSON format."""
result = await session.execute( result = await session.execute(select(Game).order_by(Game.name))
select(Game).order_by(Game.name)
)
games = result.scalars().all() games = result.scalars().all()
return [ return [
{ {
@@ -154,7 +152,11 @@ async def export_game_bosses(
"pokemon_name": bp.pokemon.name, "pokemon_name": bp.pokemon.name,
"level": bp.level, "level": bp.level,
"order": bp.order, "order": bp.order,
**({"condition_label": bp.condition_label} if bp.condition_label else {}), **(
{"condition_label": bp.condition_label}
if bp.condition_label
else {}
),
} }
for bp in sorted(b.pokemon, key=lambda p: p.order) for bp in sorted(b.pokemon, key=lambda p: p.order)
], ],
@@ -167,9 +169,7 @@ async def export_game_bosses(
@router.get("/pokemon") @router.get("/pokemon")
async def export_pokemon(session: AsyncSession = Depends(get_session)): async def export_pokemon(session: AsyncSession = Depends(get_session)):
"""Export all pokemon in seed JSON format.""" """Export all pokemon in seed JSON format."""
result = await session.execute( result = await session.execute(select(Pokemon).order_by(Pokemon.pokeapi_id))
select(Pokemon).order_by(Pokemon.pokeapi_id)
)
pokemon_list = result.scalars().all() pokemon_list = result.scalars().all()
return [ return [
{ {

View File

@@ -40,7 +40,9 @@ async def _get_game_or_404(session: AsyncSession, game_id: int) -> Game:
async def _get_version_group_id(session: AsyncSession, game_id: int) -> int: async def _get_version_group_id(session: AsyncSession, game_id: int) -> int:
game = await _get_game_or_404(session, game_id) game = await _get_game_or_404(session, game_id)
if game.version_group_id is None: if game.version_group_id is None:
raise HTTPException(status_code=400, detail="Game has no version group assigned") raise HTTPException(
status_code=400, detail="Game has no version group assigned"
)
return game.version_group_id return game.version_group_id
@@ -68,16 +70,18 @@ async def list_games_by_region(session: AsyncSession = Depends(get_session)):
for region in regions_data: for region in regions_data:
region_games = games_by_region.get(region["name"], []) region_games = games_by_region.get(region["name"], [])
defaults = region["genlocke_defaults"] defaults = region["genlocke_defaults"]
response.append({ response.append(
"name": region["name"], {
"generation": region["generation"], "name": region["name"],
"order": region["order"], "generation": region["generation"],
"genlocke_defaults": { "order": region["order"],
"true_genlocke": defaults["true"], "genlocke_defaults": {
"normal_genlocke": defaults["normal"], "true_genlocke": defaults["true"],
}, "normal_genlocke": defaults["normal"],
"games": region_games, },
}) "games": region_games,
}
)
return response return response
@@ -89,9 +93,7 @@ async def get_game(game_id: int, session: AsyncSession = Depends(get_session)):
# Load routes via version_group_id # Load routes via version_group_id
result = await session.execute( result = await session.execute(
select(Route) select(Route).where(Route.version_group_id == vg_id).order_by(Route.order)
.where(Route.version_group_id == vg_id)
.order_by(Route.order)
) )
routes = result.scalars().all() routes = result.scalars().all()
@@ -149,10 +151,13 @@ async def list_game_routes(
def route_to_dict(route: Route) -> dict: def route_to_dict(route: Route) -> dict:
# Only show encounter methods for the requested game # Only show encounter methods for the requested game
methods = sorted({ methods = sorted(
re.encounter_method for re in route.route_encounters {
if re.game_id == game_id re.encounter_method
}) for re in route.route_encounters
if re.game_id == game_id
}
)
return { return {
"id": route.id, "id": route.id,
"name": route.name, "name": route.name,
@@ -193,14 +198,12 @@ async def list_game_routes(
@router.post("", response_model=GameResponse, status_code=201) @router.post("", response_model=GameResponse, status_code=201)
async def create_game( async def create_game(data: GameCreate, session: AsyncSession = Depends(get_session)):
data: GameCreate, session: AsyncSession = Depends(get_session) existing = await session.execute(select(Game).where(Game.slug == data.slug))
):
existing = await session.execute(
select(Game).where(Game.slug == data.slug)
)
if existing.scalar_one_or_none() is not None: if existing.scalar_one_or_none() is not None:
raise HTTPException(status_code=409, detail="Game with this slug already exists") raise HTTPException(
status_code=409, detail="Game with this slug already exists"
)
game = Game(**data.model_dump()) game = Game(**data.model_dump())
session.add(game) session.add(game)
@@ -223,7 +226,9 @@ async def update_game(
select(Game).where(Game.slug == update_data["slug"], Game.id != game_id) select(Game).where(Game.slug == update_data["slug"], Game.id != game_id)
) )
if existing.scalar_one_or_none() is not None: if existing.scalar_one_or_none() is not None:
raise HTTPException(status_code=409, detail="Game with this slug already exists") raise HTTPException(
status_code=409, detail="Game with this slug already exists"
)
for field, value in update_data.items(): for field, value in update_data.items():
setattr(game, field, value) setattr(game, field, value)
@@ -234,9 +239,7 @@ async def update_game(
@router.delete("/{game_id}", status_code=204) @router.delete("/{game_id}", status_code=204)
async def delete_game( async def delete_game(game_id: int, session: AsyncSession = Depends(get_session)):
game_id: int, session: AsyncSession = Depends(get_session)
):
result = await session.execute( result = await session.execute(
select(Game).where(Game.id == game_id).options(selectinload(Game.runs)) select(Game).where(Game.id == game_id).options(selectinload(Game.runs))
) )
@@ -393,7 +396,9 @@ async def bulk_import_routes(
try: try:
route_name_to_id = await upsert_routes(session, vg_id, routes_data) route_name_to_id = await upsert_routes(session, vg_id, routes_data)
except Exception as e: except Exception as e:
raise HTTPException(status_code=400, detail=f"Failed to import routes: {e}") raise HTTPException(
status_code=400, detail=f"Failed to import routes: {e}"
) from e
# Upsert encounters for each route # Upsert encounters for each route
encounter_count = 0 encounter_count = 0
@@ -406,8 +411,11 @@ async def bulk_import_routes(
if item.encounters: if item.encounters:
try: try:
count = await upsert_route_encounters( count = await upsert_route_encounters(
session, route_id, [e.model_dump() for e in item.encounters], session,
dex_to_id, game_id, route_id,
[e.model_dump() for e in item.encounters],
dex_to_id,
game_id,
) )
encounter_count += count encounter_count += count
except Exception as e: except Exception as e:
@@ -422,8 +430,11 @@ async def bulk_import_routes(
if child.encounters: if child.encounters:
try: try:
count = await upsert_route_encounters( count = await upsert_route_encounters(
session, child_id, [e.model_dump() for e in child.encounters], session,
dex_to_id, game_id, child_id,
[e.model_dump() for e in child.encounters],
dex_to_id,
game_id,
) )
encounter_count += count encounter_count += count
except Exception as e: except Exception as e:

View File

@@ -1,6 +1,8 @@
from fastapi import APIRouter, Depends, HTTPException from fastapi import APIRouter, Depends, HTTPException
from pydantic import BaseModel from pydantic import BaseModel
from sqlalchemy import delete as sa_delete, func, select, update as sa_update from sqlalchemy import delete as sa_delete
from sqlalchemy import func, select
from sqlalchemy import update as sa_update
from sqlalchemy.ext.asyncio import AsyncSession from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy.orm import selectinload from sqlalchemy.orm import selectinload
@@ -9,9 +11,9 @@ from app.models.encounter import Encounter
from app.models.evolution import Evolution from app.models.evolution import Evolution
from app.models.game import Game from app.models.game import Game
from app.models.genlocke import Genlocke, GenlockeLeg from app.models.genlocke import Genlocke, GenlockeLeg
from app.models.genlocke_transfer import GenlockeTransfer
from app.models.nuzlocke_run import NuzlockeRun from app.models.nuzlocke_run import NuzlockeRun
from app.models.pokemon import Pokemon from app.models.pokemon import Pokemon
from app.models.genlocke_transfer import GenlockeTransfer
from app.models.route import Route from app.models.route import Route
from app.schemas.genlocke import ( from app.schemas.genlocke import (
AddLegRequest, AddLegRequest,
@@ -74,9 +76,7 @@ async def list_genlockes(session: AsyncSession = Depends(get_session)):
@router.get("/{genlocke_id}", response_model=GenlockeDetailResponse) @router.get("/{genlocke_id}", response_model=GenlockeDetailResponse)
async def get_genlocke( async def get_genlocke(genlocke_id: int, session: AsyncSession = Depends(get_session)):
genlocke_id: int, session: AsyncSession = Depends(get_session)
):
result = await session.execute( result = await session.execute(
select(Genlocke) select(Genlocke)
.where(Genlocke.id == genlocke_id) .where(Genlocke.id == genlocke_id)
@@ -112,7 +112,9 @@ async def get_genlocke(
legs_completed = 0 legs_completed = 0
for leg in genlocke.legs: for leg in genlocke.legs:
run_status = leg.run.status if leg.run else None run_status = leg.run.status if leg.run else None
enc_count, death_count = stats_by_run.get(leg.run_id, (0, 0)) if leg.run_id else (0, 0) enc_count, death_count = (
stats_by_run.get(leg.run_id, (0, 0)) if leg.run_id else (0, 0)
)
total_encounters += enc_count total_encounters += enc_count
total_deaths += death_count total_deaths += death_count
if run_status == "completed": if run_status == "completed":
@@ -254,7 +256,9 @@ async def get_genlocke_graveyard(
) )
) )
deadliest = max(deaths_per_leg, key=lambda s: s.death_count) if deaths_per_leg else None deadliest = (
max(deaths_per_leg, key=lambda s: s.death_count) if deaths_per_leg else None
)
return GenlockeGraveyardResponse( return GenlockeGraveyardResponse(
entries=entries, entries=entries,
@@ -285,9 +289,7 @@ async def get_genlocke_lineages(
# Query all transfers for this genlocke # Query all transfers for this genlocke
transfer_result = await session.execute( transfer_result = await session.execute(
select(GenlockeTransfer).where( select(GenlockeTransfer).where(GenlockeTransfer.genlocke_id == genlocke_id)
GenlockeTransfer.genlocke_id == genlocke_id
)
) )
transfers = transfer_result.scalars().all() transfers = transfer_result.scalars().all()
@@ -302,7 +304,11 @@ async def get_genlocke_lineages(
backward.add(t.target_encounter_id) backward.add(t.target_encounter_id)
# Find roots: sources that are NOT targets # Find roots: sources that are NOT targets
roots = [t.source_encounter_id for t in transfers if t.source_encounter_id not in backward] roots = [
t.source_encounter_id
for t in transfers
if t.source_encounter_id not in backward
]
# Deduplicate while preserving order # Deduplicate while preserving order
seen_roots: set[int] = set() seen_roots: set[int] = set()
unique_roots: list[int] = [] unique_roots: list[int] = []
@@ -421,7 +427,7 @@ async def get_genlocke_lineages(
) )
# Sort by first leg order, then by encounter ID # Sort by first leg order, then by encounter ID
lineages.sort(key=lambda l: (l.legs[0].leg_order, l.legs[0].encounter_id)) lineages.sort(key=lambda lin: (lin.legs[0].leg_order, lin.legs[0].encounter_id))
return GenlockeLineageResponse( return GenlockeLineageResponse(
lineages=lineages, lineages=lineages,
@@ -440,15 +446,11 @@ async def create_genlocke(
raise HTTPException(status_code=400, detail="Name is required") raise HTTPException(status_code=400, detail="Name is required")
# Validate all game_ids exist # Validate all game_ids exist
result = await session.execute( result = await session.execute(select(Game).where(Game.id.in_(data.game_ids)))
select(Game).where(Game.id.in_(data.game_ids))
)
found_games = {g.id: g for g in result.scalars().all()} found_games = {g.id: g for g in result.scalars().all()}
missing = [gid for gid in data.game_ids if gid not in found_games] missing = [gid for gid in data.game_ids if gid not in found_games]
if missing: if missing:
raise HTTPException( raise HTTPException(status_code=404, detail=f"Games not found: {missing}")
status_code=404, detail=f"Games not found: {missing}"
)
# Create genlocke # Create genlocke
genlocke = Genlocke( genlocke = Genlocke(
@@ -578,9 +580,7 @@ async def advance_leg(
raise HTTPException(status_code=404, detail="Genlocke not found") raise HTTPException(status_code=404, detail="Genlocke not found")
if genlocke.status != "active": if genlocke.status != "active":
raise HTTPException( raise HTTPException(status_code=400, detail="Genlocke is not active")
status_code=400, detail="Genlocke is not active"
)
# Find the current leg # Find the current leg
current_leg = None current_leg = None
@@ -596,9 +596,7 @@ async def advance_leg(
# Verify current leg's run is completed # Verify current leg's run is completed
if current_leg.run_id is None: if current_leg.run_id is None:
raise HTTPException( raise HTTPException(status_code=400, detail="Current leg has no run")
status_code=400, detail="Current leg has no run"
)
current_run = await session.get(NuzlockeRun, current_leg.run_id) current_run = await session.get(NuzlockeRun, current_leg.run_id)
if current_run is None or current_run.status != "completed": if current_run is None or current_run.status != "completed":
raise HTTPException( raise HTTPException(
@@ -606,14 +604,10 @@ async def advance_leg(
) )
if next_leg is None: if next_leg is None:
raise HTTPException( raise HTTPException(status_code=400, detail="No next leg to advance to")
status_code=400, detail="No next leg to advance to"
)
if next_leg.run_id is not None: if next_leg.run_id is not None:
raise HTTPException( raise HTTPException(status_code=400, detail="Next leg already has a run")
status_code=400, detail="Next leg already has a run"
)
# Compute retired Pokemon families if retireHoF is enabled # Compute retired Pokemon families if retireHoF is enabled
if genlocke.genlocke_rules.get("retireHoF", False): if genlocke.genlocke_rules.get("retireHoF", False):
@@ -807,10 +801,12 @@ async def get_retired_families(
for leg in legs: for leg in legs:
ids = leg.retired_pokemon_ids or [] ids = leg.retired_pokemon_ids or []
cumulative.update(ids) cumulative.update(ids)
by_leg.append(RetiredLegResponse( by_leg.append(
leg_order=leg.leg_order, RetiredLegResponse(
retired_pokemon_ids=ids, leg_order=leg.leg_order,
)) retired_pokemon_ids=ids,
)
)
return RetiredFamiliesResponse( return RetiredFamiliesResponse(
retired_pokemon_ids=sorted(cumulative), retired_pokemon_ids=sorted(cumulative),
@@ -837,12 +833,15 @@ async def update_genlocke(
update_data = data.model_dump(exclude_unset=True) update_data = data.model_dump(exclude_unset=True)
if "status" in update_data: if "status" in update_data and update_data["status"] not in (
if update_data["status"] not in ("active", "completed", "failed"): "active",
raise HTTPException( "completed",
status_code=400, "failed",
detail="Status must be one of: active, completed, failed", ):
) raise HTTPException(
status_code=400,
detail="Status must be one of: active, completed, failed",
)
for field, value in update_data.items(): for field, value in update_data.items():
setattr(genlocke, field, value) setattr(genlocke, field, value)
@@ -871,8 +870,7 @@ async def delete_genlocke(
# Delete legs explicitly to avoid ORM cascade issues # Delete legs explicitly to avoid ORM cascade issues
# (genlocke_id is non-nullable, so SQLAlchemy can't nullify it) # (genlocke_id is non-nullable, so SQLAlchemy can't nullify it)
await session.execute( await session.execute(
sa_delete(GenlockeLeg) sa_delete(GenlockeLeg).where(GenlockeLeg.genlocke_id == genlocke_id)
.where(GenlockeLeg.genlocke_id == genlocke_id)
) )
await session.delete(genlocke) await session.delete(genlocke)

View File

@@ -8,7 +8,6 @@ from app.models.evolution import Evolution
from app.models.pokemon import Pokemon from app.models.pokemon import Pokemon
from app.models.route import Route from app.models.route import Route
from app.models.route_encounter import RouteEncounter from app.models.route_encounter import RouteEncounter
from app.models.game import Game
from app.schemas.pokemon import ( from app.schemas.pokemon import (
BulkImportItem, BulkImportItem,
BulkImportResult, BulkImportResult,
@@ -40,9 +39,7 @@ async def list_pokemon(
# Build base query with optional search filter # Build base query with optional search filter
base_query = select(Pokemon) base_query = select(Pokemon)
if search: if search:
base_query = base_query.where( base_query = base_query.where(func.lower(Pokemon.name).contains(search.lower()))
func.lower(Pokemon.name).contains(search.lower())
)
if type: if type:
base_query = base_query.where(Pokemon.types.any(type)) base_query = base_query.where(Pokemon.types.any(type))
@@ -51,7 +48,11 @@ async def list_pokemon(
total = (await session.execute(count_query)).scalar() or 0 total = (await session.execute(count_query)).scalar() or 0
# Get paginated items # Get paginated items
items_query = base_query.order_by(Pokemon.national_dex, Pokemon.name).offset(offset).limit(limit) items_query = (
base_query.order_by(Pokemon.national_dex, Pokemon.name)
.offset(offset)
.limit(limit)
)
result = await session.execute(items_query) result = await session.execute(items_query)
items = result.scalars().all() items = result.scalars().all()
@@ -156,9 +157,7 @@ async def get_pokemon_families(
@router.get("/pokemon/{pokemon_id}", response_model=PokemonResponse) @router.get("/pokemon/{pokemon_id}", response_model=PokemonResponse)
async def get_pokemon( async def get_pokemon(pokemon_id: int, session: AsyncSession = Depends(get_session)):
pokemon_id: int, session: AsyncSession = Depends(get_session)
):
pokemon = await session.get(Pokemon, pokemon_id) pokemon = await session.get(Pokemon, pokemon_id)
if pokemon is None: if pokemon is None:
raise HTTPException(status_code=404, detail="Pokemon not found") raise HTTPException(status_code=404, detail="Pokemon not found")
@@ -258,7 +257,8 @@ async def get_pokemon_evolution_chain(
# Filter evolutions to only those in the family # Filter evolutions to only those in the family
family_evo_ids = [ family_evo_ids = [
evo.id for evo in evolutions evo.id
for evo in evolutions
if evo.from_pokemon_id in family and evo.to_pokemon_id in family if evo.from_pokemon_id in family and evo.to_pokemon_id in family
] ]
@@ -294,9 +294,7 @@ async def get_pokemon_evolutions(
.options(joinedload(Evolution.to_pokemon)) .options(joinedload(Evolution.to_pokemon))
) )
if region is not None: if region is not None:
query = query.where( query = query.where(or_(Evolution.region.is_(None), Evolution.region == region))
or_(Evolution.region.is_(None), Evolution.region == region)
)
result = await session.execute(query) result = await session.execute(query)
evolutions = result.scalars().unique().all() evolutions = result.scalars().unique().all()
@@ -309,7 +307,8 @@ async def get_pokemon_evolutions(
} }
if regional_keys: if regional_keys:
evolutions = [ evolutions = [
e for e in evolutions e
for e in evolutions
if e.region is not None or (e.trigger, e.item) not in regional_keys if e.region is not None or (e.trigger, e.item) not in regional_keys
] ]
@@ -349,9 +348,7 @@ async def update_pokemon(
@router.delete("/pokemon/{pokemon_id}", status_code=204) @router.delete("/pokemon/{pokemon_id}", status_code=204)
async def delete_pokemon( async def delete_pokemon(pokemon_id: int, session: AsyncSession = Depends(get_session)):
pokemon_id: int, session: AsyncSession = Depends(get_session)
):
result = await session.execute( result = await session.execute(
select(Pokemon) select(Pokemon)
.where(Pokemon.id == pokemon_id) .where(Pokemon.id == pokemon_id)

View File

@@ -1,6 +1,17 @@
from fastapi import APIRouter from fastapi import APIRouter
from app.api import bosses, encounters, evolutions, export, games, genlockes, health, pokemon, runs, stats from app.api import (
bosses,
encounters,
evolutions,
export,
games,
genlockes,
health,
pokemon,
runs,
stats,
)
api_router = APIRouter() api_router = APIRouter()
api_router.include_router(health.router) api_router.include_router(health.router)

View File

@@ -1,4 +1,4 @@
from datetime import datetime, timezone from datetime import UTC, datetime
from fastapi import APIRouter, Depends, HTTPException, Response from fastapi import APIRouter, Depends, HTTPException, Response
from sqlalchemy import func, select from sqlalchemy import func, select
@@ -9,18 +9,22 @@ from app.core.database import get_session
from app.models.boss_result import BossResult from app.models.boss_result import BossResult
from app.models.encounter import Encounter from app.models.encounter import Encounter
from app.models.game import Game from app.models.game import Game
from app.models.genlocke import Genlocke, GenlockeLeg from app.models.genlocke import GenlockeLeg
from app.models.genlocke_transfer import GenlockeTransfer from app.models.genlocke_transfer import GenlockeTransfer
from app.models.nuzlocke_run import NuzlockeRun from app.models.nuzlocke_run import NuzlockeRun
from app.schemas.run import RunCreate, RunDetailResponse, RunGenlockeContext, RunResponse, RunUpdate from app.schemas.run import (
RunCreate,
RunDetailResponse,
RunGenlockeContext,
RunResponse,
RunUpdate,
)
router = APIRouter() router = APIRouter()
@router.post("", response_model=RunResponse, status_code=201) @router.post("", response_model=RunResponse, status_code=201)
async def create_run( async def create_run(data: RunCreate, session: AsyncSession = Depends(get_session)):
data: RunCreate, session: AsyncSession = Depends(get_session)
):
# Validate game exists # Validate game exists
game = await session.get(Game, data.game_id) game = await session.get(Game, data.game_id)
if game is None: if game is None:
@@ -53,12 +57,9 @@ async def get_run(run_id: int, session: AsyncSession = Depends(get_session)):
.where(NuzlockeRun.id == run_id) .where(NuzlockeRun.id == run_id)
.options( .options(
joinedload(NuzlockeRun.game), joinedload(NuzlockeRun.game),
selectinload(NuzlockeRun.encounters) selectinload(NuzlockeRun.encounters).joinedload(Encounter.pokemon),
.joinedload(Encounter.pokemon), selectinload(NuzlockeRun.encounters).joinedload(Encounter.current_pokemon),
selectinload(NuzlockeRun.encounters) selectinload(NuzlockeRun.encounters).joinedload(Encounter.route),
.joinedload(Encounter.current_pokemon),
selectinload(NuzlockeRun.encounters)
.joinedload(Encounter.route),
) )
) )
run = result.scalar_one_or_none() run = result.scalar_one_or_none()
@@ -134,7 +135,10 @@ async def update_run(
update_data = data.model_dump(exclude_unset=True) update_data = data.model_dump(exclude_unset=True)
# Validate hof_encounter_ids if provided # Validate hof_encounter_ids if provided
if "hof_encounter_ids" in update_data and update_data["hof_encounter_ids"] is not None: if (
"hof_encounter_ids" in update_data
and update_data["hof_encounter_ids"] is not None
):
hof_ids = update_data["hof_encounter_ids"] hof_ids = update_data["hof_encounter_ids"]
if len(hof_ids) > 6: if len(hof_ids) > 6:
raise HTTPException( raise HTTPException(
@@ -156,7 +160,8 @@ async def update_run(
detail=f"Encounters not found in this run: {missing}", detail=f"Encounters not found in this run: {missing}",
) )
not_alive = [ not_alive = [
eid for eid, e in found.items() eid
for eid, e in found.items()
if e.status != "caught" or e.faint_level is not None if e.status != "caught" or e.faint_level is not None
] ]
if not_alive: if not_alive:
@@ -168,13 +173,15 @@ async def update_run(
# Auto-set completed_at when ending a run # Auto-set completed_at when ending a run
if "status" in update_data and update_data["status"] in ("completed", "failed"): if "status" in update_data and update_data["status"] in ("completed", "failed"):
if run.status != "active": if run.status != "active":
raise HTTPException( raise HTTPException(status_code=400, detail="Only active runs can be ended")
status_code=400, detail="Only active runs can be ended" update_data["completed_at"] = datetime.now(UTC)
)
update_data["completed_at"] = datetime.now(timezone.utc)
# Block reactivating a completed/failed run that belongs to a genlocke # Block reactivating a completed/failed run that belongs to a genlocke
if "status" in update_data and update_data["status"] == "active" and run.status != "active": if (
"status" in update_data
and update_data["status"] == "active"
and run.status != "active"
):
leg_result = await session.execute( leg_result = await session.execute(
select(GenlockeLeg).where(GenlockeLeg.run_id == run_id) select(GenlockeLeg).where(GenlockeLeg.run_id == run_id)
) )
@@ -215,9 +222,7 @@ async def update_run(
@router.delete("/{run_id}", status_code=204) @router.delete("/{run_id}", status_code=204)
async def delete_run( async def delete_run(run_id: int, session: AsyncSession = Depends(get_session)):
run_id: int, session: AsyncSession = Depends(get_session)
):
run = await session.get(NuzlockeRun, run_id) run = await session.get(NuzlockeRun, run_id)
if run is None: if run is None:
raise HTTPException(status_code=404, detail="Run not found") raise HTTPException(status_code=404, detail="Run not found")

View File

@@ -84,8 +84,12 @@ async def get_stats(session: AsyncSession = Depends(get_session)):
fainted_count = enc.fainted fainted_count = enc.fainted
missed_count = enc.missed missed_count = enc.missed
catch_rate = round(caught_count / total_encounters, 4) if total_encounters > 0 else None catch_rate = (
avg_encounters_per_run = round(total_encounters / total_runs, 1) if total_runs > 0 else None round(caught_count / total_encounters, 4) if total_encounters > 0 else None
)
avg_encounters_per_run = (
round(total_encounters / total_runs, 1) if total_runs > 0 else None
)
# --- Top caught pokemon (top 10) --- # --- Top caught pokemon (top 10) ---
top_caught_q = await session.execute( top_caught_q = await session.execute(
@@ -102,7 +106,9 @@ async def get_stats(session: AsyncSession = Depends(get_session)):
.limit(10) .limit(10)
) )
top_caught_pokemon = [ top_caught_pokemon = [
PokemonRanking(pokemon_id=r.id, name=r.name, sprite_url=r.sprite_url, count=r.count) PokemonRanking(
pokemon_id=r.id, name=r.name, sprite_url=r.sprite_url, count=r.count
)
for r in top_caught_q.all() for r in top_caught_q.all()
] ]
@@ -120,7 +126,9 @@ async def get_stats(session: AsyncSession = Depends(get_session)):
.limit(10) .limit(10)
) )
top_encountered_pokemon = [ top_encountered_pokemon = [
PokemonRanking(pokemon_id=r.id, name=r.name, sprite_url=r.sprite_url, count=r.count) PokemonRanking(
pokemon_id=r.id, name=r.name, sprite_url=r.sprite_url, count=r.count
)
for r in top_enc_q.all() for r in top_enc_q.all()
] ]
@@ -149,8 +157,7 @@ async def get_stats(session: AsyncSession = Depends(get_session)):
.limit(5) .limit(5)
) )
top_death_causes = [ top_death_causes = [
DeathCause(cause=r.death_cause, count=r.count) DeathCause(cause=r.death_cause, count=r.count) for r in death_causes_q.all()
for r in death_causes_q.all()
] ]
# Average levels # Average levels
@@ -179,8 +186,7 @@ async def get_stats(session: AsyncSession = Depends(get_session)):
.order_by(func.count().desc()) .order_by(func.count().desc())
) )
type_distribution = [ type_distribution = [
TypeCount(type=r.type_name, count=r.count) TypeCount(type=r.type_name, count=r.count) for r in type_q.all()
for r in type_q.all()
] ]
return StatsResponse( return StatsResponse(

View File

@@ -7,7 +7,9 @@ from app.core.database import Base
class BossBattle(Base): class BossBattle(Base):
__tablename__ = "boss_battles" __tablename__ = "boss_battles"
__table_args__ = ( __table_args__ = (
UniqueConstraint("version_group_id", "order", name="uq_boss_battles_version_group_order"), UniqueConstraint(
"version_group_id", "order", name="uq_boss_battles_version_group_order"
),
) )
id: Mapped[int] = mapped_column(primary_key=True) id: Mapped[int] = mapped_column(primary_key=True)
@@ -15,8 +17,12 @@ class BossBattle(Base):
ForeignKey("version_groups.id"), index=True ForeignKey("version_groups.id"), index=True
) )
name: Mapped[str] = mapped_column(String(100)) name: Mapped[str] = mapped_column(String(100))
boss_type: Mapped[str] = mapped_column(String(20)) # gym_leader, elite_four, champion, rival, evil_team, other boss_type: Mapped[str] = mapped_column(
specialty_type: Mapped[str | None] = mapped_column(String(20), default=None) # pokemon type specialty (e.g. rock, water) String(20)
) # gym_leader, elite_four, champion, rival, evil_team, other
specialty_type: Mapped[str | None] = mapped_column(
String(20), default=None
) # pokemon type specialty (e.g. rock, water)
badge_name: Mapped[str | None] = mapped_column(String(100)) badge_name: Mapped[str | None] = mapped_column(String(100))
badge_image_url: Mapped[str | None] = mapped_column(String(500)) badge_image_url: Mapped[str | None] = mapped_column(String(500))
level_cap: Mapped[int] = mapped_column(SmallInteger) level_cap: Mapped[int] = mapped_column(SmallInteger)
@@ -28,13 +34,13 @@ class BossBattle(Base):
section: Mapped[str | None] = mapped_column(String(100), default=None) section: Mapped[str | None] = mapped_column(String(100), default=None)
sprite_url: Mapped[str | None] = mapped_column(String(500)) sprite_url: Mapped[str | None] = mapped_column(String(500))
version_group: Mapped["VersionGroup"] = relationship( version_group: Mapped["VersionGroup"] = relationship(back_populates="boss_battles")
back_populates="boss_battles"
)
after_route: Mapped["Route | None"] = relationship() after_route: Mapped["Route | None"] = relationship()
pokemon: Mapped[list["BossPokemon"]] = relationship( pokemon: Mapped[list["BossPokemon"]] = relationship(
back_populates="boss_battle", cascade="all, delete-orphan" back_populates="boss_battle", cascade="all, delete-orphan"
) )
def __repr__(self) -> str: def __repr__(self) -> str:
return f"<BossBattle(id={self.id}, name='{self.name}', type='{self.boss_type}')>" return (
f"<BossBattle(id={self.id}, name='{self.name}', type='{self.boss_type}')>"
)

View File

@@ -21,7 +21,9 @@ class Encounter(Base):
current_pokemon_id: Mapped[int | None] = mapped_column( current_pokemon_id: Mapped[int | None] = mapped_column(
ForeignKey("pokemon.id"), index=True ForeignKey("pokemon.id"), index=True
) )
is_shiny: Mapped[bool] = mapped_column(Boolean, default=False, server_default=text("false")) is_shiny: Mapped[bool] = mapped_column(
Boolean, default=False, server_default=text("false")
)
caught_at: Mapped[datetime] = mapped_column( caught_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), server_default=func.now() DateTime(timezone=True), server_default=func.now()
) )

View File

@@ -14,7 +14,9 @@ class Evolution(Base):
min_level: Mapped[int | None] = mapped_column(SmallInteger) min_level: Mapped[int | None] = mapped_column(SmallInteger)
item: Mapped[str | None] = mapped_column(String(50)) # e.g. thunder-stone item: Mapped[str | None] = mapped_column(String(50)) # e.g. thunder-stone
held_item: Mapped[str | None] = mapped_column(String(50)) held_item: Mapped[str | None] = mapped_column(String(50))
condition: Mapped[str | None] = mapped_column(String(200)) # catch-all for other conditions condition: Mapped[str | None] = mapped_column(
String(200)
) # catch-all for other conditions
region: Mapped[str | None] = mapped_column(String(30)) region: Mapped[str | None] = mapped_column(String(30))
from_pokemon: Mapped["Pokemon"] = relationship(foreign_keys=[from_pokemon_id]) from_pokemon: Mapped["Pokemon"] = relationship(foreign_keys=[from_pokemon_id])

View File

@@ -12,7 +12,9 @@ class Game(Base):
slug: Mapped[str] = mapped_column(String(100), unique=True) slug: Mapped[str] = mapped_column(String(100), unique=True)
generation: Mapped[int] = mapped_column(SmallInteger) generation: Mapped[int] = mapped_column(SmallInteger)
region: Mapped[str] = mapped_column(String(50)) region: Mapped[str] = mapped_column(String(50))
category: Mapped[str | None] = mapped_column(String(20)) # original, remake, enhanced, sequel, spinoff category: Mapped[str | None] = mapped_column(
String(20)
) # original, remake, enhanced, sequel, spinoff
box_art_url: Mapped[str | None] = mapped_column(String(500)) box_art_url: Mapped[str | None] = mapped_column(String(500))
release_year: Mapped[int | None] = mapped_column(SmallInteger) release_year: Mapped[int | None] = mapped_column(SmallInteger)
color: Mapped[str | None] = mapped_column(String(7)) # Hex color e.g. #FF0000 color: Mapped[str | None] = mapped_column(String(7)) # Hex color e.g. #FF0000
@@ -20,9 +22,7 @@ class Game(Base):
ForeignKey("version_groups.id"), index=True ForeignKey("version_groups.id"), index=True
) )
version_group: Mapped["VersionGroup | None"] = relationship( version_group: Mapped["VersionGroup | None"] = relationship(back_populates="games")
back_populates="games"
)
runs: Mapped[list["NuzlockeRun"]] = relationship(back_populates="game") runs: Mapped[list["NuzlockeRun"]] = relationship(back_populates="game")
def __repr__(self) -> str: def __repr__(self) -> str:

View File

@@ -13,7 +13,9 @@ class Genlocke(Base):
id: Mapped[int] = mapped_column(primary_key=True) id: Mapped[int] = mapped_column(primary_key=True)
name: Mapped[str] = mapped_column(String(100)) name: Mapped[str] = mapped_column(String(100))
status: Mapped[str] = mapped_column(String(20), index=True) # active, completed, failed status: Mapped[str] = mapped_column(
String(20), index=True
) # active, completed, failed
genlocke_rules: Mapped[dict] = mapped_column(JSONB, default=dict) genlocke_rules: Mapped[dict] = mapped_column(JSONB, default=dict)
nuzlocke_rules: Mapped[dict] = mapped_column(JSONB, default=dict) nuzlocke_rules: Mapped[dict] = mapped_column(JSONB, default=dict)
created_at: Mapped[datetime] = mapped_column( created_at: Mapped[datetime] = mapped_column(

View File

@@ -13,7 +13,9 @@ class NuzlockeRun(Base):
id: Mapped[int] = mapped_column(primary_key=True) id: Mapped[int] = mapped_column(primary_key=True)
game_id: Mapped[int] = mapped_column(ForeignKey("games.id"), index=True) game_id: Mapped[int] = mapped_column(ForeignKey("games.id"), index=True)
name: Mapped[str] = mapped_column(String(100)) name: Mapped[str] = mapped_column(String(100))
status: Mapped[str] = mapped_column(String(20), index=True) # active, completed, failed status: Mapped[str] = mapped_column(
String(20), index=True
) # active, completed, failed
rules: Mapped[dict] = mapped_column(JSONB, default=dict) rules: Mapped[dict] = mapped_column(JSONB, default=dict)
started_at: Mapped[datetime] = mapped_column( started_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), server_default=func.now() DateTime(timezone=True), server_default=func.now()
@@ -26,4 +28,6 @@ class NuzlockeRun(Base):
boss_results: Mapped[list["BossResult"]] = relationship(back_populates="run") boss_results: Mapped[list["BossResult"]] = relationship(back_populates="run")
def __repr__(self) -> str: def __repr__(self) -> str:
return f"<NuzlockeRun(id={self.id}, name='{self.name}', status='{self.status}')>" return (
f"<NuzlockeRun(id={self.id}, name='{self.name}', status='{self.status}')>"
)

View File

@@ -7,7 +7,9 @@ from app.core.database import Base
class Route(Base): class Route(Base):
__tablename__ = "routes" __tablename__ = "routes"
__table_args__ = ( __table_args__ = (
UniqueConstraint("version_group_id", "name", name="uq_routes_version_group_name"), UniqueConstraint(
"version_group_id", "name", name="uq_routes_version_group_name"
),
) )
id: Mapped[int] = mapped_column(primary_key=True) id: Mapped[int] = mapped_column(primary_key=True)

View File

@@ -8,8 +8,11 @@ class RouteEncounter(Base):
__tablename__ = "route_encounters" __tablename__ = "route_encounters"
__table_args__ = ( __table_args__ = (
UniqueConstraint( UniqueConstraint(
"route_id", "pokemon_id", "encounter_method", "game_id", "route_id",
name="uq_route_pokemon_method_game" "pokemon_id",
"encounter_method",
"game_id",
name="uq_route_pokemon_method_game",
), ),
) )

View File

@@ -14,7 +14,6 @@ from app.schemas.encounter import (
EncounterResponse, EncounterResponse,
EncounterUpdate, EncounterUpdate,
) )
from app.schemas.genlocke import GenlockeCreate, GenlockeResponse, GenlockeLegResponse
from app.schemas.game import ( from app.schemas.game import (
GameCreate, GameCreate,
GameDetailResponse, GameDetailResponse,
@@ -25,6 +24,7 @@ from app.schemas.game import (
RouteResponse, RouteResponse,
RouteUpdate, RouteUpdate,
) )
from app.schemas.genlocke import GenlockeCreate, GenlockeLegResponse, GenlockeResponse
from app.schemas.pokemon import ( from app.schemas.pokemon import (
BulkImportItem, BulkImportItem,
BulkImportResult, BulkImportResult,
@@ -37,7 +37,13 @@ from app.schemas.pokemon import (
RouteEncounterResponse, RouteEncounterResponse,
RouteEncounterUpdate, RouteEncounterUpdate,
) )
from app.schemas.run import RunCreate, RunDetailResponse, RunGenlockeContext, RunResponse, RunUpdate from app.schemas.run import (
RunCreate,
RunDetailResponse,
RunGenlockeContext,
RunResponse,
RunUpdate,
)
__all__ = [ __all__ = [
"BossBattleCreate", "BossBattleCreate",

View File

@@ -6,7 +6,7 @@ Usage:
import asyncio import asyncio
import random import random
from datetime import datetime, timedelta, timezone from datetime import UTC, datetime, timedelta
from sqlalchemy import delete, select from sqlalchemy import delete, select
from sqlalchemy.ext.asyncio import AsyncSession from sqlalchemy.ext.asyncio import AsyncSession
@@ -16,18 +16,52 @@ from app.models.encounter import Encounter
from app.models.evolution import Evolution from app.models.evolution import Evolution
from app.models.game import Game from app.models.game import Game
from app.models.nuzlocke_run import NuzlockeRun from app.models.nuzlocke_run import NuzlockeRun
from app.models.pokemon import Pokemon
from app.models.route import Route from app.models.route import Route
random.seed(42) # reproducible data random.seed(42) # reproducible data
# --- Nicknames pool --- # --- Nicknames pool ---
NICKNAMES = [ NICKNAMES = [
"Blaze", "Thunder", "Shadow", "Luna", "Spike", "Rex", "Cinder", "Misty", "Blaze",
"Rocky", "Breeze", "Fang", "Nova", "Scout", "Atlas", "Pepper", "Storm", "Thunder",
"Bandit", "Echo", "Maple", "Titan", "Ziggy", "Bolt", "Rusty", "Pearl", "Shadow",
"Ivy", "Ghost", "Sunny", "Dash", "Ember", "Frost", "Jade", "Onyx", "Luna",
"Willow", "Tank", "Pip", "Mochi", "Salem", "Patches", "Bean", "Rocket", "Spike",
"Rex",
"Cinder",
"Misty",
"Rocky",
"Breeze",
"Fang",
"Nova",
"Scout",
"Atlas",
"Pepper",
"Storm",
"Bandit",
"Echo",
"Maple",
"Titan",
"Ziggy",
"Bolt",
"Rusty",
"Pearl",
"Ivy",
"Ghost",
"Sunny",
"Dash",
"Ember",
"Frost",
"Jade",
"Onyx",
"Willow",
"Tank",
"Pip",
"Mochi",
"Salem",
"Patches",
"Bean",
"Rocket",
] ]
DEATH_CAUSES = [ DEATH_CAUSES = [
@@ -129,20 +163,18 @@ async def get_leaf_routes(session: AsyncSession, game_id: int) -> list[Route]:
"""Get routes that can have encounters (no children).""" """Get routes that can have encounters (no children)."""
# Get all routes for the game # Get all routes for the game
result = await session.execute( result = await session.execute(
select(Route) select(Route).where(Route.game_id == game_id).order_by(Route.order)
.where(Route.game_id == game_id)
.order_by(Route.order)
) )
all_routes = result.scalars().all() all_routes = result.scalars().all()
parent_ids = {r.parent_route_id for r in all_routes if r.parent_route_id is not None} parent_ids = {
r.parent_route_id for r in all_routes if r.parent_route_id is not None
}
leaf_routes = [r for r in all_routes if r.id not in parent_ids] leaf_routes = [r for r in all_routes if r.id not in parent_ids]
return leaf_routes return leaf_routes
async def get_encounterables( async def get_encounterables(session: AsyncSession, game_id: int) -> list[int]:
session: AsyncSession, game_id: int
) -> list[int]:
"""Get pokemon IDs that appear in route encounters for this game.""" """Get pokemon IDs that appear in route encounters for this game."""
from app.models.route_encounter import RouteEncounter from app.models.route_encounter import RouteEncounter
@@ -157,16 +189,16 @@ async def get_encounterables(
async def get_evolution_map(session: AsyncSession) -> dict[int, list[int]]: async def get_evolution_map(session: AsyncSession) -> dict[int, list[int]]:
"""Return {from_pokemon_id: [to_pokemon_id, ...]} for all evolutions.""" """Return {from_pokemon_id: [to_pokemon_id, ...]} for all evolutions."""
result = await session.execute(select(Evolution.from_pokemon_id, Evolution.to_pokemon_id)) result = await session.execute(
select(Evolution.from_pokemon_id, Evolution.to_pokemon_id)
)
evo_map: dict[int, list[int]] = {} evo_map: dict[int, list[int]] = {}
for from_id, to_id in result: for from_id, to_id in result:
evo_map.setdefault(from_id, []).append(to_id) evo_map.setdefault(from_id, []).append(to_id)
return evo_map return evo_map
def pick_routes_for_run( def pick_routes_for_run(leaf_routes: list[Route], progress: float) -> list[Route]:
leaf_routes: list[Route], progress: float
) -> list[Route]:
"""Pick a subset of leaf routes respecting one-per-group. """Pick a subset of leaf routes respecting one-per-group.
For routes with a parent, only one sibling per parent_route_id is chosen. For routes with a parent, only one sibling per parent_route_id is chosen.
@@ -257,74 +289,73 @@ async def inject():
"""Clear existing runs and inject test data.""" """Clear existing runs and inject test data."""
print("Injecting test data...") print("Injecting test data...")
async with async_session() as session: async with async_session() as session, session.begin():
async with session.begin(): # Clear existing runs and encounters
# Clear existing runs and encounters await session.execute(delete(Encounter))
await session.execute(delete(Encounter)) await session.execute(delete(NuzlockeRun))
await session.execute(delete(NuzlockeRun)) print("Cleared existing runs and encounters")
print("Cleared existing runs and encounters")
evo_map = await get_evolution_map(session) evo_map = await get_evolution_map(session)
now = datetime.now(timezone.utc) now = datetime.now(UTC)
total_runs = 0 total_runs = 0
total_encounters = 0 total_encounters = 0
for run_def in RUN_DEFS: for run_def in RUN_DEFS:
game = await get_game_by_slug(session, run_def["game_slug"]) game = await get_game_by_slug(session, run_def["game_slug"])
if game is None: if game is None:
print(f" Warning: game '{run_def['game_slug']}' not found, skipping") print(f" Warning: game '{run_def['game_slug']}' not found, skipping")
continue continue
# Build rules # Build rules
rules = {**DEFAULT_RULES, **run_def["rules"]} rules = {**DEFAULT_RULES, **run_def["rules"]}
# Compute dates # Compute dates
started_at = now - timedelta(days=run_def["started_days_ago"]) started_at = now - timedelta(days=run_def["started_days_ago"])
completed_at = None completed_at = None
if run_def["ended_days_ago"] is not None: if run_def["ended_days_ago"] is not None:
completed_at = now - timedelta(days=run_def["ended_days_ago"]) completed_at = now - timedelta(days=run_def["ended_days_ago"])
run = NuzlockeRun( run = NuzlockeRun(
game_id=game.id, game_id=game.id,
name=run_def["name"], name=run_def["name"],
status=run_def["status"], status=run_def["status"],
rules=rules, rules=rules,
started_at=started_at, started_at=started_at,
completed_at=completed_at, completed_at=completed_at,
) )
session.add(run) session.add(run)
await session.flush() # get run.id await session.flush() # get run.id
# Get routes and pokemon for this game # Get routes and pokemon for this game
leaf_routes = await get_leaf_routes(session, game.id) leaf_routes = await get_leaf_routes(session, game.id)
pokemon_ids = await get_encounterables(session, game.id) pokemon_ids = await get_encounterables(session, game.id)
if not leaf_routes or not pokemon_ids:
print(f" {run_def['name']}: no routes or pokemon, skipping encounters")
total_runs += 1
continue
chosen_routes = pick_routes_for_run(leaf_routes, run_def["progress"])
used_pokemon: set[int] = set()
run_encounters = 0
for i, route in enumerate(chosen_routes):
enc = generate_encounter(
run.id, route, pokemon_ids, evo_map, used_pokemon, i
)
session.add(enc)
run_encounters += 1
if not leaf_routes or not pokemon_ids:
print(f" {run_def['name']}: no routes or pokemon, skipping encounters")
total_runs += 1 total_runs += 1
total_encounters += run_encounters continue
print( chosen_routes = pick_routes_for_run(leaf_routes, run_def["progress"])
f" {run_def['name']} ({game.name}, {run_def['status']}): " used_pokemon: set[int] = set()
f"{run_encounters} encounters across {len(chosen_routes)} routes"
run_encounters = 0
for i, route in enumerate(chosen_routes):
enc = generate_encounter(
run.id, route, pokemon_ids, evo_map, used_pokemon, i
) )
session.add(enc)
run_encounters += 1
print(f"\nCreated {total_runs} runs with {total_encounters} total encounters") total_runs += 1
total_encounters += run_encounters
print(
f" {run_def['name']} ({game.name}, {run_def['status']}): "
f"{run_encounters} encounters across {len(chosen_routes)} routes"
)
print(f"\nCreated {total_runs} runs with {total_encounters} total encounters")
print("Test data injection complete!") print("Test data injection complete!")

View File

@@ -21,15 +21,18 @@ async def upsert_version_groups(
"""Upsert version group records, return {slug: id} mapping.""" """Upsert version group records, return {slug: id} mapping."""
for vg_slug, vg_info in vg_data.items(): for vg_slug, vg_info in vg_data.items():
vg_name = " / ".join( vg_name = " / ".join(
g["name"].replace("Pokemon ", "") g["name"].replace("Pokemon ", "") for g in vg_info["games"].values()
for g in vg_info["games"].values()
) )
stmt = insert(VersionGroup).values( stmt = (
name=vg_name, insert(VersionGroup)
slug=vg_slug, .values(
).on_conflict_do_update( name=vg_name,
index_elements=["slug"], slug=vg_slug,
set_={"name": vg_name}, )
.on_conflict_do_update(
index_elements=["slug"],
set_={"name": vg_name},
)
) )
await session.execute(stmt) await session.execute(stmt)
@@ -69,9 +72,13 @@ async def upsert_games(
values["version_group_id"] = vg_id values["version_group_id"] = vg_id
update_set["version_group_id"] = vg_id update_set["version_group_id"] = vg_id
stmt = insert(Game).values(**values).on_conflict_do_update( stmt = (
index_elements=["slug"], insert(Game)
set_=update_set, .values(**values)
.on_conflict_do_update(
index_elements=["slug"],
set_=update_set,
)
) )
await session.execute(stmt) await session.execute(stmt)
@@ -81,23 +88,29 @@ async def upsert_games(
return {row.slug: row.id for row in result} return {row.slug: row.id for row in result}
async def upsert_pokemon(session: AsyncSession, pokemon_list: list[dict]) -> dict[int, int]: async def upsert_pokemon(
session: AsyncSession, pokemon_list: list[dict]
) -> dict[int, int]:
"""Upsert pokemon records, return {pokeapi_id: id} mapping.""" """Upsert pokemon records, return {pokeapi_id: id} mapping."""
for poke in pokemon_list: for poke in pokemon_list:
stmt = insert(Pokemon).values( stmt = (
pokeapi_id=poke["pokeapi_id"], insert(Pokemon)
national_dex=poke["national_dex"], .values(
name=poke["name"], pokeapi_id=poke["pokeapi_id"],
types=poke["types"], national_dex=poke["national_dex"],
sprite_url=poke.get("sprite_url"), name=poke["name"],
).on_conflict_do_update( types=poke["types"],
index_elements=["pokeapi_id"], sprite_url=poke.get("sprite_url"),
set_={ )
"national_dex": poke["national_dex"], .on_conflict_do_update(
"name": poke["name"], index_elements=["pokeapi_id"],
"types": poke["types"], set_={
"sprite_url": poke.get("sprite_url"), "national_dex": poke["national_dex"],
}, "name": poke["name"],
"types": poke["types"],
"sprite_url": poke.get("sprite_url"),
},
)
) )
await session.execute(stmt) await session.execute(stmt)
@@ -119,14 +132,18 @@ async def upsert_routes(
""" """
# First pass: upsert all parent routes (without parent_route_id) # First pass: upsert all parent routes (without parent_route_id)
for route in routes: for route in routes:
stmt = insert(Route).values( stmt = (
name=route["name"], insert(Route)
version_group_id=version_group_id, .values(
order=route["order"], name=route["name"],
parent_route_id=None, # Parent routes have no parent version_group_id=version_group_id,
).on_conflict_do_update( order=route["order"],
constraint="uq_routes_version_group_name", parent_route_id=None, # Parent routes have no parent
set_={"order": route["order"], "parent_route_id": None}, )
.on_conflict_do_update(
constraint="uq_routes_version_group_name",
set_={"order": route["order"], "parent_route_id": None},
)
) )
await session.execute(stmt) await session.execute(stmt)
@@ -146,19 +163,23 @@ async def upsert_routes(
parent_id = name_to_id[route["name"]] parent_id = name_to_id[route["name"]]
for child in children: for child in children:
stmt = insert(Route).values( stmt = (
name=child["name"], insert(Route)
version_group_id=version_group_id, .values(
order=child["order"], name=child["name"],
parent_route_id=parent_id, version_group_id=version_group_id,
pinwheel_zone=child.get("pinwheel_zone"), order=child["order"],
).on_conflict_do_update( parent_route_id=parent_id,
constraint="uq_routes_version_group_name", pinwheel_zone=child.get("pinwheel_zone"),
set_={ )
"order": child["order"], .on_conflict_do_update(
"parent_route_id": parent_id, constraint="uq_routes_version_group_name",
"pinwheel_zone": child.get("pinwheel_zone"), set_={
}, "order": child["order"],
"parent_route_id": parent_id,
"pinwheel_zone": child.get("pinwheel_zone"),
},
)
) )
await session.execute(stmt) await session.execute(stmt)
@@ -186,21 +207,25 @@ async def upsert_route_encounters(
print(f" Warning: no pokemon_id for pokeapi_id {enc['pokeapi_id']}") print(f" Warning: no pokemon_id for pokeapi_id {enc['pokeapi_id']}")
continue continue
stmt = insert(RouteEncounter).values( stmt = (
route_id=route_id, insert(RouteEncounter)
pokemon_id=pokemon_id, .values(
game_id=game_id, route_id=route_id,
encounter_method=enc["method"], pokemon_id=pokemon_id,
encounter_rate=enc["encounter_rate"], game_id=game_id,
min_level=enc["min_level"], encounter_method=enc["method"],
max_level=enc["max_level"], encounter_rate=enc["encounter_rate"],
).on_conflict_do_update( min_level=enc["min_level"],
constraint="uq_route_pokemon_method_game", max_level=enc["max_level"],
set_={ )
"encounter_rate": enc["encounter_rate"], .on_conflict_do_update(
"min_level": enc["min_level"], constraint="uq_route_pokemon_method_game",
"max_level": enc["max_level"], set_={
}, "encounter_rate": enc["encounter_rate"],
"min_level": enc["min_level"],
"max_level": enc["max_level"],
},
)
) )
await session.execute(stmt) await session.execute(stmt)
count += 1 count += 1
@@ -224,37 +249,44 @@ async def upsert_bosses(
if after_route_name and route_name_to_id: if after_route_name and route_name_to_id:
after_route_id = route_name_to_id.get(after_route_name) after_route_id = route_name_to_id.get(after_route_name)
if after_route_id is None: if after_route_id is None:
print(f" Warning: route '{after_route_name}' not found for boss '{boss['name']}'") print(
f" Warning: route '{after_route_name}' not found for boss '{boss['name']}'"
)
# Upsert the boss battle on (version_group_id, order) conflict # Upsert the boss battle on (version_group_id, order) conflict
stmt = insert(BossBattle).values( stmt = (
version_group_id=version_group_id, insert(BossBattle)
name=boss["name"], .values(
boss_type=boss["boss_type"], version_group_id=version_group_id,
specialty_type=boss.get("specialty_type"), name=boss["name"],
badge_name=boss.get("badge_name"), boss_type=boss["boss_type"],
badge_image_url=boss.get("badge_image_url"), specialty_type=boss.get("specialty_type"),
level_cap=boss["level_cap"], badge_name=boss.get("badge_name"),
order=boss["order"], badge_image_url=boss.get("badge_image_url"),
after_route_id=after_route_id, level_cap=boss["level_cap"],
location=boss["location"], order=boss["order"],
section=boss.get("section"), after_route_id=after_route_id,
sprite_url=boss.get("sprite_url"), location=boss["location"],
).on_conflict_do_update( section=boss.get("section"),
constraint="uq_boss_battles_version_group_order", sprite_url=boss.get("sprite_url"),
set_={ )
"name": boss["name"], .on_conflict_do_update(
"boss_type": boss["boss_type"], constraint="uq_boss_battles_version_group_order",
"specialty_type": boss.get("specialty_type"), set_={
"badge_name": boss.get("badge_name"), "name": boss["name"],
"badge_image_url": boss.get("badge_image_url"), "boss_type": boss["boss_type"],
"level_cap": boss["level_cap"], "specialty_type": boss.get("specialty_type"),
"after_route_id": after_route_id, "badge_name": boss.get("badge_name"),
"location": boss["location"], "badge_image_url": boss.get("badge_image_url"),
"section": boss.get("section"), "level_cap": boss["level_cap"],
"sprite_url": boss.get("sprite_url"), "after_route_id": after_route_id,
}, "location": boss["location"],
).returning(BossBattle.id) "section": boss.get("section"),
"sprite_url": boss.get("sprite_url"),
},
)
.returning(BossBattle.id)
)
result = await session.execute(stmt) result = await session.execute(stmt)
boss_id = result.scalar_one() boss_id = result.scalar_one()
@@ -267,13 +299,15 @@ async def upsert_bosses(
if pokemon_id is None: if pokemon_id is None:
print(f" Warning: no pokemon_id for pokeapi_id {bp['pokeapi_id']}") print(f" Warning: no pokemon_id for pokeapi_id {bp['pokeapi_id']}")
continue continue
session.add(BossPokemon( session.add(
boss_battle_id=boss_id, BossPokemon(
pokemon_id=pokemon_id, boss_battle_id=boss_id,
level=bp["level"], pokemon_id=pokemon_id,
order=bp["order"], level=bp["level"],
condition_label=bp.get("condition_label"), order=bp["order"],
)) condition_label=bp.get("condition_label"),
)
)
count += 1 count += 1

View File

@@ -42,130 +42,139 @@ async def seed():
"""Run the full seed process.""" """Run the full seed process."""
print("Starting seed...") print("Starting seed...")
async with async_session() as session: async with async_session() as session, session.begin():
async with session.begin(): # 1. Upsert version groups
# 1. Upsert version groups with open(VG_JSON) as f:
with open(VG_JSON) as f: vg_data = json.load(f)
vg_data = json.load(f) vg_slug_to_id = await upsert_version_groups(session, vg_data)
vg_slug_to_id = await upsert_version_groups(session, vg_data) print(f"Version Groups: {len(vg_slug_to_id)} upserted")
print(f"Version Groups: {len(vg_slug_to_id)} upserted")
# Build game_slug -> vg_id mapping # Build game_slug -> vg_id mapping
game_slug_to_vg_id: dict[str, int] = {} game_slug_to_vg_id: dict[str, int] = {}
for vg_slug, vg_info in vg_data.items(): for vg_slug, vg_info in vg_data.items():
vg_id = vg_slug_to_id[vg_slug] vg_id = vg_slug_to_id[vg_slug]
for game_slug in vg_info["games"]: for game_slug in vg_info["games"]:
game_slug_to_vg_id[game_slug] = vg_id game_slug_to_vg_id[game_slug] = vg_id
# 2. Upsert games (with version_group_id) # 2. Upsert games (with version_group_id)
games_data = load_json("games.json") games_data = load_json("games.json")
slug_to_id = await upsert_games(session, games_data, game_slug_to_vg_id) slug_to_id = await upsert_games(session, games_data, game_slug_to_vg_id)
print(f"Games: {len(slug_to_id)} upserted") print(f"Games: {len(slug_to_id)} upserted")
# 3. Upsert Pokemon # 3. Upsert Pokemon
pokemon_data = load_json("pokemon.json") pokemon_data = load_json("pokemon.json")
dex_to_id = await upsert_pokemon(session, pokemon_data) dex_to_id = await upsert_pokemon(session, pokemon_data)
print(f"Pokemon: {len(dex_to_id)} upserted") print(f"Pokemon: {len(dex_to_id)} upserted")
# 4. Per version group: upsert routes once, then encounters per game # 4. Per version group: upsert routes once, then encounters per game
total_routes = 0 total_routes = 0
total_encounters = 0 total_encounters = 0
route_maps_by_vg: dict[int, dict[str, int]] = {} route_maps_by_vg: dict[int, dict[str, int]] = {}
for vg_slug, vg_info in vg_data.items(): for vg_slug, vg_info in vg_data.items():
vg_id = vg_slug_to_id[vg_slug] vg_id = vg_slug_to_id[vg_slug]
game_slugs = list(vg_info["games"].keys()) game_slugs = list(vg_info["games"].keys())
# Use the first game's route JSON for the shared route structure # Use the first game's route JSON for the shared route structure
first_game_slug = game_slugs[0] first_game_slug = game_slugs[0]
routes_file = DATA_DIR / f"{first_game_slug}.json" routes_file = DATA_DIR / f"{first_game_slug}.json"
if not routes_file.exists(): if not routes_file.exists():
print(f" {vg_slug}: no route data ({first_game_slug}.json), skipping") print(f" {vg_slug}: no route data ({first_game_slug}.json), skipping")
continue
routes_data = load_json(f"{first_game_slug}.json")
if not routes_data:
print(f" {vg_slug}: empty route data, skipping")
continue
# Upsert routes once per version group
route_map = await upsert_routes(session, vg_id, routes_data)
route_maps_by_vg[vg_id] = route_map
total_routes += len(route_map)
print(f" {vg_slug}: {len(route_map)} routes")
# Upsert encounters per game (each game may have different encounters)
for game_slug in game_slugs:
game_id = slug_to_id.get(game_slug)
if game_id is None:
print(f" Warning: game '{game_slug}' not found, skipping")
continue continue
routes_data = load_json(f"{first_game_slug}.json") game_routes_file = DATA_DIR / f"{game_slug}.json"
if not routes_data: if not game_routes_file.exists():
print(f" {vg_slug}: empty route data, skipping")
continue continue
# Upsert routes once per version group game_routes_data = load_json(f"{game_slug}.json")
route_map = await upsert_routes(session, vg_id, routes_data) for route in game_routes_data:
route_maps_by_vg[vg_id] = route_map route_id = route_map.get(route["name"])
total_routes += len(route_map) if route_id is None:
print(f" {vg_slug}: {len(route_map)} routes") print(f" Warning: route '{route['name']}' not found")
# Upsert encounters per game (each game may have different encounters)
for game_slug in game_slugs:
game_id = slug_to_id.get(game_slug)
if game_id is None:
print(f" Warning: game '{game_slug}' not found, skipping")
continue continue
game_routes_file = DATA_DIR / f"{game_slug}.json" # Parent routes may have empty encounters
if not game_routes_file.exists(): if route["encounters"]:
continue enc_count = await upsert_route_encounters(
session,
route_id,
route["encounters"],
dex_to_id,
game_id,
)
total_encounters += enc_count
game_routes_data = load_json(f"{game_slug}.json") # Handle child routes
for route in game_routes_data: for child in route.get("children", []):
route_id = route_map.get(route["name"]) child_id = route_map.get(child["name"])
if route_id is None: if child_id is None:
print(f" Warning: route '{route['name']}' not found") print(
f" Warning: child route '{child['name']}' not found"
)
continue continue
# Parent routes may have empty encounters enc_count = await upsert_route_encounters(
if route["encounters"]: session,
enc_count = await upsert_route_encounters( child_id,
session, route_id, route["encounters"], child["encounters"],
dex_to_id, game_id, dex_to_id,
) game_id,
total_encounters += enc_count )
total_encounters += enc_count
# Handle child routes print(f" {game_slug}: encounters loaded")
for child in route.get("children", []):
child_id = route_map.get(child["name"])
if child_id is None:
print(f" Warning: child route '{child['name']}' not found")
continue
enc_count = await upsert_route_encounters( print(f"\nTotal routes: {total_routes}")
session, child_id, child["encounters"], print(f"Total encounters: {total_encounters}")
dex_to_id, game_id,
)
total_encounters += enc_count
print(f" {game_slug}: encounters loaded") # 5. Per version group: upsert bosses
total_bosses = 0
for vg_slug, vg_info in vg_data.items():
vg_id = vg_slug_to_id[vg_slug]
first_game_slug = list(vg_info["games"].keys())[0]
bosses_file = DATA_DIR / f"{first_game_slug}-bosses.json"
if not bosses_file.exists():
continue
print(f"\nTotal routes: {total_routes}") bosses_data = load_json(f"{first_game_slug}-bosses.json")
print(f"Total encounters: {total_encounters}") if not bosses_data:
continue
# 5. Per version group: upsert bosses route_name_to_id = route_maps_by_vg.get(vg_id, {})
total_bosses = 0 boss_count = await upsert_bosses(
for vg_slug, vg_info in vg_data.items(): session, vg_id, bosses_data, dex_to_id, route_name_to_id
vg_id = vg_slug_to_id[vg_slug] )
first_game_slug = list(vg_info["games"].keys())[0] total_bosses += boss_count
bosses_file = DATA_DIR / f"{first_game_slug}-bosses.json" print(f" {vg_slug}: {boss_count} bosses")
if not bosses_file.exists():
continue
bosses_data = load_json(f"{first_game_slug}-bosses.json") print(f"Total bosses: {total_bosses}")
if not bosses_data:
continue
route_name_to_id = route_maps_by_vg.get(vg_id, {}) # 6. Upsert evolutions
boss_count = await upsert_bosses(session, vg_id, bosses_data, dex_to_id, route_name_to_id) evolutions_path = DATA_DIR / "evolutions.json"
total_bosses += boss_count if evolutions_path.exists():
print(f" {vg_slug}: {boss_count} bosses") evolutions_data = load_json("evolutions.json")
evo_count = await upsert_evolutions(session, evolutions_data, dex_to_id)
print(f"Total bosses: {total_bosses}") print(f"Evolutions: {evo_count} upserted")
else:
# 6. Upsert evolutions print("No evolutions.json found, skipping evolutions")
evolutions_path = DATA_DIR / "evolutions.json"
if evolutions_path.exists():
evolutions_data = load_json("evolutions.json")
evo_count = await upsert_evolutions(session, evolutions_data, dex_to_id)
print(f"Evolutions: {evo_count} upserted")
else:
print("No evolutions.json found, skipping evolutions")
print("Seed complete!") print("Seed complete!")
@@ -180,7 +189,9 @@ async def verify():
games_count = (await session.execute(select(func.count(Game.id)))).scalar() games_count = (await session.execute(select(func.count(Game.id)))).scalar()
pokemon_count = (await session.execute(select(func.count(Pokemon.id)))).scalar() pokemon_count = (await session.execute(select(func.count(Pokemon.id)))).scalar()
routes_count = (await session.execute(select(func.count(Route.id)))).scalar() routes_count = (await session.execute(select(func.count(Route.id)))).scalar()
enc_count = (await session.execute(select(func.count(RouteEncounter.id)))).scalar() enc_count = (
await session.execute(select(func.count(RouteEncounter.id)))
).scalar()
boss_count = (await session.execute(select(func.count(BossBattle.id)))).scalar() boss_count = (await session.execute(select(func.count(BossBattle.id)))).scalar()
print(f"Version Groups: {vg_count}") print(f"Version Groups: {vg_count}")
@@ -328,7 +339,7 @@ async def _export_routes(session: AsyncSession, vg_data: dict):
games_by_slug = {g.slug: g for g in game_result.scalars().all()} games_by_slug = {g.slug: g for g in game_result.scalars().all()}
exported = 0 exported = 0
for vg_slug, vg_info in vg_data.items(): for _vg_slug, vg_info in vg_data.items():
for game_slug in vg_info["games"]: for game_slug in vg_info["games"]:
game = games_by_slug.get(game_slug) game = games_by_slug.get(game_slug)
if game is None or game.version_group_id is None: if game is None or game.version_group_id is None:
@@ -356,11 +367,9 @@ async def _export_routes(session: AsyncSession, vg_data: dict):
if r.parent_route_id is not None: if r.parent_route_id is not None:
children_by_parent.setdefault(r.parent_route_id, []).append(r) children_by_parent.setdefault(r.parent_route_id, []).append(r)
def format_encounters(route: Route) -> list[dict]: def format_encounters(route: Route, _game: Game = game) -> list[dict]:
game_encounters = [ game_encounters = [
enc enc for enc in route.route_encounters if enc.game_id == _game.id
for enc in route.route_encounters
if enc.game_id == game.id
] ]
return [ return [
{ {
@@ -384,17 +393,20 @@ async def _export_routes(session: AsyncSession, vg_data: dict):
data["pinwheel_zone"] = route.pinwheel_zone data["pinwheel_zone"] = route.pinwheel_zone
return data return data
def format_route(route: Route) -> dict: def format_route(
route: Route,
_children_by_parent: dict[int, list[Route]] = children_by_parent,
) -> dict:
data: dict = { data: dict = {
"name": route.name, "name": route.name,
"order": route.order, "order": route.order,
"encounters": format_encounters(route), "encounters": format_encounters(route),
} }
children = children_by_parent.get(route.id, []) children = _children_by_parent.get(route.id, [])
if children: if children:
data["children"] = [ data["children"] = [
format_child(c) format_child(c)
for c in sorted(children, key=lambda r: r.order) for c in sorted(children, key=lambda route: route.order)
] ]
return data return data
@@ -444,7 +456,9 @@ def _download_image(
if filename not in downloaded: if filename not in downloaded:
output_dir.mkdir(parents=True, exist_ok=True) output_dir.mkdir(parents=True, exist_ok=True)
req = urllib.request.Request(url, headers={"User-Agent": "nuzlocke-tracker/1.0"}) req = urllib.request.Request(
url, headers={"User-Agent": "nuzlocke-tracker/1.0"}
)
try: try:
with urllib.request.urlopen(req, timeout=30) as resp: with urllib.request.urlopen(req, timeout=30) as resp:
dest.write_bytes(resp.read()) dest.write_bytes(resp.read())
@@ -496,37 +510,45 @@ async def _export_bosses(session: AsyncSession, vg_data: dict):
if badge_image_url and b.badge_name: if badge_image_url and b.badge_name:
badge_slug = _slugify(b.badge_name) badge_slug = _slugify(b.badge_name)
badge_image_url = _download_image( badge_image_url = _download_image(
badge_image_url, badge_dir, badge_slug, downloaded_badges, badge_image_url,
badge_dir,
badge_slug,
downloaded_badges,
) )
if sprite_url: if sprite_url:
sprite_slug = _slugify(b.name) sprite_slug = _slugify(b.name)
sprite_url = _download_image( sprite_url = _download_image(
sprite_url, sprite_dir, sprite_slug, downloaded_sprites, sprite_url,
sprite_dir,
sprite_slug,
downloaded_sprites,
) )
data.append({ data.append(
"name": b.name, {
"boss_type": b.boss_type, "name": b.name,
"specialty_type": b.specialty_type, "boss_type": b.boss_type,
"badge_name": b.badge_name, "specialty_type": b.specialty_type,
"badge_image_url": badge_image_url, "badge_name": b.badge_name,
"level_cap": b.level_cap, "badge_image_url": badge_image_url,
"order": b.order, "level_cap": b.level_cap,
"after_route_name": b.after_route.name if b.after_route else None, "order": b.order,
"location": b.location, "after_route_name": b.after_route.name if b.after_route else None,
"section": b.section, "location": b.location,
"sprite_url": sprite_url, "section": b.section,
"pokemon": [ "sprite_url": sprite_url,
{ "pokemon": [
"pokeapi_id": bp.pokemon.pokeapi_id, {
"pokemon_name": bp.pokemon.name, "pokeapi_id": bp.pokemon.pokeapi_id,
"level": bp.level, "pokemon_name": bp.pokemon.name,
"order": bp.order, "level": bp.level,
} "order": bp.order,
for bp in sorted(b.pokemon, key=lambda p: p.order) }
], for bp in sorted(b.pokemon, key=lambda p: p.order)
}) ],
}
)
_write_json(f"{first_game_slug}-bosses.json", data) _write_json(f"{first_game_slug}-bosses.json", data)
exported += 1 exported += 1

33
backup.sh Executable file
View File

@@ -0,0 +1,33 @@
#!/usr/bin/env bash
set -euo pipefail
# ── Configuration ──────────────────────────────────────────────
DEPLOY_DIR="/mnt/user/appdata/nuzlocke-tracker"
BACKUP_DIR="${DEPLOY_DIR}/backups"
RETENTION_DAYS=7
DB_SERVICE="db"
DB_NAME="nuzlocke"
DB_USER="postgres"
TIMESTAMP=$(date +%Y%m%d-%H%M%S)
BACKUP_FILE="${BACKUP_DIR}/nuzlocke-${TIMESTAMP}.sql.gz"
# ── Create backup directory ───────────────────────────────────
mkdir -p "$BACKUP_DIR"
# ── Dump database ─────────────────────────────────────────────
cd "$DEPLOY_DIR"
docker compose exec -T "$DB_SERVICE" pg_dump -U "$DB_USER" "$DB_NAME" | gzip > "$BACKUP_FILE"
echo "Backup created: ${BACKUP_FILE}"
# ── Rotate old backups ────────────────────────────────────────
find "$BACKUP_DIR" -name "nuzlocke-*.sql.gz" -mtime +${RETENTION_DAYS} -delete
REMAINING=$(find "$BACKUP_DIR" -name "nuzlocke-*.sql.gz" | wc -l)
echo "Backups retained: ${REMAINING}"
# ── Restore procedure ────────────────────────────────────────
# To restore from a backup:
# cd /mnt/user/appdata/nuzlocke-tracker
# gunzip -c backups/nuzlocke-YYYYMMDD-HHMMSS.sql.gz | \
# docker compose exec -T db psql -U postgres nuzlocke

View File

@@ -55,10 +55,13 @@ done
info "All images built and pushed." info "All images built and pushed."
# ── Sync compose file to Unraid ────────────────────────────────── # ── Sync compose file to Unraid ──────────────────────────────────
info "Copying docker-compose.prod.yml to Unraid..." info "Copying docker-compose.prod.yml and backup.sh to Unraid..."
scp docker-compose.prod.yml "${UNRAID_SSH}:${UNRAID_DEPLOY_DIR}/docker-compose.yml" \ scp docker-compose.prod.yml "${UNRAID_SSH}:${UNRAID_DEPLOY_DIR}/docker-compose.yml" \
|| error "Failed to copy compose file to Unraid." || error "Failed to copy compose file to Unraid."
info "Compose file synced." scp backup.sh "${UNRAID_SSH}:${UNRAID_DEPLOY_DIR}/backup.sh" \
|| error "Failed to copy backup script to Unraid."
ssh "${UNRAID_SSH}" "chmod +x '${UNRAID_DEPLOY_DIR}/backup.sh'"
info "Compose file and backup script synced."
# ── Ensure .env with Postgres password exists ──────────────────── # ── Ensure .env with Postgres password exists ────────────────────
info "Checking for .env on Unraid..." info "Checking for .env on Unraid..."

View File

@@ -21,5 +21,13 @@ export default defineConfig([
ecmaVersion: 2020, ecmaVersion: 2020,
globals: globals.browser, globals: globals.browser,
}, },
rules: {
'react-refresh/only-export-components': [
'warn',
{ allowConstantExport: true },
],
'react-hooks/set-state-in-effect': 'off',
'react-hooks/preserve-manual-memoization': 'off',
},
}, },
]) ])