Compare commits

..

121 Commits

Author SHA1 Message Date
2a7881ac6e chore: bump version to 2.0.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 1m7s
2026-03-24 12:56:40 +01:00
0950018b38 fix: add autoComplete=off on password fields to suppress WebKit autofill error
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-24 12:49:02 +01:00
bc796f4ee5 feat: multi-user reading progress & backoffice impersonation
- Scope all reading progress (books, series, stats) by user via
  Option<Extension<AuthUser>> — admin sees aggregate, read token sees own data
- Fix duplicate book rows when admin views lists (IS NOT NULL guard on JOIN)
- Add X-As-User header support: admin can impersonate any user from backoffice
- UserSwitcher dropdown in nav header (persisted via as_user_id cookie)
- Per-user filter pills on "Currently reading" and "Recently read" dashboard sections
- Inline username editing (UsernameEdit component with optimistic update)
- PATCH /admin/users/:id endpoint to rename a user
- Unassigned read tokens row in users table
- Komga sync now requires a user_id — reading progress attributed to selected user
- Migration 0051: add user_id column to komga_sync_reports
- Nav breakpoints: icons-only from md, labels from xl, hamburger until md

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-24 12:47:58 +01:00
232ecdda41 feat: add backoffice authentication with login page
- Add login page with logo background, glassmorphism card
- Add session management via JWT (jose) with httpOnly cookie
- Add Next.js proxy middleware to protect all routes
- Add logout button in nav
- Restructure app into (app) route group to isolate login layout
- Add ADMIN_USERNAME, ADMIN_PASSWORD, SESSION_SECRET env vars

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-24 08:48:01 +01:00
32d13984a1 chore: bump version to 1.28.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 53s
2026-03-23 19:00:30 +01:00
eab7f2e21b feat: filter metadata refresh to ongoing series & improve job action buttons
- Metadata refresh now skips series with ended/cancelled status
- Add xs size to Button component
- Unify view/cancel button sizes (h-7) with icons (eye & cross)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-23 18:59:33 +01:00
b6422fbf3e feat: enhance jobs list stats with tooltips, icons, and refresh count
- Add Tooltip UI component for styled hover tooltips
- Replace native title attributes with Tooltip on all job stats
- Add refresh icon (green) showing actual refreshed count for metadata refresh
- Add icon+tooltip to scanned files stat
- Add icon prop to StatBox component
- Add refreshed field to stats_json types
- Distinct tooltip labels for total links vs refreshed count

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-23 18:56:42 +01:00
6dbd0c80e6 feat: improve Telegram notification UI with better formatting
Add visual separators, contextual emojis, bold labels, structured
result sections, and conditional error lines for cleaner messages.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-23 18:46:25 +01:00
0c42a9ed04 fix: add API job poller to process scheduler-created metadata jobs
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 1m12s
The scheduler (indexer) created metadata_refresh/metadata_batch jobs in DB,
but the indexer excluded them (API_ONLY_JOB_TYPES) and the API only processed
jobs created via its REST endpoints. Scheduler-created jobs stayed pending forever.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-22 21:05:42 +01:00
95a6e54d06 chore: bump version to 1.27.1 2026-03-22 21:05:23 +01:00
e26219989f feat: add job runs chart and scrollable reading lists on dashboard
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 1m5s
- Add multi-line chart showing job runs over time by type (scan,
  rebuild, thumbnails, other) with the same day/week/month toggle
- Limit currently reading and recently read lists to 3 visible items
  with a scrollbar for overflow
- Fix NUMERIC→BIGINT cast for SUM/COALESCE in jobs SQL queries

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-22 10:43:45 +01:00
5d33a35407 chore: bump version to 1.27.0 2026-03-22 10:43:25 +01:00
d53572dc33 chore: bump version to 1.26.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 1m49s
2026-03-22 10:27:59 +01:00
cf1953d11f feat: add day/week/month period toggle for dashboard line charts
Add a period selector (day, week, month) to the reading activity and
books added charts. The API now accepts a ?period= query param and
returns gap-filled data using generate_series so all time slots appear
even with zero values. Labels are locale-aware (short month, weekday).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-22 10:27:24 +01:00
6f663eaee7 docs: add MIT license
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-22 10:08:15 +01:00
ee65c6263a perf: add ETag and server-side caching for thumbnail proxy
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 49s
Add ETag header to API thumbnail responses for 304 Not Modified support.
Forward If-None-Match/ETag through the Next.js proxy route handler and
add next.revalidate for 24h server-side fetch caching to reduce
SSR-to-API round trips on the libraries page.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-22 06:52:47 +01:00
691b6b22ab chore: bump version to 1.25.0 2026-03-22 06:52:02 +01:00
11c80a16a3 docs: add Telegram notifications and updated dashboard to README and FEATURES
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 46s
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-22 06:40:34 +01:00
c366b44c54 chore: bump version to 1.24.1 2026-03-22 06:39:23 +01:00
92f80542e6 perf: skip Next.js image re-optimization and stream proxy responses
Thumbnails are already optimized (WebP) by the API, so disable Next.js
image optimization to avoid redundant CPU work. Switch route handlers
from buffering (arrayBuffer) to streaming (response.body) to reduce
memory usage and latency.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-22 06:38:46 +01:00
3a25e42a20 chore: bump version to 1.24.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 1m7s
2026-03-22 06:31:56 +01:00
24763bf5a7 fix: show absolute date/time in jobs "created" column
Replace relative time formatting (which incorrectly showed "just now"
for many jobs due to negative time diffs from server/client timezone
mismatch) with absolute locale-formatted date/time.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-22 06:31:37 +01:00
08f0397029 feat: add reading stats and replace dashboard charts with recharts
Add currently reading, recently read, and reading activity sections to
the dashboard. Replace all custom SVG/CSS charts with recharts library
(donut, area, stacked bar, horizontal bar). Reorganize layout: libraries
and popular series side by side, books added chart full width below.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-22 06:26:45 +01:00
766e3a01b2 chore: bump version to 1.23.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 45s
2026-03-21 17:43:11 +01:00
626e2e035d feat: send book thumbnails in Telegram notifications
Use Telegram sendPhoto API for conversion and metadata-approved events
when a book thumbnail is available on disk. Falls back to text message
if photo upload fails.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 17:43:01 +01:00
cfd2321db2 chore: bump version to 1.22.0 2026-03-21 17:40:22 +01:00
1b715033ce fix: add missing Next.js route handler for Telegram test endpoint
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 17:39:46 +01:00
81d1586501 feat: add Telegram notification system with granular event toggles
Add notifications crate shared between API and indexer to send Telegram
messages on scan/thumbnail/conversion completion/failure, metadata linking,
batch and refresh events. Configurable via a new Notifications tab in the
backoffice settings with per-event toggle switches grouped by category.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 17:24:43 +01:00
bd74c9e3e3 docs: add comprehensive features list to README and docs/FEATURES.md
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 1m1s
Replace the minimal README features section with a concise categorized
summary and link to a detailed docs/FEATURES.md covering all features,
business rules, API endpoints, and integrations.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-21 14:34:36 +01:00
41228430cf chore: bump version to 1.21.2 2026-03-21 14:34:32 +01:00
6a4ba06fac fix: include series_metadata authors in authors listing and detail pages
Authors were only sourced from books.authors/books.author fields which are
often empty. Now also aggregates authors from series_metadata.authors
(populated by metadata providers like bedetheque). Adds author filter to
/series endpoint and updates the author detail page to use it.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 14:34:11 +01:00
e5c3542d3f refactor: split books.rs into books+series, reorganize OpenAPI tags and fix access control
- Extract series code from books.rs into dedicated series.rs module
- Reorganize OpenAPI tags: split overloaded "books" tag into books, series, search, stats
- Add missing endpoints to OpenAPI: metadata_batch, metadata_refresh, komga, update_metadata_provider
- Add missing schemas: MissingVolumeInput, Komga/Batch/Refresh DTOs
- Fix access control: move GET /libraries and POST /libraries/:id/scan to read routes
  so non-admin tokens can list libraries and trigger scans

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 14:23:19 +01:00
24516f1069 chore: bump version to 1.21.1
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 41s
2026-03-21 13:42:17 +01:00
5383cdef60 feat: allow batch metadata and refresh metadata on all libraries
When no specific library is selected, iterate over all libraries and
trigger a job for each one, skipping libraries with metadata disabled.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 13:42:08 +01:00
be5c3f7a34 fix: pass explicit locale to date formatting to prevent hydration mismatch
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 41s
Server and client could use different default locales for
toLocaleDateString/toLocaleString, causing React hydration errors.
Pass the user locale explicitly in JobsList and SettingsPage.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 13:36:35 +01:00
caa9922ff9 chore: bump version to 1.21.0 2026-03-21 13:34:47 +01:00
135f000c71 refactor: switch JobsIndicator from polling to SSE and fix stream endpoint
Replace fetch polling in JobsIndicator with EventSource connected to
/api/jobs/stream. Fix the SSE route to return all jobs (via
/index/status) instead of only active ones, since JobsList also
consumes this stream for the full job history. JobsIndicator now
filters active jobs client-side. SSE server-side uses adaptive
interval (2s active, 15s idle) and only sends when data changes.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 13:33:58 +01:00
d9e50a4235 chore: bump version to 1.20.1
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 1m13s
2026-03-21 13:13:39 +01:00
5f6eb5a5cb perf: add selective fetch caching for stable API endpoints
Make apiFetch support Next.js revalidate option instead of
hardcoding cache: no-store on every request. Stable endpoints
(libraries, settings, stats, series statuses) now use time-based
revalidation while dynamic data (books, search, jobs) stays uncached.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 13:13:28 +01:00
41c77fca2e chore: bump version to 1.20.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 1m15s
2026-03-21 13:06:28 +01:00
49621f3fb1 perf: wrap BookCard and BookImage with React.memo
Prevent unnecessary re-renders of book grid items when parent
components update without changing book data.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 13:03:24 +01:00
6df743b2e6 perf: lazy-load heavy modal components with next/dynamic
Dynamic import EditBookForm, EditSeriesForm, MetadataSearchModal, and
ProwlarrSearchModal so their code is split into separate chunks and
only fetched when the user interacts with them.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 13:02:10 +01:00
edfefc0128 perf: optimize JobsIndicator polling with visibility API and adaptive interval
Pause polling when the tab is hidden, refetch immediately when it
becomes visible again, and use a 30s interval when no jobs are active
instead of polling every 2s unconditionally.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 12:59:06 +01:00
b0185abefe perf: enable Next.js image optimization across backoffice
Remove `unoptimized` flag from all thumbnail/cover Image components
and add proper responsive `sizes` props. Convert raw `<img>` tags on
the libraries page to next/image. Add 24h minimumCacheTTL for
optimized images. BookPreview keeps `unoptimized` since the API
already returns optimized WebP.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 12:57:10 +01:00
b9e54cbfd8 chore: bump version to 1.19.1
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 54s
2026-03-21 12:47:31 +01:00
3f0bd783cd feat: include series_count and thumbnail_book_ids in libraries API response
Eliminates N+1 sequential fetchSeries calls on the libraries page by
returning series count and up to 5 thumbnail book IDs (one per series)
directly from GET /libraries.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-21 12:47:10 +01:00
fc8856c83f chore: bump version to 1.19.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 1m19s
2026-03-21 08:12:19 +01:00
bd09f3d943 feat: persist filter state in localStorage across pages
Save/restore filter values in LiveSearchForm using localStorage keyed
by basePath (e.g. filters:/books, filters:/series). Filters are restored
on mount when the URL has no active filters, and cleared when the user
clicks the Clear button.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 08:12:10 +01:00
1f434c3d67 feat: add format and metadata filters to books page
Add two new filters to the books listing page:
- Format filter (CBZ/CBR/PDF/EPUB) using existing API support
- Metadata linked/unlinked filter with new API support via
  LEFT JOIN on external_metadata_links (using DISTINCT ON CTE
  matching the series endpoint pattern)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 08:09:37 +01:00
4972a403df chore: bump version to 1.18.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 1m7s
2026-03-21 07:47:52 +01:00
629708cdd0 feat: redesign libraries page UI with fan thumbnails and modal settings
- Replace thumbnail mosaic with fan/arc layout using series covers as background
- Move library settings from dropdown to full-page portal modal with sections
- Move FolderPicker modal to portal for proper z-index stacking
- Add descriptions to each setting for better clarity
- Move delete button to card header, compact config tags
- Add i18n keys for new labels and descriptions (en/fr)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 07:47:36 +01:00
560087a897 chore: bump version to 1.17.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 1m12s
2026-03-21 07:23:52 +01:00
27f553b005 feat: add rescan job type and improve full rebuild UX
Add "Deep rescan" job type that clears directory mtimes to force
re-walking all directories, discovering newly supported formats (e.g.
EPUB) without deleting existing data or metadata.

Also improve full rebuild button: red destructive styling instead of
warning, and FR description explicitly mentions metadata/reading status
loss. Rename FR rebuild label to "Mise à jour".

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 07:23:38 +01:00
ed7665248e chore: bump version to 1.16.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 1m5s
2026-03-21 07:06:28 +01:00
736b8aedc0 feat: add EPUB format support with spine-aware image extraction
Parse EPUB structure (container.xml → OPF → spine → XHTML) to extract
images in reading order. Zero new dependencies — reuses zip + regex
crates with pre-compiled regexes and per-file index cache for
performance. Falls back to CBZ-style image listing when spine contains
no images. Includes DB migration, API/indexer/backoffice updates.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 07:05:47 +01:00
3daa49ae6c feat: add live refresh to job detail page via SSE
The job detail page was only server-rendered with no live updates,
unlike the jobs list page. Add a lightweight JobDetailLive client
component that subscribes to the existing SSE endpoint and calls
router.refresh() on each update, keeping the page in sync while
a job is running.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 06:52:57 +01:00
5fb24188e1 chore: bump version to 1.15.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 44s
2026-03-20 13:35:36 +01:00
54f972db17 chore: bump version to 1.14.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 45s
2026-03-20 12:48:14 +01:00
acd8b62382 chore: bump version to 1.13.0 2026-03-20 12:44:54 +01:00
cc65e3d1ad feat: highlight missing volumes in Prowlarr search results
API extracts volume numbers from release titles and matches them against
missing volumes sent by the frontend. Matched results are highlighted in
green with badges indicating which missing volumes were found.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-20 12:44:35 +01:00
70889ca955 chore: bump version to 1.12.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 43s
2026-03-20 11:43:34 +01:00
4ad6d57271 feat: add authors page to backoffice with dedicated API endpoint
Add a new GET /authors endpoint that aggregates unique authors from books
with book/series counts, pagination and search. Add author filter to
GET /books. Backoffice gets a list page with search/sort and a detail
page showing the author's series and books.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-20 11:43:22 +01:00
fe5de3d5c1 feat: add scheduled metadata refresh for libraries
Add metadata_refresh_mode (manual/hourly/daily/weekly) to libraries,
with automatic scheduling via the indexer. Includes API support,
backoffice UI controls, i18n translations, and DB migration.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-20 10:51:52 +01:00
5a224c48c0 chore: bump version to 1.11.1
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 44s
2026-03-20 10:46:34 +01:00
d08fe31b1b fix: pass metadata_refresh_mode through backoffice proxy to API
The Next.js monitoring route was dropping metadata_refresh_mode from the
request body, so the value was never forwarded to the Rust API and
reverted on reload.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-20 10:46:22 +01:00
4d69ed91c5 chore: bump version to 1.11.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 56s
2026-03-20 09:46:29 +01:00
c6ddd3e6c7 chore: bump version to 1.10.1
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 49s
2026-03-19 22:33:52 +01:00
504185f31f feat: add editable search input to Prowlarr modal with scrollable badges
- Add text input for custom search queries in Prowlarr modal
- Quick search badges pre-fill the input and trigger search
- Default query uses quoted series name for exact match
- Add custom_query support to backend API
- Limit badge area height with vertical scroll
- Add debug logging for Prowlarr API responses

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-19 22:33:40 +01:00
acd0cce3f8 fix: reorder Prowlarr button, add collection progress bar, remove redundant missing badge
- Move Prowlarr search button before Metadata button
- Add amber collection progress bar showing owned/expected books ratio
- Remove yellow missing count badge from MetadataSearchModal (now shown in progress bar)
- Fix i18n plural parameter for series read count

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-19 22:17:49 +01:00
e14da4fc8d chore: bump version to 1.10.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 51s
2026-03-19 21:51:45 +01:00
c04d4fb618 feat: add qBittorrent download client integration
Send Prowlarr search results directly to qBittorrent from the modal.
Backend authenticates via SID cookie (login + add torrent endpoints).

- Backend: qbittorrent module with add and test endpoints
- Migration: add qbittorrent settings (url, username, password)
- Settings UI: qBittorrent config card with test connection
- ProwlarrSearchModal: send-to-qBittorrent button per result row
  with spinner/checkmark state progression
- Button only shown when qBittorrent is configured

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-19 21:51:28 +01:00
57bc82703d feat: add Prowlarr integration for manual release search
Add Prowlarr indexer integration (step 1: config + manual search).
Allows searching for comics/ebooks releases on Prowlarr indexers
directly from the series detail page, with download links and
per-volume search for missing books.

- Backend: new prowlarr module with search and test endpoints
- Migration: add prowlarr settings (url, api_key, categories)
- Settings UI: Prowlarr config card with test connection button
- ProwlarrSearchModal: auto-search on open, missing volumes shortcuts
- Fix series.readCount i18n plural parameter on series pages

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-19 21:43:34 +01:00
e6aa7ebed0 chore: bump version to 1.9.2
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 44s
2026-03-19 13:22:41 +01:00
c44b51d6ef fix: unmap status mappings instead of deleting, store unmapped provider statuses
- Make mapped_status nullable so unmapping (X button) sets NULL instead of
  deleting the row — provider statuses never disappear from the UI
- normalize_series_status now returns the raw provider status (lowercased)
  when no mapping exists, so all statuses are stored in series_metadata
- Fix series_statuses query crash caused by NULL mapped_status values
- Fix metadata batch/refresh server actions crashing page on 400 errors
- StatusMappingDto.mapped_status is now string | null in the backoffice

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-19 13:22:31 +01:00
d4c48de780 chore: bump version to 1.9.1 2026-03-19 12:59:31 +01:00
8948f75d62 fix: ignore unknown provider statuses instead of storing them
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 5s
normalize_series_status now returns None when no mapping exists,
so unknown provider statuses won't pollute series_metadata.status.
Users can see unmapped statuses in Settings and assign them before
they get stored.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-19 12:58:55 +01:00
d304877a83 fix: re-normalize series statuses with UI-added mappings
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 6s
Migration 0041 re-applies status normalization using all current
status_mappings entries, including those added via the UI after the
initial migration 0039 (e.g. "one shot" → "ended").

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-19 12:57:14 +01:00
9cec32ba3e fix: normalize series status casing to avoid duplicates
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 6s
- LOWER() all series_metadata.status values in the statuses endpoint
  to prevent "One shot" / "one shot" appearing as separate targets
- Migration 0040: lowercase all existing status values in DB
- Use LOWER() in series status filter queries for consistency

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-19 12:56:02 +01:00
e8768dfad7 chore: bump version to 1.9.0 2026-03-19 12:44:30 +01:00
cfc98819ab feat: add configurable status mappings for metadata providers
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 6s
Add a status_mappings table to replace hardcoded provider status
normalization. Users can now configure how provider statuses (e.g.
"releasing", "finie") map to target statuses (e.g. "ongoing", "ended")
via the Settings > Integrations page.

- Migration 0038: status_mappings table with pre-seeded mappings
- Migration 0039: re-normalize existing series_metadata.status values
- API: CRUD endpoints for status mappings, DB-based normalize function
- API: new GET /series/provider-statuses endpoint
- Backoffice: StatusMappingsCard component with create target, assign,
  and delete capabilities
- Fix all clippy warnings across the API crate
- Fix missing OpenAPI schema refs (MetadataStats, ProviderCount)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-19 12:44:22 +01:00
bfc1c76fe2 chore: repair deploy
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 32s
2026-03-19 11:19:50 +01:00
39e9f35acb chore: push deploy stack local with dockerhub images
Some checks failed
Deploy with Docker Compose / deploy (push) Failing after 8s
2026-03-19 11:16:29 +01:00
36987f59b9 chore: bump version to 1.8.1 2026-03-19 11:12:06 +01:00
931d0e06f4 feat: redesign search bars with prominent search input and compact filters
Restructure LiveSearchForm: full-width search input with magnifying glass
icon, filters in a compact row below with contextual icons per field
(library, status, sort, etc.) and inline labels. Remove per-field
className overrides from series and books pages.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-19 11:12:00 +01:00
741a4da878 feat: redesign jobs page action bar with grouped layout
Replace flat button row + separate reference card with a single card
organized in 3 visual groups (Indexation, Thumbnails, Metadata).
Each action is a card-like button with inline description.
Destructive actions have distinct warning styling.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-19 11:03:08 +01:00
e28b78d0e6 chore: bump version to 1.8.0 2026-03-19 09:09:27 +01:00
163dc3698c feat: add metadata refresh job to re-download metadata for linked series
Adds a new job type that refreshes metadata from external providers for
all series already linked via approved external_metadata_links. Tracks
and displays per-field diffs (series and book level), respects locked
fields, and provides a detailed change report in the job detail page.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-19 09:09:10 +01:00
818bd82e0f chore: bump version to 1.7.0 2026-03-18 22:26:15 +01:00
76c8bcbf2c chore: bump version to 1.6.5 2026-03-18 22:20:02 +01:00
00094b22c6 feat: add metadata statistics to dashboard
Add a new metadata row to the dashboard with three cards:
- Series metadata coverage (linked vs unlinked donut)
- Provider breakdown (donut by provider)
- Book metadata quality (summary and ISBN fill rates)

Includes API changes (stats.rs), frontend types, and FR/EN translations.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-18 22:19:53 +01:00
1e4d9acebe fix: normalize French articles in Bedetheque confidence scoring
Bedetheque uses "Légendaires (Les) - Résistance" while local series
names are "Les légendaires - Résistance". Add normalize_title() that
strips leading articles and articles in parentheses before comparing,
so these forms correctly produce a 100% confidence match.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-18 22:04:58 +01:00
b226aa3a35 chore: bump version to 1.6.4 2026-03-18 22:04:40 +01:00
d913be9d2a chore: bump version to 1.6.3 2026-03-18 21:44:37 +01:00
e9bb951d97 feat: auto-match metadata when 100% confidence and matching book count
When multiple provider results exist but the best has 100% confidence,
compare local book count with external total_volumes. If they match,
treat it as an auto-match and link+sync series and book metadata
automatically instead of requiring manual review.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-18 21:44:31 +01:00
037ede2750 chore: bump version to 1.6.2 2026-03-18 21:36:44 +01:00
06a245d90a feat: add metadata provider filter to series page
- Add `metadata_provider` query param to series API endpoints (linked/unlinked/specific provider)
- Return `metadata_provider` field in series response
- Add metadata filter dropdown on series page with all provider options
- Show small provider icon badge on linked series cards
- LiveSearchForm now wraps filters on two rows when needed

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-18 21:35:38 +01:00
63d5fcaa13 chore: bump version to 1.6.1 2026-03-18 21:19:38 +01:00
020cb6baae feat: make series names clickable in batch metadata report
Links series names to their detail page where the metadata search
modal can be triggered for quick provider lookup.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-18 21:07:21 +01:00
6db8042ffe chore: bump version to 1.6.0 2026-03-18 19:39:10 +01:00
d4f87c4044 feat: add i18n support (FR/EN) to backoffice with English as default
Implement full internationalization for the Next.js backoffice:
- i18n infrastructure: type-safe dictionaries (fr.ts/en.ts), cookie-based locale detection, React Context for client components, server-side translation helper
- Language selector in Settings page (General tab) with cookie + DB persistence
- All ~35 pages and components translated via t() / useTranslation()
- Default locale set to English, French available via settings

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-18 19:39:01 +01:00
055c376222 fix: complete French translation for settings page
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-18 18:26:59 +01:00
1cc5d049ea chore: bump version to 1.5.6 2026-03-18 18:26:50 +01:00
b955c2697c feat: add batch metadata jobs, series filters, and translate backoffice to French
- Add metadata_batch job type with background processing via tokio::spawn
- Auto-apply metadata only when single result at 100% confidence
- Support primary + fallback provider per library, "none" to opt out
- Add batch report/results API endpoints and job detail UI
- Add series_status and has_missing filters to both series listing pages
- Add GET /series/statuses endpoint for dynamic filter options
- Normalize series_metadata status values (migration 0036)
- Hide ComicVine provider tab when no API key configured
- Translate entire backoffice UI from English to French

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-18 18:26:44 +01:00
9a8c1577af chore: bump version to 1.5.5 2026-03-18 16:11:08 +01:00
52b9b0e00e feat: add series status, improve providers & e2e tests
- Add series status concept (ongoing/ended/hiatus/cancelled/upcoming)
  with normalization across all providers
- Add status field to series_metadata table (migration 0033)
- AniList: use chapters as fallback for volume count on ongoing series,
  add books_message when both volumes and chapters are null
- Bedetheque: extract description from meta tag, genres, parution status,
  origin/language; rewrite book parsing with itemprop microdata for
  clean ISBN, dates, page counts, covers; filter placeholder authors
- Add comprehensive e2e provider tests with field coverage reporting
- Wire status into EditSeriesForm, MetadataSearchModal, and series page

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-18 16:10:45 +01:00
51ef2fa725 chore: bump version to 1.5.4 2026-03-18 15:27:29 +01:00
7d53babc84 chore: bump version to 1.5.3 2026-03-18 15:23:54 +01:00
00f4445924 fix: use sort-order position as fallback volume for book matching
When books have no volume number, use their 1-based position in the
backoffice sort order (volume ASC NULLS LAST, natural title sort) as
effective volume for matching against external provider books.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-18 15:21:32 +01:00
1a91c051b5 chore: bump version to 1.5.2 2026-03-18 15:16:21 +01:00
48ca9d0a8b chore: bump version to 1.5.1 2026-03-18 15:12:44 +01:00
f75d795215 fix: improve book matching in metadata sync with bidirectional title search
Pre-fetch all local books in one query instead of N queries per external
book. Match by volume number first, then bidirectional title containment
(external in local OR local in external). Track matched IDs to prevent
double-matching.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-18 15:12:36 +01:00
ac13f53124 chore: bump version to 1.5.0 2026-03-18 15:03:49 +01:00
c9ccf5cd90 feat: add external metadata sync system with multiple providers
Add a complete metadata synchronization system allowing users to search
and sync series/book metadata from external providers (Google Books,
Open Library, ComicVine, AniList, Bédéthèque). Each library can use a
different provider. Matching requires manual approval with detailed sync
reports showing what was updated or skipped (locked fields protection).

Key changes:
- DB migrations: external_metadata_links, external_book_metadata tables,
  library metadata_provider column, locked_fields, total_volumes, book
  metadata fields (summary, isbn, publish_date)
- Rust API: MetadataProvider trait + 5 provider implementations,
  7 metadata endpoints (search, match, approve, reject, links, missing,
  delete), sync report system, provider language preference support
- Backoffice: MetadataSearchModal, ProviderIcon, SafeHtml components,
  settings UI for provider/language config, enriched book detail page,
  edit forms with locked fields support, API proxy routes
- OpenAPI/Swagger documentation for all new endpoints and schemas

Closes #3

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-18 14:59:24 +01:00
a99bfb5a91 perf(docker): optimize Rust build times with dependency layer caching
Replace sccache with a two-stage build strategy: dummy source files cache
dependency compilation in a separate layer, and --mount=type=cache for
Cargo registry/git/target directories. Source-only changes now skip
full dependency recompilation.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-18 11:06:11 +01:00
389d71b42f refactor: replace Meilisearch with PostgreSQL full-text search
Remove Meilisearch dependency entirely. Search is now handled by
PostgreSQL ILIKE with pg_trgm indexes, joining series_metadata for
series-level authors. No external search engine needed.

- Replace search.rs Meilisearch HTTP calls with PostgreSQL queries
- Remove meili.rs from indexer, sync_meili call from job pipeline
- Remove MEILI_URL/MEILI_MASTER_KEY from config, state, env files
- Remove meilisearch service from docker-compose.yml
- Add migration 0027: drop sync_metadata, enable pg_trgm, add indexes
- Remove search resync button/endpoint (no longer needed)
- Update all documentation (CLAUDE.md, README.md, AGENTS.md, PLAN.md)

API contract unchanged — same SearchResponse shape returned.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-18 10:59:25 +01:00
2985ef5561 chore: bump version to 1.4.0 2026-03-18 10:59:12 +01:00
4be8177683 feat: fix author search, add edit modals, settings tabs & search resync
- Fix Meilisearch indexing to use authors[] array instead of scalar author field
- Join series_metadata to include series-level authors in search documents
- Configure searchable attributes (title, authors, series) in Meilisearch
- Convert EditSeriesForm and EditBookForm from inline forms to modals
- Add tabbed navigation (General / Integrations) to Settings page
- Add Force Search Resync button (POST /settings/search/resync)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-18 10:45:36 +01:00
a675dcd2a4 chore: bump version to 1.3.0 2026-03-18 10:45:16 +01:00
127cd8a42c feat(komga): add Komga read-status sync with reports and history
Adds Komga sync feature to import read status from a Komga server.
Books are matched by title (case-insensitive) with series+title primary
match and title-only fallback. Sync reports are persisted with matched,
newly marked, and unmatched book lists. UI shows check icon for newly
marked books, sorted to top. Credentials (URL+username) are saved
between sessions. Uses HashSet for O(1) lookups to handle large libraries.

Closes #2

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-16 22:04:19 +01:00
1b9f2d3915 chore: bump version to 1.2.2 2026-03-16 22:04:04 +01:00
f095bf050b chore: bump version to 1.2.1 2026-03-16 21:12:25 +01:00
198 changed files with 23657 additions and 4496 deletions

View File

@@ -9,13 +9,16 @@
# REQUIRED - Change these values in production!
# =============================================================================
# Master key for Meilisearch authentication (required)
MEILI_MASTER_KEY=change-me-in-production
# Bootstrap token for initial API admin access (required)
# Use this token for the first API calls before creating proper API tokens
API_BOOTSTRAP_TOKEN=change-me-in-production
# Backoffice admin credentials (required)
ADMIN_USERNAME=admin
ADMIN_PASSWORD=change-me-in-production
# Secret for signing session JWTs (min 32 chars, required)
SESSION_SECRET=change-me-in-production-use-32-chars-min
# =============================================================================
# Service Configuration
# =============================================================================
@@ -28,9 +31,6 @@ API_BASE_URL=http://api:7080
INDEXER_LISTEN_ADDR=0.0.0.0:7081
INDEXER_SCAN_INTERVAL_SECONDS=5
# Meilisearch Search Engine
MEILI_URL=http://meilisearch:7700
# PostgreSQL Database
DATABASE_URL=postgres://stripstream:stripstream@postgres:5432/stripstream
@@ -77,5 +77,4 @@ THUMBNAILS_HOST_PATH=./data/thumbnails
# - API: change "7080:7080" to "YOUR_PORT:7080"
# - Indexer: change "7081:7081" to "YOUR_PORT:7081"
# - Backoffice: change "7082:7082" to "YOUR_PORT:7082"
# - Meilisearch: change "7700:7700" to "YOUR_PORT:7700"
# - PostgreSQL: change "6432:5432" to "YOUR_PORT:5432"

View File

@@ -0,0 +1,17 @@
name: Deploy with Docker Compose
on:
push:
branches:
- main # adapte la branche que tu veux déployer
jobs:
deploy:
runs-on: mac-orbstack-runner # le nom que tu as donné au runner
steps:
- name: Deploy stack
env:
DOCKER_BUILDKIT: 1
COMPOSE_DOCKER_CLI_BUILD: 1
run: |
BUILDKIT_PROGRESS=plain cd /Users/julienfroidefond/Sites/docker-stack && docker pull julienfroidefond32/stripstream-backoffice && docker pull julienfroidefond32/stripstream-api && docker pull julienfroidefond32/stripstream-indexer && ./scripts/stack.sh up stripstream

View File

@@ -77,7 +77,7 @@ sqlx migrate add -r migration_name
```bash
# Start infrastructure only
docker compose up -d postgres meilisearch
docker compose up -d postgres
# Start full stack
docker compose up -d

View File

@@ -10,7 +10,6 @@ Gestionnaire de bibliothèque de bandes dessinées/ebooks. Workspace Cargo multi
| Indexer (background) | `apps/indexer/` | 7081 |
| Backoffice (Next.js) | `apps/backoffice/` | 7082 |
| PostgreSQL | infra | 6432 |
| Meilisearch | infra | 7700 |
Crates partagés : `crates/core` (config env), `crates/parsers` (CBZ/CBR/PDF).
@@ -31,7 +30,7 @@ cargo test
cargo test -p parsers
# Infra (dépendances uniquement) — docker-compose.yml est à la racine
docker compose up -d postgres meilisearch
docker compose up -d postgres
# Backoffice dev
cd apps/backoffice && npm install && npm run dev # http://localhost:7082
@@ -46,7 +45,7 @@ sqlx migrate run # DATABASE_URL doit être défini
cp .env.example .env # puis éditer les valeurs REQUIRED
```
Variables **requises** au démarrage : `DATABASE_URL`, `MEILI_URL`, `MEILI_MASTER_KEY`, `API_BOOTSTRAP_TOKEN`.
Variables **requises** au démarrage : `DATABASE_URL`, `API_BOOTSTRAP_TOKEN`.
## Gotchas
@@ -56,6 +55,7 @@ Variables **requises** au démarrage : `DATABASE_URL`, `MEILI_URL`, `MEILI_MASTE
- **Thumbnails** : stockés dans `THUMBNAIL_DIRECTORY` (défaut `/data/thumbnails`), générés par **l'API** (pas l'indexer) — l'indexer déclenche un checkup via `POST /index/jobs/:id/thumbnails/checkup`.
- **Workspace Cargo** : les dépendances externes sont définies dans le `Cargo.toml` racine, pas dans les crates individuels.
- **Migrations** : dossier `infra/migrations/`, géré par sqlx. Toujours migrer avant de démarrer les services.
- **Recherche** : full-text via PostgreSQL (`ILIKE` + `pg_trgm`), pas de moteur de recherche externe.
## Fichiers clés
@@ -64,6 +64,7 @@ Variables **requises** au démarrage : `DATABASE_URL`, `MEILI_URL`, `MEILI_MASTE
| `crates/core/src/config.rs` | Config depuis env (API, Indexer, AdminUI) |
| `crates/parsers/src/lib.rs` | Détection format, extraction métadonnées |
| `apps/api/src/books.rs` | Endpoints CRUD livres |
| `apps/api/src/search.rs` | Recherche full-text PostgreSQL |
| `apps/api/src/pages.rs` | Rendu pages + cache LRU |
| `apps/indexer/src/scanner.rs` | Scan filesystem |
| `infra/migrations/*.sql` | Schéma DB |

328
Cargo.lock generated
View File

@@ -19,6 +19,19 @@ dependencies = [
"cpufeatures",
]
[[package]]
name = "ahash"
version = "0.8.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5a15f179cd60c4584b8a8c596927aadc462e27f2ca70c04e0071964a73ba7a75"
dependencies = [
"cfg-if",
"getrandom 0.3.4",
"once_cell",
"version_check",
"zerocopy",
]
[[package]]
name = "aho-corasick"
version = "1.1.4"
@@ -51,7 +64,7 @@ checksum = "7f202df86484c868dbad7eaa557ef785d5c66295e41b460ef922eca0723b842c"
[[package]]
name = "api"
version = "1.2.0"
version = "2.0.0"
dependencies = [
"anyhow",
"argon2",
@@ -63,9 +76,12 @@ dependencies = [
"image",
"jpeg-decoder",
"lru",
"notifications",
"parsers",
"rand 0.8.5",
"regex",
"reqwest",
"scraper",
"serde",
"serde_json",
"sha2",
@@ -463,6 +479,29 @@ dependencies = [
"typenum",
]
[[package]]
name = "cssparser"
version = "0.34.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b7c66d1cd8ed61bf80b38432613a7a2f09401ab8d0501110655f8b341484a3e3"
dependencies = [
"cssparser-macros",
"dtoa-short",
"itoa",
"phf",
"smallvec",
]
[[package]]
name = "cssparser-macros"
version = "0.6.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "13b588ba4ac1a99f7f2964d24b3d896ddc6bf847ee3855dbd4366f058cfcd331"
dependencies = [
"quote",
"syn 2.0.117",
]
[[package]]
name = "der"
version = "0.7.10"
@@ -483,6 +522,17 @@ dependencies = [
"powerfmt",
]
[[package]]
name = "derive_more"
version = "0.99.20"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6edb4b64a43d977b8e99788fe3a04d483834fba1215a7e02caa415b626497f7f"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.117",
]
[[package]]
name = "digest"
version = "0.10.7"
@@ -512,6 +562,21 @@ version = "0.15.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1aaf95b3e5c8f23aa320147307562d361db0ae0d51242340f558153b4eb2439b"
[[package]]
name = "dtoa"
version = "1.0.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4c3cf4824e2d5f025c7b531afcb2325364084a16806f6d47fbc1f5fbd9960590"
[[package]]
name = "dtoa-short"
version = "0.3.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cd1511a7b6a56299bd043a9c167a6d2bfb37bf84a6dfceaba651168adfb43c87"
dependencies = [
"dtoa",
]
[[package]]
name = "ecb"
version = "0.1.2"
@@ -521,6 +586,12 @@ dependencies = [
"cipher",
]
[[package]]
name = "ego-tree"
version = "0.9.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7c6ba7d4eec39eaa9ab24d44a0e73a7949a1095a8b3f3abb11eddf27dbb56a53"
[[package]]
name = "either"
version = "1.15.0"
@@ -629,6 +700,16 @@ dependencies = [
"percent-encoding",
]
[[package]]
name = "futf"
version = "0.1.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "df420e2e84819663797d1ec6544b13c5be84629e7bb00dc960d6917db2987843"
dependencies = [
"mac",
"new_debug_unreachable",
]
[[package]]
name = "futures"
version = "0.3.32"
@@ -728,6 +809,15 @@ dependencies = [
"slab",
]
[[package]]
name = "fxhash"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c31b6d751ae2c7f11320402d34e41349dd1016f8d5d45e48c4312bc8625af50c"
dependencies = [
"byteorder",
]
[[package]]
name = "generic-array"
version = "0.14.7"
@@ -738,6 +828,15 @@ dependencies = [
"version_check",
]
[[package]]
name = "getopts"
version = "0.2.24"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cfe4fbac503b8d1f88e6676011885f34b7174f46e59956bba534ba83abded4df"
dependencies = [
"unicode-width",
]
[[package]]
name = "getrandom"
version = "0.2.17"
@@ -855,6 +954,18 @@ dependencies = [
"windows-sys 0.61.2",
]
[[package]]
name = "html5ever"
version = "0.29.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3b7410cae13cbc75623c98ac4cbfd1f0bedddf3227afc24f370cf0f50a44a11c"
dependencies = [
"log",
"mac",
"markup5ever",
"match_token",
]
[[package]]
name = "http"
version = "1.4.0"
@@ -1122,7 +1233,7 @@ dependencies = [
[[package]]
name = "indexer"
version = "1.2.0"
version = "2.0.0"
dependencies = [
"anyhow",
"axum",
@@ -1130,6 +1241,7 @@ dependencies = [
"futures",
"image",
"jpeg-decoder",
"notifications",
"num_cpus",
"parsers",
"reqwest",
@@ -1406,6 +1518,37 @@ version = "0.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "112b39cec0b298b6c1999fee3e31427f74f676e4cb9879ed1a121b43661a4154"
[[package]]
name = "mac"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c41e0c4fef86961ac6d6f8a82609f55f31b05e4fce149ac5710e439df7619ba4"
[[package]]
name = "markup5ever"
version = "0.14.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c7a7213d12e1864c0f002f52c2923d4556935a43dec5e71355c2760e0f6e7a18"
dependencies = [
"log",
"phf",
"phf_codegen",
"string_cache",
"string_cache_codegen",
"tendril",
]
[[package]]
name = "match_token"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "88a9689d8d44bf9964484516275f5cd4c9b59457a6940c1d5d0ecbb94510a36b"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.117",
]
[[package]]
name = "matchers"
version = "0.2.0"
@@ -1496,6 +1639,12 @@ version = "1.0.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "308d96db8debc727c3fd9744aac51751243420e46edf401010908da7f8d5e57c"
[[package]]
name = "new_debug_unreachable"
version = "1.0.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "650eef8c711430f1a879fdd01d4745a7deea475becfb90269c06775983bbf086"
[[package]]
name = "nom"
version = "8.0.0"
@@ -1516,6 +1665,19 @@ dependencies = [
"nom",
]
[[package]]
name = "notifications"
version = "2.0.0"
dependencies = [
"anyhow",
"reqwest",
"serde",
"serde_json",
"sqlx",
"tokio",
"tracing",
]
[[package]]
name = "nu-ansi-term"
version = "0.50.3"
@@ -1624,7 +1786,7 @@ dependencies = [
[[package]]
name = "parsers"
version = "1.2.0"
version = "2.0.0"
dependencies = [
"anyhow",
"flate2",
@@ -1690,6 +1852,58 @@ version = "2.3.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9b4f627cb1b25917193a259e49bdad08f671f8d9708acfd5fe0a8c1455d87220"
[[package]]
name = "phf"
version = "0.11.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1fd6780a80ae0c52cc120a26a1a42c1ae51b247a253e4e06113d23d2c2edd078"
dependencies = [
"phf_macros",
"phf_shared",
]
[[package]]
name = "phf_codegen"
version = "0.11.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "aef8048c789fa5e851558d709946d6d79a8ff88c0440c587967f8e94bfb1216a"
dependencies = [
"phf_generator",
"phf_shared",
]
[[package]]
name = "phf_generator"
version = "0.11.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3c80231409c20246a13fddb31776fb942c38553c51e871f8cbd687a4cfb5843d"
dependencies = [
"phf_shared",
"rand 0.8.5",
]
[[package]]
name = "phf_macros"
version = "0.11.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f84ac04429c13a7ff43785d75ad27569f2951ce0ffd30a3321230db2fc727216"
dependencies = [
"phf_generator",
"phf_shared",
"proc-macro2",
"quote",
"syn 2.0.117",
]
[[package]]
name = "phf_shared"
version = "0.11.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "67eabc2ef2a60eb7faa00097bd1ffdb5bd28e62bf39990626a582201b7a754e5"
dependencies = [
"siphasher",
]
[[package]]
name = "pin-project-lite"
version = "0.2.17"
@@ -1793,6 +2007,12 @@ dependencies = [
"zerocopy",
]
[[package]]
name = "precomputed-hash"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "925383efa346730478fb4838dbe9137d2a47675ad789c546d150a6e1dd4ab31c"
[[package]]
name = "prettyplease"
version = "0.2.37"
@@ -2065,6 +2285,7 @@ dependencies = [
"base64",
"bytes",
"futures-core",
"futures-util",
"http",
"http-body",
"http-body-util",
@@ -2073,6 +2294,7 @@ dependencies = [
"hyper-util",
"js-sys",
"log",
"mime_guess",
"percent-encoding",
"pin-project-lite",
"quinn",
@@ -2230,6 +2452,41 @@ version = "1.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "94143f37725109f92c262ed2cf5e59bce7498c01bcc1502d7b9afe439a4e9f49"
[[package]]
name = "scraper"
version = "0.21.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b0e749d29b2064585327af5038a5a8eb73aeebad4a3472e83531a436563f7208"
dependencies = [
"ahash",
"cssparser",
"ego-tree",
"getopts",
"html5ever",
"precomputed-hash",
"selectors",
"tendril",
]
[[package]]
name = "selectors"
version = "0.26.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fd568a4c9bb598e291a08244a5c1f5a8a6650bee243b5b0f8dbb3d9cc1d87fe8"
dependencies = [
"bitflags",
"cssparser",
"derive_more",
"fxhash",
"log",
"new_debug_unreachable",
"phf",
"phf_codegen",
"precomputed-hash",
"servo_arc",
"smallvec",
]
[[package]]
name = "semver"
version = "1.0.27"
@@ -2302,6 +2559,15 @@ dependencies = [
"serde",
]
[[package]]
name = "servo_arc"
version = "0.4.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "170fb83ab34de17dc69aa7c67482b22218ddb85da56546f9bd6b929e32a05930"
dependencies = [
"stable_deref_trait",
]
[[package]]
name = "sha1"
version = "0.10.6"
@@ -2365,6 +2631,12 @@ version = "0.3.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e320a6c5ad31d271ad523dcf3ad13e2767ad8b1cb8f047f75a8aeaf8da139da2"
[[package]]
name = "siphasher"
version = "1.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b2aa850e253778c88a04c3d7323b043aeda9d3e30d5971937c1855769763678e"
[[package]]
name = "slab"
version = "0.4.12"
@@ -2613,6 +2885,31 @@ version = "1.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6ce2be8dc25455e1f91df71bfa12ad37d7af1092ae736f3a6cd0e37bc7810596"
[[package]]
name = "string_cache"
version = "0.8.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bf776ba3fa74f83bf4b63c3dcbbf82173db2632ed8452cb2d891d33f459de70f"
dependencies = [
"new_debug_unreachable",
"parking_lot",
"phf_shared",
"precomputed-hash",
"serde",
]
[[package]]
name = "string_cache_codegen"
version = "0.5.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c711928715f1fe0fe509c53b43e993a9a557babc2d0a3567d0a3006f1ac931a0"
dependencies = [
"phf_generator",
"phf_shared",
"proc-macro2",
"quote",
]
[[package]]
name = "stringprep"
version = "0.1.5"
@@ -2626,7 +2923,7 @@ dependencies = [
[[package]]
name = "stripstream-core"
version = "1.2.0"
version = "2.0.0"
dependencies = [
"anyhow",
"serde",
@@ -2679,6 +2976,17 @@ dependencies = [
"syn 2.0.117",
]
[[package]]
name = "tendril"
version = "0.4.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d24a120c5fc464a3458240ee02c299ebcb9d67b5249c8848b09d639dca8d7bb0"
dependencies = [
"futf",
"mac",
"utf-8",
]
[[package]]
name = "thiserror"
version = "2.0.18"
@@ -2991,6 +3299,12 @@ version = "0.1.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7df058c713841ad818f1dc5d3fd88063241cc61f49f5fbea4b951e8cf5a8d71d"
[[package]]
name = "unicode-width"
version = "0.2.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b4ac048d71ede7ee76d585517add45da530660ef4390e49b098733c6e897f254"
[[package]]
name = "unicode-xid"
version = "0.2.6"
@@ -3038,6 +3352,12 @@ dependencies = [
"serde",
]
[[package]]
name = "utf-8"
version = "0.7.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "09cc8ee72d2a9becf2f2febe0205bbed8fc6615b7cb429ad062dc7b7ddd036a9"
[[package]]
name = "utf16string"
version = "0.2.0"

View File

@@ -3,13 +3,14 @@ members = [
"apps/api",
"apps/indexer",
"crates/core",
"crates/notifications",
"crates/parsers",
]
resolver = "2"
[workspace.package]
edition = "2021"
version = "1.2.0"
version = "2.0.0"
license = "MIT"
[workspace.dependencies]
@@ -22,7 +23,7 @@ image = { version = "0.25", default-features = false, features = ["jpeg", "png",
jpeg-decoder = "0.3"
lru = "0.12"
rayon = "1.10"
reqwest = { version = "0.12", default-features = false, features = ["json", "rustls-tls"] }
reqwest = { version = "0.12", default-features = false, features = ["json", "multipart", "rustls-tls"] }
rand = "0.8"
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
@@ -41,3 +42,4 @@ walkdir = "2.5"
webp = "0.3"
utoipa = "4.0"
utoipa-swagger-ui = "6.0"
scraper = "0.21"

21
LICENSE Normal file
View File

@@ -0,0 +1,21 @@
MIT License
Copyright (c) 2026 Julien Froidefond
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

10
PLAN.md
View File

@@ -12,7 +12,7 @@ Construire un serveur ultra performant pour indexer et servir des bibliotheques
- Backend/API: Rust (`axum`)
- Indexation: service Rust dedie (`indexer`)
- DB: PostgreSQL
- Recherche: Meilisearch
- Recherche: PostgreSQL full-text (ILIKE + pg_trgm)
- Deploiement: Docker Compose
- Auth: token bootstrap env + tokens admin en DB (creables/revocables)
- Expiration tokens admin: aucune par defaut (revocation manuelle)
@@ -33,7 +33,7 @@ Construire un serveur ultra performant pour indexer et servir des bibliotheques
**DoD:** Build des crates OK.
### T2 - Infra Docker Compose
- [x] Definir services `postgres`, `meilisearch`, `api`, `indexer`
- [x] Definir services `postgres`, `api`, `indexer`
- [x] Volumes persistants
- [x] Healthchecks
@@ -114,7 +114,7 @@ Construire un serveur ultra performant pour indexer et servir des bibliotheques
**DoD:** Pagination/filtres fonctionnels.
### T13 - Recherche
- [x] Projection vers Meilisearch
- [x] Recherche full-text PostgreSQL
- [x] `GET /search?q=...&library_id=...&type=...`
- [x] Fuzzy + filtres
@@ -264,10 +264,10 @@ Construire un serveur ultra performant pour indexer et servir des bibliotheques
- Bootstrap token = break-glass (peut etre desactive plus tard)
## Journal
- 2026-03-05: `docker compose up -d --build` valide, stack complete en healthy (`postgres`, `meilisearch`, `api`, `indexer`, `admin-ui`).
- 2026-03-05: `docker compose up -d --build` valide, stack complete en healthy (`postgres`, `api`, `indexer`, `admin-ui`).
- 2026-03-05: ajustements infra appliques pour demarrage stable (`unrar` -> `unrar-free`, image `rust:1-bookworm`, healthchecks `127.0.0.1`).
- 2026-03-05: ajout d'un service `migrate` dans Compose pour executer automatiquement `infra/migrations/0001_init.sql` au demarrage.
- 2026-03-05: Lot 2 termine (jobs, scan incremental, parsers `cbz/cbr/pdf`, API livres, sync + recherche Meilisearch).
- 2026-03-05: Lot 2 termine (jobs, scan incremental, parsers `cbz/cbr/pdf`, API livres, recherche PostgreSQL).
- 2026-03-05: verification de bout en bout OK sur une librairie de test (`/libraries/demo`) avec indexation, listing `/books` et recherche `/search` (1 CBZ detecte).
- 2026-03-05: Lot 3 avancee: endpoint pages (`/books/:id/pages/:n`) actif avec cache LRU, ETag/Cache-Control, limite concurrence rendu et timeouts.
- 2026-03-05: hardening API: readiness expose sans auth via `route_layer`, metriques simples `/metrics`, rate limiting lecture (120 req/s).

View File

@@ -9,7 +9,7 @@ The project consists of the following components:
- **API** (`apps/api/`) - Rust-based REST API service
- **Indexer** (`apps/indexer/`) - Rust-based background indexing service
- **Backoffice** (`apps/backoffice/`) - Next.js web administration interface
- **Infrastructure** (`infra/`) - Docker Compose setup with PostgreSQL and Meilisearch
- **Infrastructure** (`infra/`) - Docker Compose setup with PostgreSQL
## Quick Start
@@ -27,19 +27,16 @@ The project consists of the following components:
```
2. Edit `.env` and set secure values for:
- `MEILI_MASTER_KEY` - Master key for Meilisearch
- `API_BOOTSTRAP_TOKEN` - Bootstrap token for initial API authentication
### Running with Docker
```bash
cd infra
docker compose up -d
```
This will start:
- PostgreSQL (port 6432)
- Meilisearch (port 7700)
- API service (port 7080)
- Indexer service (port 7081)
- Backoffice web UI (port 7082)
@@ -48,7 +45,6 @@ This will start:
- **Backoffice**: http://localhost:7082
- **API**: http://localhost:7080
- **Meilisearch**: http://localhost:7700
### Default Credentials
@@ -62,8 +58,7 @@ The default bootstrap token is configured in your `.env` file. Use this for init
```bash
# Start dependencies
cd infra
docker compose up -d postgres meilisearch
docker compose up -d postgres
# Run API
cd apps/api
@@ -86,28 +81,66 @@ The backoffice will be available at http://localhost:7082
## Features
### Libraries Management
- Create and manage multiple libraries
- Configure automatic scanning schedules (hourly, daily, weekly)
- Real-time file watcher for instant indexing
- Full and incremental rebuild options
> For the full feature list, business rules, and API details, see [docs/FEATURES.md](docs/FEATURES.md).
### Books Management
- Support for CBZ, CBR, and PDF formats
- Automatic metadata extraction
- Series and volume detection
- Full-text search with Meilisearch
### Libraries
- Multi-library management with per-library configuration
- Incremental and full scanning, real-time filesystem watcher
- Per-library metadata provider selection (Google Books, ComicVine, BedéThèque, AniList, Open Library)
### Jobs Monitoring
- Real-time job progress tracking
- Detailed statistics (scanned, indexed, removed, errors)
- Job history and logs
- Cancel pending jobs
### Books & Series
- **Formats**: CBZ, CBR, PDF, EPUB
- Automatic metadata extraction (title, series, volume, authors, page count) from filenames and directory structure
- Series aggregation with missing volume detection
- Thumbnail generation (WebP/JPEG/PNG) with lazy generation and bulk rebuild
- CBR → CBZ conversion
### Search
- Full-text search across titles, authors, and series
- Library filtering
- Real-time suggestions
### Reading Progress
- Per-book tracking: unread / reading / read with current page
- Series-level aggregated reading status
- Bulk mark-as-read for series
### Search & Discovery
- Full-text search across titles, authors, and series (PostgreSQL `pg_trgm`)
- Author listing with book/series counts
- Filtering by reading status, series status, format, metadata provider
### External Metadata
- Search, match, approve/reject workflow with confidence scoring
- Batch auto-matching and scheduled metadata refresh
- Field locking to protect manual edits from sync
### Notifications
- **Telegram**: real-time notifications via Telegram Bot API
- 12 granular event toggles (scans, thumbnails, conversions, metadata)
- Book thumbnail images included in notifications where applicable
- Test connection from settings
### External Integrations
- **Komga**: import reading progress
- **Prowlarr**: search for missing volumes
- **qBittorrent**: add torrents directly from search results
### Background Jobs
- Rebuild, rescan, thumbnail generation, metadata batch, CBR conversion
- Real-time progress via Server-Sent Events (SSE)
- Job history, error tracking, cancellation
### Page Rendering
- On-demand page extraction from all formats
- Image processing (format, quality, max width, resampling filter)
- LRU in-memory + disk cache
### Security
- Token-based auth (`admin` / `read` scopes) with Argon2 hashing
- Rate limiting, token expiration and revocation
### Web UI (Backoffice)
- Dashboard with statistics, interactive charts (recharts), and reading progress
- Currently reading & recently read sections
- Library, book, series, author management
- Live job monitoring, metadata search modals, settings panel
- Notification settings with per-event toggle configuration
## Environment Variables
@@ -118,8 +151,6 @@ Variables marquées **required** doivent être définies. Les autres ont une val
| Variable | Description | Défaut |
|----------|-------------|--------|
| `DATABASE_URL` | **required** — Connexion PostgreSQL | — |
| `MEILI_URL` | **required** — URL Meilisearch | — |
| `MEILI_MASTER_KEY` | **required** — Clé maître Meilisearch | — |
### API
@@ -165,7 +196,6 @@ stripstream-librarian/
│ ├── indexer/ # Rust background indexer
│ └── backoffice/ # Next.js web UI
├── infra/
│ ├── docker-compose.yml
│ └── migrations/ # SQL database migrations
├── libraries/ # Book storage (mounted volume)
└── .env # Environment configuration
@@ -207,11 +237,6 @@ services:
volumes:
- postgres_data:/var/lib/postgresql/data
meilisearch:
image: getmeili/meilisearch:v1.12
environment:
MEILI_MASTER_KEY: your_meili_master_key # required — change this
api:
image: julienfroidefond32/stripstream-api:latest
ports:
@@ -222,8 +247,6 @@ services:
environment:
# --- Required ---
DATABASE_URL: postgres://stripstream:stripstream@postgres:5432/stripstream
MEILI_URL: http://meilisearch:7700
MEILI_MASTER_KEY: your_meili_master_key # must match meilisearch above
API_BOOTSTRAP_TOKEN: your_bootstrap_token # required — change this
# --- Optional (defaults shown) ---
# API_LISTEN_ADDR: 0.0.0.0:7080
@@ -238,8 +261,6 @@ services:
environment:
# --- Required ---
DATABASE_URL: postgres://stripstream:stripstream@postgres:5432/stripstream
MEILI_URL: http://meilisearch:7700
MEILI_MASTER_KEY: your_meili_master_key # must match meilisearch above
# --- Optional (defaults shown) ---
# INDEXER_LISTEN_ADDR: 0.0.0.0:7081
# INDEXER_SCAN_INTERVAL_SECONDS: 5
@@ -266,4 +287,4 @@ volumes:
## License
[Your License Here]
This project is licensed under the [MIT License](LICENSE).

View File

@@ -15,10 +15,12 @@ futures = "0.3"
image.workspace = true
jpeg-decoder.workspace = true
lru.workspace = true
notifications = { path = "../../crates/notifications" }
stripstream-core = { path = "../../crates/core" }
parsers = { path = "../../crates/parsers" }
rand.workspace = true
tokio-stream = "0.1"
regex = "1"
reqwest.workspace = true
serde.workspace = true
serde_json.workspace = true
@@ -33,3 +35,4 @@ uuid.workspace = true
utoipa.workspace = true
utoipa-swagger-ui = { workspace = true, features = ["axum"] }
webp.workspace = true
scraper.workspace = true

View File

@@ -1,25 +1,42 @@
FROM rust:1-bookworm AS builder
WORKDIR /app
# Install sccache for faster builds
RUN cargo install sccache --locked
ENV RUSTC_WRAPPER=sccache
ENV SCCACHE_DIR=/sccache
# Copy workspace manifests and create dummy source files to cache dependency builds
COPY Cargo.toml ./
COPY apps/api/Cargo.toml apps/api/Cargo.toml
COPY apps/indexer/Cargo.toml apps/indexer/Cargo.toml
COPY crates/core/Cargo.toml crates/core/Cargo.toml
COPY crates/notifications/Cargo.toml crates/notifications/Cargo.toml
COPY crates/parsers/Cargo.toml crates/parsers/Cargo.toml
RUN mkdir -p apps/api/src apps/indexer/src crates/core/src crates/notifications/src crates/parsers/src && \
echo "fn main() {}" > apps/api/src/main.rs && \
echo "fn main() {}" > apps/indexer/src/main.rs && \
echo "" > apps/indexer/src/lib.rs && \
echo "" > crates/core/src/lib.rs && \
echo "" > crates/notifications/src/lib.rs && \
echo "" > crates/parsers/src/lib.rs
# Build dependencies only (cached as long as Cargo.toml files don't change)
RUN --mount=type=cache,target=/usr/local/cargo/registry \
--mount=type=cache,target=/usr/local/cargo/git \
--mount=type=cache,target=/app/target \
cargo build --release -p api && \
cargo install sqlx-cli --no-default-features --features postgres --locked
# Copy real source code and build
COPY apps/api/src apps/api/src
COPY apps/indexer/src apps/indexer/src
COPY crates/core/src crates/core/src
COPY crates/notifications/src crates/notifications/src
COPY crates/parsers/src crates/parsers/src
# Build with sccache (cache persisted between builds via Docker cache mount)
RUN --mount=type=cache,target=/sccache \
RUN --mount=type=cache,target=/usr/local/cargo/registry \
--mount=type=cache,target=/usr/local/cargo/git \
--mount=type=cache,target=/app/target \
touch apps/api/src/main.rs crates/core/src/lib.rs crates/notifications/src/lib.rs crates/parsers/src/lib.rs && \
cargo build --release -p api && \
cargo install sqlx-cli --no-default-features --features postgres --locked
cp /app/target/release/api /usr/local/bin/api
FROM debian:bookworm-slim
@@ -42,7 +59,7 @@ RUN ARCH=$(dpkg --print-architecture) && \
cp /tmp/lib/libpdfium.so /usr/local/lib/ && \
rm -rf /tmp/pdfium.tgz /tmp/lib /tmp/include && \
ldconfig
COPY --from=builder /app/target/release/api /usr/local/bin/api
COPY --from=builder /usr/local/bin/api /usr/local/bin/api
COPY --from=builder /usr/local/cargo/bin/sqlx /usr/local/bin/sqlx
COPY infra/migrations /app/migrations
COPY apps/api/entrypoint.sh /usr/local/bin/entrypoint.sh

View File

@@ -10,10 +10,15 @@ use sqlx::Row;
use crate::{error::ApiError, state::AppState};
#[derive(Clone, Debug)]
pub struct AuthUser {
pub user_id: uuid::Uuid,
}
#[derive(Clone, Debug)]
pub enum Scope {
Admin,
Read,
Read { user_id: uuid::Uuid },
}
pub async fn require_admin(
@@ -40,6 +45,20 @@ pub async fn require_read(
let token = bearer_token(&req).ok_or_else(|| ApiError::unauthorized("missing bearer token"))?;
let scope = authenticate(&state, token).await?;
if let Scope::Read { user_id } = &scope {
req.extensions_mut().insert(AuthUser { user_id: *user_id });
} else if matches!(scope, Scope::Admin) {
// Admin peut s'impersonifier via le header X-As-User
if let Some(as_user_id) = req
.headers()
.get("X-As-User")
.and_then(|v| v.to_str().ok())
.and_then(|v| uuid::Uuid::parse_str(v).ok())
{
req.extensions_mut().insert(AuthUser { user_id: as_user_id });
}
}
req.extensions_mut().insert(scope);
Ok(next.run(req).await)
}
@@ -60,8 +79,7 @@ async fn authenticate(state: &AppState, token: &str) -> Result<Scope, ApiError>
let maybe_row = sqlx::query(
r#"
SELECT id, token_hash, scope
FROM api_tokens
SELECT id, token_hash, scope, user_id FROM api_tokens
WHERE prefix = $1 AND revoked_at IS NULL AND (expires_at IS NULL OR expires_at > NOW())
"#,
)
@@ -88,7 +106,12 @@ async fn authenticate(state: &AppState, token: &str) -> Result<Scope, ApiError>
let scope: String = row.try_get("scope").map_err(|_| ApiError::unauthorized("invalid token"))?;
match scope.as_str() {
"admin" => Ok(Scope::Admin),
"read" => Ok(Scope::Read),
"read" => {
let user_id: uuid::Uuid = row
.try_get("user_id")
.map_err(|_| ApiError::unauthorized("read token missing user_id"))?;
Ok(Scope::Read { user_id })
}
_ => Err(ApiError::unauthorized("invalid token scope")),
}
}

178
apps/api/src/authors.rs Normal file
View File

@@ -0,0 +1,178 @@
use axum::{extract::{Query, State}, Json};
use serde::{Deserialize, Serialize};
use sqlx::Row;
use utoipa::ToSchema;
use crate::{error::ApiError, state::AppState};
#[derive(Deserialize, ToSchema)]
pub struct ListAuthorsQuery {
#[schema(value_type = Option<String>, example = "batman")]
pub q: Option<String>,
#[schema(value_type = Option<i64>, example = 1)]
pub page: Option<i64>,
#[schema(value_type = Option<i64>, example = 20)]
pub limit: Option<i64>,
/// Sort order: "name" (default), "books" (most books first)
#[schema(value_type = Option<String>, example = "books")]
pub sort: Option<String>,
}
#[derive(Serialize, ToSchema)]
pub struct AuthorItem {
pub name: String,
pub book_count: i64,
pub series_count: i64,
}
#[derive(Serialize, ToSchema)]
pub struct AuthorsPageResponse {
pub items: Vec<AuthorItem>,
pub total: i64,
pub page: i64,
pub limit: i64,
}
/// List all unique authors with book/series counts
#[utoipa::path(
get,
path = "/authors",
tag = "authors",
params(
("q" = Option<String>, Query, description = "Search by author name"),
("page" = Option<i64>, Query, description = "Page number (1-based)"),
("limit" = Option<i64>, Query, description = "Items per page (max 100)"),
("sort" = Option<String>, Query, description = "Sort: name (default) or books"),
),
responses(
(status = 200, body = AuthorsPageResponse),
(status = 401, description = "Unauthorized"),
),
security(("Bearer" = []))
)]
pub async fn list_authors(
State(state): State<AppState>,
Query(query): Query<ListAuthorsQuery>,
) -> Result<Json<AuthorsPageResponse>, ApiError> {
let page = query.page.unwrap_or(1).max(1);
let limit = query.limit.unwrap_or(20).clamp(1, 100);
let offset = (page - 1) * limit;
let sort = query.sort.as_deref().unwrap_or("name");
let order_clause = match sort {
"books" => "book_count DESC, name ASC",
_ => "name ASC",
};
let q_pattern = query.q.as_deref()
.filter(|s| !s.trim().is_empty())
.map(|s| format!("%{s}%"));
// Aggregate unique authors from books.authors + books.author + series_metadata.authors
let sql = format!(
r#"
WITH all_authors AS (
SELECT DISTINCT UNNEST(
COALESCE(
NULLIF(authors, '{{}}'),
CASE WHEN author IS NOT NULL AND author != '' THEN ARRAY[author] ELSE ARRAY[]::text[] END
)
) AS name
FROM books
UNION
SELECT DISTINCT UNNEST(authors) AS name
FROM series_metadata
WHERE authors != '{{}}'
),
filtered AS (
SELECT name FROM all_authors
WHERE ($1::text IS NULL OR name ILIKE $1)
),
book_counts AS (
SELECT
f.name AS author_name,
COUNT(DISTINCT b.id) AS book_count
FROM filtered f
LEFT JOIN books b ON (
f.name = ANY(
COALESCE(
NULLIF(b.authors, '{{}}'),
CASE WHEN b.author IS NOT NULL AND b.author != '' THEN ARRAY[b.author] ELSE ARRAY[]::text[] END
)
)
)
GROUP BY f.name
),
series_counts AS (
SELECT
f.name AS author_name,
COUNT(DISTINCT (sm.library_id, sm.name)) AS series_count
FROM filtered f
LEFT JOIN series_metadata sm ON (
f.name = ANY(sm.authors) AND sm.authors != '{{}}'
)
GROUP BY f.name
)
SELECT
f.name,
COALESCE(bc.book_count, 0) AS book_count,
COALESCE(sc.series_count, 0) AS series_count
FROM filtered f
LEFT JOIN book_counts bc ON bc.author_name = f.name
LEFT JOIN series_counts sc ON sc.author_name = f.name
ORDER BY {order_clause}
LIMIT $2 OFFSET $3
"#
);
let count_sql = r#"
WITH all_authors AS (
SELECT DISTINCT UNNEST(
COALESCE(
NULLIF(authors, '{}'),
CASE WHEN author IS NOT NULL AND author != '' THEN ARRAY[author] ELSE ARRAY[]::text[] END
)
) AS name
FROM books
UNION
SELECT DISTINCT UNNEST(authors) AS name
FROM series_metadata
WHERE authors != '{}'
)
SELECT COUNT(*) AS total
FROM all_authors
WHERE ($1::text IS NULL OR name ILIKE $1)
"#;
let (rows, count_row) = tokio::join!(
sqlx::query(&sql)
.bind(q_pattern.as_deref())
.bind(limit)
.bind(offset)
.fetch_all(&state.pool),
sqlx::query(count_sql)
.bind(q_pattern.as_deref())
.fetch_one(&state.pool)
);
let rows = rows.map_err(|e| ApiError::internal(format!("authors query failed: {e}")))?;
let total: i64 = count_row
.map_err(|e| ApiError::internal(format!("authors count failed: {e}")))?
.get("total");
let items: Vec<AuthorItem> = rows
.iter()
.map(|r| AuthorItem {
name: r.get("name"),
book_count: r.get("book_count"),
series_count: r.get("series_count"),
})
.collect();
Ok(Json(AuthorsPageResponse {
items,
total,
page,
limit,
}))
}

File diff suppressed because it is too large Load Diff

View File

@@ -83,3 +83,9 @@ impl From<std::io::Error> for ApiError {
Self::internal(format!("IO error: {err}"))
}
}
impl From<reqwest::Error> for ApiError {
fn from(err: reqwest::Error) -> Self {
Self::internal(format!("HTTP client error: {err}"))
}
}

View File

@@ -16,6 +16,10 @@ pub struct RebuildRequest {
pub library_id: Option<Uuid>,
#[schema(value_type = Option<bool>, example = false)]
pub full: Option<bool>,
/// Deep rescan: clears directory mtimes to force re-walking all directories,
/// discovering newly supported formats without deleting existing data.
#[schema(value_type = Option<bool>, example = false)]
pub rescan: Option<bool>,
}
#[derive(Serialize, ToSchema)]
@@ -117,7 +121,8 @@ pub async fn enqueue_rebuild(
) -> Result<Json<IndexJobResponse>, ApiError> {
let library_id = payload.as_ref().and_then(|p| p.0.library_id);
let is_full = payload.as_ref().and_then(|p| p.0.full).unwrap_or(false);
let job_type = if is_full { "full_rebuild" } else { "rebuild" };
let is_rescan = payload.as_ref().and_then(|p| p.0.rescan).unwrap_or(false);
let job_type = if is_full { "full_rebuild" } else if is_rescan { "rescan" } else { "rebuild" };
let id = Uuid::new_v4();
sqlx::query(

134
apps/api/src/job_poller.rs Normal file
View File

@@ -0,0 +1,134 @@
use std::time::Duration;
use sqlx::{PgPool, Row};
use tracing::{error, info, trace};
use uuid::Uuid;
use crate::{metadata_batch, metadata_refresh};
/// Poll for pending API-only jobs (`metadata_batch`, `metadata_refresh`) and process them.
/// This mirrors the indexer's worker loop but for job types handled by the API.
pub async fn run_job_poller(pool: PgPool, interval_seconds: u64) {
let wait = Duration::from_secs(interval_seconds.max(1));
loop {
match claim_next_api_job(&pool).await {
Ok(Some((job_id, job_type, library_id))) => {
info!("[JOB_POLLER] Claimed {job_type} job {job_id} library={library_id}");
let pool_clone = pool.clone();
let library_name: Option<String> =
sqlx::query_scalar("SELECT name FROM libraries WHERE id = $1")
.bind(library_id)
.fetch_optional(&pool)
.await
.ok()
.flatten();
tokio::spawn(async move {
let result = match job_type.as_str() {
"metadata_refresh" => {
metadata_refresh::process_metadata_refresh(
&pool_clone,
job_id,
library_id,
)
.await
}
"metadata_batch" => {
metadata_batch::process_metadata_batch(
&pool_clone,
job_id,
library_id,
)
.await
}
_ => Err(format!("Unknown API job type: {job_type}")),
};
if let Err(e) = result {
error!("[JOB_POLLER] {job_type} job {job_id} failed: {e}");
let _ = sqlx::query(
"UPDATE index_jobs SET status = 'failed', error_opt = $2, finished_at = NOW() WHERE id = $1",
)
.bind(job_id)
.bind(e.to_string())
.execute(&pool_clone)
.await;
match job_type.as_str() {
"metadata_refresh" => {
notifications::notify(
pool_clone,
notifications::NotificationEvent::MetadataRefreshFailed {
library_name,
error: e.to_string(),
},
);
}
"metadata_batch" => {
notifications::notify(
pool_clone,
notifications::NotificationEvent::MetadataBatchFailed {
library_name,
error: e.to_string(),
},
);
}
_ => {}
}
}
});
}
Ok(None) => {
trace!("[JOB_POLLER] No pending API jobs, waiting...");
tokio::time::sleep(wait).await;
}
Err(err) => {
error!("[JOB_POLLER] Error claiming job: {err}");
tokio::time::sleep(wait).await;
}
}
}
}
const API_JOB_TYPES: &[&str] = &["metadata_batch", "metadata_refresh"];
async fn claim_next_api_job(pool: &PgPool) -> Result<Option<(Uuid, String, Uuid)>, sqlx::Error> {
let mut tx = pool.begin().await?;
let row = sqlx::query(
r#"
SELECT id, type, library_id
FROM index_jobs
WHERE status = 'pending'
AND type = ANY($1)
AND library_id IS NOT NULL
ORDER BY created_at ASC
FOR UPDATE SKIP LOCKED
LIMIT 1
"#,
)
.bind(API_JOB_TYPES)
.fetch_optional(&mut *tx)
.await?;
let Some(row) = row else {
tx.commit().await?;
return Ok(None);
};
let id: Uuid = row.get("id");
let job_type: String = row.get("type");
let library_id: Uuid = row.get("library_id");
sqlx::query(
"UPDATE index_jobs SET status = 'running', started_at = NOW(), error_opt = NULL WHERE id = $1",
)
.bind(id)
.execute(&mut *tx)
.await?;
tx.commit().await?;
Ok(Some((id, job_type, library_id)))
}

410
apps/api/src/komga.rs Normal file
View File

@@ -0,0 +1,410 @@
use axum::{extract::State, Json};
use chrono::{DateTime, Utc};
use serde::{Deserialize, Serialize};
use sqlx::Row;
use std::collections::HashMap;
use utoipa::ToSchema;
use uuid::Uuid;
use crate::{error::ApiError, state::AppState};
// ─── Komga API types ─────────────────────────────────────────────────────────
#[derive(Deserialize)]
struct KomgaBooksResponse {
content: Vec<KomgaBook>,
#[serde(rename = "totalPages")]
total_pages: i32,
number: i32,
}
#[derive(Deserialize)]
struct KomgaBook {
name: String,
#[serde(rename = "seriesTitle")]
series_title: String,
metadata: KomgaBookMetadata,
}
#[derive(Deserialize)]
struct KomgaBookMetadata {
title: String,
}
// ─── Request / Response ──────────────────────────────────────────────────────
#[derive(Deserialize, ToSchema)]
pub struct KomgaSyncRequest {
pub url: String,
pub username: String,
pub password: String,
#[schema(value_type = String)]
pub user_id: Uuid,
}
#[derive(Serialize, ToSchema)]
pub struct KomgaSyncResponse {
#[schema(value_type = String)]
pub id: Uuid,
pub komga_url: String,
#[schema(value_type = Option<String>)]
pub user_id: Option<Uuid>,
pub total_komga_read: i64,
pub matched: i64,
pub already_read: i64,
pub newly_marked: i64,
pub matched_books: Vec<String>,
pub newly_marked_books: Vec<String>,
pub unmatched: Vec<String>,
#[schema(value_type = String)]
pub created_at: DateTime<Utc>,
}
#[derive(Serialize, ToSchema)]
pub struct KomgaSyncReportSummary {
#[schema(value_type = String)]
pub id: Uuid,
pub komga_url: String,
#[schema(value_type = Option<String>)]
pub user_id: Option<Uuid>,
pub total_komga_read: i64,
pub matched: i64,
pub already_read: i64,
pub newly_marked: i64,
pub unmatched_count: i32,
#[schema(value_type = String)]
pub created_at: DateTime<Utc>,
}
// ─── Handlers ────────────────────────────────────────────────────────────────
/// Sync read books from a Komga server
#[utoipa::path(
post,
path = "/komga/sync",
tag = "komga",
request_body = KomgaSyncRequest,
responses(
(status = 200, body = KomgaSyncResponse),
(status = 400, description = "Bad request"),
(status = 401, description = "Unauthorized"),
(status = 500, description = "Komga connection or sync error"),
),
security(("Bearer" = []))
)]
pub async fn sync_komga_read_books(
State(state): State<AppState>,
Json(body): Json<KomgaSyncRequest>,
) -> Result<Json<KomgaSyncResponse>, ApiError> {
let url = body.url.trim_end_matches('/').to_string();
if url.is_empty() {
return Err(ApiError::bad_request("url is required"));
}
// Build HTTP client with basic auth
let client = reqwest::Client::builder()
.timeout(std::time::Duration::from_secs(30))
.build()
.map_err(|e| ApiError::internal(format!("failed to build HTTP client: {e}")))?;
// Paginate through all READ books from Komga
let mut komga_books: Vec<(String, String)> = Vec::new(); // (series_title, title)
let mut page = 0;
let page_size = 100;
let max_pages = 500;
loop {
let resp = client
.post(format!("{url}/api/v1/books/list?page={page}&size={page_size}"))
.basic_auth(&body.username, Some(&body.password))
.header("Content-Type", "application/json")
.json(&serde_json::json!({ "condition": { "readStatus": { "operator": "is", "value": "READ" } } }))
.send()
.await
.map_err(|e| ApiError::internal(format!("Komga request failed: {e}")))?;
if !resp.status().is_success() {
let status = resp.status();
let text = resp.text().await.unwrap_or_default();
return Err(ApiError::internal(format!(
"Komga returned {status}: {text}"
)));
}
let data: KomgaBooksResponse = resp
.json()
.await
.map_err(|e| ApiError::internal(format!("Failed to parse Komga response: {e}")))?;
for book in &data.content {
let title = if !book.metadata.title.is_empty() {
&book.metadata.title
} else {
&book.name
};
komga_books.push((book.series_title.clone(), title.clone()));
}
if data.number >= data.total_pages - 1 || page >= max_pages {
break;
}
page += 1;
}
let total_komga_read = komga_books.len() as i64;
// Build local lookup maps
let rows = sqlx::query(
"SELECT id, title, COALESCE(series, '') as series, LOWER(title) as title_lower, LOWER(COALESCE(series, '')) as series_lower FROM books",
)
.fetch_all(&state.pool)
.await?;
type BookEntry = (Uuid, String, String);
// Primary: (series_lower, title_lower) -> Vec<(Uuid, title, series)>
let mut primary_map: HashMap<(String, String), Vec<BookEntry>> = HashMap::new();
// Secondary: title_lower -> Vec<(Uuid, title, series)>
let mut secondary_map: HashMap<String, Vec<BookEntry>> = HashMap::new();
for row in &rows {
let id: Uuid = row.get("id");
let title: String = row.get("title");
let series: String = row.get("series");
let title_lower: String = row.get("title_lower");
let series_lower: String = row.get("series_lower");
let entry = (id, title, series);
primary_map
.entry((series_lower, title_lower.clone()))
.or_default()
.push(entry.clone());
secondary_map.entry(title_lower).or_default().push(entry);
}
// Match Komga books to local books
let mut matched_entries: Vec<(Uuid, String)> = Vec::new(); // (id, display_title)
let mut unmatched: Vec<String> = Vec::new();
for (series_title, title) in &komga_books {
let title_lower = title.to_lowercase();
let series_lower = series_title.to_lowercase();
let found = if let Some(entries) = primary_map.get(&(series_lower.clone(), title_lower.clone())) {
Some(entries)
} else {
secondary_map.get(&title_lower)
};
if let Some(entries) = found {
for (id, local_title, local_series) in entries {
let display = if local_series.is_empty() {
local_title.clone()
} else {
format!("{local_series} - {local_title}")
};
matched_entries.push((*id, display));
}
} else if series_title.is_empty() {
unmatched.push(title.clone());
} else {
unmatched.push(format!("{series_title} - {title}"));
}
}
// Deduplicate by ID
matched_entries.sort_by(|a, b| a.0.cmp(&b.0));
matched_entries.dedup_by(|a, b| a.0 == b.0);
let matched_ids: Vec<Uuid> = matched_entries.iter().map(|(id, _)| *id).collect();
let matched = matched_ids.len() as i64;
let mut already_read: i64 = 0;
let mut already_read_ids: std::collections::HashSet<Uuid> = std::collections::HashSet::new();
if !matched_ids.is_empty() {
// Get already-read book IDs for this user
let ar_rows = sqlx::query(
"SELECT book_id FROM book_reading_progress WHERE book_id = ANY($1) AND user_id = $2 AND status = 'read'",
)
.bind(&matched_ids)
.bind(body.user_id)
.fetch_all(&state.pool)
.await?;
for row in &ar_rows {
already_read_ids.insert(row.get("book_id"));
}
already_read = already_read_ids.len() as i64;
// Bulk upsert all matched books as read for this user
sqlx::query(
r#"
INSERT INTO book_reading_progress (book_id, user_id, status, current_page, last_read_at, updated_at)
SELECT unnest($1::uuid[]), $2, 'read', NULL, NOW(), NOW()
ON CONFLICT (book_id, user_id) DO UPDATE
SET status = 'read',
current_page = NULL,
last_read_at = NOW(),
updated_at = NOW()
WHERE book_reading_progress.status != 'read'
"#,
)
.bind(&matched_ids)
.bind(body.user_id)
.execute(&state.pool)
.await?;
}
let newly_marked = matched - already_read;
// Build matched_books and newly_marked_books lists
let mut newly_marked_books: Vec<String> = Vec::new();
let mut matched_books: Vec<String> = Vec::new();
for (id, title) in &matched_entries {
if !already_read_ids.contains(id) {
newly_marked_books.push(title.clone());
}
matched_books.push(title.clone());
}
// Sort: newly marked first, then alphabetical
let newly_marked_set: std::collections::HashSet<&str> =
newly_marked_books.iter().map(|s| s.as_str()).collect();
matched_books.sort_by(|a, b| {
let a_new = newly_marked_set.contains(a.as_str());
let b_new = newly_marked_set.contains(b.as_str());
b_new.cmp(&a_new).then(a.cmp(b))
});
newly_marked_books.sort();
// Save sync report
let unmatched_json = serde_json::to_value(&unmatched).unwrap_or_default();
let matched_books_json = serde_json::to_value(&matched_books).unwrap_or_default();
let newly_marked_books_json = serde_json::to_value(&newly_marked_books).unwrap_or_default();
let report_row = sqlx::query(
r#"
INSERT INTO komga_sync_reports (komga_url, user_id, total_komga_read, matched, already_read, newly_marked, matched_books, newly_marked_books, unmatched)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9)
RETURNING id, created_at
"#,
)
.bind(&url)
.bind(body.user_id)
.bind(total_komga_read)
.bind(matched)
.bind(already_read)
.bind(newly_marked)
.bind(&matched_books_json)
.bind(&newly_marked_books_json)
.bind(&unmatched_json)
.fetch_one(&state.pool)
.await?;
Ok(Json(KomgaSyncResponse {
id: report_row.get("id"),
komga_url: url,
user_id: Some(body.user_id),
total_komga_read,
matched,
already_read,
newly_marked,
matched_books,
newly_marked_books,
unmatched,
created_at: report_row.get("created_at"),
}))
}
/// List Komga sync reports (most recent first)
#[utoipa::path(
get,
path = "/komga/reports",
tag = "komga",
responses(
(status = 200, body = Vec<KomgaSyncReportSummary>),
(status = 401, description = "Unauthorized"),
),
security(("Bearer" = []))
)]
pub async fn list_sync_reports(
State(state): State<AppState>,
) -> Result<Json<Vec<KomgaSyncReportSummary>>, ApiError> {
let rows = sqlx::query(
r#"
SELECT id, komga_url, user_id, total_komga_read, matched, already_read, newly_marked,
jsonb_array_length(unmatched) as unmatched_count, created_at
FROM komga_sync_reports
ORDER BY created_at DESC
LIMIT 20
"#,
)
.fetch_all(&state.pool)
.await?;
let reports: Vec<KomgaSyncReportSummary> = rows
.iter()
.map(|row| KomgaSyncReportSummary {
id: row.get("id"),
komga_url: row.get("komga_url"),
user_id: row.get("user_id"),
total_komga_read: row.get("total_komga_read"),
matched: row.get("matched"),
already_read: row.get("already_read"),
newly_marked: row.get("newly_marked"),
unmatched_count: row.get("unmatched_count"),
created_at: row.get("created_at"),
})
.collect();
Ok(Json(reports))
}
/// Get a specific sync report with full unmatched list
#[utoipa::path(
get,
path = "/komga/reports/{id}",
tag = "komga",
params(("id" = String, Path, description = "Report UUID")),
responses(
(status = 200, body = KomgaSyncResponse),
(status = 404, description = "Report not found"),
(status = 401, description = "Unauthorized"),
),
security(("Bearer" = []))
)]
pub async fn get_sync_report(
State(state): State<AppState>,
axum::extract::Path(id): axum::extract::Path<Uuid>,
) -> Result<Json<KomgaSyncResponse>, ApiError> {
let row = sqlx::query(
r#"
SELECT id, komga_url, user_id, total_komga_read, matched, already_read, newly_marked, matched_books, newly_marked_books, unmatched, created_at
FROM komga_sync_reports
WHERE id = $1
"#,
)
.bind(id)
.fetch_optional(&state.pool)
.await?;
let row = row.ok_or_else(|| ApiError::not_found("report not found"))?;
let matched_books_json: serde_json::Value = row.try_get("matched_books").unwrap_or(serde_json::Value::Array(vec![]));
let matched_books: Vec<String> = serde_json::from_value(matched_books_json).unwrap_or_default();
let newly_marked_books_json: serde_json::Value = row.try_get("newly_marked_books").unwrap_or(serde_json::Value::Array(vec![]));
let newly_marked_books: Vec<String> = serde_json::from_value(newly_marked_books_json).unwrap_or_default();
let unmatched_json: serde_json::Value = row.get("unmatched");
let unmatched: Vec<String> = serde_json::from_value(unmatched_json).unwrap_or_default();
Ok(Json(KomgaSyncResponse {
id: row.get("id"),
komga_url: row.get("komga_url"),
user_id: row.get("user_id"),
total_komga_read: row.get("total_komga_read"),
matched: row.get("matched"),
already_read: row.get("already_read"),
newly_marked: row.get("newly_marked"),
matched_books,
newly_marked_books,
unmatched,
created_at: row.get("created_at"),
}))
}

View File

@@ -21,6 +21,15 @@ pub struct LibraryResponse {
#[schema(value_type = Option<String>)]
pub next_scan_at: Option<chrono::DateTime<chrono::Utc>>,
pub watcher_enabled: bool,
pub metadata_provider: Option<String>,
pub fallback_metadata_provider: Option<String>,
pub metadata_refresh_mode: String,
#[schema(value_type = Option<String>)]
pub next_metadata_refresh_at: Option<chrono::DateTime<chrono::Utc>>,
pub series_count: i64,
/// First book IDs from up to 5 distinct series (for thumbnail fan display)
#[schema(value_type = Vec<String>)]
pub thumbnail_book_ids: Vec<Uuid>,
}
#[derive(Deserialize, ToSchema)]
@@ -39,14 +48,27 @@ pub struct CreateLibraryRequest {
responses(
(status = 200, body = Vec<LibraryResponse>),
(status = 401, description = "Unauthorized"),
(status = 403, description = "Forbidden - Admin scope required"),
),
security(("Bearer" = []))
)]
pub async fn list_libraries(State(state): State<AppState>) -> Result<Json<Vec<LibraryResponse>>, ApiError> {
let rows = sqlx::query(
"SELECT l.id, l.name, l.root_path, l.enabled, l.monitor_enabled, l.scan_mode, l.next_scan_at, l.watcher_enabled,
(SELECT COUNT(*) FROM books b WHERE b.library_id = l.id) as book_count
"SELECT l.id, l.name, l.root_path, l.enabled, l.monitor_enabled, l.scan_mode, l.next_scan_at, l.watcher_enabled, l.metadata_provider, l.fallback_metadata_provider, l.metadata_refresh_mode, l.next_metadata_refresh_at,
(SELECT COUNT(*) FROM books b WHERE b.library_id = l.id) as book_count,
(SELECT COUNT(DISTINCT COALESCE(NULLIF(b.series, ''), 'unclassified')) FROM books b WHERE b.library_id = l.id) as series_count,
COALESCE((
SELECT ARRAY_AGG(first_id ORDER BY series_name)
FROM (
SELECT DISTINCT ON (COALESCE(NULLIF(b.series, ''), 'unclassified'))
COALESCE(NULLIF(b.series, ''), 'unclassified') as series_name,
b.id as first_id
FROM books b
WHERE b.library_id = l.id
ORDER BY COALESCE(NULLIF(b.series, ''), 'unclassified'),
b.volume NULLS LAST, b.title ASC
LIMIT 5
) sub
), ARRAY[]::uuid[]) as thumbnail_book_ids
FROM libraries l ORDER BY l.created_at DESC"
)
.fetch_all(&state.pool)
@@ -60,10 +82,16 @@ pub async fn list_libraries(State(state): State<AppState>) -> Result<Json<Vec<Li
root_path: row.get("root_path"),
enabled: row.get("enabled"),
book_count: row.get("book_count"),
series_count: row.get("series_count"),
monitor_enabled: row.get("monitor_enabled"),
scan_mode: row.get("scan_mode"),
next_scan_at: row.get("next_scan_at"),
watcher_enabled: row.get("watcher_enabled"),
metadata_provider: row.get("metadata_provider"),
fallback_metadata_provider: row.get("fallback_metadata_provider"),
metadata_refresh_mode: row.get("metadata_refresh_mode"),
next_metadata_refresh_at: row.get("next_metadata_refresh_at"),
thumbnail_book_ids: row.get("thumbnail_book_ids"),
})
.collect();
@@ -111,10 +139,16 @@ pub async fn create_library(
root_path,
enabled: true,
book_count: 0,
series_count: 0,
monitor_enabled: false,
scan_mode: "manual".to_string(),
next_scan_at: None,
watcher_enabled: false,
metadata_provider: None,
fallback_metadata_provider: None,
metadata_refresh_mode: "manual".to_string(),
next_metadata_refresh_at: None,
thumbnail_book_ids: vec![],
}))
}
@@ -186,7 +220,6 @@ use crate::index_jobs::{IndexJobResponse, RebuildRequest};
(status = 200, body = IndexJobResponse),
(status = 404, description = "Library not found"),
(status = 401, description = "Unauthorized"),
(status = 403, description = "Forbidden - Admin scope required"),
),
security(("Bearer" = []))
)]
@@ -206,7 +239,8 @@ pub async fn scan_library(
}
let is_full = payload.as_ref().and_then(|p| p.full).unwrap_or(false);
let job_type = if is_full { "full_rebuild" } else { "rebuild" };
let is_rescan = payload.as_ref().and_then(|p| p.rescan).unwrap_or(false);
let job_type = if is_full { "full_rebuild" } else if is_rescan { "rescan" } else { "rebuild" };
// Create indexing job for this library
let job_id = Uuid::new_v4();
@@ -235,6 +269,8 @@ pub struct UpdateMonitoringRequest {
#[schema(value_type = String, example = "hourly")]
pub scan_mode: String, // 'manual', 'hourly', 'daily', 'weekly'
pub watcher_enabled: Option<bool>,
#[schema(value_type = Option<String>, example = "daily")]
pub metadata_refresh_mode: Option<String>, // 'manual', 'hourly', 'daily', 'weekly'
}
/// Update monitoring settings for a library
@@ -265,6 +301,12 @@ pub async fn update_monitoring(
return Err(ApiError::bad_request("scan_mode must be one of: manual, hourly, daily, weekly"));
}
// Validate metadata_refresh_mode
let metadata_refresh_mode = input.metadata_refresh_mode.as_deref().unwrap_or("manual");
if !valid_modes.contains(&metadata_refresh_mode) {
return Err(ApiError::bad_request("metadata_refresh_mode must be one of: manual, hourly, daily, weekly"));
}
// Calculate next_scan_at if monitoring is enabled
let next_scan_at = if input.monitor_enabled {
let interval_minutes = match input.scan_mode.as_str() {
@@ -278,16 +320,31 @@ pub async fn update_monitoring(
None
};
// Calculate next_metadata_refresh_at
let next_metadata_refresh_at = if metadata_refresh_mode != "manual" {
let interval_minutes = match metadata_refresh_mode {
"hourly" => 60,
"daily" => 1440,
"weekly" => 10080,
_ => 1440,
};
Some(chrono::Utc::now() + chrono::Duration::minutes(interval_minutes))
} else {
None
};
let watcher_enabled = input.watcher_enabled.unwrap_or(false);
let result = sqlx::query(
"UPDATE libraries SET monitor_enabled = $2, scan_mode = $3, next_scan_at = $4, watcher_enabled = $5 WHERE id = $1 RETURNING id, name, root_path, enabled, monitor_enabled, scan_mode, next_scan_at, watcher_enabled"
"UPDATE libraries SET monitor_enabled = $2, scan_mode = $3, next_scan_at = $4, watcher_enabled = $5, metadata_refresh_mode = $6, next_metadata_refresh_at = $7 WHERE id = $1 RETURNING id, name, root_path, enabled, monitor_enabled, scan_mode, next_scan_at, watcher_enabled, metadata_provider, fallback_metadata_provider, metadata_refresh_mode, next_metadata_refresh_at"
)
.bind(library_id)
.bind(input.monitor_enabled)
.bind(input.scan_mode)
.bind(next_scan_at)
.bind(watcher_enabled)
.bind(metadata_refresh_mode)
.bind(next_metadata_refresh_at)
.fetch_optional(&state.pool)
.await?;
@@ -300,15 +357,121 @@ pub async fn update_monitoring(
.fetch_one(&state.pool)
.await?;
let series_count: i64 = sqlx::query_scalar("SELECT COUNT(DISTINCT COALESCE(NULLIF(series, ''), 'unclassified')) FROM books WHERE library_id = $1")
.bind(library_id)
.fetch_one(&state.pool)
.await?;
let thumbnail_book_ids: Vec<Uuid> = sqlx::query_scalar(
"SELECT b.id FROM books b
WHERE b.library_id = $1
ORDER BY COALESCE(NULLIF(b.series, ''), 'unclassified'), b.volume NULLS LAST, b.title ASC
LIMIT 5"
)
.bind(library_id)
.fetch_all(&state.pool)
.await
.unwrap_or_default();
Ok(Json(LibraryResponse {
id: row.get("id"),
name: row.get("name"),
root_path: row.get("root_path"),
enabled: row.get("enabled"),
book_count,
series_count,
monitor_enabled: row.get("monitor_enabled"),
scan_mode: row.get("scan_mode"),
next_scan_at: row.get("next_scan_at"),
watcher_enabled: row.get("watcher_enabled"),
metadata_provider: row.get("metadata_provider"),
fallback_metadata_provider: row.get("fallback_metadata_provider"),
metadata_refresh_mode: row.get("metadata_refresh_mode"),
next_metadata_refresh_at: row.get("next_metadata_refresh_at"),
thumbnail_book_ids,
}))
}
#[derive(Deserialize, ToSchema)]
pub struct UpdateMetadataProviderRequest {
pub metadata_provider: Option<String>,
pub fallback_metadata_provider: Option<String>,
}
/// Update the metadata provider for a library
#[utoipa::path(
patch,
path = "/libraries/{id}/metadata-provider",
tag = "libraries",
params(
("id" = String, Path, description = "Library UUID"),
),
request_body = UpdateMetadataProviderRequest,
responses(
(status = 200, body = LibraryResponse),
(status = 404, description = "Library not found"),
(status = 401, description = "Unauthorized"),
(status = 403, description = "Forbidden - Admin scope required"),
),
security(("Bearer" = []))
)]
pub async fn update_metadata_provider(
State(state): State<AppState>,
AxumPath(library_id): AxumPath<Uuid>,
Json(input): Json<UpdateMetadataProviderRequest>,
) -> Result<Json<LibraryResponse>, ApiError> {
let provider = input.metadata_provider.as_deref().filter(|s| !s.is_empty());
let fallback = input.fallback_metadata_provider.as_deref().filter(|s| !s.is_empty());
let result = sqlx::query(
"UPDATE libraries SET metadata_provider = $2, fallback_metadata_provider = $3 WHERE id = $1 RETURNING id, name, root_path, enabled, monitor_enabled, scan_mode, next_scan_at, watcher_enabled, metadata_provider, fallback_metadata_provider, metadata_refresh_mode, next_metadata_refresh_at"
)
.bind(library_id)
.bind(provider)
.bind(fallback)
.fetch_optional(&state.pool)
.await?;
let Some(row) = result else {
return Err(ApiError::not_found("library not found"));
};
let book_count: i64 = sqlx::query_scalar("SELECT COUNT(*) FROM books WHERE library_id = $1")
.bind(library_id)
.fetch_one(&state.pool)
.await?;
let series_count: i64 = sqlx::query_scalar("SELECT COUNT(DISTINCT COALESCE(NULLIF(series, ''), 'unclassified')) FROM books WHERE library_id = $1")
.bind(library_id)
.fetch_one(&state.pool)
.await?;
let thumbnail_book_ids: Vec<Uuid> = sqlx::query_scalar(
"SELECT b.id FROM books b
WHERE b.library_id = $1
ORDER BY COALESCE(NULLIF(b.series, ''), 'unclassified'), b.volume NULLS LAST, b.title ASC
LIMIT 5"
)
.bind(library_id)
.fetch_all(&state.pool)
.await
.unwrap_or_default();
Ok(Json(LibraryResponse {
id: row.get("id"),
name: row.get("name"),
root_path: row.get("root_path"),
enabled: row.get("enabled"),
book_count,
series_count,
monitor_enabled: row.get("monitor_enabled"),
scan_mode: row.get("scan_mode"),
next_scan_at: row.get("next_scan_at"),
watcher_enabled: row.get("watcher_enabled"),
metadata_provider: row.get("metadata_provider"),
fallback_metadata_provider: row.get("fallback_metadata_provider"),
metadata_refresh_mode: row.get("metadata_refresh_mode"),
next_metadata_refresh_at: row.get("next_metadata_refresh_at"),
thumbnail_book_ids,
}))
}

View File

@@ -1,19 +1,31 @@
mod auth;
mod authors;
mod books;
mod error;
mod handlers;
mod index_jobs;
mod job_poller;
mod komga;
mod libraries;
mod metadata;
mod metadata_batch;
mod metadata_refresh;
mod metadata_providers;
mod api_middleware;
mod openapi;
mod pages;
mod prowlarr;
mod qbittorrent;
mod reading_progress;
mod search;
mod series;
mod settings;
mod state;
mod stats;
mod telegram;
mod thumbnails;
mod tokens;
mod users;
use std::sync::Arc;
use std::time::Instant;
@@ -67,8 +79,6 @@ async fn main() -> anyhow::Result<()> {
let state = AppState {
pool,
bootstrap_token: Arc::from(config.api_bootstrap_token),
meili_url: Arc::from(config.meili_url),
meili_master_key: Arc::from(config.meili_master_key),
page_cache: Arc::new(Mutex::new(LruCache::new(NonZeroUsize::new(512).expect("non-zero")))),
page_render_limit: Arc::new(Semaphore::new(concurrent_renders)),
metrics: Arc::new(Metrics::new()),
@@ -80,13 +90,13 @@ async fn main() -> anyhow::Result<()> {
};
let admin_routes = Router::new()
.route("/libraries", get(libraries::list_libraries).post(libraries::create_library))
.route("/libraries", axum::routing::post(libraries::create_library))
.route("/libraries/:id", delete(libraries::delete_library))
.route("/libraries/:id/scan", axum::routing::post(libraries::scan_library))
.route("/libraries/:id/monitoring", axum::routing::patch(libraries::update_monitoring))
.route("/libraries/:id/metadata-provider", axum::routing::patch(libraries::update_metadata_provider))
.route("/books/:id", axum::routing::patch(books::update_book))
.route("/books/:id/convert", axum::routing::post(books::convert_book))
.route("/libraries/:library_id/series/:name", axum::routing::patch(books::update_series))
.route("/libraries/:library_id/series/:name", axum::routing::patch(series::update_series))
.route("/index/rebuild", axum::routing::post(index_jobs::enqueue_rebuild))
.route("/index/thumbnails/rebuild", axum::routing::post(thumbnails::start_thumbnails_rebuild))
.route("/index/thumbnails/regenerate", axum::routing::post(thumbnails::start_thumbnails_regenerate))
@@ -97,9 +107,31 @@ async fn main() -> anyhow::Result<()> {
.route("/index/jobs/:id/errors", get(index_jobs::get_job_errors))
.route("/index/cancel/:id", axum::routing::post(index_jobs::cancel_job))
.route("/folders", get(index_jobs::list_folders))
.route("/admin/users", get(users::list_users).post(users::create_user))
.route("/admin/users/:id", delete(users::delete_user).patch(users::update_user))
.route("/admin/tokens", get(tokens::list_tokens).post(tokens::create_token))
.route("/admin/tokens/:id", delete(tokens::revoke_token))
.route("/admin/tokens/:id", delete(tokens::revoke_token).patch(tokens::update_token))
.route("/admin/tokens/:id/delete", axum::routing::post(tokens::delete_token))
.route("/prowlarr/search", axum::routing::post(prowlarr::search_prowlarr))
.route("/prowlarr/test", get(prowlarr::test_prowlarr))
.route("/qbittorrent/add", axum::routing::post(qbittorrent::add_torrent))
.route("/qbittorrent/test", get(qbittorrent::test_qbittorrent))
.route("/telegram/test", get(telegram::test_telegram))
.route("/komga/sync", axum::routing::post(komga::sync_komga_read_books))
.route("/komga/reports", get(komga::list_sync_reports))
.route("/komga/reports/:id", get(komga::get_sync_report))
.route("/metadata/search", axum::routing::post(metadata::search_metadata))
.route("/metadata/match", axum::routing::post(metadata::create_metadata_match))
.route("/metadata/approve/:id", axum::routing::post(metadata::approve_metadata))
.route("/metadata/reject/:id", axum::routing::post(metadata::reject_metadata))
.route("/metadata/links", get(metadata::get_metadata_links))
.route("/metadata/missing/:id", get(metadata::get_missing_books))
.route("/metadata/links/:id", delete(metadata::delete_metadata_link))
.route("/metadata/batch", axum::routing::post(metadata_batch::start_batch))
.route("/metadata/batch/:id/report", get(metadata_batch::get_batch_report))
.route("/metadata/batch/:id/results", get(metadata_batch::get_batch_results))
.route("/metadata/refresh", axum::routing::post(metadata_refresh::start_refresh))
.route("/metadata/refresh/:id/report", get(metadata_refresh::get_refresh_report))
.merge(settings::settings_routes())
.route_layer(middleware::from_fn_with_state(
state.clone(),
@@ -107,17 +139,22 @@ async fn main() -> anyhow::Result<()> {
));
let read_routes = Router::new()
.route("/libraries", get(libraries::list_libraries))
.route("/libraries/:id/scan", axum::routing::post(libraries::scan_library))
.route("/books", get(books::list_books))
.route("/books/ongoing", get(books::ongoing_books))
.route("/books/ongoing", get(series::ongoing_books))
.route("/books/:id", get(books::get_book))
.route("/books/:id/thumbnail", get(books::get_thumbnail))
.route("/books/:id/pages/:n", get(pages::get_page))
.route("/books/:id/progress", get(reading_progress::get_reading_progress).patch(reading_progress::update_reading_progress))
.route("/libraries/:library_id/series", get(books::list_series))
.route("/libraries/:library_id/series/:name/metadata", get(books::get_series_metadata))
.route("/series", get(books::list_all_series))
.route("/series/ongoing", get(books::ongoing_series))
.route("/libraries/:library_id/series", get(series::list_series))
.route("/libraries/:library_id/series/:name/metadata", get(series::get_series_metadata))
.route("/series", get(series::list_all_series))
.route("/series/ongoing", get(series::ongoing_series))
.route("/series/statuses", get(series::series_statuses))
.route("/series/provider-statuses", get(series::provider_statuses))
.route("/series/mark-read", axum::routing::post(reading_progress::mark_series_read))
.route("/authors", get(authors::list_authors))
.route("/stats", get(stats::get_stats))
.route("/search", get(search::search_books))
.route_layer(middleware::from_fn_with_state(state.clone(), api_middleware::read_rate_limit))
@@ -126,6 +163,9 @@ async fn main() -> anyhow::Result<()> {
auth::require_read,
));
// Clone pool before state is moved into the router
let poller_pool = state.pool.clone();
let app = Router::new()
.route("/health", get(handlers::health))
.route("/ready", get(handlers::ready))
@@ -137,6 +177,11 @@ async fn main() -> anyhow::Result<()> {
.layer(middleware::from_fn_with_state(state.clone(), api_middleware::request_counter))
.with_state(state);
// Start background poller for API-only jobs (metadata_batch, metadata_refresh)
tokio::spawn(async move {
job_poller::run_job_poller(poller_pool, 5).await;
});
let listener = tokio::net::TcpListener::bind(&config.listen_addr).await?;
info!(addr = %config.listen_addr, "api listening");
axum::serve(listener, app).await?;

1097
apps/api/src/metadata.rs Normal file

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,342 @@
use super::{BookCandidate, MetadataProvider, ProviderConfig, SeriesCandidate};
pub struct AniListProvider;
impl MetadataProvider for AniListProvider {
fn name(&self) -> &str {
"anilist"
}
fn search_series(
&self,
query: &str,
config: &ProviderConfig,
) -> std::pin::Pin<
Box<dyn std::future::Future<Output = Result<Vec<SeriesCandidate>, String>> + Send + '_>,
> {
let query = query.to_string();
let config = config.clone();
Box::pin(async move { search_series_impl(&query, &config).await })
}
fn get_series_books(
&self,
external_id: &str,
config: &ProviderConfig,
) -> std::pin::Pin<
Box<dyn std::future::Future<Output = Result<Vec<BookCandidate>, String>> + Send + '_>,
> {
let external_id = external_id.to_string();
let config = config.clone();
Box::pin(async move { get_series_books_impl(&external_id, &config).await })
}
}
const SEARCH_QUERY: &str = r#"
query ($search: String) {
Page(perPage: 20) {
media(search: $search, type: MANGA, sort: SEARCH_MATCH) {
id
title { romaji english native }
description(asHtml: false)
coverImage { large medium }
startDate { year }
status
volumes
chapters
staff { edges { node { name { full } } role } }
siteUrl
genres
}
}
}
"#;
const DETAIL_QUERY: &str = r#"
query ($id: Int) {
Media(id: $id, type: MANGA) {
id
title { romaji english native }
description(asHtml: false)
coverImage { large medium }
startDate { year }
status
volumes
chapters
staff { edges { node { name { full } } role } }
siteUrl
genres
}
}
"#;
async fn graphql_request(
client: &reqwest::Client,
query: &str,
variables: serde_json::Value,
) -> Result<serde_json::Value, String> {
let resp = client
.post("https://graphql.anilist.co")
.header("Content-Type", "application/json")
.json(&serde_json::json!({
"query": query,
"variables": variables,
}))
.send()
.await
.map_err(|e| format!("AniList request failed: {e}"))?;
if !resp.status().is_success() {
let status = resp.status();
let text = resp.text().await.unwrap_or_default();
return Err(format!("AniList returned {status}: {text}"));
}
resp.json()
.await
.map_err(|e| format!("Failed to parse AniList response: {e}"))
}
async fn search_series_impl(
query: &str,
_config: &ProviderConfig,
) -> Result<Vec<SeriesCandidate>, String> {
let client = reqwest::Client::builder()
.timeout(std::time::Duration::from_secs(15))
.build()
.map_err(|e| format!("failed to build HTTP client: {e}"))?;
let data = graphql_request(
&client,
SEARCH_QUERY,
serde_json::json!({ "search": query }),
)
.await?;
let media = match data
.get("data")
.and_then(|d| d.get("Page"))
.and_then(|p| p.get("media"))
.and_then(|m| m.as_array())
{
Some(media) => media,
None => return Ok(vec![]),
};
let query_lower = query.to_lowercase();
let mut candidates: Vec<SeriesCandidate> = media
.iter()
.filter_map(|m| {
let id = m.get("id").and_then(|id| id.as_i64())?;
let title_obj = m.get("title")?;
let title = title_obj
.get("english")
.and_then(|t| t.as_str())
.or_else(|| title_obj.get("romaji").and_then(|t| t.as_str()))?
.to_string();
let description = m
.get("description")
.and_then(|d| d.as_str())
.map(|d| d.replace("\\n", "\n").trim().to_string())
.filter(|d| !d.is_empty());
let cover_url = m
.get("coverImage")
.and_then(|ci| ci.get("large").or_else(|| ci.get("medium")))
.and_then(|u| u.as_str())
.map(String::from);
let start_year = m
.get("startDate")
.and_then(|sd| sd.get("year"))
.and_then(|y| y.as_i64())
.map(|y| y as i32);
let volumes = m
.get("volumes")
.and_then(|v| v.as_i64())
.map(|v| v as i32);
let chapters = m
.get("chapters")
.and_then(|v| v.as_i64())
.map(|v| v as i32);
let status = m
.get("status")
.and_then(|s| s.as_str())
.unwrap_or("UNKNOWN")
.to_string();
let site_url = m
.get("siteUrl")
.and_then(|u| u.as_str())
.map(String::from);
let authors = extract_authors(m);
let confidence = compute_confidence(&title, &query_lower);
// Use volumes if known, otherwise fall back to chapters count
let (total_volumes, volume_source) = match volumes {
Some(v) => (Some(v), "volumes"),
None => match chapters {
Some(c) => (Some(c), "chapters"),
None => (None, "unknown"),
},
};
Some(SeriesCandidate {
external_id: id.to_string(),
title,
authors,
description,
publishers: vec![],
start_year,
total_volumes,
cover_url,
external_url: site_url,
confidence,
metadata_json: serde_json::json!({
"status": status,
"chapters": chapters,
"volumes": volumes,
"volume_source": volume_source,
}),
})
})
.collect();
candidates.sort_by(|a, b| b.confidence.partial_cmp(&a.confidence).unwrap_or(std::cmp::Ordering::Equal));
candidates.truncate(10);
Ok(candidates)
}
async fn get_series_books_impl(
external_id: &str,
_config: &ProviderConfig,
) -> Result<Vec<BookCandidate>, String> {
let id: i64 = external_id
.parse()
.map_err(|_| "invalid AniList ID".to_string())?;
let client = reqwest::Client::builder()
.timeout(std::time::Duration::from_secs(15))
.build()
.map_err(|e| format!("failed to build HTTP client: {e}"))?;
let data = graphql_request(
&client,
DETAIL_QUERY,
serde_json::json!({ "id": id }),
)
.await?;
let media = match data.get("data").and_then(|d| d.get("Media")) {
Some(m) => m,
None => return Ok(vec![]),
};
let title_obj = media.get("title").cloned().unwrap_or(serde_json::json!({}));
let title = title_obj
.get("english")
.and_then(|t| t.as_str())
.or_else(|| title_obj.get("romaji").and_then(|t| t.as_str()))
.unwrap_or("")
.to_string();
let volumes = media
.get("volumes")
.and_then(|v| v.as_i64())
.map(|v| v as i32);
let chapters = media
.get("chapters")
.and_then(|v| v.as_i64())
.map(|v| v as i32);
// Use volumes if known, otherwise fall back to chapters count
let total = volumes.or(chapters);
let cover_url = media
.get("coverImage")
.and_then(|ci| ci.get("large").or_else(|| ci.get("medium")))
.and_then(|u| u.as_str())
.map(String::from);
let description = media
.get("description")
.and_then(|d| d.as_str())
.map(|d| d.replace("\\n", "\n").trim().to_string());
let authors = extract_authors(media);
// AniList doesn't have per-volume data — generate entries from volumes count (or chapters as fallback)
let mut books = Vec::new();
if let Some(total) = total {
for vol in 1..=total {
books.push(BookCandidate {
external_book_id: format!("{}-vol-{}", external_id, vol),
title: format!("{} Vol. {}", title, vol),
volume_number: Some(vol),
authors: authors.clone(),
isbn: None,
summary: if vol == 1 { description.clone() } else { None },
cover_url: if vol == 1 { cover_url.clone() } else { None },
page_count: None,
language: Some("ja".to_string()),
publish_date: None,
metadata_json: serde_json::json!({}),
});
}
}
Ok(books)
}
fn extract_authors(media: &serde_json::Value) -> Vec<String> {
let mut authors = Vec::new();
if let Some(edges) = media
.get("staff")
.and_then(|s| s.get("edges"))
.and_then(|e| e.as_array())
{
for edge in edges {
let role = edge
.get("role")
.and_then(|r| r.as_str())
.unwrap_or("");
let role_lower = role.to_lowercase();
if role_lower.contains("story") || role_lower.contains("art") || role_lower.contains("original") {
if let Some(name) = edge
.get("node")
.and_then(|n| n.get("name"))
.and_then(|n| n.get("full"))
.and_then(|f| f.as_str())
{
if !authors.contains(&name.to_string()) {
authors.push(name.to_string());
}
}
}
}
}
authors
}
fn compute_confidence(title: &str, query: &str) -> f32 {
let title_lower = title.to_lowercase();
if title_lower == query {
1.0
} else if title_lower.starts_with(query) || query.starts_with(&title_lower) {
0.8
} else if title_lower.contains(query) || query.contains(&title_lower) {
0.7
} else {
let common: usize = query.chars().filter(|c| title_lower.contains(*c)).count();
let max_len = query.len().max(title_lower.len()).max(1);
(common as f32 / max_len as f32).clamp(0.1, 0.6)
}
}

View File

@@ -0,0 +1,671 @@
use scraper::{Html, Selector};
use super::{BookCandidate, MetadataProvider, ProviderConfig, SeriesCandidate};
pub struct BedethequeProvider;
impl MetadataProvider for BedethequeProvider {
fn name(&self) -> &str {
"bedetheque"
}
fn search_series(
&self,
query: &str,
config: &ProviderConfig,
) -> std::pin::Pin<
Box<dyn std::future::Future<Output = Result<Vec<SeriesCandidate>, String>> + Send + '_>,
> {
let query = query.to_string();
let config = config.clone();
Box::pin(async move { search_series_impl(&query, &config).await })
}
fn get_series_books(
&self,
external_id: &str,
config: &ProviderConfig,
) -> std::pin::Pin<
Box<dyn std::future::Future<Output = Result<Vec<BookCandidate>, String>> + Send + '_>,
> {
let external_id = external_id.to_string();
let config = config.clone();
Box::pin(async move { get_series_books_impl(&external_id, &config).await })
}
}
fn build_client() -> Result<reqwest::Client, String> {
reqwest::Client::builder()
.timeout(std::time::Duration::from_secs(20))
.user_agent("Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:108.0) Gecko/20100101 Firefox/108.0")
.default_headers({
let mut h = reqwest::header::HeaderMap::new();
h.insert(
reqwest::header::ACCEPT,
"text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"
.parse()
.unwrap(),
);
h.insert(
reqwest::header::ACCEPT_LANGUAGE,
"fr-FR,fr;q=0.9,en;q=0.5".parse().unwrap(),
);
h.insert(reqwest::header::REFERER, "https://www.bedetheque.com/".parse().unwrap());
h
})
.build()
.map_err(|e| format!("failed to build HTTP client: {e}"))
}
/// Remove diacritics for URL construction (bedetheque uses ASCII slugs)
fn normalize_for_url(s: &str) -> String {
s.chars()
.map(|c| match c {
'é' | 'è' | 'ê' | 'ë' | 'É' | 'È' | 'Ê' | 'Ë' => 'e',
'à' | 'â' | 'ä' | 'À' | 'Â' | 'Ä' => 'a',
'ù' | 'û' | 'ü' | 'Ù' | 'Û' | 'Ü' => 'u',
'ô' | 'ö' | 'Ô' | 'Ö' => 'o',
'î' | 'ï' | 'Î' | 'Ï' => 'i',
'ç' | 'Ç' => 'c',
'ñ' | 'Ñ' => 'n',
_ => c,
})
.collect()
}
fn urlencoded(s: &str) -> String {
let mut result = String::new();
for byte in s.bytes() {
match byte {
b'A'..=b'Z' | b'a'..=b'z' | b'0'..=b'9' | b'-' | b'_' | b'.' | b'~' => {
result.push(byte as char);
}
b' ' => result.push('+'),
_ => result.push_str(&format!("%{:02X}", byte)),
}
}
result
}
// ---------------------------------------------------------------------------
// Search
// ---------------------------------------------------------------------------
async fn search_series_impl(
query: &str,
_config: &ProviderConfig,
) -> Result<Vec<SeriesCandidate>, String> {
let client = build_client()?;
// Use the full-text search page
let url = format!(
"https://www.bedetheque.com/search/tout?RechTexte={}&RechWhere=0",
urlencoded(&normalize_for_url(query))
);
let resp = client
.get(&url)
.send()
.await
.map_err(|e| format!("Bedetheque request failed: {e}"))?;
if !resp.status().is_success() {
let status = resp.status();
return Err(format!("Bedetheque returned {status}"));
}
let html = resp
.text()
.await
.map_err(|e| format!("Failed to read Bedetheque response: {e}"))?;
// Detect IP blacklist
if html.contains("<title></title>") || html.contains("<title> </title>") {
return Err("Bedetheque: IP may be rate-limited, please retry later".to_string());
}
// Parse HTML in a block so the non-Send Html type is dropped before any .await
let candidates = {
let document = Html::parse_document(&html);
let link_sel =
Selector::parse("a[href*='/serie-']").map_err(|e| format!("selector error: {e}"))?;
let query_lower = query.to_lowercase();
let mut seen = std::collections::HashSet::new();
let mut candidates = Vec::new();
for el in document.select(&link_sel) {
let href = match el.value().attr("href") {
Some(h) => h.to_string(),
None => continue,
};
let (series_id, _slug) = match parse_serie_href(&href) {
Some(v) => v,
None => continue,
};
if !seen.insert(series_id.clone()) {
continue;
}
let title = el.text().collect::<String>().trim().to_string();
if title.is_empty() {
continue;
}
let confidence = compute_confidence(&title, &query_lower);
let cover_url = format!(
"https://www.bedetheque.com/cache/thb_series/PlancheS_{}.jpg",
series_id
);
candidates.push(SeriesCandidate {
external_id: series_id.clone(),
title: title.clone(),
authors: vec![],
description: None,
publishers: vec![],
start_year: None,
total_volumes: None,
cover_url: Some(cover_url),
external_url: Some(href),
confidence,
metadata_json: serde_json::json!({}),
});
}
candidates.sort_by(|a, b| {
b.confidence
.partial_cmp(&a.confidence)
.unwrap_or(std::cmp::Ordering::Equal)
});
candidates.truncate(10);
candidates
}; // document is dropped here — safe to .await below
// For the top candidates, fetch series details to enrich metadata
// (limit to top 3 to avoid hammering the site)
let mut enriched = Vec::new();
for mut c in candidates {
if enriched.len() < 3 {
if let Ok(details) = fetch_series_details(&client, &c.external_id, c.external_url.as_deref()).await {
if let Some(desc) = details.description {
c.description = Some(desc);
}
if !details.authors.is_empty() {
c.authors = details.authors;
}
if !details.publishers.is_empty() {
c.publishers = details.publishers;
}
if let Some(year) = details.start_year {
c.start_year = Some(year);
}
if let Some(count) = details.album_count {
c.total_volumes = Some(count);
}
c.metadata_json = serde_json::json!({
"description": c.description,
"authors": c.authors,
"publishers": c.publishers,
"start_year": c.start_year,
"genres": details.genres,
"status": details.status,
"origin": details.origin,
"language": details.language,
});
}
}
enriched.push(c);
}
Ok(enriched)
}
/// Parse serie URL to extract (id, slug)
fn parse_serie_href(href: &str) -> Option<(String, String)> {
// Patterns:
// https://www.bedetheque.com/serie-3-BD-Blacksad.html
// /serie-3-BD-Blacksad.html
let re = regex::Regex::new(r"/serie-(\d+)-[A-Za-z]+-(.+?)(?:__\d+)?\.html").ok()?;
let caps = re.captures(href)?;
Some((caps[1].to_string(), caps[2].to_string()))
}
struct SeriesDetails {
description: Option<String>,
authors: Vec<String>,
publishers: Vec<String>,
start_year: Option<i32>,
album_count: Option<i32>,
genres: Vec<String>,
status: Option<String>,
origin: Option<String>,
language: Option<String>,
}
async fn fetch_series_details(
client: &reqwest::Client,
series_id: &str,
series_url: Option<&str>,
) -> Result<SeriesDetails, String> {
// Build URL — append __10000 to get all albums on one page
let url = match series_url {
Some(u) => {
// Replace .html with __10000.html
u.replace(".html", "__10000.html")
}
None => format!(
"https://www.bedetheque.com/serie-{}-BD-Serie__10000.html",
series_id
),
};
let resp = client
.get(&url)
.send()
.await
.map_err(|e| format!("Failed to fetch series page: {e}"))?;
if !resp.status().is_success() {
return Err(format!("Series page returned {}", resp.status()));
}
let html = resp
.text()
.await
.map_err(|e| format!("Failed to read series page: {e}"))?;
let doc = Html::parse_document(&html);
let mut details = SeriesDetails {
description: None,
authors: vec![],
publishers: vec![],
start_year: None,
album_count: None,
genres: vec![],
status: None,
origin: None,
language: None,
};
// Description from <meta name="description"> — format: "Tout sur la série {name} : {description}"
if let Ok(sel) = Selector::parse(r#"meta[name="description"]"#) {
if let Some(el) = doc.select(&sel).next() {
if let Some(content) = el.value().attr("content") {
let desc = content.trim().to_string();
// Strip the "Tout sur la série ... : " prefix
let cleaned = if let Some(pos) = desc.find(" : ") {
desc[pos + 3..].trim().to_string()
} else {
desc
};
if !cleaned.is_empty() {
details.description = Some(cleaned);
}
}
}
}
// Extract authors from itemprop="author" and itemprop="illustrator" (deduplicated)
{
let mut authors_set = std::collections::HashSet::new();
for attr in ["author", "illustrator"] {
if let Ok(sel) = Selector::parse(&format!(r#"[itemprop="{attr}"]"#)) {
for el in doc.select(&sel) {
let name = el.text().collect::<String>().trim().to_string();
// Names are "Last, First" — normalize to "First Last"
let normalized = if let Some((last, first)) = name.split_once(',') {
format!("{} {}", first.trim(), last.trim())
} else {
name
};
if !normalized.is_empty() && is_real_author(&normalized) {
authors_set.insert(normalized);
}
}
}
}
details.authors = authors_set.into_iter().collect();
details.authors.sort();
}
// Extract publishers from itemprop="publisher" (deduplicated)
{
let mut publishers_set = std::collections::HashSet::new();
if let Ok(sel) = Selector::parse(r#"[itemprop="publisher"]"#) {
for el in doc.select(&sel) {
let name = el.text().collect::<String>().trim().to_string();
if !name.is_empty() {
publishers_set.insert(name);
}
}
}
details.publishers = publishers_set.into_iter().collect();
details.publishers.sort();
}
// Extract series-level info from <li><label>X :</label>value</li> blocks
// Genre: <li><label>Genre :</label><span class="style-serie">Animalier, Aventure, Humour</span></li>
if let Ok(sel) = Selector::parse("span.style-serie") {
if let Some(el) = doc.select(&sel).next() {
let text = el.text().collect::<String>();
details.genres = text
.split(',')
.map(|s| s.trim().to_string())
.filter(|s| !s.is_empty())
.collect();
}
}
// Parution: <li><label>Parution :</label><span class="parution-serie">Série finie</span></li>
if let Ok(sel) = Selector::parse("span.parution-serie") {
if let Some(el) = doc.select(&sel).next() {
let text = el.text().collect::<String>().trim().to_string();
if !text.is_empty() {
details.status = Some(text);
}
}
}
// Origine and Langue from page text (no dedicated CSS class)
let page_text = doc.root_element().text().collect::<String>();
if let Some(val) = extract_info_value(&page_text, "Origine") {
let val = val.lines().next().unwrap_or(val).trim();
if !val.is_empty() {
details.origin = Some(val.to_string());
}
}
if let Some(val) = extract_info_value(&page_text, "Langue") {
let val = val.lines().next().unwrap_or(val).trim();
if !val.is_empty() {
details.language = Some(val.to_string());
}
}
// Album count from serie-info text (e.g. "Tomes : 8")
if let Ok(re) = regex::Regex::new(r"Tomes?\s*:\s*(\d+)") {
if let Some(caps) = re.captures(&page_text) {
if let Ok(n) = caps[1].parse::<i32>() {
details.album_count = Some(n);
}
}
}
// Start year from first <meta itemprop="datePublished" content="YYYY-MM-DD">
if let Ok(sel) = Selector::parse(r#"[itemprop="datePublished"]"#) {
if let Some(el) = doc.select(&sel).next() {
if let Some(content) = el.value().attr("content") {
// content is "YYYY-MM-DD"
if let Some(year_str) = content.split('-').next() {
if let Ok(year) = year_str.parse::<i32>() {
details.start_year = Some(year);
}
}
}
}
}
Ok(details)
}
/// Extract value after a label like "Scénario : Jean-Claude" → "Jean-Claude"
fn extract_info_value<'a>(text: &'a str, label: &str) -> Option<&'a str> {
// Handle both "Label :" and "Label:"
let patterns = [
format!("{} :", label),
format!("{}:", label),
format!("{} :", &label.to_lowercase()),
];
for pat in &patterns {
if let Some(pos) = text.find(pat.as_str()) {
let val = text[pos + pat.len()..].trim();
if !val.is_empty() {
return Some(val);
}
}
}
None
}
// ---------------------------------------------------------------------------
// Get series books
// ---------------------------------------------------------------------------
async fn get_series_books_impl(
external_id: &str,
_config: &ProviderConfig,
) -> Result<Vec<BookCandidate>, String> {
let client = build_client()?;
// We need to find the series URL — try a direct fetch
// external_id is the numeric series ID
// We try to fetch the series page to get the album list
let url = format!(
"https://www.bedetheque.com/serie-{}-BD-Serie__10000.html",
external_id
);
let resp = client
.get(&url)
.send()
.await
.map_err(|e| format!("Failed to fetch series: {e}"))?;
// If the generic slug fails, try without the slug part (bedetheque redirects)
let html = if resp.status().is_success() {
resp.text().await.map_err(|e| format!("Failed to read: {e}"))?
} else {
// Try alternative URL pattern
let alt_url = format!(
"https://www.bedetheque.com/serie-{}__10000.html",
external_id
);
let resp2 = client
.get(&alt_url)
.send()
.await
.map_err(|e| format!("Failed to fetch series (alt): {e}"))?;
if !resp2.status().is_success() {
return Err(format!("Series page not found for id {external_id}"));
}
resp2.text().await.map_err(|e| format!("Failed to read: {e}"))?
};
if html.contains("<title></title>") {
return Err("Bedetheque: IP may be rate-limited".to_string());
}
let doc = Html::parse_document(&html);
let mut books = Vec::new();
// Each album block starts before a .album-main div.
// The cover image (<img itemprop="image">) is OUTSIDE .album-main (sibling),
// so we iterate over a broader parent. But the simplest approach: parse all
// itemprop elements relative to each .album-main, plus pick covers separately.
let album_sel = Selector::parse(".album-main").map_err(|e| format!("selector: {e}"))?;
// Pre-collect cover images — they appear in <img itemprop="image"> before each .album-main
// and link to an album URL containing the book ID
let cover_sel = Selector::parse(r#"img[itemprop="image"]"#).map_err(|e| format!("selector: {e}"))?;
let covers: Vec<String> = doc.select(&cover_sel)
.filter_map(|el| el.value().attr("src").map(|s| {
if s.starts_with("http") { s.to_string() } else { format!("https://www.bedetheque.com{}", s) }
}))
.collect();
static RE_TOME: std::sync::LazyLock<regex::Regex> =
std::sync::LazyLock::new(|| regex::Regex::new(r"(?i)-Tome-\d+-").unwrap());
static RE_BOOK_ID: std::sync::LazyLock<regex::Regex> =
std::sync::LazyLock::new(|| regex::Regex::new(r"-(\d+)\.html").unwrap());
static RE_VOLUME: std::sync::LazyLock<regex::Regex> =
std::sync::LazyLock::new(|| regex::Regex::new(r"(?i)Tome-(\d+)-").unwrap());
for (idx, album_el) in doc.select(&album_sel).enumerate() {
// Title from <a class="titre" title="..."> — the title attribute is clean
let title_sel = Selector::parse("a.titre").ok();
let title_el = title_sel.as_ref().and_then(|s| album_el.select(s).next());
let title = title_el
.and_then(|el| el.value().attr("title"))
.unwrap_or("")
.trim()
.to_string();
if title.is_empty() {
continue;
}
// External book ID from album URL (e.g. "...-1063.html")
let album_url = title_el.and_then(|el| el.value().attr("href")).unwrap_or("");
// Only keep main tomes — their URLs contain "Tome-{N}-"
// Skip hors-série (HS), intégrales (INT/INTFL), romans, coffrets, etc.
if !RE_TOME.is_match(album_url) {
continue;
}
let external_book_id = RE_BOOK_ID
.captures(album_url)
.map(|c| c[1].to_string())
.unwrap_or_default();
// Volume number from URL pattern "Tome-{N}-" or from itemprop name
let volume_number = RE_VOLUME
.captures(album_url)
.and_then(|c| c[1].parse::<i32>().ok())
.or_else(|| extract_volume_from_title(&title));
// Authors from itemprop="author" and itemprop="illustrator"
let mut authors = Vec::new();
let author_sel = Selector::parse(r#"[itemprop="author"]"#).ok();
let illustrator_sel = Selector::parse(r#"[itemprop="illustrator"]"#).ok();
for sel in [&author_sel, &illustrator_sel].into_iter().flatten() {
for el in album_el.select(sel) {
let name = el.text().collect::<String>().trim().to_string();
// Names are "Last, First" format — normalize to "First Last"
let normalized = if let Some((last, first)) = name.split_once(',') {
format!("{} {}", first.trim(), last.trim())
} else {
name
};
if !normalized.is_empty() && is_real_author(&normalized) && !authors.contains(&normalized) {
authors.push(normalized);
}
}
}
// ISBN from <span itemprop="isbn">
let isbn = Selector::parse(r#"[itemprop="isbn"]"#)
.ok()
.and_then(|s| album_el.select(&s).next())
.map(|el| el.text().collect::<String>().trim().to_string())
.filter(|s| !s.is_empty());
// Page count from <span itemprop="numberOfPages">
let page_count = Selector::parse(r#"[itemprop="numberOfPages"]"#)
.ok()
.and_then(|s| album_el.select(&s).next())
.and_then(|el| el.text().collect::<String>().trim().parse::<i32>().ok());
// Publish date from <meta itemprop="datePublished" content="YYYY-MM-DD">
let publish_date = Selector::parse(r#"[itemprop="datePublished"]"#)
.ok()
.and_then(|s| album_el.select(&s).next())
.and_then(|el| el.value().attr("content").map(|c| c.trim().to_string()))
.filter(|s| !s.is_empty());
// Cover from pre-collected covers (same index)
let cover_url = covers.get(idx).cloned();
books.push(BookCandidate {
external_book_id,
title,
volume_number,
authors,
isbn,
summary: None,
cover_url,
page_count,
language: Some("fr".to_string()),
publish_date,
metadata_json: serde_json::json!({}),
});
}
books.sort_by_key(|b| b.volume_number.unwrap_or(999));
Ok(books)
}
/// Filter out placeholder author names from Bédéthèque
fn is_real_author(name: &str) -> bool {
!name.starts_with('<') && !name.ends_with('>') && name != "Collectif"
}
fn extract_volume_from_title(title: &str) -> Option<i32> {
let patterns = [
r"(?i)(?:tome|t\.)\s*(\d+)",
r"(?i)(?:vol(?:ume)?\.?)\s*(\d+)",
r"#\s*(\d+)",
];
for pattern in &patterns {
if let Ok(re) = regex::Regex::new(pattern) {
if let Some(caps) = re.captures(title) {
if let Ok(n) = caps[1].parse::<i32>() {
return Some(n);
}
}
}
}
None
}
/// Normalize a title by removing French articles (leading or in parentheses)
/// and extra whitespace/punctuation, so that "Les Légendaires - Résistance"
/// and "Légendaires (Les) - Résistance" produce the same canonical form.
fn normalize_title(s: &str) -> String {
let lower = s.to_lowercase();
// Remove articles in parentheses: "(les)", "(la)", "(le)", "(l')", "(un)", "(une)", "(des)"
let re_parens = regex::Regex::new(r"\s*\((?:les?|la|l'|une?|des|du|d')\)").unwrap();
let cleaned = re_parens.replace_all(&lower, "");
// Remove leading articles: "les ", "la ", "le ", "l'", "un ", "une ", "des ", "du ", "d'"
let re_leading = regex::Regex::new(r"^(?:les?|la|l'|une?|des|du|d')\s+").unwrap();
let cleaned = re_leading.replace(&cleaned, "");
// Collapse multiple spaces/dashes into single
let re_spaces = regex::Regex::new(r"\s+").unwrap();
re_spaces.replace_all(cleaned.trim(), " ").to_string()
}
fn compute_confidence(title: &str, query: &str) -> f32 {
let title_lower = title.to_lowercase();
let query_lower = query.to_lowercase();
if title_lower == query_lower {
return 1.0;
}
// Try with normalized forms (handles Bedetheque's "Name (Article)" convention)
let title_norm = normalize_title(title);
let query_norm = normalize_title(query);
if title_norm == query_norm {
return 1.0;
}
if title_lower.starts_with(&query_lower) || query_lower.starts_with(&title_lower)
|| title_norm.starts_with(&query_norm) || query_norm.starts_with(&title_norm)
{
0.85
} else if title_lower.contains(&query_lower) || query_lower.contains(&title_lower)
|| title_norm.contains(&query_norm) || query_norm.contains(&title_norm)
{
0.7
} else {
let common: usize = query_lower
.chars()
.filter(|c| title_lower.contains(*c))
.count();
let max_len = query_lower.len().max(title_lower.len()).max(1);
(common as f32 / max_len as f32).clamp(0.1, 0.6)
}
}

View File

@@ -0,0 +1,267 @@
use super::{BookCandidate, MetadataProvider, ProviderConfig, SeriesCandidate};
pub struct ComicVineProvider;
impl MetadataProvider for ComicVineProvider {
fn name(&self) -> &str {
"comicvine"
}
fn search_series(
&self,
query: &str,
config: &ProviderConfig,
) -> std::pin::Pin<
Box<dyn std::future::Future<Output = Result<Vec<SeriesCandidate>, String>> + Send + '_>,
> {
let query = query.to_string();
let config = config.clone();
Box::pin(async move { search_series_impl(&query, &config).await })
}
fn get_series_books(
&self,
external_id: &str,
config: &ProviderConfig,
) -> std::pin::Pin<
Box<dyn std::future::Future<Output = Result<Vec<BookCandidate>, String>> + Send + '_>,
> {
let external_id = external_id.to_string();
let config = config.clone();
Box::pin(async move { get_series_books_impl(&external_id, &config).await })
}
}
fn build_client() -> Result<reqwest::Client, String> {
reqwest::Client::builder()
.timeout(std::time::Duration::from_secs(15))
.user_agent("StripstreamLibrarian/1.0")
.build()
.map_err(|e| format!("failed to build HTTP client: {e}"))
}
async fn search_series_impl(
query: &str,
config: &ProviderConfig,
) -> Result<Vec<SeriesCandidate>, String> {
let api_key = config
.api_key
.as_deref()
.filter(|k| !k.is_empty())
.ok_or_else(|| "ComicVine requires an API key. Configure it in Settings > Integrations.".to_string())?;
let client = build_client()?;
let url = format!(
"https://comicvine.gamespot.com/api/search/?api_key={}&format=json&resources=volume&query={}&limit=20",
api_key,
urlencoded(query)
);
let resp = client
.get(&url)
.send()
.await
.map_err(|e| format!("ComicVine request failed: {e}"))?;
if !resp.status().is_success() {
let status = resp.status();
let text = resp.text().await.unwrap_or_default();
return Err(format!("ComicVine returned {status}: {text}"));
}
let data: serde_json::Value = resp
.json()
.await
.map_err(|e| format!("Failed to parse ComicVine response: {e}"))?;
let results = match data.get("results").and_then(|r| r.as_array()) {
Some(results) => results,
None => return Ok(vec![]),
};
let query_lower = query.to_lowercase();
let mut candidates: Vec<SeriesCandidate> = results
.iter()
.filter_map(|vol| {
let name = vol.get("name").and_then(|n| n.as_str())?.to_string();
let id = vol.get("id").and_then(|id| id.as_i64())?;
let description = vol
.get("description")
.and_then(|d| d.as_str())
.map(strip_html);
let publisher = vol
.get("publisher")
.and_then(|p| p.get("name"))
.and_then(|n| n.as_str())
.map(String::from);
let start_year = vol
.get("start_year")
.and_then(|y| y.as_str())
.and_then(|y| y.parse::<i32>().ok());
let count_of_issues = vol
.get("count_of_issues")
.and_then(|c| c.as_i64())
.map(|c| c as i32);
let cover_url = vol
.get("image")
.and_then(|img| img.get("medium_url").or_else(|| img.get("small_url")))
.and_then(|u| u.as_str())
.map(String::from);
let site_url = vol
.get("site_detail_url")
.and_then(|u| u.as_str())
.map(String::from);
let confidence = compute_confidence(&name, &query_lower);
Some(SeriesCandidate {
external_id: id.to_string(),
title: name,
authors: vec![],
description,
publishers: publisher.into_iter().collect(),
start_year,
total_volumes: count_of_issues,
cover_url,
external_url: site_url,
confidence,
metadata_json: serde_json::json!({}),
})
})
.collect();
candidates.sort_by(|a, b| b.confidence.partial_cmp(&a.confidence).unwrap_or(std::cmp::Ordering::Equal));
candidates.truncate(10);
Ok(candidates)
}
async fn get_series_books_impl(
external_id: &str,
config: &ProviderConfig,
) -> Result<Vec<BookCandidate>, String> {
let api_key = config
.api_key
.as_deref()
.filter(|k| !k.is_empty())
.ok_or_else(|| "ComicVine requires an API key".to_string())?;
let client = build_client()?;
let url = format!(
"https://comicvine.gamespot.com/api/issues/?api_key={}&format=json&filter=volume:{}&sort=issue_number:asc&limit=100&field_list=id,name,issue_number,description,image,cover_date,site_detail_url",
api_key,
external_id
);
let resp = client
.get(&url)
.send()
.await
.map_err(|e| format!("ComicVine request failed: {e}"))?;
if !resp.status().is_success() {
let status = resp.status();
let text = resp.text().await.unwrap_or_default();
return Err(format!("ComicVine returned {status}: {text}"));
}
let data: serde_json::Value = resp
.json()
.await
.map_err(|e| format!("Failed to parse ComicVine response: {e}"))?;
let results = match data.get("results").and_then(|r| r.as_array()) {
Some(results) => results,
None => return Ok(vec![]),
};
let books: Vec<BookCandidate> = results
.iter()
.filter_map(|issue| {
let id = issue.get("id").and_then(|id| id.as_i64())?;
let name = issue
.get("name")
.and_then(|n| n.as_str())
.unwrap_or("")
.to_string();
let issue_number = issue
.get("issue_number")
.and_then(|n| n.as_str())
.and_then(|n| n.parse::<f64>().ok())
.map(|n| n as i32);
let description = issue
.get("description")
.and_then(|d| d.as_str())
.map(strip_html);
let cover_url = issue
.get("image")
.and_then(|img| img.get("medium_url").or_else(|| img.get("small_url")))
.and_then(|u| u.as_str())
.map(String::from);
let cover_date = issue
.get("cover_date")
.and_then(|d| d.as_str())
.map(String::from);
Some(BookCandidate {
external_book_id: id.to_string(),
title: name,
volume_number: issue_number,
authors: vec![],
isbn: None,
summary: description,
cover_url,
page_count: None,
language: None,
publish_date: cover_date,
metadata_json: serde_json::json!({}),
})
})
.collect();
Ok(books)
}
fn strip_html(s: &str) -> String {
let mut result = String::new();
let mut in_tag = false;
for ch in s.chars() {
match ch {
'<' => in_tag = true,
'>' => in_tag = false,
_ if !in_tag => result.push(ch),
_ => {}
}
}
result.trim().to_string()
}
fn compute_confidence(title: &str, query: &str) -> f32 {
let title_lower = title.to_lowercase();
if title_lower == query {
1.0
} else if title_lower.starts_with(query) || query.starts_with(&title_lower) {
0.8
} else if title_lower.contains(query) || query.contains(&title_lower) {
0.7
} else {
let common: usize = query.chars().filter(|c| title_lower.contains(*c)).count();
let max_len = query.len().max(title_lower.len()).max(1);
(common as f32 / max_len as f32).clamp(0.1, 0.6)
}
}
fn urlencoded(s: &str) -> String {
let mut result = String::new();
for byte in s.bytes() {
match byte {
b'A'..=b'Z' | b'a'..=b'z' | b'0'..=b'9' | b'-' | b'_' | b'.' | b'~' => {
result.push(byte as char);
}
_ => result.push_str(&format!("%{:02X}", byte)),
}
}
result
}

View File

@@ -0,0 +1,472 @@
use super::{BookCandidate, MetadataProvider, ProviderConfig, SeriesCandidate};
pub struct GoogleBooksProvider;
impl MetadataProvider for GoogleBooksProvider {
fn name(&self) -> &str {
"google_books"
}
fn search_series(
&self,
query: &str,
config: &ProviderConfig,
) -> std::pin::Pin<
Box<dyn std::future::Future<Output = Result<Vec<SeriesCandidate>, String>> + Send + '_>,
> {
let query = query.to_string();
let config = config.clone();
Box::pin(async move { search_series_impl(&query, &config).await })
}
fn get_series_books(
&self,
external_id: &str,
config: &ProviderConfig,
) -> std::pin::Pin<
Box<dyn std::future::Future<Output = Result<Vec<BookCandidate>, String>> + Send + '_>,
> {
let external_id = external_id.to_string();
let config = config.clone();
Box::pin(async move { get_series_books_impl(&external_id, &config).await })
}
}
async fn search_series_impl(
query: &str,
config: &ProviderConfig,
) -> Result<Vec<SeriesCandidate>, String> {
let client = reqwest::Client::builder()
.timeout(std::time::Duration::from_secs(15))
.build()
.map_err(|e| format!("failed to build HTTP client: {e}"))?;
let search_query = format!("intitle:{}", query);
let mut url = format!(
"https://www.googleapis.com/books/v1/volumes?q={}&maxResults=20&printType=books&langRestrict={}",
urlencoded(&search_query),
urlencoded(&config.language),
);
if let Some(ref key) = config.api_key {
url.push_str(&format!("&key={}", key));
}
let resp = client
.get(&url)
.send()
.await
.map_err(|e| format!("Google Books request failed: {e}"))?;
if !resp.status().is_success() {
let status = resp.status();
let text = resp.text().await.unwrap_or_default();
return Err(format!("Google Books returned {status}: {text}"));
}
let data: serde_json::Value = resp
.json()
.await
.map_err(|e| format!("Failed to parse Google Books response: {e}"))?;
let items = match data.get("items").and_then(|i| i.as_array()) {
Some(items) => items,
None => return Ok(vec![]),
};
// Group volumes by series name to produce series candidates
let query_lower = query.to_lowercase();
let mut series_map: std::collections::HashMap<String, SeriesCandidateBuilder> =
std::collections::HashMap::new();
for item in items {
let volume_info = match item.get("volumeInfo") {
Some(vi) => vi,
None => continue,
};
let title = volume_info
.get("title")
.and_then(|t| t.as_str())
.unwrap_or("")
.to_string();
let authors: Vec<String> = volume_info
.get("authors")
.and_then(|a| a.as_array())
.map(|arr| {
arr.iter()
.filter_map(|v| v.as_str().map(String::from))
.collect()
})
.unwrap_or_default();
let publisher = volume_info
.get("publisher")
.and_then(|p| p.as_str())
.map(String::from);
let published_date = volume_info
.get("publishedDate")
.and_then(|d| d.as_str())
.map(String::from);
let description = volume_info
.get("description")
.and_then(|d| d.as_str())
.map(String::from);
// Extract series info from title or seriesInfo
let series_name = volume_info
.get("seriesInfo")
.and_then(|si| si.get("title"))
.and_then(|t| t.as_str())
.map(String::from)
.unwrap_or_else(|| extract_series_name(&title));
let cover_url = volume_info
.get("imageLinks")
.and_then(|il| {
il.get("thumbnail")
.or_else(|| il.get("smallThumbnail"))
})
.and_then(|u| u.as_str())
.map(|s| s.replace("http://", "https://"));
let google_id = item
.get("id")
.and_then(|id| id.as_str())
.unwrap_or("")
.to_string();
let entry = series_map
.entry(series_name.clone())
.or_insert_with(|| SeriesCandidateBuilder {
title: series_name.clone(),
authors: vec![],
description: None,
publishers: vec![],
start_year: None,
volume_count: 0,
cover_url: None,
external_id: google_id.clone(),
external_url: None,
metadata_json: serde_json::json!({}),
});
entry.volume_count += 1;
// Merge authors
for a in &authors {
if !entry.authors.contains(a) {
entry.authors.push(a.clone());
}
}
// Set description if not yet set
if entry.description.is_none() {
entry.description = description;
}
// Merge publisher
if let Some(ref pub_name) = publisher {
if !entry.publishers.contains(pub_name) {
entry.publishers.push(pub_name.clone());
}
}
// Extract year
if let Some(ref date) = published_date {
if let Some(year) = extract_year(date) {
if entry.start_year.is_none() || entry.start_year.unwrap() > year {
entry.start_year = Some(year);
}
}
}
if entry.cover_url.is_none() {
entry.cover_url = cover_url;
}
entry.external_url = Some(format!(
"https://books.google.com/books?id={}",
google_id
));
}
let mut candidates: Vec<SeriesCandidate> = series_map
.into_values()
.map(|b| {
let confidence = compute_confidence(&b.title, &query_lower);
SeriesCandidate {
external_id: b.external_id,
title: b.title,
authors: b.authors,
description: b.description,
publishers: b.publishers,
start_year: b.start_year,
total_volumes: if b.volume_count > 1 {
Some(b.volume_count)
} else {
None
},
cover_url: b.cover_url,
external_url: b.external_url,
confidence,
metadata_json: b.metadata_json,
}
})
.collect();
candidates.sort_by(|a, b| b.confidence.partial_cmp(&a.confidence).unwrap_or(std::cmp::Ordering::Equal));
candidates.truncate(10);
Ok(candidates)
}
async fn get_series_books_impl(
external_id: &str,
config: &ProviderConfig,
) -> Result<Vec<BookCandidate>, String> {
let client = reqwest::Client::builder()
.timeout(std::time::Duration::from_secs(15))
.build()
.map_err(|e| format!("failed to build HTTP client: {e}"))?;
// First fetch the volume to get its series info
let mut url = format!(
"https://www.googleapis.com/books/v1/volumes/{}",
external_id
);
if let Some(ref key) = config.api_key {
url.push_str(&format!("?key={}", key));
}
let resp = client
.get(&url)
.send()
.await
.map_err(|e| format!("Google Books request failed: {e}"))?;
if !resp.status().is_success() {
let status = resp.status();
let text = resp.text().await.unwrap_or_default();
return Err(format!("Google Books returned {status}: {text}"));
}
let volume: serde_json::Value = resp
.json()
.await
.map_err(|e| format!("Failed to parse Google Books response: {e}"))?;
let volume_info = volume.get("volumeInfo").cloned().unwrap_or(serde_json::json!({}));
let title = volume_info
.get("title")
.and_then(|t| t.as_str())
.unwrap_or("");
// Search for more volumes in this series
let series_name = extract_series_name(title);
let search_query = format!("intitle:{}", series_name);
let mut search_url = format!(
"https://www.googleapis.com/books/v1/volumes?q={}&maxResults=40&printType=books&langRestrict={}",
urlencoded(&search_query),
urlencoded(&config.language),
);
if let Some(ref key) = config.api_key {
search_url.push_str(&format!("&key={}", key));
}
let resp = client
.get(&search_url)
.send()
.await
.map_err(|e| format!("Google Books search failed: {e}"))?;
if !resp.status().is_success() {
// Return just the single volume as a book
return Ok(vec![volume_to_book_candidate(&volume)]);
}
let data: serde_json::Value = resp
.json()
.await
.map_err(|e| format!("Failed to parse search response: {e}"))?;
let items = match data.get("items").and_then(|i| i.as_array()) {
Some(items) => items,
None => return Ok(vec![volume_to_book_candidate(&volume)]),
};
let mut books: Vec<BookCandidate> = items
.iter()
.map(volume_to_book_candidate)
.collect();
// Sort by volume number
books.sort_by_key(|b| b.volume_number.unwrap_or(999));
Ok(books)
}
fn volume_to_book_candidate(item: &serde_json::Value) -> BookCandidate {
let volume_info = item.get("volumeInfo").cloned().unwrap_or(serde_json::json!({}));
let title = volume_info
.get("title")
.and_then(|t| t.as_str())
.unwrap_or("")
.to_string();
let authors: Vec<String> = volume_info
.get("authors")
.and_then(|a| a.as_array())
.map(|arr| {
arr.iter()
.filter_map(|v| v.as_str().map(String::from))
.collect()
})
.unwrap_or_default();
let isbn = volume_info
.get("industryIdentifiers")
.and_then(|ids| ids.as_array())
.and_then(|arr| {
arr.iter()
.find(|id| {
id.get("type")
.and_then(|t| t.as_str())
.map(|t| t == "ISBN_13" || t == "ISBN_10")
.unwrap_or(false)
})
.and_then(|id| id.get("identifier").and_then(|i| i.as_str()))
})
.map(String::from);
let summary = volume_info
.get("description")
.and_then(|d| d.as_str())
.map(String::from);
let cover_url = volume_info
.get("imageLinks")
.and_then(|il| il.get("thumbnail").or_else(|| il.get("smallThumbnail")))
.and_then(|u| u.as_str())
.map(|s| s.replace("http://", "https://"));
let page_count = volume_info
.get("pageCount")
.and_then(|p| p.as_i64())
.map(|p| p as i32);
let language = volume_info
.get("language")
.and_then(|l| l.as_str())
.map(String::from);
let publish_date = volume_info
.get("publishedDate")
.and_then(|d| d.as_str())
.map(String::from);
let google_id = item
.get("id")
.and_then(|id| id.as_str())
.unwrap_or("")
.to_string();
let volume_number = extract_volume_number(&title);
BookCandidate {
external_book_id: google_id,
title,
volume_number,
authors,
isbn,
summary,
cover_url,
page_count,
language,
publish_date,
metadata_json: serde_json::json!({}),
}
}
fn extract_series_name(title: &str) -> String {
// Remove trailing volume indicators like "Vol. 1", "Tome 2", "#3", "- Volume 1"
let re_patterns = [
r"(?i)\s*[-–—]\s*(?:vol(?:ume)?\.?\s*|tome\s*|t\.\s*|#)\s*\d+.*$",
r"(?i)\s*,?\s*(?:vol(?:ume)?\.?\s*|tome\s*|t\.\s*|#)\s*\d+.*$",
r"\s*\(\d+\)\s*$",
r"\s+\d+\s*$",
];
let mut result = title.to_string();
for pattern in &re_patterns {
if let Ok(re) = regex::Regex::new(pattern) {
let cleaned = re.replace(&result, "").to_string();
if !cleaned.is_empty() {
result = cleaned;
break;
}
}
}
result.trim().to_string()
}
fn extract_volume_number(title: &str) -> Option<i32> {
let patterns = [
r"(?i)(?:vol(?:ume)?\.?\s*|tome\s*|t\.\s*|#)\s*(\d+)",
r"\((\d+)\)\s*$",
r"\b(\d+)\s*$",
];
for pattern in &patterns {
if let Ok(re) = regex::Regex::new(pattern) {
if let Some(caps) = re.captures(title) {
if let Some(num) = caps.get(1).and_then(|m| m.as_str().parse::<i32>().ok()) {
return Some(num);
}
}
}
}
None
}
fn extract_year(date: &str) -> Option<i32> {
date.get(..4).and_then(|s| s.parse::<i32>().ok())
}
fn compute_confidence(title: &str, query: &str) -> f32 {
let title_lower = title.to_lowercase();
if title_lower == query {
1.0
} else if title_lower.starts_with(query) || query.starts_with(&title_lower) {
0.8
} else if title_lower.contains(query) || query.contains(&title_lower) {
0.7
} else {
// Simple character overlap ratio
let common: usize = query
.chars()
.filter(|c| title_lower.contains(*c))
.count();
let max_len = query.len().max(title_lower.len()).max(1);
(common as f32 / max_len as f32).clamp(0.1, 0.6)
}
}
fn urlencoded(s: &str) -> String {
let mut result = String::new();
for byte in s.bytes() {
match byte {
b'A'..=b'Z' | b'a'..=b'z' | b'0'..=b'9' | b'-' | b'_' | b'.' | b'~' => {
result.push(byte as char);
}
_ => {
result.push_str(&format!("%{:02X}", byte));
}
}
}
result
}
struct SeriesCandidateBuilder {
title: String,
authors: Vec<String>,
description: Option<String>,
publishers: Vec<String>,
start_year: Option<i32>,
volume_count: i32,
cover_url: Option<String>,
external_id: String,
external_url: Option<String>,
metadata_json: serde_json::Value,
}

View File

@@ -0,0 +1,295 @@
pub mod anilist;
pub mod bedetheque;
pub mod comicvine;
pub mod google_books;
pub mod open_library;
use serde::{Deserialize, Serialize};
/// Configuration passed to providers (API keys, etc.)
#[derive(Debug, Clone, Default)]
pub struct ProviderConfig {
pub api_key: Option<String>,
/// Preferred language for metadata results (ISO 639-1: "en", "fr", "es"). Defaults to "en".
pub language: String,
}
/// A candidate series returned by a provider search
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct SeriesCandidate {
pub external_id: String,
pub title: String,
pub authors: Vec<String>,
pub description: Option<String>,
pub publishers: Vec<String>,
pub start_year: Option<i32>,
pub total_volumes: Option<i32>,
pub cover_url: Option<String>,
pub external_url: Option<String>,
pub confidence: f32,
pub metadata_json: serde_json::Value,
}
/// A candidate book within a series
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct BookCandidate {
pub external_book_id: String,
pub title: String,
pub volume_number: Option<i32>,
pub authors: Vec<String>,
pub isbn: Option<String>,
pub summary: Option<String>,
pub cover_url: Option<String>,
pub page_count: Option<i32>,
pub language: Option<String>,
pub publish_date: Option<String>,
pub metadata_json: serde_json::Value,
}
/// Trait that all metadata providers must implement
pub trait MetadataProvider: Send + Sync {
#[allow(dead_code)]
fn name(&self) -> &str;
fn search_series(
&self,
query: &str,
config: &ProviderConfig,
) -> std::pin::Pin<
Box<dyn std::future::Future<Output = Result<Vec<SeriesCandidate>, String>> + Send + '_>,
>;
fn get_series_books(
&self,
external_id: &str,
config: &ProviderConfig,
) -> std::pin::Pin<
Box<dyn std::future::Future<Output = Result<Vec<BookCandidate>, String>> + Send + '_>,
>;
}
/// Factory function to get a provider by name
pub fn get_provider(name: &str) -> Option<Box<dyn MetadataProvider>> {
match name {
"google_books" => Some(Box::new(google_books::GoogleBooksProvider)),
"open_library" => Some(Box::new(open_library::OpenLibraryProvider)),
"comicvine" => Some(Box::new(comicvine::ComicVineProvider)),
"anilist" => Some(Box::new(anilist::AniListProvider)),
"bedetheque" => Some(Box::new(bedetheque::BedethequeProvider)),
_ => None,
}
}
// ---------------------------------------------------------------------------
// End-to-end provider tests
//
// These tests hit real external APIs — run them explicitly with:
// cargo test -p api providers_e2e -- --ignored --nocapture
// ---------------------------------------------------------------------------
#[cfg(test)]
mod providers_e2e {
use super::*;
fn config_fr() -> ProviderConfig {
ProviderConfig { api_key: None, language: "fr".to_string() }
}
fn config_en() -> ProviderConfig {
ProviderConfig { api_key: None, language: "en".to_string() }
}
fn print_candidate(name: &str, c: &SeriesCandidate) {
println!("\n=== {name} — best candidate ===");
println!(" title: {:?}", c.title);
println!(" external_id: {:?}", c.external_id);
println!(" authors: {:?}", c.authors);
println!(" description: {:?}", c.description.as_deref().map(|d| &d[..d.len().min(120)]));
println!(" publishers: {:?}", c.publishers);
println!(" start_year: {:?}", c.start_year);
println!(" total_volumes: {:?}", c.total_volumes);
println!(" cover_url: {}", c.cover_url.is_some());
println!(" external_url: {}", c.external_url.is_some());
println!(" confidence: {:.2}", c.confidence);
println!(" metadata_json: {}", serde_json::to_string_pretty(&c.metadata_json).unwrap_or_default());
}
fn print_books(name: &str, books: &[BookCandidate]) {
println!("\n=== {name}{} books ===", books.len());
for (i, b) in books.iter().take(5).enumerate() {
println!(
" [{}] vol={:?} title={:?} authors={} isbn={:?} pages={:?} lang={:?} date={:?} cover={}",
i, b.volume_number, b.title, b.authors.len(), b.isbn, b.page_count, b.language, b.publish_date, b.cover_url.is_some()
);
}
if books.len() > 5 { println!(" ... and {} more", books.len() - 5); }
let with_vol = books.iter().filter(|b| b.volume_number.is_some()).count();
let with_isbn = books.iter().filter(|b| b.isbn.is_some()).count();
let with_authors = books.iter().filter(|b| !b.authors.is_empty()).count();
let with_date = books.iter().filter(|b| b.publish_date.is_some()).count();
let with_cover = books.iter().filter(|b| b.cover_url.is_some()).count();
let with_pages = books.iter().filter(|b| b.page_count.is_some()).count();
println!(" --- field coverage ---");
println!(" volume_number: {with_vol}/{}", books.len());
println!(" isbn: {with_isbn}/{}", books.len());
println!(" authors: {with_authors}/{}", books.len());
println!(" publish_date: {with_date}/{}", books.len());
println!(" cover_url: {with_cover}/{}", books.len());
println!(" page_count: {with_pages}/{}", books.len());
}
// --- Google Books ---
#[tokio::test]
#[ignore]
async fn google_books_search_and_books() {
let p = get_provider("google_books").unwrap();
let cfg = config_en();
let candidates = p.search_series("Blacksad", &cfg).await.unwrap();
assert!(!candidates.is_empty(), "google_books: no results for Blacksad");
print_candidate("google_books", &candidates[0]);
let books = p.get_series_books(&candidates[0].external_id, &cfg).await.unwrap();
print_books("google_books", &books);
assert!(!books.is_empty(), "google_books: no books returned");
}
// --- Open Library ---
#[tokio::test]
#[ignore]
async fn open_library_search_and_books() {
let p = get_provider("open_library").unwrap();
let cfg = config_en();
let candidates = p.search_series("Sandman Neil Gaiman", &cfg).await.unwrap();
assert!(!candidates.is_empty(), "open_library: no results for Sandman");
print_candidate("open_library", &candidates[0]);
let books = p.get_series_books(&candidates[0].external_id, &cfg).await.unwrap();
print_books("open_library", &books);
assert!(!books.is_empty(), "open_library: no books returned");
}
// --- AniList ---
#[tokio::test]
#[ignore]
async fn anilist_search_finished() {
let p = get_provider("anilist").unwrap();
let cfg = config_fr();
let candidates = p.search_series("Death Note", &cfg).await.unwrap();
assert!(!candidates.is_empty(), "anilist: no results for Death Note");
print_candidate("anilist (finished)", &candidates[0]);
let best = &candidates[0];
assert!(best.total_volumes.is_some(), "anilist: finished series should have total_volumes");
assert!(best.description.is_some(), "anilist: should have description");
assert!(!best.authors.is_empty(), "anilist: should have authors");
let status = best.metadata_json.get("status").and_then(|s| s.as_str());
assert_eq!(status, Some("FINISHED"), "anilist: Death Note should be FINISHED");
let books = p.get_series_books(&best.external_id, &cfg).await.unwrap();
print_books("anilist (Death Note)", &books);
assert!(books.len() >= 12, "anilist: Death Note should have ≥12 volumes, got {}", books.len());
}
#[tokio::test]
#[ignore]
async fn anilist_search_ongoing() {
let p = get_provider("anilist").unwrap();
let cfg = config_fr();
let candidates = p.search_series("One Piece", &cfg).await.unwrap();
assert!(!candidates.is_empty(), "anilist: no results for One Piece");
print_candidate("anilist (ongoing)", &candidates[0]);
let best = &candidates[0];
let status = best.metadata_json.get("status").and_then(|s| s.as_str());
assert_eq!(status, Some("RELEASING"), "anilist: One Piece should be RELEASING");
let volume_source = best.metadata_json.get("volume_source").and_then(|s| s.as_str());
println!(" volume_source: {:?}", volume_source);
println!(" total_volumes: {:?}", best.total_volumes);
}
// --- Bédéthèque ---
#[tokio::test]
#[ignore]
async fn bedetheque_search_and_books() {
let p = get_provider("bedetheque").unwrap();
let cfg = config_fr();
let candidates = p.search_series("De Cape et de Crocs", &cfg).await.unwrap();
assert!(!candidates.is_empty(), "bedetheque: no results");
print_candidate("bedetheque", &candidates[0]);
let best = &candidates[0];
assert!(best.description.is_some(), "bedetheque: should have description");
assert!(!best.authors.is_empty(), "bedetheque: should have authors");
assert!(!best.publishers.is_empty(), "bedetheque: should have publishers");
assert!(best.start_year.is_some(), "bedetheque: should have start_year");
assert!(best.total_volumes.is_some(), "bedetheque: should have total_volumes");
// Enriched metadata_json
let mj = &best.metadata_json;
assert!(mj.get("genres").and_then(|g| g.as_array()).map(|a| !a.is_empty()).unwrap_or(false), "bedetheque: should have genres");
assert!(mj.get("status").and_then(|s| s.as_str()).is_some(), "bedetheque: should have status");
let books = p.get_series_books(&best.external_id, &cfg).await.unwrap();
print_books("bedetheque", &books);
assert!(books.len() >= 12, "bedetheque: De Cape et de Crocs should have ≥12 volumes, got {}", books.len());
}
// --- ComicVine (needs API key) ---
#[tokio::test]
#[ignore]
async fn comicvine_no_key() {
let p = get_provider("comicvine").unwrap();
let cfg = config_en();
let result = p.search_series("Batman", &cfg).await;
println!("\n=== comicvine (no key) ===");
match result {
Ok(c) => println!(" returned {} candidates (unexpected without key)", c.len()),
Err(e) => println!(" expected error: {e}"),
}
}
// --- Cross-provider comparison ---
#[tokio::test]
#[ignore]
async fn cross_provider_blacksad() {
println!("\n{}", "=".repeat(60));
println!(" Cross-provider comparison: Blacksad");
println!("{}\n", "=".repeat(60));
let providers: Vec<(&str, ProviderConfig)> = vec![
("google_books", config_en()),
("open_library", config_en()),
("anilist", config_fr()),
("bedetheque", config_fr()),
];
for (name, cfg) in &providers {
let p = get_provider(name).unwrap();
match p.search_series("Blacksad", cfg).await {
Ok(candidates) if !candidates.is_empty() => {
let b = &candidates[0];
println!("[{name}] title={:?} authors={} desc={} pubs={} year={:?} vols={:?} cover={} url={} conf={:.2}",
b.title, b.authors.len(), b.description.is_some(), b.publishers.len(),
b.start_year, b.total_volumes, b.cover_url.is_some(), b.external_url.is_some(), b.confidence);
}
Ok(_) => println!("[{name}] no results"),
Err(e) => println!("[{name}] error: {e}"),
}
}
}
}

View File

@@ -0,0 +1,351 @@
use super::{BookCandidate, MetadataProvider, ProviderConfig, SeriesCandidate};
pub struct OpenLibraryProvider;
impl MetadataProvider for OpenLibraryProvider {
fn name(&self) -> &str {
"open_library"
}
fn search_series(
&self,
query: &str,
config: &ProviderConfig,
) -> std::pin::Pin<
Box<dyn std::future::Future<Output = Result<Vec<SeriesCandidate>, String>> + Send + '_>,
> {
let query = query.to_string();
let config = config.clone();
Box::pin(async move { search_series_impl(&query, &config).await })
}
fn get_series_books(
&self,
external_id: &str,
config: &ProviderConfig,
) -> std::pin::Pin<
Box<dyn std::future::Future<Output = Result<Vec<BookCandidate>, String>> + Send + '_>,
> {
let external_id = external_id.to_string();
let config = config.clone();
Box::pin(async move { get_series_books_impl(&external_id, &config).await })
}
}
async fn search_series_impl(
query: &str,
config: &ProviderConfig,
) -> Result<Vec<SeriesCandidate>, String> {
let client = reqwest::Client::builder()
.timeout(std::time::Duration::from_secs(15))
.build()
.map_err(|e| format!("failed to build HTTP client: {e}"))?;
// Open Library uses 3-letter language codes
let ol_lang = match config.language.as_str() {
"fr" => "fre",
"es" => "spa",
_ => "eng",
};
let url = format!(
"https://openlibrary.org/search.json?title={}&limit=20&language={}",
urlencoded(query),
ol_lang,
);
let resp = client
.get(&url)
.send()
.await
.map_err(|e| format!("Open Library request failed: {e}"))?;
if !resp.status().is_success() {
let status = resp.status();
let text = resp.text().await.unwrap_or_default();
return Err(format!("Open Library returned {status}: {text}"));
}
let data: serde_json::Value = resp
.json()
.await
.map_err(|e| format!("Failed to parse Open Library response: {e}"))?;
let docs = match data.get("docs").and_then(|d| d.as_array()) {
Some(docs) => docs,
None => return Ok(vec![]),
};
let query_lower = query.to_lowercase();
let mut series_map: std::collections::HashMap<String, SeriesCandidateBuilder> =
std::collections::HashMap::new();
for doc in docs {
let title = doc
.get("title")
.and_then(|t| t.as_str())
.unwrap_or("")
.to_string();
let authors: Vec<String> = doc
.get("author_name")
.and_then(|a| a.as_array())
.map(|arr| arr.iter().filter_map(|v| v.as_str().map(String::from)).collect())
.unwrap_or_default();
let publishers: Vec<String> = doc
.get("publisher")
.and_then(|a| a.as_array())
.map(|arr| {
let mut pubs: Vec<String> = arr.iter().filter_map(|v| v.as_str().map(String::from)).collect();
pubs.truncate(3);
pubs
})
.unwrap_or_default();
let first_publish_year = doc
.get("first_publish_year")
.and_then(|y| y.as_i64())
.map(|y| y as i32);
let cover_i = doc.get("cover_i").and_then(|c| c.as_i64());
let cover_url = cover_i.map(|id| format!("https://covers.openlibrary.org/b/id/{}-M.jpg", id));
let key = doc
.get("key")
.and_then(|k| k.as_str())
.unwrap_or("")
.to_string();
let series_name = extract_series_name(&title);
let entry = series_map
.entry(series_name.clone())
.or_insert_with(|| SeriesCandidateBuilder {
title: series_name.clone(),
authors: vec![],
description: None,
publishers: vec![],
start_year: None,
volume_count: 0,
cover_url: None,
external_id: key.clone(),
external_url: if key.is_empty() {
None
} else {
Some(format!("https://openlibrary.org{}", key))
},
});
entry.volume_count += 1;
for a in &authors {
if !entry.authors.contains(a) {
entry.authors.push(a.clone());
}
}
for p in &publishers {
if !entry.publishers.contains(p) {
entry.publishers.push(p.clone());
}
}
if (entry.start_year.is_none() || first_publish_year.is_some_and(|y| entry.start_year.unwrap() > y))
&& first_publish_year.is_some()
{
entry.start_year = first_publish_year;
}
if entry.cover_url.is_none() {
entry.cover_url = cover_url;
}
}
let mut candidates: Vec<SeriesCandidate> = series_map
.into_values()
.map(|b| {
let confidence = compute_confidence(&b.title, &query_lower);
SeriesCandidate {
external_id: b.external_id,
title: b.title,
authors: b.authors,
description: b.description,
publishers: b.publishers,
start_year: b.start_year,
total_volumes: if b.volume_count > 1 { Some(b.volume_count) } else { None },
cover_url: b.cover_url,
external_url: b.external_url,
confidence,
metadata_json: serde_json::json!({}),
}
})
.collect();
candidates.sort_by(|a, b| b.confidence.partial_cmp(&a.confidence).unwrap_or(std::cmp::Ordering::Equal));
candidates.truncate(10);
Ok(candidates)
}
async fn get_series_books_impl(
external_id: &str,
_config: &ProviderConfig,
) -> Result<Vec<BookCandidate>, String> {
let client = reqwest::Client::builder()
.timeout(std::time::Duration::from_secs(15))
.build()
.map_err(|e| format!("failed to build HTTP client: {e}"))?;
// Fetch the work to get its title for series search
let url = format!("https://openlibrary.org{}.json", external_id);
let resp = client.get(&url).send().await.map_err(|e| format!("Open Library request failed: {e}"))?;
let work: serde_json::Value = if resp.status().is_success() {
resp.json().await.map_err(|e| format!("Failed to parse response: {e}"))?
} else {
serde_json::json!({})
};
let title = work.get("title").and_then(|t| t.as_str()).unwrap_or("");
let series_name = extract_series_name(title);
// Search for editions of this series
let search_url = format!(
"https://openlibrary.org/search.json?title={}&limit=40",
urlencoded(&series_name)
);
let resp = client.get(&search_url).send().await.map_err(|e| format!("Open Library search failed: {e}"))?;
if !resp.status().is_success() {
return Ok(vec![]);
}
let data: serde_json::Value = resp.json().await.map_err(|e| format!("Failed to parse response: {e}"))?;
let docs = match data.get("docs").and_then(|d| d.as_array()) {
Some(docs) => docs,
None => return Ok(vec![]),
};
let mut books: Vec<BookCandidate> = docs
.iter()
.map(|doc| {
let title = doc.get("title").and_then(|t| t.as_str()).unwrap_or("").to_string();
let authors: Vec<String> = doc
.get("author_name")
.and_then(|a| a.as_array())
.map(|arr| arr.iter().filter_map(|v| v.as_str().map(String::from)).collect())
.unwrap_or_default();
let isbn = doc
.get("isbn")
.and_then(|a| a.as_array())
.and_then(|arr| arr.first())
.and_then(|v| v.as_str())
.map(String::from);
let page_count = doc
.get("number_of_pages_median")
.and_then(|n| n.as_i64())
.map(|n| n as i32);
let cover_i = doc.get("cover_i").and_then(|c| c.as_i64());
let cover_url = cover_i.map(|id| format!("https://covers.openlibrary.org/b/id/{}-M.jpg", id));
let language = doc
.get("language")
.and_then(|a| a.as_array())
.and_then(|arr| arr.first())
.and_then(|v| v.as_str())
.map(String::from);
let publish_date = doc
.get("first_publish_year")
.and_then(|y| y.as_i64())
.map(|y| y.to_string());
let key = doc.get("key").and_then(|k| k.as_str()).unwrap_or("").to_string();
let volume_number = extract_volume_number(&title);
BookCandidate {
external_book_id: key,
title,
volume_number,
authors,
isbn,
summary: None,
cover_url,
page_count,
language,
publish_date,
metadata_json: serde_json::json!({}),
}
})
.collect();
books.sort_by_key(|b| b.volume_number.unwrap_or(999));
Ok(books)
}
fn extract_series_name(title: &str) -> String {
let re_patterns = [
r"(?i)\s*[-–—]\s*(?:vol(?:ume)?\.?\s*|tome\s*|t\.\s*|#)\s*\d+.*$",
r"(?i)\s*,?\s*(?:vol(?:ume)?\.?\s*|tome\s*|t\.\s*|#)\s*\d+.*$",
r"\s*\(\d+\)\s*$",
r"\s+\d+\s*$",
];
let mut result = title.to_string();
for pattern in &re_patterns {
if let Ok(re) = regex::Regex::new(pattern) {
let cleaned = re.replace(&result, "").to_string();
if !cleaned.is_empty() {
result = cleaned;
break;
}
}
}
result.trim().to_string()
}
fn extract_volume_number(title: &str) -> Option<i32> {
let patterns = [
r"(?i)(?:vol(?:ume)?\.?\s*|tome\s*|t\.\s*|#)\s*(\d+)",
r"\((\d+)\)\s*$",
r"\b(\d+)\s*$",
];
for pattern in &patterns {
if let Ok(re) = regex::Regex::new(pattern) {
if let Some(caps) = re.captures(title) {
if let Some(num) = caps.get(1).and_then(|m| m.as_str().parse::<i32>().ok()) {
return Some(num);
}
}
}
}
None
}
fn compute_confidence(title: &str, query: &str) -> f32 {
let title_lower = title.to_lowercase();
if title_lower == query {
1.0
} else if title_lower.starts_with(query) || query.starts_with(&title_lower) {
0.8
} else if title_lower.contains(query) || query.contains(&title_lower) {
0.7
} else {
let common: usize = query.chars().filter(|c| title_lower.contains(*c)).count();
let max_len = query.len().max(title_lower.len()).max(1);
(common as f32 / max_len as f32).clamp(0.1, 0.6)
}
}
fn urlencoded(s: &str) -> String {
let mut result = String::new();
for byte in s.bytes() {
match byte {
b'A'..=b'Z' | b'a'..=b'z' | b'0'..=b'9' | b'-' | b'_' | b'.' | b'~' => {
result.push(byte as char);
}
_ => result.push_str(&format!("%{:02X}", byte)),
}
}
result
}
struct SeriesCandidateBuilder {
title: String,
authors: Vec<String>,
description: Option<String>,
publishers: Vec<String>,
start_year: Option<i32>,
volume_count: i32,
cover_url: Option<String>,
external_id: String,
external_url: Option<String>,
}

View File

@@ -0,0 +1,836 @@
use axum::{
extract::{Path as AxumPath, State},
Json,
};
use serde::{Deserialize, Serialize};
use sqlx::{PgPool, Row};
use uuid::Uuid;
use utoipa::ToSchema;
use tracing::{info, warn};
use crate::{error::ApiError, metadata_providers, state::AppState};
use crate::metadata_batch::{load_provider_config_from_pool, is_job_cancelled, update_progress};
// ---------------------------------------------------------------------------
// DTOs
// ---------------------------------------------------------------------------
#[derive(Deserialize, ToSchema)]
pub struct MetadataRefreshRequest {
pub library_id: String,
}
/// A single field change: old → new
#[derive(Serialize, Clone)]
struct FieldDiff {
field: String,
#[serde(skip_serializing_if = "Option::is_none")]
old: Option<serde_json::Value>,
#[serde(skip_serializing_if = "Option::is_none")]
new: Option<serde_json::Value>,
}
/// Per-book changes
#[derive(Serialize, Clone)]
struct BookDiff {
book_id: String,
title: String,
volume: Option<i32>,
changes: Vec<FieldDiff>,
}
/// Per-series change report
#[derive(Serialize, Clone)]
struct SeriesRefreshResult {
series_name: String,
provider: String,
status: String, // "updated", "unchanged", "error"
series_changes: Vec<FieldDiff>,
book_changes: Vec<BookDiff>,
#[serde(skip_serializing_if = "Option::is_none")]
error: Option<String>,
}
/// Response DTO for the report endpoint
#[derive(Serialize, ToSchema)]
pub struct MetadataRefreshReportDto {
#[schema(value_type = String)]
pub job_id: Uuid,
pub status: String,
pub total_links: i64,
pub refreshed: i64,
pub unchanged: i64,
pub errors: i64,
pub changes: serde_json::Value,
}
// ---------------------------------------------------------------------------
// POST /metadata/refresh — Trigger a metadata refresh job
// ---------------------------------------------------------------------------
#[utoipa::path(
post,
path = "/metadata/refresh",
tag = "metadata",
request_body = MetadataRefreshRequest,
responses(
(status = 200, description = "Job created"),
(status = 400, description = "Bad request"),
),
security(("Bearer" = []))
)]
pub async fn start_refresh(
State(state): State<AppState>,
Json(body): Json<MetadataRefreshRequest>,
) -> Result<Json<serde_json::Value>, ApiError> {
let library_id: Uuid = body
.library_id
.parse()
.map_err(|_| ApiError::bad_request("invalid library_id"))?;
// Verify library exists
sqlx::query("SELECT 1 FROM libraries WHERE id = $1")
.bind(library_id)
.fetch_optional(&state.pool)
.await?
.ok_or_else(|| ApiError::not_found("library not found"))?;
// Check no existing running metadata_refresh job for this library
let existing: Option<Uuid> = sqlx::query_scalar(
"SELECT id FROM index_jobs WHERE library_id = $1 AND type = 'metadata_refresh' AND status IN ('pending', 'running') LIMIT 1",
)
.bind(library_id)
.fetch_optional(&state.pool)
.await?;
if let Some(existing_id) = existing {
return Ok(Json(serde_json::json!({
"id": existing_id.to_string(),
"status": "already_running",
})));
}
// Check there are approved links to refresh (only ongoing series)
let link_count: i64 = sqlx::query_scalar(
r#"
SELECT COUNT(*) FROM external_metadata_links eml
LEFT JOIN series_metadata sm
ON sm.library_id = eml.library_id AND sm.name = eml.series_name
WHERE eml.library_id = $1
AND eml.status = 'approved'
AND COALESCE(sm.status, 'ongoing') NOT IN ('ended', 'cancelled')
"#,
)
.bind(library_id)
.fetch_one(&state.pool)
.await?;
if link_count == 0 {
return Err(ApiError::bad_request("No approved metadata links to refresh for this library"));
}
let job_id = Uuid::new_v4();
sqlx::query(
"INSERT INTO index_jobs (id, library_id, type, status, started_at) VALUES ($1, $2, 'metadata_refresh', 'running', NOW())",
)
.bind(job_id)
.bind(library_id)
.execute(&state.pool)
.await?;
// Spawn the background processing task (status already 'running' to avoid poller race)
let pool = state.pool.clone();
let library_name: Option<String> = sqlx::query_scalar("SELECT name FROM libraries WHERE id = $1")
.bind(library_id)
.fetch_optional(&state.pool)
.await
.ok()
.flatten();
tokio::spawn(async move {
if let Err(e) = process_metadata_refresh(&pool, job_id, library_id).await {
warn!("[METADATA_REFRESH] job {job_id} failed: {e}");
let _ = sqlx::query(
"UPDATE index_jobs SET status = 'failed', error_opt = $2, finished_at = NOW() WHERE id = $1",
)
.bind(job_id)
.bind(e.to_string())
.execute(&pool)
.await;
notifications::notify(
pool.clone(),
notifications::NotificationEvent::MetadataRefreshFailed {
library_name,
error: e.to_string(),
},
);
}
});
Ok(Json(serde_json::json!({
"id": job_id.to_string(),
"status": "pending",
})))
}
// ---------------------------------------------------------------------------
// GET /metadata/refresh/:id/report — Refresh report from stats_json
// ---------------------------------------------------------------------------
#[utoipa::path(
get,
path = "/metadata/refresh/{id}/report",
tag = "metadata",
params(("id" = String, Path, description = "Job UUID")),
responses(
(status = 200, body = MetadataRefreshReportDto),
(status = 404, description = "Job not found"),
),
security(("Bearer" = []))
)]
pub async fn get_refresh_report(
State(state): State<AppState>,
AxumPath(job_id): AxumPath<Uuid>,
) -> Result<Json<MetadataRefreshReportDto>, ApiError> {
let row = sqlx::query(
"SELECT status, stats_json, total_files FROM index_jobs WHERE id = $1 AND type = 'metadata_refresh'",
)
.bind(job_id)
.fetch_optional(&state.pool)
.await?
.ok_or_else(|| ApiError::not_found("job not found"))?;
let job_status: String = row.get("status");
let stats: Option<serde_json::Value> = row.get("stats_json");
let total_files: Option<i32> = row.get("total_files");
let (refreshed, unchanged, errors, changes) = if let Some(ref s) = stats {
(
s.get("refreshed").and_then(|v| v.as_i64()).unwrap_or(0),
s.get("unchanged").and_then(|v| v.as_i64()).unwrap_or(0),
s.get("errors").and_then(|v| v.as_i64()).unwrap_or(0),
s.get("changes").cloned().unwrap_or(serde_json::json!([])),
)
} else {
(0, 0, 0, serde_json::json!([]))
};
Ok(Json(MetadataRefreshReportDto {
job_id,
status: job_status,
total_links: total_files.unwrap_or(0) as i64,
refreshed,
unchanged,
errors,
changes,
}))
}
// ---------------------------------------------------------------------------
// Background processing
// ---------------------------------------------------------------------------
pub(crate) async fn process_metadata_refresh(
pool: &PgPool,
job_id: Uuid,
library_id: Uuid,
) -> Result<(), String> {
// Set job to running
sqlx::query("UPDATE index_jobs SET status = 'running', started_at = NOW() WHERE id = $1")
.bind(job_id)
.execute(pool)
.await
.map_err(|e| e.to_string())?;
// Get approved links for this library, only for ongoing series (not ended/cancelled)
let links: Vec<(Uuid, String, String, String)> = sqlx::query_as(
r#"
SELECT eml.id, eml.series_name, eml.provider, eml.external_id
FROM external_metadata_links eml
LEFT JOIN series_metadata sm
ON sm.library_id = eml.library_id AND sm.name = eml.series_name
WHERE eml.library_id = $1
AND eml.status = 'approved'
AND COALESCE(sm.status, 'ongoing') NOT IN ('ended', 'cancelled')
ORDER BY eml.series_name
"#,
)
.bind(library_id)
.fetch_all(pool)
.await
.map_err(|e| e.to_string())?;
let total = links.len() as i32;
sqlx::query("UPDATE index_jobs SET total_files = $2 WHERE id = $1")
.bind(job_id)
.bind(total)
.execute(pool)
.await
.map_err(|e| e.to_string())?;
let mut processed = 0i32;
let mut refreshed = 0i32;
let mut unchanged = 0i32;
let mut errors = 0i32;
let mut all_results: Vec<SeriesRefreshResult> = Vec::new();
for (link_id, series_name, provider_name, external_id) in &links {
// Check cancellation
if is_job_cancelled(pool, job_id).await {
sqlx::query(
"UPDATE index_jobs SET status = 'cancelled', finished_at = NOW() WHERE id = $1",
)
.bind(job_id)
.execute(pool)
.await
.map_err(|e| e.to_string())?;
return Ok(());
}
match refresh_link(pool, *link_id, library_id, series_name, provider_name, external_id).await {
Ok(result) => {
if result.status == "updated" {
refreshed += 1;
info!("[METADATA_REFRESH] job={job_id} updated series='{series_name}' via {provider_name}");
} else {
unchanged += 1;
}
all_results.push(result);
}
Err(e) => {
errors += 1;
warn!("[METADATA_REFRESH] job={job_id} error on series='{series_name}': {e}");
all_results.push(SeriesRefreshResult {
series_name: series_name.clone(),
provider: provider_name.clone(),
status: "error".to_string(),
series_changes: vec![],
book_changes: vec![],
error: Some(e),
});
}
}
processed += 1;
update_progress(pool, job_id, processed, total, series_name).await;
// Rate limit: 1s delay between provider calls
tokio::time::sleep(std::time::Duration::from_millis(1000)).await;
}
// Only keep series that have changes or errors (filter out "unchanged")
let changes_only: Vec<&SeriesRefreshResult> = all_results
.iter()
.filter(|r| r.status != "unchanged")
.collect();
// Build stats summary
let stats = serde_json::json!({
"total_links": total,
"refreshed": refreshed,
"unchanged": unchanged,
"errors": errors,
"changes": changes_only,
});
sqlx::query(
"UPDATE index_jobs SET status = 'success', finished_at = NOW(), progress_percent = 100, stats_json = $2 WHERE id = $1",
)
.bind(job_id)
.bind(stats)
.execute(pool)
.await
.map_err(|e| e.to_string())?;
info!("[METADATA_REFRESH] job={job_id} completed: {refreshed} updated, {unchanged} unchanged, {errors} errors");
let library_name: Option<String> = sqlx::query_scalar("SELECT name FROM libraries WHERE id = $1")
.bind(library_id)
.fetch_optional(pool)
.await
.ok()
.flatten();
notifications::notify(
pool.clone(),
notifications::NotificationEvent::MetadataRefreshCompleted {
library_name,
refreshed,
unchanged,
errors,
},
);
Ok(())
}
/// Refresh a single approved metadata link: re-fetch from provider, compare, sync, return diff
async fn refresh_link(
pool: &PgPool,
link_id: Uuid,
library_id: Uuid,
series_name: &str,
provider_name: &str,
external_id: &str,
) -> Result<SeriesRefreshResult, String> {
let provider = metadata_providers::get_provider(provider_name)
.ok_or_else(|| format!("Unknown provider: {provider_name}"))?;
let config = load_provider_config_from_pool(pool, provider_name).await;
let mut series_changes: Vec<FieldDiff> = Vec::new();
let mut book_changes: Vec<BookDiff> = Vec::new();
// ── Series-level refresh ──────────────────────────────────────────────
let candidates = provider
.search_series(series_name, &config)
.await
.map_err(|e| format!("provider search error: {e}"))?;
let candidate = candidates
.iter()
.find(|c| c.external_id == external_id)
.or_else(|| candidates.first());
if let Some(candidate) = candidate {
// Update link metadata_json
sqlx::query(
r#"
UPDATE external_metadata_links
SET metadata_json = $2,
total_volumes_external = $3,
updated_at = NOW()
WHERE id = $1
"#,
)
.bind(link_id)
.bind(&candidate.metadata_json)
.bind(candidate.total_volumes)
.execute(pool)
.await
.map_err(|e| e.to_string())?;
// Diff + sync series metadata
series_changes = sync_series_with_diff(pool, library_id, series_name, candidate).await?;
}
// ── Book-level refresh ────────────────────────────────────────────────
let books = provider
.get_series_books(external_id, &config)
.await
.map_err(|e| format!("provider books error: {e}"))?;
// Delete existing external_book_metadata for this link
sqlx::query("DELETE FROM external_book_metadata WHERE link_id = $1")
.bind(link_id)
.execute(pool)
.await
.map_err(|e| e.to_string())?;
// Pre-fetch local books
let local_books: Vec<(Uuid, Option<i32>, String)> = sqlx::query_as(
r#"
SELECT id, volume, title FROM books
WHERE library_id = $1
AND COALESCE(NULLIF(series, ''), 'unclassified') = $2
ORDER BY volume NULLS LAST,
REGEXP_REPLACE(LOWER(title), '[0-9].*$', ''),
COALESCE((REGEXP_MATCH(LOWER(title), '\d+'))[1]::int, 0),
title ASC
"#,
)
.bind(library_id)
.bind(series_name)
.fetch_all(pool)
.await
.map_err(|e| e.to_string())?;
let local_books_with_pos: Vec<(Uuid, i32, String)> = local_books
.iter()
.enumerate()
.map(|(idx, (id, vol, title))| (*id, vol.unwrap_or((idx + 1) as i32), title.clone()))
.collect();
let mut matched_local_ids = std::collections::HashSet::new();
for (ext_idx, book) in books.iter().enumerate() {
let ext_vol = book.volume_number.unwrap_or((ext_idx + 1) as i32);
// Match by volume number
let mut local_book_id: Option<Uuid> = local_books_with_pos
.iter()
.find(|(id, v, _)| *v == ext_vol && !matched_local_ids.contains(id))
.map(|(id, _, _)| *id);
// Match by title containment
if local_book_id.is_none() {
let ext_title_lower = book.title.to_lowercase();
local_book_id = local_books_with_pos
.iter()
.find(|(id, _, local_title)| {
if matched_local_ids.contains(id) {
return false;
}
let local_lower = local_title.to_lowercase();
local_lower.contains(&ext_title_lower) || ext_title_lower.contains(&local_lower)
})
.map(|(id, _, _)| *id);
}
if let Some(id) = local_book_id {
matched_local_ids.insert(id);
}
// Insert external_book_metadata
sqlx::query(
r#"
INSERT INTO external_book_metadata
(link_id, book_id, external_book_id, volume_number, title, authors, isbn, summary, cover_url, page_count, language, publish_date, metadata_json)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13)
"#,
)
.bind(link_id)
.bind(local_book_id)
.bind(&book.external_book_id)
.bind(book.volume_number)
.bind(&book.title)
.bind(&book.authors)
.bind(&book.isbn)
.bind(&book.summary)
.bind(&book.cover_url)
.bind(book.page_count)
.bind(&book.language)
.bind(&book.publish_date)
.bind(&book.metadata_json)
.execute(pool)
.await
.map_err(|e| e.to_string())?;
// Diff + push metadata to matched local book
if let Some(book_id) = local_book_id {
let diffs = sync_book_with_diff(pool, book_id, book).await?;
if !diffs.is_empty() {
let local_title = local_books_with_pos
.iter()
.find(|(id, _, _)| *id == book_id)
.map(|(_, _, t)| t.clone())
.unwrap_or_default();
book_changes.push(BookDiff {
book_id: book_id.to_string(),
title: local_title,
volume: book.volume_number,
changes: diffs,
});
}
}
}
// Update synced_at on the link
sqlx::query("UPDATE external_metadata_links SET synced_at = NOW(), updated_at = NOW() WHERE id = $1")
.bind(link_id)
.execute(pool)
.await
.map_err(|e| e.to_string())?;
let has_changes = !series_changes.is_empty() || !book_changes.is_empty();
Ok(SeriesRefreshResult {
series_name: series_name.to_string(),
provider: provider_name.to_string(),
status: if has_changes { "updated".to_string() } else { "unchanged".to_string() },
series_changes,
book_changes,
error: None,
})
}
// ---------------------------------------------------------------------------
// Diff helpers
// ---------------------------------------------------------------------------
/// Compare old/new for a nullable string field. Returns Some(FieldDiff) only if value actually changed.
fn diff_opt_str(field: &str, old: Option<&str>, new: Option<&str>) -> Option<FieldDiff> {
let new_val = new.filter(|s| !s.is_empty());
// Only report a change if there is a new non-empty value AND it differs from old
match (old, new_val) {
(Some(o), Some(n)) if o != n => Some(FieldDiff {
field: field.to_string(),
old: Some(serde_json::Value::String(o.to_string())),
new: Some(serde_json::Value::String(n.to_string())),
}),
(None, Some(n)) => Some(FieldDiff {
field: field.to_string(),
old: None,
new: Some(serde_json::Value::String(n.to_string())),
}),
_ => None,
}
}
fn diff_opt_i32(field: &str, old: Option<i32>, new: Option<i32>) -> Option<FieldDiff> {
match (old, new) {
(Some(o), Some(n)) if o != n => Some(FieldDiff {
field: field.to_string(),
old: Some(serde_json::json!(o)),
new: Some(serde_json::json!(n)),
}),
(None, Some(n)) => Some(FieldDiff {
field: field.to_string(),
old: None,
new: Some(serde_json::json!(n)),
}),
_ => None,
}
}
fn diff_str_vec(field: &str, old: &[String], new: &[String]) -> Option<FieldDiff> {
if new.is_empty() {
return None;
}
if old != new {
Some(FieldDiff {
field: field.to_string(),
old: Some(serde_json::json!(old)),
new: Some(serde_json::json!(new)),
})
} else {
None
}
}
// ---------------------------------------------------------------------------
// Series sync with diff tracking
// ---------------------------------------------------------------------------
async fn sync_series_with_diff(
pool: &PgPool,
library_id: Uuid,
series_name: &str,
candidate: &metadata_providers::SeriesCandidate,
) -> Result<Vec<FieldDiff>, String> {
let new_description = candidate.metadata_json
.get("description")
.and_then(|d| d.as_str())
.or(candidate.description.as_deref());
let new_authors = &candidate.authors;
let new_publishers = &candidate.publishers;
let new_start_year = candidate.start_year;
let new_total_volumes = candidate.total_volumes;
let new_status = if let Some(raw) = candidate.metadata_json.get("status").and_then(|s| s.as_str()) {
Some(crate::metadata::normalize_series_status(pool, raw).await)
} else {
None
};
let new_status = new_status.as_deref();
// Fetch existing series metadata for diffing
let existing = sqlx::query(
r#"SELECT description, publishers, start_year, total_volumes, status, authors, locked_fields
FROM series_metadata WHERE library_id = $1 AND name = $2"#,
)
.bind(library_id)
.bind(series_name)
.fetch_optional(pool)
.await
.map_err(|e| e.to_string())?;
let locked = existing
.as_ref()
.map(|r| r.get::<serde_json::Value, _>("locked_fields"))
.unwrap_or(serde_json::json!({}));
let is_locked = |field: &str| -> bool {
locked.get(field).and_then(|v| v.as_bool()).unwrap_or(false)
};
// Build diffs (only for unlocked fields that actually change)
let mut diffs: Vec<FieldDiff> = Vec::new();
if !is_locked("description") {
let old_desc: Option<String> = existing.as_ref().and_then(|r| r.get("description"));
if let Some(d) = diff_opt_str("description", old_desc.as_deref(), new_description) {
diffs.push(d);
}
}
if !is_locked("authors") {
let old_authors: Vec<String> = existing.as_ref().map(|r| r.get("authors")).unwrap_or_default();
if let Some(d) = diff_str_vec("authors", &old_authors, new_authors) {
diffs.push(d);
}
}
if !is_locked("publishers") {
let old_publishers: Vec<String> = existing.as_ref().map(|r| r.get("publishers")).unwrap_or_default();
if let Some(d) = diff_str_vec("publishers", &old_publishers, new_publishers) {
diffs.push(d);
}
}
if !is_locked("start_year") {
let old_year: Option<i32> = existing.as_ref().and_then(|r| r.get("start_year"));
if let Some(d) = diff_opt_i32("start_year", old_year, new_start_year) {
diffs.push(d);
}
}
if !is_locked("total_volumes") {
let old_vols: Option<i32> = existing.as_ref().and_then(|r| r.get("total_volumes"));
if let Some(d) = diff_opt_i32("total_volumes", old_vols, new_total_volumes) {
diffs.push(d);
}
}
if !is_locked("status") {
let old_status: Option<String> = existing.as_ref().and_then(|r| r.get("status"));
if let Some(d) = diff_opt_str("status", old_status.as_deref(), new_status) {
diffs.push(d);
}
}
// Now do the actual upsert
sqlx::query(
r#"
INSERT INTO series_metadata (library_id, name, description, publishers, start_year, total_volumes, status, authors, created_at, updated_at)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, NOW(), NOW())
ON CONFLICT (library_id, name)
DO UPDATE SET
description = CASE
WHEN (series_metadata.locked_fields->>'description')::boolean IS TRUE THEN series_metadata.description
ELSE COALESCE(NULLIF(EXCLUDED.description, ''), series_metadata.description)
END,
publishers = CASE
WHEN (series_metadata.locked_fields->>'publishers')::boolean IS TRUE THEN series_metadata.publishers
WHEN array_length(EXCLUDED.publishers, 1) > 0 THEN EXCLUDED.publishers
ELSE series_metadata.publishers
END,
start_year = CASE
WHEN (series_metadata.locked_fields->>'start_year')::boolean IS TRUE THEN series_metadata.start_year
ELSE COALESCE(EXCLUDED.start_year, series_metadata.start_year)
END,
total_volumes = CASE
WHEN (series_metadata.locked_fields->>'total_volumes')::boolean IS TRUE THEN series_metadata.total_volumes
ELSE COALESCE(EXCLUDED.total_volumes, series_metadata.total_volumes)
END,
status = CASE
WHEN (series_metadata.locked_fields->>'status')::boolean IS TRUE THEN series_metadata.status
ELSE COALESCE(EXCLUDED.status, series_metadata.status)
END,
authors = CASE
WHEN (series_metadata.locked_fields->>'authors')::boolean IS TRUE THEN series_metadata.authors
WHEN array_length(EXCLUDED.authors, 1) > 0 THEN EXCLUDED.authors
ELSE series_metadata.authors
END,
updated_at = NOW()
"#,
)
.bind(library_id)
.bind(series_name)
.bind(new_description)
.bind(new_publishers)
.bind(new_start_year)
.bind(new_total_volumes)
.bind(new_status)
.bind(new_authors)
.execute(pool)
.await
.map_err(|e| e.to_string())?;
Ok(diffs)
}
// ---------------------------------------------------------------------------
// Book sync with diff tracking
// ---------------------------------------------------------------------------
async fn sync_book_with_diff(
pool: &PgPool,
book_id: Uuid,
ext_book: &metadata_providers::BookCandidate,
) -> Result<Vec<FieldDiff>, String> {
// Fetch current book state
let current = sqlx::query(
"SELECT summary, isbn, publish_date, language, authors, locked_fields FROM books WHERE id = $1",
)
.bind(book_id)
.fetch_one(pool)
.await
.map_err(|e| e.to_string())?;
let locked = current.get::<serde_json::Value, _>("locked_fields");
let is_locked = |field: &str| -> bool {
locked.get(field).and_then(|v| v.as_bool()).unwrap_or(false)
};
// Build diffs
let mut diffs: Vec<FieldDiff> = Vec::new();
if !is_locked("summary") {
let old: Option<String> = current.get("summary");
if let Some(d) = diff_opt_str("summary", old.as_deref(), ext_book.summary.as_deref()) {
diffs.push(d);
}
}
if !is_locked("isbn") {
let old: Option<String> = current.get("isbn");
if let Some(d) = diff_opt_str("isbn", old.as_deref(), ext_book.isbn.as_deref()) {
diffs.push(d);
}
}
if !is_locked("publish_date") {
let old: Option<String> = current.get("publish_date");
if let Some(d) = diff_opt_str("publish_date", old.as_deref(), ext_book.publish_date.as_deref()) {
diffs.push(d);
}
}
if !is_locked("language") {
let old: Option<String> = current.get("language");
if let Some(d) = diff_opt_str("language", old.as_deref(), ext_book.language.as_deref()) {
diffs.push(d);
}
}
if !is_locked("authors") {
let old: Vec<String> = current.get("authors");
if let Some(d) = diff_str_vec("authors", &old, &ext_book.authors) {
diffs.push(d);
}
}
// Do the actual update
sqlx::query(
r#"
UPDATE books SET
summary = CASE
WHEN (locked_fields->>'summary')::boolean IS TRUE THEN summary
ELSE COALESCE(NULLIF($2, ''), summary)
END,
isbn = CASE
WHEN (locked_fields->>'isbn')::boolean IS TRUE THEN isbn
ELSE COALESCE(NULLIF($3, ''), isbn)
END,
publish_date = CASE
WHEN (locked_fields->>'publish_date')::boolean IS TRUE THEN publish_date
ELSE COALESCE(NULLIF($4, ''), publish_date)
END,
language = CASE
WHEN (locked_fields->>'language')::boolean IS TRUE THEN language
ELSE COALESCE(NULLIF($5, ''), language)
END,
authors = CASE
WHEN (locked_fields->>'authors')::boolean IS TRUE THEN authors
WHEN CARDINALITY($6::text[]) > 0 THEN $6
ELSE authors
END,
author = CASE
WHEN (locked_fields->>'authors')::boolean IS TRUE THEN author
WHEN CARDINALITY($6::text[]) > 0 THEN $6[1]
ELSE author
END,
updated_at = NOW()
WHERE id = $1
"#,
)
.bind(book_id)
.bind(&ext_book.summary)
.bind(&ext_book.isbn)
.bind(&ext_book.publish_date)
.bind(&ext_book.language)
.bind(&ext_book.authors)
.execute(pool)
.await
.map_err(|e| e.to_string())?;
Ok(diffs)
}

View File

@@ -10,14 +10,14 @@ use utoipa::OpenApi;
crate::reading_progress::update_reading_progress,
crate::reading_progress::mark_series_read,
crate::books::get_thumbnail,
crate::books::list_series,
crate::books::list_all_series,
crate::books::ongoing_series,
crate::books::ongoing_books,
crate::series::list_series,
crate::series::list_all_series,
crate::series::ongoing_series,
crate::series::ongoing_books,
crate::books::convert_book,
crate::books::update_book,
crate::books::get_series_metadata,
crate::books::update_series,
crate::series::get_series_metadata,
crate::series::update_series,
crate::pages::get_page,
crate::search::search_books,
crate::index_jobs::enqueue_rebuild,
@@ -35,10 +35,12 @@ use utoipa::OpenApi;
crate::libraries::delete_library,
crate::libraries::scan_library,
crate::libraries::update_monitoring,
crate::libraries::update_metadata_provider,
crate::tokens::list_tokens,
crate::tokens::create_token,
crate::tokens::revoke_token,
crate::tokens::delete_token,
crate::authors::list_authors,
crate::stats::get_stats,
crate::settings::get_settings,
crate::settings::get_setting,
@@ -46,6 +48,30 @@ use utoipa::OpenApi;
crate::settings::clear_cache,
crate::settings::get_cache_stats,
crate::settings::get_thumbnail_stats,
crate::metadata::search_metadata,
crate::metadata::create_metadata_match,
crate::metadata::approve_metadata,
crate::metadata::reject_metadata,
crate::metadata::get_metadata_links,
crate::metadata::get_missing_books,
crate::metadata::delete_metadata_link,
crate::series::series_statuses,
crate::series::provider_statuses,
crate::settings::list_status_mappings,
crate::settings::upsert_status_mapping,
crate::settings::delete_status_mapping,
crate::prowlarr::search_prowlarr,
crate::prowlarr::test_prowlarr,
crate::qbittorrent::add_torrent,
crate::qbittorrent::test_qbittorrent,
crate::metadata_batch::start_batch,
crate::metadata_batch::get_batch_report,
crate::metadata_batch::get_batch_results,
crate::metadata_refresh::start_refresh,
crate::metadata_refresh::get_refresh_report,
crate::komga::sync_komga_read_books,
crate::komga::list_sync_reports,
crate::komga::get_sync_report,
),
components(
schemas(
@@ -57,14 +83,14 @@ use utoipa::OpenApi;
crate::reading_progress::UpdateReadingProgressRequest,
crate::reading_progress::MarkSeriesReadRequest,
crate::reading_progress::MarkSeriesReadResponse,
crate::books::SeriesItem,
crate::books::SeriesPage,
crate::books::ListAllSeriesQuery,
crate::books::OngoingQuery,
crate::series::SeriesItem,
crate::series::SeriesPage,
crate::series::ListAllSeriesQuery,
crate::series::OngoingQuery,
crate::books::UpdateBookRequest,
crate::books::SeriesMetadata,
crate::books::UpdateSeriesRequest,
crate::books::UpdateSeriesResponse,
crate::series::SeriesMetadata,
crate::series::UpdateSeriesRequest,
crate::series::UpdateSeriesResponse,
crate::pages::PageQuery,
crate::search::SearchQuery,
crate::search::SearchResponse,
@@ -79,6 +105,7 @@ use utoipa::OpenApi;
crate::libraries::LibraryResponse,
crate::libraries::CreateLibraryRequest,
crate::libraries::UpdateMonitoringRequest,
crate::libraries::UpdateMetadataProviderRequest,
crate::tokens::CreateTokenRequest,
crate::tokens::TokenResponse,
crate::tokens::CreatedTokenResponse,
@@ -86,6 +113,11 @@ use utoipa::OpenApi;
crate::settings::ClearCacheResponse,
crate::settings::CacheStats,
crate::settings::ThumbnailStats,
crate::settings::StatusMappingDto,
crate::settings::UpsertStatusMappingRequest,
crate::authors::ListAuthorsQuery,
crate::authors::AuthorItem,
crate::authors::AuthorsPageResponse,
crate::stats::StatsResponse,
crate::stats::StatsOverview,
crate::stats::ReadingStatusStats,
@@ -94,6 +126,37 @@ use utoipa::OpenApi;
crate::stats::LibraryStats,
crate::stats::TopSeries,
crate::stats::MonthlyAdditions,
crate::stats::MetadataStats,
crate::stats::ProviderCount,
crate::metadata::ApproveRequest,
crate::metadata::ApproveResponse,
crate::metadata::SyncReport,
crate::metadata::SeriesSyncReport,
crate::metadata::BookSyncReport,
crate::metadata::FieldChange,
crate::metadata::MetadataSearchRequest,
crate::metadata::SeriesCandidateDto,
crate::metadata::MetadataMatchRequest,
crate::metadata::ExternalMetadataLinkDto,
crate::metadata::MissingBooksDto,
crate::metadata::MissingBookItem,
crate::qbittorrent::QBittorrentAddRequest,
crate::qbittorrent::QBittorrentAddResponse,
crate::qbittorrent::QBittorrentTestResponse,
crate::prowlarr::ProwlarrSearchRequest,
crate::prowlarr::ProwlarrRelease,
crate::prowlarr::ProwlarrCategory,
crate::prowlarr::ProwlarrSearchResponse,
crate::prowlarr::MissingVolumeInput,
crate::prowlarr::ProwlarrTestResponse,
crate::metadata_batch::MetadataBatchRequest,
crate::metadata_batch::MetadataBatchReportDto,
crate::metadata_batch::MetadataBatchResultDto,
crate::metadata_refresh::MetadataRefreshRequest,
crate::metadata_refresh::MetadataRefreshReportDto,
crate::komga::KomgaSyncRequest,
crate::komga::KomgaSyncResponse,
crate::komga::KomgaSyncReportSummary,
ErrorResponse,
)
),
@@ -101,12 +164,20 @@ use utoipa::OpenApi;
("Bearer" = [])
),
tags(
(name = "books", description = "Read-only endpoints for browsing and searching books"),
(name = "books", description = "Book browsing, details and management"),
(name = "series", description = "Series browsing, filtering and management"),
(name = "search", description = "Full-text search across books and series"),
(name = "reading-progress", description = "Reading progress tracking per book"),
(name = "libraries", description = "Library management endpoints (Admin only)"),
(name = "authors", description = "Author browsing and listing"),
(name = "stats", description = "Collection statistics and dashboard data"),
(name = "libraries", description = "Library listing, scanning, and management (create/delete/settings: Admin only)"),
(name = "indexing", description = "Search index management and job control (Admin only)"),
(name = "metadata", description = "External metadata providers and matching (Admin only)"),
(name = "komga", description = "Komga read-status sync (Admin only)"),
(name = "tokens", description = "API token management (Admin only)"),
(name = "settings", description = "Application settings and cache management (Admin only)"),
(name = "prowlarr", description = "Prowlarr indexer integration (Admin only)"),
(name = "qbittorrent", description = "qBittorrent download client integration (Admin only)"),
),
modifiers(&SecurityAddon)
)]

View File

@@ -277,7 +277,17 @@ pub async fn get_page(
let cache_dir2 = cache_dir_path.clone();
let format2 = format;
tokio::spawn(async move {
prefetch_page(state2, book_id, &abs_path2, next_page, format2, quality, width, filter, timeout_secs, &cache_dir2).await;
prefetch_page(state2, &PrefetchParams {
book_id,
abs_path: &abs_path2,
page: next_page,
format: format2,
quality,
width,
filter,
timeout_secs,
cache_dir: &cache_dir2,
}).await;
});
}
@@ -290,19 +300,30 @@ pub async fn get_page(
}
}
/// Prefetch a single page into disk+memory cache (best-effort, ignores errors).
async fn prefetch_page(
state: AppState,
struct PrefetchParams<'a> {
book_id: Uuid,
abs_path: &str,
abs_path: &'a str,
page: u32,
format: OutputFormat,
quality: u8,
width: u32,
filter: image::imageops::FilterType,
timeout_secs: u64,
cache_dir: &Path,
) {
cache_dir: &'a Path,
}
/// Prefetch a single page into disk+memory cache (best-effort, ignores errors).
async fn prefetch_page(state: AppState, params: &PrefetchParams<'_>) {
let book_id = params.book_id;
let page = params.page;
let format = params.format;
let quality = params.quality;
let width = params.width;
let filter = params.filter;
let timeout_secs = params.timeout_secs;
let abs_path = params.abs_path;
let cache_dir = params.cache_dir;
let mem_key = format!("{book_id}:{page}:{}:{quality}:{width}", format.extension());
// Already in memory cache?
if state.page_cache.lock().await.contains(&mem_key) {
@@ -330,6 +351,7 @@ async fn prefetch_page(
Some(ref e) if e == "cbz" => "cbz",
Some(ref e) if e == "cbr" => "cbr",
Some(ref e) if e == "pdf" => "pdf",
Some(ref e) if e == "epub" => "epub",
_ => return,
}
.to_string();
@@ -458,6 +480,7 @@ fn render_page(
"cbz" => parsers::BookFormat::Cbz,
"cbr" => parsers::BookFormat::Cbr,
"pdf" => parsers::BookFormat::Pdf,
"epub" => parsers::BookFormat::Epub,
_ => return Err(ApiError::bad_request("unsupported source format")),
};

363
apps/api/src/prowlarr.rs Normal file
View File

@@ -0,0 +1,363 @@
use axum::{extract::State, Json};
use serde::{Deserialize, Serialize};
use sqlx::Row;
use utoipa::ToSchema;
use crate::{error::ApiError, state::AppState};
// ─── Types ──────────────────────────────────────────────────────────────────
#[derive(Deserialize, ToSchema)]
pub struct MissingVolumeInput {
pub volume_number: Option<i32>,
#[allow(dead_code)]
pub title: Option<String>,
}
#[derive(Deserialize, ToSchema)]
pub struct ProwlarrSearchRequest {
pub series_name: String,
pub volume_number: Option<i32>,
pub custom_query: Option<String>,
pub missing_volumes: Option<Vec<MissingVolumeInput>>,
}
#[derive(Serialize, Deserialize, ToSchema)]
#[serde(rename_all = "camelCase")]
pub struct ProwlarrRawRelease {
pub guid: String,
pub title: String,
pub size: i64,
pub download_url: Option<String>,
pub indexer: Option<String>,
pub seeders: Option<i32>,
pub leechers: Option<i32>,
pub publish_date: Option<String>,
pub protocol: Option<String>,
pub info_url: Option<String>,
pub categories: Option<Vec<ProwlarrCategory>>,
}
#[derive(Serialize, ToSchema)]
#[serde(rename_all = "camelCase")]
pub struct ProwlarrRelease {
pub guid: String,
pub title: String,
pub size: i64,
pub download_url: Option<String>,
pub indexer: Option<String>,
pub seeders: Option<i32>,
pub leechers: Option<i32>,
pub publish_date: Option<String>,
pub protocol: Option<String>,
pub info_url: Option<String>,
pub categories: Option<Vec<ProwlarrCategory>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub matched_missing_volumes: Option<Vec<i32>>,
}
#[derive(Serialize, Deserialize, ToSchema)]
#[serde(rename_all = "camelCase")]
pub struct ProwlarrCategory {
pub id: i32,
pub name: Option<String>,
}
#[derive(Serialize, ToSchema)]
pub struct ProwlarrSearchResponse {
pub results: Vec<ProwlarrRelease>,
pub query: String,
}
#[derive(Serialize, ToSchema)]
pub struct ProwlarrTestResponse {
pub success: bool,
pub message: String,
pub indexer_count: Option<i32>,
}
// ─── Config helper ──────────────────────────────────────────────────────────
#[derive(Deserialize)]
struct ProwlarrConfig {
url: String,
api_key: String,
categories: Option<Vec<i32>>,
}
async fn load_prowlarr_config(
pool: &sqlx::PgPool,
) -> Result<(String, String, Vec<i32>), ApiError> {
let row = sqlx::query("SELECT value FROM app_settings WHERE key = 'prowlarr'")
.fetch_optional(pool)
.await?;
let row = row.ok_or_else(|| ApiError::bad_request("Prowlarr is not configured"))?;
let value: serde_json::Value = row.get("value");
let config: ProwlarrConfig = serde_json::from_value(value)
.map_err(|e| ApiError::internal(format!("invalid prowlarr config: {e}")))?;
if config.url.is_empty() || config.api_key.is_empty() {
return Err(ApiError::bad_request(
"Prowlarr URL and API key must be configured in settings",
));
}
let url = config.url.trim_end_matches('/').to_string();
let categories = config.categories.unwrap_or_else(|| vec![7030, 7020]);
Ok((url, config.api_key, categories))
}
// ─── Volume matching ─────────────────────────────────────────────────────────
/// Extract volume numbers from a release title.
/// Looks for patterns like: T01, Tome 01, Vol. 01, v01, #01,
/// or standalone numbers that appear after common separators.
fn extract_volumes_from_title(title: &str) -> Vec<i32> {
let lower = title.to_lowercase();
let mut volumes = Vec::new();
// Patterns: T01, Tome 01, Tome01, Vol 01, Vol.01, v01, #01
let prefixes = ["tome", "vol.", "vol ", "t", "v", "#"];
let chars: Vec<char> = lower.chars().collect();
let len = chars.len();
for prefix in &prefixes {
let mut start = 0;
while let Some(pos) = lower[start..].find(prefix) {
let abs_pos = start + pos;
let after = abs_pos + prefix.len();
// For single-char prefixes (t, v, #), ensure it's at a word boundary
if prefix.len() == 1 && *prefix != "#" {
if abs_pos > 0 && chars[abs_pos - 1].is_alphanumeric() {
start = after;
continue;
}
}
// Skip optional spaces after prefix
let mut i = after;
while i < len && chars[i] == ' ' {
i += 1;
}
// Read digits
let digit_start = i;
while i < len && chars[i].is_ascii_digit() {
i += 1;
}
if i > digit_start {
if let Ok(num) = lower[digit_start..i].parse::<i32>() {
if !volumes.contains(&num) {
volumes.push(num);
}
}
}
start = after;
}
}
volumes
}
/// Match releases against missing volume numbers.
fn match_missing_volumes(
releases: Vec<ProwlarrRawRelease>,
missing: &[MissingVolumeInput],
) -> Vec<ProwlarrRelease> {
let missing_numbers: Vec<i32> = missing
.iter()
.filter_map(|m| m.volume_number)
.collect();
releases
.into_iter()
.map(|r| {
let matched = if missing_numbers.is_empty() {
None
} else {
let title_volumes = extract_volumes_from_title(&r.title);
let matched: Vec<i32> = title_volumes
.into_iter()
.filter(|v| missing_numbers.contains(v))
.collect();
if matched.is_empty() {
None
} else {
Some(matched)
}
};
ProwlarrRelease {
guid: r.guid,
title: r.title,
size: r.size,
download_url: r.download_url,
indexer: r.indexer,
seeders: r.seeders,
leechers: r.leechers,
publish_date: r.publish_date,
protocol: r.protocol,
info_url: r.info_url,
categories: r.categories,
matched_missing_volumes: matched,
}
})
.collect()
}
// ─── Handlers ───────────────────────────────────────────────────────────────
/// Search for releases on Prowlarr
#[utoipa::path(
post,
path = "/prowlarr/search",
tag = "prowlarr",
request_body = ProwlarrSearchRequest,
responses(
(status = 200, body = ProwlarrSearchResponse),
(status = 400, description = "Bad request or Prowlarr not configured"),
(status = 401, description = "Unauthorized"),
(status = 500, description = "Prowlarr connection error"),
),
security(("Bearer" = []))
)]
pub async fn search_prowlarr(
State(state): State<AppState>,
Json(body): Json<ProwlarrSearchRequest>,
) -> Result<Json<ProwlarrSearchResponse>, ApiError> {
let (url, api_key, categories) = load_prowlarr_config(&state.pool).await?;
let query = if let Some(custom) = &body.custom_query {
custom.clone()
} else if let Some(vol) = body.volume_number {
format!("\"{}\" {}", body.series_name, vol)
} else {
format!("\"{}\"", body.series_name)
};
let client = reqwest::Client::builder()
.timeout(std::time::Duration::from_secs(30))
.build()
.map_err(|e| ApiError::internal(format!("failed to build HTTP client: {e}")))?;
let mut params: Vec<(&str, String)> = vec![
("query", query.clone()),
("type", "search".to_string()),
];
for cat in &categories {
params.push(("categories", cat.to_string()));
}
let resp = client
.get(format!("{url}/api/v1/search"))
.query(&params)
.header("X-Api-Key", &api_key)
.send()
.await
.map_err(|e| ApiError::internal(format!("Prowlarr request failed: {e}")))?;
if !resp.status().is_success() {
let status = resp.status();
let text = resp.text().await.unwrap_or_default();
return Err(ApiError::internal(format!(
"Prowlarr returned {status}: {text}"
)));
}
let raw_text = resp
.text()
.await
.map_err(|e| ApiError::internal(format!("Failed to read Prowlarr response: {e}")))?;
tracing::debug!("Prowlarr raw response length: {} chars", raw_text.len());
let raw_releases: Vec<ProwlarrRawRelease> = serde_json::from_str(&raw_text)
.map_err(|e| {
tracing::error!("Failed to parse Prowlarr response: {e}");
tracing::error!("Raw response (first 500 chars): {}", &raw_text[..raw_text.len().min(500)]);
ApiError::internal(format!("Failed to parse Prowlarr response: {e}"))
})?;
let results = if let Some(missing) = &body.missing_volumes {
match_missing_volumes(raw_releases, missing)
} else {
raw_releases
.into_iter()
.map(|r| ProwlarrRelease {
guid: r.guid,
title: r.title,
size: r.size,
download_url: r.download_url,
indexer: r.indexer,
seeders: r.seeders,
leechers: r.leechers,
publish_date: r.publish_date,
protocol: r.protocol,
info_url: r.info_url,
categories: r.categories,
matched_missing_volumes: None,
})
.collect()
};
Ok(Json(ProwlarrSearchResponse { results, query }))
}
/// Test connection to Prowlarr
#[utoipa::path(
get,
path = "/prowlarr/test",
tag = "prowlarr",
responses(
(status = 200, body = ProwlarrTestResponse),
(status = 400, description = "Prowlarr not configured"),
(status = 401, description = "Unauthorized"),
),
security(("Bearer" = []))
)]
pub async fn test_prowlarr(
State(state): State<AppState>,
) -> Result<Json<ProwlarrTestResponse>, ApiError> {
let (url, api_key, _categories) = load_prowlarr_config(&state.pool).await?;
let client = reqwest::Client::builder()
.timeout(std::time::Duration::from_secs(10))
.build()
.map_err(|e| ApiError::internal(format!("failed to build HTTP client: {e}")))?;
let resp = client
.get(format!("{url}/api/v1/indexer"))
.header("X-Api-Key", &api_key)
.send()
.await;
match resp {
Ok(r) if r.status().is_success() => {
let indexers: Vec<serde_json::Value> = r.json().await.unwrap_or_default();
Ok(Json(ProwlarrTestResponse {
success: true,
message: format!("Connected successfully ({} indexers)", indexers.len()),
indexer_count: Some(indexers.len() as i32),
}))
}
Ok(r) => {
let status = r.status();
let text = r.text().await.unwrap_or_default();
Ok(Json(ProwlarrTestResponse {
success: false,
message: format!("Prowlarr returned {status}: {text}"),
indexer_count: None,
}))
}
Err(e) => Ok(Json(ProwlarrTestResponse {
success: false,
message: format!("Connection failed: {e}"),
indexer_count: None,
})),
}
}

218
apps/api/src/qbittorrent.rs Normal file
View File

@@ -0,0 +1,218 @@
use axum::{extract::State, Json};
use serde::{Deserialize, Serialize};
use sqlx::Row;
use utoipa::ToSchema;
use crate::{error::ApiError, state::AppState};
// ─── Types ──────────────────────────────────────────────────────────────────
#[derive(Deserialize, ToSchema)]
pub struct QBittorrentAddRequest {
pub url: String,
}
#[derive(Serialize, ToSchema)]
pub struct QBittorrentAddResponse {
pub success: bool,
pub message: String,
}
#[derive(Serialize, ToSchema)]
pub struct QBittorrentTestResponse {
pub success: bool,
pub message: String,
pub version: Option<String>,
}
// ─── Config helper ──────────────────────────────────────────────────────────
#[derive(Deserialize)]
struct QBittorrentConfig {
url: String,
username: String,
password: String,
}
async fn load_qbittorrent_config(
pool: &sqlx::PgPool,
) -> Result<(String, String, String), ApiError> {
let row = sqlx::query("SELECT value FROM app_settings WHERE key = 'qbittorrent'")
.fetch_optional(pool)
.await?;
let row = row.ok_or_else(|| ApiError::bad_request("qBittorrent is not configured"))?;
let value: serde_json::Value = row.get("value");
let config: QBittorrentConfig = serde_json::from_value(value)
.map_err(|e| ApiError::internal(format!("invalid qbittorrent config: {e}")))?;
if config.url.is_empty() || config.username.is_empty() {
return Err(ApiError::bad_request(
"qBittorrent URL and username must be configured in settings",
));
}
let url = config.url.trim_end_matches('/').to_string();
Ok((url, config.username, config.password))
}
// ─── Login helper ───────────────────────────────────────────────────────────
async fn qbittorrent_login(
client: &reqwest::Client,
base_url: &str,
username: &str,
password: &str,
) -> Result<String, ApiError> {
let resp = client
.post(format!("{base_url}/api/v2/auth/login"))
.form(&[("username", username), ("password", password)])
.send()
.await
.map_err(|e| ApiError::internal(format!("qBittorrent login request failed: {e}")))?;
if !resp.status().is_success() {
let status = resp.status();
let text = resp.text().await.unwrap_or_default();
return Err(ApiError::internal(format!(
"qBittorrent login failed ({status}): {text}"
)));
}
// Extract SID from Set-Cookie header
let cookie_header = resp
.headers()
.get("set-cookie")
.and_then(|v| v.to_str().ok())
.unwrap_or("");
let sid = cookie_header
.split(';')
.next()
.and_then(|s| s.strip_prefix("SID="))
.ok_or_else(|| ApiError::internal("Failed to get SID cookie from qBittorrent"))?
.to_string();
Ok(sid)
}
// ─── Handlers ───────────────────────────────────────────────────────────────
/// Add a torrent to qBittorrent
#[utoipa::path(
post,
path = "/qbittorrent/add",
tag = "qbittorrent",
request_body = QBittorrentAddRequest,
responses(
(status = 200, body = QBittorrentAddResponse),
(status = 400, description = "Bad request or qBittorrent not configured"),
(status = 401, description = "Unauthorized"),
(status = 500, description = "qBittorrent connection error"),
),
security(("Bearer" = []))
)]
pub async fn add_torrent(
State(state): State<AppState>,
Json(body): Json<QBittorrentAddRequest>,
) -> Result<Json<QBittorrentAddResponse>, ApiError> {
if body.url.is_empty() {
return Err(ApiError::bad_request("url is required"));
}
let (base_url, username, password) = load_qbittorrent_config(&state.pool).await?;
let client = reqwest::Client::builder()
.timeout(std::time::Duration::from_secs(10))
.build()
.map_err(|e| ApiError::internal(format!("failed to build HTTP client: {e}")))?;
let sid = qbittorrent_login(&client, &base_url, &username, &password).await?;
let resp = client
.post(format!("{base_url}/api/v2/torrents/add"))
.header("Cookie", format!("SID={sid}"))
.form(&[("urls", &body.url)])
.send()
.await
.map_err(|e| ApiError::internal(format!("qBittorrent add request failed: {e}")))?;
if resp.status().is_success() {
Ok(Json(QBittorrentAddResponse {
success: true,
message: "Torrent added to qBittorrent".to_string(),
}))
} else {
let status = resp.status();
let text = resp.text().await.unwrap_or_default();
Ok(Json(QBittorrentAddResponse {
success: false,
message: format!("qBittorrent returned {status}: {text}"),
}))
}
}
/// Test connection to qBittorrent
#[utoipa::path(
get,
path = "/qbittorrent/test",
tag = "qbittorrent",
responses(
(status = 200, body = QBittorrentTestResponse),
(status = 400, description = "qBittorrent not configured"),
(status = 401, description = "Unauthorized"),
),
security(("Bearer" = []))
)]
pub async fn test_qbittorrent(
State(state): State<AppState>,
) -> Result<Json<QBittorrentTestResponse>, ApiError> {
let (base_url, username, password) = load_qbittorrent_config(&state.pool).await?;
let client = reqwest::Client::builder()
.timeout(std::time::Duration::from_secs(10))
.build()
.map_err(|e| ApiError::internal(format!("failed to build HTTP client: {e}")))?;
let sid = match qbittorrent_login(&client, &base_url, &username, &password).await {
Ok(sid) => sid,
Err(e) => {
return Ok(Json(QBittorrentTestResponse {
success: false,
message: format!("Login failed: {}", e.message),
version: None,
}));
}
};
let resp = client
.get(format!("{base_url}/api/v2/app/version"))
.header("Cookie", format!("SID={sid}"))
.send()
.await;
match resp {
Ok(r) if r.status().is_success() => {
let version = r.text().await.unwrap_or_default();
Ok(Json(QBittorrentTestResponse {
success: true,
message: format!("Connected successfully ({})", version.trim()),
version: Some(version.trim().to_string()),
}))
}
Ok(r) => {
let status = r.status();
let text = r.text().await.unwrap_or_default();
Ok(Json(QBittorrentTestResponse {
success: false,
message: format!("qBittorrent returned {status}: {text}"),
version: None,
}))
}
Err(e) => Ok(Json(QBittorrentTestResponse {
success: false,
message: format!("Connection failed: {e}"),
version: None,
})),
}
}

View File

@@ -1,11 +1,11 @@
use axum::{extract::{Path, State}, Json};
use axum::{extract::{Extension, Path, State}, Json};
use chrono::{DateTime, Utc};
use serde::{Deserialize, Serialize};
use sqlx::Row;
use uuid::Uuid;
use utoipa::ToSchema;
use crate::{error::ApiError, state::AppState};
use crate::{auth::AuthUser, error::ApiError, state::AppState};
#[derive(Serialize, ToSchema)]
pub struct ReadingProgressResponse {
@@ -42,8 +42,10 @@ pub struct UpdateReadingProgressRequest {
)]
pub async fn get_reading_progress(
State(state): State<AppState>,
user: Option<Extension<AuthUser>>,
Path(id): Path<Uuid>,
) -> Result<Json<ReadingProgressResponse>, ApiError> {
let auth_user = user.ok_or_else(|| ApiError::bad_request("admin tokens cannot track reading progress"))?.0;
// Verify book exists
let exists: bool = sqlx::query_scalar("SELECT EXISTS(SELECT 1 FROM books WHERE id = $1)")
.bind(id)
@@ -55,9 +57,10 @@ pub async fn get_reading_progress(
}
let row = sqlx::query(
"SELECT status, current_page, last_read_at FROM book_reading_progress WHERE book_id = $1",
"SELECT status, current_page, last_read_at FROM book_reading_progress WHERE book_id = $1 AND user_id = $2",
)
.bind(id)
.bind(auth_user.user_id)
.fetch_optional(&state.pool)
.await?;
@@ -96,9 +99,11 @@ pub async fn get_reading_progress(
)]
pub async fn update_reading_progress(
State(state): State<AppState>,
user: Option<Extension<AuthUser>>,
Path(id): Path<Uuid>,
Json(body): Json<UpdateReadingProgressRequest>,
) -> Result<Json<ReadingProgressResponse>, ApiError> {
let auth_user = user.ok_or_else(|| ApiError::bad_request("admin tokens cannot track reading progress"))?.0;
// Validate status value
if !["unread", "reading", "read"].contains(&body.status.as_str()) {
return Err(ApiError::bad_request(format!(
@@ -143,9 +148,9 @@ pub async fn update_reading_progress(
let row = sqlx::query(
r#"
INSERT INTO book_reading_progress (book_id, status, current_page, last_read_at, updated_at)
VALUES ($1, $2, $3, NOW(), NOW())
ON CONFLICT (book_id) DO UPDATE
INSERT INTO book_reading_progress (book_id, user_id, status, current_page, last_read_at, updated_at)
VALUES ($1, $2, $3, $4, NOW(), NOW())
ON CONFLICT (book_id, user_id) DO UPDATE
SET status = EXCLUDED.status,
current_page = EXCLUDED.current_page,
last_read_at = NOW(),
@@ -154,6 +159,7 @@ pub async fn update_reading_progress(
"#,
)
.bind(id)
.bind(auth_user.user_id)
.bind(&body.status)
.bind(current_page)
.fetch_one(&state.pool)
@@ -194,8 +200,10 @@ pub struct MarkSeriesReadResponse {
)]
pub async fn mark_series_read(
State(state): State<AppState>,
user: Option<Extension<AuthUser>>,
Json(body): Json<MarkSeriesReadRequest>,
) -> Result<Json<MarkSeriesReadResponse>, ApiError> {
let auth_user = user.ok_or_else(|| ApiError::bad_request("admin tokens cannot track reading progress"))?.0;
if !["read", "unread"].contains(&body.status.as_str()) {
return Err(ApiError::bad_request(
"status must be 'read' or 'unread'",
@@ -209,24 +217,50 @@ pub async fn mark_series_read(
};
let sql = if body.status == "unread" {
// Delete progress records to reset to unread
// Delete progress records to reset to unread (scoped to this user)
if body.series == "unclassified" {
format!(
r#"
WITH target_books AS (
SELECT id FROM books WHERE {series_filter}
)
DELETE FROM book_reading_progress
WHERE book_id IN (SELECT id FROM target_books)
WHERE book_id IN (SELECT id FROM target_books) AND user_id = $1
"#
)
} else {
format!(
r#"
INSERT INTO book_reading_progress (book_id, status, current_page, last_read_at, updated_at)
SELECT id, 'read', NULL, NOW(), NOW()
WITH target_books AS (
SELECT id FROM books WHERE {series_filter}
)
DELETE FROM book_reading_progress
WHERE book_id IN (SELECT id FROM target_books) AND user_id = $2
"#
)
}
} else if body.series == "unclassified" {
format!(
r#"
INSERT INTO book_reading_progress (book_id, user_id, status, current_page, last_read_at, updated_at)
SELECT id, $1, 'read', NULL, NOW(), NOW()
FROM books
WHERE {series_filter}
ON CONFLICT (book_id) DO UPDATE
ON CONFLICT (book_id, user_id) DO UPDATE
SET status = 'read',
current_page = NULL,
last_read_at = NOW(),
updated_at = NOW()
"#
)
} else {
format!(
r#"
INSERT INTO book_reading_progress (book_id, user_id, status, current_page, last_read_at, updated_at)
SELECT id, $2, 'read', NULL, NOW(), NOW()
FROM books
WHERE {series_filter}
ON CONFLICT (book_id, user_id) DO UPDATE
SET status = 'read',
current_page = NULL,
last_read_at = NOW(),
@@ -236,9 +270,18 @@ pub async fn mark_series_read(
};
let result = if body.series == "unclassified" {
sqlx::query(&sql).execute(&state.pool).await?
// $1 = user_id (no series bind needed)
sqlx::query(&sql)
.bind(auth_user.user_id)
.execute(&state.pool)
.await?
} else {
sqlx::query(&sql).bind(&body.series).execute(&state.pool).await?
// $1 = series, $2 = user_id
sqlx::query(&sql)
.bind(&body.series)
.bind(auth_user.user_id)
.execute(&state.pool)
.await?
};
Ok(Json(MarkSeriesReadResponse {

View File

@@ -39,15 +39,15 @@ pub struct SearchResponse {
pub processing_time_ms: Option<u64>,
}
/// Search books across all libraries using Meilisearch
/// Search books across all libraries
#[utoipa::path(
get,
path = "/search",
tag = "books",
tag = "search",
params(
("q" = String, Query, description = "Search query (books via Meilisearch + series via ILIKE)"),
("q" = String, Query, description = "Search query (books + series via PostgreSQL full-text)"),
("library_id" = Option<String>, Query, description = "Filter by library ID"),
("type" = Option<String>, Query, description = "Filter by type (cbz, cbr, pdf)"),
("type" = Option<String>, Query, description = "Filter by type (cbz, cbr, pdf, epub)"),
("kind" = Option<String>, Query, description = "Filter by kind (alias for type)"),
("limit" = Option<usize>, Query, description = "Max results per type (max 100)"),
),
@@ -65,34 +65,38 @@ pub async fn search_books(
return Err(ApiError::bad_request("q is required"));
}
let mut filters: Vec<String> = Vec::new();
if let Some(library_id) = query.library_id.as_deref() {
filters.push(format!("library_id = '{}'", library_id.replace('"', "")));
}
let kind_filter = query.r#type.as_deref().or(query.kind.as_deref());
if let Some(kind) = kind_filter {
filters.push(format!("kind = '{}'", kind.replace('"', "")));
}
let body = serde_json::json!({
"q": query.q,
"limit": query.limit.unwrap_or(20).clamp(1, 100),
"filter": if filters.is_empty() { serde_json::Value::Null } else { serde_json::Value::String(filters.join(" AND ")) }
});
let limit_val = query.limit.unwrap_or(20).clamp(1, 100);
let limit_val = query.limit.unwrap_or(20).clamp(1, 100) as i64;
let q_pattern = format!("%{}%", query.q);
let library_id_uuid: Option<uuid::Uuid> = query.library_id.as_deref()
let library_id_uuid: Option<Uuid> = query.library_id.as_deref()
.and_then(|s| s.parse().ok());
let kind_filter: Option<&str> = query.r#type.as_deref().or(query.kind.as_deref());
// Recherche Meilisearch (books) + séries PG en parallèle
let client = reqwest::Client::new();
let url = format!("{}/indexes/books/search", state.meili_url.trim_end_matches('/'));
let meili_fut = client
.post(&url)
.header("Authorization", format!("Bearer {}", state.meili_master_key))
.json(&body)
.send();
let start = std::time::Instant::now();
// Book search via PostgreSQL ILIKE on title, authors, series
let books_sql = r#"
SELECT b.id, b.library_id, b.kind, b.title,
COALESCE(b.authors, CASE WHEN b.author IS NOT NULL AND b.author != '' THEN ARRAY[b.author] ELSE ARRAY[]::text[] END) as authors,
b.series, b.volume, b.language
FROM books b
LEFT JOIN series_metadata sm
ON sm.library_id = b.library_id
AND sm.name = COALESCE(NULLIF(b.series, ''), 'unclassified')
WHERE (
b.title ILIKE $1
OR b.series ILIKE $1
OR EXISTS (SELECT 1 FROM unnest(
COALESCE(b.authors, CASE WHEN b.author IS NOT NULL AND b.author != '' THEN ARRAY[b.author] ELSE ARRAY[]::text[] END)
|| COALESCE(sm.authors, ARRAY[]::text[])
) AS a WHERE a ILIKE $1)
)
AND ($2::uuid IS NULL OR b.library_id = $2)
AND ($3::text IS NULL OR b.kind = $3)
ORDER BY
CASE WHEN b.title ILIKE $1 THEN 0 ELSE 1 END,
b.title ASC
LIMIT $4
"#;
let series_sql = r#"
WITH sorted_books AS (
@@ -108,7 +112,7 @@ pub async fn search_books(
title ASC
) as rn
FROM books
WHERE ($1::uuid IS NULL OR library_id = $1)
WHERE ($2::uuid IS NULL OR library_id = $2)
),
series_counts AS (
SELECT
@@ -123,39 +127,49 @@ pub async fn search_books(
SELECT sc.library_id, sc.name, sc.book_count, sc.books_read_count, sb.id as first_book_id
FROM series_counts sc
JOIN sorted_books sb ON sb.library_id = sc.library_id AND sb.name = sc.name AND sb.rn = 1
WHERE sc.name ILIKE $2
WHERE sc.name ILIKE $1
ORDER BY sc.name ASC
LIMIT $3
LIMIT $4
"#;
let series_fut = sqlx::query(series_sql)
.bind(library_id_uuid)
let (books_rows, series_rows) = tokio::join!(
sqlx::query(books_sql)
.bind(&q_pattern)
.bind(limit_val as i64)
.fetch_all(&state.pool);
.bind(library_id_uuid)
.bind(kind_filter)
.bind(limit_val)
.fetch_all(&state.pool),
sqlx::query(series_sql)
.bind(&q_pattern)
.bind(library_id_uuid)
.bind(kind_filter) // unused in series query but keeps bind positions consistent
.bind(limit_val)
.fetch_all(&state.pool)
);
let (meili_resp, series_rows) = tokio::join!(meili_fut, series_fut);
let elapsed_ms = start.elapsed().as_millis() as u64;
// Traitement Meilisearch
let meili_resp = meili_resp.map_err(|e| ApiError::internal(format!("meili request failed: {e}")))?;
let (hits, estimated_total_hits, processing_time_ms) = if !meili_resp.status().is_success() {
let body = meili_resp.text().await.unwrap_or_default();
if body.contains("index_not_found") {
(serde_json::json!([]), Some(0u64), Some(0u64))
} else {
return Err(ApiError::internal(format!("meili error: {body}")));
}
} else {
let payload: serde_json::Value = meili_resp.json().await
.map_err(|e| ApiError::internal(format!("invalid meili response: {e}")))?;
(
payload.get("hits").cloned().unwrap_or_else(|| serde_json::json!([])),
payload.get("estimatedTotalHits").and_then(|v| v.as_u64()),
payload.get("processingTimeMs").and_then(|v| v.as_u64()),
)
};
// Build book hits as JSON array (same shape as before)
let books_rows = books_rows.map_err(|e| ApiError::internal(format!("book search failed: {e}")))?;
let hits: Vec<serde_json::Value> = books_rows
.iter()
.map(|row| {
serde_json::json!({
"id": row.get::<Uuid, _>("id").to_string(),
"library_id": row.get::<Uuid, _>("library_id").to_string(),
"kind": row.get::<String, _>("kind"),
"title": row.get::<String, _>("title"),
"authors": row.get::<Vec<String>, _>("authors"),
"series": row.get::<Option<String>, _>("series"),
"volume": row.get::<Option<i32>, _>("volume"),
"language": row.get::<Option<String>, _>("language"),
})
})
.collect();
// Traitement séries
let estimated_total_hits = hits.len() as u64;
// Series hits
let series_hits: Vec<SeriesHit> = series_rows
.unwrap_or_default()
.iter()
@@ -169,9 +183,9 @@ pub async fn search_books(
.collect();
Ok(Json(SearchResponse {
hits,
hits: serde_json::Value::Array(hits),
series_hits,
estimated_total_hits,
processing_time_ms,
estimated_total_hits: Some(estimated_total_hits),
processing_time_ms: Some(elapsed_ms),
}))
}

1043
apps/api/src/series.rs Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -1,11 +1,12 @@
use axum::{
extract::State,
routing::{get, post},
extract::{Path as AxumPath, State},
routing::{delete, get, post},
Json, Router,
};
use serde::{Deserialize, Serialize};
use serde_json::Value;
use sqlx::Row;
use uuid::Uuid;
use utoipa::ToSchema;
use crate::{error::ApiError, state::{AppState, load_dynamic_settings}};
@@ -42,6 +43,14 @@ pub fn settings_routes() -> Router<AppState> {
.route("/settings/cache/clear", post(clear_cache))
.route("/settings/cache/stats", get(get_cache_stats))
.route("/settings/thumbnail/stats", get(get_thumbnail_stats))
.route(
"/settings/status-mappings",
get(list_status_mappings).post(upsert_status_mapping),
)
.route(
"/settings/status-mappings/:id",
delete(delete_status_mapping),
)
}
/// List all settings
@@ -324,3 +333,125 @@ pub async fn get_thumbnail_stats(State(_state): State<AppState>) -> Result<Json<
Ok(Json(stats))
}
// ---------------------------------------------------------------------------
// Status Mappings
// ---------------------------------------------------------------------------
#[derive(Debug, Clone, Serialize, Deserialize, ToSchema)]
pub struct StatusMappingDto {
pub id: String,
pub provider_status: String,
pub mapped_status: Option<String>,
}
#[derive(Debug, Clone, Deserialize, ToSchema)]
pub struct UpsertStatusMappingRequest {
pub provider_status: String,
pub mapped_status: String,
}
/// List all status mappings
#[utoipa::path(
get,
path = "/settings/status-mappings",
tag = "settings",
responses(
(status = 200, body = Vec<StatusMappingDto>),
(status = 401, description = "Unauthorized"),
),
security(("Bearer" = []))
)]
pub async fn list_status_mappings(
State(state): State<AppState>,
) -> Result<Json<Vec<StatusMappingDto>>, ApiError> {
let rows = sqlx::query(
"SELECT id, provider_status, mapped_status FROM status_mappings ORDER BY mapped_status NULLS LAST, provider_status",
)
.fetch_all(&state.pool)
.await?;
let mappings = rows
.iter()
.map(|row| StatusMappingDto {
id: row.get::<Uuid, _>("id").to_string(),
provider_status: row.get("provider_status"),
mapped_status: row.get::<Option<String>, _>("mapped_status"),
})
.collect();
Ok(Json(mappings))
}
/// Create or update a status mapping
#[utoipa::path(
post,
path = "/settings/status-mappings",
tag = "settings",
request_body = UpsertStatusMappingRequest,
responses(
(status = 200, body = StatusMappingDto),
(status = 401, description = "Unauthorized"),
),
security(("Bearer" = []))
)]
pub async fn upsert_status_mapping(
State(state): State<AppState>,
Json(body): Json<UpsertStatusMappingRequest>,
) -> Result<Json<StatusMappingDto>, ApiError> {
let provider_status = body.provider_status.to_lowercase();
let row = sqlx::query(
r#"
INSERT INTO status_mappings (provider_status, mapped_status)
VALUES ($1, $2)
ON CONFLICT (provider_status)
DO UPDATE SET mapped_status = $2, updated_at = NOW()
RETURNING id, provider_status, mapped_status
"#,
)
.bind(&provider_status)
.bind(&body.mapped_status)
.fetch_one(&state.pool)
.await?;
Ok(Json(StatusMappingDto {
id: row.get::<Uuid, _>("id").to_string(),
provider_status: row.get("provider_status"),
mapped_status: row.get::<Option<String>, _>("mapped_status"),
}))
}
/// Unmap a status mapping (sets mapped_status to NULL, keeps the provider status known)
#[utoipa::path(
delete,
path = "/settings/status-mappings/{id}",
tag = "settings",
params(("id" = String, Path, description = "Mapping UUID")),
responses(
(status = 200, body = StatusMappingDto),
(status = 401, description = "Unauthorized"),
(status = 404, description = "Not found"),
),
security(("Bearer" = []))
)]
pub async fn delete_status_mapping(
State(state): State<AppState>,
AxumPath(id): AxumPath<Uuid>,
) -> Result<Json<StatusMappingDto>, ApiError> {
let row = sqlx::query(
"UPDATE status_mappings SET mapped_status = NULL, updated_at = NOW() WHERE id = $1 RETURNING id, provider_status, mapped_status",
)
.bind(id)
.fetch_optional(&state.pool)
.await?;
match row {
Some(row) => Ok(Json(StatusMappingDto {
id: row.get::<Uuid, _>("id").to_string(),
provider_status: row.get("provider_status"),
mapped_status: row.get::<Option<String>, _>("mapped_status"),
})),
None => Err(ApiError::not_found("status mapping not found")),
}
}

View File

@@ -12,8 +12,6 @@ use tokio::sync::{Mutex, RwLock, Semaphore};
pub struct AppState {
pub pool: sqlx::PgPool,
pub bootstrap_token: Arc<str>,
pub meili_url: Arc<str>,
pub meili_master_key: Arc<str>,
pub page_cache: Arc<Mutex<LruCache<String, Arc<Vec<u8>>>>>,
pub page_render_limit: Arc<Semaphore>,
pub metrics: Arc<Metrics>,

View File

@@ -1,9 +1,18 @@
use axum::{extract::State, Json};
use serde::Serialize;
use axum::{
extract::{Extension, Query, State},
Json,
};
use serde::{Deserialize, Serialize};
use sqlx::Row;
use utoipa::ToSchema;
use utoipa::{IntoParams, ToSchema};
use crate::{error::ApiError, state::AppState};
use crate::{auth::AuthUser, error::ApiError, state::AppState};
#[derive(Deserialize, IntoParams)]
pub struct StatsQuery {
/// Granularity: "day", "week" or "month" (default: "month")
pub period: Option<String>,
}
#[derive(Serialize, ToSchema)]
pub struct StatsOverview {
@@ -58,22 +67,86 @@ pub struct MonthlyAdditions {
pub books_added: i64,
}
#[derive(Serialize, ToSchema)]
pub struct MetadataStats {
pub total_series: i64,
pub series_linked: i64,
pub series_unlinked: i64,
pub books_with_summary: i64,
pub books_with_isbn: i64,
pub by_provider: Vec<ProviderCount>,
}
#[derive(Serialize, ToSchema)]
pub struct ProviderCount {
pub provider: String,
pub count: i64,
}
#[derive(Serialize, ToSchema)]
pub struct CurrentlyReadingItem {
pub book_id: String,
pub title: String,
pub series: Option<String>,
pub current_page: i32,
pub page_count: i32,
pub username: Option<String>,
}
#[derive(Serialize, ToSchema)]
pub struct RecentlyReadItem {
pub book_id: String,
pub title: String,
pub series: Option<String>,
pub last_read_at: String,
pub username: Option<String>,
}
#[derive(Serialize, ToSchema)]
pub struct MonthlyReading {
pub month: String,
pub books_read: i64,
}
#[derive(Serialize, ToSchema)]
pub struct UserMonthlyReading {
pub month: String,
pub username: String,
pub books_read: i64,
}
#[derive(Serialize, ToSchema)]
pub struct JobTimePoint {
pub label: String,
pub scan: i64,
pub rebuild: i64,
pub thumbnail: i64,
pub other: i64,
}
#[derive(Serialize, ToSchema)]
pub struct StatsResponse {
pub overview: StatsOverview,
pub reading_status: ReadingStatusStats,
pub currently_reading: Vec<CurrentlyReadingItem>,
pub recently_read: Vec<RecentlyReadItem>,
pub reading_over_time: Vec<MonthlyReading>,
pub by_format: Vec<FormatCount>,
pub by_language: Vec<LanguageCount>,
pub by_library: Vec<LibraryStats>,
pub top_series: Vec<TopSeries>,
pub additions_over_time: Vec<MonthlyAdditions>,
pub jobs_over_time: Vec<JobTimePoint>,
pub metadata: MetadataStats,
pub users_reading_over_time: Vec<UserMonthlyReading>,
}
/// Get collection statistics for the dashboard
#[utoipa::path(
get,
path = "/stats",
tag = "books",
tag = "stats",
params(StatsQuery),
responses(
(status = 200, body = StatsResponse),
(status = 401, description = "Unauthorized"),
@@ -82,7 +155,11 @@ pub struct StatsResponse {
)]
pub async fn get_stats(
State(state): State<AppState>,
Query(query): Query<StatsQuery>,
user: Option<Extension<AuthUser>>,
) -> Result<Json<StatsResponse>, ApiError> {
let user_id: Option<uuid::Uuid> = user.map(|u| u.0.user_id);
let period = query.period.as_deref().unwrap_or("month");
// Overview + reading status in one query
let overview_row = sqlx::query(
r#"
@@ -100,9 +177,10 @@ pub async fn get_stats(
COUNT(*) FILTER (WHERE brp.status = 'reading') AS reading,
COUNT(*) FILTER (WHERE brp.status = 'read') AS read
FROM books b
LEFT JOIN book_reading_progress brp ON brp.book_id = b.id
LEFT JOIN book_reading_progress brp ON brp.book_id = b.id AND ($1::uuid IS NULL OR brp.user_id = $1)
"#,
)
.bind(user_id)
.fetch_one(&state.pool)
.await?;
@@ -190,7 +268,7 @@ pub async fn get_stats(
COUNT(*) FILTER (WHERE COALESCE(brp.status, 'unread') = 'unread') AS unread_count
FROM libraries l
LEFT JOIN books b ON b.library_id = l.id
LEFT JOIN book_reading_progress brp ON brp.book_id = b.id
LEFT JOIN book_reading_progress brp ON brp.book_id = b.id AND ($1::uuid IS NULL OR brp.user_id = $1)
LEFT JOIN LATERAL (
SELECT size_bytes FROM book_files WHERE book_id = b.id ORDER BY updated_at DESC LIMIT 1
) bf ON TRUE
@@ -198,6 +276,7 @@ pub async fn get_stats(
ORDER BY book_count DESC
"#,
)
.bind(user_id)
.fetch_all(&state.pool)
.await?;
@@ -222,13 +301,14 @@ pub async fn get_stats(
COUNT(*) FILTER (WHERE brp.status = 'read') AS read_count,
COALESCE(SUM(b.page_count), 0)::BIGINT AS total_pages
FROM books b
LEFT JOIN book_reading_progress brp ON brp.book_id = b.id
LEFT JOIN book_reading_progress brp ON brp.book_id = b.id AND ($1::uuid IS NULL OR brp.user_id = $1)
WHERE b.series IS NOT NULL AND b.series != ''
GROUP BY b.series
ORDER BY book_count DESC
LIMIT 10
"#,
)
.bind(user_id)
.fetch_all(&state.pool)
.await?;
@@ -242,20 +322,74 @@ pub async fn get_stats(
})
.collect();
// Additions over time (last 12 months)
let additions_rows = sqlx::query(
// Additions over time (with gap filling)
let additions_rows = match period {
"day" => {
sqlx::query(
r#"
SELECT
TO_CHAR(DATE_TRUNC('month', created_at), 'YYYY-MM') AS month,
COUNT(*) AS books_added
TO_CHAR(d.dt, 'YYYY-MM-DD') AS month,
COALESCE(cnt.books_added, 0) AS books_added
FROM generate_series(CURRENT_DATE - INTERVAL '6 days', CURRENT_DATE, '1 day') AS d(dt)
LEFT JOIN (
SELECT created_at::date AS dt, COUNT(*) AS books_added
FROM books
WHERE created_at >= DATE_TRUNC('month', NOW()) - INTERVAL '11 months'
GROUP BY DATE_TRUNC('month', created_at)
WHERE created_at >= CURRENT_DATE - INTERVAL '6 days'
GROUP BY created_at::date
) cnt ON cnt.dt = d.dt
ORDER BY month ASC
"#,
)
.fetch_all(&state.pool)
.await?;
.await?
}
"week" => {
sqlx::query(
r#"
SELECT
TO_CHAR(d.dt, 'YYYY-MM-DD') AS month,
COALESCE(cnt.books_added, 0) AS books_added
FROM generate_series(
DATE_TRUNC('week', NOW() - INTERVAL '2 months'),
DATE_TRUNC('week', NOW()),
'1 week'
) AS d(dt)
LEFT JOIN (
SELECT DATE_TRUNC('week', created_at) AS dt, COUNT(*) AS books_added
FROM books
WHERE created_at >= DATE_TRUNC('week', NOW() - INTERVAL '2 months')
GROUP BY DATE_TRUNC('week', created_at)
) cnt ON cnt.dt = d.dt
ORDER BY month ASC
"#,
)
.fetch_all(&state.pool)
.await?
}
_ => {
sqlx::query(
r#"
SELECT
TO_CHAR(d.dt, 'YYYY-MM') AS month,
COALESCE(cnt.books_added, 0) AS books_added
FROM generate_series(
DATE_TRUNC('month', NOW()) - INTERVAL '11 months',
DATE_TRUNC('month', NOW()),
'1 month'
) AS d(dt)
LEFT JOIN (
SELECT DATE_TRUNC('month', created_at) AS dt, COUNT(*) AS books_added
FROM books
WHERE created_at >= DATE_TRUNC('month', NOW()) - INTERVAL '11 months'
GROUP BY DATE_TRUNC('month', created_at)
) cnt ON cnt.dt = d.dt
ORDER BY month ASC
"#,
)
.fetch_all(&state.pool)
.await?
}
};
let additions_over_time: Vec<MonthlyAdditions> = additions_rows
.iter()
@@ -265,13 +399,421 @@ pub async fn get_stats(
})
.collect();
// Metadata stats
let meta_row = sqlx::query(
r#"
SELECT
(SELECT COUNT(DISTINCT NULLIF(series, '')) FROM books) AS total_series,
(SELECT COUNT(DISTINCT series_name) FROM external_metadata_links WHERE status = 'approved') AS series_linked,
(SELECT COUNT(*) FROM books WHERE summary IS NOT NULL AND summary != '') AS books_with_summary,
(SELECT COUNT(*) FROM books WHERE isbn IS NOT NULL AND isbn != '') AS books_with_isbn
"#,
)
.fetch_one(&state.pool)
.await?;
let meta_total_series: i64 = meta_row.get("total_series");
let meta_series_linked: i64 = meta_row.get("series_linked");
let provider_rows = sqlx::query(
r#"
SELECT provider, COUNT(DISTINCT series_name) AS count
FROM external_metadata_links
WHERE status = 'approved'
GROUP BY provider
ORDER BY count DESC
"#,
)
.fetch_all(&state.pool)
.await?;
let by_provider: Vec<ProviderCount> = provider_rows
.iter()
.map(|r| ProviderCount {
provider: r.get("provider"),
count: r.get("count"),
})
.collect();
let metadata = MetadataStats {
total_series: meta_total_series,
series_linked: meta_series_linked,
series_unlinked: meta_total_series - meta_series_linked,
books_with_summary: meta_row.get("books_with_summary"),
books_with_isbn: meta_row.get("books_with_isbn"),
by_provider,
};
// Currently reading books
let reading_rows = sqlx::query(
r#"
SELECT b.id AS book_id, b.title, b.series, brp.current_page, b.page_count, u.username
FROM book_reading_progress brp
JOIN books b ON b.id = brp.book_id
LEFT JOIN users u ON u.id = brp.user_id
WHERE brp.status = 'reading' AND brp.current_page IS NOT NULL
AND ($1::uuid IS NULL OR brp.user_id = $1)
ORDER BY brp.updated_at DESC
LIMIT 20
"#,
)
.bind(user_id)
.fetch_all(&state.pool)
.await?;
let currently_reading: Vec<CurrentlyReadingItem> = reading_rows
.iter()
.map(|r| {
let id: uuid::Uuid = r.get("book_id");
CurrentlyReadingItem {
book_id: id.to_string(),
title: r.get("title"),
series: r.get("series"),
current_page: r.get::<Option<i32>, _>("current_page").unwrap_or(0),
page_count: r.get::<Option<i32>, _>("page_count").unwrap_or(0),
username: r.get("username"),
}
})
.collect();
// Recently read books
let recent_rows = sqlx::query(
r#"
SELECT b.id AS book_id, b.title, b.series,
TO_CHAR(brp.last_read_at, 'YYYY-MM-DD') AS last_read_at,
u.username
FROM book_reading_progress brp
JOIN books b ON b.id = brp.book_id
LEFT JOIN users u ON u.id = brp.user_id
WHERE brp.status = 'read' AND brp.last_read_at IS NOT NULL
AND ($1::uuid IS NULL OR brp.user_id = $1)
ORDER BY brp.last_read_at DESC
LIMIT 10
"#,
)
.bind(user_id)
.fetch_all(&state.pool)
.await?;
let recently_read: Vec<RecentlyReadItem> = recent_rows
.iter()
.map(|r| {
let id: uuid::Uuid = r.get("book_id");
RecentlyReadItem {
book_id: id.to_string(),
title: r.get("title"),
series: r.get("series"),
last_read_at: r.get::<Option<String>, _>("last_read_at").unwrap_or_default(),
username: r.get("username"),
}
})
.collect();
// Reading activity over time (with gap filling)
let reading_time_rows = match period {
"day" => {
sqlx::query(
r#"
SELECT
TO_CHAR(d.dt, 'YYYY-MM-DD') AS month,
COALESCE(cnt.books_read, 0) AS books_read
FROM generate_series(CURRENT_DATE - INTERVAL '6 days', CURRENT_DATE, '1 day') AS d(dt)
LEFT JOIN (
SELECT brp.last_read_at::date AS dt, COUNT(*) AS books_read
FROM book_reading_progress brp
WHERE brp.status = 'read'
AND brp.last_read_at >= CURRENT_DATE - INTERVAL '6 days'
AND ($1::uuid IS NULL OR brp.user_id = $1)
GROUP BY brp.last_read_at::date
) cnt ON cnt.dt = d.dt
ORDER BY month ASC
"#,
)
.bind(user_id)
.fetch_all(&state.pool)
.await?
}
"week" => {
sqlx::query(
r#"
SELECT
TO_CHAR(d.dt, 'YYYY-MM-DD') AS month,
COALESCE(cnt.books_read, 0) AS books_read
FROM generate_series(
DATE_TRUNC('week', NOW() - INTERVAL '2 months'),
DATE_TRUNC('week', NOW()),
'1 week'
) AS d(dt)
LEFT JOIN (
SELECT DATE_TRUNC('week', brp.last_read_at) AS dt, COUNT(*) AS books_read
FROM book_reading_progress brp
WHERE brp.status = 'read'
AND brp.last_read_at >= DATE_TRUNC('week', NOW() - INTERVAL '2 months')
AND ($1::uuid IS NULL OR brp.user_id = $1)
GROUP BY DATE_TRUNC('week', brp.last_read_at)
) cnt ON cnt.dt = d.dt
ORDER BY month ASC
"#,
)
.bind(user_id)
.fetch_all(&state.pool)
.await?
}
_ => {
sqlx::query(
r#"
SELECT
TO_CHAR(d.dt, 'YYYY-MM') AS month,
COALESCE(cnt.books_read, 0) AS books_read
FROM generate_series(
DATE_TRUNC('month', NOW()) - INTERVAL '11 months',
DATE_TRUNC('month', NOW()),
'1 month'
) AS d(dt)
LEFT JOIN (
SELECT DATE_TRUNC('month', brp.last_read_at) AS dt, COUNT(*) AS books_read
FROM book_reading_progress brp
WHERE brp.status = 'read'
AND brp.last_read_at >= DATE_TRUNC('month', NOW()) - INTERVAL '11 months'
AND ($1::uuid IS NULL OR brp.user_id = $1)
GROUP BY DATE_TRUNC('month', brp.last_read_at)
) cnt ON cnt.dt = d.dt
ORDER BY month ASC
"#,
)
.bind(user_id)
.fetch_all(&state.pool)
.await?
}
};
let reading_over_time: Vec<MonthlyReading> = reading_time_rows
.iter()
.map(|r| MonthlyReading {
month: r.get::<Option<String>, _>("month").unwrap_or_default(),
books_read: r.get("books_read"),
})
.collect();
// Per-user reading over time (admin view — always all users, no user_id filter)
let users_reading_time_rows = match period {
"day" => {
sqlx::query(
r#"
SELECT
TO_CHAR(d.dt, 'YYYY-MM-DD') AS month,
u.username,
COALESCE(cnt.books_read, 0) AS books_read
FROM generate_series(CURRENT_DATE - INTERVAL '6 days', CURRENT_DATE, '1 day') AS d(dt)
CROSS JOIN users u
LEFT JOIN (
SELECT brp.last_read_at::date AS dt, brp.user_id, COUNT(*) AS books_read
FROM book_reading_progress brp
WHERE brp.status = 'read'
AND brp.last_read_at >= CURRENT_DATE - INTERVAL '6 days'
GROUP BY brp.last_read_at::date, brp.user_id
) cnt ON cnt.dt = d.dt AND cnt.user_id = u.id
ORDER BY month ASC, u.username
"#,
)
.fetch_all(&state.pool)
.await?
}
"week" => {
sqlx::query(
r#"
SELECT
TO_CHAR(d.dt, 'YYYY-MM-DD') AS month,
u.username,
COALESCE(cnt.books_read, 0) AS books_read
FROM generate_series(
DATE_TRUNC('week', NOW() - INTERVAL '2 months'),
DATE_TRUNC('week', NOW()),
'1 week'
) AS d(dt)
CROSS JOIN users u
LEFT JOIN (
SELECT DATE_TRUNC('week', brp.last_read_at) AS dt, brp.user_id, COUNT(*) AS books_read
FROM book_reading_progress brp
WHERE brp.status = 'read'
AND brp.last_read_at >= DATE_TRUNC('week', NOW() - INTERVAL '2 months')
GROUP BY DATE_TRUNC('week', brp.last_read_at), brp.user_id
) cnt ON cnt.dt = d.dt AND cnt.user_id = u.id
ORDER BY month ASC, u.username
"#,
)
.fetch_all(&state.pool)
.await?
}
_ => {
sqlx::query(
r#"
SELECT
TO_CHAR(d.dt, 'YYYY-MM') AS month,
u.username,
COALESCE(cnt.books_read, 0) AS books_read
FROM generate_series(
DATE_TRUNC('month', NOW()) - INTERVAL '11 months',
DATE_TRUNC('month', NOW()),
'1 month'
) AS d(dt)
CROSS JOIN users u
LEFT JOIN (
SELECT DATE_TRUNC('month', brp.last_read_at) AS dt, brp.user_id, COUNT(*) AS books_read
FROM book_reading_progress brp
WHERE brp.status = 'read'
AND brp.last_read_at >= DATE_TRUNC('month', NOW()) - INTERVAL '11 months'
GROUP BY DATE_TRUNC('month', brp.last_read_at), brp.user_id
) cnt ON cnt.dt = d.dt AND cnt.user_id = u.id
ORDER BY month ASC, u.username
"#,
)
.fetch_all(&state.pool)
.await?
}
};
let users_reading_over_time: Vec<UserMonthlyReading> = users_reading_time_rows
.iter()
.map(|r| UserMonthlyReading {
month: r.get::<Option<String>, _>("month").unwrap_or_default(),
username: r.get("username"),
books_read: r.get("books_read"),
})
.collect();
// Jobs over time (with gap filling, grouped by type category)
let jobs_rows = match period {
"day" => {
sqlx::query(
r#"
SELECT
TO_CHAR(d.dt, 'YYYY-MM-DD') AS label,
COALESCE(SUM(cnt.c) FILTER (WHERE cnt.cat = 'scan'), 0)::BIGINT AS scan,
COALESCE(SUM(cnt.c) FILTER (WHERE cnt.cat = 'rebuild'), 0)::BIGINT AS rebuild,
COALESCE(SUM(cnt.c) FILTER (WHERE cnt.cat = 'thumbnail'), 0)::BIGINT AS thumbnail,
COALESCE(SUM(cnt.c) FILTER (WHERE cnt.cat = 'other'), 0)::BIGINT AS other
FROM generate_series(CURRENT_DATE - INTERVAL '6 days', CURRENT_DATE, '1 day') AS d(dt)
LEFT JOIN (
SELECT
finished_at::date AS dt,
CASE
WHEN type = 'scan' THEN 'scan'
WHEN type IN ('rebuild', 'full_rebuild', 'rescan') THEN 'rebuild'
WHEN type IN ('thumbnail_rebuild', 'thumbnail_regenerate') THEN 'thumbnail'
ELSE 'other'
END AS cat,
COUNT(*) AS c
FROM index_jobs
WHERE status IN ('success', 'failed')
AND finished_at >= CURRENT_DATE - INTERVAL '6 days'
GROUP BY finished_at::date, cat
) cnt ON cnt.dt = d.dt
GROUP BY d.dt
ORDER BY label ASC
"#,
)
.fetch_all(&state.pool)
.await?
}
"week" => {
sqlx::query(
r#"
SELECT
TO_CHAR(d.dt, 'YYYY-MM-DD') AS label,
COALESCE(SUM(cnt.c) FILTER (WHERE cnt.cat = 'scan'), 0)::BIGINT AS scan,
COALESCE(SUM(cnt.c) FILTER (WHERE cnt.cat = 'rebuild'), 0)::BIGINT AS rebuild,
COALESCE(SUM(cnt.c) FILTER (WHERE cnt.cat = 'thumbnail'), 0)::BIGINT AS thumbnail,
COALESCE(SUM(cnt.c) FILTER (WHERE cnt.cat = 'other'), 0)::BIGINT AS other
FROM generate_series(
DATE_TRUNC('week', NOW() - INTERVAL '2 months'),
DATE_TRUNC('week', NOW()),
'1 week'
) AS d(dt)
LEFT JOIN (
SELECT
DATE_TRUNC('week', finished_at) AS dt,
CASE
WHEN type = 'scan' THEN 'scan'
WHEN type IN ('rebuild', 'full_rebuild', 'rescan') THEN 'rebuild'
WHEN type IN ('thumbnail_rebuild', 'thumbnail_regenerate') THEN 'thumbnail'
ELSE 'other'
END AS cat,
COUNT(*) AS c
FROM index_jobs
WHERE status IN ('success', 'failed')
AND finished_at >= DATE_TRUNC('week', NOW() - INTERVAL '2 months')
GROUP BY DATE_TRUNC('week', finished_at), cat
) cnt ON cnt.dt = d.dt
GROUP BY d.dt
ORDER BY label ASC
"#,
)
.fetch_all(&state.pool)
.await?
}
_ => {
sqlx::query(
r#"
SELECT
TO_CHAR(d.dt, 'YYYY-MM') AS label,
COALESCE(SUM(cnt.c) FILTER (WHERE cnt.cat = 'scan'), 0)::BIGINT AS scan,
COALESCE(SUM(cnt.c) FILTER (WHERE cnt.cat = 'rebuild'), 0)::BIGINT AS rebuild,
COALESCE(SUM(cnt.c) FILTER (WHERE cnt.cat = 'thumbnail'), 0)::BIGINT AS thumbnail,
COALESCE(SUM(cnt.c) FILTER (WHERE cnt.cat = 'other'), 0)::BIGINT AS other
FROM generate_series(
DATE_TRUNC('month', NOW()) - INTERVAL '11 months',
DATE_TRUNC('month', NOW()),
'1 month'
) AS d(dt)
LEFT JOIN (
SELECT
DATE_TRUNC('month', finished_at) AS dt,
CASE
WHEN type = 'scan' THEN 'scan'
WHEN type IN ('rebuild', 'full_rebuild', 'rescan') THEN 'rebuild'
WHEN type IN ('thumbnail_rebuild', 'thumbnail_regenerate') THEN 'thumbnail'
ELSE 'other'
END AS cat,
COUNT(*) AS c
FROM index_jobs
WHERE status IN ('success', 'failed')
AND finished_at >= DATE_TRUNC('month', NOW()) - INTERVAL '11 months'
GROUP BY DATE_TRUNC('month', finished_at), cat
) cnt ON cnt.dt = d.dt
GROUP BY d.dt
ORDER BY label ASC
"#,
)
.fetch_all(&state.pool)
.await?
}
};
let jobs_over_time: Vec<JobTimePoint> = jobs_rows
.iter()
.map(|r| JobTimePoint {
label: r.get("label"),
scan: r.get("scan"),
rebuild: r.get("rebuild"),
thumbnail: r.get("thumbnail"),
other: r.get("other"),
})
.collect();
Ok(Json(StatsResponse {
overview,
reading_status,
currently_reading,
recently_read,
reading_over_time,
by_format,
by_language,
by_library,
top_series,
additions_over_time,
jobs_over_time,
metadata,
users_reading_over_time,
}))
}

46
apps/api/src/telegram.rs Normal file
View File

@@ -0,0 +1,46 @@
use axum::{extract::State, Json};
use serde::Serialize;
use utoipa::ToSchema;
use crate::{error::ApiError, state::AppState};
#[derive(Serialize, ToSchema)]
pub struct TelegramTestResponse {
pub success: bool,
pub message: String,
}
/// Test Telegram connection by sending a test message
#[utoipa::path(
get,
path = "/telegram/test",
tag = "notifications",
responses(
(status = 200, body = TelegramTestResponse),
(status = 400, description = "Telegram not configured"),
(status = 401, description = "Unauthorized"),
),
security(("Bearer" = []))
)]
pub async fn test_telegram(
State(state): State<AppState>,
) -> Result<Json<TelegramTestResponse>, ApiError> {
let config = notifications::load_telegram_config(&state.pool)
.await
.ok_or_else(|| {
ApiError::bad_request(
"Telegram is not configured or disabled. Set bot_token, chat_id, and enable it.",
)
})?;
match notifications::send_test_message(&config).await {
Ok(()) => Ok(Json(TelegramTestResponse {
success: true,
message: "Test message sent successfully".to_string(),
})),
Err(e) => Ok(Json(TelegramTestResponse {
success: false,
message: format!("Failed to send: {e}"),
})),
}
}

View File

@@ -16,6 +16,8 @@ pub struct CreateTokenRequest {
pub name: String,
#[schema(value_type = Option<String>, example = "read")]
pub scope: Option<String>,
#[schema(value_type = Option<String>)]
pub user_id: Option<Uuid>,
}
#[derive(Serialize, ToSchema)]
@@ -26,6 +28,9 @@ pub struct TokenResponse {
pub scope: String,
pub prefix: String,
#[schema(value_type = Option<String>)]
pub user_id: Option<Uuid>,
pub username: Option<String>,
#[schema(value_type = Option<String>)]
pub last_used_at: Option<DateTime<Utc>>,
#[schema(value_type = Option<String>)]
pub revoked_at: Option<DateTime<Utc>>,
@@ -71,6 +76,10 @@ pub async fn create_token(
_ => return Err(ApiError::bad_request("scope must be 'admin' or 'read'")),
};
if scope == "read" && input.user_id.is_none() {
return Err(ApiError::bad_request("user_id is required for read-scoped tokens"));
}
let mut random = [0u8; 24];
OsRng.fill_bytes(&mut random);
let secret = URL_SAFE_NO_PAD.encode(random);
@@ -85,13 +94,14 @@ pub async fn create_token(
let id = Uuid::new_v4();
sqlx::query(
"INSERT INTO api_tokens (id, name, prefix, token_hash, scope) VALUES ($1, $2, $3, $4, $5)",
"INSERT INTO api_tokens (id, name, prefix, token_hash, scope, user_id) VALUES ($1, $2, $3, $4, $5, $6)",
)
.bind(id)
.bind(input.name.trim())
.bind(&prefix)
.bind(token_hash)
.bind(scope)
.bind(input.user_id)
.execute(&state.pool)
.await?;
@@ -118,7 +128,13 @@ pub async fn create_token(
)]
pub async fn list_tokens(State(state): State<AppState>) -> Result<Json<Vec<TokenResponse>>, ApiError> {
let rows = sqlx::query(
"SELECT id, name, scope, prefix, last_used_at, revoked_at, created_at FROM api_tokens ORDER BY created_at DESC",
r#"
SELECT t.id, t.name, t.scope, t.prefix, t.user_id, u.username,
t.last_used_at, t.revoked_at, t.created_at
FROM api_tokens t
LEFT JOIN users u ON u.id = t.user_id
ORDER BY t.created_at DESC
"#,
)
.fetch_all(&state.pool)
.await?;
@@ -130,6 +146,8 @@ pub async fn list_tokens(State(state): State<AppState>) -> Result<Json<Vec<Token
name: row.get("name"),
scope: row.get("scope"),
prefix: row.get("prefix"),
user_id: row.get("user_id"),
username: row.get("username"),
last_used_at: row.get("last_used_at"),
revoked_at: row.get("revoked_at"),
created_at: row.get("created_at"),
@@ -171,6 +189,47 @@ pub async fn revoke_token(
Ok(Json(serde_json::json!({"revoked": true, "id": id})))
}
#[derive(Deserialize, ToSchema)]
pub struct UpdateTokenRequest {
#[schema(value_type = Option<String>)]
pub user_id: Option<Uuid>,
}
/// Update a token's assigned user
#[utoipa::path(
patch,
path = "/admin/tokens/{id}",
tag = "tokens",
params(
("id" = String, Path, description = "Token UUID"),
),
request_body = UpdateTokenRequest,
responses(
(status = 200, description = "Token updated"),
(status = 404, description = "Token not found"),
(status = 401, description = "Unauthorized"),
(status = 403, description = "Forbidden - Admin scope required"),
),
security(("Bearer" = []))
)]
pub async fn update_token(
State(state): State<AppState>,
Path(id): Path<Uuid>,
Json(input): Json<UpdateTokenRequest>,
) -> Result<Json<serde_json::Value>, ApiError> {
let result = sqlx::query("UPDATE api_tokens SET user_id = $1 WHERE id = $2")
.bind(input.user_id)
.bind(id)
.execute(&state.pool)
.await?;
if result.rows_affected() == 0 {
return Err(ApiError::not_found("token not found"));
}
Ok(Json(serde_json::json!({"updated": true, "id": id})))
}
/// Permanently delete a revoked API token
#[utoipa::path(
post,

195
apps/api/src/users.rs Normal file
View File

@@ -0,0 +1,195 @@
use axum::{extract::{Path, State}, Json};
use chrono::{DateTime, Utc};
use serde::{Deserialize, Serialize};
use sqlx::Row;
use uuid::Uuid;
use utoipa::ToSchema;
use crate::{error::ApiError, state::AppState};
#[derive(Serialize, ToSchema)]
pub struct UserResponse {
#[schema(value_type = String)]
pub id: Uuid,
pub username: String,
pub token_count: i64,
pub books_read: i64,
pub books_reading: i64,
#[schema(value_type = String)]
pub created_at: DateTime<Utc>,
}
#[derive(Deserialize, ToSchema)]
pub struct CreateUserRequest {
pub username: String,
}
/// List all reader users with their associated token count
#[utoipa::path(
get,
path = "/admin/users",
tag = "users",
responses(
(status = 200, body = Vec<UserResponse>),
(status = 401, description = "Unauthorized"),
(status = 403, description = "Forbidden - Admin scope required"),
),
security(("Bearer" = []))
)]
pub async fn list_users(State(state): State<AppState>) -> Result<Json<Vec<UserResponse>>, ApiError> {
let rows = sqlx::query(
r#"
SELECT u.id, u.username, u.created_at,
COUNT(DISTINCT t.id) AS token_count,
COUNT(DISTINCT brp.book_id) FILTER (WHERE brp.status = 'read') AS books_read,
COUNT(DISTINCT brp.book_id) FILTER (WHERE brp.status = 'reading') AS books_reading
FROM users u
LEFT JOIN api_tokens t ON t.user_id = u.id AND t.revoked_at IS NULL
LEFT JOIN book_reading_progress brp ON brp.user_id = u.id
GROUP BY u.id, u.username, u.created_at
ORDER BY u.created_at DESC
"#,
)
.fetch_all(&state.pool)
.await?;
let items = rows
.into_iter()
.map(|row| UserResponse {
id: row.get("id"),
username: row.get("username"),
token_count: row.get("token_count"),
books_read: row.get("books_read"),
books_reading: row.get("books_reading"),
created_at: row.get("created_at"),
})
.collect();
Ok(Json(items))
}
/// Create a new reader user
#[utoipa::path(
post,
path = "/admin/users",
tag = "users",
request_body = CreateUserRequest,
responses(
(status = 200, body = UserResponse, description = "User created"),
(status = 400, description = "Invalid input"),
(status = 401, description = "Unauthorized"),
(status = 403, description = "Forbidden - Admin scope required"),
),
security(("Bearer" = []))
)]
pub async fn create_user(
State(state): State<AppState>,
Json(input): Json<CreateUserRequest>,
) -> Result<Json<UserResponse>, ApiError> {
if input.username.trim().is_empty() {
return Err(ApiError::bad_request("username is required"));
}
let id = Uuid::new_v4();
let row = sqlx::query(
"INSERT INTO users (id, username) VALUES ($1, $2) RETURNING id, username, created_at",
)
.bind(id)
.bind(input.username.trim())
.fetch_one(&state.pool)
.await
.map_err(|e| {
if let sqlx::Error::Database(ref db_err) = e {
if db_err.constraint() == Some("users_username_key") {
return ApiError::bad_request("username already exists");
}
}
ApiError::from(e)
})?;
Ok(Json(UserResponse {
id: row.get("id"),
username: row.get("username"),
token_count: 0,
books_read: 0,
books_reading: 0,
created_at: row.get("created_at"),
}))
}
/// Update a reader user's username
#[utoipa::path(
patch,
path = "/admin/users/{id}",
tag = "users",
request_body = CreateUserRequest,
responses(
(status = 200, body = UserResponse, description = "User updated"),
(status = 400, description = "Invalid input"),
(status = 404, description = "User not found"),
(status = 401, description = "Unauthorized"),
(status = 403, description = "Forbidden - Admin scope required"),
),
security(("Bearer" = []))
)]
pub async fn update_user(
State(state): State<AppState>,
Path(id): Path<Uuid>,
Json(input): Json<CreateUserRequest>,
) -> Result<Json<serde_json::Value>, ApiError> {
if input.username.trim().is_empty() {
return Err(ApiError::bad_request("username is required"));
}
let result = sqlx::query("UPDATE users SET username = $1 WHERE id = $2")
.bind(input.username.trim())
.bind(id)
.execute(&state.pool)
.await
.map_err(|e| {
if let sqlx::Error::Database(ref db_err) = e {
if db_err.constraint() == Some("users_username_key") {
return ApiError::bad_request("username already exists");
}
}
ApiError::from(e)
})?;
if result.rows_affected() == 0 {
return Err(ApiError::not_found("user not found"));
}
Ok(Json(serde_json::json!({"updated": true, "id": id})))
}
/// Delete a reader user (cascades on tokens and reading progress)
#[utoipa::path(
delete,
path = "/admin/users/{id}",
tag = "users",
params(
("id" = String, Path, description = "User UUID"),
),
responses(
(status = 200, description = "User deleted"),
(status = 404, description = "User not found"),
(status = 401, description = "Unauthorized"),
(status = 403, description = "Forbidden - Admin scope required"),
),
security(("Bearer" = []))
)]
pub async fn delete_user(
State(state): State<AppState>,
Path(id): Path<Uuid>,
) -> Result<Json<serde_json::Value>, ApiError> {
let result = sqlx::query("DELETE FROM users WHERE id = $1")
.bind(id)
.execute(&state.pool)
.await?;
if result.rows_affected() == 0 {
return Err(ApiError::not_found("user not found"));
}
Ok(Json(serde_json::json!({"deleted": true, "id": id})))
}

View File

@@ -0,0 +1,135 @@
import { fetchBooks, fetchAllSeries, BooksPageDto, SeriesPageDto, getBookCoverUrl } from "@/lib/api";
import { getServerTranslations } from "@/lib/i18n/server";
import { BooksGrid } from "@/app/components/BookCard";
import { OffsetPagination } from "@/app/components/ui";
import Image from "next/image";
import Link from "next/link";
export const dynamic = "force-dynamic";
export default async function AuthorDetailPage({
params,
searchParams,
}: {
params: Promise<{ name: string }>;
searchParams: Promise<{ [key: string]: string | string[] | undefined }>;
}) {
const { t } = await getServerTranslations();
const { name: encodedName } = await params;
const authorName = decodeURIComponent(encodedName);
const searchParamsAwaited = await searchParams;
const page = typeof searchParamsAwaited.page === "string" ? parseInt(searchParamsAwaited.page) : 1;
const limit = typeof searchParamsAwaited.limit === "string" ? parseInt(searchParamsAwaited.limit) : 20;
// Fetch books by this author (server-side filtering via API) and series by this author
const [booksPage, seriesPage] = await Promise.all([
fetchBooks(undefined, undefined, page, limit, undefined, undefined, authorName).catch(
() => ({ items: [], total: 0, page: 1, limit }) as BooksPageDto
),
fetchAllSeries(undefined, undefined, undefined, 1, 200, undefined, undefined, undefined, undefined, authorName).catch(
() => ({ items: [], total: 0, page: 1, limit: 200 }) as SeriesPageDto
),
]);
const totalPages = Math.ceil(booksPage.total / limit);
const authorSeries = seriesPage.items;
return (
<>
{/* Breadcrumb */}
<nav className="flex items-center gap-2 text-sm text-muted-foreground mb-6">
<Link href="/authors" className="hover:text-foreground transition-colors">
{t("authors.title")}
</Link>
<span>/</span>
<span className="text-foreground font-medium">{authorName}</span>
</nav>
{/* Author Header */}
<div className="flex items-center gap-4 mb-8">
<div className="w-16 h-16 rounded-full bg-accent/50 flex items-center justify-center flex-shrink-0">
<span className="text-2xl font-bold text-accent-foreground">
{authorName.charAt(0).toUpperCase()}
</span>
</div>
<div>
<h1 className="text-3xl font-bold text-foreground">{authorName}</h1>
<div className="flex items-center gap-4 mt-1">
<span className="text-sm text-muted-foreground">
{t("authors.bookCount", { count: String(booksPage.total), plural: booksPage.total !== 1 ? "s" : "" })}
</span>
{authorSeries.length > 0 && (
<span className="text-sm text-muted-foreground">
{t("authors.seriesCount", { count: String(authorSeries.length), plural: authorSeries.length !== 1 ? "s" : "" })}
</span>
)}
</div>
</div>
</div>
{/* Series Section */}
{authorSeries.length > 0 && (
<section className="mb-8">
<h2 className="text-xl font-semibold text-foreground mb-4">
{t("authors.seriesBy", { name: authorName })}
</h2>
<div className="grid grid-cols-2 sm:grid-cols-3 md:grid-cols-4 lg:grid-cols-5 xl:grid-cols-6 gap-4">
{authorSeries.map((s) => (
<Link
key={`${s.library_id}-${s.name}`}
href={`/libraries/${s.library_id}/series/${encodeURIComponent(s.name)}`}
className="group"
>
<div className="bg-card rounded-xl shadow-sm border border-border/60 overflow-hidden hover:shadow-md hover:-translate-y-1 transition-all duration-200">
<div className="aspect-[2/3] relative bg-muted/50">
<Image
src={getBookCoverUrl(s.first_book_id)}
alt={s.name}
fill
className="object-cover"
sizes="(max-width: 640px) 50vw, (max-width: 768px) 33vw, (max-width: 1024px) 25vw, 16vw"
/>
</div>
<div className="p-3">
<h3 className="font-medium text-foreground truncate text-sm" title={s.name}>
{s.name}
</h3>
<p className="text-xs text-muted-foreground mt-1">
{t("authors.bookCount", { count: String(s.book_count), plural: s.book_count !== 1 ? "s" : "" })}
</p>
</div>
</div>
</Link>
))}
</div>
</section>
)}
{/* Books Section */}
{booksPage.items.length > 0 && (
<section>
<h2 className="text-xl font-semibold text-foreground mb-4">
{t("authors.booksBy", { name: authorName })}
</h2>
<BooksGrid books={booksPage.items} />
<OffsetPagination
currentPage={page}
totalPages={totalPages}
pageSize={limit}
totalItems={booksPage.total}
/>
</section>
)}
{/* Empty State */}
{booksPage.items.length === 0 && authorSeries.length === 0 && (
<div className="flex flex-col items-center justify-center py-16 text-center">
<p className="text-muted-foreground text-lg">
{t("authors.noResults")}
</p>
</div>
)}
</>
);
}

View File

@@ -0,0 +1,122 @@
import { fetchAuthors, AuthorsPageDto } from "@/lib/api";
import { getServerTranslations } from "@/lib/i18n/server";
import { LiveSearchForm } from "@/app/components/LiveSearchForm";
import { Card, CardContent, OffsetPagination } from "@/app/components/ui";
import Link from "next/link";
export const dynamic = "force-dynamic";
export default async function AuthorsPage({
searchParams,
}: {
searchParams: Promise<{ [key: string]: string | string[] | undefined }>;
}) {
const { t } = await getServerTranslations();
const searchParamsAwaited = await searchParams;
const searchQuery = typeof searchParamsAwaited.q === "string" ? searchParamsAwaited.q : "";
const sort = typeof searchParamsAwaited.sort === "string" ? searchParamsAwaited.sort : undefined;
const page = typeof searchParamsAwaited.page === "string" ? parseInt(searchParamsAwaited.page) : 1;
const limit = typeof searchParamsAwaited.limit === "string" ? parseInt(searchParamsAwaited.limit) : 20;
const authorsPage = await fetchAuthors(
searchQuery || undefined,
page,
limit,
sort,
).catch(() => ({ items: [], total: 0, page: 1, limit }) as AuthorsPageDto);
const totalPages = Math.ceil(authorsPage.total / limit);
const hasFilters = searchQuery || sort;
const sortOptions = [
{ value: "", label: t("authors.sortName") },
{ value: "books", label: t("authors.sortBooks") },
];
return (
<>
<div className="mb-6">
<h1 className="text-3xl font-bold text-foreground flex items-center gap-3">
<svg className="w-8 h-8 text-violet-500" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M17 20h5v-2a3 3 0 00-5.356-1.857M17 20H7m10 0v-2c0-.656-.126-1.283-.356-1.857M7 20H2v-2a3 3 0 015.356-1.857M7 20v-2c0-.656.126-1.283.356-1.857m0 0a5.002 5.002 0 019.288 0M15 7a3 3 0 11-6 0 3 3 0 016 0zm6 3a2 2 0 11-4 0 2 2 0 014 0zM7 10a2 2 0 11-4 0 2 2 0 014 0z" />
</svg>
{t("authors.title")}
</h1>
</div>
<Card className="mb-6">
<CardContent className="pt-6">
<LiveSearchForm
basePath="/authors"
fields={[
{ name: "q", type: "text", label: t("common.search"), placeholder: t("authors.searchPlaceholder") },
{ name: "sort", type: "select", label: t("books.sort"), options: sortOptions },
]}
/>
</CardContent>
</Card>
{/* Results count */}
<p className="text-sm text-muted-foreground mb-4">
{authorsPage.total} {t("authors.title").toLowerCase()}
{searchQuery && <> {t("authors.matchingQuery")} &quot;{searchQuery}&quot;</>}
</p>
{/* Authors List */}
{authorsPage.items.length > 0 ? (
<>
<div className="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-3 xl:grid-cols-4 gap-4">
{authorsPage.items.map((author) => (
<Link
key={author.name}
href={`/authors/${encodeURIComponent(author.name)}`}
className="group"
>
<div className="bg-card rounded-xl shadow-sm border border-border/60 overflow-hidden hover:shadow-md hover:-translate-y-1 transition-all duration-200 p-4">
<div className="flex items-center gap-3">
<div className="w-10 h-10 rounded-full bg-accent/50 flex items-center justify-center flex-shrink-0">
<span className="text-lg font-semibold text-violet-500">
{author.name.charAt(0).toUpperCase()}
</span>
</div>
<div className="min-w-0">
<h3 className="font-medium text-foreground truncate text-sm group-hover:text-violet-500 transition-colors" title={author.name}>
{author.name}
</h3>
<div className="flex items-center gap-3 mt-0.5">
<span className="text-xs text-muted-foreground">
{t("authors.bookCount", { count: String(author.book_count), plural: author.book_count !== 1 ? "s" : "" })}
</span>
<span className="text-xs text-muted-foreground">
{t("authors.seriesCount", { count: String(author.series_count), plural: author.series_count !== 1 ? "s" : "" })}
</span>
</div>
</div>
</div>
</div>
</Link>
))}
</div>
<OffsetPagination
currentPage={page}
totalPages={totalPages}
pageSize={limit}
totalItems={authorsPage.total}
/>
</>
) : (
<div className="flex flex-col items-center justify-center py-16 text-center">
<div className="w-16 h-16 mb-4 text-muted-foreground/30">
<svg fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={1.5} d="M17 20h5v-2a3 3 0 00-5.356-1.857M17 20H7m10 0v-2c0-.656-.126-1.283-.356-1.857M7 20H2v-2a3 3 0 015.356-1.857M7 20v-2c0-.656.126-1.283.356-1.857m0 0a5.002 5.002 0 019.288 0M15 7a3 3 0 11-6 0 3 3 0 016 0zm6 3a2 2 0 11-4 0 2 2 0 014 0zM7 10a2 2 0 11-4 0 2 2 0 014 0z" />
</svg>
</div>
<p className="text-muted-foreground text-lg">
{hasFilters ? t("authors.noResults") : t("authors.noAuthors")}
</p>
</div>
)}
</>
);
}

View File

@@ -0,0 +1,242 @@
import { fetchLibraries, getBookCoverUrl, BookDto, apiFetch, ReadingStatus } from "@/lib/api";
import { BookPreview } from "@/app/components/BookPreview";
import { ConvertButton } from "@/app/components/ConvertButton";
import { MarkBookReadButton } from "@/app/components/MarkBookReadButton";
import nextDynamic from "next/dynamic";
import { SafeHtml } from "@/app/components/SafeHtml";
import { getServerTranslations } from "@/lib/i18n/server";
import Image from "next/image";
import Link from "next/link";
const EditBookForm = nextDynamic(
() => import("@/app/components/EditBookForm").then(m => m.EditBookForm)
);
import { notFound } from "next/navigation";
export const dynamic = "force-dynamic";
const readingStatusClassNames: Record<ReadingStatus, string> = {
unread: "bg-muted/60 text-muted-foreground border border-border",
reading: "bg-amber-500/15 text-amber-600 dark:text-amber-400 border border-amber-500/30",
read: "bg-green-500/15 text-green-600 dark:text-green-400 border border-green-500/30",
};
async function fetchBook(bookId: string): Promise<BookDto | null> {
try {
return await apiFetch<BookDto>(`/books/${bookId}`);
} catch {
return null;
}
}
export default async function BookDetailPage({
params
}: {
params: Promise<{ id: string }>;
}) {
const { id } = await params;
const [book, libraries] = await Promise.all([
fetchBook(id),
fetchLibraries().catch(() => [] as { id: string; name: string }[])
]);
if (!book) {
notFound();
}
const { t, locale } = await getServerTranslations();
const library = libraries.find(l => l.id === book.library_id);
const formatBadge = (book.format ?? book.kind).toUpperCase();
const formatColor =
formatBadge === "CBZ" ? "bg-success/10 text-success border-success/30" :
formatBadge === "CBR" ? "bg-warning/10 text-warning border-warning/30" :
formatBadge === "PDF" ? "bg-destructive/10 text-destructive border-destructive/30" :
"bg-muted/50 text-muted-foreground border-border";
const statusLabel = t(`status.${book.reading_status}` as "status.unread" | "status.reading" | "status.read");
const statusClassName = readingStatusClassNames[book.reading_status];
return (
<div className="space-y-6">
{/* Breadcrumb */}
<div className="flex items-center gap-2 text-sm">
<Link href="/libraries" className="text-muted-foreground hover:text-primary transition-colors">
{t("bookDetail.libraries")}
</Link>
<span className="text-muted-foreground">/</span>
{library && (
<>
<Link
href={`/libraries/${book.library_id}/series`}
className="text-muted-foreground hover:text-primary transition-colors"
>
{library.name}
</Link>
<span className="text-muted-foreground">/</span>
</>
)}
{book.series && (
<>
<Link
href={`/libraries/${book.library_id}/series/${encodeURIComponent(book.series)}`}
className="text-muted-foreground hover:text-primary transition-colors"
>
{book.series}
</Link>
<span className="text-muted-foreground">/</span>
</>
)}
<span className="text-foreground font-medium truncate">{book.title}</span>
</div>
{/* Hero */}
<div className="flex flex-col sm:flex-row gap-6">
{/* Cover */}
<div className="flex-shrink-0">
<div className="w-48 aspect-[2/3] relative rounded-xl overflow-hidden shadow-card border border-border">
<Image
src={getBookCoverUrl(book.id)}
alt={t("bookDetail.coverOf", { title: book.title })}
fill
className="object-cover"
sizes="192px"
loading="lazy"
/>
</div>
</div>
{/* Info */}
<div className="flex-1 space-y-4">
<div className="flex items-start justify-between gap-4">
<div>
<h1 className="text-3xl font-bold text-foreground">{book.title}</h1>
{book.author && (
<p className="text-base text-muted-foreground mt-1">{book.author}</p>
)}
</div>
<EditBookForm book={book} />
</div>
{/* Series + Volume link */}
{book.series && (
<div className="flex items-center gap-2 text-sm">
<Link
href={`/libraries/${book.library_id}/series/${encodeURIComponent(book.series)}`}
className="text-primary hover:text-primary/80 transition-colors font-medium"
>
{book.series}
</Link>
{book.volume != null && (
<span className="px-2 py-0.5 bg-primary/10 text-primary rounded-md text-xs font-semibold">
Vol. {book.volume}
</span>
)}
</div>
)}
{/* Reading status + actions */}
<div className="flex flex-wrap items-center gap-3">
<span className={`inline-flex items-center px-2.5 py-1 rounded-full text-xs font-semibold ${statusClassName}`}>
{statusLabel}
{book.reading_status === "reading" && book.reading_current_page != null && ` · p. ${book.reading_current_page}`}
</span>
{book.reading_last_read_at && (
<span className="text-xs text-muted-foreground">
{new Date(book.reading_last_read_at).toLocaleDateString(locale)}
</span>
)}
<MarkBookReadButton bookId={book.id} currentStatus={book.reading_status} />
{book.file_format === "cbr" && <ConvertButton bookId={book.id} />}
</div>
{/* Metadata pills */}
<div className="flex flex-wrap items-center gap-2">
<span className={`inline-flex px-2.5 py-1 rounded-full text-xs font-semibold border ${formatColor}`}>
{formatBadge}
</span>
{book.page_count && (
<span className="inline-flex px-2.5 py-1 rounded-full text-xs font-medium bg-muted/50 text-muted-foreground border border-border">
{book.page_count} {t("dashboard.pages").toLowerCase()}
</span>
)}
{book.language && (
<span className="inline-flex px-2.5 py-1 rounded-full text-xs font-medium bg-muted/50 text-muted-foreground border border-border">
{book.language.toUpperCase()}
</span>
)}
{book.isbn && (
<span className="inline-flex px-2.5 py-1 rounded-full text-xs font-mono font-medium bg-muted/50 text-muted-foreground border border-border">
ISBN {book.isbn}
</span>
)}
{book.publish_date && (
<span className="inline-flex px-2.5 py-1 rounded-full text-xs font-medium bg-muted/50 text-muted-foreground border border-border">
{book.publish_date}
</span>
)}
</div>
{/* Description */}
{book.summary && (
<SafeHtml html={book.summary} className="text-sm text-muted-foreground leading-relaxed" />
)}
</div>
</div>
{/* Technical info (collapsible) */}
<details className="group">
<summary className="cursor-pointer text-xs text-muted-foreground hover:text-foreground transition-colors select-none flex items-center gap-1.5">
<svg className="w-3.5 h-3.5 transition-transform group-open:rotate-90" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M9 5l7 7-7 7" />
</svg>
{t("bookDetail.technicalInfo")}
</summary>
<div className="mt-3 p-4 rounded-lg bg-muted/30 border border-border/50 space-y-2 text-xs">
{book.file_path && (
<div className="flex flex-col gap-0.5">
<span className="text-muted-foreground">{t("bookDetail.file")}</span>
<code className="font-mono text-foreground break-all">{book.file_path}</code>
</div>
)}
{book.file_format && (
<div className="flex items-center justify-between">
<span className="text-muted-foreground">{t("bookDetail.fileFormat")}</span>
<span className="text-foreground">{book.file_format.toUpperCase()}</span>
</div>
)}
{book.file_parse_status && (
<div className="flex items-center justify-between">
<span className="text-muted-foreground">{t("bookDetail.parsing")}</span>
<span className={`inline-flex px-2 py-0.5 rounded-full text-xs font-medium ${
book.file_parse_status === "success" ? "bg-success/10 text-success" :
book.file_parse_status === "failed" ? "bg-destructive/10 text-destructive" :
"bg-muted/50 text-muted-foreground"
}`}>
{book.file_parse_status}
</span>
</div>
)}
<div className="flex items-center justify-between">
<span className="text-muted-foreground">Book ID</span>
<code className="font-mono text-foreground">{book.id}</code>
</div>
<div className="flex items-center justify-between">
<span className="text-muted-foreground">Library ID</span>
<code className="font-mono text-foreground">{book.library_id}</code>
</div>
{book.updated_at && (
<div className="flex items-center justify-between">
<span className="text-muted-foreground">{t("bookDetail.updatedAt")}</span>
<span className="text-foreground">{new Date(book.updated_at).toLocaleString(locale)}</span>
</div>
)}
</div>
</details>
{/* Book Preview */}
{book.page_count && book.page_count > 0 && (
<BookPreview bookId={book.id} pageCount={book.page_count} />
)}
</div>
);
}

View File

@@ -1,9 +1,10 @@
import { fetchBooks, searchBooks, fetchLibraries, BookDto, LibraryDto, SeriesHitDto, getBookCoverUrl } from "../../lib/api";
import { BooksGrid, EmptyState } from "../components/BookCard";
import { LiveSearchForm } from "../components/LiveSearchForm";
import { Card, CardContent, OffsetPagination } from "../components/ui";
import { fetchBooks, searchBooks, fetchLibraries, BookDto, LibraryDto, SeriesHitDto, getBookCoverUrl } from "@/lib/api";
import { BooksGrid, EmptyState } from "@/app/components/BookCard";
import { LiveSearchForm } from "@/app/components/LiveSearchForm";
import { Card, CardContent, OffsetPagination } from "@/app/components/ui";
import Link from "next/link";
import Image from "next/image";
import { getServerTranslations } from "@/lib/i18n/server";
export const dynamic = "force-dynamic";
@@ -12,10 +13,13 @@ export default async function BooksPage({
}: {
searchParams: Promise<{ [key: string]: string | string[] | undefined }>;
}) {
const { t } = await getServerTranslations();
const searchParamsAwaited = await searchParams;
const libraryId = typeof searchParamsAwaited.library === "string" ? searchParamsAwaited.library : undefined;
const searchQuery = typeof searchParamsAwaited.q === "string" ? searchParamsAwaited.q : "";
const readingStatus = typeof searchParamsAwaited.status === "string" ? searchParamsAwaited.status : undefined;
const format = typeof searchParamsAwaited.format === "string" ? searchParamsAwaited.format : undefined;
const metadataProvider = typeof searchParamsAwaited.metadata === "string" ? searchParamsAwaited.metadata : undefined;
const sort = typeof searchParamsAwaited.sort === "string" ? searchParamsAwaited.sort : undefined;
const page = typeof searchParamsAwaited.page === "string" ? parseInt(searchParamsAwaited.page) : 1;
const limit = typeof searchParamsAwaited.limit === "string" ? parseInt(searchParamsAwaited.limit) : 20;
@@ -39,8 +43,8 @@ export default async function BooksPage({
library_id: hit.library_id,
kind: hit.kind,
title: hit.title,
author: hit.author,
authors: [],
author: hit.authors?.[0] ?? null,
authors: hit.authors ?? [],
series: hit.series,
volume: hit.volume,
language: hit.language,
@@ -53,11 +57,14 @@ export default async function BooksPage({
reading_status: "unread" as const,
reading_current_page: null,
reading_last_read_at: null,
summary: null,
isbn: null,
publish_date: null,
}));
totalHits = searchResponse.estimated_total_hits;
}
} else {
const booksPage = await fetchBooks(libraryId, undefined, page, limit, readingStatus, sort).catch(() => ({
const booksPage = await fetchBooks(libraryId, undefined, page, limit, readingStatus, sort, undefined, format, metadataProvider).catch(() => ({
items: [] as BookDto[],
total: 0,
page: 1,
@@ -75,23 +82,37 @@ export default async function BooksPage({
const totalPages = Math.ceil(total / limit);
const libraryOptions = [
{ value: "", label: "All libraries" },
{ value: "", label: t("books.allLibraries") },
...libraries.map((lib) => ({ value: lib.id, label: lib.name })),
];
const statusOptions = [
{ value: "", label: "All" },
{ value: "unread", label: "Unread" },
{ value: "reading", label: "In progress" },
{ value: "read", label: "Read" },
{ value: "", label: t("common.all") },
{ value: "unread", label: t("status.unread") },
{ value: "reading", label: t("status.reading") },
{ value: "read", label: t("status.read") },
];
const formatOptions = [
{ value: "", label: t("books.allFormats") },
{ value: "cbz", label: "CBZ" },
{ value: "cbr", label: "CBR" },
{ value: "pdf", label: "PDF" },
{ value: "epub", label: "EPUB" },
];
const metadataOptions = [
{ value: "", label: t("series.metadataAll") },
{ value: "linked", label: t("series.metadataLinked") },
{ value: "unlinked", label: t("series.metadataUnlinked") },
];
const sortOptions = [
{ value: "", label: "Title" },
{ value: "latest", label: "Latest added" },
{ value: "", label: t("books.sortTitle") },
{ value: "latest", label: t("books.sortLatest") },
];
const hasFilters = searchQuery || libraryId || readingStatus || sort;
const hasFilters = searchQuery || libraryId || readingStatus || format || metadataProvider || sort;
return (
<>
@@ -100,7 +121,7 @@ export default async function BooksPage({
<svg className="w-8 h-8 text-success" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M12 6.253v13m0-13C10.832 5.477 9.246 5 7.5 5S4.168 5.477 3 6.253v13C4.168 18.477 5.754 18 7.5 18s3.332.477 4.5 1.253m0-13C13.168 5.477 14.754 5 16.5 5c1.747 0 3.332.477 4.5 1.253v13C19.832 18.477 18.247 18 16.5 18c-1.746 0-3.332.477-4.5 1.253" />
</svg>
Books
{t("books.title")}
</h1>
</div>
@@ -109,10 +130,12 @@ export default async function BooksPage({
<LiveSearchForm
basePath="/books"
fields={[
{ name: "q", type: "text", label: "Search", placeholder: "Search by title, author, series...", className: "flex-1 w-full" },
{ name: "library", type: "select", label: "Library", options: libraryOptions, className: "w-full sm:w-48" },
{ name: "status", type: "select", label: "Status", options: statusOptions, className: "w-full sm:w-40" },
{ name: "sort", type: "select", label: "Sort", options: sortOptions, className: "w-full sm:w-40" },
{ name: "q", type: "text", label: t("common.search"), placeholder: t("books.searchPlaceholder") },
{ name: "library", type: "select", label: t("books.library"), options: libraryOptions },
{ name: "status", type: "select", label: t("books.status"), options: statusOptions },
{ name: "format", type: "select", label: t("books.format"), options: formatOptions },
{ name: "metadata", type: "select", label: t("series.metadata"), options: metadataOptions },
{ name: "sort", type: "select", label: t("books.sort"), options: sortOptions },
]}
/>
</CardContent>
@@ -121,18 +144,18 @@ export default async function BooksPage({
{/* Résultats */}
{searchQuery && totalHits !== null ? (
<p className="text-sm text-muted-foreground mb-4">
Found {totalHits} result{totalHits !== 1 ? 's' : ''} for &quot;{searchQuery}&quot;
{t("books.resultCountFor", { count: String(totalHits), plural: totalHits !== 1 ? "s" : "", query: searchQuery })}
</p>
) : !searchQuery && (
<p className="text-sm text-muted-foreground mb-4">
{total} book{total !== 1 ? 's' : ''}
{t("books.resultCount", { count: String(total), plural: total !== 1 ? "s" : "" })}
</p>
)}
{/* Séries matchantes */}
{seriesHits.length > 0 && (
<div className="mb-8">
<h2 className="text-lg font-semibold text-foreground mb-3">Series</h2>
<h2 className="text-lg font-semibold text-foreground mb-3">{t("books.seriesHeading")}</h2>
<div className="grid grid-cols-2 sm:grid-cols-3 md:grid-cols-4 lg:grid-cols-6 gap-4">
{seriesHits.map((s) => (
<Link
@@ -144,18 +167,18 @@ export default async function BooksPage({
<div className="aspect-[2/3] relative bg-muted/50">
<Image
src={getBookCoverUrl(s.first_book_id)}
alt={`Cover of ${s.name}`}
alt={t("books.coverOf", { name: s.name })}
fill
className="object-cover"
unoptimized
sizes="(max-width: 640px) 50vw, (max-width: 768px) 33vw, (max-width: 1024px) 25vw, 16vw"
/>
</div>
<div className="p-2">
<h3 className="font-medium text-foreground truncate text-sm" title={s.name}>
{s.name === "unclassified" ? "Unclassified" : s.name}
{s.name === "unclassified" ? t("books.unclassified") : s.name}
</h3>
<p className="text-xs text-muted-foreground mt-0.5">
{s.book_count} book{s.book_count !== 1 ? 's' : ''}
{t("books.bookCount", { count: String(s.book_count), plural: s.book_count !== 1 ? "s" : "" })}
</p>
</div>
</div>
@@ -168,7 +191,7 @@ export default async function BooksPage({
{/* Grille de livres */}
{displayBooks.length > 0 ? (
<>
{searchQuery && <h2 className="text-lg font-semibold text-foreground mb-3">Books</h2>}
{searchQuery && <h2 className="text-lg font-semibold text-foreground mb-3">{t("books.title")}</h2>}
<BooksGrid books={displayBooks} />
{!searchQuery && (
@@ -181,7 +204,7 @@ export default async function BooksPage({
)}
</>
) : (
<EmptyState message={searchQuery ? `No books found for "${searchQuery}"` : "No books available"} />
<EmptyState message={searchQuery ? t("books.noResults", { query: searchQuery }) : t("books.noBooks")} />
)}
</>
);

View File

@@ -0,0 +1,809 @@
export const dynamic = "force-dynamic";
import { notFound } from "next/navigation";
import Link from "next/link";
import { apiFetch, getMetadataBatchReport, getMetadataBatchResults, getMetadataRefreshReport, MetadataBatchReportDto, MetadataBatchResultDto, MetadataRefreshReportDto } from "@/lib/api";
import {
Card, CardHeader, CardTitle, CardDescription, CardContent,
StatusBadge, JobTypeBadge, StatBox, ProgressBar
} from "@/app/components/ui";
import { JobDetailLive } from "@/app/components/JobDetailLive";
import { getServerTranslations } from "@/lib/i18n/server";
interface JobDetailPageProps {
params: Promise<{ id: string }>;
}
interface JobDetails {
id: string;
library_id: string | null;
book_id: string | null;
type: string;
status: string;
created_at: string;
started_at: string | null;
finished_at: string | null;
phase2_started_at: string | null;
generating_thumbnails_started_at: string | null;
current_file: string | null;
progress_percent: number | null;
processed_files: number | null;
total_files: number | null;
stats_json: {
scanned_files: number;
indexed_files: number;
removed_files: number;
errors: number;
warnings: number;
} | null;
error_opt: string | null;
}
interface JobError {
id: string;
file_path: string;
error_message: string;
created_at: string;
}
async function getJobDetails(jobId: string): Promise<JobDetails | null> {
try {
return await apiFetch<JobDetails>(`/index/jobs/${jobId}`);
} catch {
return null;
}
}
async function getJobErrors(jobId: string): Promise<JobError[]> {
try {
return await apiFetch<JobError[]>(`/index/jobs/${jobId}/errors`);
} catch {
return [];
}
}
function formatDuration(start: string, end: string | null): string {
const startDate = new Date(start);
const endDate = end ? new Date(end) : new Date();
const diff = endDate.getTime() - startDate.getTime();
if (diff < 60000) return `${Math.floor(diff / 1000)}s`;
if (diff < 3600000) return `${Math.floor(diff / 60000)}m ${Math.floor((diff % 60000) / 1000)}s`;
return `${Math.floor(diff / 3600000)}h ${Math.floor((diff % 3600000) / 60000)}m`;
}
function formatSpeed(count: number, durationMs: number): string {
if (durationMs === 0 || count === 0) return "-";
return `${(count / (durationMs / 1000)).toFixed(1)}/s`;
}
export default async function JobDetailPage({ params }: JobDetailPageProps) {
const { id } = await params;
const [job, errors] = await Promise.all([
getJobDetails(id),
getJobErrors(id),
]);
if (!job) {
notFound();
}
const { t, locale } = await getServerTranslations();
const JOB_TYPE_INFO: Record<string, { label: string; description: string; isThumbnailOnly: boolean }> = {
rebuild: {
label: t("jobType.rebuildLabel"),
description: t("jobType.rebuildDesc"),
isThumbnailOnly: false,
},
full_rebuild: {
label: t("jobType.full_rebuildLabel"),
description: t("jobType.full_rebuildDesc"),
isThumbnailOnly: false,
},
rescan: {
label: t("jobType.rescanLabel"),
description: t("jobType.rescanDesc"),
isThumbnailOnly: false,
},
thumbnail_rebuild: {
label: t("jobType.thumbnail_rebuildLabel"),
description: t("jobType.thumbnail_rebuildDesc"),
isThumbnailOnly: true,
},
thumbnail_regenerate: {
label: t("jobType.thumbnail_regenerateLabel"),
description: t("jobType.thumbnail_regenerateDesc"),
isThumbnailOnly: true,
},
cbr_to_cbz: {
label: t("jobType.cbr_to_cbzLabel"),
description: t("jobType.cbr_to_cbzDesc"),
isThumbnailOnly: false,
},
metadata_batch: {
label: t("jobType.metadata_batchLabel"),
description: t("jobType.metadata_batchDesc"),
isThumbnailOnly: false,
},
metadata_refresh: {
label: t("jobType.metadata_refreshLabel"),
description: t("jobType.metadata_refreshDesc"),
isThumbnailOnly: false,
},
};
const isMetadataBatch = job.type === "metadata_batch";
const isMetadataRefresh = job.type === "metadata_refresh";
// Fetch batch report & results for metadata_batch jobs
let batchReport: MetadataBatchReportDto | null = null;
let batchResults: MetadataBatchResultDto[] = [];
if (isMetadataBatch) {
[batchReport, batchResults] = await Promise.all([
getMetadataBatchReport(id).catch(() => null),
getMetadataBatchResults(id).catch(() => []),
]);
}
// Fetch refresh report for metadata_refresh jobs
let refreshReport: MetadataRefreshReportDto | null = null;
if (isMetadataRefresh) {
refreshReport = await getMetadataRefreshReport(id).catch(() => null);
}
const typeInfo = JOB_TYPE_INFO[job.type] ?? {
label: job.type,
description: null,
isThumbnailOnly: false,
};
const durationMs = job.started_at
? new Date(job.finished_at || new Date()).getTime() - new Date(job.started_at).getTime()
: 0;
const isCompleted = job.status === "success";
const isFailed = job.status === "failed";
const isCancelled = job.status === "cancelled";
const isTerminal = isCompleted || isFailed || isCancelled;
const isExtractingPages = job.status === "extracting_pages";
const isThumbnailPhase = job.status === "generating_thumbnails";
const isPhase2 = isExtractingPages || isThumbnailPhase;
const { isThumbnailOnly } = typeInfo;
// Which label to use for the progress card
const progressTitle = isMetadataBatch
? t("jobDetail.metadataSearch")
: isMetadataRefresh
? t("jobDetail.metadataRefresh")
: isThumbnailOnly
? t("jobType.thumbnail_rebuild")
: isExtractingPages
? t("jobDetail.phase2a")
: isThumbnailPhase
? t("jobDetail.phase2b")
: t("jobDetail.phase1");
const progressDescription = isMetadataBatch
? t("jobDetail.metadataSearchDesc")
: isMetadataRefresh
? t("jobDetail.metadataRefreshDesc")
: isThumbnailOnly
? undefined
: isExtractingPages
? t("jobDetail.phase2aDesc")
: isThumbnailPhase
? t("jobDetail.phase2bDesc")
: t("jobDetail.phase1Desc");
// Speed metric: thumbnail count for thumbnail jobs, scanned files for index jobs
const speedCount = isThumbnailOnly
? (job.processed_files ?? 0)
: (job.stats_json?.scanned_files ?? 0);
const showProgressCard =
(isCompleted || isFailed || job.status === "running" || isPhase2) &&
(job.total_files != null || !!job.current_file);
return (
<>
<JobDetailLive jobId={id} isTerminal={isTerminal} />
<div className="mb-6">
<Link
href="/jobs"
className="inline-flex items-center text-sm text-muted-foreground hover:text-primary transition-colors duration-200"
>
<svg className="w-4 h-4 mr-1" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M15 19l-7-7 7-7" />
</svg>
{t("jobDetail.backToJobs")}
</Link>
<h1 className="text-3xl font-bold text-foreground mt-2">{t("jobDetail.title")}</h1>
</div>
{/* Summary banner — completed */}
{isCompleted && job.started_at && (
<div className="mb-6 p-4 rounded-xl bg-success/10 border border-success/30 flex items-start gap-3">
<svg className="w-5 h-5 text-success mt-0.5 shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M9 12l2 2 4-4m6 2a9 9 0 11-18 0 9 9 0 0118 0z" />
</svg>
<div className="text-sm text-success">
<span className="font-semibold">{t("jobDetail.completedIn", { duration: formatDuration(job.started_at, job.finished_at) })}</span>
{isMetadataBatch && batchReport && (
<span className="ml-2 text-success/80">
{batchReport.auto_matched} {t("jobDetail.autoMatched").toLowerCase()}, {batchReport.already_linked} {t("jobDetail.alreadyLinked").toLowerCase()}, {batchReport.no_results} {t("jobDetail.noResults").toLowerCase()}, {batchReport.errors} {t("jobDetail.errors").toLowerCase()}
</span>
)}
{isMetadataRefresh && refreshReport && (
<span className="ml-2 text-success/80">
{refreshReport.refreshed} {t("jobDetail.refreshed").toLowerCase()}, {refreshReport.unchanged} {t("jobDetail.unchanged").toLowerCase()}, {refreshReport.errors} {t("jobDetail.errors").toLowerCase()}
</span>
)}
{!isMetadataBatch && !isMetadataRefresh && job.stats_json && (
<span className="ml-2 text-success/80">
{job.stats_json.scanned_files} {t("jobDetail.scanned").toLowerCase()}, {job.stats_json.indexed_files} {t("jobDetail.indexed").toLowerCase()}
{job.stats_json.removed_files > 0 && `, ${job.stats_json.removed_files} ${t("jobDetail.removed").toLowerCase()}`}
{(job.stats_json.warnings ?? 0) > 0 && `, ${job.stats_json.warnings} ${t("jobDetail.warnings").toLowerCase()}`}
{job.stats_json.errors > 0 && `, ${job.stats_json.errors} ${t("jobDetail.errors").toLowerCase()}`}
{job.total_files != null && job.total_files > 0 && `, ${job.total_files} ${t("jobType.thumbnail_rebuild").toLowerCase()}`}
</span>
)}
{!isMetadataBatch && !isMetadataRefresh && !job.stats_json && isThumbnailOnly && job.total_files != null && (
<span className="ml-2 text-success/80">
{job.processed_files ?? job.total_files} {t("jobDetail.generated").toLowerCase()}
</span>
)}
</div>
</div>
)}
{/* Summary banner — failed */}
{isFailed && (
<div className="mb-6 p-4 rounded-xl bg-destructive/10 border border-destructive/30 flex items-start gap-3">
<svg className="w-5 h-5 text-destructive mt-0.5 shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M12 8v4m0 4h.01M21 12a9 9 0 11-18 0 9 9 0 0118 0z" />
</svg>
<div className="text-sm text-destructive">
<span className="font-semibold">{t("jobDetail.jobFailed")}</span>
{job.started_at && (
<span className="ml-2 text-destructive/80">{t("jobDetail.failedAfter", { duration: formatDuration(job.started_at, job.finished_at) })}</span>
)}
{job.error_opt && (
<p className="mt-1 text-destructive/70 font-mono text-xs break-all">{job.error_opt}</p>
)}
</div>
</div>
)}
{/* Summary banner — cancelled */}
{isCancelled && (
<div className="mb-6 p-4 rounded-xl bg-muted border border-border flex items-start gap-3">
<svg className="w-5 h-5 text-muted-foreground mt-0.5 shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M18.364 18.364A9 9 0 005.636 5.636m12.728 12.728A9 9 0 015.636 5.636m12.728 12.728L5.636 5.636" />
</svg>
<span className="text-sm text-muted-foreground">
<span className="font-semibold">{t("jobDetail.cancelled")}</span>
{job.started_at && (
<span className="ml-2">{t("jobDetail.failedAfter", { duration: formatDuration(job.started_at, job.finished_at) })}</span>
)}
</span>
</div>
)}
<div className="grid grid-cols-1 lg:grid-cols-2 gap-6">
{/* Overview Card */}
<Card>
<CardHeader>
<CardTitle>{t("jobDetail.overview")}</CardTitle>
{typeInfo.description && (
<CardDescription>{typeInfo.description}</CardDescription>
)}
</CardHeader>
<CardContent className="space-y-3">
<div className="flex items-center justify-between py-2 border-b border-border/60">
<span className="text-sm text-muted-foreground">ID</span>
<code className="px-2 py-1 bg-muted rounded font-mono text-sm text-foreground">{job.id}</code>
</div>
<div className="flex items-center justify-between py-2 border-b border-border/60">
<span className="text-sm text-muted-foreground">{t("jobsList.type")}</span>
<div className="flex items-center gap-2">
<JobTypeBadge type={job.type} />
<span className="text-sm text-muted-foreground">{typeInfo.label}</span>
</div>
</div>
<div className="flex items-center justify-between py-2 border-b border-border/60">
<span className="text-sm text-muted-foreground">{t("jobsList.status")}</span>
<StatusBadge status={job.status} />
</div>
<div className={`flex items-center justify-between py-2 ${(job.book_id || job.started_at) ? "border-b border-border/60" : ""}`}>
<span className="text-sm text-muted-foreground">{t("jobDetail.library")}</span>
<span className="text-sm text-foreground">{job.library_id || t("jobDetail.allLibraries")}</span>
</div>
{job.book_id && (
<div className={`flex items-center justify-between py-2 ${job.started_at ? "border-b border-border/60" : ""}`}>
<span className="text-sm text-muted-foreground">{t("jobDetail.book")}</span>
<Link
href={`/books/${job.book_id}`}
className="text-sm text-primary hover:text-primary/80 font-mono hover:underline"
>
{job.book_id.slice(0, 8)}
</Link>
</div>
)}
{job.started_at && (
<div className="flex items-center justify-between py-2">
<span className="text-sm text-muted-foreground">{t("jobsList.duration")}</span>
<span className="text-sm font-semibold text-foreground">
{formatDuration(job.started_at, job.finished_at)}
</span>
</div>
)}
</CardContent>
</Card>
{/* Timeline Card */}
<Card>
<CardHeader>
<CardTitle>{t("jobDetail.timeline")}</CardTitle>
</CardHeader>
<CardContent>
<div className="relative">
{/* Vertical line */}
<div className="absolute left-[7px] top-2 bottom-2 w-px bg-border" />
<div className="space-y-5">
{/* Created */}
<div className="flex items-start gap-4">
<div className="w-3.5 h-3.5 rounded-full mt-0.5 bg-muted border-2 border-border shrink-0 z-10" />
<div className="flex-1 min-w-0">
<span className="text-sm font-medium text-foreground">{t("jobDetail.created")}</span>
<p className="text-xs text-muted-foreground">{new Date(job.created_at).toLocaleString(locale)}</p>
</div>
</div>
{/* Phase 1 start — for index jobs that have two phases */}
{job.started_at && job.phase2_started_at && (
<div className="flex items-start gap-4">
<div className="w-3.5 h-3.5 rounded-full mt-0.5 bg-primary shrink-0 z-10" />
<div className="flex-1 min-w-0">
<span className="text-sm font-medium text-foreground">{t("jobDetail.phase1")}</span>
<p className="text-xs text-muted-foreground">{new Date(job.started_at).toLocaleString(locale)}</p>
<p className="text-xs text-primary/80 font-medium mt-0.5">
{t("jobDetail.duration", { duration: formatDuration(job.started_at, job.phase2_started_at) })}
{job.stats_json && (
<span className="text-muted-foreground font-normal ml-1">
· {job.stats_json.scanned_files} {t("jobDetail.scanned").toLowerCase()}, {job.stats_json.indexed_files} {t("jobDetail.indexed").toLowerCase()}
{job.stats_json.removed_files > 0 && `, ${job.stats_json.removed_files} ${t("jobDetail.removed").toLowerCase()}`}
{(job.stats_json.warnings ?? 0) > 0 && `, ${job.stats_json.warnings} ${t("jobDetail.warnings").toLowerCase()}`}
</span>
)}
</p>
</div>
</div>
)}
{/* Phase 2a — Extracting pages (index jobs with phase2) */}
{job.phase2_started_at && !isThumbnailOnly && (
<div className="flex items-start gap-4">
<div className={`w-3.5 h-3.5 rounded-full mt-0.5 shrink-0 z-10 ${
job.generating_thumbnails_started_at || job.finished_at ? "bg-primary" : "bg-primary animate-pulse"
}`} />
<div className="flex-1 min-w-0">
<span className="text-sm font-medium text-foreground">{t("jobDetail.phase2a")}</span>
<p className="text-xs text-muted-foreground">{new Date(job.phase2_started_at).toLocaleString(locale)}</p>
<p className="text-xs text-primary/80 font-medium mt-0.5">
{t("jobDetail.duration", { duration: formatDuration(job.phase2_started_at, job.generating_thumbnails_started_at ?? job.finished_at ?? null) })}
{!job.generating_thumbnails_started_at && !job.finished_at && isExtractingPages && (
<span className="text-muted-foreground font-normal ml-1">· {t("jobDetail.inProgress")}</span>
)}
</p>
</div>
</div>
)}
{/* Phase 2b — Generating thumbnails */}
{(job.generating_thumbnails_started_at || (job.phase2_started_at && isThumbnailOnly)) && (
<div className="flex items-start gap-4">
<div className={`w-3.5 h-3.5 rounded-full mt-0.5 shrink-0 z-10 ${
job.finished_at ? "bg-primary" : "bg-primary animate-pulse"
}`} />
<div className="flex-1 min-w-0">
<span className="text-sm font-medium text-foreground">
{isThumbnailOnly ? t("jobType.thumbnail_rebuild") : t("jobDetail.phase2b")}
</span>
<p className="text-xs text-muted-foreground">
{(job.generating_thumbnails_started_at ? new Date(job.generating_thumbnails_started_at) : job.phase2_started_at ? new Date(job.phase2_started_at) : null)?.toLocaleString(locale)}
</p>
{(job.generating_thumbnails_started_at || job.finished_at) && (
<p className="text-xs text-primary/80 font-medium mt-0.5">
{t("jobDetail.duration", { duration: formatDuration(
job.generating_thumbnails_started_at ?? job.phase2_started_at!,
job.finished_at ?? null
) })}
{job.total_files != null && job.total_files > 0 && (
<span className="text-muted-foreground font-normal ml-1">
· {job.processed_files ?? job.total_files} {t("jobType.thumbnail_rebuild").toLowerCase()}
</span>
)}
</p>
)}
{!job.finished_at && isThumbnailPhase && (
<span className="text-xs text-muted-foreground">{t("jobDetail.inProgress")}</span>
)}
</div>
</div>
)}
{/* Started — for jobs without phase2 (cbr_to_cbz, or no phase yet) */}
{job.started_at && !job.phase2_started_at && (
<div className="flex items-start gap-4">
<div className={`w-3.5 h-3.5 rounded-full mt-0.5 shrink-0 z-10 ${
job.finished_at ? "bg-primary" : "bg-primary animate-pulse"
}`} />
<div className="flex-1 min-w-0">
<span className="text-sm font-medium text-foreground">{t("jobDetail.started")}</span>
<p className="text-xs text-muted-foreground">{new Date(job.started_at).toLocaleString(locale)}</p>
</div>
</div>
)}
{/* Pending — not started yet */}
{!job.started_at && (
<div className="flex items-start gap-4">
<div className="w-3.5 h-3.5 rounded-full mt-0.5 bg-warning shrink-0 z-10" />
<div className="flex-1 min-w-0">
<span className="text-sm font-medium text-foreground">{t("jobDetail.pendingStart")}</span>
</div>
</div>
)}
{/* Finished */}
{job.finished_at && (
<div className="flex items-start gap-4">
<div className={`w-3.5 h-3.5 rounded-full mt-0.5 shrink-0 z-10 ${
isCompleted ? "bg-success" : isFailed ? "bg-destructive" : "bg-muted"
}`} />
<div className="flex-1 min-w-0">
<span className="text-sm font-medium text-foreground">
{isCompleted ? t("jobDetail.finished") : isFailed ? t("jobDetail.failed") : t("jobDetail.cancelled")}
</span>
<p className="text-xs text-muted-foreground">{new Date(job.finished_at).toLocaleString(locale)}</p>
</div>
</div>
)}
</div>
</div>
</CardContent>
</Card>
{/* Progress Card */}
{showProgressCard && (
<Card>
<CardHeader>
<CardTitle>{progressTitle}</CardTitle>
{progressDescription && <CardDescription>{progressDescription}</CardDescription>}
</CardHeader>
<CardContent>
{job.total_files != null && job.total_files > 0 && (
<>
<ProgressBar value={job.progress_percent || 0} showLabel size="lg" className="mb-4" />
<div className="grid grid-cols-3 gap-4">
<StatBox
value={job.processed_files ?? 0}
label={isThumbnailOnly || isPhase2 ? t("jobDetail.generated") : t("jobDetail.processed")}
variant="primary"
/>
<StatBox value={job.total_files} label={t("jobDetail.total")} />
<StatBox
value={Math.max(0, job.total_files - (job.processed_files ?? 0))}
label={t("jobDetail.remaining")}
variant={isCompleted ? "default" : "warning"}
/>
</div>
</>
)}
{job.current_file && (
<div className="mt-4 p-3 bg-muted/50 rounded-lg">
<span className="text-xs text-muted-foreground uppercase tracking-wide">{t("jobDetail.currentFile")}</span>
<code className="block mt-1 text-xs font-mono text-foreground break-all">{job.current_file}</code>
</div>
)}
</CardContent>
</Card>
)}
{/* Index Statistics — index jobs only */}
{job.stats_json && !isThumbnailOnly && !isMetadataBatch && !isMetadataRefresh && (
<Card>
<CardHeader>
<CardTitle>{t("jobDetail.indexStats")}</CardTitle>
{job.started_at && (
<CardDescription>
{formatDuration(job.started_at, job.finished_at)}
{speedCount > 0 && ` · ${formatSpeed(speedCount, durationMs)} scan rate`}
</CardDescription>
)}
</CardHeader>
<CardContent>
<div className="grid grid-cols-2 sm:grid-cols-5 gap-4">
<StatBox value={job.stats_json.scanned_files} label={t("jobDetail.scanned")} variant="success" />
<StatBox value={job.stats_json.indexed_files} label={t("jobDetail.indexed")} variant="primary" />
<StatBox value={job.stats_json.removed_files} label={t("jobDetail.removed")} variant="warning" />
<StatBox value={job.stats_json.warnings ?? 0} label={t("jobDetail.warnings")} variant={(job.stats_json.warnings ?? 0) > 0 ? "warning" : "default"} />
<StatBox value={job.stats_json.errors} label={t("jobDetail.errors")} variant={job.stats_json.errors > 0 ? "error" : "default"} />
</div>
</CardContent>
</Card>
)}
{/* Thumbnail statistics — thumbnail-only jobs, completed */}
{isThumbnailOnly && isCompleted && job.total_files != null && (
<Card>
<CardHeader>
<CardTitle>{t("jobDetail.thumbnailStats")}</CardTitle>
{job.started_at && (
<CardDescription>
{formatDuration(job.started_at, job.finished_at)}
{speedCount > 0 && ` · ${formatSpeed(speedCount, durationMs)} thumbnails/s`}
</CardDescription>
)}
</CardHeader>
<CardContent>
<div className="grid grid-cols-2 gap-4">
<StatBox value={job.processed_files ?? job.total_files} label={t("jobDetail.generated")} variant="success" />
<StatBox value={job.total_files} label={t("jobDetail.total")} />
</div>
</CardContent>
</Card>
)}
{/* Metadata batch report */}
{isMetadataBatch && batchReport && (
<Card>
<CardHeader>
<CardTitle>{t("jobDetail.batchReport")}</CardTitle>
<CardDescription>{t("jobDetail.seriesAnalyzed", { count: String(batchReport.total_series) })}</CardDescription>
</CardHeader>
<CardContent>
<div className="grid grid-cols-2 sm:grid-cols-3 gap-4">
<StatBox value={batchReport.auto_matched} label={t("jobDetail.autoMatched")} variant="success" />
<StatBox value={batchReport.already_linked} label={t("jobDetail.alreadyLinked")} variant="primary" />
<StatBox value={batchReport.no_results} label={t("jobDetail.noResults")} />
<StatBox value={batchReport.too_many_results} label={t("jobDetail.tooManyResults")} variant="warning" />
<StatBox value={batchReport.low_confidence} label={t("jobDetail.lowConfidence")} variant="warning" />
<StatBox value={batchReport.errors} label={t("jobDetail.errors")} variant={batchReport.errors > 0 ? "error" : "default"} />
</div>
</CardContent>
</Card>
)}
{/* Metadata refresh report */}
{isMetadataRefresh && refreshReport && (
<Card>
<CardHeader>
<CardTitle>{t("jobDetail.refreshReport")}</CardTitle>
<CardDescription>{t("jobDetail.refreshReportDesc", { count: String(refreshReport.total_links) })}</CardDescription>
</CardHeader>
<CardContent>
<div className="grid grid-cols-2 sm:grid-cols-4 gap-4">
<StatBox
value={refreshReport.refreshed}
label={t("jobDetail.refreshed")}
variant="success"
icon={
<svg className="w-6 h-6 text-success" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M4 4v5h.582m15.356 2A8.001 8.001 0 004.582 9m0 0H9m11 11v-5h-.581m0 0a8.003 8.003 0 01-15.357-2m15.357 2H15" />
</svg>
}
/>
<StatBox value={refreshReport.unchanged} label={t("jobDetail.unchanged")} />
<StatBox value={refreshReport.errors} label={t("jobDetail.errors")} variant={refreshReport.errors > 0 ? "error" : "default"} />
<StatBox value={refreshReport.total_links} label={t("jobDetail.total")} />
</div>
</CardContent>
</Card>
)}
{/* Metadata refresh changes detail */}
{isMetadataRefresh && refreshReport && refreshReport.changes.length > 0 && (
<Card className="lg:col-span-2">
<CardHeader>
<CardTitle>{t("jobDetail.refreshChanges")}</CardTitle>
<CardDescription>{t("jobDetail.refreshChangesDesc", { count: String(refreshReport.changes.length) })}</CardDescription>
</CardHeader>
<CardContent className="space-y-3 max-h-[600px] overflow-y-auto">
{refreshReport.changes.map((r, idx) => (
<div
key={idx}
className={`p-3 rounded-lg border ${
r.status === "updated" ? "bg-success/10 border-success/20" :
r.status === "error" ? "bg-destructive/10 border-destructive/20" :
"bg-muted/50 border-border/60"
}`}
>
<div className="flex items-center justify-between gap-2">
{job.library_id ? (
<Link
href={`/libraries/${job.library_id}/series/${encodeURIComponent(r.series_name)}`}
className="font-medium text-sm text-primary hover:underline truncate"
>
{r.series_name}
</Link>
) : (
<span className="font-medium text-sm text-foreground truncate">{r.series_name}</span>
)}
<div className="flex items-center gap-2">
<span className="text-[10px] text-muted-foreground">{r.provider}</span>
<span className={`text-[10px] px-1.5 py-0.5 rounded-full font-medium whitespace-nowrap ${
r.status === "updated" ? "bg-success/20 text-success" :
r.status === "error" ? "bg-destructive/20 text-destructive" :
"bg-muted text-muted-foreground"
}`}>
{r.status === "updated" ? t("jobDetail.refreshed") :
r.status === "error" ? t("common.error") :
t("jobDetail.unchanged")}
</span>
</div>
</div>
{r.error && (
<p className="text-xs text-destructive/80 mt-1">{r.error}</p>
)}
{/* Series field changes */}
{r.series_changes.length > 0 && (
<div className="mt-2">
<span className="text-[10px] uppercase tracking-wide text-muted-foreground font-semibold">{t("metadata.seriesLabel")}</span>
<div className="mt-1 space-y-1">
{r.series_changes.map((c, ci) => (
<div key={ci} className="flex items-start gap-2 text-xs">
<span className="font-medium text-foreground shrink-0 w-24">{t(`field.${c.field}` as never) || c.field}</span>
<span className="text-muted-foreground line-through truncate max-w-[200px]" title={String(c.old ?? "—")}>
{c.old != null ? (Array.isArray(c.old) ? (c.old as string[]).join(", ") : String(c.old)) : "—"}
</span>
<span className="text-success shrink-0"></span>
<span className="text-success truncate max-w-[200px]" title={String(c.new ?? "—")}>
{c.new != null ? (Array.isArray(c.new) ? (c.new as string[]).join(", ") : String(c.new)) : "—"}
</span>
</div>
))}
</div>
</div>
)}
{/* Book field changes */}
{r.book_changes.length > 0 && (
<div className="mt-2">
<span className="text-[10px] uppercase tracking-wide text-muted-foreground font-semibold">
{t("metadata.booksLabel")} ({r.book_changes.length})
</span>
<div className="mt-1 space-y-2">
{r.book_changes.map((b, bi) => (
<div key={bi} className="pl-2 border-l-2 border-border/60">
<Link
href={`/books/${b.book_id}`}
className="text-xs text-primary hover:underline font-medium"
>
{b.volume != null && <span className="text-muted-foreground mr-1">T.{b.volume}</span>}
{b.title}
</Link>
<div className="mt-0.5 space-y-0.5">
{b.changes.map((c, ci) => (
<div key={ci} className="flex items-start gap-2 text-xs">
<span className="font-medium text-foreground shrink-0 w-24">{t(`field.${c.field}` as never) || c.field}</span>
<span className="text-muted-foreground line-through truncate max-w-[150px]" title={String(c.old ?? "—")}>
{c.old != null ? (Array.isArray(c.old) ? (c.old as string[]).join(", ") : String(c.old).substring(0, 60)) : "—"}
</span>
<span className="text-success shrink-0"></span>
<span className="text-success truncate max-w-[150px]" title={String(c.new ?? "—")}>
{c.new != null ? (Array.isArray(c.new) ? (c.new as string[]).join(", ") : String(c.new).substring(0, 60)) : "—"}
</span>
</div>
))}
</div>
</div>
))}
</div>
</div>
)}
</div>
))}
</CardContent>
</Card>
)}
{/* Metadata batch results */}
{isMetadataBatch && batchResults.length > 0 && (
<Card className="lg:col-span-2">
<CardHeader>
<CardTitle>{t("jobDetail.resultsBySeries")}</CardTitle>
<CardDescription>{t("jobDetail.seriesProcessed", { count: String(batchResults.length) })}</CardDescription>
</CardHeader>
<CardContent className="space-y-2 max-h-[600px] overflow-y-auto">
{batchResults.map((r) => (
<div
key={r.id}
className={`p-3 rounded-lg border ${
r.status === "auto_matched" ? "bg-success/10 border-success/20" :
r.status === "already_linked" ? "bg-primary/10 border-primary/20" :
r.status === "error" ? "bg-destructive/10 border-destructive/20" :
"bg-muted/50 border-border/60"
}`}
>
<div className="flex items-center justify-between gap-2">
{job.library_id ? (
<Link
href={`/libraries/${job.library_id}/series/${encodeURIComponent(r.series_name)}`}
className="font-medium text-sm text-primary hover:underline truncate"
>
{r.series_name}
</Link>
) : (
<span className="font-medium text-sm text-foreground truncate">{r.series_name}</span>
)}
<span className={`text-[10px] px-1.5 py-0.5 rounded-full font-medium whitespace-nowrap ${
r.status === "auto_matched" ? "bg-success/20 text-success" :
r.status === "already_linked" ? "bg-primary/20 text-primary" :
r.status === "no_results" ? "bg-muted text-muted-foreground" :
r.status === "too_many_results" ? "bg-amber-500/15 text-amber-600" :
r.status === "low_confidence" ? "bg-amber-500/15 text-amber-600" :
r.status === "error" ? "bg-destructive/20 text-destructive" :
"bg-muted text-muted-foreground"
}`}>
{r.status === "auto_matched" ? t("jobDetail.autoMatched") :
r.status === "already_linked" ? t("jobDetail.alreadyLinked") :
r.status === "no_results" ? t("jobDetail.noResults") :
r.status === "too_many_results" ? t("jobDetail.tooManyResults") :
r.status === "low_confidence" ? t("jobDetail.lowConfidence") :
r.status === "error" ? t("common.error") :
r.status}
</span>
</div>
<div className="flex items-center gap-3 mt-1 text-xs text-muted-foreground">
{r.provider_used && (
<span>{r.provider_used}{r.fallback_used ? ` ${t("metadata.fallbackUsed")}` : ""}</span>
)}
{r.candidates_count > 0 && (
<span>{r.candidates_count} {t("jobDetail.candidates", { plural: r.candidates_count > 1 ? "s" : "" })}</span>
)}
{r.best_confidence != null && (
<span>{Math.round(r.best_confidence * 100)}% {t("jobDetail.confidence")}</span>
)}
</div>
{r.best_candidate_json && (
<p className="text-xs text-muted-foreground mt-1">
{t("jobDetail.match", { title: (r.best_candidate_json as { title?: string }).title || r.best_candidate_json.toString() })}
</p>
)}
{r.error_message && (
<p className="text-xs text-destructive/80 mt-1">{r.error_message}</p>
)}
</div>
))}
</CardContent>
</Card>
)}
{/* File errors */}
{errors.length > 0 && (
<Card className="lg:col-span-2">
<CardHeader>
<CardTitle>{t("jobDetail.fileErrors", { count: String(errors.length) })}</CardTitle>
<CardDescription>{t("jobDetail.fileErrorsDesc")}</CardDescription>
</CardHeader>
<CardContent className="space-y-2 max-h-80 overflow-y-auto">
{errors.map((error) => (
<div key={error.id} className="p-3 bg-destructive/10 rounded-lg border border-destructive/20">
<code className="block text-sm font-mono text-destructive mb-1">{error.file_path}</code>
<p className="text-sm text-destructive/80">{error.error_message}</p>
<span className="text-xs text-muted-foreground">{new Date(error.created_at).toLocaleString(locale)}</span>
</div>
))}
</CardContent>
</Card>
)}
</div>
</>
);
}

View File

@@ -0,0 +1,269 @@
import { revalidatePath } from "next/cache";
import { redirect } from "next/navigation";
import { listJobs, fetchLibraries, rebuildIndex, rebuildThumbnails, regenerateThumbnails, startMetadataBatch, startMetadataRefresh, IndexJobDto, LibraryDto } from "@/lib/api";
import { JobsList } from "@/app/components/JobsList";
import { Card, CardHeader, CardTitle, CardDescription, CardContent, FormField, FormSelect } from "@/app/components/ui";
import { getServerTranslations } from "@/lib/i18n/server";
export const dynamic = "force-dynamic";
export default async function JobsPage({ searchParams }: { searchParams: Promise<{ highlight?: string }> }) {
const { highlight } = await searchParams;
const { t } = await getServerTranslations();
const [jobs, libraries] = await Promise.all([
listJobs().catch(() => [] as IndexJobDto[]),
fetchLibraries().catch(() => [] as LibraryDto[])
]);
const libraryMap = new Map(libraries.map(l => [l.id, l.name]));
async function triggerRebuild(formData: FormData) {
"use server";
const libraryId = formData.get("library_id") as string;
const result = await rebuildIndex(libraryId || undefined);
revalidatePath("/jobs");
redirect(`/jobs?highlight=${result.id}`);
}
async function triggerFullRebuild(formData: FormData) {
"use server";
const libraryId = formData.get("library_id") as string;
const result = await rebuildIndex(libraryId || undefined, true);
revalidatePath("/jobs");
redirect(`/jobs?highlight=${result.id}`);
}
async function triggerRescan(formData: FormData) {
"use server";
const libraryId = formData.get("library_id") as string;
const result = await rebuildIndex(libraryId || undefined, false, true);
revalidatePath("/jobs");
redirect(`/jobs?highlight=${result.id}`);
}
async function triggerThumbnailsRebuild(formData: FormData) {
"use server";
const libraryId = formData.get("library_id") as string;
const result = await rebuildThumbnails(libraryId || undefined);
revalidatePath("/jobs");
redirect(`/jobs?highlight=${result.id}`);
}
async function triggerThumbnailsRegenerate(formData: FormData) {
"use server";
const libraryId = formData.get("library_id") as string;
const result = await regenerateThumbnails(libraryId || undefined);
revalidatePath("/jobs");
redirect(`/jobs?highlight=${result.id}`);
}
async function triggerMetadataBatch(formData: FormData) {
"use server";
const libraryId = formData.get("library_id") as string;
if (libraryId) {
let result;
try {
result = await startMetadataBatch(libraryId);
} catch {
// Library may have metadata disabled — ignore silently
return;
}
revalidatePath("/jobs");
redirect(`/jobs?highlight=${result.id}`);
} else {
// All libraries — skip those with metadata disabled
const allLibraries = await fetchLibraries().catch(() => [] as LibraryDto[]);
let lastId: string | undefined;
for (const lib of allLibraries) {
if (lib.metadata_provider === "none") continue;
try {
const result = await startMetadataBatch(lib.id);
if (result.status !== "already_running") lastId = result.id;
} catch {
// Library may have metadata disabled or other issue — skip
}
}
revalidatePath("/jobs");
redirect(lastId ? `/jobs?highlight=${lastId}` : "/jobs");
}
}
async function triggerMetadataRefresh(formData: FormData) {
"use server";
const libraryId = formData.get("library_id") as string;
if (libraryId) {
let result;
try {
result = await startMetadataRefresh(libraryId);
} catch {
return;
}
revalidatePath("/jobs");
redirect(`/jobs?highlight=${result.id}`);
} else {
// All libraries — skip those with metadata disabled
const allLibraries = await fetchLibraries().catch(() => [] as LibraryDto[]);
let lastId: string | undefined;
for (const lib of allLibraries) {
if (lib.metadata_provider === "none") continue;
try {
const result = await startMetadataRefresh(lib.id);
if (result.status !== "already_running") lastId = result.id;
} catch {
// Library may have metadata disabled or no approved links — skip
}
}
revalidatePath("/jobs");
redirect(lastId ? `/jobs?highlight=${lastId}` : "/jobs");
}
}
return (
<>
<div className="mb-6">
<h1 className="text-3xl font-bold text-foreground flex items-center gap-3">
<svg className="w-8 h-8 text-warning" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M13 10V3L4 14h7v7l9-11h-7z" />
</svg>
{t("jobs.title")}
</h1>
</div>
<Card className="mb-6">
<CardHeader>
<CardTitle>{t("jobs.startJob")}</CardTitle>
<CardDescription>{t("jobs.startJobDescription")}</CardDescription>
</CardHeader>
<CardContent>
<form>
<div className="mb-6">
<FormField className="max-w-xs">
<FormSelect name="library_id" defaultValue="">
<option value="">{t("jobs.allLibraries")}</option>
{libraries.map((lib) => (
<option key={lib.id} value={lib.id}>{lib.name}</option>
))}
</FormSelect>
</FormField>
</div>
<div className="grid grid-cols-1 lg:grid-cols-3 gap-6">
{/* Indexation group */}
<div className="space-y-3">
<div className="flex items-center gap-2 text-sm font-semibold text-foreground">
<svg className="w-4 h-4 text-primary" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M3 7v10a2 2 0 002 2h14a2 2 0 002-2V9a2 2 0 00-2-2h-6l-2-2H5a2 2 0 00-2 2z" />
</svg>
{t("jobs.groupIndexation")}
</div>
<div className="space-y-2">
<button type="submit" formAction={triggerRebuild}
className="w-full text-left rounded-lg border border-input bg-background p-3 hover:bg-accent/50 transition-colors group cursor-pointer">
<div className="flex items-center gap-2">
<svg className="w-4 h-4 text-primary shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M4 4v5h.582m15.356 2A8.001 8.001 0 004.582 9m0 0H9m11 11v-5h-.581m0 0a8.003 8.003 0 01-15.357-2m15.357 2H15" />
</svg>
<span className="font-medium text-sm text-foreground">{t("jobs.rebuild")}</span>
</div>
<p className="text-xs text-muted-foreground mt-1 ml-6">{t("jobs.rebuildShort")}</p>
</button>
<button type="submit" formAction={triggerRescan}
className="w-full text-left rounded-lg border border-input bg-background p-3 hover:bg-accent/50 transition-colors group cursor-pointer">
<div className="flex items-center gap-2">
<svg className="w-4 h-4 text-primary shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M21 21l-6-6m2-5a7 7 0 11-14 0 7 7 0 0114 0z" />
</svg>
<span className="font-medium text-sm text-foreground">{t("jobs.rescan")}</span>
</div>
<p className="text-xs text-muted-foreground mt-1 ml-6">{t("jobs.rescanShort")}</p>
</button>
<button type="submit" formAction={triggerFullRebuild}
className="w-full text-left rounded-lg border border-destructive/30 bg-destructive/5 p-3 hover:bg-destructive/10 transition-colors group cursor-pointer">
<div className="flex items-center gap-2">
<svg className="w-4 h-4 text-destructive shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M12 9v2m0 4h.01m-6.938 4h13.856c1.54 0 2.502-1.667 1.732-3L13.732 4c-.77-1.333-2.694-1.333-3.464 0L3.34 16c-.77 1.333.192 3 1.732 3z" />
</svg>
<span className="font-medium text-sm text-destructive">{t("jobs.fullRebuild")}</span>
</div>
<p className="text-xs text-muted-foreground mt-1 ml-6">{t("jobs.fullRebuildShort")}</p>
</button>
</div>
</div>
{/* Thumbnails group */}
<div className="space-y-3">
<div className="flex items-center gap-2 text-sm font-semibold text-foreground">
<svg className="w-4 h-4 text-primary" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M4 16l4.586-4.586a2 2 0 012.828 0L16 16m-2-2l1.586-1.586a2 2 0 012.828 0L20 14m-6-6h.01M6 20h12a2 2 0 002-2V6a2 2 0 00-2-2H6a2 2 0 00-2 2v12a2 2 0 002 2z" />
</svg>
{t("jobs.groupThumbnails")}
</div>
<div className="space-y-2">
<button type="submit" formAction={triggerThumbnailsRebuild}
className="w-full text-left rounded-lg border border-input bg-background p-3 hover:bg-accent/50 transition-colors group cursor-pointer">
<div className="flex items-center gap-2">
<svg className="w-4 h-4 text-primary shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M12 6v6m0 0v6m0-6h6m-6 0H6" />
</svg>
<span className="font-medium text-sm text-foreground">{t("jobs.generateThumbnails")}</span>
</div>
<p className="text-xs text-muted-foreground mt-1 ml-6">{t("jobs.generateThumbnailsShort")}</p>
</button>
<button type="submit" formAction={triggerThumbnailsRegenerate}
className="w-full text-left rounded-lg border border-warning/30 bg-warning/5 p-3 hover:bg-warning/10 transition-colors group cursor-pointer">
<div className="flex items-center gap-2">
<svg className="w-4 h-4 text-warning shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M12 9v2m0 4h.01m-6.938 4h13.856c1.54 0 2.502-1.667 1.732-3L13.732 4c-.77-1.333-2.694-1.333-3.464 0L3.34 16c-.77 1.333.192 3 1.732 3z" />
</svg>
<span className="font-medium text-sm text-warning">{t("jobs.regenerateThumbnails")}</span>
</div>
<p className="text-xs text-muted-foreground mt-1 ml-6">{t("jobs.regenerateThumbnailsShort")}</p>
</button>
</div>
</div>
{/* Metadata group */}
<div className="space-y-3">
<div className="flex items-center gap-2 text-sm font-semibold text-foreground">
<svg className="w-4 h-4 text-primary" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M7 7h.01M7 3h5c.512 0 1.024.195 1.414.586l7 7a2 2 0 010 2.828l-7 7a2 2 0 01-2.828 0l-7-7A1.994 1.994 0 013 12V7a4 4 0 014-4z" />
</svg>
{t("jobs.groupMetadata")}
</div>
<div className="space-y-2">
<button type="submit" formAction={triggerMetadataBatch}
className="w-full text-left rounded-lg border border-input bg-background p-3 hover:bg-accent/50 transition-colors group cursor-pointer disabled:opacity-50 disabled:cursor-not-allowed disabled:hover:bg-background">
<div className="flex items-center gap-2">
<svg className="w-4 h-4 text-primary shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M21 21l-6-6m2-5a7 7 0 11-14 0 7 7 0 0114 0z" />
</svg>
<span className="font-medium text-sm text-foreground">{t("jobs.batchMetadata")}</span>
</div>
<p className="text-xs text-muted-foreground mt-1 ml-6">{t("jobs.batchMetadataShort")}</p>
</button>
<button type="submit" formAction={triggerMetadataRefresh}
className="w-full text-left rounded-lg border border-input bg-background p-3 hover:bg-accent/50 transition-colors group cursor-pointer disabled:opacity-50 disabled:cursor-not-allowed disabled:hover:bg-background">
<div className="flex items-center gap-2">
<svg className="w-4 h-4 text-primary shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M4 4v5h.582m15.356 2A8.001 8.001 0 004.582 9m0 0H9m11 11v-5h-.581m0 0a8.003 8.003 0 01-15.357-2m15.357 2H15" />
</svg>
<span className="font-medium text-sm text-foreground">{t("jobs.refreshMetadata")}</span>
</div>
<p className="text-xs text-muted-foreground mt-1 ml-6">{t("jobs.refreshMetadataShort")}</p>
</button>
</div>
</div>
</div>
</form>
</CardContent>
</Card>
<JobsList
initialJobs={jobs}
libraries={libraryMap}
highlightJobId={highlight}
/>
</>
);
}

View File

@@ -0,0 +1,127 @@
import Image from "next/image";
import Link from "next/link";
import type { ReactNode } from "react";
import { cookies } from "next/headers";
import { revalidatePath } from "next/cache";
import { ThemeToggle } from "@/app/theme-toggle";
import { JobsIndicator } from "@/app/components/JobsIndicator";
import { NavIcon, Icon } from "@/app/components/ui";
import { LogoutButton } from "@/app/components/LogoutButton";
import { MobileNav } from "@/app/components/MobileNav";
import { UserSwitcher } from "@/app/components/UserSwitcher";
import { fetchUsers } from "@/lib/api";
import { getServerTranslations } from "@/lib/i18n/server";
import type { TranslationKey } from "@/lib/i18n/fr";
type NavItem = {
href: "/" | "/books" | "/series" | "/authors" | "/libraries" | "/jobs" | "/tokens" | "/settings";
labelKey: TranslationKey;
icon: "dashboard" | "books" | "series" | "authors" | "libraries" | "jobs" | "tokens" | "settings";
};
const navItems: NavItem[] = [
{ href: "/", labelKey: "nav.dashboard", icon: "dashboard" },
{ href: "/books", labelKey: "nav.books", icon: "books" },
{ href: "/series", labelKey: "nav.series", icon: "series" },
{ href: "/authors", labelKey: "nav.authors", icon: "authors" },
{ href: "/libraries", labelKey: "nav.libraries", icon: "libraries" },
{ href: "/jobs", labelKey: "nav.jobs", icon: "jobs" },
{ href: "/tokens", labelKey: "nav.tokens", icon: "tokens" },
];
export default async function AppLayout({ children }: { children: ReactNode }) {
const { t } = await getServerTranslations();
const cookieStore = await cookies();
const activeUserId = cookieStore.get("as_user_id")?.value || null;
const users = await fetchUsers().catch(() => []);
async function setActiveUserAction(formData: FormData) {
"use server";
const userId = formData.get("user_id") as string;
const store = await cookies();
if (userId) {
store.set("as_user_id", userId, { path: "/", httpOnly: false, sameSite: "lax" });
} else {
store.delete("as_user_id");
}
revalidatePath("/", "layout");
}
return (
<>
<header className="sticky top-0 z-50 w-full border-b border-border/40 bg-background/70 backdrop-blur-xl backdrop-saturate-150 supports-[backdrop-filter]:bg-background/60">
<nav className="container mx-auto flex h-16 items-center justify-between px-4">
<Link
href="/"
className="flex items-center gap-3 hover:opacity-80 transition-opacity duration-200"
>
<Image src="/logo.png" alt="StripStream" width={36} height={36} className="rounded-lg" />
<div className="flex items-baseline gap-2">
<span className="text-xl font-bold tracking-tight text-foreground">StripStream</span>
<span className="text-sm text-muted-foreground font-medium hidden xl:inline">
{t("common.backoffice")}
</span>
</div>
</Link>
<div className="flex items-center gap-2">
<div className="hidden md:flex items-center gap-1">
{navItems.map((item) => (
<NavLink key={item.href} href={item.href} title={t(item.labelKey)}>
<NavIcon name={item.icon} />
<span className="ml-2 hidden xl:inline">{t(item.labelKey)}</span>
</NavLink>
))}
</div>
<UserSwitcher
users={users}
activeUserId={activeUserId}
setActiveUserAction={setActiveUserAction}
/>
<div className="flex items-center gap-1 pl-4 ml-2 border-l border-border/60">
<JobsIndicator />
<Link
href="/settings"
className="hidden xl:flex p-2 rounded-lg text-muted-foreground hover:text-foreground hover:bg-accent transition-colors"
title={t("nav.settings")}
>
<Icon name="settings" size="md" />
</Link>
<ThemeToggle />
<LogoutButton />
<MobileNav navItems={navItems.map(item => ({ ...item, label: t(item.labelKey) }))} />
</div>
</div>
</nav>
</header>
<main className="container mx-auto px-4 sm:px-6 lg:px-8 py-8 pb-16">
{children}
</main>
</>
);
}
function NavLink({ href, title, children }: { href: NavItem["href"]; title?: string; children: React.ReactNode }) {
return (
<Link
href={href}
title={title}
className="
flex items-center
px-2 lg:px-3 py-2
rounded-lg
text-sm font-medium
text-muted-foreground
hover:text-foreground
hover:bg-accent
transition-colors duration-200
active:scale-[0.98]
"
>
{children}
</Link>
);
}

View File

@@ -1,8 +1,9 @@
import { fetchLibraries, fetchBooks, getBookCoverUrl, LibraryDto, BookDto } from "../../../../lib/api";
import { BooksGrid, EmptyState } from "../../../components/BookCard";
import { LibrarySubPageHeader } from "../../../components/LibrarySubPageHeader";
import { OffsetPagination } from "../../../components/ui";
import { fetchLibraries, fetchBooks, getBookCoverUrl, LibraryDto, BookDto } from "@/lib/api";
import { BooksGrid, EmptyState } from "@/app/components/BookCard";
import { LibrarySubPageHeader } from "@/app/components/LibrarySubPageHeader";
import { OffsetPagination } from "@/app/components/ui";
import { notFound } from "next/navigation";
import { getServerTranslations } from "@/lib/i18n/server";
export const dynamic = "force-dynamic";
@@ -14,6 +15,7 @@ export default async function LibraryBooksPage({
searchParams: Promise<{ [key: string]: string | string[] | undefined }>;
}) {
const { id } = await params;
const { t } = await getServerTranslations();
const searchParamsAwaited = await searchParams;
const page = typeof searchParamsAwaited.page === "string" ? parseInt(searchParamsAwaited.page) : 1;
const series = typeof searchParamsAwaited.series === "string" ? searchParamsAwaited.series : undefined;
@@ -38,14 +40,14 @@ export default async function LibraryBooksPage({
coverUrl: getBookCoverUrl(book.id)
}));
const seriesDisplayName = series === "unclassified" ? "Unclassified" : series;
const seriesDisplayName = series === "unclassified" ? t("books.unclassified") : (series ?? "");
const totalPages = Math.ceil(booksPage.total / limit);
return (
<div className="space-y-6">
<LibrarySubPageHeader
library={library}
title={series ? `Books in "${seriesDisplayName}"` : "All Books"}
title={series ? t("libraryBooks.booksOfSeries", { series: seriesDisplayName }) : t("libraryBooks.allBooks")}
icon={
<svg className="w-8 h-8" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M12 6.253v13m0-13C10.832 5.477 9.246 5 7.5 5S4.168 5.477 3 6.253v13C4.168 18.477 5.754 18 7.5 18s3.332.477 4.5 1.253m0-13C13.168 5.477 14.754 5 16.5 5c1.747 0 3.332.477 4.5 1.253v13C19.832 18.477 18.247 18 16.5 18c-1.746 0-3.332.477-4.5 1.253" />
@@ -53,9 +55,9 @@ export default async function LibraryBooksPage({
}
iconColor="text-success"
filterInfo={series ? {
label: `Showing books from series "${seriesDisplayName}"`,
label: t("libraryBooks.filterLabel", { series: seriesDisplayName }),
clearHref: `/libraries/${id}/books`,
clearLabel: "View all books"
clearLabel: t("libraryBooks.viewAll")
} : undefined}
/>
@@ -71,7 +73,7 @@ export default async function LibraryBooksPage({
/>
</>
) : (
<EmptyState message={series ? `No books in series "${seriesDisplayName}"` : "No books in this library yet"} />
<EmptyState message={series ? t("libraryBooks.noBooksInSeries", { series: seriesDisplayName }) : t("libraryBooks.noBooks")} />
)}
</div>
);

View File

@@ -1,12 +1,24 @@
import { fetchLibraries, fetchBooks, fetchSeriesMetadata, getBookCoverUrl, BookDto, SeriesMetadataDto } from "../../../../../lib/api";
import { BooksGrid, EmptyState } from "../../../../components/BookCard";
import { MarkSeriesReadButton } from "../../../../components/MarkSeriesReadButton";
import { MarkBookReadButton } from "../../../../components/MarkBookReadButton";
import { EditSeriesForm } from "../../../../components/EditSeriesForm";
import { OffsetPagination } from "../../../../components/ui";
import { fetchLibraries, fetchBooks, fetchSeriesMetadata, getBookCoverUrl, getMetadataLink, getMissingBooks, BookDto, SeriesMetadataDto, ExternalMetadataLinkDto, MissingBooksDto } from "@/lib/api";
import { BooksGrid, EmptyState } from "@/app/components/BookCard";
import { MarkSeriesReadButton } from "@/app/components/MarkSeriesReadButton";
import { MarkBookReadButton } from "@/app/components/MarkBookReadButton";
import nextDynamic from "next/dynamic";
import { OffsetPagination } from "@/app/components/ui";
import { SafeHtml } from "@/app/components/SafeHtml";
import Image from "next/image";
import Link from "next/link";
const EditSeriesForm = nextDynamic(
() => import("@/app/components/EditSeriesForm").then(m => m.EditSeriesForm)
);
const MetadataSearchModal = nextDynamic(
() => import("@/app/components/MetadataSearchModal").then(m => m.MetadataSearchModal)
);
const ProwlarrSearchModal = nextDynamic(
() => import("@/app/components/ProwlarrSearchModal").then(m => m.ProwlarrSearchModal)
);
import { notFound } from "next/navigation";
import { getServerTranslations } from "@/lib/i18n/server";
export const dynamic = "force-dynamic";
@@ -18,13 +30,14 @@ export default async function SeriesDetailPage({
searchParams: Promise<{ [key: string]: string | string[] | undefined }>;
}) {
const { id, name } = await params;
const { t } = await getServerTranslations();
const searchParamsAwaited = await searchParams;
const page = typeof searchParamsAwaited.page === "string" ? parseInt(searchParamsAwaited.page) : 1;
const limit = typeof searchParamsAwaited.limit === "string" ? parseInt(searchParamsAwaited.limit) : 50;
const seriesName = decodeURIComponent(name);
const [library, booksPage, seriesMeta] = await Promise.all([
const [library, booksPage, seriesMeta, metadataLinks] = await Promise.all([
fetchLibraries().then((libs) => libs.find((l) => l.id === id)),
fetchBooks(id, seriesName, page, limit).catch(() => ({
items: [] as BookDto[],
@@ -33,8 +46,15 @@ export default async function SeriesDetailPage({
limit,
})),
fetchSeriesMetadata(id, seriesName).catch(() => null as SeriesMetadataDto | null),
getMetadataLink(id, seriesName).catch(() => [] as ExternalMetadataLinkDto[]),
]);
const existingLink = metadataLinks.find((l) => l.status === "approved") ?? metadataLinks[0] ?? null;
let missingData: MissingBooksDto | null = null;
if (existingLink && existingLink.status === "approved") {
missingData = await getMissingBooks(existingLink.id).catch(() => null);
}
if (!library) {
notFound();
}
@@ -46,7 +66,7 @@ export default async function SeriesDetailPage({
const totalPages = Math.ceil(booksPage.total / limit);
const booksReadCount = booksPage.items.filter((b) => b.reading_status === "read").length;
const displayName = seriesName === "unclassified" ? "Non classifié" : seriesName;
const displayName = seriesName === "unclassified" ? t("books.unclassified") : seriesName;
// Use first book cover as series cover
const coverBookId = booksPage.items[0]?.id;
@@ -59,7 +79,7 @@ export default async function SeriesDetailPage({
href="/libraries"
className="text-muted-foreground hover:text-primary transition-colors"
>
Libraries
{t("nav.libraries")}
</Link>
<span className="text-muted-foreground">/</span>
<Link
@@ -79,10 +99,10 @@ export default async function SeriesDetailPage({
<div className="w-40 aspect-[2/3] relative rounded-xl overflow-hidden shadow-card border border-border">
<Image
src={getBookCoverUrl(coverBookId)}
alt={`Cover of ${displayName}`}
alt={t("books.coverOf", { name: displayName })}
fill
className="object-cover"
unoptimized
sizes="160px"
/>
</div>
</div>
@@ -91,12 +111,25 @@ export default async function SeriesDetailPage({
<div className="flex-1 space-y-4">
<h1 className="text-3xl font-bold text-foreground">{displayName}</h1>
<div className="flex flex-wrap items-center gap-3">
{seriesMeta && seriesMeta.authors.length > 0 && (
<p className="text-base text-muted-foreground">{seriesMeta.authors.join(", ")}</p>
)}
{seriesMeta?.status && (
<span className={`inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-medium ${
seriesMeta.status === "ongoing" ? "bg-blue-500/15 text-blue-600" :
seriesMeta.status === "ended" ? "bg-green-500/15 text-green-600" :
seriesMeta.status === "hiatus" ? "bg-amber-500/15 text-amber-600" :
seriesMeta.status === "cancelled" ? "bg-red-500/15 text-red-600" :
"bg-muted text-muted-foreground"
}`}>
{t(`seriesStatus.${seriesMeta.status}` as any) || seriesMeta.status}
</span>
)}
</div>
{seriesMeta?.description && (
<p className="text-sm text-muted-foreground leading-relaxed">{seriesMeta.description}</p>
<SafeHtml html={seriesMeta.description} className="text-sm text-muted-foreground leading-relaxed" />
)}
<div className="flex flex-wrap items-center gap-4 text-sm">
@@ -110,14 +143,14 @@ export default async function SeriesDetailPage({
)}
{((seriesMeta && seriesMeta.publishers.length > 0) || seriesMeta?.start_year) && <span className="w-px h-4 bg-border" />}
<span className="text-muted-foreground">
<span className="font-semibold text-foreground">{booksPage.total}</span> livre{booksPage.total !== 1 ? "s" : ""}
<span className="font-semibold text-foreground">{booksPage.total}</span> {t("dashboard.books").toLowerCase()}
</span>
<span className="w-px h-4 bg-border" />
<span className="text-muted-foreground">
<span className="font-semibold text-foreground">{booksReadCount}</span>/{booksPage.total} lu{booksPage.total !== 1 ? "s" : ""}
{t("series.readCount", { read: String(booksReadCount), total: String(booksPage.total), plural: booksPage.total !== 1 ? "s" : "" })}
</span>
{/* Progress bar */}
{/* Reading progress bar */}
<div className="flex items-center gap-2 flex-1 min-w-[120px] max-w-[200px]">
<div className="flex-1 h-2 bg-muted rounded-full overflow-hidden">
<div
@@ -126,6 +159,22 @@ export default async function SeriesDetailPage({
/>
</div>
</div>
{/* Collection progress bar (owned / expected) */}
{missingData && missingData.total_external > 0 && (
<>
<span className="w-px h-4 bg-border" />
<span className="text-muted-foreground">
{booksPage.total}/{missingData.total_external} {t("series.missingCount", { count: missingData.missing_count, plural: missingData.missing_count !== 1 ? "s" : "" })}
</span>
<div className="w-[150px] h-2 bg-muted rounded-full overflow-hidden">
<div
className="h-full bg-amber-500 rounded-full transition-all"
style={{ width: `${Math.round((booksPage.total / missingData.total_external) * 100)}%` }}
/>
</div>
</>
)}
</div>
<div className="flex flex-wrap items-center gap-3">
@@ -143,6 +192,19 @@ export default async function SeriesDetailPage({
currentBookLanguage={seriesMeta?.book_language ?? booksPage.items[0]?.language ?? null}
currentDescription={seriesMeta?.description ?? null}
currentStartYear={seriesMeta?.start_year ?? null}
currentTotalVolumes={seriesMeta?.total_volumes ?? null}
currentStatus={seriesMeta?.status ?? null}
currentLockedFields={seriesMeta?.locked_fields ?? {}}
/>
<ProwlarrSearchModal
seriesName={seriesName}
missingBooks={missingData?.missing_books ?? null}
/>
<MetadataSearchModal
libraryId={id}
seriesName={seriesName}
existingLink={existingLink}
initialMissing={missingData}
/>
</div>
</div>
@@ -160,7 +222,7 @@ export default async function SeriesDetailPage({
/>
</>
) : (
<EmptyState message="Aucun livre dans cette série" />
<EmptyState message={t("librarySeries.noBooksInSeries")} />
)}
</div>
);

View File

@@ -0,0 +1,144 @@
import { fetchLibraries, fetchSeries, fetchSeriesStatuses, getBookCoverUrl, LibraryDto, SeriesDto, SeriesPageDto } from "@/lib/api";
import { OffsetPagination } from "@/app/components/ui";
import { MarkSeriesReadButton } from "@/app/components/MarkSeriesReadButton";
import { SeriesFilters } from "@/app/components/SeriesFilters";
import Image from "next/image";
import Link from "next/link";
import { notFound } from "next/navigation";
import { LibrarySubPageHeader } from "@/app/components/LibrarySubPageHeader";
import { getServerTranslations } from "@/lib/i18n/server";
export const dynamic = "force-dynamic";
export default async function LibrarySeriesPage({
params,
searchParams
}: {
params: Promise<{ id: string }>;
searchParams: Promise<{ [key: string]: string | string[] | undefined }>;
}) {
const { id } = await params;
const { t } = await getServerTranslations();
const searchParamsAwaited = await searchParams;
const page = typeof searchParamsAwaited.page === "string" ? parseInt(searchParamsAwaited.page) : 1;
const limit = typeof searchParamsAwaited.limit === "string" ? parseInt(searchParamsAwaited.limit) : 20;
const seriesStatus = typeof searchParamsAwaited.series_status === "string" ? searchParamsAwaited.series_status : undefined;
const hasMissing = searchParamsAwaited.has_missing === "true";
const [library, seriesPage, dbStatuses] = await Promise.all([
fetchLibraries().then(libs => libs.find(l => l.id === id)),
fetchSeries(id, page, limit, seriesStatus, hasMissing).catch(() => ({ items: [] as SeriesDto[], total: 0, page: 1, limit }) as SeriesPageDto),
fetchSeriesStatuses().catch(() => [] as string[]),
]);
if (!library) {
notFound();
}
const series = seriesPage.items;
const totalPages = Math.ceil(seriesPage.total / limit);
const KNOWN_STATUSES: Record<string, string> = {
ongoing: t("seriesStatus.ongoing"),
ended: t("seriesStatus.ended"),
hiatus: t("seriesStatus.hiatus"),
cancelled: t("seriesStatus.cancelled"),
upcoming: t("seriesStatus.upcoming"),
};
const seriesStatusOptions = [
{ value: "", label: t("seriesStatus.allStatuses") },
...dbStatuses.map((s) => ({ value: s, label: KNOWN_STATUSES[s] || s })),
];
return (
<div className="space-y-6">
<LibrarySubPageHeader
library={library}
title={t("series.title")}
icon={
<svg className="w-8 h-8" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M19 11H5m14 0a2 2 0 012 2v6a2 2 0 01-2 2H5a2 2 0 01-2-2v-6a2 2 0 012-2m14 0V9a2 2 0 00-2-2M5 11V9a2 2 0 012-2m0 0V5a2 2 0 012-2h6a2 2 0 012 2v2M7 7h10" />
</svg>
}
iconColor="text-primary"
/>
<SeriesFilters
basePath={`/libraries/${id}/series`}
currentSeriesStatus={seriesStatus}
currentHasMissing={hasMissing}
seriesStatusOptions={seriesStatusOptions}
/>
{series.length > 0 ? (
<>
<div className="grid grid-cols-2 sm:grid-cols-3 md:grid-cols-4 lg:grid-cols-5 gap-6">
{series.map((s) => (
<Link
key={s.name}
href={`/libraries/${id}/series/${encodeURIComponent(s.name)}`}
className="group"
>
<div className={`bg-card rounded-xl shadow-sm border border-border/60 overflow-hidden hover:shadow-md transition-shadow duration-200 ${s.books_read_count >= s.book_count ? "opacity-50" : ""}`}>
<div className="aspect-[2/3] relative bg-muted/50">
<Image
src={getBookCoverUrl(s.first_book_id)}
alt={t("books.coverOf", { name: s.name })}
fill
className="object-cover"
sizes="(max-width: 640px) 50vw, (max-width: 768px) 33vw, (max-width: 1024px) 25vw, 20vw"
/>
</div>
<div className="p-3">
<h3 className="font-medium text-foreground truncate text-sm" title={s.name}>
{s.name === "unclassified" ? t("books.unclassified") : s.name}
</h3>
<div className="flex items-center justify-between mt-1">
<p className="text-xs text-muted-foreground">
{t("series.readCount", { read: String(s.books_read_count), total: String(s.book_count), plural: s.book_count !== 1 ? "s" : "" })}
</p>
<MarkSeriesReadButton
seriesName={s.name}
bookCount={s.book_count}
booksReadCount={s.books_read_count}
/>
</div>
<div className="flex items-center gap-1 mt-1.5 flex-wrap">
{s.series_status && (
<span className={`text-[10px] px-1.5 py-0.5 rounded-full font-medium ${
s.series_status === "ongoing" ? "bg-blue-500/15 text-blue-600" :
s.series_status === "ended" ? "bg-green-500/15 text-green-600" :
s.series_status === "hiatus" ? "bg-amber-500/15 text-amber-600" :
s.series_status === "cancelled" ? "bg-red-500/15 text-red-600" :
"bg-muted text-muted-foreground"
}`}>
{KNOWN_STATUSES[s.series_status] || s.series_status}
</span>
)}
{s.missing_count != null && s.missing_count > 0 && (
<span className="text-[10px] px-1.5 py-0.5 rounded-full font-medium bg-yellow-500/15 text-yellow-600">
{t("series.missingCount", { count: String(s.missing_count) })}
</span>
)}
</div>
</div>
</div>
</Link>
))}
</div>
<OffsetPagination
currentPage={page}
totalPages={totalPages}
pageSize={limit}
totalItems={seriesPage.total}
/>
</>
) : (
<div className="text-center py-12 text-muted-foreground">
<p>{t("librarySeries.noSeries")}</p>
</div>
)}
</div>
);
}

View File

@@ -0,0 +1,227 @@
import { revalidatePath } from "next/cache";
import Image from "next/image";
import Link from "next/link";
import { listFolders, createLibrary, deleteLibrary, fetchLibraries, getBookCoverUrl, LibraryDto, FolderItem } from "@/lib/api";
import type { TranslationKey } from "@/lib/i18n/fr";
import { getServerTranslations } from "@/lib/i18n/server";
import { LibraryActions } from "@/app/components/LibraryActions";
import { LibraryForm } from "@/app/components/LibraryForm";
import { ProviderIcon } from "@/app/components/ProviderIcon";
import {
Card, CardHeader, CardTitle, CardDescription, CardContent,
Button, Badge
} from "@/app/components/ui";
export const dynamic = "force-dynamic";
function formatNextScan(nextScanAt: string | null, imminentLabel: string): string {
if (!nextScanAt) return "-";
const date = new Date(nextScanAt);
const now = new Date();
const diff = date.getTime() - now.getTime();
if (diff < 0) return imminentLabel;
if (diff < 60000) return "< 1 min";
if (diff < 3600000) return `${Math.floor(diff / 60000)}m`;
if (diff < 86400000) return `${Math.floor(diff / 3600000)}h`;
return `${Math.floor(diff / 86400000)}d`;
}
export default async function LibrariesPage() {
const { t } = await getServerTranslations();
const [libraries, folders] = await Promise.all([
fetchLibraries().catch(() => [] as LibraryDto[]),
listFolders().catch(() => [] as FolderItem[])
]);
const thumbnailMap = new Map(
libraries.map(lib => [
lib.id,
(lib.thumbnail_book_ids || []).map(bookId => getBookCoverUrl(bookId)),
])
);
async function addLibrary(formData: FormData) {
"use server";
const name = formData.get("name") as string;
const rootPath = formData.get("root_path") as string;
if (name && rootPath) {
await createLibrary(name, rootPath);
revalidatePath("/libraries");
}
}
async function removeLibrary(formData: FormData) {
"use server";
const id = formData.get("id") as string;
await deleteLibrary(id);
revalidatePath("/libraries");
}
return (
<>
<div className="mb-6">
<h1 className="text-3xl font-bold text-foreground flex items-center gap-3">
<svg className="w-8 h-8 text-primary" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M3 7v10a2 2 0 002 2h14a2 2 0 002-2V9a2 2 0 00-2-2h-6l-2-2H5a2 2 0 00-2 2z" />
</svg>
{t("libraries.title")}
</h1>
</div>
{/* Add Library Form */}
<Card className="mb-6">
<CardHeader>
<CardTitle>{t("libraries.addLibrary")}</CardTitle>
<CardDescription>{t("libraries.addLibraryDescription")}</CardDescription>
</CardHeader>
<CardContent>
<LibraryForm initialFolders={folders} action={addLibrary} />
</CardContent>
</Card>
{/* Libraries Grid */}
<div className="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-4">
{libraries.map((lib) => {
const thumbnails = thumbnailMap.get(lib.id) || [];
return (
<Card key={lib.id} className="flex flex-col overflow-hidden">
{/* Thumbnail fan */}
{thumbnails.length > 0 ? (
<Link href={`/libraries/${lib.id}/series`} className="block relative h-48 overflow-hidden bg-muted/10">
<Image
src={thumbnails[0]}
alt=""
fill
className="object-cover blur-xl scale-110 opacity-40"
sizes="(max-width: 768px) 100vw, 33vw"
loading="lazy"
/>
<div className="absolute inset-0 flex items-end justify-center">
{thumbnails.map((url, i) => {
const count = thumbnails.length;
const mid = (count - 1) / 2;
const angle = (i - mid) * 12;
const radius = 220;
const rad = ((angle - 90) * Math.PI) / 180;
const cx = Math.cos(rad) * radius;
const cy = Math.sin(rad) * radius;
return (
<Image
key={i}
src={url}
alt=""
width={96}
height={144}
className="absolute object-cover shadow-lg"
style={{
transform: `translate(${cx}px, ${cy}px) rotate(${angle}deg)`,
transformOrigin: 'bottom center',
zIndex: count - Math.abs(Math.round(i - mid)),
bottom: '-185px',
}}
sizes="96px"
loading="lazy"
/>
);
})}
</div>
</Link>
) : (
<div className="h-8 bg-muted/10" />
)}
<CardHeader className="pb-2">
<div className="flex items-start justify-between">
<div>
<CardTitle className="text-lg">{lib.name}</CardTitle>
{!lib.enabled && <Badge variant="muted" className="mt-1">{t("libraries.disabled")}</Badge>}
</div>
<div className="flex items-center gap-1">
<LibraryActions
libraryId={lib.id}
monitorEnabled={lib.monitor_enabled}
scanMode={lib.scan_mode}
watcherEnabled={lib.watcher_enabled}
metadataProvider={lib.metadata_provider}
fallbackMetadataProvider={lib.fallback_metadata_provider}
metadataRefreshMode={lib.metadata_refresh_mode}
/>
<form>
<input type="hidden" name="id" value={lib.id} />
<Button type="submit" variant="ghost" size="sm" formAction={removeLibrary} className="text-muted-foreground hover:text-destructive">
<svg className="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M19 7l-.867 12.142A2 2 0 0116.138 21H7.862a2 2 0 01-1.995-1.858L5 7m5 4v6m4-6v6m1-10V4a1 1 0 00-1-1h-4a1 1 0 00-1 1v3M4 7h16" />
</svg>
</Button>
</form>
</div>
</div>
<code className="text-xs font-mono text-muted-foreground break-all">{lib.root_path}</code>
</CardHeader>
<CardContent className="flex-1 pt-0">
{/* Stats */}
<div className="grid grid-cols-2 gap-3 mb-3">
<Link
href={`/libraries/${lib.id}/books`}
className="text-center p-2.5 bg-muted/50 rounded-lg hover:bg-accent transition-colors duration-200"
>
<span className="block text-2xl font-bold text-primary">{lib.book_count}</span>
<span className="text-xs text-muted-foreground">{t("libraries.books")}</span>
</Link>
<Link
href={`/libraries/${lib.id}/series`}
className="text-center p-2.5 bg-muted/50 rounded-lg hover:bg-accent transition-colors duration-200"
>
<span className="block text-2xl font-bold text-foreground">{lib.series_count}</span>
<span className="text-xs text-muted-foreground">{t("libraries.series")}</span>
</Link>
</div>
{/* Configuration tags */}
<div className="flex flex-wrap gap-1.5">
<span className={`inline-flex items-center gap-1 px-2 py-0.5 rounded-full text-[11px] font-medium ${
lib.monitor_enabled
? 'bg-success/10 text-success'
: 'bg-muted/50 text-muted-foreground'
}`}>
<span className="text-[9px]">{lib.monitor_enabled ? '●' : '○'}</span>
{t("libraries.scanLabel", { mode: t(`monitoring.${lib.scan_mode}` as TranslationKey) })}
</span>
<span className={`inline-flex items-center gap-1 px-2 py-0.5 rounded-full text-[11px] font-medium ${
lib.watcher_enabled
? 'bg-warning/10 text-warning'
: 'bg-muted/50 text-muted-foreground'
}`}>
<span>{lib.watcher_enabled ? '⚡' : '○'}</span>
<span>{t("libraries.watcherLabel")}</span>
</span>
{lib.metadata_provider && lib.metadata_provider !== "none" && (
<span className="inline-flex items-center gap-1 px-2 py-0.5 rounded-full text-[11px] font-medium bg-primary/10 text-primary">
<ProviderIcon provider={lib.metadata_provider} size={11} />
{lib.metadata_provider.replace('_', ' ')}
</span>
)}
{lib.metadata_refresh_mode !== "manual" && (
<span className="inline-flex items-center gap-1 px-2 py-0.5 rounded-full text-[11px] font-medium bg-muted/50 text-muted-foreground">
{t("libraries.metaRefreshLabel", { mode: t(`monitoring.${lib.metadata_refresh_mode}` as TranslationKey) })}
</span>
)}
{lib.monitor_enabled && lib.next_scan_at && (
<span className="inline-flex items-center gap-1 px-2 py-0.5 rounded-full text-[11px] font-medium bg-muted/50 text-muted-foreground">
{t("libraries.nextScan", { time: formatNextScan(lib.next_scan_at, t("libraries.imminent")) })}
</span>
)}
</div>
</CardContent>
</Card>
);
})}
</div>
</>
);
}

View File

@@ -0,0 +1,514 @@
import React from "react";
import { fetchStats, fetchUsers, StatsResponse, UserDto } from "@/lib/api";
import { Card, CardContent, CardHeader, CardTitle } from "@/app/components/ui";
import { RcDonutChart, RcBarChart, RcAreaChart, RcStackedBar, RcHorizontalBar, RcMultiLineChart } from "@/app/components/DashboardCharts";
import { PeriodToggle } from "@/app/components/PeriodToggle";
import { CurrentlyReadingList, RecentlyReadList } from "@/app/components/ReadingUserFilter";
import Link from "next/link";
import { getServerTranslations } from "@/lib/i18n/server";
import type { TranslateFunction } from "@/lib/i18n/dictionaries";
export const dynamic = "force-dynamic";
function formatBytes(bytes: number): string {
if (bytes === 0) return "0 B";
const k = 1024;
const sizes = ["B", "KB", "MB", "GB", "TB"];
const i = Math.floor(Math.log(bytes) / Math.log(k));
return `${(bytes / Math.pow(k, i)).toFixed(1)} ${sizes[i]}`;
}
function formatNumber(n: number, locale: string): string {
return n.toLocaleString(locale === "fr" ? "fr-FR" : "en-US");
}
function formatChartLabel(raw: string, period: "day" | "week" | "month", locale: string): string {
const loc = locale === "fr" ? "fr-FR" : "en-US";
if (period === "month") {
// raw = "YYYY-MM"
const [y, m] = raw.split("-");
const d = new Date(Number(y), Number(m) - 1, 1);
return d.toLocaleDateString(loc, { month: "short" });
}
if (period === "week") {
// raw = "YYYY-MM-DD" (Monday of the week)
const d = new Date(raw + "T00:00:00");
return d.toLocaleDateString(loc, { day: "numeric", month: "short" });
}
// day: raw = "YYYY-MM-DD"
const d = new Date(raw + "T00:00:00");
return d.toLocaleDateString(loc, { weekday: "short", day: "numeric" });
}
// Horizontal progress bar for metadata quality (stays server-rendered, no recharts needed)
function HorizontalBar({ label, value, max, subLabel, color = "var(--color-primary)" }: { label: string; value: number; max: number; subLabel?: string; color?: string }) {
const pct = max > 0 ? (value / max) * 100 : 0;
return (
<div className="space-y-1">
<div className="flex justify-between text-sm">
<span className="font-medium text-foreground truncate">{label}</span>
<span className="text-muted-foreground shrink-0 ml-2">{subLabel || value}</span>
</div>
<div className="h-2 bg-muted rounded-full overflow-hidden">
<div
className="h-full rounded-full transition-all duration-500"
style={{ width: `${pct}%`, backgroundColor: color }}
/>
</div>
</div>
);
}
export default async function DashboardPage({
searchParams,
}: {
searchParams: Promise<{ [key: string]: string | string[] | undefined }>;
}) {
const searchParamsAwaited = await searchParams;
const rawPeriod = searchParamsAwaited.period;
const period = rawPeriod === "day" ? "day" as const : rawPeriod === "week" ? "week" as const : "month" as const;
const { t, locale } = await getServerTranslations();
let stats: StatsResponse | null = null;
let users: UserDto[] = [];
try {
[stats, users] = await Promise.all([
fetchStats(period),
fetchUsers().catch(() => []),
]);
} catch (e) {
console.error("Failed to fetch stats:", e);
}
if (!stats) {
return (
<div className="max-w-5xl mx-auto">
<div className="text-center mb-12">
<h1 className="text-4xl font-bold tracking-tight mb-4 text-foreground">StripStream Backoffice</h1>
<p className="text-lg text-muted-foreground">{t("dashboard.loadError")}</p>
</div>
<QuickLinks t={t} />
</div>
);
}
const {
overview,
reading_status,
currently_reading = [],
recently_read = [],
reading_over_time = [],
users_reading_over_time = [],
by_format,
by_library,
top_series,
additions_over_time,
jobs_over_time = [],
metadata = { total_series: 0, series_linked: 0, series_unlinked: 0, books_with_summary: 0, books_with_isbn: 0, by_provider: [] },
} = stats;
const readingColors = ["hsl(220 13% 70%)", "hsl(45 93% 47%)", "hsl(142 60% 45%)"];
const formatColors = [
"hsl(198 78% 37%)", "hsl(142 60% 45%)", "hsl(45 93% 47%)",
"hsl(2 72% 48%)", "hsl(280 60% 50%)", "hsl(32 80% 50%)",
"hsl(170 60% 45%)", "hsl(220 60% 50%)",
];
const noDataLabel = t("common.noData");
return (
<div className="max-w-7xl mx-auto space-y-6">
{/* Header */}
<div className="mb-2">
<h1 className="text-3xl font-bold text-foreground flex items-center gap-3">
<svg className="w-8 h-8 text-primary" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M9 19v-6a2 2 0 00-2-2H5a2 2 0 00-2 2v6a2 2 0 002 2h2a2 2 0 002-2zm0 0V9a2 2 0 012-2h2a2 2 0 012 2v10m-6 0a2 2 0 002 2h2a2 2 0 002-2m0 0V5a2 2 0 012-2h2a2 2 0 012 2v14a2 2 0 01-2 2h-2a2 2 0 01-2-2z" />
</svg>
{t("dashboard.title")}
</h1>
<p className="text-muted-foreground mt-2 max-w-2xl">
{t("dashboard.subtitle")}
</p>
</div>
{/* Overview stat cards */}
<div className="grid grid-cols-2 md:grid-cols-3 lg:grid-cols-6 gap-4">
<StatCard icon="book" label={t("dashboard.books")} value={formatNumber(overview.total_books, locale)} color="success" />
<StatCard icon="series" label={t("dashboard.series")} value={formatNumber(overview.total_series, locale)} color="primary" />
<StatCard icon="library" label={t("dashboard.libraries")} value={formatNumber(overview.total_libraries, locale)} color="warning" />
<StatCard icon="pages" label={t("dashboard.pages")} value={formatNumber(overview.total_pages, locale)} color="primary" />
<StatCard icon="author" label={t("dashboard.authors")} value={formatNumber(overview.total_authors, locale)} color="success" />
<StatCard icon="size" label={t("dashboard.totalSize")} value={formatBytes(overview.total_size_bytes)} color="warning" />
</div>
{/* Currently reading + Recently read */}
{(currently_reading.length > 0 || recently_read.length > 0) && (
<div className="grid grid-cols-1 lg:grid-cols-2 gap-6">
{/* Currently reading */}
<Card hover={false}>
<CardHeader>
<CardTitle className="text-base">{t("dashboard.currentlyReading")}</CardTitle>
</CardHeader>
<CardContent>
<CurrentlyReadingList
items={currently_reading}
allLabel={t("dashboard.allUsers")}
emptyLabel={t("dashboard.noCurrentlyReading")}
pageProgressTemplate={t("dashboard.pageProgress")}
/>
</CardContent>
</Card>
{/* Recently read */}
<Card hover={false}>
<CardHeader>
<CardTitle className="text-base">{t("dashboard.recentlyRead")}</CardTitle>
</CardHeader>
<CardContent>
<RecentlyReadList
items={recently_read}
allLabel={t("dashboard.allUsers")}
emptyLabel={t("dashboard.noRecentlyRead")}
/>
</CardContent>
</Card>
</div>
)}
{/* Reading activity line chart */}
<Card hover={false}>
<CardHeader className="flex flex-row items-center justify-between space-y-0">
<CardTitle className="text-base">{t("dashboard.readingActivity")}</CardTitle>
<PeriodToggle labels={{ day: t("dashboard.periodDay"), week: t("dashboard.periodWeek"), month: t("dashboard.periodMonth") }} />
</CardHeader>
<CardContent>
{(() => {
const userColors = [
"hsl(142 60% 45%)", "hsl(198 78% 37%)", "hsl(45 93% 47%)",
"hsl(2 72% 48%)", "hsl(280 60% 50%)", "hsl(32 80% 50%)",
];
const usernames = [...new Set(users_reading_over_time.map(r => r.username))];
if (usernames.length === 0) {
return (
<RcAreaChart
noDataLabel={noDataLabel}
data={reading_over_time.map((m) => ({ label: formatChartLabel(m.month, period, locale), value: m.books_read }))}
color="hsl(142 60% 45%)"
/>
);
}
// Pivot: { label, username1: n, username2: n, ... }
const byMonth = new Map<string, Record<string, unknown>>();
for (const row of users_reading_over_time) {
const label = formatChartLabel(row.month, period, locale);
if (!byMonth.has(row.month)) byMonth.set(row.month, { label });
byMonth.get(row.month)![row.username] = row.books_read;
}
const chartData = [...byMonth.values()];
const lines = usernames.map((u, i) => ({
key: u,
label: u,
color: userColors[i % userColors.length],
}));
return <RcMultiLineChart data={chartData} lines={lines} noDataLabel={noDataLabel} />;
})()}
</CardContent>
</Card>
{/* Charts row */}
<div className="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-6">
{/* Reading status par lecteur */}
<Card hover={false}>
<CardHeader>
<CardTitle className="text-base">{t("dashboard.readingStatus")}</CardTitle>
</CardHeader>
<CardContent>
{users.length === 0 ? (
<RcDonutChart
noDataLabel={noDataLabel}
data={[
{ name: t("status.unread"), value: reading_status.unread, color: readingColors[0] },
{ name: t("status.reading"), value: reading_status.reading, color: readingColors[1] },
{ name: t("status.read"), value: reading_status.read, color: readingColors[2] },
]}
/>
) : (
<div className="space-y-3">
{users.map((user) => {
const total = overview.total_books;
const read = user.books_read;
const reading = user.books_reading;
const unread = Math.max(0, total - read - reading);
const readPct = total > 0 ? (read / total) * 100 : 0;
const readingPct = total > 0 ? (reading / total) * 100 : 0;
return (
<div key={user.id} className="space-y-1">
<div className="flex items-center justify-between text-sm">
<span className="font-medium text-foreground truncate">{user.username}</span>
<span className="text-xs text-muted-foreground shrink-0 ml-2">
<span className="text-success font-medium">{read}</span>
{reading > 0 && <span className="text-amber-500 font-medium"> · {reading}</span>}
<span className="text-muted-foreground/60"> / {total}</span>
</span>
</div>
<div className="h-2 bg-muted rounded-full overflow-hidden flex">
<div className="h-full bg-success transition-all duration-500" style={{ width: `${readPct}%` }} />
<div className="h-full bg-amber-500 transition-all duration-500" style={{ width: `${readingPct}%` }} />
</div>
</div>
);
})}
</div>
)}
</CardContent>
</Card>
{/* By format donut */}
<Card hover={false}>
<CardHeader>
<CardTitle className="text-base">{t("dashboard.byFormat")}</CardTitle>
</CardHeader>
<CardContent>
<RcDonutChart
noDataLabel={noDataLabel}
data={by_format.slice(0, 6).map((f, i) => ({
name: (f.format || t("dashboard.unknown")).toUpperCase(),
value: f.count,
color: formatColors[i % formatColors.length],
}))}
/>
</CardContent>
</Card>
{/* By library donut */}
<Card hover={false}>
<CardHeader>
<CardTitle className="text-base">{t("dashboard.byLibrary")}</CardTitle>
</CardHeader>
<CardContent>
<RcDonutChart
noDataLabel={noDataLabel}
data={by_library.slice(0, 6).map((l, i) => ({
name: l.library_name,
value: l.book_count,
color: formatColors[i % formatColors.length],
}))}
/>
</CardContent>
</Card>
</div>
{/* Metadata row */}
<div className="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-6">
{/* Series metadata coverage donut */}
<Card hover={false}>
<CardHeader>
<CardTitle className="text-base">{t("dashboard.metadataCoverage")}</CardTitle>
</CardHeader>
<CardContent>
<RcDonutChart
noDataLabel={noDataLabel}
data={[
{ name: t("dashboard.seriesLinked"), value: metadata.series_linked, color: "hsl(142 60% 45%)" },
{ name: t("dashboard.seriesUnlinked"), value: metadata.series_unlinked, color: "hsl(220 13% 70%)" },
]}
/>
</CardContent>
</Card>
{/* By provider donut */}
<Card hover={false}>
<CardHeader>
<CardTitle className="text-base">{t("dashboard.byProvider")}</CardTitle>
</CardHeader>
<CardContent>
<RcDonutChart
noDataLabel={noDataLabel}
data={metadata.by_provider.map((p, i) => ({
name: p.provider.replace(/_/g, " ").replace(/\b\w/g, (c) => c.toUpperCase()),
value: p.count,
color: formatColors[i % formatColors.length],
}))}
/>
</CardContent>
</Card>
{/* Book metadata quality */}
<Card hover={false}>
<CardHeader>
<CardTitle className="text-base">{t("dashboard.bookMetadata")}</CardTitle>
</CardHeader>
<CardContent>
<div className="space-y-4">
<HorizontalBar
label={t("dashboard.withSummary")}
value={metadata.books_with_summary}
max={overview.total_books}
subLabel={overview.total_books > 0 ? `${Math.round((metadata.books_with_summary / overview.total_books) * 100)}%` : "0%"}
color="hsl(198 78% 37%)"
/>
<HorizontalBar
label={t("dashboard.withIsbn")}
value={metadata.books_with_isbn}
max={overview.total_books}
subLabel={overview.total_books > 0 ? `${Math.round((metadata.books_with_isbn / overview.total_books) * 100)}%` : "0%"}
color="hsl(280 60% 50%)"
/>
</div>
</CardContent>
</Card>
</div>
{/* Libraries breakdown + Top series */}
<div className="grid grid-cols-1 lg:grid-cols-2 gap-6">
{by_library.length > 0 && (
<Card hover={false}>
<CardHeader>
<CardTitle className="text-base">{t("dashboard.libraries")}</CardTitle>
</CardHeader>
<CardContent>
<RcStackedBar
data={by_library.map((lib) => ({
name: lib.library_name,
read: lib.read_count,
reading: lib.reading_count,
unread: lib.unread_count,
sizeLabel: formatBytes(lib.size_bytes),
}))}
labels={{
read: t("status.read"),
reading: t("status.reading"),
unread: t("status.unread"),
books: t("dashboard.books"),
}}
/>
</CardContent>
</Card>
)}
{/* Top series */}
<Card hover={false}>
<CardHeader>
<CardTitle className="text-base">{t("dashboard.popularSeries")}</CardTitle>
</CardHeader>
<CardContent>
<RcHorizontalBar
noDataLabel={t("dashboard.noSeries")}
data={top_series.slice(0, 8).map((s) => ({
name: s.series,
value: s.book_count,
subLabel: t("dashboard.readCount", { read: s.read_count, total: s.book_count }),
}))}
color="hsl(142 60% 45%)"
/>
</CardContent>
</Card>
</div>
{/* Additions line chart full width */}
<Card hover={false}>
<CardHeader className="flex flex-row items-center justify-between space-y-0">
<CardTitle className="text-base">{t("dashboard.booksAdded")}</CardTitle>
<PeriodToggle labels={{ day: t("dashboard.periodDay"), week: t("dashboard.periodWeek"), month: t("dashboard.periodMonth") }} />
</CardHeader>
<CardContent>
<RcAreaChart
noDataLabel={noDataLabel}
data={additions_over_time.map((m) => ({ label: formatChartLabel(m.month, period, locale), value: m.books_added }))}
color="hsl(198 78% 37%)"
/>
</CardContent>
</Card>
{/* Jobs over time multi-line chart */}
<Card hover={false}>
<CardHeader className="flex flex-row items-center justify-between space-y-0">
<CardTitle className="text-base">{t("dashboard.jobsOverTime")}</CardTitle>
<PeriodToggle labels={{ day: t("dashboard.periodDay"), week: t("dashboard.periodWeek"), month: t("dashboard.periodMonth") }} />
</CardHeader>
<CardContent>
<RcMultiLineChart
noDataLabel={noDataLabel}
data={jobs_over_time.map((j) => ({
label: formatChartLabel(j.label, period, locale),
scan: j.scan,
rebuild: j.rebuild,
thumbnail: j.thumbnail,
other: j.other,
}))}
lines={[
{ key: "scan", label: t("dashboard.jobScan"), color: "hsl(198 78% 37%)" },
{ key: "rebuild", label: t("dashboard.jobRebuild"), color: "hsl(142 60% 45%)" },
{ key: "thumbnail", label: t("dashboard.jobThumbnail"), color: "hsl(45 93% 47%)" },
{ key: "other", label: t("dashboard.jobOther"), color: "hsl(280 60% 50%)" },
]}
/>
</CardContent>
</Card>
{/* Quick links */}
<QuickLinks t={t} />
</div>
);
}
function StatCard({ icon, label, value, color }: { icon: string; label: string; value: string; color: string }) {
const icons: Record<string, React.ReactNode> = {
book: <path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M12 6.253v13m0-13C10.832 5.477 9.246 5 7.5 5S4.168 5.477 3 6.253v13C4.168 18.477 5.754 18 7.5 18s3.332.477 4.5 1.253m0-13C13.168 5.477 14.754 5 16.5 5c1.747 0 3.332.477 4.5 1.253v13C19.832 18.477 18.247 18 16.5 18c-1.746 0-3.332.477-4.5 1.253" />,
series: <path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M19 11H5m14 0a2 2 0 012 2v6a2 2 0 01-2 2H5a2 2 0 01-2-2v-6a2 2 0 012-2m14 0V9a2 2 0 00-2-2M5 11V9a2 2 0 012-2m0 0V5a2 2 0 012-2h6a2 2 0 012 2v2M7 7h10" />,
library: <path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M3 7v10a2 2 0 002 2h14a2 2 0 002-2V9a2 2 0 00-2-2h-6l-2-2H5a2 2 0 00-2 2z" />,
pages: <path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M9 12h6m-6 4h6m2 5H7a2 2 0 01-2-2V5a2 2 0 012-2h5.586a1 1 0 01.707.293l5.414 5.414a1 1 0 01.293.707V19a2 2 0 01-2 2z" />,
author: <path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M16 7a4 4 0 11-8 0 4 4 0 018 0zM12 14a7 7 0 00-7 7h14a7 7 0 00-7-7z" />,
size: <path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M4 7v10c0 2.21 3.582 4 8 4s8-1.79 8-4V7M4 7c0 2.21 3.582 4 8 4s8-1.79 8-4M4 7c0-2.21 3.582-4 8-4s8 1.79 8 4m0 5c0 2.21-3.582 4-8 4s-8-1.79-8-4" />,
};
const colorClasses: Record<string, string> = {
primary: "bg-primary/10 text-primary",
success: "bg-success/10 text-success",
warning: "bg-warning/10 text-warning",
};
return (
<Card hover={false} className="p-4">
<div className="flex items-center gap-3">
<div className={`w-10 h-10 rounded-lg flex items-center justify-center shrink-0 ${colorClasses[color]}`}>
<svg className="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
{icons[icon]}
</svg>
</div>
<div className="min-w-0">
<p className="text-xl font-bold text-foreground leading-tight">{value}</p>
<p className="text-xs text-muted-foreground">{label}</p>
</div>
</div>
</Card>
);
}
function QuickLinks({ t }: { t: TranslateFunction }) {
const links = [
{ href: "/libraries", label: t("nav.libraries"), bg: "bg-primary/10", text: "text-primary", hoverBg: "group-hover:bg-primary", hoverText: "group-hover:text-primary-foreground", icon: <path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M3 7v10a2 2 0 002 2h14a2 2 0 002-2V9a2 2 0 00-2-2h-6l-2-2H5a2 2 0 00-2 2z" /> },
{ href: "/books", label: t("nav.books"), bg: "bg-success/10", text: "text-success", hoverBg: "group-hover:bg-success", hoverText: "group-hover:text-white", icon: <path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M12 6.253v13m0-13C10.832 5.477 9.246 5 7.5 5S4.168 5.477 3 6.253v13C4.168 18.477 5.754 18 7.5 18s3.332.477 4.5 1.253m0-13C13.168 5.477 14.754 5 16.5 5c1.747 0 3.332.477 4.5 1.253v13C19.832 18.477 18.247 18 16.5 18c-1.746 0-3.332.477-4.5 1.253" /> },
{ href: "/series", label: t("nav.series"), bg: "bg-warning/10", text: "text-warning", hoverBg: "group-hover:bg-warning", hoverText: "group-hover:text-white", icon: <path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M19 11H5m14 0a2 2 0 012 2v6a2 2 0 01-2 2H5a2 2 0 01-2-2v-6a2 2 0 012-2m14 0V9a2 2 0 00-2-2M5 11V9a2 2 0 012-2m0 0V5a2 2 0 012-2h6a2 2 0 012 2v2M7 7h10" /> },
{ href: "/jobs", label: t("nav.jobs"), bg: "bg-destructive/10", text: "text-destructive", hoverBg: "group-hover:bg-destructive", hoverText: "group-hover:text-destructive-foreground", icon: <path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M13 10V3L4 14h7v7l9-11h-7z" /> },
];
return (
<div className="grid grid-cols-2 md:grid-cols-4 gap-4">
{links.map((l) => (
<Link
key={l.href}
href={l.href as any}
className="group p-4 bg-card/80 backdrop-blur-sm rounded-xl border border-border/50 shadow-sm hover:shadow-md hover:-translate-y-0.5 transition-all duration-200 flex items-center gap-3"
>
<div className={`w-9 h-9 rounded-lg flex items-center justify-center transition-colors duration-200 ${l.bg} ${l.hoverBg}`}>
<svg className={`w-5 h-5 ${l.text} ${l.hoverText}`} fill="none" stroke="currentColor" viewBox="0 0 24 24">
{l.icon}
</svg>
</div>
<span className="font-medium text-foreground text-sm">{l.label}</span>
</Link>
))}
</div>
);
}

View File

@@ -0,0 +1,208 @@
import { fetchAllSeries, fetchLibraries, fetchSeriesStatuses, LibraryDto, SeriesDto, SeriesPageDto, getBookCoverUrl } from "@/lib/api";
import { getServerTranslations } from "@/lib/i18n/server";
import { MarkSeriesReadButton } from "@/app/components/MarkSeriesReadButton";
import { LiveSearchForm } from "@/app/components/LiveSearchForm";
import { Card, CardContent, OffsetPagination } from "@/app/components/ui";
import Image from "next/image";
import Link from "next/link";
import { ProviderIcon } from "@/app/components/ProviderIcon";
export const dynamic = "force-dynamic";
export default async function SeriesPage({
searchParams,
}: {
searchParams: Promise<{ [key: string]: string | string[] | undefined }>;
}) {
const { t } = await getServerTranslations();
const searchParamsAwaited = await searchParams;
const libraryId = typeof searchParamsAwaited.library === "string" ? searchParamsAwaited.library : undefined;
const searchQuery = typeof searchParamsAwaited.q === "string" ? searchParamsAwaited.q : "";
const readingStatus = typeof searchParamsAwaited.status === "string" ? searchParamsAwaited.status : undefined;
const sort = typeof searchParamsAwaited.sort === "string" ? searchParamsAwaited.sort : undefined;
const seriesStatus = typeof searchParamsAwaited.series_status === "string" ? searchParamsAwaited.series_status : undefined;
const hasMissing = searchParamsAwaited.has_missing === "true";
const metadataProvider = typeof searchParamsAwaited.metadata_provider === "string" ? searchParamsAwaited.metadata_provider : undefined;
const page = typeof searchParamsAwaited.page === "string" ? parseInt(searchParamsAwaited.page) : 1;
const limit = typeof searchParamsAwaited.limit === "string" ? parseInt(searchParamsAwaited.limit) : 20;
const [libraries, seriesPage, dbStatuses] = await Promise.all([
fetchLibraries().catch(() => [] as LibraryDto[]),
fetchAllSeries(libraryId, searchQuery || undefined, readingStatus, page, limit, sort, seriesStatus, hasMissing, metadataProvider).catch(
() => ({ items: [] as SeriesDto[], total: 0, page: 1, limit }) as SeriesPageDto
),
fetchSeriesStatuses().catch(() => [] as string[]),
]);
const series = seriesPage.items;
const totalPages = Math.ceil(seriesPage.total / limit);
const sortOptions = [
{ value: "", label: t("books.sortTitle") },
{ value: "latest", label: t("books.sortLatest") },
];
const hasFilters = searchQuery || libraryId || readingStatus || sort || seriesStatus || hasMissing || metadataProvider;
const libraryOptions = [
{ value: "", label: t("books.allLibraries") },
...libraries.map((lib) => ({ value: lib.id, label: lib.name })),
];
const statusOptions = [
{ value: "", label: t("common.all") },
{ value: "unread", label: t("status.unread") },
{ value: "reading", label: t("status.reading") },
{ value: "read", label: t("status.read") },
];
const KNOWN_STATUSES: Record<string, string> = {
ongoing: t("seriesStatus.ongoing"),
ended: t("seriesStatus.ended"),
hiatus: t("seriesStatus.hiatus"),
cancelled: t("seriesStatus.cancelled"),
upcoming: t("seriesStatus.upcoming"),
};
const seriesStatusOptions = [
{ value: "", label: t("seriesStatus.allStatuses") },
...dbStatuses.map((s) => ({ value: s, label: KNOWN_STATUSES[s] || s })),
];
const missingOptions = [
{ value: "", label: t("common.all") },
{ value: "true", label: t("series.missingBooks") },
];
const metadataOptions = [
{ value: "", label: t("series.metadataAll") },
{ value: "linked", label: t("series.metadataLinked") },
{ value: "unlinked", label: t("series.metadataUnlinked") },
{ value: "google_books", label: "Google Books" },
{ value: "open_library", label: "Open Library" },
{ value: "comicvine", label: "ComicVine" },
{ value: "anilist", label: "AniList" },
{ value: "bedetheque", label: "Bédéthèque" },
];
return (
<>
<div className="mb-6">
<h1 className="text-3xl font-bold text-foreground flex items-center gap-3">
<svg className="w-8 h-8 text-warning" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M19 11H5m14 0a2 2 0 012 2v6a2 2 0 01-2 2H5a2 2 0 01-2-2v-6a2 2 0 012-2m14 0V9a2 2 0 00-2-2M5 11V9a2 2 0 012-2m0 0V5a2 2 0 012-2h6a2 2 0 012 2v2M7 7h10" />
</svg>
{t("series.title")}
</h1>
</div>
<Card className="mb-6">
<CardContent className="pt-6">
<LiveSearchForm
basePath="/series"
fields={[
{ name: "q", type: "text", label: t("common.search"), placeholder: t("series.searchPlaceholder") },
{ name: "library", type: "select", label: t("books.library"), options: libraryOptions },
{ name: "status", type: "select", label: t("series.reading"), options: statusOptions },
{ name: "series_status", type: "select", label: t("editSeries.status"), options: seriesStatusOptions },
{ name: "has_missing", type: "select", label: t("series.missing"), options: missingOptions },
{ name: "metadata_provider", type: "select", label: t("series.metadata"), options: metadataOptions },
{ name: "sort", type: "select", label: t("books.sort"), options: sortOptions },
]}
/>
</CardContent>
</Card>
{/* Results count */}
<p className="text-sm text-muted-foreground mb-4">
{seriesPage.total} {t("series.title").toLowerCase()}
{searchQuery && <> {t("series.matchingQuery")} &quot;{searchQuery}&quot;</>}
</p>
{/* Series Grid */}
{series.length > 0 ? (
<>
<div className="grid grid-cols-2 sm:grid-cols-3 md:grid-cols-4 lg:grid-cols-5 xl:grid-cols-6 gap-4">
{series.map((s) => (
<Link
key={s.name}
href={`/libraries/${s.library_id}/series/${encodeURIComponent(s.name)}`}
className="group"
>
<div
className={`bg-card rounded-xl shadow-sm border border-border/60 overflow-hidden hover:shadow-md hover:-translate-y-1 transition-all duration-200 ${
s.books_read_count >= s.book_count ? "opacity-50" : ""
}`}
>
<div className="aspect-[2/3] relative bg-muted/50">
<Image
src={getBookCoverUrl(s.first_book_id)}
alt={t("books.coverOf", { name: s.name })}
fill
className="object-cover"
sizes="(max-width: 640px) 50vw, (max-width: 768px) 33vw, (max-width: 1024px) 25vw, 16vw"
/>
</div>
<div className="p-3">
<h3 className="font-medium text-foreground truncate text-sm" title={s.name}>
{s.name === "unclassified" ? t("books.unclassified") : s.name}
</h3>
<div className="flex items-center justify-between mt-1">
<p className="text-xs text-muted-foreground">
{t("series.readCount", { read: String(s.books_read_count), total: String(s.book_count), plural: s.book_count !== 1 ? "s" : "" })}
</p>
<MarkSeriesReadButton
seriesName={s.name}
bookCount={s.book_count}
booksReadCount={s.books_read_count}
/>
</div>
<div className="flex items-center gap-1 mt-1.5 flex-wrap">
{s.series_status && (
<span className={`text-[10px] px-1.5 py-0.5 rounded-full font-medium ${
s.series_status === "ongoing" ? "bg-blue-500/15 text-blue-600" :
s.series_status === "ended" ? "bg-green-500/15 text-green-600" :
s.series_status === "hiatus" ? "bg-amber-500/15 text-amber-600" :
s.series_status === "cancelled" ? "bg-red-500/15 text-red-600" :
"bg-muted text-muted-foreground"
}`}>
{KNOWN_STATUSES[s.series_status] || s.series_status}
</span>
)}
{s.missing_count != null && s.missing_count > 0 && (
<span className="text-[10px] px-1.5 py-0.5 rounded-full font-medium bg-yellow-500/15 text-yellow-600">
{t("series.missingCount", { count: String(s.missing_count), plural: s.missing_count > 1 ? "s" : "" })}
</span>
)}
{s.metadata_provider && (
<span className="text-[10px] px-1.5 py-0.5 rounded-full font-medium bg-purple-500/15 text-purple-600 inline-flex items-center gap-0.5">
<ProviderIcon provider={s.metadata_provider} size={10} />
</span>
)}
</div>
</div>
</div>
</Link>
))}
</div>
<OffsetPagination
currentPage={page}
totalPages={totalPages}
pageSize={limit}
totalItems={seriesPage.total}
/>
</>
) : (
<div className="flex flex-col items-center justify-center py-16 text-center">
<div className="w-16 h-16 mb-4 text-muted-foreground/30">
<svg fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={1.5} d="M19 11H5m14 0a2 2 0 012 2v6a2 2 0 01-2 2H5a2 2 0 01-2-2v-6a2 2 0 012-2m14 0V9a2 2 0 00-2-2M5 11V9a2 2 0 012-2m0 0V5a2 2 0 012-2h6a2 2 0 012 2v2M7 7h10" />
</svg>
</div>
<p className="text-muted-foreground text-lg">
{hasFilters ? t("series.noResults") : t("series.noSeries")}
</p>
</div>
)}
</>
);
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,4 +1,4 @@
import { getSettings, getCacheStats, getThumbnailStats } from "../../lib/api";
import { getSettings, getCacheStats, getThumbnailStats, fetchUsers } from "@/lib/api";
import SettingsPage from "./SettingsPage";
export const dynamic = "force-dynamic";
@@ -23,5 +23,7 @@ export default async function SettingsPageWrapper() {
directory: "/data/thumbnails"
}));
return <SettingsPage initialSettings={settings} initialCacheStats={cacheStats} initialThumbnailStats={thumbnailStats} />;
const users = await fetchUsers().catch(() => []);
return <SettingsPage initialSettings={settings} initialCacheStats={cacheStats} initialThumbnailStats={thumbnailStats} users={users} />;
}

View File

@@ -0,0 +1,316 @@
import { revalidatePath } from "next/cache";
import { redirect } from "next/navigation";
import { listTokens, createToken, revokeToken, deleteToken, updateToken, fetchUsers, createUser, deleteUser, updateUser, TokenDto, UserDto } from "@/lib/api";
import { Card, CardHeader, CardTitle, CardDescription, CardContent, Button, Badge, FormField, FormInput, FormSelect, FormRow } from "@/app/components/ui";
import { TokenUserSelect } from "@/app/components/TokenUserSelect";
import { UsernameEdit } from "@/app/components/UsernameEdit";
import { getServerTranslations } from "@/lib/i18n/server";
export const dynamic = "force-dynamic";
export default async function TokensPage({
searchParams
}: {
searchParams: Promise<{ created?: string }>;
}) {
const { t } = await getServerTranslations();
const params = await searchParams;
const tokens = await listTokens().catch(() => [] as TokenDto[]);
const users = await fetchUsers().catch(() => [] as UserDto[]);
async function createTokenAction(formData: FormData) {
"use server";
const name = formData.get("name") as string;
const scope = formData.get("scope") as string;
const userId = (formData.get("user_id") as string) || undefined;
if (name) {
const result = await createToken(name, scope, userId);
revalidatePath("/tokens");
redirect(`/tokens?created=${encodeURIComponent(result.token)}`);
}
}
async function revokeTokenAction(formData: FormData) {
"use server";
const id = formData.get("id") as string;
await revokeToken(id);
revalidatePath("/tokens");
}
async function deleteTokenAction(formData: FormData) {
"use server";
const id = formData.get("id") as string;
await deleteToken(id);
revalidatePath("/tokens");
}
async function createUserAction(formData: FormData) {
"use server";
const username = formData.get("username") as string;
if (username) {
await createUser(username);
revalidatePath("/tokens");
}
}
async function deleteUserAction(formData: FormData) {
"use server";
const id = formData.get("id") as string;
await deleteUser(id);
revalidatePath("/tokens");
}
async function renameUserAction(formData: FormData) {
"use server";
const id = formData.get("id") as string;
const username = formData.get("username") as string;
if (username?.trim()) {
await updateUser(id, username.trim());
revalidatePath("/tokens");
}
}
async function reassignTokenAction(formData: FormData) {
"use server";
const id = formData.get("id") as string;
const userId = (formData.get("user_id") as string) || null;
await updateToken(id, userId);
revalidatePath("/tokens");
}
return (
<>
<div className="mb-6">
<h1 className="text-3xl font-bold text-foreground flex items-center gap-3">
<svg className="w-8 h-8 text-destructive" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M15 7a2 2 0 012 2m4 0a6 6 0 01-7.743 5.743L11 17H9v2H7v2H4a1 1 0 01-1-1v-2.586a1 1 0 01.293-.707l5.964-5.964A6 6 0 1121 9z" />
</svg>
{t("tokens.title")}
</h1>
</div>
{/* ── Lecteurs ─────────────────────────────────────────── */}
<div className="mb-2">
<h2 className="text-xl font-semibold text-foreground">{t("users.title")}</h2>
</div>
<Card className="mb-6">
<CardHeader>
<CardTitle>{t("users.createNew")}</CardTitle>
<CardDescription>{t("users.createDescription")}</CardDescription>
</CardHeader>
<CardContent>
<form action={createUserAction}>
<FormRow>
<FormField className="flex-1 min-w-48">
<FormInput name="username" placeholder={t("users.username")} required autoComplete="off" />
</FormField>
<Button type="submit">{t("users.createButton")}</Button>
</FormRow>
</form>
</CardContent>
</Card>
<Card className="overflow-hidden mb-10">
<div className="overflow-x-auto">
<table className="w-full">
<thead>
<tr className="border-b border-border/60 bg-muted/50">
<th className="px-4 py-3 text-left text-xs font-semibold text-muted-foreground uppercase tracking-wider">{t("users.name")}</th>
<th className="px-4 py-3 text-left text-xs font-semibold text-muted-foreground uppercase tracking-wider">{t("users.tokenCount")}</th>
<th className="px-4 py-3 text-left text-xs font-semibold text-muted-foreground uppercase tracking-wider">{t("status.read")}</th>
<th className="px-4 py-3 text-left text-xs font-semibold text-muted-foreground uppercase tracking-wider">{t("status.reading")}</th>
<th className="px-4 py-3 text-left text-xs font-semibold text-muted-foreground uppercase tracking-wider">{t("users.createdAt")}</th>
<th className="px-4 py-3 text-left text-xs font-semibold text-muted-foreground uppercase tracking-wider">{t("users.actions")}</th>
</tr>
</thead>
<tbody className="divide-y divide-border/60">
{/* Ligne admin synthétique */}
<tr className="hover:bg-accent/50 transition-colors bg-destructive/5">
<td className="px-4 py-3 text-sm font-medium text-foreground flex items-center gap-2">
{process.env.ADMIN_USERNAME ?? "admin"}
<Badge variant="destructive">{t("tokens.scopeAdmin")}</Badge>
</td>
<td className="px-4 py-3 text-sm text-muted-foreground">
{tokens.filter(tok => tok.scope === "admin" && !tok.revoked_at).length}
</td>
<td className="px-4 py-3 text-sm text-muted-foreground/50"></td>
<td className="px-4 py-3 text-sm text-muted-foreground/50"></td>
<td className="px-4 py-3 text-sm text-muted-foreground/50"></td>
<td className="px-4 py-3 text-sm text-muted-foreground/50"></td>
</tr>
{/* Ligne tokens read non assignés */}
{(() => {
const unassigned = tokens.filter(tok => tok.scope === "read" && !tok.user_id && !tok.revoked_at);
if (unassigned.length === 0) return null;
return (
<tr className="hover:bg-accent/50 transition-colors bg-warning/5">
<td className="px-4 py-3 text-sm font-medium text-muted-foreground italic">
{t("tokens.noUser")}
</td>
<td className="px-4 py-3 text-sm text-warning font-medium">{unassigned.length}</td>
<td className="px-4 py-3 text-sm text-muted-foreground/50"></td>
<td className="px-4 py-3 text-sm text-muted-foreground/50"></td>
<td className="px-4 py-3 text-sm text-muted-foreground/50"></td>
<td className="px-4 py-3 text-sm text-muted-foreground/50"></td>
</tr>
);
})()}
{users.map((user) => (
<tr key={user.id} className="hover:bg-accent/50 transition-colors">
<td className="px-4 py-3">
<UsernameEdit userId={user.id} currentUsername={user.username} action={renameUserAction} />
</td>
<td className="px-4 py-3 text-sm text-muted-foreground">{user.token_count}</td>
<td className="px-4 py-3 text-sm">
{user.books_read > 0
? <span className="font-medium text-success">{user.books_read}</span>
: <span className="text-muted-foreground/50"></span>}
</td>
<td className="px-4 py-3 text-sm">
{user.books_reading > 0
? <span className="font-medium text-amber-500">{user.books_reading}</span>
: <span className="text-muted-foreground/50"></span>}
</td>
<td className="px-4 py-3 text-sm text-muted-foreground">
{new Date(user.created_at).toLocaleDateString()}
</td>
<td className="px-4 py-3">
<form action={deleteUserAction}>
<input type="hidden" name="id" value={user.id} />
<Button type="submit" variant="destructive" size="sm">
<svg className="w-4 h-4 mr-1" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M19 7l-.867 12.142A2 2 0 0116.138 21H7.862a2 2 0 01-1.995-1.858L5 7m5 4v6m4-6v6m1-10V4a1 1 0 00-1-1h-4a1 1 0 00-1 1v3M4 7h16" />
</svg>
{t("common.delete")}
</Button>
</form>
</td>
</tr>
))}
</tbody>
</table>
</div>
</Card>
{/* ── Tokens API ───────────────────────────────────────── */}
<div className="mb-2">
<h2 className="text-xl font-semibold text-foreground">{t("tokens.apiTokens")}</h2>
</div>
{params.created ? (
<Card className="mb-6 border-success/50 bg-success/5">
<CardHeader>
<CardTitle className="text-success">{t("tokens.created")}</CardTitle>
<CardDescription>{t("tokens.createdDescription")}</CardDescription>
</CardHeader>
<CardContent>
<pre className="p-4 bg-background rounded-lg text-sm font-mono text-foreground overflow-x-auto border">{params.created}</pre>
</CardContent>
</Card>
) : null}
<Card className="mb-6">
<CardHeader>
<CardTitle>{t("tokens.createNew")}</CardTitle>
<CardDescription>{t("tokens.createDescription")}</CardDescription>
</CardHeader>
<CardContent>
<form action={createTokenAction}>
<FormRow>
<FormField className="flex-1 min-w-48">
<FormInput name="name" placeholder={t("tokens.tokenName")} required autoComplete="off" />
</FormField>
<FormField className="w-32">
<FormSelect name="scope" defaultValue="read">
<option value="read">{t("tokens.scopeRead")}</option>
<option value="admin">{t("tokens.scopeAdmin")}</option>
</FormSelect>
</FormField>
<FormField className="w-48">
<FormSelect name="user_id" defaultValue="">
<option value="">{t("tokens.noUser")}</option>
{users.map((user) => (
<option key={user.id} value={user.id}>{user.username}</option>
))}
</FormSelect>
</FormField>
<Button type="submit">{t("tokens.createButton")}</Button>
</FormRow>
</form>
</CardContent>
</Card>
<Card className="overflow-hidden">
<div className="overflow-x-auto">
<table className="w-full">
<thead>
<tr className="border-b border-border/60 bg-muted/50">
<th className="px-4 py-3 text-left text-xs font-semibold text-muted-foreground uppercase tracking-wider">{t("tokens.name")}</th>
<th className="px-4 py-3 text-left text-xs font-semibold text-muted-foreground uppercase tracking-wider">{t("tokens.user")}</th>
<th className="px-4 py-3 text-left text-xs font-semibold text-muted-foreground uppercase tracking-wider">{t("tokens.scope")}</th>
<th className="px-4 py-3 text-left text-xs font-semibold text-muted-foreground uppercase tracking-wider">{t("tokens.prefix")}</th>
<th className="px-4 py-3 text-left text-xs font-semibold text-muted-foreground uppercase tracking-wider">{t("tokens.status")}</th>
<th className="px-4 py-3 text-left text-xs font-semibold text-muted-foreground uppercase tracking-wider">{t("tokens.actions")}</th>
</tr>
</thead>
<tbody className="divide-y divide-border/60">
{tokens.map((token) => (
<tr key={token.id} className="hover:bg-accent/50 transition-colors">
<td className="px-4 py-3 text-sm text-foreground">{token.name}</td>
<td className="px-4 py-3 text-sm">
<TokenUserSelect
tokenId={token.id}
currentUserId={token.user_id}
users={users}
action={reassignTokenAction}
noUserLabel={t("tokens.noUser")}
/>
</td>
<td className="px-4 py-3 text-sm">
<Badge variant={token.scope === "admin" ? "destructive" : "secondary"}>
{token.scope}
</Badge>
</td>
<td className="px-4 py-3 text-sm">
<code className="px-2 py-1 bg-muted rounded font-mono text-foreground">{token.prefix}</code>
</td>
<td className="px-4 py-3 text-sm">
{token.revoked_at ? (
<Badge variant="error">{t("tokens.revoked")}</Badge>
) : (
<Badge variant="success">{t("tokens.active")}</Badge>
)}
</td>
<td className="px-4 py-3">
{!token.revoked_at ? (
<form action={revokeTokenAction}>
<input type="hidden" name="id" value={token.id} />
<Button type="submit" variant="destructive" size="sm">
<svg className="w-4 h-4 mr-1" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M10 14l2-2m0 0l2-2m-2 2l-2-2m2 2l2 2m7-2a9 9 0 11-18 0 9 9 0 0118 0z" />
</svg>
{t("tokens.revoke")}
</Button>
</form>
) : (
<form action={deleteTokenAction}>
<input type="hidden" name="id" value={token.id} />
<Button type="submit" variant="destructive" size="sm">
<svg className="w-4 h-4 mr-1" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M19 7l-.867 12.142A2 2 0 0116.138 21H7.862a2 2 0 01-1.995-1.858L5 7m5 4v6m4-6v6m1-10V4a1 1 0 00-1-1h-4a1 1 0 00-1 1v3M4 7h16" />
</svg>
{t("common.delete")}
</Button>
</form>
)}
</td>
</tr>
))}
</tbody>
</table>
</div>
</Card>
</>
);
}

View File

@@ -0,0 +1,31 @@
import { NextRequest, NextResponse } from "next/server";
import { createSessionToken, SESSION_COOKIE } from "@/lib/session";
export async function POST(req: NextRequest) {
const body = await req.json().catch(() => null);
if (!body || typeof body.username !== "string" || typeof body.password !== "string") {
return NextResponse.json({ error: "Invalid request" }, { status: 400 });
}
const expectedUsername = process.env.ADMIN_USERNAME || "admin";
const expectedPassword = process.env.ADMIN_PASSWORD;
if (!expectedPassword) {
return NextResponse.json({ error: "Server misconfiguration" }, { status: 500 });
}
if (body.username !== expectedUsername || body.password !== expectedPassword) {
return NextResponse.json({ error: "Invalid credentials" }, { status: 401 });
}
const token = await createSessionToken();
const response = NextResponse.json({ success: true });
response.cookies.set(SESSION_COOKIE, token, {
httpOnly: true,
secure: process.env.NODE_ENV === "production",
sameSite: "lax",
maxAge: 7 * 24 * 60 * 60,
path: "/",
});
return response;
}

View File

@@ -0,0 +1,8 @@
import { NextResponse } from "next/server";
import { SESSION_COOKIE } from "@/lib/session";
export async function POST() {
const response = NextResponse.json({ success: true });
response.cookies.delete(SESSION_COOKIE);
return response;
}

View File

@@ -28,12 +28,9 @@ export async function GET(
});
}
// Récupérer le content-type et les données
const contentType = response.headers.get("content-type") || "image/webp";
const imageBuffer = await response.arrayBuffer();
// Retourner l'image avec le bon content-type
return new NextResponse(imageBuffer, {
return new NextResponse(response.body, {
headers: {
"Content-Type": contentType,
"Cache-Control": "public, max-age=300",

View File

@@ -9,10 +9,25 @@ export async function GET(
try {
const { baseUrl, token } = config();
const ifNoneMatch = request.headers.get("if-none-match");
const fetchHeaders: Record<string, string> = {
Authorization: `Bearer ${token}`,
};
if (ifNoneMatch) {
fetchHeaders["If-None-Match"] = ifNoneMatch;
}
const response = await fetch(`${baseUrl}/books/${bookId}/thumbnail`, {
headers: { Authorization: `Bearer ${token}` },
headers: fetchHeaders,
next: { revalidate: 86400 },
});
// Forward 304 Not Modified as-is
if (response.status === 304) {
return new NextResponse(null, { status: 304 });
}
if (!response.ok) {
return new NextResponse(`Failed to fetch thumbnail: ${response.status}`, {
status: response.status
@@ -20,14 +35,17 @@ export async function GET(
}
const contentType = response.headers.get("content-type") || "image/webp";
const imageBuffer = await response.arrayBuffer();
const etag = response.headers.get("etag");
return new NextResponse(imageBuffer, {
headers: {
const headers: Record<string, string> = {
"Content-Type": contentType,
"Cache-Control": "public, max-age=31536000, immutable",
},
});
};
if (etag) {
headers["ETag"] = etag;
}
return new NextResponse(response.body, { headers });
} catch (error) {
console.error("Error fetching thumbnail:", error);
return new NextResponse("Failed to fetch thumbnail", { status: 500 });

View File

@@ -11,6 +11,7 @@ export async function GET(request: NextRequest) {
let lastData: string | null = null;
let isActive = true;
let consecutiveErrors = 0;
let intervalId: ReturnType<typeof setInterval> | null = null;
const fetchJobs = async () => {
if (!isActive) return;
@@ -25,23 +26,28 @@ export async function GET(request: NextRequest) {
const data = await response.json();
const dataStr = JSON.stringify(data);
// Send if data changed
// Send only if data changed
if (dataStr !== lastData && isActive) {
lastData = dataStr;
try {
controller.enqueue(
new TextEncoder().encode(`data: ${dataStr}\n\n`)
);
} catch (err) {
// Controller closed, ignore
} catch {
isActive = false;
}
}
// Adapt interval: 2s when active jobs exist, 15s when idle
const hasActiveJobs = data.some((j: { status: string }) =>
j.status === "running" || j.status === "pending" || j.status === "extracting_pages" || j.status === "generating_thumbnails"
);
const nextInterval = hasActiveJobs ? 2000 : 15000;
restartInterval(nextInterval);
}
} catch (error) {
if (isActive) {
consecutiveErrors++;
// Only log first failure and every 30th to avoid spam
if (consecutiveErrors === 1 || consecutiveErrors % 30 === 0) {
console.warn(`SSE fetch error (${consecutiveErrors} consecutive):`, error);
}
@@ -49,22 +55,18 @@ export async function GET(request: NextRequest) {
}
};
// Initial fetch
await fetchJobs();
const restartInterval = (ms: number) => {
if (intervalId !== null) clearInterval(intervalId);
intervalId = setInterval(fetchJobs, ms);
};
// Poll every 2 seconds
const interval = setInterval(async () => {
if (!isActive) {
clearInterval(interval);
return;
}
// Initial fetch + start polling
await fetchJobs();
}, 2000);
// Cleanup
request.signal.addEventListener("abort", () => {
isActive = false;
clearInterval(interval);
if (intervalId !== null) clearInterval(intervalId);
controller.close();
});
},

View File

@@ -0,0 +1,16 @@
import { NextResponse, NextRequest } from "next/server";
import { getKomgaReport } from "@/lib/api";
export async function GET(
_request: NextRequest,
{ params }: { params: Promise<{ id: string }> },
) {
try {
const { id } = await params;
const data = await getKomgaReport(id);
return NextResponse.json(data);
} catch (error) {
const message = error instanceof Error ? error.message : "Failed to fetch report";
return NextResponse.json({ error: message }, { status: 500 });
}
}

View File

@@ -0,0 +1,12 @@
import { NextResponse } from "next/server";
import { listKomgaReports } from "@/lib/api";
export async function GET() {
try {
const data = await listKomgaReports();
return NextResponse.json(data);
} catch (error) {
const message = error instanceof Error ? error.message : "Failed to fetch reports";
return NextResponse.json({ error: message }, { status: 500 });
}
}

View File

@@ -0,0 +1,16 @@
import { NextResponse, NextRequest } from "next/server";
import { apiFetch } from "@/lib/api";
export async function POST(request: NextRequest) {
try {
const body = await request.json();
const data = await apiFetch("/komga/sync", {
method: "POST",
body: JSON.stringify(body),
});
return NextResponse.json(data);
} catch (error) {
const message = error instanceof Error ? error.message : "Failed to sync with Komga";
return NextResponse.json({ error: message }, { status: 500 });
}
}

View File

@@ -0,0 +1,20 @@
import { NextRequest, NextResponse } from "next/server";
import { apiFetch, LibraryDto } from "@/lib/api";
export async function PATCH(
request: NextRequest,
{ params }: { params: Promise<{ id: string }> }
) {
const { id } = await params;
try {
const body = await request.json();
const data = await apiFetch<LibraryDto>(`/libraries/${id}/metadata-provider`, {
method: "PATCH",
body: JSON.stringify(body),
});
return NextResponse.json(data);
} catch (error) {
const message = error instanceof Error ? error.message : "Failed to update metadata provider";
return NextResponse.json({ error: message }, { status: 500 });
}
}

View File

@@ -7,8 +7,8 @@ export async function PATCH(
) {
const { id } = await params;
try {
const { monitor_enabled, scan_mode, watcher_enabled } = await request.json();
const data = await updateLibraryMonitoring(id, monitor_enabled, scan_mode, watcher_enabled);
const { monitor_enabled, scan_mode, watcher_enabled, metadata_refresh_mode } = await request.json();
const data = await updateLibraryMonitoring(id, monitor_enabled, scan_mode, watcher_enabled, metadata_refresh_mode);
return NextResponse.json(data);
} catch (error) {
const message = error instanceof Error ? error.message : "Failed to update monitoring settings";

View File

@@ -0,0 +1,17 @@
import { NextRequest, NextResponse } from "next/server";
import { apiFetch } from "@/lib/api";
export async function POST(request: NextRequest) {
try {
const body = await request.json();
const { id, ...rest } = body;
const data = await apiFetch<{ status: string; books_synced: number }>(`/metadata/approve/${id}`, {
method: "POST",
body: JSON.stringify(rest),
});
return NextResponse.json(data);
} catch (error) {
const message = error instanceof Error ? error.message : "Failed to approve metadata";
return NextResponse.json({ error: message }, { status: 500 });
}
}

View File

@@ -0,0 +1,17 @@
import { NextRequest, NextResponse } from "next/server";
import { apiFetch, MetadataBatchReportDto } from "@/lib/api";
export async function GET(request: NextRequest) {
try {
const { searchParams } = new URL(request.url);
const id = searchParams.get("id");
if (!id) {
return NextResponse.json({ error: "id is required" }, { status: 400 });
}
const data = await apiFetch<MetadataBatchReportDto>(`/metadata/batch/${id}/report`);
return NextResponse.json(data);
} catch (error) {
const message = error instanceof Error ? error.message : "Failed to fetch report";
return NextResponse.json({ error: message }, { status: 500 });
}
}

View File

@@ -0,0 +1,19 @@
import { NextRequest, NextResponse } from "next/server";
import { apiFetch, MetadataBatchResultDto } from "@/lib/api";
export async function GET(request: NextRequest) {
try {
const { searchParams } = new URL(request.url);
const id = searchParams.get("id");
if (!id) {
return NextResponse.json({ error: "id is required" }, { status: 400 });
}
const status = searchParams.get("status") || "";
const params = status ? `?status=${status}` : "";
const data = await apiFetch<MetadataBatchResultDto[]>(`/metadata/batch/${id}/results${params}`);
return NextResponse.json(data);
} catch (error) {
const message = error instanceof Error ? error.message : "Failed to fetch results";
return NextResponse.json({ error: message }, { status: 500 });
}
}

View File

@@ -0,0 +1,16 @@
import { NextRequest, NextResponse } from "next/server";
import { apiFetch } from "@/lib/api";
export async function POST(request: NextRequest) {
try {
const body = await request.json();
const data = await apiFetch<{ id: string; status: string }>("/metadata/batch", {
method: "POST",
body: JSON.stringify(body),
});
return NextResponse.json(data);
} catch (error) {
const message = error instanceof Error ? error.message : "Failed to start batch";
return NextResponse.json({ error: message }, { status: 500 });
}
}

View File

@@ -0,0 +1,35 @@
import { NextRequest, NextResponse } from "next/server";
import { apiFetch, ExternalMetadataLinkDto } from "@/lib/api";
export async function GET(request: NextRequest) {
try {
const { searchParams } = new URL(request.url);
const libraryId = searchParams.get("library_id") || "";
const seriesName = searchParams.get("series_name") || "";
const params = new URLSearchParams();
if (libraryId) params.set("library_id", libraryId);
if (seriesName) params.set("series_name", seriesName);
const data = await apiFetch<ExternalMetadataLinkDto[]>(`/metadata/links?${params.toString()}`);
return NextResponse.json(data);
} catch (error) {
const message = error instanceof Error ? error.message : "Failed to fetch metadata links";
return NextResponse.json({ error: message }, { status: 500 });
}
}
export async function DELETE(request: NextRequest) {
try {
const { searchParams } = new URL(request.url);
const id = searchParams.get("id");
if (!id) {
return NextResponse.json({ error: "id is required" }, { status: 400 });
}
const data = await apiFetch<{ deleted: boolean }>(`/metadata/links/${id}`, {
method: "DELETE",
});
return NextResponse.json(data);
} catch (error) {
const message = error instanceof Error ? error.message : "Failed to delete metadata link";
return NextResponse.json({ error: message }, { status: 500 });
}
}

View File

@@ -0,0 +1,16 @@
import { NextRequest, NextResponse } from "next/server";
import { apiFetch, ExternalMetadataLinkDto } from "@/lib/api";
export async function POST(request: NextRequest) {
try {
const body = await request.json();
const data = await apiFetch<ExternalMetadataLinkDto>("/metadata/match", {
method: "POST",
body: JSON.stringify(body),
});
return NextResponse.json(data);
} catch (error) {
const message = error instanceof Error ? error.message : "Failed to create metadata match";
return NextResponse.json({ error: message }, { status: 500 });
}
}

View File

@@ -0,0 +1,17 @@
import { NextRequest, NextResponse } from "next/server";
import { apiFetch, MissingBooksDto } from "@/lib/api";
export async function GET(request: NextRequest) {
try {
const { searchParams } = new URL(request.url);
const id = searchParams.get("id");
if (!id) {
return NextResponse.json({ error: "id is required" }, { status: 400 });
}
const data = await apiFetch<MissingBooksDto>(`/metadata/missing/${id}`);
return NextResponse.json(data);
} catch (error) {
const message = error instanceof Error ? error.message : "Failed to fetch missing books";
return NextResponse.json({ error: message }, { status: 500 });
}
}

View File

@@ -0,0 +1,16 @@
import { NextRequest, NextResponse } from "next/server";
import { apiFetch } from "@/lib/api";
export async function GET(request: NextRequest) {
try {
const jobId = request.nextUrl.searchParams.get("job_id");
if (!jobId) {
return NextResponse.json({ error: "job_id required" }, { status: 400 });
}
const data = await apiFetch(`/metadata/refresh/${jobId}/report`);
return NextResponse.json(data);
} catch (error) {
const message = error instanceof Error ? error.message : "Failed to get report";
return NextResponse.json({ error: message }, { status: 500 });
}
}

View File

@@ -0,0 +1,16 @@
import { NextRequest, NextResponse } from "next/server";
import { apiFetch } from "@/lib/api";
export async function POST(request: NextRequest) {
try {
const body = await request.json();
const data = await apiFetch<{ id: string; status: string }>("/metadata/refresh", {
method: "POST",
body: JSON.stringify(body),
});
return NextResponse.json(data);
} catch (error) {
const message = error instanceof Error ? error.message : "Failed to start refresh";
return NextResponse.json({ error: message }, { status: 500 });
}
}

View File

@@ -0,0 +1,15 @@
import { NextRequest, NextResponse } from "next/server";
import { apiFetch } from "@/lib/api";
export async function POST(request: NextRequest) {
try {
const body = await request.json();
const data = await apiFetch<{ status: string }>(`/metadata/reject/${body.id}`, {
method: "POST",
});
return NextResponse.json(data);
} catch (error) {
const message = error instanceof Error ? error.message : "Failed to reject metadata";
return NextResponse.json({ error: message }, { status: 500 });
}
}

View File

@@ -0,0 +1,16 @@
import { NextRequest, NextResponse } from "next/server";
import { apiFetch, SeriesCandidateDto } from "@/lib/api";
export async function POST(request: NextRequest) {
try {
const body = await request.json();
const data = await apiFetch<SeriesCandidateDto[]>("/metadata/search", {
method: "POST",
body: JSON.stringify(body),
});
return NextResponse.json(data);
} catch (error) {
const message = error instanceof Error ? error.message : "Failed to search metadata";
return NextResponse.json({ error: message }, { status: 500 });
}
}

View File

@@ -0,0 +1,16 @@
import { NextResponse, NextRequest } from "next/server";
import { apiFetch } from "@/lib/api";
export async function POST(request: NextRequest) {
try {
const body = await request.json();
const data = await apiFetch("/prowlarr/search", {
method: "POST",
body: JSON.stringify(body),
});
return NextResponse.json(data);
} catch (error) {
const message = error instanceof Error ? error.message : "Failed to search Prowlarr";
return NextResponse.json({ error: message }, { status: 500 });
}
}

View File

@@ -0,0 +1,12 @@
import { NextResponse } from "next/server";
import { apiFetch } from "@/lib/api";
export async function GET() {
try {
const data = await apiFetch("/prowlarr/test");
return NextResponse.json(data);
} catch (error) {
const message = error instanceof Error ? error.message : "Failed to test Prowlarr connection";
return NextResponse.json({ error: message }, { status: 500 });
}
}

View File

@@ -0,0 +1,16 @@
import { NextResponse, NextRequest } from "next/server";
import { apiFetch } from "@/lib/api";
export async function POST(request: NextRequest) {
try {
const body = await request.json();
const data = await apiFetch("/qbittorrent/add", {
method: "POST",
body: JSON.stringify(body),
});
return NextResponse.json(data);
} catch (error) {
const message = error instanceof Error ? error.message : "Failed to add torrent";
return NextResponse.json({ error: message }, { status: 500 });
}
}

View File

@@ -0,0 +1,12 @@
import { NextResponse } from "next/server";
import { apiFetch } from "@/lib/api";
export async function GET() {
try {
const data = await apiFetch("/qbittorrent/test");
return NextResponse.json(data);
} catch (error) {
const message = error instanceof Error ? error.message : "Failed to test qBittorrent";
return NextResponse.json({ error: message }, { status: 500 });
}
}

View File

@@ -0,0 +1,11 @@
import { NextResponse } from "next/server";
import { apiFetch } from "@/lib/api";
export async function GET() {
try {
const data = await apiFetch<string[]>("/series/provider-statuses");
return NextResponse.json(data);
} catch {
return NextResponse.json([], { status: 200 });
}
}

View File

@@ -0,0 +1,11 @@
import { NextResponse } from "next/server";
import { apiFetch } from "@/lib/api";
export async function GET() {
try {
const data = await apiFetch<string[]>("/series/statuses");
return NextResponse.json(data);
} catch {
return NextResponse.json([], { status: 200 });
}
}

View File

@@ -0,0 +1,17 @@
import { NextRequest, NextResponse } from "next/server";
import { apiFetch } from "@/lib/api";
export async function DELETE(
_request: NextRequest,
{ params }: { params: Promise<{ id: string }> }
) {
const { id } = await params;
try {
const data = await apiFetch<unknown>(`/settings/status-mappings/${id}`, {
method: "DELETE",
});
return NextResponse.json(data);
} catch {
return NextResponse.json({ error: "Failed to delete status mapping" }, { status: 500 });
}
}

View File

@@ -0,0 +1,24 @@
import { NextRequest, NextResponse } from "next/server";
import { apiFetch } from "@/lib/api";
export async function GET() {
try {
const data = await apiFetch<unknown>("/settings/status-mappings");
return NextResponse.json(data);
} catch {
return NextResponse.json({ error: "Failed to fetch status mappings" }, { status: 500 });
}
}
export async function POST(request: NextRequest) {
try {
const body = await request.json();
const data = await apiFetch<unknown>("/settings/status-mappings", {
method: "POST",
body: JSON.stringify(body),
});
return NextResponse.json(data);
} catch {
return NextResponse.json({ error: "Failed to save status mapping" }, { status: 500 });
}
}

View File

@@ -0,0 +1,12 @@
import { NextResponse } from "next/server";
import { apiFetch } from "@/lib/api";
export async function GET() {
try {
const data = await apiFetch("/telegram/test");
return NextResponse.json(data);
} catch (error) {
const message = error instanceof Error ? error.message : "Failed to test Telegram connection";
return NextResponse.json({ error: message }, { status: 500 });
}
}

View File

@@ -1,226 +0,0 @@
import { fetchLibraries, getBookCoverUrl, BookDto, apiFetch, ReadingStatus } from "../../../lib/api";
import { BookPreview } from "../../components/BookPreview";
import { ConvertButton } from "../../components/ConvertButton";
import { MarkBookReadButton } from "../../components/MarkBookReadButton";
import { EditBookForm } from "../../components/EditBookForm";
import Image from "next/image";
import Link from "next/link";
import { notFound } from "next/navigation";
export const dynamic = "force-dynamic";
const readingStatusConfig: Record<ReadingStatus, { label: string; className: string }> = {
unread: { label: "Non lu", className: "bg-muted/60 text-muted-foreground border border-border" },
reading: { label: "En cours", className: "bg-amber-500/15 text-amber-600 dark:text-amber-400 border border-amber-500/30" },
read: { label: "Lu", className: "bg-green-500/15 text-green-600 dark:text-green-400 border border-green-500/30" },
};
function ReadingStatusBadge({
status,
currentPage,
lastReadAt,
}: {
status: ReadingStatus;
currentPage: number | null;
lastReadAt: string | null;
}) {
const { label, className } = readingStatusConfig[status];
return (
<div className="flex items-center gap-2">
<span className={`inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-semibold ${className}`}>
{label}
{status === "reading" && currentPage != null && ` · p. ${currentPage}`}
</span>
{lastReadAt && (
<span className="text-xs text-muted-foreground">
{new Date(lastReadAt).toLocaleDateString()}
</span>
)}
</div>
);
}
async function fetchBook(bookId: string): Promise<BookDto | null> {
try {
return await apiFetch<BookDto>(`/books/${bookId}`);
} catch {
return null;
}
}
export default async function BookDetailPage({
params
}: {
params: Promise<{ id: string }>;
}) {
const { id } = await params;
const [book, libraries] = await Promise.all([
fetchBook(id),
fetchLibraries().catch(() => [] as { id: string; name: string }[])
]);
if (!book) {
notFound();
}
const library = libraries.find(l => l.id === book.library_id);
return (
<>
<div className="mb-6">
<Link href="/books" className="inline-flex items-center text-sm text-muted-foreground hover:text-primary transition-colors">
Back to books
</Link>
</div>
<div className="flex flex-col lg:flex-row gap-8">
<div className="flex-shrink-0">
<div className="bg-card rounded-xl shadow-card border border-border p-4 inline-block">
<Image
src={getBookCoverUrl(book.id)}
alt={`Cover of ${book.title}`}
width={300}
height={440}
className="w-auto h-auto max-w-[300px] rounded-lg"
unoptimized
loading="lazy"
/>
</div>
</div>
<div className="flex-1">
<div className="bg-card rounded-xl shadow-sm border border-border p-6">
<div className="flex items-start justify-between gap-4 mb-2">
<h1 className="text-3xl font-bold text-foreground">{book.title}</h1>
<EditBookForm book={book} />
</div>
{book.author && (
<p className="text-lg text-muted-foreground mb-4">by {book.author}</p>
)}
{book.series && (
<p className="text-sm text-muted-foreground mb-6">
{book.series}
{book.volume && <span className="ml-2 px-2 py-1 bg-primary/10 text-primary rounded text-xs">Volume {book.volume}</span>}
</p>
)}
<div className="space-y-3">
{book.reading_status && (
<div className="flex items-center justify-between py-2 border-b border-border">
<span className="text-sm text-muted-foreground">Lecture :</span>
<div className="flex items-center gap-3">
<ReadingStatusBadge
status={book.reading_status}
currentPage={book.reading_current_page ?? null}
lastReadAt={book.reading_last_read_at ?? null}
/>
<MarkBookReadButton bookId={book.id} currentStatus={book.reading_status} />
</div>
</div>
)}
<div className="flex items-center justify-between py-2 border-b border-border">
<span className="text-sm text-muted-foreground">Format:</span>
<span className={`inline-flex px-2.5 py-1 rounded-full text-xs font-semibold ${
(book.format ?? book.kind) === 'cbz' ? 'bg-success/10 text-success' :
(book.format ?? book.kind) === 'cbr' ? 'bg-warning/10 text-warning' :
(book.format ?? book.kind) === 'pdf' ? 'bg-destructive/10 text-destructive' :
'bg-muted/50 text-muted-foreground'
}`}>
{(book.format ?? book.kind).toUpperCase()}
</span>
</div>
{book.volume && (
<div className="flex items-center justify-between py-2 border-b border-border">
<span className="text-sm text-muted-foreground">Volume:</span>
<span className="text-sm text-foreground">{book.volume}</span>
</div>
)}
{book.language && (
<div className="flex items-center justify-between py-2 border-b border-border">
<span className="text-sm text-muted-foreground">Language:</span>
<span className="text-sm text-foreground">{book.language.toUpperCase()}</span>
</div>
)}
{book.page_count && (
<div className="flex items-center justify-between py-2 border-b border-border">
<span className="text-sm text-muted-foreground">Pages:</span>
<span className="text-sm text-foreground">{book.page_count}</span>
</div>
)}
<div className="flex items-center justify-between py-2 border-b border-border">
<span className="text-sm text-muted-foreground">Library:</span>
<span className="text-sm text-foreground">{library?.name || book.library_id}</span>
</div>
{book.series && (
<div className="flex items-center justify-between py-2 border-b border-border">
<span className="text-sm text-muted-foreground">Series:</span>
<span className="text-sm text-foreground">{book.series}</span>
</div>
)}
{book.file_format && (
<div className="flex items-center justify-between py-2 border-b border-border">
<span className="text-sm text-muted-foreground">File Format:</span>
<div className="flex items-center gap-3">
<span className="text-sm text-foreground">{book.file_format.toUpperCase()}</span>
{book.file_format === "cbr" && <ConvertButton bookId={book.id} />}
</div>
</div>
)}
{book.file_parse_status && (
<div className="flex items-center justify-between py-2 border-b border-border">
<span className="text-sm text-muted-foreground">Parse Status:</span>
<span className={`inline-flex px-2.5 py-1 rounded-full text-xs font-semibold ${
book.file_parse_status === 'success' ? 'bg-success/10 text-success' :
book.file_parse_status === 'failed' ? 'bg-destructive/10 text-error' : 'bg-muted/50 text-muted-foreground'
}`}>
{book.file_parse_status}
</span>
</div>
)}
{book.file_path && (
<div className="flex flex-col py-2 border-b border-border">
<span className="text-sm text-muted-foreground mb-1">File Path:</span>
<code className="text-xs font-mono text-foreground break-all">{book.file_path}</code>
</div>
)}
<div className="flex flex-col py-2 border-b border-border">
<span className="text-sm text-muted-foreground mb-1">Book ID:</span>
<code className="text-xs font-mono text-foreground break-all">{book.id}</code>
</div>
<div className="flex flex-col py-2 border-b border-border">
<span className="text-sm text-muted-foreground mb-1">Library ID:</span>
<code className="text-xs font-mono text-foreground break-all">{book.library_id}</code>
</div>
{book.updated_at && (
<div className="flex items-center justify-between py-2">
<span className="text-sm text-muted-foreground">Updated:</span>
<span className="text-sm text-foreground">{new Date(book.updated_at).toLocaleString()}</span>
</div>
)}
</div>
</div>
</div>
</div>
{book.page_count && book.page_count > 0 && (
<div className="mt-8">
<BookPreview bookId={book.id} pageCount={book.page_count} />
</div>
)}
</>
);
}

View File

@@ -1,14 +1,15 @@
"use client";
import { useState } from "react";
import { memo, useState } from "react";
import Image from "next/image";
import Link from "next/link";
import { BookDto, ReadingStatus } from "../../lib/api";
import { useTranslation } from "../../lib/i18n/context";
const readingStatusOverlay: Record<ReadingStatus, { label: string; className: string } | null> = {
const readingStatusOverlayClasses: Record<ReadingStatus, string | null> = {
unread: null,
reading: { label: "En cours", className: "bg-amber-500/90 text-white" },
read: { label: "Lu", className: "bg-green-600/90 text-white" },
reading: "bg-amber-500/90 text-white",
read: "bg-green-600/90 text-white",
};
interface BookCardProps {
@@ -16,7 +17,7 @@ interface BookCardProps {
readingStatus?: ReadingStatus;
}
function BookImage({ src, alt }: { src: string; alt: string }) {
const BookImage = memo(function BookImage({ src, alt }: { src: string; alt: string }) {
const [isLoaded, setIsLoaded] = useState(false);
const [hasError, setHasError] = useState(false);
@@ -50,16 +51,21 @@ function BookImage({ src, alt }: { src: string; alt: string }) {
sizes="(max-width: 640px) 50vw, (max-width: 768px) 33vw, (max-width: 1024px) 25vw, 16vw"
onLoad={() => setIsLoaded(true)}
onError={() => setHasError(true)}
unoptimized
/>
</div>
);
}
});
export function BookCard({ book, readingStatus }: BookCardProps) {
export const BookCard = memo(function BookCard({ book, readingStatus }: BookCardProps) {
const { t } = useTranslation();
const coverUrl = book.coverUrl || `/api/books/${book.id}/thumbnail`;
const status = readingStatus ?? book.reading_status;
const overlay = status ? readingStatusOverlay[status] : null;
const overlayClass = status ? readingStatusOverlayClasses[status] : null;
const statusLabels: Record<ReadingStatus, string> = {
unread: t("status.unread"),
reading: t("status.reading"),
read: t("status.read"),
};
const isRead = status === "read";
@@ -71,11 +77,11 @@ export function BookCard({ book, readingStatus }: BookCardProps) {
<div className="relative">
<BookImage
src={coverUrl}
alt={`Cover of ${book.title}`}
alt={t("books.coverOf", { name: book.title })}
/>
{overlay && (
<span className={`absolute bottom-2 left-2 px-2 py-0.5 rounded-full text-[10px] font-bold tracking-wide ${overlay.className}`}>
{overlay.label}
{overlayClass && status && (
<span className={`absolute bottom-2 left-2 px-2 py-0.5 rounded-full text-[10px] font-bold tracking-wide ${overlayClass}`}>
{statusLabels[status]}
</span>
)}
</div>
@@ -108,6 +114,7 @@ export function BookCard({ book, readingStatus }: BookCardProps) {
${(book.format ?? book.kind) === 'cbz' ? 'bg-success/10 text-success' : ''}
${(book.format ?? book.kind) === 'cbr' ? 'bg-warning/10 text-warning' : ''}
${(book.format ?? book.kind) === 'pdf' ? 'bg-destructive/10 text-destructive' : ''}
${(book.format ?? book.kind) === 'epub' ? 'bg-info/10 text-info' : ''}
`}>
{book.format ?? book.kind}
</span>
@@ -121,7 +128,7 @@ export function BookCard({ book, readingStatus }: BookCardProps) {
</div>
</Link>
);
}
});
interface BooksGridProps {
books: (BookDto & { coverUrl?: string })[];

View File

@@ -2,10 +2,12 @@
import { useState } from "react";
import Image from "next/image";
import { useTranslation } from "../../lib/i18n/context";
const PAGE_SIZE = 5;
export function BookPreview({ bookId, pageCount }: { bookId: string; pageCount: number }) {
const { t } = useTranslation();
const [offset, setOffset] = useState(0);
const pages = Array.from({ length: PAGE_SIZE }, (_, i) => offset + i + 1).filter(
@@ -16,9 +18,9 @@ export function BookPreview({ bookId, pageCount }: { bookId: string; pageCount:
<div className="bg-card rounded-xl border border-border p-6">
<div className="flex items-center justify-between mb-4">
<h2 className="text-lg font-semibold text-foreground">
Preview
{t("bookPreview.preview")}
<span className="ml-2 text-sm font-normal text-muted-foreground">
pages {offset + 1}{Math.min(offset + PAGE_SIZE, pageCount)} / {pageCount}
{t("bookPreview.pages", { start: offset + 1, end: Math.min(offset + PAGE_SIZE, pageCount), total: pageCount })}
</span>
</h2>
<div className="flex gap-2">
@@ -27,14 +29,14 @@ export function BookPreview({ bookId, pageCount }: { bookId: string; pageCount:
disabled={offset === 0}
className="px-3 py-1.5 text-sm rounded-lg border border-border bg-muted/50 text-foreground hover:bg-muted disabled:opacity-40 disabled:cursor-not-allowed transition-colors"
>
Prev
{t("bookPreview.prev")}
</button>
<button
onClick={() => setOffset((o) => Math.min(o + PAGE_SIZE, pageCount - 1))}
disabled={offset + PAGE_SIZE >= pageCount}
className="px-3 py-1.5 text-sm rounded-lg border border-border bg-muted/50 text-foreground hover:bg-muted disabled:opacity-40 disabled:cursor-not-allowed transition-colors"
>
Next
{t("bookPreview.next")}
</button>
</div>
</div>

View File

@@ -3,6 +3,7 @@
import { useState } from "react";
import Link from "next/link";
import { Button } from "./ui";
import { useTranslation } from "../../lib/i18n/context";
interface ConvertButtonProps {
bookId: string;
@@ -15,6 +16,7 @@ type ConvertState =
| { type: "error"; message: string };
export function ConvertButton({ bookId }: ConvertButtonProps) {
const { t } = useTranslation();
const [state, setState] = useState<ConvertState>({ type: "idle" });
const handleConvert = async () => {
@@ -23,22 +25,22 @@ export function ConvertButton({ bookId }: ConvertButtonProps) {
const res = await fetch(`/api/books/${bookId}/convert`, { method: "POST" });
if (!res.ok) {
const body = await res.json().catch(() => ({ error: res.statusText }));
setState({ type: "error", message: body.error || "Conversion failed" });
setState({ type: "error", message: body.error || t("convert.failed") });
return;
}
const job = await res.json();
setState({ type: "success", jobId: job.id });
} catch (err) {
setState({ type: "error", message: err instanceof Error ? err.message : "Unknown error" });
setState({ type: "error", message: err instanceof Error ? err.message : t("convert.unknownError") });
}
};
if (state.type === "success") {
return (
<div className="flex items-center gap-2 text-sm text-success">
<span>Conversion started.</span>
<span>{t("convert.started")}</span>
<Link href={`/jobs/${state.jobId}`} className="text-primary hover:underline font-medium">
View job
{t("convert.viewJob")}
</Link>
</div>
);
@@ -52,7 +54,7 @@ export function ConvertButton({ bookId }: ConvertButtonProps) {
className="text-xs text-muted-foreground hover:underline text-left"
onClick={() => setState({ type: "idle" })}
>
Dismiss
{t("common.close")}
</button>
</div>
);
@@ -65,7 +67,7 @@ export function ConvertButton({ bookId }: ConvertButtonProps) {
onClick={handleConvert}
disabled={state.type === "loading"}
>
{state.type === "loading" ? "Converting" : "Convert to CBZ"}
{state.type === "loading" ? t("convert.converting") : t("convert.convertToCbz")}
</Button>
);
}

View File

@@ -0,0 +1,231 @@
"use client";
import {
PieChart, Pie, Cell, ResponsiveContainer, Tooltip,
BarChart, Bar, XAxis, YAxis, CartesianGrid,
AreaChart, Area, Line, LineChart,
Legend,
} from "recharts";
// ---------------------------------------------------------------------------
// Donut
// ---------------------------------------------------------------------------
export function RcDonutChart({
data,
noDataLabel,
}: {
data: { name: string; value: number; color: string }[];
noDataLabel?: string;
}) {
const total = data.reduce((s, d) => s + d.value, 0);
if (total === 0) return <p className="text-muted-foreground text-sm text-center py-8">{noDataLabel}</p>;
return (
<div className="flex items-center gap-4">
<ResponsiveContainer width={130} height={130}>
<PieChart>
<Pie
data={data}
cx="50%"
cy="50%"
innerRadius={32}
outerRadius={55}
dataKey="value"
strokeWidth={0}
>
{data.map((d, i) => (
<Cell key={i} fill={d.color} />
))}
</Pie>
<Tooltip
formatter={(value) => value}
contentStyle={{ backgroundColor: "var(--color-card)", border: "1px solid var(--color-border)", borderRadius: 8, fontSize: 12 }}
/>
</PieChart>
</ResponsiveContainer>
<div className="flex flex-col gap-1.5 min-w-0">
{data.map((d, i) => (
<div key={i} className="flex items-center gap-2 text-sm">
<span className="w-3 h-3 rounded-full shrink-0" style={{ backgroundColor: d.color }} />
<span className="text-muted-foreground truncate">{d.name}</span>
<span className="font-medium text-foreground ml-auto">{d.value}</span>
</div>
))}
</div>
</div>
);
}
// ---------------------------------------------------------------------------
// Bar chart
// ---------------------------------------------------------------------------
export function RcBarChart({
data,
color = "hsl(198 78% 37%)",
noDataLabel,
}: {
data: { label: string; value: number }[];
color?: string;
noDataLabel?: string;
}) {
if (data.length === 0) return <p className="text-muted-foreground text-sm text-center py-8">{noDataLabel}</p>;
return (
<ResponsiveContainer width="100%" height={180}>
<BarChart data={data} margin={{ top: 5, right: 5, bottom: 0, left: -20 }}>
<CartesianGrid strokeDasharray="3 3" vertical={false} stroke="var(--color-border)" opacity={0.3} />
<XAxis dataKey="label" tick={{ fontSize: 11, fill: "var(--color-muted-foreground)" }} axisLine={false} tickLine={false} />
<YAxis tick={{ fontSize: 11, fill: "var(--color-muted-foreground)" }} axisLine={false} tickLine={false} allowDecimals={false} />
<Tooltip
contentStyle={{ backgroundColor: "var(--color-card)", border: "1px solid var(--color-border)", borderRadius: 8, fontSize: 12 }}
/>
<Bar dataKey="value" fill={color} radius={[4, 4, 0, 0]} />
</BarChart>
</ResponsiveContainer>
);
}
// ---------------------------------------------------------------------------
// Area / Line chart
// ---------------------------------------------------------------------------
export function RcAreaChart({
data,
color = "hsl(142 60% 45%)",
noDataLabel,
}: {
data: { label: string; value: number }[];
color?: string;
noDataLabel?: string;
}) {
if (data.length === 0) return <p className="text-muted-foreground text-sm text-center py-8">{noDataLabel}</p>;
return (
<ResponsiveContainer width="100%" height={180}>
<AreaChart data={data} margin={{ top: 5, right: 5, bottom: 0, left: -20 }}>
<defs>
<linearGradient id="areaGradient" x1="0" y1="0" x2="0" y2="1">
<stop offset="0%" stopColor={color} stopOpacity={0.3} />
<stop offset="100%" stopColor={color} stopOpacity={0} />
</linearGradient>
</defs>
<CartesianGrid strokeDasharray="3 3" vertical={false} stroke="var(--color-border)" opacity={0.3} />
<XAxis dataKey="label" tick={{ fontSize: 11, fill: "var(--color-muted-foreground)" }} axisLine={false} tickLine={false} />
<YAxis tick={{ fontSize: 11, fill: "var(--color-muted-foreground)" }} axisLine={false} tickLine={false} allowDecimals={false} />
<Tooltip
contentStyle={{ backgroundColor: "var(--color-card)", border: "1px solid var(--color-border)", borderRadius: 8, fontSize: 12 }}
/>
<Area type="monotone" dataKey="value" stroke={color} strokeWidth={2} fill="url(#areaGradient)" dot={{ r: 3, fill: color }} />
</AreaChart>
</ResponsiveContainer>
);
}
// ---------------------------------------------------------------------------
// Horizontal stacked bar (libraries breakdown)
// ---------------------------------------------------------------------------
export function RcStackedBar({
data,
labels,
}: {
data: { name: string; read: number; reading: number; unread: number; sizeLabel: string }[];
labels: { read: string; reading: string; unread: string; books: string };
}) {
if (data.length === 0) return null;
return (
<ResponsiveContainer width="100%" height={data.length * 60 + 30}>
<BarChart data={data} layout="vertical" margin={{ top: 0, right: 5, bottom: 0, left: 5 }}>
<CartesianGrid strokeDasharray="3 3" horizontal={false} stroke="var(--color-border)" opacity={0.3} />
<XAxis type="number" tick={{ fontSize: 11, fill: "var(--color-muted-foreground)" }} axisLine={false} tickLine={false} allowDecimals={false} />
<YAxis type="category" dataKey="name" tick={{ fontSize: 12, fill: "var(--color-foreground)" }} axisLine={false} tickLine={false} width={120} />
<Tooltip
contentStyle={{ backgroundColor: "var(--color-card)", border: "1px solid var(--color-border)", borderRadius: 8, fontSize: 12 }}
/>
<Legend
wrapperStyle={{ fontSize: 11 }}
formatter={(value: string) => <span className="text-muted-foreground">{value}</span>}
/>
<Bar dataKey="read" stackId="a" fill="hsl(142 60% 45%)" name={labels.read} radius={[0, 0, 0, 0]} />
<Bar dataKey="reading" stackId="a" fill="hsl(45 93% 47%)" name={labels.reading} />
<Bar dataKey="unread" stackId="a" fill="hsl(220 13% 70%)" name={labels.unread} radius={[0, 4, 4, 0]} />
</BarChart>
</ResponsiveContainer>
);
}
// ---------------------------------------------------------------------------
// Horizontal bar chart (top series)
// ---------------------------------------------------------------------------
export function RcHorizontalBar({
data,
color = "hsl(142 60% 45%)",
noDataLabel,
}: {
data: { name: string; value: number; subLabel: string }[];
color?: string;
noDataLabel?: string;
}) {
if (data.length === 0) return <p className="text-muted-foreground text-sm text-center py-4">{noDataLabel}</p>;
return (
<ResponsiveContainer width="100%" height={data.length * 40 + 10}>
<BarChart data={data} layout="vertical" margin={{ top: 0, right: 5, bottom: 0, left: 5 }}>
<CartesianGrid strokeDasharray="3 3" horizontal={false} stroke="var(--color-border)" opacity={0.3} />
<XAxis type="number" tick={{ fontSize: 11, fill: "var(--color-muted-foreground)" }} axisLine={false} tickLine={false} allowDecimals={false} />
<YAxis type="category" dataKey="name" tick={{ fontSize: 11, fill: "var(--color-foreground)" }} axisLine={false} tickLine={false} width={120} />
<Tooltip
contentStyle={{ backgroundColor: "var(--color-card)", border: "1px solid var(--color-border)", borderRadius: 8, fontSize: 12 }}
/>
<Bar dataKey="value" fill={color} radius={[0, 4, 4, 0]} />
</BarChart>
</ResponsiveContainer>
);
}
// ---------------------------------------------------------------------------
// Multi-line chart (jobs over time)
// ---------------------------------------------------------------------------
export function RcMultiLineChart({
data,
lines,
noDataLabel,
}: {
data: Record<string, unknown>[];
lines: { key: string; label: string; color: string }[];
noDataLabel?: string;
}) {
const hasData = data.some((d) => lines.some((l) => (d[l.key] as number) > 0));
if (data.length === 0 || !hasData)
return <p className="text-muted-foreground text-sm text-center py-8">{noDataLabel}</p>;
return (
<ResponsiveContainer width="100%" height={180}>
<LineChart data={data} margin={{ top: 5, right: 5, bottom: 0, left: -20 }}>
<CartesianGrid strokeDasharray="3 3" vertical={false} stroke="var(--color-border)" opacity={0.3} />
<XAxis dataKey="label" tick={{ fontSize: 11, fill: "var(--color-muted-foreground)" }} axisLine={false} tickLine={false} />
<YAxis tick={{ fontSize: 11, fill: "var(--color-muted-foreground)" }} axisLine={false} tickLine={false} allowDecimals={false} />
<Tooltip
contentStyle={{ backgroundColor: "var(--color-card)", border: "1px solid var(--color-border)", borderRadius: 8, fontSize: 12 }}
/>
<Legend wrapperStyle={{ fontSize: 11 }} />
{lines.map((l) => (
<Line
key={l.key}
type="monotone"
dataKey={l.key}
name={l.label}
stroke={l.color}
strokeWidth={2}
dot={{ r: 3, fill: l.color }}
/>
))}
</LineChart>
</ResponsiveContainer>
);
}

View File

@@ -1,18 +1,56 @@
"use client";
import { useState, useTransition } from "react";
import { useState, useTransition, useEffect, useCallback } from "react";
import { createPortal } from "react-dom";
import { useRouter } from "next/navigation";
import { BookDto } from "@/lib/api";
import { FormField, FormLabel, FormInput } from "./ui/Form";
import { useTranslation } from "../../lib/i18n/context";
function LockButton({
locked,
onToggle,
disabled,
}: {
locked: boolean;
onToggle: () => void;
disabled?: boolean;
}) {
const { t } = useTranslation();
return (
<button
type="button"
onClick={onToggle}
disabled={disabled}
className={`p-1 rounded transition-colors ${
locked
? "text-amber-500 hover:text-amber-600"
: "text-muted-foreground/40 hover:text-muted-foreground"
}`}
title={locked ? t("editBook.lockedField") : t("editBook.clickToLock")}
>
{locked ? (
<svg className="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M12 15v2m-6 4h12a2 2 0 002-2v-6a2 2 0 00-2-2H6a2 2 0 00-2 2v6a2 2 0 002 2zm10-10V7a4 4 0 00-8 0v4h8z" />
</svg>
) : (
<svg className="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M8 11V7a4 4 0 118 0m-4 8v2m-6 4h12a2 2 0 002-2v-6a2 2 0 00-2-2H6a2 2 0 00-2 2v6a2 2 0 002 2z" />
</svg>
)}
</button>
);
}
interface EditBookFormProps {
book: BookDto;
}
export function EditBookForm({ book }: EditBookFormProps) {
const { t } = useTranslation();
const router = useRouter();
const [isPending, startTransition] = useTransition();
const [isEditing, setIsEditing] = useState(false);
const [isOpen, setIsOpen] = useState(false);
const [error, setError] = useState<string | null>(null);
const [title, setTitle] = useState(book.title);
@@ -22,6 +60,14 @@ export function EditBookForm({ book }: EditBookFormProps) {
const [series, setSeries] = useState(book.series ?? "");
const [volume, setVolume] = useState(book.volume?.toString() ?? "");
const [language, setLanguage] = useState(book.language ?? "");
const [summary, setSummary] = useState(book.summary ?? "");
const [isbn, setIsbn] = useState(book.isbn ?? "");
const [publishDate, setPublishDate] = useState(book.publish_date ?? "");
const [lockedFields, setLockedFields] = useState<Record<string, boolean>>(book.locked_fields ?? {});
const toggleLock = (field: string) => {
setLockedFields((prev) => ({ ...prev, [field]: !prev[field] }));
};
const addAuthor = () => {
const v = authorInput.trim();
@@ -43,16 +89,29 @@ export function EditBookForm({ book }: EditBookFormProps) {
}
};
const handleCancel = () => {
const handleClose = useCallback(() => {
setTitle(book.title);
setAuthors(book.authors ?? []);
setAuthorInput("");
setSeries(book.series ?? "");
setVolume(book.volume?.toString() ?? "");
setLanguage(book.language ?? "");
setSummary(book.summary ?? "");
setIsbn(book.isbn ?? "");
setPublishDate(book.publish_date ?? "");
setLockedFields(book.locked_fields ?? {});
setError(null);
setIsEditing(false);
setIsOpen(false);
}, [book]);
useEffect(() => {
if (!isOpen) return;
const handleKeyDown = (e: KeyboardEvent) => {
if (e.key === "Escape" && !isPending) handleClose();
};
document.addEventListener("keydown", handleKeyDown);
return () => document.removeEventListener("keydown", handleKeyDown);
}, [isOpen, isPending, handleClose]);
const handleSubmit = (e: React.FormEvent) => {
e.preventDefault();
@@ -75,50 +134,73 @@ export function EditBookForm({ book }: EditBookFormProps) {
series: series.trim() || null,
volume: volume.trim() ? parseInt(volume.trim(), 10) : null,
language: language.trim() || null,
summary: summary.trim() || null,
isbn: isbn.trim() || null,
publish_date: publishDate.trim() || null,
locked_fields: lockedFields,
}),
});
if (!res.ok) {
const data = await res.json();
setError(data.error ?? "Erreur lors de la sauvegarde");
setError(data.error ?? t("editBook.saveError"));
return;
}
setIsEditing(false);
setIsOpen(false);
router.refresh();
} catch {
setError("Erreur réseau");
setError(t("common.networkError"));
}
});
};
if (!isEditing) {
return (
const modal = isOpen ? createPortal(
<>
{/* Backdrop */}
<div
className="fixed inset-0 bg-black/30 backdrop-blur-sm z-50"
onClick={() => !isPending && handleClose()}
/>
{/* Modal */}
<div className="fixed inset-0 flex items-center justify-center z-50 p-4">
<div className="bg-card border border-border/50 rounded-xl shadow-2xl w-full max-w-2xl max-h-[90vh] overflow-y-auto animate-in fade-in zoom-in-95 duration-200">
{/* Header */}
<div className="flex items-center justify-between px-5 py-4 border-b border-border/50 bg-muted/30 sticky top-0 z-10">
<h3 className="font-semibold text-foreground">{t("editBook.editMetadata")}</h3>
<button
onClick={() => setIsEditing(true)}
className="inline-flex items-center gap-1.5 px-3 py-1.5 rounded-lg border border-border bg-card text-sm font-medium text-muted-foreground hover:text-foreground hover:border-primary transition-colors"
type="button"
onClick={handleClose}
disabled={isPending}
className="text-muted-foreground hover:text-foreground transition-colors p-1 hover:bg-accent rounded"
>
<span></span> Modifier
<svg className="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M6 18L18 6M6 6l12 12" />
</svg>
</button>
);
}
return (
<form onSubmit={handleSubmit} className="mt-4 p-4 border border-border rounded-xl bg-muted/30 space-y-4">
<h3 className="text-sm font-semibold text-foreground">Modifier les métadonnées</h3>
</div>
{/* Body */}
<form onSubmit={handleSubmit} className="p-5 space-y-5">
<div className="grid grid-cols-1 sm:grid-cols-2 gap-3">
<FormField className="sm:col-span-2">
<FormLabel required>Titre</FormLabel>
<div className="flex items-center gap-1">
<FormLabel required>{t("editBook.title")}</FormLabel>
<LockButton locked={!!lockedFields.title} onToggle={() => toggleLock("title")} disabled={isPending} />
</div>
<FormInput
value={title}
onChange={(e) => setTitle(e.target.value)}
disabled={isPending}
placeholder="Titre du livre"
placeholder={t("editBook.titlePlaceholder")}
/>
</FormField>
{/* Auteurs — multi-valeur */}
<FormField className="sm:col-span-2">
<FormLabel>Auteur(s)</FormLabel>
<div className="flex items-center gap-1">
<FormLabel>{t("editBook.authors")}</FormLabel>
<LockButton locked={!!lockedFields.authors} onToggle={() => toggleLock("authors")} disabled={isPending} />
</div>
<div className="space-y-2">
{authors.length > 0 && (
<div className="flex flex-wrap gap-1.5">
@@ -133,7 +215,7 @@ export function EditBookForm({ book }: EditBookFormProps) {
onClick={() => removeAuthor(i)}
disabled={isPending}
className="hover:text-destructive transition-colors ml-0.5"
aria-label={`Supprimer ${a}`}
aria-label={t("editBook.removeAuthor", { name: a })}
>
×
</button>
@@ -148,7 +230,7 @@ export function EditBookForm({ book }: EditBookFormProps) {
onChange={(e) => setAuthorInput(e.target.value)}
onKeyDown={handleAuthorKeyDown}
disabled={isPending}
placeholder="Ajouter un auteur (Entrée pour valider)"
placeholder={t("editBook.addAuthor")}
className="flex h-10 flex-1 rounded-md border border-input bg-background px-3 py-2 text-sm shadow-sm transition-colors placeholder:text-muted-foreground/90 focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50"
/>
<button
@@ -164,59 +246,136 @@ export function EditBookForm({ book }: EditBookFormProps) {
</FormField>
<FormField>
<FormLabel>Langue</FormLabel>
<div className="flex items-center gap-1">
<FormLabel>{t("editBook.language")}</FormLabel>
<LockButton locked={!!lockedFields.language} onToggle={() => toggleLock("language")} disabled={isPending} />
</div>
<FormInput
value={language}
onChange={(e) => setLanguage(e.target.value)}
disabled={isPending}
placeholder="ex : fr, en, jp"
placeholder={t("editBook.languagePlaceholder")}
/>
</FormField>
<FormField>
<FormLabel>Série</FormLabel>
<div className="flex items-center gap-1">
<FormLabel>{t("editBook.series")}</FormLabel>
<LockButton locked={!!lockedFields.series} onToggle={() => toggleLock("series")} disabled={isPending} />
</div>
<FormInput
value={series}
onChange={(e) => setSeries(e.target.value)}
disabled={isPending}
placeholder="Nom de la série"
placeholder={t("editBook.seriesPlaceholder")}
/>
</FormField>
<FormField>
<FormLabel>Volume</FormLabel>
<div className="flex items-center gap-1">
<FormLabel>{t("editBook.volume")}</FormLabel>
<LockButton locked={!!lockedFields.volume} onToggle={() => toggleLock("volume")} disabled={isPending} />
</div>
<FormInput
type="number"
min="1"
value={volume}
onChange={(e) => setVolume(e.target.value)}
disabled={isPending}
placeholder="Numéro de volume"
placeholder={t("editBook.volumePlaceholder")}
/>
</FormField>
<FormField>
<div className="flex items-center gap-1">
<FormLabel>{t("editBook.isbn")}</FormLabel>
<LockButton locked={!!lockedFields.isbn} onToggle={() => toggleLock("isbn")} disabled={isPending} />
</div>
<FormInput
value={isbn}
onChange={(e) => setIsbn(e.target.value)}
disabled={isPending}
placeholder="ISBN"
/>
</FormField>
<FormField>
<div className="flex items-center gap-1">
<FormLabel>{t("editBook.publishDate")}</FormLabel>
<LockButton locked={!!lockedFields.publish_date} onToggle={() => toggleLock("publish_date")} disabled={isPending} />
</div>
<FormInput
value={publishDate}
onChange={(e) => setPublishDate(e.target.value)}
disabled={isPending}
placeholder={t("editBook.publishDatePlaceholder")}
/>
</FormField>
<FormField className="sm:col-span-2">
<div className="flex items-center gap-1">
<FormLabel>{t("editBook.description")}</FormLabel>
<LockButton locked={!!lockedFields.summary} onToggle={() => toggleLock("summary")} disabled={isPending} />
</div>
<textarea
value={summary}
onChange={(e) => setSummary(e.target.value)}
disabled={isPending}
placeholder={t("editBook.descriptionPlaceholder")}
rows={4}
className="flex w-full rounded-md border border-input bg-background px-3 py-2 text-sm shadow-sm transition-colors placeholder:text-muted-foreground/90 focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50 resize-y"
/>
</FormField>
</div>
{/* Lock legend */}
{Object.values(lockedFields).some(Boolean) && (
<p className="text-xs text-amber-500 flex items-center gap-1.5">
<svg className="w-3.5 h-3.5 shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M12 15v2m-6 4h12a2 2 0 002-2v-6a2 2 0 00-2-2H6a2 2 0 00-2 2v6a2 2 0 002 2zm10-10V7a4 4 0 00-8 0v4h8z" />
</svg>
{t("editBook.lockedFieldsNote")}
</p>
)}
{error && (
<p className="text-xs text-destructive">{error}</p>
)}
<div className="flex items-center gap-2">
{/* Footer */}
<div className="flex items-center justify-end gap-2 pt-2 border-t border-border/50">
<button
type="button"
onClick={handleClose}
disabled={isPending}
className="px-4 py-1.5 rounded-lg border border-border bg-card text-sm font-medium text-muted-foreground hover:text-foreground transition-colors"
>
{t("common.cancel")}
</button>
<button
type="submit"
disabled={isPending || !title.trim()}
className="px-4 py-1.5 rounded-lg bg-primary text-white text-sm font-medium hover:bg-primary/90 disabled:opacity-50 disabled:cursor-not-allowed transition-colors"
>
{isPending ? "Sauvegarde…" : "Sauvegarder"}
</button>
<button
type="button"
onClick={handleCancel}
disabled={isPending}
className="px-4 py-1.5 rounded-lg border border-border bg-card text-sm font-medium text-muted-foreground hover:text-foreground transition-colors"
>
Annuler
{isPending ? t("editBook.savingLabel") : t("editBook.saveLabel")}
</button>
</div>
</form>
</div>
</div>
</>,
document.body
) : null;
return (
<>
<button
onClick={() => setIsOpen(true)}
className="inline-flex items-center gap-1.5 px-3 py-1.5 rounded-lg border border-border bg-card text-sm font-medium text-muted-foreground hover:text-foreground hover:border-primary transition-colors"
>
<span></span> {t("editBook.editMetadata")}
</button>
{modal}
</>
);
}

View File

@@ -1,8 +1,47 @@
"use client";
import { useState, useTransition } from "react";
import { useState, useTransition, useEffect, useCallback } from "react";
import { createPortal } from "react-dom";
import { useRouter } from "next/navigation";
import { FormField, FormLabel, FormInput } from "./ui/Form";
import { useTranslation } from "../../lib/i18n/context";
function LockButton({
locked,
onToggle,
disabled,
}: {
locked: boolean;
onToggle: () => void;
disabled?: boolean;
}) {
const { t } = useTranslation();
return (
<button
type="button"
onClick={onToggle}
disabled={disabled}
className={`p-1 rounded transition-colors ${
locked
? "text-amber-500 hover:text-amber-600"
: "text-muted-foreground/40 hover:text-muted-foreground"
}`}
title={locked ? t("editBook.lockedField") : t("editBook.clickToLock")}
>
{locked ? (
<svg className="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M12 15v2m-6 4h12a2 2 0 002-2v-6a2 2 0 00-2-2H6a2 2 0 00-2 2v6a2 2 0 002 2zm10-10V7a4 4 0 00-8 0v4h8z" />
</svg>
) : (
<svg className="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M8 11V7a4 4 0 118 0m-4 8v2m-6 4h12a2 2 0 002-2v-6a2 2 0 00-2-2H6a2 2 0 00-2 2v6a2 2 0 002 2z" />
</svg>
)}
</button>
);
}
const SERIES_STATUS_VALUES = ["", "ongoing", "ended", "hiatus", "cancelled", "upcoming"] as const;
interface EditSeriesFormProps {
libraryId: string;
@@ -13,6 +52,9 @@ interface EditSeriesFormProps {
currentBookLanguage: string | null;
currentDescription: string | null;
currentStartYear: number | null;
currentTotalVolumes: number | null;
currentStatus: string | null;
currentLockedFields: Record<string, boolean>;
}
export function EditSeriesForm({
@@ -24,10 +66,14 @@ export function EditSeriesForm({
currentBookLanguage,
currentDescription,
currentStartYear,
currentTotalVolumes,
currentStatus,
currentLockedFields,
}: EditSeriesFormProps) {
const { t } = useTranslation();
const router = useRouter();
const [isPending, startTransition] = useTransition();
const [isEditing, setIsEditing] = useState(false);
const [isOpen, setIsOpen] = useState(false);
const [error, setError] = useState<string | null>(null);
// Champs propres à la série
@@ -40,12 +86,21 @@ export function EditSeriesForm({
const [publisherInputEl, setPublisherInputEl] = useState<HTMLInputElement | null>(null);
const [description, setDescription] = useState(currentDescription ?? "");
const [startYear, setStartYear] = useState(currentStartYear?.toString() ?? "");
const [totalVolumes, setTotalVolumes] = useState(currentTotalVolumes?.toString() ?? "");
const [status, setStatus] = useState(currentStatus ?? "");
// Lock states
const [lockedFields, setLockedFields] = useState<Record<string, boolean>>(currentLockedFields);
// Propagation aux livres — opt-in via bouton
const [bookAuthor, setBookAuthor] = useState(currentBookAuthor ?? "");
const [bookLanguage, setBookLanguage] = useState(currentBookLanguage ?? "");
const [showApplyToBooks, setShowApplyToBooks] = useState(false);
const toggleLock = (field: string) => {
setLockedFields((prev) => ({ ...prev, [field]: !prev[field] }));
};
const addAuthor = () => {
const v = authorInput.trim();
if (v && !authors.includes(v)) {
@@ -86,7 +141,7 @@ export function EditSeriesForm({
}
};
const handleCancel = () => {
const handleClose = useCallback(() => {
setNewName(seriesName === "unclassified" ? "" : seriesName);
setAuthors(currentAuthors);
setAuthorInput("");
@@ -94,12 +149,24 @@ export function EditSeriesForm({
setPublisherInput("");
setDescription(currentDescription ?? "");
setStartYear(currentStartYear?.toString() ?? "");
setTotalVolumes(currentTotalVolumes?.toString() ?? "");
setStatus(currentStatus ?? "");
setLockedFields(currentLockedFields);
setShowApplyToBooks(false);
setBookAuthor(currentBookAuthor ?? "");
setBookLanguage(currentBookLanguage ?? "");
setError(null);
setIsEditing(false);
setIsOpen(false);
}, [seriesName, currentAuthors, currentPublishers, currentDescription, currentStartYear, currentTotalVolumes, currentBookAuthor, currentBookLanguage, currentLockedFields]);
useEffect(() => {
if (!isOpen) return;
const handleKeyDown = (e: KeyboardEvent) => {
if (e.key === "Escape" && !isPending) handleClose();
};
document.addEventListener("keydown", handleKeyDown);
return () => document.removeEventListener("keydown", handleKeyDown);
}, [isOpen, isPending, handleClose]);
const handleSubmit = (e: React.FormEvent) => {
e.preventDefault();
@@ -123,6 +190,9 @@ export function EditSeriesForm({
publishers: finalPublishers,
description: description.trim() || null,
start_year: startYear.trim() ? parseInt(startYear.trim(), 10) : null,
total_volumes: totalVolumes.trim() ? parseInt(totalVolumes.trim(), 10) : null,
status: status || null,
locked_fields: lockedFields,
};
if (showApplyToBooks) {
body.author = bookAuthor.trim() || null;
@@ -139,10 +209,10 @@ export function EditSeriesForm({
);
if (!res.ok) {
const data = await res.json();
setError(data.error ?? "Erreur lors de la sauvegarde");
setError(data.error ?? t("editBook.saveError"));
return;
}
setIsEditing(false);
setIsOpen(false);
if (effectiveName !== seriesName) {
router.push(`/libraries/${libraryId}/series/${encodeURIComponent(effectiveName)}` as any);
@@ -150,39 +220,55 @@ export function EditSeriesForm({
router.refresh();
}
} catch {
setError("Erreur réseau");
setError(t("common.networkError"));
}
});
};
if (!isEditing) {
return (
const modal = isOpen ? createPortal(
<>
{/* Backdrop */}
<div
className="fixed inset-0 bg-black/30 backdrop-blur-sm z-50"
onClick={() => !isPending && handleClose()}
/>
{/* Modal */}
<div className="fixed inset-0 flex items-center justify-center z-50 p-4">
<div className="bg-card border border-border/50 rounded-xl shadow-2xl w-full max-w-2xl max-h-[90vh] overflow-y-auto animate-in fade-in zoom-in-95 duration-200">
{/* Header */}
<div className="flex items-center justify-between px-5 py-4 border-b border-border/50 bg-muted/30 sticky top-0 z-10">
<h3 className="font-semibold text-foreground">{t("editSeries.title")}</h3>
<button
onClick={() => setIsEditing(true)}
className="inline-flex items-center gap-1.5 px-3 py-1.5 rounded-lg border border-border bg-card text-sm font-medium text-muted-foreground hover:text-foreground hover:border-primary transition-colors"
type="button"
onClick={handleClose}
disabled={isPending}
className="text-muted-foreground hover:text-foreground transition-colors p-1 hover:bg-accent rounded"
>
<span></span> Modifier la série
<svg className="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M6 18L18 6M6 6l12 12" />
</svg>
</button>
);
}
return (
<form onSubmit={handleSubmit} className="w-full p-4 border border-border rounded-xl bg-muted/30 space-y-5">
<h3 className="text-sm font-semibold text-foreground">Modifier les métadonnées de la série</h3>
</div>
{/* Body */}
<form onSubmit={handleSubmit} className="p-5 space-y-5">
<div className="grid grid-cols-1 sm:grid-cols-2 gap-3">
<FormField>
<FormLabel required>Nom</FormLabel>
<FormLabel required>{t("editSeries.name")}</FormLabel>
<FormInput
value={newName}
onChange={(e) => setNewName(e.target.value)}
disabled={isPending}
placeholder="Nom de la série"
placeholder={t("editSeries.namePlaceholder")}
/>
</FormField>
<FormField>
<FormLabel>Année de début</FormLabel>
<div className="flex items-center gap-1">
<FormLabel>{t("editSeries.startYear")}</FormLabel>
<LockButton locked={!!lockedFields.start_year} onToggle={() => toggleLock("start_year")} disabled={isPending} />
</div>
<FormInput
type="number"
min="1900"
@@ -190,13 +276,50 @@ export function EditSeriesForm({
value={startYear}
onChange={(e) => setStartYear(e.target.value)}
disabled={isPending}
placeholder="ex : 1990"
placeholder={t("editSeries.startYearPlaceholder")}
/>
</FormField>
<FormField>
<div className="flex items-center gap-1">
<FormLabel>{t("editSeries.totalVolumes")}</FormLabel>
<LockButton locked={!!lockedFields.total_volumes} onToggle={() => toggleLock("total_volumes")} disabled={isPending} />
</div>
<FormInput
type="number"
min="1"
value={totalVolumes}
onChange={(e) => setTotalVolumes(e.target.value)}
disabled={isPending}
placeholder="12"
/>
</FormField>
<FormField>
<div className="flex items-center gap-1">
<FormLabel>{t("editSeries.status")}</FormLabel>
<LockButton locked={!!lockedFields.status} onToggle={() => toggleLock("status")} disabled={isPending} />
</div>
<select
value={status}
onChange={(e) => setStatus(e.target.value)}
disabled={isPending}
className="w-full rounded-lg border border-border bg-background px-3 py-2 text-sm text-foreground focus:outline-none focus:ring-2 focus:ring-primary/40"
>
{SERIES_STATUS_VALUES.map((v) => (
<option key={v} value={v}>
{v === "" ? t("seriesStatus.notDefined") : t(`seriesStatus.${v}` as any)}
</option>
))}
</select>
</FormField>
{/* Auteurs — multi-valeur */}
<FormField className="sm:col-span-2">
<FormLabel>Auteur(s)</FormLabel>
<div className="flex items-center gap-1">
<FormLabel>{t("editSeries.authors")}</FormLabel>
<LockButton locked={!!lockedFields.authors} onToggle={() => toggleLock("authors")} disabled={isPending} />
</div>
<div className="space-y-2">
{authors.length > 0 && (
<div className="flex flex-wrap gap-1.5">
@@ -211,7 +334,7 @@ export function EditSeriesForm({
onClick={() => removeAuthor(i)}
disabled={isPending}
className="hover:text-destructive transition-colors ml-0.5"
aria-label={`Supprimer ${a}`}
aria-label={t("editBook.removeAuthor", { name: a })}
>
×
</button>
@@ -226,7 +349,7 @@ export function EditSeriesForm({
onChange={(e) => setAuthorInput(e.target.value)}
onKeyDown={handleAuthorKeyDown}
disabled={isPending}
placeholder="Ajouter un auteur (Entrée pour valider)"
placeholder={t("editBook.addAuthor")}
className="flex h-10 flex-1 rounded-md border border-input bg-background px-3 py-2 text-sm shadow-sm transition-colors placeholder:text-muted-foreground/90 focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50"
/>
<button
@@ -246,9 +369,9 @@ export function EditSeriesForm({
? "border-primary bg-primary/10 text-primary"
: "border-border bg-card text-muted-foreground hover:text-foreground"
}`}
title="Appliquer auteur et langue à tous les livres de la série"
title={t("editSeries.applyToBooksTitle")}
>
livres
{t("editSeries.applyToBooks")}
</button>
</div>
</div>
@@ -257,21 +380,21 @@ export function EditSeriesForm({
{showApplyToBooks && (
<div className="sm:col-span-2 grid grid-cols-1 sm:grid-cols-2 gap-3 pl-4 border-l-2 border-primary/30">
<FormField>
<FormLabel>Auteur (livres)</FormLabel>
<FormLabel>{t("editSeries.bookAuthor")}</FormLabel>
<FormInput
value={bookAuthor}
onChange={(e) => setBookAuthor(e.target.value)}
disabled={isPending}
placeholder="Écrase le champ auteur de chaque livre"
placeholder={t("editSeries.bookAuthorPlaceholder")}
/>
</FormField>
<FormField>
<FormLabel>Langue (livres)</FormLabel>
<FormLabel>{t("editSeries.bookLanguage")}</FormLabel>
<FormInput
value={bookLanguage}
onChange={(e) => setBookLanguage(e.target.value)}
disabled={isPending}
placeholder="ex : fr, en, jp"
placeholder={t("editBook.languagePlaceholder")}
/>
</FormField>
</div>
@@ -279,7 +402,10 @@ export function EditSeriesForm({
{/* Éditeurs — multi-valeur */}
<FormField className="sm:col-span-2">
<FormLabel>Éditeur(s)</FormLabel>
<div className="flex items-center gap-1">
<FormLabel>{t("editSeries.publishers")}</FormLabel>
<LockButton locked={!!lockedFields.publishers} onToggle={() => toggleLock("publishers")} disabled={isPending} />
</div>
<div className="space-y-2">
{publishers.length > 0 && (
<div className="flex flex-wrap gap-1.5">
@@ -294,7 +420,7 @@ export function EditSeriesForm({
onClick={() => removePublisher(i)}
disabled={isPending}
className="hover:text-destructive transition-colors ml-0.5"
aria-label={`Supprimer ${p}`}
aria-label={t("editBook.removeAuthor", { name: p })}
>
×
</button>
@@ -309,7 +435,7 @@ export function EditSeriesForm({
onChange={(e) => setPublisherInput(e.target.value)}
onKeyDown={handlePublisherKeyDown}
disabled={isPending}
placeholder="Ajouter un éditeur (Entrée pour valider)"
placeholder={t("editSeries.addPublisher")}
className="flex h-10 flex-1 rounded-md border border-input bg-background px-3 py-2 text-sm shadow-sm transition-colors placeholder:text-muted-foreground/90 focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50"
/>
<button
@@ -325,37 +451,67 @@ export function EditSeriesForm({
</FormField>
<FormField className="sm:col-span-2">
<FormLabel>Description</FormLabel>
<div className="flex items-center gap-1">
<FormLabel>{t("editBook.description")}</FormLabel>
<LockButton locked={!!lockedFields.description} onToggle={() => toggleLock("description")} disabled={isPending} />
</div>
<textarea
value={description}
onChange={(e) => setDescription(e.target.value)}
disabled={isPending}
rows={3}
placeholder="Synopsis ou description de la série…"
placeholder={t("editSeries.descriptionPlaceholder")}
className="flex w-full rounded-md border border-input bg-background px-3 py-2 text-sm shadow-sm transition-colors placeholder:text-muted-foreground/90 focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-ring focus-visible:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50 resize-none"
/>
</FormField>
</div>
{/* Lock legend */}
{Object.values(lockedFields).some(Boolean) && (
<p className="text-xs text-amber-500 flex items-center gap-1.5">
<svg className="w-3.5 h-3.5 shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M12 15v2m-6 4h12a2 2 0 002-2v-6a2 2 0 00-2-2H6a2 2 0 00-2 2v6a2 2 0 002 2zm10-10V7a4 4 0 00-8 0v4h8z" />
</svg>
{t("editBook.lockedFieldsNote")}
</p>
)}
{error && <p className="text-xs text-destructive">{error}</p>}
<div className="flex items-center gap-2">
{/* Footer */}
<div className="flex items-center justify-end gap-2 pt-2 border-t border-border/50">
<button
type="button"
onClick={handleClose}
disabled={isPending}
className="px-4 py-1.5 rounded-lg border border-border bg-card text-sm font-medium text-muted-foreground hover:text-foreground transition-colors"
>
{t("common.cancel")}
</button>
<button
type="submit"
disabled={isPending || (!newName.trim() && seriesName !== "unclassified")}
className="px-4 py-1.5 rounded-lg bg-primary text-white text-sm font-medium hover:bg-primary/90 disabled:opacity-50 disabled:cursor-not-allowed transition-colors"
>
{isPending ? "Sauvegarde…" : "Sauvegarder"}
</button>
<button
type="button"
onClick={handleCancel}
disabled={isPending}
className="px-4 py-1.5 rounded-lg border border-border bg-card text-sm font-medium text-muted-foreground hover:text-foreground transition-colors"
>
Annuler
{isPending ? t("common.saving") : t("common.save")}
</button>
</div>
</form>
</div>
</div>
</>,
document.body
) : null;
return (
<>
<button
onClick={() => setIsOpen(true)}
className="inline-flex items-center gap-1.5 px-3 py-1.5 rounded-lg border border-border bg-card text-sm font-medium text-muted-foreground hover:text-foreground hover:border-primary transition-colors"
>
<span></span> {t("editSeries.title")}
</button>
{modal}
</>
);
}

View File

@@ -2,6 +2,7 @@
import { useState, useCallback } from "react";
import { FolderItem } from "../../lib/api";
import { useTranslation } from "../../lib/i18n/context";
interface TreeNode extends FolderItem {
children?: TreeNode[];
@@ -15,6 +16,7 @@ interface FolderBrowserProps {
}
export function FolderBrowser({ initialFolders, selectedPath, onSelect }: FolderBrowserProps) {
const { t } = useTranslation();
// Convert initial folders to tree structure
const [tree, setTree] = useState<TreeNode[]>(
initialFolders.map(f => ({ ...f, children: f.has_children ? [] : undefined }))
@@ -173,7 +175,7 @@ export function FolderBrowser({ initialFolders, selectedPath, onSelect }: Folder
<div className="max-h-80 overflow-y-auto">
{tree.length === 0 ? (
<div className="px-3 py-8 text-sm text-muted-foreground text-center">
No folders found
{t("folder.noFolders")}
</div>
) : (
tree.map(node => renderNode(node))

View File

@@ -1,9 +1,11 @@
"use client";
import { useState } from "react";
import { createPortal } from "react-dom";
import { FolderBrowser } from "./FolderBrowser";
import { FolderItem } from "../../lib/api";
import { Button } from "./ui";
import { useTranslation } from "../../lib/i18n/context";
interface FolderPickerProps {
initialFolders: FolderItem[];
@@ -13,6 +15,7 @@ interface FolderPickerProps {
export function FolderPicker({ initialFolders, selectedPath, onSelect }: FolderPickerProps) {
const [isOpen, setIsOpen] = useState(false);
const { t } = useTranslation();
const handleSelect = (path: string) => {
onSelect(path);
@@ -27,7 +30,7 @@ export function FolderPicker({ initialFolders, selectedPath, onSelect }: FolderP
<input
type="text"
readOnly
value={selectedPath || "Select a folder..."}
value={selectedPath || t("folder.selectFolder")}
className={`
w-full px-3 py-2 rounded-lg border bg-card
text-sm font-mono
@@ -57,12 +60,12 @@ export function FolderPicker({ initialFolders, selectedPath, onSelect }: FolderP
<svg className="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M3 7v10a2 2 0 002 2h14a2 2 0 002-2V9a2 2 0 00-2-2h-6l-2-2H5a2 2 0 00-2 2z" />
</svg>
Browse
{t("common.browse")}
</Button>
</div>
{/* Popup Modal */}
{isOpen && (
{isOpen && createPortal(
<>
{/* Backdrop */}
<div
@@ -79,7 +82,7 @@ export function FolderPicker({ initialFolders, selectedPath, onSelect }: FolderP
<svg className="w-5 h-5 text-primary" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M3 7v10a2 2 0 002 2h14a2 2 0 002-2V9a2 2 0 00-2-2h-6l-2-2H5a2 2 0 00-2 2z" />
</svg>
<span className="font-medium">Select Folder</span>
<span className="font-medium">{t("folder.selectFolderTitle")}</span>
</div>
<button
type="button"
@@ -104,7 +107,7 @@ export function FolderPicker({ initialFolders, selectedPath, onSelect }: FolderP
{/* Footer */}
<div className="flex items-center justify-between px-4 py-3 border-t border-border/50 bg-muted/30">
<span className="text-xs text-muted-foreground">
Click a folder to select it
{t("folder.clickToSelect")}
</span>
<div className="flex gap-2">
<Button
@@ -113,13 +116,14 @@ export function FolderPicker({ initialFolders, selectedPath, onSelect }: FolderP
size="sm"
onClick={() => setIsOpen(false)}
>
Cancel
{t("common.cancel")}
</Button>
</div>
</div>
</div>
</div>
</>
</>,
document.body
)}
</div>
);

View File

@@ -0,0 +1,44 @@
"use client";
import { useEffect, useRef } from "react";
import { useRouter } from "next/navigation";
interface JobDetailLiveProps {
jobId: string;
isTerminal: boolean;
}
export function JobDetailLive({ jobId, isTerminal }: JobDetailLiveProps) {
const router = useRouter();
const isTerminalRef = useRef(isTerminal);
isTerminalRef.current = isTerminal;
useEffect(() => {
if (isTerminalRef.current) return;
const eventSource = new EventSource(`/api/jobs/${jobId}/stream`);
eventSource.onmessage = (event) => {
try {
const data = JSON.parse(event.data);
router.refresh();
if (data.status === "success" || data.status === "failed" || data.status === "cancelled") {
eventSource.close();
}
} catch {
// ignore parse errors
}
};
eventSource.onerror = () => {
eventSource.close();
};
return () => {
eventSource.close();
};
}, [jobId, router]);
return null;
}

View File

@@ -1,6 +1,7 @@
"use client";
import { useEffect, useState } from "react";
import { useTranslation } from "../../lib/i18n/context";
import { StatusBadge, Badge, ProgressBar } from "./ui";
interface ProgressEvent {
@@ -24,6 +25,7 @@ interface JobProgressProps {
}
export function JobProgress({ jobId, onComplete }: JobProgressProps) {
const { t } = useTranslation();
const [progress, setProgress] = useState<ProgressEvent | null>(null);
const [error, setError] = useState<string | null>(null);
const [isComplete, setIsComplete] = useState(false);
@@ -53,25 +55,25 @@ export function JobProgress({ jobId, onComplete }: JobProgressProps) {
onComplete?.();
}
} catch (err) {
setError("Failed to parse SSE data");
setError(t("jobProgress.sseError"));
}
};
eventSource.onerror = (err) => {
console.error("SSE error:", err);
eventSource.close();
setError("Connection lost");
setError(t("jobProgress.connectionLost"));
};
return () => {
eventSource.close();
};
}, [jobId, onComplete]);
}, [jobId, onComplete, t]);
if (error) {
return (
<div className="p-4 bg-destructive/10 text-error rounded-lg text-sm">
Error: {error}
{t("jobProgress.error", { message: error })}
</div>
);
}
@@ -79,7 +81,7 @@ export function JobProgress({ jobId, onComplete }: JobProgressProps) {
if (!progress) {
return (
<div className="p-4 text-muted-foreground text-sm">
Loading progress...
{t("jobProgress.loadingProgress")}
</div>
);
}
@@ -88,14 +90,14 @@ export function JobProgress({ jobId, onComplete }: JobProgressProps) {
const processed = progress.processed_files ?? 0;
const total = progress.total_files ?? 0;
const isPhase2 = progress.status === "extracting_pages" || progress.status === "generating_thumbnails";
const unitLabel = progress.status === "extracting_pages" ? "pages" : progress.status === "generating_thumbnails" ? "thumbnails" : "files";
const unitLabel = progress.status === "extracting_pages" ? t("jobProgress.pages") : progress.status === "generating_thumbnails" ? t("jobProgress.thumbnails") : t("jobProgress.filesUnit");
return (
<div className="p-4 bg-card rounded-lg border border-border">
<div className="flex items-center justify-between mb-3">
<StatusBadge status={progress.status} />
{isComplete && (
<Badge variant="success">Complete</Badge>
<Badge variant="success">{t("jobProgress.done")}</Badge>
)}
</div>
@@ -105,20 +107,20 @@ export function JobProgress({ jobId, onComplete }: JobProgressProps) {
<span>{processed} / {total} {unitLabel}</span>
{progress.current_file && (
<span className="truncate max-w-md" title={progress.current_file}>
Current: {progress.current_file.length > 40
{t("jobProgress.currentFile", { file: progress.current_file.length > 40
? progress.current_file.substring(0, 40) + "..."
: progress.current_file}
: progress.current_file })}
</span>
)}
</div>
{progress.stats_json && !isPhase2 && (
<div className="flex flex-wrap gap-3 text-xs">
<Badge variant="primary">Scanned: {progress.stats_json.scanned_files}</Badge>
<Badge variant="success">Indexed: {progress.stats_json.indexed_files}</Badge>
<Badge variant="warning">Removed: {progress.stats_json.removed_files}</Badge>
<Badge variant="primary">{t("jobProgress.scanned", { count: progress.stats_json.scanned_files })}</Badge>
<Badge variant="success">{t("jobProgress.indexed", { count: progress.stats_json.indexed_files })}</Badge>
<Badge variant="warning">{t("jobProgress.removed", { count: progress.stats_json.removed_files })}</Badge>
{progress.stats_json.errors > 0 && (
<Badge variant="error">Errors: {progress.stats_json.errors}</Badge>
<Badge variant="error">{t("jobProgress.errors", { count: progress.stats_json.errors })}</Badge>
)}
</div>
)}

View File

@@ -2,8 +2,9 @@
import { useState } from "react";
import Link from "next/link";
import { useTranslation } from "../../lib/i18n/context";
import { JobProgress } from "./JobProgress";
import { StatusBadge, JobTypeBadge, Button, MiniProgressBar } from "./ui";
import { StatusBadge, JobTypeBadge, Button, MiniProgressBar, Icon, Tooltip } from "./ui";
interface JobRowProps {
job: {
@@ -20,6 +21,7 @@ interface JobRowProps {
indexed_files: number;
removed_files: number;
errors: number;
refreshed?: number;
} | null;
progress_percent: number | null;
processed_files: number | null;
@@ -33,6 +35,7 @@ interface JobRowProps {
}
export function JobRow({ job, libraryName, highlighted, onCancel, formatDate, formatDuration }: JobRowProps) {
const { t } = useTranslation();
const isActive = job.status === "running" || job.status === "pending" || job.status === "extracting_pages" || job.status === "generating_thumbnails";
const [showProgress, setShowProgress] = useState(highlighted || isActive);
@@ -57,28 +60,11 @@ export function JobRow({ job, libraryName, highlighted, onCancel, formatDate, fo
const isThumbnailJob = job.type === "thumbnail_rebuild" || job.type === "thumbnail_regenerate";
const hasThumbnailPhase = isPhase2 || isThumbnailJob;
// Files column: index-phase stats only (Phase 1 discovery)
const filesDisplay =
job.status === "running" && !isPhase2
? job.total_files != null
? `${job.processed_files ?? 0}/${job.total_files}`
: scanned > 0
? `${scanned} scanned`
: "-"
: job.status === "success" && (indexed > 0 || removed > 0 || errors > 0)
? null // rendered below as ✓ / / ⚠
: scanned > 0
? `${scanned} scanned`
: "—";
const isMetadataBatch = job.type === "metadata_batch";
const isMetadataRefresh = job.type === "metadata_refresh";
// Thumbnails column (Phase 2: extracting_pages + generating_thumbnails)
// Thumbnails progress (Phase 2: extracting_pages + generating_thumbnails)
const thumbInProgress = hasThumbnailPhase && (job.status === "running" || isPhase2);
const thumbDisplay =
thumbInProgress && job.total_files != null
? `${job.processed_files ?? 0}/${job.total_files}`
: job.status === "success" && job.total_files != null && hasThumbnailPhase
? `${job.total_files}`
: "—";
return (
<>
@@ -113,32 +99,99 @@ export function JobRow({ job, libraryName, highlighted, onCancel, formatDate, fo
className="text-xs text-primary hover:text-primary/80 hover:underline"
onClick={() => setShowProgress(!showProgress)}
>
{showProgress ? "Hide" : "Show"} progress
{showProgress ? t("jobRow.hideProgress") : t("jobRow.showProgress")}
</button>
)}
</div>
</td>
<td className="px-4 py-3">
<div className="flex flex-col gap-1">
{filesDisplay !== null ? (
<span className="text-sm text-foreground">{filesDisplay}</span>
) : (
<div className="flex items-center gap-2 text-xs">
<span className="text-success"> {indexed}</span>
{removed > 0 && <span className="text-warning"> {removed}</span>}
{errors > 0 && <span className="text-error"> {errors}</span>}
</div>
)}
{job.status === "running" && !isPhase2 && job.total_files != null && (
<MiniProgressBar value={job.processed_files ?? 0} max={job.total_files} className="w-24" />
)}
</div>
</td>
<td className="px-4 py-3">
{/* Running progress */}
{isActive && job.total_files != null && (
<div className="flex flex-col gap-1">
<span className="text-sm text-foreground">{thumbDisplay}</span>
{thumbInProgress && job.total_files != null && (
<span className="text-sm text-foreground">{job.processed_files ?? 0}/{job.total_files}</span>
<MiniProgressBar value={job.processed_files ?? 0} max={job.total_files} className="w-24" />
</div>
)}
{/* Completed stats with icons */}
{!isActive && (
<div className="flex items-center gap-3 text-xs">
{/* Files: indexed count */}
{indexed > 0 && (
<Tooltip label={t("jobRow.filesIndexed", { count: indexed })}>
<span className="inline-flex items-center gap-1 text-success">
<Icon name="document" size="sm" />
{indexed}
</span>
</Tooltip>
)}
{/* Removed files */}
{removed > 0 && (
<Tooltip label={t("jobRow.filesRemoved", { count: removed })}>
<span className="inline-flex items-center gap-1 text-warning">
<Icon name="trash" size="sm" />
{removed}
</span>
</Tooltip>
)}
{/* Thumbnails */}
{hasThumbnailPhase && job.total_files != null && job.total_files > 0 && (
<Tooltip label={t("jobRow.thumbnailsGenerated", { count: job.total_files })}>
<span className="inline-flex items-center gap-1 text-primary">
<Icon name="image" size="sm" />
{job.total_files}
</span>
</Tooltip>
)}
{/* Metadata batch: series processed */}
{isMetadataBatch && job.total_files != null && job.total_files > 0 && (
<Tooltip label={t("jobRow.metadataProcessed", { count: job.total_files })}>
<span className="inline-flex items-center gap-1 text-info">
<Icon name="tag" size="sm" />
{job.total_files}
</span>
</Tooltip>
)}
{/* Metadata refresh: total links + refreshed count */}
{isMetadataRefresh && job.total_files != null && job.total_files > 0 && (
<Tooltip label={t("jobRow.metadataLinks", { count: job.total_files })}>
<span className="inline-flex items-center gap-1 text-info">
<Icon name="tag" size="sm" />
{job.total_files}
</span>
</Tooltip>
)}
{isMetadataRefresh && job.stats_json?.refreshed != null && job.stats_json.refreshed > 0 && (
<Tooltip label={t("jobRow.metadataRefreshed", { count: job.stats_json.refreshed })}>
<span className="inline-flex items-center gap-1 text-success">
<Icon name="refresh" size="sm" />
{job.stats_json.refreshed}
</span>
</Tooltip>
)}
{/* Errors */}
{errors > 0 && (
<Tooltip label={t("jobRow.errors", { count: errors })}>
<span className="inline-flex items-center gap-1 text-error">
<Icon name="warning" size="sm" />
{errors}
</span>
</Tooltip>
)}
{/* Scanned only (no other stats) */}
{indexed === 0 && removed === 0 && errors === 0 && !hasThumbnailPhase && !isMetadataBatch && !isMetadataRefresh && scanned > 0 && (
<Tooltip label={t("jobRow.scanned", { count: scanned })}>
<span className="inline-flex items-center gap-1 text-muted-foreground">
<Icon name="search" size="sm" />
{scanned}
</span>
</Tooltip>
)}
{/* Nothing to show */}
{indexed === 0 && removed === 0 && errors === 0 && scanned === 0 && !hasThumbnailPhase && !isMetadataBatch && !isMetadataRefresh && (
<span className="text-sm text-muted-foreground"></span>
)}
</div>
)}
</div>
</td>
@@ -152,17 +205,24 @@ export function JobRow({ job, libraryName, highlighted, onCancel, formatDate, fo
<div className="flex items-center gap-2">
<Link
href={`/jobs/${job.id}`}
className="inline-flex items-center px-3 py-1.5 text-xs font-medium rounded-lg bg-primary text-white hover:bg-primary/90 transition-colors"
className="inline-flex items-center justify-center gap-1.5 h-7 px-2.5 text-xs font-medium rounded-md bg-primary text-white hover:bg-primary/90 transition-colors"
>
View
<svg className="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M15 12a3 3 0 11-6 0 3 3 0 016 0z" />
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M2.458 12C3.732 7.943 7.523 5 12 5c4.478 0 8.268 2.943 9.542 7-1.274 4.057-5.064 7-9.542 7-4.477 0-8.268-2.943-9.542-7z" />
</svg>
{t("jobRow.view")}
</Link>
{(job.status === "pending" || job.status === "running" || job.status === "extracting_pages" || job.status === "generating_thumbnails") && (
<Button
variant="danger"
size="sm"
size="xs"
onClick={() => onCancel(job.id)}
>
Cancel
<svg className="w-3.5 h-3.5 mr-1.5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M6 18L18 6M6 6l12 12" />
</svg>
{t("common.cancel")}
</Button>
)}
</div>
@@ -170,7 +230,7 @@ export function JobRow({ job, libraryName, highlighted, onCancel, formatDate, fo
</tr>
{showProgress && isActive && (
<tr>
<td colSpan={9} className="px-4 py-3 bg-muted/50">
<td colSpan={8} className="px-4 py-3 bg-muted/50">
<JobProgress
jobId={job.id}
onComplete={handleComplete}

Some files were not shown because too many files have changed in this diff Show More