Compare commits

...

48 Commits

Author SHA1 Message Date
ee65c6263a perf: add ETag and server-side caching for thumbnail proxy
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 49s
Add ETag header to API thumbnail responses for 304 Not Modified support.
Forward If-None-Match/ETag through the Next.js proxy route handler and
add next.revalidate for 24h server-side fetch caching to reduce
SSR-to-API round trips on the libraries page.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-22 06:52:47 +01:00
691b6b22ab chore: bump version to 1.25.0 2026-03-22 06:52:02 +01:00
11c80a16a3 docs: add Telegram notifications and updated dashboard to README and FEATURES
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 46s
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-22 06:40:34 +01:00
c366b44c54 chore: bump version to 1.24.1 2026-03-22 06:39:23 +01:00
92f80542e6 perf: skip Next.js image re-optimization and stream proxy responses
Thumbnails are already optimized (WebP) by the API, so disable Next.js
image optimization to avoid redundant CPU work. Switch route handlers
from buffering (arrayBuffer) to streaming (response.body) to reduce
memory usage and latency.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-22 06:38:46 +01:00
3a25e42a20 chore: bump version to 1.24.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 1m7s
2026-03-22 06:31:56 +01:00
24763bf5a7 fix: show absolute date/time in jobs "created" column
Replace relative time formatting (which incorrectly showed "just now"
for many jobs due to negative time diffs from server/client timezone
mismatch) with absolute locale-formatted date/time.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-22 06:31:37 +01:00
08f0397029 feat: add reading stats and replace dashboard charts with recharts
Add currently reading, recently read, and reading activity sections to
the dashboard. Replace all custom SVG/CSS charts with recharts library
(donut, area, stacked bar, horizontal bar). Reorganize layout: libraries
and popular series side by side, books added chart full width below.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-22 06:26:45 +01:00
766e3a01b2 chore: bump version to 1.23.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 45s
2026-03-21 17:43:11 +01:00
626e2e035d feat: send book thumbnails in Telegram notifications
Use Telegram sendPhoto API for conversion and metadata-approved events
when a book thumbnail is available on disk. Falls back to text message
if photo upload fails.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 17:43:01 +01:00
cfd2321db2 chore: bump version to 1.22.0 2026-03-21 17:40:22 +01:00
1b715033ce fix: add missing Next.js route handler for Telegram test endpoint
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 17:39:46 +01:00
81d1586501 feat: add Telegram notification system with granular event toggles
Add notifications crate shared between API and indexer to send Telegram
messages on scan/thumbnail/conversion completion/failure, metadata linking,
batch and refresh events. Configurable via a new Notifications tab in the
backoffice settings with per-event toggle switches grouped by category.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 17:24:43 +01:00
bd74c9e3e3 docs: add comprehensive features list to README and docs/FEATURES.md
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 1m1s
Replace the minimal README features section with a concise categorized
summary and link to a detailed docs/FEATURES.md covering all features,
business rules, API endpoints, and integrations.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-21 14:34:36 +01:00
41228430cf chore: bump version to 1.21.2 2026-03-21 14:34:32 +01:00
6a4ba06fac fix: include series_metadata authors in authors listing and detail pages
Authors were only sourced from books.authors/books.author fields which are
often empty. Now also aggregates authors from series_metadata.authors
(populated by metadata providers like bedetheque). Adds author filter to
/series endpoint and updates the author detail page to use it.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 14:34:11 +01:00
e5c3542d3f refactor: split books.rs into books+series, reorganize OpenAPI tags and fix access control
- Extract series code from books.rs into dedicated series.rs module
- Reorganize OpenAPI tags: split overloaded "books" tag into books, series, search, stats
- Add missing endpoints to OpenAPI: metadata_batch, metadata_refresh, komga, update_metadata_provider
- Add missing schemas: MissingVolumeInput, Komga/Batch/Refresh DTOs
- Fix access control: move GET /libraries and POST /libraries/:id/scan to read routes
  so non-admin tokens can list libraries and trigger scans

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 14:23:19 +01:00
24516f1069 chore: bump version to 1.21.1
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 41s
2026-03-21 13:42:17 +01:00
5383cdef60 feat: allow batch metadata and refresh metadata on all libraries
When no specific library is selected, iterate over all libraries and
trigger a job for each one, skipping libraries with metadata disabled.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 13:42:08 +01:00
be5c3f7a34 fix: pass explicit locale to date formatting to prevent hydration mismatch
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 41s
Server and client could use different default locales for
toLocaleDateString/toLocaleString, causing React hydration errors.
Pass the user locale explicitly in JobsList and SettingsPage.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 13:36:35 +01:00
caa9922ff9 chore: bump version to 1.21.0 2026-03-21 13:34:47 +01:00
135f000c71 refactor: switch JobsIndicator from polling to SSE and fix stream endpoint
Replace fetch polling in JobsIndicator with EventSource connected to
/api/jobs/stream. Fix the SSE route to return all jobs (via
/index/status) instead of only active ones, since JobsList also
consumes this stream for the full job history. JobsIndicator now
filters active jobs client-side. SSE server-side uses adaptive
interval (2s active, 15s idle) and only sends when data changes.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 13:33:58 +01:00
d9e50a4235 chore: bump version to 1.20.1
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 1m13s
2026-03-21 13:13:39 +01:00
5f6eb5a5cb perf: add selective fetch caching for stable API endpoints
Make apiFetch support Next.js revalidate option instead of
hardcoding cache: no-store on every request. Stable endpoints
(libraries, settings, stats, series statuses) now use time-based
revalidation while dynamic data (books, search, jobs) stays uncached.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 13:13:28 +01:00
41c77fca2e chore: bump version to 1.20.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 1m15s
2026-03-21 13:06:28 +01:00
49621f3fb1 perf: wrap BookCard and BookImage with React.memo
Prevent unnecessary re-renders of book grid items when parent
components update without changing book data.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 13:03:24 +01:00
6df743b2e6 perf: lazy-load heavy modal components with next/dynamic
Dynamic import EditBookForm, EditSeriesForm, MetadataSearchModal, and
ProwlarrSearchModal so their code is split into separate chunks and
only fetched when the user interacts with them.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 13:02:10 +01:00
edfefc0128 perf: optimize JobsIndicator polling with visibility API and adaptive interval
Pause polling when the tab is hidden, refetch immediately when it
becomes visible again, and use a 30s interval when no jobs are active
instead of polling every 2s unconditionally.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 12:59:06 +01:00
b0185abefe perf: enable Next.js image optimization across backoffice
Remove `unoptimized` flag from all thumbnail/cover Image components
and add proper responsive `sizes` props. Convert raw `<img>` tags on
the libraries page to next/image. Add 24h minimumCacheTTL for
optimized images. BookPreview keeps `unoptimized` since the API
already returns optimized WebP.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 12:57:10 +01:00
b9e54cbfd8 chore: bump version to 1.19.1
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 54s
2026-03-21 12:47:31 +01:00
3f0bd783cd feat: include series_count and thumbnail_book_ids in libraries API response
Eliminates N+1 sequential fetchSeries calls on the libraries page by
returning series count and up to 5 thumbnail book IDs (one per series)
directly from GET /libraries.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-21 12:47:10 +01:00
fc8856c83f chore: bump version to 1.19.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 1m19s
2026-03-21 08:12:19 +01:00
bd09f3d943 feat: persist filter state in localStorage across pages
Save/restore filter values in LiveSearchForm using localStorage keyed
by basePath (e.g. filters:/books, filters:/series). Filters are restored
on mount when the URL has no active filters, and cleared when the user
clicks the Clear button.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 08:12:10 +01:00
1f434c3d67 feat: add format and metadata filters to books page
Add two new filters to the books listing page:
- Format filter (CBZ/CBR/PDF/EPUB) using existing API support
- Metadata linked/unlinked filter with new API support via
  LEFT JOIN on external_metadata_links (using DISTINCT ON CTE
  matching the series endpoint pattern)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 08:09:37 +01:00
4972a403df chore: bump version to 1.18.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 1m7s
2026-03-21 07:47:52 +01:00
629708cdd0 feat: redesign libraries page UI with fan thumbnails and modal settings
- Replace thumbnail mosaic with fan/arc layout using series covers as background
- Move library settings from dropdown to full-page portal modal with sections
- Move FolderPicker modal to portal for proper z-index stacking
- Add descriptions to each setting for better clarity
- Move delete button to card header, compact config tags
- Add i18n keys for new labels and descriptions (en/fr)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 07:47:36 +01:00
560087a897 chore: bump version to 1.17.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 1m12s
2026-03-21 07:23:52 +01:00
27f553b005 feat: add rescan job type and improve full rebuild UX
Add "Deep rescan" job type that clears directory mtimes to force
re-walking all directories, discovering newly supported formats (e.g.
EPUB) without deleting existing data or metadata.

Also improve full rebuild button: red destructive styling instead of
warning, and FR description explicitly mentions metadata/reading status
loss. Rename FR rebuild label to "Mise à jour".

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 07:23:38 +01:00
ed7665248e chore: bump version to 1.16.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 1m5s
2026-03-21 07:06:28 +01:00
736b8aedc0 feat: add EPUB format support with spine-aware image extraction
Parse EPUB structure (container.xml → OPF → spine → XHTML) to extract
images in reading order. Zero new dependencies — reuses zip + regex
crates with pre-compiled regexes and per-file index cache for
performance. Falls back to CBZ-style image listing when spine contains
no images. Includes DB migration, API/indexer/backoffice updates.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 07:05:47 +01:00
3daa49ae6c feat: add live refresh to job detail page via SSE
The job detail page was only server-rendered with no live updates,
unlike the jobs list page. Add a lightweight JobDetailLive client
component that subscribes to the existing SSE endpoint and calls
router.refresh() on each update, keeping the page in sync while
a job is running.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-21 06:52:57 +01:00
5fb24188e1 chore: bump version to 1.15.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 44s
2026-03-20 13:35:36 +01:00
54f972db17 chore: bump version to 1.14.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 45s
2026-03-20 12:48:14 +01:00
acd8b62382 chore: bump version to 1.13.0 2026-03-20 12:44:54 +01:00
cc65e3d1ad feat: highlight missing volumes in Prowlarr search results
API extracts volume numbers from release titles and matches them against
missing volumes sent by the frontend. Matched results are highlighted in
green with badges indicating which missing volumes were found.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-20 12:44:35 +01:00
70889ca955 chore: bump version to 1.12.0
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 43s
2026-03-20 11:43:34 +01:00
4ad6d57271 feat: add authors page to backoffice with dedicated API endpoint
Add a new GET /authors endpoint that aggregates unique authors from books
with book/series counts, pagination and search. Add author filter to
GET /books. Backoffice gets a list page with search/sort and a detail
page showing the author's series and books.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-20 11:43:22 +01:00
fe5de3d5c1 feat: add scheduled metadata refresh for libraries
Add metadata_refresh_mode (manual/hourly/daily/weekly) to libraries,
with automatic scheduling via the indexer. Includes API support,
backoffice UI controls, i18n translations, and DB migration.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-20 10:51:52 +01:00
72 changed files with 5660 additions and 1644 deletions

25
Cargo.lock generated
View File

@@ -64,7 +64,7 @@ checksum = "7f202df86484c868dbad7eaa557ef785d5c66295e41b460ef922eca0723b842c"
[[package]] [[package]]
name = "api" name = "api"
version = "1.11.1" version = "1.25.0"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"argon2", "argon2",
@@ -76,6 +76,7 @@ dependencies = [
"image", "image",
"jpeg-decoder", "jpeg-decoder",
"lru", "lru",
"notifications",
"parsers", "parsers",
"rand 0.8.5", "rand 0.8.5",
"regex", "regex",
@@ -1232,7 +1233,7 @@ dependencies = [
[[package]] [[package]]
name = "indexer" name = "indexer"
version = "1.11.1" version = "1.25.0"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"axum", "axum",
@@ -1240,6 +1241,7 @@ dependencies = [
"futures", "futures",
"image", "image",
"jpeg-decoder", "jpeg-decoder",
"notifications",
"num_cpus", "num_cpus",
"parsers", "parsers",
"reqwest", "reqwest",
@@ -1663,6 +1665,19 @@ dependencies = [
"nom", "nom",
] ]
[[package]]
name = "notifications"
version = "1.25.0"
dependencies = [
"anyhow",
"reqwest",
"serde",
"serde_json",
"sqlx",
"tokio",
"tracing",
]
[[package]] [[package]]
name = "nu-ansi-term" name = "nu-ansi-term"
version = "0.50.3" version = "0.50.3"
@@ -1771,7 +1786,7 @@ dependencies = [
[[package]] [[package]]
name = "parsers" name = "parsers"
version = "1.11.1" version = "1.25.0"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"flate2", "flate2",
@@ -2270,6 +2285,7 @@ dependencies = [
"base64", "base64",
"bytes", "bytes",
"futures-core", "futures-core",
"futures-util",
"http", "http",
"http-body", "http-body",
"http-body-util", "http-body-util",
@@ -2278,6 +2294,7 @@ dependencies = [
"hyper-util", "hyper-util",
"js-sys", "js-sys",
"log", "log",
"mime_guess",
"percent-encoding", "percent-encoding",
"pin-project-lite", "pin-project-lite",
"quinn", "quinn",
@@ -2906,7 +2923,7 @@ dependencies = [
[[package]] [[package]]
name = "stripstream-core" name = "stripstream-core"
version = "1.11.1" version = "1.25.0"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"serde", "serde",

View File

@@ -3,13 +3,14 @@ members = [
"apps/api", "apps/api",
"apps/indexer", "apps/indexer",
"crates/core", "crates/core",
"crates/notifications",
"crates/parsers", "crates/parsers",
] ]
resolver = "2" resolver = "2"
[workspace.package] [workspace.package]
edition = "2021" edition = "2021"
version = "1.11.1" version = "1.25.0"
license = "MIT" license = "MIT"
[workspace.dependencies] [workspace.dependencies]
@@ -22,7 +23,7 @@ image = { version = "0.25", default-features = false, features = ["jpeg", "png",
jpeg-decoder = "0.3" jpeg-decoder = "0.3"
lru = "0.12" lru = "0.12"
rayon = "1.10" rayon = "1.10"
reqwest = { version = "0.12", default-features = false, features = ["json", "rustls-tls"] } reqwest = { version = "0.12", default-features = false, features = ["json", "multipart", "rustls-tls"] }
rand = "0.8" rand = "0.8"
serde = { version = "1.0", features = ["derive"] } serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0" serde_json = "1.0"

View File

@@ -81,28 +81,66 @@ The backoffice will be available at http://localhost:7082
## Features ## Features
### Libraries Management > For the full feature list, business rules, and API details, see [docs/FEATURES.md](docs/FEATURES.md).
- Create and manage multiple libraries
- Configure automatic scanning schedules (hourly, daily, weekly)
- Real-time file watcher for instant indexing
- Full and incremental rebuild options
### Books Management ### Libraries
- Support for CBZ, CBR, and PDF formats - Multi-library management with per-library configuration
- Automatic metadata extraction - Incremental and full scanning, real-time filesystem watcher
- Series and volume detection - Per-library metadata provider selection (Google Books, ComicVine, BedéThèque, AniList, Open Library)
- Full-text search powered by PostgreSQL
### Jobs Monitoring ### Books & Series
- Real-time job progress tracking - **Formats**: CBZ, CBR, PDF, EPUB
- Detailed statistics (scanned, indexed, removed, errors) - Automatic metadata extraction (title, series, volume, authors, page count) from filenames and directory structure
- Job history and logs - Series aggregation with missing volume detection
- Cancel pending jobs - Thumbnail generation (WebP/JPEG/PNG) with lazy generation and bulk rebuild
- CBR → CBZ conversion
### Search ### Reading Progress
- Full-text search across titles, authors, and series - Per-book tracking: unread / reading / read with current page
- Library filtering - Series-level aggregated reading status
- Real-time suggestions - Bulk mark-as-read for series
### Search & Discovery
- Full-text search across titles, authors, and series (PostgreSQL `pg_trgm`)
- Author listing with book/series counts
- Filtering by reading status, series status, format, metadata provider
### External Metadata
- Search, match, approve/reject workflow with confidence scoring
- Batch auto-matching and scheduled metadata refresh
- Field locking to protect manual edits from sync
### Notifications
- **Telegram**: real-time notifications via Telegram Bot API
- 12 granular event toggles (scans, thumbnails, conversions, metadata)
- Book thumbnail images included in notifications where applicable
- Test connection from settings
### External Integrations
- **Komga**: import reading progress
- **Prowlarr**: search for missing volumes
- **qBittorrent**: add torrents directly from search results
### Background Jobs
- Rebuild, rescan, thumbnail generation, metadata batch, CBR conversion
- Real-time progress via Server-Sent Events (SSE)
- Job history, error tracking, cancellation
### Page Rendering
- On-demand page extraction from all formats
- Image processing (format, quality, max width, resampling filter)
- LRU in-memory + disk cache
### Security
- Token-based auth (`admin` / `read` scopes) with Argon2 hashing
- Rate limiting, token expiration and revocation
### Web UI (Backoffice)
- Dashboard with statistics, interactive charts (recharts), and reading progress
- Currently reading & recently read sections
- Library, book, series, author management
- Live job monitoring, metadata search modals, settings panel
- Notification settings with per-event toggle configuration
## Environment Variables ## Environment Variables

View File

@@ -15,6 +15,7 @@ futures = "0.3"
image.workspace = true image.workspace = true
jpeg-decoder.workspace = true jpeg-decoder.workspace = true
lru.workspace = true lru.workspace = true
notifications = { path = "../../crates/notifications" }
stripstream-core = { path = "../../crates/core" } stripstream-core = { path = "../../crates/core" }
parsers = { path = "../../crates/parsers" } parsers = { path = "../../crates/parsers" }
rand.workspace = true rand.workspace = true

View File

@@ -6,13 +6,15 @@ COPY Cargo.toml ./
COPY apps/api/Cargo.toml apps/api/Cargo.toml COPY apps/api/Cargo.toml apps/api/Cargo.toml
COPY apps/indexer/Cargo.toml apps/indexer/Cargo.toml COPY apps/indexer/Cargo.toml apps/indexer/Cargo.toml
COPY crates/core/Cargo.toml crates/core/Cargo.toml COPY crates/core/Cargo.toml crates/core/Cargo.toml
COPY crates/notifications/Cargo.toml crates/notifications/Cargo.toml
COPY crates/parsers/Cargo.toml crates/parsers/Cargo.toml COPY crates/parsers/Cargo.toml crates/parsers/Cargo.toml
RUN mkdir -p apps/api/src apps/indexer/src crates/core/src crates/parsers/src && \ RUN mkdir -p apps/api/src apps/indexer/src crates/core/src crates/notifications/src crates/parsers/src && \
echo "fn main() {}" > apps/api/src/main.rs && \ echo "fn main() {}" > apps/api/src/main.rs && \
echo "fn main() {}" > apps/indexer/src/main.rs && \ echo "fn main() {}" > apps/indexer/src/main.rs && \
echo "" > apps/indexer/src/lib.rs && \ echo "" > apps/indexer/src/lib.rs && \
echo "" > crates/core/src/lib.rs && \ echo "" > crates/core/src/lib.rs && \
echo "" > crates/notifications/src/lib.rs && \
echo "" > crates/parsers/src/lib.rs echo "" > crates/parsers/src/lib.rs
# Build dependencies only (cached as long as Cargo.toml files don't change) # Build dependencies only (cached as long as Cargo.toml files don't change)
@@ -26,12 +28,13 @@ RUN --mount=type=cache,target=/usr/local/cargo/registry \
COPY apps/api/src apps/api/src COPY apps/api/src apps/api/src
COPY apps/indexer/src apps/indexer/src COPY apps/indexer/src apps/indexer/src
COPY crates/core/src crates/core/src COPY crates/core/src crates/core/src
COPY crates/notifications/src crates/notifications/src
COPY crates/parsers/src crates/parsers/src COPY crates/parsers/src crates/parsers/src
RUN --mount=type=cache,target=/usr/local/cargo/registry \ RUN --mount=type=cache,target=/usr/local/cargo/registry \
--mount=type=cache,target=/usr/local/cargo/git \ --mount=type=cache,target=/usr/local/cargo/git \
--mount=type=cache,target=/app/target \ --mount=type=cache,target=/app/target \
touch apps/api/src/main.rs crates/core/src/lib.rs crates/parsers/src/lib.rs && \ touch apps/api/src/main.rs crates/core/src/lib.rs crates/notifications/src/lib.rs crates/parsers/src/lib.rs && \
cargo build --release -p api && \ cargo build --release -p api && \
cp /app/target/release/api /usr/local/bin/api cp /app/target/release/api /usr/local/bin/api

178
apps/api/src/authors.rs Normal file
View File

@@ -0,0 +1,178 @@
use axum::{extract::{Query, State}, Json};
use serde::{Deserialize, Serialize};
use sqlx::Row;
use utoipa::ToSchema;
use crate::{error::ApiError, state::AppState};
#[derive(Deserialize, ToSchema)]
pub struct ListAuthorsQuery {
#[schema(value_type = Option<String>, example = "batman")]
pub q: Option<String>,
#[schema(value_type = Option<i64>, example = 1)]
pub page: Option<i64>,
#[schema(value_type = Option<i64>, example = 20)]
pub limit: Option<i64>,
/// Sort order: "name" (default), "books" (most books first)
#[schema(value_type = Option<String>, example = "books")]
pub sort: Option<String>,
}
#[derive(Serialize, ToSchema)]
pub struct AuthorItem {
pub name: String,
pub book_count: i64,
pub series_count: i64,
}
#[derive(Serialize, ToSchema)]
pub struct AuthorsPageResponse {
pub items: Vec<AuthorItem>,
pub total: i64,
pub page: i64,
pub limit: i64,
}
/// List all unique authors with book/series counts
#[utoipa::path(
get,
path = "/authors",
tag = "authors",
params(
("q" = Option<String>, Query, description = "Search by author name"),
("page" = Option<i64>, Query, description = "Page number (1-based)"),
("limit" = Option<i64>, Query, description = "Items per page (max 100)"),
("sort" = Option<String>, Query, description = "Sort: name (default) or books"),
),
responses(
(status = 200, body = AuthorsPageResponse),
(status = 401, description = "Unauthorized"),
),
security(("Bearer" = []))
)]
pub async fn list_authors(
State(state): State<AppState>,
Query(query): Query<ListAuthorsQuery>,
) -> Result<Json<AuthorsPageResponse>, ApiError> {
let page = query.page.unwrap_or(1).max(1);
let limit = query.limit.unwrap_or(20).clamp(1, 100);
let offset = (page - 1) * limit;
let sort = query.sort.as_deref().unwrap_or("name");
let order_clause = match sort {
"books" => "book_count DESC, name ASC",
_ => "name ASC",
};
let q_pattern = query.q.as_deref()
.filter(|s| !s.trim().is_empty())
.map(|s| format!("%{s}%"));
// Aggregate unique authors from books.authors + books.author + series_metadata.authors
let sql = format!(
r#"
WITH all_authors AS (
SELECT DISTINCT UNNEST(
COALESCE(
NULLIF(authors, '{{}}'),
CASE WHEN author IS NOT NULL AND author != '' THEN ARRAY[author] ELSE ARRAY[]::text[] END
)
) AS name
FROM books
UNION
SELECT DISTINCT UNNEST(authors) AS name
FROM series_metadata
WHERE authors != '{{}}'
),
filtered AS (
SELECT name FROM all_authors
WHERE ($1::text IS NULL OR name ILIKE $1)
),
book_counts AS (
SELECT
f.name AS author_name,
COUNT(DISTINCT b.id) AS book_count
FROM filtered f
LEFT JOIN books b ON (
f.name = ANY(
COALESCE(
NULLIF(b.authors, '{{}}'),
CASE WHEN b.author IS NOT NULL AND b.author != '' THEN ARRAY[b.author] ELSE ARRAY[]::text[] END
)
)
)
GROUP BY f.name
),
series_counts AS (
SELECT
f.name AS author_name,
COUNT(DISTINCT (sm.library_id, sm.name)) AS series_count
FROM filtered f
LEFT JOIN series_metadata sm ON (
f.name = ANY(sm.authors) AND sm.authors != '{{}}'
)
GROUP BY f.name
)
SELECT
f.name,
COALESCE(bc.book_count, 0) AS book_count,
COALESCE(sc.series_count, 0) AS series_count
FROM filtered f
LEFT JOIN book_counts bc ON bc.author_name = f.name
LEFT JOIN series_counts sc ON sc.author_name = f.name
ORDER BY {order_clause}
LIMIT $2 OFFSET $3
"#
);
let count_sql = r#"
WITH all_authors AS (
SELECT DISTINCT UNNEST(
COALESCE(
NULLIF(authors, '{}'),
CASE WHEN author IS NOT NULL AND author != '' THEN ARRAY[author] ELSE ARRAY[]::text[] END
)
) AS name
FROM books
UNION
SELECT DISTINCT UNNEST(authors) AS name
FROM series_metadata
WHERE authors != '{}'
)
SELECT COUNT(*) AS total
FROM all_authors
WHERE ($1::text IS NULL OR name ILIKE $1)
"#;
let (rows, count_row) = tokio::join!(
sqlx::query(&sql)
.bind(q_pattern.as_deref())
.bind(limit)
.bind(offset)
.fetch_all(&state.pool),
sqlx::query(count_sql)
.bind(q_pattern.as_deref())
.fetch_one(&state.pool)
);
let rows = rows.map_err(|e| ApiError::internal(format!("authors query failed: {e}")))?;
let total: i64 = count_row
.map_err(|e| ApiError::internal(format!("authors count failed: {e}")))?
.get("total");
let items: Vec<AuthorItem> = rows
.iter()
.map(|r| AuthorItem {
name: r.get("name"),
book_count: r.get("book_count"),
series_count: r.get("series_count"),
})
.collect();
Ok(Json(AuthorsPageResponse {
items,
total,
page,
limit,
}))
}

File diff suppressed because it is too large Load Diff

View File

@@ -16,6 +16,10 @@ pub struct RebuildRequest {
pub library_id: Option<Uuid>, pub library_id: Option<Uuid>,
#[schema(value_type = Option<bool>, example = false)] #[schema(value_type = Option<bool>, example = false)]
pub full: Option<bool>, pub full: Option<bool>,
/// Deep rescan: clears directory mtimes to force re-walking all directories,
/// discovering newly supported formats without deleting existing data.
#[schema(value_type = Option<bool>, example = false)]
pub rescan: Option<bool>,
} }
#[derive(Serialize, ToSchema)] #[derive(Serialize, ToSchema)]
@@ -117,7 +121,8 @@ pub async fn enqueue_rebuild(
) -> Result<Json<IndexJobResponse>, ApiError> { ) -> Result<Json<IndexJobResponse>, ApiError> {
let library_id = payload.as_ref().and_then(|p| p.0.library_id); let library_id = payload.as_ref().and_then(|p| p.0.library_id);
let is_full = payload.as_ref().and_then(|p| p.0.full).unwrap_or(false); let is_full = payload.as_ref().and_then(|p| p.0.full).unwrap_or(false);
let job_type = if is_full { "full_rebuild" } else { "rebuild" }; let is_rescan = payload.as_ref().and_then(|p| p.0.rescan).unwrap_or(false);
let job_type = if is_full { "full_rebuild" } else if is_rescan { "rescan" } else { "rebuild" };
let id = Uuid::new_v4(); let id = Uuid::new_v4();
sqlx::query( sqlx::query(

View File

@@ -23,6 +23,13 @@ pub struct LibraryResponse {
pub watcher_enabled: bool, pub watcher_enabled: bool,
pub metadata_provider: Option<String>, pub metadata_provider: Option<String>,
pub fallback_metadata_provider: Option<String>, pub fallback_metadata_provider: Option<String>,
pub metadata_refresh_mode: String,
#[schema(value_type = Option<String>)]
pub next_metadata_refresh_at: Option<chrono::DateTime<chrono::Utc>>,
pub series_count: i64,
/// First book IDs from up to 5 distinct series (for thumbnail fan display)
#[schema(value_type = Vec<String>)]
pub thumbnail_book_ids: Vec<Uuid>,
} }
#[derive(Deserialize, ToSchema)] #[derive(Deserialize, ToSchema)]
@@ -41,14 +48,27 @@ pub struct CreateLibraryRequest {
responses( responses(
(status = 200, body = Vec<LibraryResponse>), (status = 200, body = Vec<LibraryResponse>),
(status = 401, description = "Unauthorized"), (status = 401, description = "Unauthorized"),
(status = 403, description = "Forbidden - Admin scope required"),
), ),
security(("Bearer" = [])) security(("Bearer" = []))
)] )]
pub async fn list_libraries(State(state): State<AppState>) -> Result<Json<Vec<LibraryResponse>>, ApiError> { pub async fn list_libraries(State(state): State<AppState>) -> Result<Json<Vec<LibraryResponse>>, ApiError> {
let rows = sqlx::query( let rows = sqlx::query(
"SELECT l.id, l.name, l.root_path, l.enabled, l.monitor_enabled, l.scan_mode, l.next_scan_at, l.watcher_enabled, l.metadata_provider, l.fallback_metadata_provider, "SELECT l.id, l.name, l.root_path, l.enabled, l.monitor_enabled, l.scan_mode, l.next_scan_at, l.watcher_enabled, l.metadata_provider, l.fallback_metadata_provider, l.metadata_refresh_mode, l.next_metadata_refresh_at,
(SELECT COUNT(*) FROM books b WHERE b.library_id = l.id) as book_count (SELECT COUNT(*) FROM books b WHERE b.library_id = l.id) as book_count,
(SELECT COUNT(DISTINCT COALESCE(NULLIF(b.series, ''), 'unclassified')) FROM books b WHERE b.library_id = l.id) as series_count,
COALESCE((
SELECT ARRAY_AGG(first_id ORDER BY series_name)
FROM (
SELECT DISTINCT ON (COALESCE(NULLIF(b.series, ''), 'unclassified'))
COALESCE(NULLIF(b.series, ''), 'unclassified') as series_name,
b.id as first_id
FROM books b
WHERE b.library_id = l.id
ORDER BY COALESCE(NULLIF(b.series, ''), 'unclassified'),
b.volume NULLS LAST, b.title ASC
LIMIT 5
) sub
), ARRAY[]::uuid[]) as thumbnail_book_ids
FROM libraries l ORDER BY l.created_at DESC" FROM libraries l ORDER BY l.created_at DESC"
) )
.fetch_all(&state.pool) .fetch_all(&state.pool)
@@ -62,12 +82,16 @@ pub async fn list_libraries(State(state): State<AppState>) -> Result<Json<Vec<Li
root_path: row.get("root_path"), root_path: row.get("root_path"),
enabled: row.get("enabled"), enabled: row.get("enabled"),
book_count: row.get("book_count"), book_count: row.get("book_count"),
series_count: row.get("series_count"),
monitor_enabled: row.get("monitor_enabled"), monitor_enabled: row.get("monitor_enabled"),
scan_mode: row.get("scan_mode"), scan_mode: row.get("scan_mode"),
next_scan_at: row.get("next_scan_at"), next_scan_at: row.get("next_scan_at"),
watcher_enabled: row.get("watcher_enabled"), watcher_enabled: row.get("watcher_enabled"),
metadata_provider: row.get("metadata_provider"), metadata_provider: row.get("metadata_provider"),
fallback_metadata_provider: row.get("fallback_metadata_provider"), fallback_metadata_provider: row.get("fallback_metadata_provider"),
metadata_refresh_mode: row.get("metadata_refresh_mode"),
next_metadata_refresh_at: row.get("next_metadata_refresh_at"),
thumbnail_book_ids: row.get("thumbnail_book_ids"),
}) })
.collect(); .collect();
@@ -115,12 +139,16 @@ pub async fn create_library(
root_path, root_path,
enabled: true, enabled: true,
book_count: 0, book_count: 0,
series_count: 0,
monitor_enabled: false, monitor_enabled: false,
scan_mode: "manual".to_string(), scan_mode: "manual".to_string(),
next_scan_at: None, next_scan_at: None,
watcher_enabled: false, watcher_enabled: false,
metadata_provider: None, metadata_provider: None,
fallback_metadata_provider: None, fallback_metadata_provider: None,
metadata_refresh_mode: "manual".to_string(),
next_metadata_refresh_at: None,
thumbnail_book_ids: vec![],
})) }))
} }
@@ -192,7 +220,6 @@ use crate::index_jobs::{IndexJobResponse, RebuildRequest};
(status = 200, body = IndexJobResponse), (status = 200, body = IndexJobResponse),
(status = 404, description = "Library not found"), (status = 404, description = "Library not found"),
(status = 401, description = "Unauthorized"), (status = 401, description = "Unauthorized"),
(status = 403, description = "Forbidden - Admin scope required"),
), ),
security(("Bearer" = [])) security(("Bearer" = []))
)] )]
@@ -212,7 +239,8 @@ pub async fn scan_library(
} }
let is_full = payload.as_ref().and_then(|p| p.full).unwrap_or(false); let is_full = payload.as_ref().and_then(|p| p.full).unwrap_or(false);
let job_type = if is_full { "full_rebuild" } else { "rebuild" }; let is_rescan = payload.as_ref().and_then(|p| p.rescan).unwrap_or(false);
let job_type = if is_full { "full_rebuild" } else if is_rescan { "rescan" } else { "rebuild" };
// Create indexing job for this library // Create indexing job for this library
let job_id = Uuid::new_v4(); let job_id = Uuid::new_v4();
@@ -241,6 +269,8 @@ pub struct UpdateMonitoringRequest {
#[schema(value_type = String, example = "hourly")] #[schema(value_type = String, example = "hourly")]
pub scan_mode: String, // 'manual', 'hourly', 'daily', 'weekly' pub scan_mode: String, // 'manual', 'hourly', 'daily', 'weekly'
pub watcher_enabled: Option<bool>, pub watcher_enabled: Option<bool>,
#[schema(value_type = Option<String>, example = "daily")]
pub metadata_refresh_mode: Option<String>, // 'manual', 'hourly', 'daily', 'weekly'
} }
/// Update monitoring settings for a library /// Update monitoring settings for a library
@@ -271,6 +301,12 @@ pub async fn update_monitoring(
return Err(ApiError::bad_request("scan_mode must be one of: manual, hourly, daily, weekly")); return Err(ApiError::bad_request("scan_mode must be one of: manual, hourly, daily, weekly"));
} }
// Validate metadata_refresh_mode
let metadata_refresh_mode = input.metadata_refresh_mode.as_deref().unwrap_or("manual");
if !valid_modes.contains(&metadata_refresh_mode) {
return Err(ApiError::bad_request("metadata_refresh_mode must be one of: manual, hourly, daily, weekly"));
}
// Calculate next_scan_at if monitoring is enabled // Calculate next_scan_at if monitoring is enabled
let next_scan_at = if input.monitor_enabled { let next_scan_at = if input.monitor_enabled {
let interval_minutes = match input.scan_mode.as_str() { let interval_minutes = match input.scan_mode.as_str() {
@@ -284,16 +320,31 @@ pub async fn update_monitoring(
None None
}; };
// Calculate next_metadata_refresh_at
let next_metadata_refresh_at = if metadata_refresh_mode != "manual" {
let interval_minutes = match metadata_refresh_mode {
"hourly" => 60,
"daily" => 1440,
"weekly" => 10080,
_ => 1440,
};
Some(chrono::Utc::now() + chrono::Duration::minutes(interval_minutes))
} else {
None
};
let watcher_enabled = input.watcher_enabled.unwrap_or(false); let watcher_enabled = input.watcher_enabled.unwrap_or(false);
let result = sqlx::query( let result = sqlx::query(
"UPDATE libraries SET monitor_enabled = $2, scan_mode = $3, next_scan_at = $4, watcher_enabled = $5 WHERE id = $1 RETURNING id, name, root_path, enabled, monitor_enabled, scan_mode, next_scan_at, watcher_enabled, metadata_provider, fallback_metadata_provider" "UPDATE libraries SET monitor_enabled = $2, scan_mode = $3, next_scan_at = $4, watcher_enabled = $5, metadata_refresh_mode = $6, next_metadata_refresh_at = $7 WHERE id = $1 RETURNING id, name, root_path, enabled, monitor_enabled, scan_mode, next_scan_at, watcher_enabled, metadata_provider, fallback_metadata_provider, metadata_refresh_mode, next_metadata_refresh_at"
) )
.bind(library_id) .bind(library_id)
.bind(input.monitor_enabled) .bind(input.monitor_enabled)
.bind(input.scan_mode) .bind(input.scan_mode)
.bind(next_scan_at) .bind(next_scan_at)
.bind(watcher_enabled) .bind(watcher_enabled)
.bind(metadata_refresh_mode)
.bind(next_metadata_refresh_at)
.fetch_optional(&state.pool) .fetch_optional(&state.pool)
.await?; .await?;
@@ -306,18 +357,38 @@ pub async fn update_monitoring(
.fetch_one(&state.pool) .fetch_one(&state.pool)
.await?; .await?;
let series_count: i64 = sqlx::query_scalar("SELECT COUNT(DISTINCT COALESCE(NULLIF(series, ''), 'unclassified')) FROM books WHERE library_id = $1")
.bind(library_id)
.fetch_one(&state.pool)
.await?;
let thumbnail_book_ids: Vec<Uuid> = sqlx::query_scalar(
"SELECT b.id FROM books b
WHERE b.library_id = $1
ORDER BY COALESCE(NULLIF(b.series, ''), 'unclassified'), b.volume NULLS LAST, b.title ASC
LIMIT 5"
)
.bind(library_id)
.fetch_all(&state.pool)
.await
.unwrap_or_default();
Ok(Json(LibraryResponse { Ok(Json(LibraryResponse {
id: row.get("id"), id: row.get("id"),
name: row.get("name"), name: row.get("name"),
root_path: row.get("root_path"), root_path: row.get("root_path"),
enabled: row.get("enabled"), enabled: row.get("enabled"),
book_count, book_count,
series_count,
monitor_enabled: row.get("monitor_enabled"), monitor_enabled: row.get("monitor_enabled"),
scan_mode: row.get("scan_mode"), scan_mode: row.get("scan_mode"),
next_scan_at: row.get("next_scan_at"), next_scan_at: row.get("next_scan_at"),
watcher_enabled: row.get("watcher_enabled"), watcher_enabled: row.get("watcher_enabled"),
metadata_provider: row.get("metadata_provider"), metadata_provider: row.get("metadata_provider"),
fallback_metadata_provider: row.get("fallback_metadata_provider"), fallback_metadata_provider: row.get("fallback_metadata_provider"),
metadata_refresh_mode: row.get("metadata_refresh_mode"),
next_metadata_refresh_at: row.get("next_metadata_refresh_at"),
thumbnail_book_ids,
})) }))
} }
@@ -353,7 +424,7 @@ pub async fn update_metadata_provider(
let fallback = input.fallback_metadata_provider.as_deref().filter(|s| !s.is_empty()); let fallback = input.fallback_metadata_provider.as_deref().filter(|s| !s.is_empty());
let result = sqlx::query( let result = sqlx::query(
"UPDATE libraries SET metadata_provider = $2, fallback_metadata_provider = $3 WHERE id = $1 RETURNING id, name, root_path, enabled, monitor_enabled, scan_mode, next_scan_at, watcher_enabled, metadata_provider, fallback_metadata_provider" "UPDATE libraries SET metadata_provider = $2, fallback_metadata_provider = $3 WHERE id = $1 RETURNING id, name, root_path, enabled, monitor_enabled, scan_mode, next_scan_at, watcher_enabled, metadata_provider, fallback_metadata_provider, metadata_refresh_mode, next_metadata_refresh_at"
) )
.bind(library_id) .bind(library_id)
.bind(provider) .bind(provider)
@@ -370,17 +441,37 @@ pub async fn update_metadata_provider(
.fetch_one(&state.pool) .fetch_one(&state.pool)
.await?; .await?;
let series_count: i64 = sqlx::query_scalar("SELECT COUNT(DISTINCT COALESCE(NULLIF(series, ''), 'unclassified')) FROM books WHERE library_id = $1")
.bind(library_id)
.fetch_one(&state.pool)
.await?;
let thumbnail_book_ids: Vec<Uuid> = sqlx::query_scalar(
"SELECT b.id FROM books b
WHERE b.library_id = $1
ORDER BY COALESCE(NULLIF(b.series, ''), 'unclassified'), b.volume NULLS LAST, b.title ASC
LIMIT 5"
)
.bind(library_id)
.fetch_all(&state.pool)
.await
.unwrap_or_default();
Ok(Json(LibraryResponse { Ok(Json(LibraryResponse {
id: row.get("id"), id: row.get("id"),
name: row.get("name"), name: row.get("name"),
root_path: row.get("root_path"), root_path: row.get("root_path"),
enabled: row.get("enabled"), enabled: row.get("enabled"),
book_count, book_count,
series_count,
monitor_enabled: row.get("monitor_enabled"), monitor_enabled: row.get("monitor_enabled"),
scan_mode: row.get("scan_mode"), scan_mode: row.get("scan_mode"),
next_scan_at: row.get("next_scan_at"), next_scan_at: row.get("next_scan_at"),
watcher_enabled: row.get("watcher_enabled"), watcher_enabled: row.get("watcher_enabled"),
metadata_provider: row.get("metadata_provider"), metadata_provider: row.get("metadata_provider"),
fallback_metadata_provider: row.get("fallback_metadata_provider"), fallback_metadata_provider: row.get("fallback_metadata_provider"),
metadata_refresh_mode: row.get("metadata_refresh_mode"),
next_metadata_refresh_at: row.get("next_metadata_refresh_at"),
thumbnail_book_ids,
})) }))
} }

View File

@@ -1,4 +1,5 @@
mod auth; mod auth;
mod authors;
mod books; mod books;
mod error; mod error;
mod handlers; mod handlers;
@@ -16,9 +17,11 @@ mod prowlarr;
mod qbittorrent; mod qbittorrent;
mod reading_progress; mod reading_progress;
mod search; mod search;
mod series;
mod settings; mod settings;
mod state; mod state;
mod stats; mod stats;
mod telegram;
mod thumbnails; mod thumbnails;
mod tokens; mod tokens;
@@ -85,14 +88,13 @@ async fn main() -> anyhow::Result<()> {
}; };
let admin_routes = Router::new() let admin_routes = Router::new()
.route("/libraries", get(libraries::list_libraries).post(libraries::create_library)) .route("/libraries", axum::routing::post(libraries::create_library))
.route("/libraries/:id", delete(libraries::delete_library)) .route("/libraries/:id", delete(libraries::delete_library))
.route("/libraries/:id/scan", axum::routing::post(libraries::scan_library))
.route("/libraries/:id/monitoring", axum::routing::patch(libraries::update_monitoring)) .route("/libraries/:id/monitoring", axum::routing::patch(libraries::update_monitoring))
.route("/libraries/:id/metadata-provider", axum::routing::patch(libraries::update_metadata_provider)) .route("/libraries/:id/metadata-provider", axum::routing::patch(libraries::update_metadata_provider))
.route("/books/:id", axum::routing::patch(books::update_book)) .route("/books/:id", axum::routing::patch(books::update_book))
.route("/books/:id/convert", axum::routing::post(books::convert_book)) .route("/books/:id/convert", axum::routing::post(books::convert_book))
.route("/libraries/:library_id/series/:name", axum::routing::patch(books::update_series)) .route("/libraries/:library_id/series/:name", axum::routing::patch(series::update_series))
.route("/index/rebuild", axum::routing::post(index_jobs::enqueue_rebuild)) .route("/index/rebuild", axum::routing::post(index_jobs::enqueue_rebuild))
.route("/index/thumbnails/rebuild", axum::routing::post(thumbnails::start_thumbnails_rebuild)) .route("/index/thumbnails/rebuild", axum::routing::post(thumbnails::start_thumbnails_rebuild))
.route("/index/thumbnails/regenerate", axum::routing::post(thumbnails::start_thumbnails_regenerate)) .route("/index/thumbnails/regenerate", axum::routing::post(thumbnails::start_thumbnails_regenerate))
@@ -110,6 +112,7 @@ async fn main() -> anyhow::Result<()> {
.route("/prowlarr/test", get(prowlarr::test_prowlarr)) .route("/prowlarr/test", get(prowlarr::test_prowlarr))
.route("/qbittorrent/add", axum::routing::post(qbittorrent::add_torrent)) .route("/qbittorrent/add", axum::routing::post(qbittorrent::add_torrent))
.route("/qbittorrent/test", get(qbittorrent::test_qbittorrent)) .route("/qbittorrent/test", get(qbittorrent::test_qbittorrent))
.route("/telegram/test", get(telegram::test_telegram))
.route("/komga/sync", axum::routing::post(komga::sync_komga_read_books)) .route("/komga/sync", axum::routing::post(komga::sync_komga_read_books))
.route("/komga/reports", get(komga::list_sync_reports)) .route("/komga/reports", get(komga::list_sync_reports))
.route("/komga/reports/:id", get(komga::get_sync_report)) .route("/komga/reports/:id", get(komga::get_sync_report))
@@ -132,19 +135,22 @@ async fn main() -> anyhow::Result<()> {
)); ));
let read_routes = Router::new() let read_routes = Router::new()
.route("/libraries", get(libraries::list_libraries))
.route("/libraries/:id/scan", axum::routing::post(libraries::scan_library))
.route("/books", get(books::list_books)) .route("/books", get(books::list_books))
.route("/books/ongoing", get(books::ongoing_books)) .route("/books/ongoing", get(series::ongoing_books))
.route("/books/:id", get(books::get_book)) .route("/books/:id", get(books::get_book))
.route("/books/:id/thumbnail", get(books::get_thumbnail)) .route("/books/:id/thumbnail", get(books::get_thumbnail))
.route("/books/:id/pages/:n", get(pages::get_page)) .route("/books/:id/pages/:n", get(pages::get_page))
.route("/books/:id/progress", get(reading_progress::get_reading_progress).patch(reading_progress::update_reading_progress)) .route("/books/:id/progress", get(reading_progress::get_reading_progress).patch(reading_progress::update_reading_progress))
.route("/libraries/:library_id/series", get(books::list_series)) .route("/libraries/:library_id/series", get(series::list_series))
.route("/libraries/:library_id/series/:name/metadata", get(books::get_series_metadata)) .route("/libraries/:library_id/series/:name/metadata", get(series::get_series_metadata))
.route("/series", get(books::list_all_series)) .route("/series", get(series::list_all_series))
.route("/series/ongoing", get(books::ongoing_series)) .route("/series/ongoing", get(series::ongoing_series))
.route("/series/statuses", get(books::series_statuses)) .route("/series/statuses", get(series::series_statuses))
.route("/series/provider-statuses", get(books::provider_statuses)) .route("/series/provider-statuses", get(series::provider_statuses))
.route("/series/mark-read", axum::routing::post(reading_progress::mark_series_read)) .route("/series/mark-read", axum::routing::post(reading_progress::mark_series_read))
.route("/authors", get(authors::list_authors))
.route("/stats", get(stats::get_stats)) .route("/stats", get(stats::get_stats))
.route("/search", get(search::search_books)) .route("/search", get(search::search_books))
.route_layer(middleware::from_fn_with_state(state.clone(), api_middleware::read_rate_limit)) .route_layer(middleware::from_fn_with_state(state.clone(), api_middleware::read_rate_limit))

View File

@@ -369,6 +369,26 @@ pub async fn approve_metadata(
.await?; .await?;
} }
// Notify via Telegram (with first book thumbnail if available)
let provider_for_notif: String = row.get("provider");
let thumbnail_path: Option<String> = sqlx::query_scalar(
"SELECT thumbnail_path FROM books WHERE library_id = $1 AND series_name = $2 AND thumbnail_path IS NOT NULL ORDER BY sort_order LIMIT 1",
)
.bind(library_id)
.bind(&series_name)
.fetch_optional(&state.pool)
.await
.ok()
.flatten();
notifications::notify(
state.pool.clone(),
notifications::NotificationEvent::MetadataApproved {
series_name: series_name.clone(),
provider: provider_for_notif,
thumbnail_path,
},
);
Ok(Json(ApproveResponse { Ok(Json(ApproveResponse {
status: "approved".to_string(), status: "approved".to_string(),
report, report,

View File

@@ -124,6 +124,12 @@ pub async fn start_batch(
// Spawn the background processing task // Spawn the background processing task
let pool = state.pool.clone(); let pool = state.pool.clone();
let library_name: Option<String> = sqlx::query_scalar("SELECT name FROM libraries WHERE id = $1")
.bind(library_id)
.fetch_optional(&state.pool)
.await
.ok()
.flatten();
tokio::spawn(async move { tokio::spawn(async move {
if let Err(e) = process_metadata_batch(&pool, job_id, library_id).await { if let Err(e) = process_metadata_batch(&pool, job_id, library_id).await {
warn!("[METADATA_BATCH] job {job_id} failed: {e}"); warn!("[METADATA_BATCH] job {job_id} failed: {e}");
@@ -134,6 +140,13 @@ pub async fn start_batch(
.bind(e.to_string()) .bind(e.to_string())
.execute(&pool) .execute(&pool)
.await; .await;
notifications::notify(
pool.clone(),
notifications::NotificationEvent::MetadataBatchFailed {
library_name,
error: e.to_string(),
},
);
} }
}); });
@@ -621,6 +634,21 @@ async fn process_metadata_batch(
info!("[METADATA_BATCH] job={job_id} completed: {processed}/{total} series processed"); info!("[METADATA_BATCH] job={job_id} completed: {processed}/{total} series processed");
let library_name: Option<String> = sqlx::query_scalar("SELECT name FROM libraries WHERE id = $1")
.bind(library_id)
.fetch_optional(pool)
.await
.ok()
.flatten();
notifications::notify(
pool.clone(),
notifications::NotificationEvent::MetadataBatchCompleted {
library_name,
total_series: total,
processed,
},
);
Ok(()) Ok(())
} }

View File

@@ -133,6 +133,12 @@ pub async fn start_refresh(
// Spawn the background processing task // Spawn the background processing task
let pool = state.pool.clone(); let pool = state.pool.clone();
let library_name: Option<String> = sqlx::query_scalar("SELECT name FROM libraries WHERE id = $1")
.bind(library_id)
.fetch_optional(&state.pool)
.await
.ok()
.flatten();
tokio::spawn(async move { tokio::spawn(async move {
if let Err(e) = process_metadata_refresh(&pool, job_id, library_id).await { if let Err(e) = process_metadata_refresh(&pool, job_id, library_id).await {
warn!("[METADATA_REFRESH] job {job_id} failed: {e}"); warn!("[METADATA_REFRESH] job {job_id} failed: {e}");
@@ -143,6 +149,13 @@ pub async fn start_refresh(
.bind(e.to_string()) .bind(e.to_string())
.execute(&pool) .execute(&pool)
.await; .await;
notifications::notify(
pool.clone(),
notifications::NotificationEvent::MetadataRefreshFailed {
library_name,
error: e.to_string(),
},
);
} }
}); });
@@ -319,6 +332,22 @@ async fn process_metadata_refresh(
info!("[METADATA_REFRESH] job={job_id} completed: {refreshed} updated, {unchanged} unchanged, {errors} errors"); info!("[METADATA_REFRESH] job={job_id} completed: {refreshed} updated, {unchanged} unchanged, {errors} errors");
let library_name: Option<String> = sqlx::query_scalar("SELECT name FROM libraries WHERE id = $1")
.bind(library_id)
.fetch_optional(pool)
.await
.ok()
.flatten();
notifications::notify(
pool.clone(),
notifications::NotificationEvent::MetadataRefreshCompleted {
library_name,
refreshed,
unchanged,
errors,
},
);
Ok(()) Ok(())
} }

View File

@@ -10,14 +10,14 @@ use utoipa::OpenApi;
crate::reading_progress::update_reading_progress, crate::reading_progress::update_reading_progress,
crate::reading_progress::mark_series_read, crate::reading_progress::mark_series_read,
crate::books::get_thumbnail, crate::books::get_thumbnail,
crate::books::list_series, crate::series::list_series,
crate::books::list_all_series, crate::series::list_all_series,
crate::books::ongoing_series, crate::series::ongoing_series,
crate::books::ongoing_books, crate::series::ongoing_books,
crate::books::convert_book, crate::books::convert_book,
crate::books::update_book, crate::books::update_book,
crate::books::get_series_metadata, crate::series::get_series_metadata,
crate::books::update_series, crate::series::update_series,
crate::pages::get_page, crate::pages::get_page,
crate::search::search_books, crate::search::search_books,
crate::index_jobs::enqueue_rebuild, crate::index_jobs::enqueue_rebuild,
@@ -35,10 +35,12 @@ use utoipa::OpenApi;
crate::libraries::delete_library, crate::libraries::delete_library,
crate::libraries::scan_library, crate::libraries::scan_library,
crate::libraries::update_monitoring, crate::libraries::update_monitoring,
crate::libraries::update_metadata_provider,
crate::tokens::list_tokens, crate::tokens::list_tokens,
crate::tokens::create_token, crate::tokens::create_token,
crate::tokens::revoke_token, crate::tokens::revoke_token,
crate::tokens::delete_token, crate::tokens::delete_token,
crate::authors::list_authors,
crate::stats::get_stats, crate::stats::get_stats,
crate::settings::get_settings, crate::settings::get_settings,
crate::settings::get_setting, crate::settings::get_setting,
@@ -53,8 +55,8 @@ use utoipa::OpenApi;
crate::metadata::get_metadata_links, crate::metadata::get_metadata_links,
crate::metadata::get_missing_books, crate::metadata::get_missing_books,
crate::metadata::delete_metadata_link, crate::metadata::delete_metadata_link,
crate::books::series_statuses, crate::series::series_statuses,
crate::books::provider_statuses, crate::series::provider_statuses,
crate::settings::list_status_mappings, crate::settings::list_status_mappings,
crate::settings::upsert_status_mapping, crate::settings::upsert_status_mapping,
crate::settings::delete_status_mapping, crate::settings::delete_status_mapping,
@@ -62,6 +64,14 @@ use utoipa::OpenApi;
crate::prowlarr::test_prowlarr, crate::prowlarr::test_prowlarr,
crate::qbittorrent::add_torrent, crate::qbittorrent::add_torrent,
crate::qbittorrent::test_qbittorrent, crate::qbittorrent::test_qbittorrent,
crate::metadata_batch::start_batch,
crate::metadata_batch::get_batch_report,
crate::metadata_batch::get_batch_results,
crate::metadata_refresh::start_refresh,
crate::metadata_refresh::get_refresh_report,
crate::komga::sync_komga_read_books,
crate::komga::list_sync_reports,
crate::komga::get_sync_report,
), ),
components( components(
schemas( schemas(
@@ -73,14 +83,14 @@ use utoipa::OpenApi;
crate::reading_progress::UpdateReadingProgressRequest, crate::reading_progress::UpdateReadingProgressRequest,
crate::reading_progress::MarkSeriesReadRequest, crate::reading_progress::MarkSeriesReadRequest,
crate::reading_progress::MarkSeriesReadResponse, crate::reading_progress::MarkSeriesReadResponse,
crate::books::SeriesItem, crate::series::SeriesItem,
crate::books::SeriesPage, crate::series::SeriesPage,
crate::books::ListAllSeriesQuery, crate::series::ListAllSeriesQuery,
crate::books::OngoingQuery, crate::series::OngoingQuery,
crate::books::UpdateBookRequest, crate::books::UpdateBookRequest,
crate::books::SeriesMetadata, crate::series::SeriesMetadata,
crate::books::UpdateSeriesRequest, crate::series::UpdateSeriesRequest,
crate::books::UpdateSeriesResponse, crate::series::UpdateSeriesResponse,
crate::pages::PageQuery, crate::pages::PageQuery,
crate::search::SearchQuery, crate::search::SearchQuery,
crate::search::SearchResponse, crate::search::SearchResponse,
@@ -95,6 +105,7 @@ use utoipa::OpenApi;
crate::libraries::LibraryResponse, crate::libraries::LibraryResponse,
crate::libraries::CreateLibraryRequest, crate::libraries::CreateLibraryRequest,
crate::libraries::UpdateMonitoringRequest, crate::libraries::UpdateMonitoringRequest,
crate::libraries::UpdateMetadataProviderRequest,
crate::tokens::CreateTokenRequest, crate::tokens::CreateTokenRequest,
crate::tokens::TokenResponse, crate::tokens::TokenResponse,
crate::tokens::CreatedTokenResponse, crate::tokens::CreatedTokenResponse,
@@ -104,6 +115,9 @@ use utoipa::OpenApi;
crate::settings::ThumbnailStats, crate::settings::ThumbnailStats,
crate::settings::StatusMappingDto, crate::settings::StatusMappingDto,
crate::settings::UpsertStatusMappingRequest, crate::settings::UpsertStatusMappingRequest,
crate::authors::ListAuthorsQuery,
crate::authors::AuthorItem,
crate::authors::AuthorsPageResponse,
crate::stats::StatsResponse, crate::stats::StatsResponse,
crate::stats::StatsOverview, crate::stats::StatsOverview,
crate::stats::ReadingStatusStats, crate::stats::ReadingStatusStats,
@@ -133,7 +147,16 @@ use utoipa::OpenApi;
crate::prowlarr::ProwlarrRelease, crate::prowlarr::ProwlarrRelease,
crate::prowlarr::ProwlarrCategory, crate::prowlarr::ProwlarrCategory,
crate::prowlarr::ProwlarrSearchResponse, crate::prowlarr::ProwlarrSearchResponse,
crate::prowlarr::MissingVolumeInput,
crate::prowlarr::ProwlarrTestResponse, crate::prowlarr::ProwlarrTestResponse,
crate::metadata_batch::MetadataBatchRequest,
crate::metadata_batch::MetadataBatchReportDto,
crate::metadata_batch::MetadataBatchResultDto,
crate::metadata_refresh::MetadataRefreshRequest,
crate::metadata_refresh::MetadataRefreshReportDto,
crate::komga::KomgaSyncRequest,
crate::komga::KomgaSyncResponse,
crate::komga::KomgaSyncReportSummary,
ErrorResponse, ErrorResponse,
) )
), ),
@@ -141,10 +164,16 @@ use utoipa::OpenApi;
("Bearer" = []) ("Bearer" = [])
), ),
tags( tags(
(name = "books", description = "Read-only endpoints for browsing and searching books"), (name = "books", description = "Book browsing, details and management"),
(name = "series", description = "Series browsing, filtering and management"),
(name = "search", description = "Full-text search across books and series"),
(name = "reading-progress", description = "Reading progress tracking per book"), (name = "reading-progress", description = "Reading progress tracking per book"),
(name = "libraries", description = "Library management endpoints (Admin only)"), (name = "authors", description = "Author browsing and listing"),
(name = "stats", description = "Collection statistics and dashboard data"),
(name = "libraries", description = "Library listing, scanning, and management (create/delete/settings: Admin only)"),
(name = "indexing", description = "Search index management and job control (Admin only)"), (name = "indexing", description = "Search index management and job control (Admin only)"),
(name = "metadata", description = "External metadata providers and matching (Admin only)"),
(name = "komga", description = "Komga read-status sync (Admin only)"),
(name = "tokens", description = "API token management (Admin only)"), (name = "tokens", description = "API token management (Admin only)"),
(name = "settings", description = "Application settings and cache management (Admin only)"), (name = "settings", description = "Application settings and cache management (Admin only)"),
(name = "prowlarr", description = "Prowlarr indexer integration (Admin only)"), (name = "prowlarr", description = "Prowlarr indexer integration (Admin only)"),

View File

@@ -351,6 +351,7 @@ async fn prefetch_page(state: AppState, params: &PrefetchParams<'_>) {
Some(ref e) if e == "cbz" => "cbz", Some(ref e) if e == "cbz" => "cbz",
Some(ref e) if e == "cbr" => "cbr", Some(ref e) if e == "cbr" => "cbr",
Some(ref e) if e == "pdf" => "pdf", Some(ref e) if e == "pdf" => "pdf",
Some(ref e) if e == "epub" => "epub",
_ => return, _ => return,
} }
.to_string(); .to_string();
@@ -479,6 +480,7 @@ fn render_page(
"cbz" => parsers::BookFormat::Cbz, "cbz" => parsers::BookFormat::Cbz,
"cbr" => parsers::BookFormat::Cbr, "cbr" => parsers::BookFormat::Cbr,
"pdf" => parsers::BookFormat::Pdf, "pdf" => parsers::BookFormat::Pdf,
"epub" => parsers::BookFormat::Epub,
_ => return Err(ApiError::bad_request("unsupported source format")), _ => return Err(ApiError::bad_request("unsupported source format")),
}; };

View File

@@ -7,15 +7,39 @@ use crate::{error::ApiError, state::AppState};
// ─── Types ────────────────────────────────────────────────────────────────── // ─── Types ──────────────────────────────────────────────────────────────────
#[derive(Deserialize, ToSchema)]
pub struct MissingVolumeInput {
pub volume_number: Option<i32>,
#[allow(dead_code)]
pub title: Option<String>,
}
#[derive(Deserialize, ToSchema)] #[derive(Deserialize, ToSchema)]
pub struct ProwlarrSearchRequest { pub struct ProwlarrSearchRequest {
pub series_name: String, pub series_name: String,
pub volume_number: Option<i32>, pub volume_number: Option<i32>,
pub custom_query: Option<String>, pub custom_query: Option<String>,
pub missing_volumes: Option<Vec<MissingVolumeInput>>,
} }
#[derive(Serialize, Deserialize, ToSchema)] #[derive(Serialize, Deserialize, ToSchema)]
#[serde(rename_all = "camelCase")] #[serde(rename_all = "camelCase")]
pub struct ProwlarrRawRelease {
pub guid: String,
pub title: String,
pub size: i64,
pub download_url: Option<String>,
pub indexer: Option<String>,
pub seeders: Option<i32>,
pub leechers: Option<i32>,
pub publish_date: Option<String>,
pub protocol: Option<String>,
pub info_url: Option<String>,
pub categories: Option<Vec<ProwlarrCategory>>,
}
#[derive(Serialize, ToSchema)]
#[serde(rename_all = "camelCase")]
pub struct ProwlarrRelease { pub struct ProwlarrRelease {
pub guid: String, pub guid: String,
pub title: String, pub title: String,
@@ -28,6 +52,8 @@ pub struct ProwlarrRelease {
pub protocol: Option<String>, pub protocol: Option<String>,
pub info_url: Option<String>, pub info_url: Option<String>,
pub categories: Option<Vec<ProwlarrCategory>>, pub categories: Option<Vec<ProwlarrCategory>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub matched_missing_volumes: Option<Vec<i32>>,
} }
#[derive(Serialize, Deserialize, ToSchema)] #[derive(Serialize, Deserialize, ToSchema)]
@@ -83,6 +109,107 @@ async fn load_prowlarr_config(
Ok((url, config.api_key, categories)) Ok((url, config.api_key, categories))
} }
// ─── Volume matching ─────────────────────────────────────────────────────────
/// Extract volume numbers from a release title.
/// Looks for patterns like: T01, Tome 01, Vol. 01, v01, #01,
/// or standalone numbers that appear after common separators.
fn extract_volumes_from_title(title: &str) -> Vec<i32> {
let lower = title.to_lowercase();
let mut volumes = Vec::new();
// Patterns: T01, Tome 01, Tome01, Vol 01, Vol.01, v01, #01
let prefixes = ["tome", "vol.", "vol ", "t", "v", "#"];
let chars: Vec<char> = lower.chars().collect();
let len = chars.len();
for prefix in &prefixes {
let mut start = 0;
while let Some(pos) = lower[start..].find(prefix) {
let abs_pos = start + pos;
let after = abs_pos + prefix.len();
// For single-char prefixes (t, v, #), ensure it's at a word boundary
if prefix.len() == 1 && *prefix != "#" {
if abs_pos > 0 && chars[abs_pos - 1].is_alphanumeric() {
start = after;
continue;
}
}
// Skip optional spaces after prefix
let mut i = after;
while i < len && chars[i] == ' ' {
i += 1;
}
// Read digits
let digit_start = i;
while i < len && chars[i].is_ascii_digit() {
i += 1;
}
if i > digit_start {
if let Ok(num) = lower[digit_start..i].parse::<i32>() {
if !volumes.contains(&num) {
volumes.push(num);
}
}
}
start = after;
}
}
volumes
}
/// Match releases against missing volume numbers.
fn match_missing_volumes(
releases: Vec<ProwlarrRawRelease>,
missing: &[MissingVolumeInput],
) -> Vec<ProwlarrRelease> {
let missing_numbers: Vec<i32> = missing
.iter()
.filter_map(|m| m.volume_number)
.collect();
releases
.into_iter()
.map(|r| {
let matched = if missing_numbers.is_empty() {
None
} else {
let title_volumes = extract_volumes_from_title(&r.title);
let matched: Vec<i32> = title_volumes
.into_iter()
.filter(|v| missing_numbers.contains(v))
.collect();
if matched.is_empty() {
None
} else {
Some(matched)
}
};
ProwlarrRelease {
guid: r.guid,
title: r.title,
size: r.size,
download_url: r.download_url,
indexer: r.indexer,
seeders: r.seeders,
leechers: r.leechers,
publish_date: r.publish_date,
protocol: r.protocol,
info_url: r.info_url,
categories: r.categories,
matched_missing_volumes: matched,
}
})
.collect()
}
// ─── Handlers ─────────────────────────────────────────────────────────────── // ─── Handlers ───────────────────────────────────────────────────────────────
/// Search for releases on Prowlarr /// Search for releases on Prowlarr
@@ -149,13 +276,35 @@ pub async fn search_prowlarr(
tracing::debug!("Prowlarr raw response length: {} chars", raw_text.len()); tracing::debug!("Prowlarr raw response length: {} chars", raw_text.len());
let results: Vec<ProwlarrRelease> = serde_json::from_str(&raw_text) let raw_releases: Vec<ProwlarrRawRelease> = serde_json::from_str(&raw_text)
.map_err(|e| { .map_err(|e| {
tracing::error!("Failed to parse Prowlarr response: {e}"); tracing::error!("Failed to parse Prowlarr response: {e}");
tracing::error!("Raw response (first 500 chars): {}", &raw_text[..raw_text.len().min(500)]); tracing::error!("Raw response (first 500 chars): {}", &raw_text[..raw_text.len().min(500)]);
ApiError::internal(format!("Failed to parse Prowlarr response: {e}")) ApiError::internal(format!("Failed to parse Prowlarr response: {e}"))
})?; })?;
let results = if let Some(missing) = &body.missing_volumes {
match_missing_volumes(raw_releases, missing)
} else {
raw_releases
.into_iter()
.map(|r| ProwlarrRelease {
guid: r.guid,
title: r.title,
size: r.size,
download_url: r.download_url,
indexer: r.indexer,
seeders: r.seeders,
leechers: r.leechers,
publish_date: r.publish_date,
protocol: r.protocol,
info_url: r.info_url,
categories: r.categories,
matched_missing_volumes: None,
})
.collect()
};
Ok(Json(ProwlarrSearchResponse { results, query })) Ok(Json(ProwlarrSearchResponse { results, query }))
} }

View File

@@ -43,11 +43,11 @@ pub struct SearchResponse {
#[utoipa::path( #[utoipa::path(
get, get,
path = "/search", path = "/search",
tag = "books", tag = "search",
params( params(
("q" = String, Query, description = "Search query (books + series via PostgreSQL full-text)"), ("q" = String, Query, description = "Search query (books + series via PostgreSQL full-text)"),
("library_id" = Option<String>, Query, description = "Filter by library ID"), ("library_id" = Option<String>, Query, description = "Filter by library ID"),
("type" = Option<String>, Query, description = "Filter by type (cbz, cbr, pdf)"), ("type" = Option<String>, Query, description = "Filter by type (cbz, cbr, pdf, epub)"),
("kind" = Option<String>, Query, description = "Filter by kind (alias for type)"), ("kind" = Option<String>, Query, description = "Filter by kind (alias for type)"),
("limit" = Option<usize>, Query, description = "Max results per type (max 100)"), ("limit" = Option<usize>, Query, description = "Max results per type (max 100)"),
), ),

1028
apps/api/src/series.rs Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -74,10 +74,36 @@ pub struct ProviderCount {
pub count: i64, pub count: i64,
} }
#[derive(Serialize, ToSchema)]
pub struct CurrentlyReadingItem {
pub book_id: String,
pub title: String,
pub series: Option<String>,
pub current_page: i32,
pub page_count: i32,
}
#[derive(Serialize, ToSchema)]
pub struct RecentlyReadItem {
pub book_id: String,
pub title: String,
pub series: Option<String>,
pub last_read_at: String,
}
#[derive(Serialize, ToSchema)]
pub struct MonthlyReading {
pub month: String,
pub books_read: i64,
}
#[derive(Serialize, ToSchema)] #[derive(Serialize, ToSchema)]
pub struct StatsResponse { pub struct StatsResponse {
pub overview: StatsOverview, pub overview: StatsOverview,
pub reading_status: ReadingStatusStats, pub reading_status: ReadingStatusStats,
pub currently_reading: Vec<CurrentlyReadingItem>,
pub recently_read: Vec<RecentlyReadItem>,
pub reading_over_time: Vec<MonthlyReading>,
pub by_format: Vec<FormatCount>, pub by_format: Vec<FormatCount>,
pub by_language: Vec<LanguageCount>, pub by_language: Vec<LanguageCount>,
pub by_library: Vec<LibraryStats>, pub by_library: Vec<LibraryStats>,
@@ -90,7 +116,7 @@ pub struct StatsResponse {
#[utoipa::path( #[utoipa::path(
get, get,
path = "/stats", path = "/stats",
tag = "books", tag = "stats",
responses( responses(
(status = 200, body = StatsResponse), (status = 200, body = StatsResponse),
(status = 401, description = "Unauthorized"), (status = 401, description = "Unauthorized"),
@@ -327,9 +353,92 @@ pub async fn get_stats(
by_provider, by_provider,
}; };
// Currently reading books
let reading_rows = sqlx::query(
r#"
SELECT b.id AS book_id, b.title, b.series, brp.current_page, b.page_count
FROM book_reading_progress brp
JOIN books b ON b.id = brp.book_id
WHERE brp.status = 'reading' AND brp.current_page IS NOT NULL
ORDER BY brp.updated_at DESC
LIMIT 20
"#,
)
.fetch_all(&state.pool)
.await?;
let currently_reading: Vec<CurrentlyReadingItem> = reading_rows
.iter()
.map(|r| {
let id: uuid::Uuid = r.get("book_id");
CurrentlyReadingItem {
book_id: id.to_string(),
title: r.get("title"),
series: r.get("series"),
current_page: r.get::<Option<i32>, _>("current_page").unwrap_or(0),
page_count: r.get::<Option<i32>, _>("page_count").unwrap_or(0),
}
})
.collect();
// Recently read books
let recent_rows = sqlx::query(
r#"
SELECT b.id AS book_id, b.title, b.series,
TO_CHAR(brp.last_read_at, 'YYYY-MM-DD') AS last_read_at
FROM book_reading_progress brp
JOIN books b ON b.id = brp.book_id
WHERE brp.status = 'read' AND brp.last_read_at IS NOT NULL
ORDER BY brp.last_read_at DESC
LIMIT 10
"#,
)
.fetch_all(&state.pool)
.await?;
let recently_read: Vec<RecentlyReadItem> = recent_rows
.iter()
.map(|r| {
let id: uuid::Uuid = r.get("book_id");
RecentlyReadItem {
book_id: id.to_string(),
title: r.get("title"),
series: r.get("series"),
last_read_at: r.get::<Option<String>, _>("last_read_at").unwrap_or_default(),
}
})
.collect();
// Reading activity over time (last 12 months)
let reading_time_rows = sqlx::query(
r#"
SELECT
TO_CHAR(DATE_TRUNC('month', brp.last_read_at), 'YYYY-MM') AS month,
COUNT(*) AS books_read
FROM book_reading_progress brp
WHERE brp.status = 'read'
AND brp.last_read_at >= DATE_TRUNC('month', NOW()) - INTERVAL '11 months'
GROUP BY DATE_TRUNC('month', brp.last_read_at)
ORDER BY month ASC
"#,
)
.fetch_all(&state.pool)
.await?;
let reading_over_time: Vec<MonthlyReading> = reading_time_rows
.iter()
.map(|r| MonthlyReading {
month: r.get::<Option<String>, _>("month").unwrap_or_default(),
books_read: r.get("books_read"),
})
.collect();
Ok(Json(StatsResponse { Ok(Json(StatsResponse {
overview, overview,
reading_status, reading_status,
currently_reading,
recently_read,
reading_over_time,
by_format, by_format,
by_language, by_language,
by_library, by_library,

46
apps/api/src/telegram.rs Normal file
View File

@@ -0,0 +1,46 @@
use axum::{extract::State, Json};
use serde::Serialize;
use utoipa::ToSchema;
use crate::{error::ApiError, state::AppState};
#[derive(Serialize, ToSchema)]
pub struct TelegramTestResponse {
pub success: bool,
pub message: String,
}
/// Test Telegram connection by sending a test message
#[utoipa::path(
get,
path = "/telegram/test",
tag = "notifications",
responses(
(status = 200, body = TelegramTestResponse),
(status = 400, description = "Telegram not configured"),
(status = 401, description = "Unauthorized"),
),
security(("Bearer" = []))
)]
pub async fn test_telegram(
State(state): State<AppState>,
) -> Result<Json<TelegramTestResponse>, ApiError> {
let config = notifications::load_telegram_config(&state.pool)
.await
.ok_or_else(|| {
ApiError::bad_request(
"Telegram is not configured or disabled. Set bot_token, chat_id, and enable it.",
)
})?;
match notifications::send_test_message(&config).await {
Ok(()) => Ok(Json(TelegramTestResponse {
success: true,
message: "Test message sent successfully".to_string(),
})),
Err(e) => Ok(Json(TelegramTestResponse {
success: false,
message: format!("Failed to send: {e}"),
})),
}
}

View File

@@ -28,12 +28,9 @@ export async function GET(
}); });
} }
// Récupérer le content-type et les données
const contentType = response.headers.get("content-type") || "image/webp"; const contentType = response.headers.get("content-type") || "image/webp";
const imageBuffer = await response.arrayBuffer();
// Retourner l'image avec le bon content-type return new NextResponse(response.body, {
return new NextResponse(imageBuffer, {
headers: { headers: {
"Content-Type": contentType, "Content-Type": contentType,
"Cache-Control": "public, max-age=300", "Cache-Control": "public, max-age=300",

View File

@@ -9,10 +9,25 @@ export async function GET(
try { try {
const { baseUrl, token } = config(); const { baseUrl, token } = config();
const ifNoneMatch = request.headers.get("if-none-match");
const fetchHeaders: Record<string, string> = {
Authorization: `Bearer ${token}`,
};
if (ifNoneMatch) {
fetchHeaders["If-None-Match"] = ifNoneMatch;
}
const response = await fetch(`${baseUrl}/books/${bookId}/thumbnail`, { const response = await fetch(`${baseUrl}/books/${bookId}/thumbnail`, {
headers: { Authorization: `Bearer ${token}` }, headers: fetchHeaders,
next: { revalidate: 86400 },
}); });
// Forward 304 Not Modified as-is
if (response.status === 304) {
return new NextResponse(null, { status: 304 });
}
if (!response.ok) { if (!response.ok) {
return new NextResponse(`Failed to fetch thumbnail: ${response.status}`, { return new NextResponse(`Failed to fetch thumbnail: ${response.status}`, {
status: response.status status: response.status
@@ -20,14 +35,17 @@ export async function GET(
} }
const contentType = response.headers.get("content-type") || "image/webp"; const contentType = response.headers.get("content-type") || "image/webp";
const imageBuffer = await response.arrayBuffer(); const etag = response.headers.get("etag");
return new NextResponse(imageBuffer, { const headers: Record<string, string> = {
headers: {
"Content-Type": contentType, "Content-Type": contentType,
"Cache-Control": "public, max-age=31536000, immutable", "Cache-Control": "public, max-age=31536000, immutable",
}, };
}); if (etag) {
headers["ETag"] = etag;
}
return new NextResponse(response.body, { headers });
} catch (error) { } catch (error) {
console.error("Error fetching thumbnail:", error); console.error("Error fetching thumbnail:", error);
return new NextResponse("Failed to fetch thumbnail", { status: 500 }); return new NextResponse("Failed to fetch thumbnail", { status: 500 });

View File

@@ -11,6 +11,7 @@ export async function GET(request: NextRequest) {
let lastData: string | null = null; let lastData: string | null = null;
let isActive = true; let isActive = true;
let consecutiveErrors = 0; let consecutiveErrors = 0;
let intervalId: ReturnType<typeof setInterval> | null = null;
const fetchJobs = async () => { const fetchJobs = async () => {
if (!isActive) return; if (!isActive) return;
@@ -25,23 +26,28 @@ export async function GET(request: NextRequest) {
const data = await response.json(); const data = await response.json();
const dataStr = JSON.stringify(data); const dataStr = JSON.stringify(data);
// Send if data changed // Send only if data changed
if (dataStr !== lastData && isActive) { if (dataStr !== lastData && isActive) {
lastData = dataStr; lastData = dataStr;
try { try {
controller.enqueue( controller.enqueue(
new TextEncoder().encode(`data: ${dataStr}\n\n`) new TextEncoder().encode(`data: ${dataStr}\n\n`)
); );
} catch (err) { } catch {
// Controller closed, ignore
isActive = false; isActive = false;
} }
} }
// Adapt interval: 2s when active jobs exist, 15s when idle
const hasActiveJobs = data.some((j: { status: string }) =>
j.status === "running" || j.status === "pending" || j.status === "extracting_pages" || j.status === "generating_thumbnails"
);
const nextInterval = hasActiveJobs ? 2000 : 15000;
restartInterval(nextInterval);
} }
} catch (error) { } catch (error) {
if (isActive) { if (isActive) {
consecutiveErrors++; consecutiveErrors++;
// Only log first failure and every 30th to avoid spam
if (consecutiveErrors === 1 || consecutiveErrors % 30 === 0) { if (consecutiveErrors === 1 || consecutiveErrors % 30 === 0) {
console.warn(`SSE fetch error (${consecutiveErrors} consecutive):`, error); console.warn(`SSE fetch error (${consecutiveErrors} consecutive):`, error);
} }
@@ -49,22 +55,18 @@ export async function GET(request: NextRequest) {
} }
}; };
// Initial fetch const restartInterval = (ms: number) => {
await fetchJobs(); if (intervalId !== null) clearInterval(intervalId);
intervalId = setInterval(fetchJobs, ms);
};
// Poll every 2 seconds // Initial fetch + start polling
const interval = setInterval(async () => {
if (!isActive) {
clearInterval(interval);
return;
}
await fetchJobs(); await fetchJobs();
}, 2000);
// Cleanup // Cleanup
request.signal.addEventListener("abort", () => { request.signal.addEventListener("abort", () => {
isActive = false; isActive = false;
clearInterval(interval); if (intervalId !== null) clearInterval(intervalId);
controller.close(); controller.close();
}); });
}, },

View File

@@ -0,0 +1,12 @@
import { NextResponse } from "next/server";
import { apiFetch } from "@/lib/api";
export async function GET() {
try {
const data = await apiFetch("/telegram/test");
return NextResponse.json(data);
} catch (error) {
const message = error instanceof Error ? error.message : "Failed to test Telegram connection";
return NextResponse.json({ error: message }, { status: 500 });
}
}

View File

@@ -0,0 +1,135 @@
import { fetchBooks, fetchAllSeries, BooksPageDto, SeriesPageDto, getBookCoverUrl } from "../../../lib/api";
import { getServerTranslations } from "../../../lib/i18n/server";
import { BooksGrid } from "../../components/BookCard";
import { OffsetPagination } from "../../components/ui";
import Image from "next/image";
import Link from "next/link";
export const dynamic = "force-dynamic";
export default async function AuthorDetailPage({
params,
searchParams,
}: {
params: Promise<{ name: string }>;
searchParams: Promise<{ [key: string]: string | string[] | undefined }>;
}) {
const { t } = await getServerTranslations();
const { name: encodedName } = await params;
const authorName = decodeURIComponent(encodedName);
const searchParamsAwaited = await searchParams;
const page = typeof searchParamsAwaited.page === "string" ? parseInt(searchParamsAwaited.page) : 1;
const limit = typeof searchParamsAwaited.limit === "string" ? parseInt(searchParamsAwaited.limit) : 20;
// Fetch books by this author (server-side filtering via API) and series by this author
const [booksPage, seriesPage] = await Promise.all([
fetchBooks(undefined, undefined, page, limit, undefined, undefined, authorName).catch(
() => ({ items: [], total: 0, page: 1, limit }) as BooksPageDto
),
fetchAllSeries(undefined, undefined, undefined, 1, 200, undefined, undefined, undefined, undefined, authorName).catch(
() => ({ items: [], total: 0, page: 1, limit: 200 }) as SeriesPageDto
),
]);
const totalPages = Math.ceil(booksPage.total / limit);
const authorSeries = seriesPage.items;
return (
<>
{/* Breadcrumb */}
<nav className="flex items-center gap-2 text-sm text-muted-foreground mb-6">
<Link href="/authors" className="hover:text-foreground transition-colors">
{t("authors.title")}
</Link>
<span>/</span>
<span className="text-foreground font-medium">{authorName}</span>
</nav>
{/* Author Header */}
<div className="flex items-center gap-4 mb-8">
<div className="w-16 h-16 rounded-full bg-accent/50 flex items-center justify-center flex-shrink-0">
<span className="text-2xl font-bold text-accent-foreground">
{authorName.charAt(0).toUpperCase()}
</span>
</div>
<div>
<h1 className="text-3xl font-bold text-foreground">{authorName}</h1>
<div className="flex items-center gap-4 mt-1">
<span className="text-sm text-muted-foreground">
{t("authors.bookCount", { count: String(booksPage.total), plural: booksPage.total !== 1 ? "s" : "" })}
</span>
{authorSeries.length > 0 && (
<span className="text-sm text-muted-foreground">
{t("authors.seriesCount", { count: String(authorSeries.length), plural: authorSeries.length !== 1 ? "s" : "" })}
</span>
)}
</div>
</div>
</div>
{/* Series Section */}
{authorSeries.length > 0 && (
<section className="mb-8">
<h2 className="text-xl font-semibold text-foreground mb-4">
{t("authors.seriesBy", { name: authorName })}
</h2>
<div className="grid grid-cols-2 sm:grid-cols-3 md:grid-cols-4 lg:grid-cols-5 xl:grid-cols-6 gap-4">
{authorSeries.map((s) => (
<Link
key={`${s.library_id}-${s.name}`}
href={`/libraries/${s.library_id}/series/${encodeURIComponent(s.name)}`}
className="group"
>
<div className="bg-card rounded-xl shadow-sm border border-border/60 overflow-hidden hover:shadow-md hover:-translate-y-1 transition-all duration-200">
<div className="aspect-[2/3] relative bg-muted/50">
<Image
src={getBookCoverUrl(s.first_book_id)}
alt={s.name}
fill
className="object-cover"
sizes="(max-width: 640px) 50vw, (max-width: 768px) 33vw, (max-width: 1024px) 25vw, 16vw"
/>
</div>
<div className="p-3">
<h3 className="font-medium text-foreground truncate text-sm" title={s.name}>
{s.name}
</h3>
<p className="text-xs text-muted-foreground mt-1">
{t("authors.bookCount", { count: String(s.book_count), plural: s.book_count !== 1 ? "s" : "" })}
</p>
</div>
</div>
</Link>
))}
</div>
</section>
)}
{/* Books Section */}
{booksPage.items.length > 0 && (
<section>
<h2 className="text-xl font-semibold text-foreground mb-4">
{t("authors.booksBy", { name: authorName })}
</h2>
<BooksGrid books={booksPage.items} />
<OffsetPagination
currentPage={page}
totalPages={totalPages}
pageSize={limit}
totalItems={booksPage.total}
/>
</section>
)}
{/* Empty State */}
{booksPage.items.length === 0 && authorSeries.length === 0 && (
<div className="flex flex-col items-center justify-center py-16 text-center">
<p className="text-muted-foreground text-lg">
{t("authors.noResults")}
</p>
</div>
)}
</>
);
}

View File

@@ -0,0 +1,122 @@
import { fetchAuthors, AuthorsPageDto } from "../../lib/api";
import { getServerTranslations } from "../../lib/i18n/server";
import { LiveSearchForm } from "../components/LiveSearchForm";
import { Card, CardContent, OffsetPagination } from "../components/ui";
import Link from "next/link";
export const dynamic = "force-dynamic";
export default async function AuthorsPage({
searchParams,
}: {
searchParams: Promise<{ [key: string]: string | string[] | undefined }>;
}) {
const { t } = await getServerTranslations();
const searchParamsAwaited = await searchParams;
const searchQuery = typeof searchParamsAwaited.q === "string" ? searchParamsAwaited.q : "";
const sort = typeof searchParamsAwaited.sort === "string" ? searchParamsAwaited.sort : undefined;
const page = typeof searchParamsAwaited.page === "string" ? parseInt(searchParamsAwaited.page) : 1;
const limit = typeof searchParamsAwaited.limit === "string" ? parseInt(searchParamsAwaited.limit) : 20;
const authorsPage = await fetchAuthors(
searchQuery || undefined,
page,
limit,
sort,
).catch(() => ({ items: [], total: 0, page: 1, limit }) as AuthorsPageDto);
const totalPages = Math.ceil(authorsPage.total / limit);
const hasFilters = searchQuery || sort;
const sortOptions = [
{ value: "", label: t("authors.sortName") },
{ value: "books", label: t("authors.sortBooks") },
];
return (
<>
<div className="mb-6">
<h1 className="text-3xl font-bold text-foreground flex items-center gap-3">
<svg className="w-8 h-8 text-violet-500" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M17 20h5v-2a3 3 0 00-5.356-1.857M17 20H7m10 0v-2c0-.656-.126-1.283-.356-1.857M7 20H2v-2a3 3 0 015.356-1.857M7 20v-2c0-.656.126-1.283.356-1.857m0 0a5.002 5.002 0 019.288 0M15 7a3 3 0 11-6 0 3 3 0 016 0zm6 3a2 2 0 11-4 0 2 2 0 014 0zM7 10a2 2 0 11-4 0 2 2 0 014 0z" />
</svg>
{t("authors.title")}
</h1>
</div>
<Card className="mb-6">
<CardContent className="pt-6">
<LiveSearchForm
basePath="/authors"
fields={[
{ name: "q", type: "text", label: t("common.search"), placeholder: t("authors.searchPlaceholder") },
{ name: "sort", type: "select", label: t("books.sort"), options: sortOptions },
]}
/>
</CardContent>
</Card>
{/* Results count */}
<p className="text-sm text-muted-foreground mb-4">
{authorsPage.total} {t("authors.title").toLowerCase()}
{searchQuery && <> {t("authors.matchingQuery")} &quot;{searchQuery}&quot;</>}
</p>
{/* Authors List */}
{authorsPage.items.length > 0 ? (
<>
<div className="grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-3 xl:grid-cols-4 gap-4">
{authorsPage.items.map((author) => (
<Link
key={author.name}
href={`/authors/${encodeURIComponent(author.name)}`}
className="group"
>
<div className="bg-card rounded-xl shadow-sm border border-border/60 overflow-hidden hover:shadow-md hover:-translate-y-1 transition-all duration-200 p-4">
<div className="flex items-center gap-3">
<div className="w-10 h-10 rounded-full bg-accent/50 flex items-center justify-center flex-shrink-0">
<span className="text-lg font-semibold text-violet-500">
{author.name.charAt(0).toUpperCase()}
</span>
</div>
<div className="min-w-0">
<h3 className="font-medium text-foreground truncate text-sm group-hover:text-violet-500 transition-colors" title={author.name}>
{author.name}
</h3>
<div className="flex items-center gap-3 mt-0.5">
<span className="text-xs text-muted-foreground">
{t("authors.bookCount", { count: String(author.book_count), plural: author.book_count !== 1 ? "s" : "" })}
</span>
<span className="text-xs text-muted-foreground">
{t("authors.seriesCount", { count: String(author.series_count), plural: author.series_count !== 1 ? "s" : "" })}
</span>
</div>
</div>
</div>
</div>
</Link>
))}
</div>
<OffsetPagination
currentPage={page}
totalPages={totalPages}
pageSize={limit}
totalItems={authorsPage.total}
/>
</>
) : (
<div className="flex flex-col items-center justify-center py-16 text-center">
<div className="w-16 h-16 mb-4 text-muted-foreground/30">
<svg fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={1.5} d="M17 20h5v-2a3 3 0 00-5.356-1.857M17 20H7m10 0v-2c0-.656-.126-1.283-.356-1.857M7 20H2v-2a3 3 0 015.356-1.857M7 20v-2c0-.656.126-1.283.356-1.857m0 0a5.002 5.002 0 019.288 0M15 7a3 3 0 11-6 0 3 3 0 016 0zm6 3a2 2 0 11-4 0 2 2 0 014 0zM7 10a2 2 0 11-4 0 2 2 0 014 0z" />
</svg>
</div>
<p className="text-muted-foreground text-lg">
{hasFilters ? t("authors.noResults") : t("authors.noAuthors")}
</p>
</div>
)}
</>
);
}

View File

@@ -2,11 +2,15 @@ import { fetchLibraries, getBookCoverUrl, BookDto, apiFetch, ReadingStatus } fro
import { BookPreview } from "../../components/BookPreview"; import { BookPreview } from "../../components/BookPreview";
import { ConvertButton } from "../../components/ConvertButton"; import { ConvertButton } from "../../components/ConvertButton";
import { MarkBookReadButton } from "../../components/MarkBookReadButton"; import { MarkBookReadButton } from "../../components/MarkBookReadButton";
import { EditBookForm } from "../../components/EditBookForm"; import nextDynamic from "next/dynamic";
import { SafeHtml } from "../../components/SafeHtml"; import { SafeHtml } from "../../components/SafeHtml";
import { getServerTranslations } from "../../../lib/i18n/server"; import { getServerTranslations } from "../../../lib/i18n/server";
import Image from "next/image"; import Image from "next/image";
import Link from "next/link"; import Link from "next/link";
const EditBookForm = nextDynamic(
() => import("../../components/EditBookForm").then(m => m.EditBookForm)
);
import { notFound } from "next/navigation"; import { notFound } from "next/navigation";
export const dynamic = "force-dynamic"; export const dynamic = "force-dynamic";
@@ -95,7 +99,7 @@ export default async function BookDetailPage({
alt={t("bookDetail.coverOf", { title: book.title })} alt={t("bookDetail.coverOf", { title: book.title })}
fill fill
className="object-cover" className="object-cover"
unoptimized sizes="192px"
loading="lazy" loading="lazy"
/> />
</div> </div>

View File

@@ -18,6 +18,8 @@ export default async function BooksPage({
const libraryId = typeof searchParamsAwaited.library === "string" ? searchParamsAwaited.library : undefined; const libraryId = typeof searchParamsAwaited.library === "string" ? searchParamsAwaited.library : undefined;
const searchQuery = typeof searchParamsAwaited.q === "string" ? searchParamsAwaited.q : ""; const searchQuery = typeof searchParamsAwaited.q === "string" ? searchParamsAwaited.q : "";
const readingStatus = typeof searchParamsAwaited.status === "string" ? searchParamsAwaited.status : undefined; const readingStatus = typeof searchParamsAwaited.status === "string" ? searchParamsAwaited.status : undefined;
const format = typeof searchParamsAwaited.format === "string" ? searchParamsAwaited.format : undefined;
const metadataProvider = typeof searchParamsAwaited.metadata === "string" ? searchParamsAwaited.metadata : undefined;
const sort = typeof searchParamsAwaited.sort === "string" ? searchParamsAwaited.sort : undefined; const sort = typeof searchParamsAwaited.sort === "string" ? searchParamsAwaited.sort : undefined;
const page = typeof searchParamsAwaited.page === "string" ? parseInt(searchParamsAwaited.page) : 1; const page = typeof searchParamsAwaited.page === "string" ? parseInt(searchParamsAwaited.page) : 1;
const limit = typeof searchParamsAwaited.limit === "string" ? parseInt(searchParamsAwaited.limit) : 20; const limit = typeof searchParamsAwaited.limit === "string" ? parseInt(searchParamsAwaited.limit) : 20;
@@ -62,7 +64,7 @@ export default async function BooksPage({
totalHits = searchResponse.estimated_total_hits; totalHits = searchResponse.estimated_total_hits;
} }
} else { } else {
const booksPage = await fetchBooks(libraryId, undefined, page, limit, readingStatus, sort).catch(() => ({ const booksPage = await fetchBooks(libraryId, undefined, page, limit, readingStatus, sort, undefined, format, metadataProvider).catch(() => ({
items: [] as BookDto[], items: [] as BookDto[],
total: 0, total: 0,
page: 1, page: 1,
@@ -91,12 +93,26 @@ export default async function BooksPage({
{ value: "read", label: t("status.read") }, { value: "read", label: t("status.read") },
]; ];
const formatOptions = [
{ value: "", label: t("books.allFormats") },
{ value: "cbz", label: "CBZ" },
{ value: "cbr", label: "CBR" },
{ value: "pdf", label: "PDF" },
{ value: "epub", label: "EPUB" },
];
const metadataOptions = [
{ value: "", label: t("series.metadataAll") },
{ value: "linked", label: t("series.metadataLinked") },
{ value: "unlinked", label: t("series.metadataUnlinked") },
];
const sortOptions = [ const sortOptions = [
{ value: "", label: t("books.sortTitle") }, { value: "", label: t("books.sortTitle") },
{ value: "latest", label: t("books.sortLatest") }, { value: "latest", label: t("books.sortLatest") },
]; ];
const hasFilters = searchQuery || libraryId || readingStatus || sort; const hasFilters = searchQuery || libraryId || readingStatus || format || metadataProvider || sort;
return ( return (
<> <>
@@ -117,6 +133,8 @@ export default async function BooksPage({
{ name: "q", type: "text", label: t("common.search"), placeholder: t("books.searchPlaceholder") }, { name: "q", type: "text", label: t("common.search"), placeholder: t("books.searchPlaceholder") },
{ name: "library", type: "select", label: t("books.library"), options: libraryOptions }, { name: "library", type: "select", label: t("books.library"), options: libraryOptions },
{ name: "status", type: "select", label: t("books.status"), options: statusOptions }, { name: "status", type: "select", label: t("books.status"), options: statusOptions },
{ name: "format", type: "select", label: t("books.format"), options: formatOptions },
{ name: "metadata", type: "select", label: t("series.metadata"), options: metadataOptions },
{ name: "sort", type: "select", label: t("books.sort"), options: sortOptions }, { name: "sort", type: "select", label: t("books.sort"), options: sortOptions },
]} ]}
/> />
@@ -152,7 +170,7 @@ export default async function BooksPage({
alt={t("books.coverOf", { name: s.name })} alt={t("books.coverOf", { name: s.name })}
fill fill
className="object-cover" className="object-cover"
unoptimized sizes="(max-width: 640px) 50vw, (max-width: 768px) 33vw, (max-width: 1024px) 25vw, 16vw"
/> />
</div> </div>
<div className="p-2"> <div className="p-2">

View File

@@ -1,6 +1,6 @@
"use client"; "use client";
import { useState } from "react"; import { memo, useState } from "react";
import Image from "next/image"; import Image from "next/image";
import Link from "next/link"; import Link from "next/link";
import { BookDto, ReadingStatus } from "../../lib/api"; import { BookDto, ReadingStatus } from "../../lib/api";
@@ -17,7 +17,7 @@ interface BookCardProps {
readingStatus?: ReadingStatus; readingStatus?: ReadingStatus;
} }
function BookImage({ src, alt }: { src: string; alt: string }) { const BookImage = memo(function BookImage({ src, alt }: { src: string; alt: string }) {
const [isLoaded, setIsLoaded] = useState(false); const [isLoaded, setIsLoaded] = useState(false);
const [hasError, setHasError] = useState(false); const [hasError, setHasError] = useState(false);
@@ -51,13 +51,12 @@ function BookImage({ src, alt }: { src: string; alt: string }) {
sizes="(max-width: 640px) 50vw, (max-width: 768px) 33vw, (max-width: 1024px) 25vw, 16vw" sizes="(max-width: 640px) 50vw, (max-width: 768px) 33vw, (max-width: 1024px) 25vw, 16vw"
onLoad={() => setIsLoaded(true)} onLoad={() => setIsLoaded(true)}
onError={() => setHasError(true)} onError={() => setHasError(true)}
unoptimized
/> />
</div> </div>
); );
} });
export function BookCard({ book, readingStatus }: BookCardProps) { export const BookCard = memo(function BookCard({ book, readingStatus }: BookCardProps) {
const { t } = useTranslation(); const { t } = useTranslation();
const coverUrl = book.coverUrl || `/api/books/${book.id}/thumbnail`; const coverUrl = book.coverUrl || `/api/books/${book.id}/thumbnail`;
const status = readingStatus ?? book.reading_status; const status = readingStatus ?? book.reading_status;
@@ -115,6 +114,7 @@ export function BookCard({ book, readingStatus }: BookCardProps) {
${(book.format ?? book.kind) === 'cbz' ? 'bg-success/10 text-success' : ''} ${(book.format ?? book.kind) === 'cbz' ? 'bg-success/10 text-success' : ''}
${(book.format ?? book.kind) === 'cbr' ? 'bg-warning/10 text-warning' : ''} ${(book.format ?? book.kind) === 'cbr' ? 'bg-warning/10 text-warning' : ''}
${(book.format ?? book.kind) === 'pdf' ? 'bg-destructive/10 text-destructive' : ''} ${(book.format ?? book.kind) === 'pdf' ? 'bg-destructive/10 text-destructive' : ''}
${(book.format ?? book.kind) === 'epub' ? 'bg-info/10 text-info' : ''}
`}> `}>
{book.format ?? book.kind} {book.format ?? book.kind}
</span> </span>
@@ -128,7 +128,7 @@ export function BookCard({ book, readingStatus }: BookCardProps) {
</div> </div>
</Link> </Link>
); );
} });
interface BooksGridProps { interface BooksGridProps {
books: (BookDto & { coverUrl?: string })[]; books: (BookDto & { coverUrl?: string })[];

View File

@@ -0,0 +1,188 @@
"use client";
import {
PieChart, Pie, Cell, ResponsiveContainer, Tooltip,
BarChart, Bar, XAxis, YAxis, CartesianGrid,
AreaChart, Area,
Legend,
} from "recharts";
// ---------------------------------------------------------------------------
// Donut
// ---------------------------------------------------------------------------
export function RcDonutChart({
data,
noDataLabel,
}: {
data: { name: string; value: number; color: string }[];
noDataLabel?: string;
}) {
const total = data.reduce((s, d) => s + d.value, 0);
if (total === 0) return <p className="text-muted-foreground text-sm text-center py-8">{noDataLabel}</p>;
return (
<div className="flex items-center gap-4">
<ResponsiveContainer width={130} height={130}>
<PieChart>
<Pie
data={data}
cx="50%"
cy="50%"
innerRadius={32}
outerRadius={55}
dataKey="value"
strokeWidth={0}
>
{data.map((d, i) => (
<Cell key={i} fill={d.color} />
))}
</Pie>
<Tooltip
formatter={(value) => value}
contentStyle={{ backgroundColor: "var(--color-card)", border: "1px solid var(--color-border)", borderRadius: 8, fontSize: 12 }}
/>
</PieChart>
</ResponsiveContainer>
<div className="flex flex-col gap-1.5 min-w-0">
{data.map((d, i) => (
<div key={i} className="flex items-center gap-2 text-sm">
<span className="w-3 h-3 rounded-full shrink-0" style={{ backgroundColor: d.color }} />
<span className="text-muted-foreground truncate">{d.name}</span>
<span className="font-medium text-foreground ml-auto">{d.value}</span>
</div>
))}
</div>
</div>
);
}
// ---------------------------------------------------------------------------
// Bar chart
// ---------------------------------------------------------------------------
export function RcBarChart({
data,
color = "hsl(198 78% 37%)",
noDataLabel,
}: {
data: { label: string; value: number }[];
color?: string;
noDataLabel?: string;
}) {
if (data.length === 0) return <p className="text-muted-foreground text-sm text-center py-8">{noDataLabel}</p>;
return (
<ResponsiveContainer width="100%" height={180}>
<BarChart data={data} margin={{ top: 5, right: 5, bottom: 0, left: -20 }}>
<CartesianGrid strokeDasharray="3 3" vertical={false} stroke="var(--color-border)" opacity={0.3} />
<XAxis dataKey="label" tick={{ fontSize: 11, fill: "var(--color-muted-foreground)" }} axisLine={false} tickLine={false} />
<YAxis tick={{ fontSize: 11, fill: "var(--color-muted-foreground)" }} axisLine={false} tickLine={false} allowDecimals={false} />
<Tooltip
contentStyle={{ backgroundColor: "var(--color-card)", border: "1px solid var(--color-border)", borderRadius: 8, fontSize: 12 }}
/>
<Bar dataKey="value" fill={color} radius={[4, 4, 0, 0]} />
</BarChart>
</ResponsiveContainer>
);
}
// ---------------------------------------------------------------------------
// Area / Line chart
// ---------------------------------------------------------------------------
export function RcAreaChart({
data,
color = "hsl(142 60% 45%)",
noDataLabel,
}: {
data: { label: string; value: number }[];
color?: string;
noDataLabel?: string;
}) {
if (data.length === 0) return <p className="text-muted-foreground text-sm text-center py-8">{noDataLabel}</p>;
return (
<ResponsiveContainer width="100%" height={180}>
<AreaChart data={data} margin={{ top: 5, right: 5, bottom: 0, left: -20 }}>
<defs>
<linearGradient id="areaGradient" x1="0" y1="0" x2="0" y2="1">
<stop offset="0%" stopColor={color} stopOpacity={0.3} />
<stop offset="100%" stopColor={color} stopOpacity={0} />
</linearGradient>
</defs>
<CartesianGrid strokeDasharray="3 3" vertical={false} stroke="var(--color-border)" opacity={0.3} />
<XAxis dataKey="label" tick={{ fontSize: 11, fill: "var(--color-muted-foreground)" }} axisLine={false} tickLine={false} />
<YAxis tick={{ fontSize: 11, fill: "var(--color-muted-foreground)" }} axisLine={false} tickLine={false} allowDecimals={false} />
<Tooltip
contentStyle={{ backgroundColor: "var(--color-card)", border: "1px solid var(--color-border)", borderRadius: 8, fontSize: 12 }}
/>
<Area type="monotone" dataKey="value" stroke={color} strokeWidth={2} fill="url(#areaGradient)" dot={{ r: 3, fill: color }} />
</AreaChart>
</ResponsiveContainer>
);
}
// ---------------------------------------------------------------------------
// Horizontal stacked bar (libraries breakdown)
// ---------------------------------------------------------------------------
export function RcStackedBar({
data,
labels,
}: {
data: { name: string; read: number; reading: number; unread: number; sizeLabel: string }[];
labels: { read: string; reading: string; unread: string; books: string };
}) {
if (data.length === 0) return null;
return (
<ResponsiveContainer width="100%" height={data.length * 60 + 30}>
<BarChart data={data} layout="vertical" margin={{ top: 0, right: 5, bottom: 0, left: 5 }}>
<CartesianGrid strokeDasharray="3 3" horizontal={false} stroke="var(--color-border)" opacity={0.3} />
<XAxis type="number" tick={{ fontSize: 11, fill: "var(--color-muted-foreground)" }} axisLine={false} tickLine={false} allowDecimals={false} />
<YAxis type="category" dataKey="name" tick={{ fontSize: 12, fill: "var(--color-foreground)" }} axisLine={false} tickLine={false} width={120} />
<Tooltip
contentStyle={{ backgroundColor: "var(--color-card)", border: "1px solid var(--color-border)", borderRadius: 8, fontSize: 12 }}
/>
<Legend
wrapperStyle={{ fontSize: 11 }}
formatter={(value: string) => <span className="text-muted-foreground">{value}</span>}
/>
<Bar dataKey="read" stackId="a" fill="hsl(142 60% 45%)" name={labels.read} radius={[0, 0, 0, 0]} />
<Bar dataKey="reading" stackId="a" fill="hsl(45 93% 47%)" name={labels.reading} />
<Bar dataKey="unread" stackId="a" fill="hsl(220 13% 70%)" name={labels.unread} radius={[0, 4, 4, 0]} />
</BarChart>
</ResponsiveContainer>
);
}
// ---------------------------------------------------------------------------
// Horizontal bar chart (top series)
// ---------------------------------------------------------------------------
export function RcHorizontalBar({
data,
color = "hsl(142 60% 45%)",
noDataLabel,
}: {
data: { name: string; value: number; subLabel: string }[];
color?: string;
noDataLabel?: string;
}) {
if (data.length === 0) return <p className="text-muted-foreground text-sm text-center py-4">{noDataLabel}</p>;
return (
<ResponsiveContainer width="100%" height={data.length * 40 + 10}>
<BarChart data={data} layout="vertical" margin={{ top: 0, right: 5, bottom: 0, left: 5 }}>
<CartesianGrid strokeDasharray="3 3" horizontal={false} stroke="var(--color-border)" opacity={0.3} />
<XAxis type="number" tick={{ fontSize: 11, fill: "var(--color-muted-foreground)" }} axisLine={false} tickLine={false} allowDecimals={false} />
<YAxis type="category" dataKey="name" tick={{ fontSize: 11, fill: "var(--color-foreground)" }} axisLine={false} tickLine={false} width={120} />
<Tooltip
contentStyle={{ backgroundColor: "var(--color-card)", border: "1px solid var(--color-border)", borderRadius: 8, fontSize: 12 }}
/>
<Bar dataKey="value" fill={color} radius={[0, 4, 4, 0]} />
</BarChart>
</ResponsiveContainer>
);
}

View File

@@ -1,6 +1,7 @@
"use client"; "use client";
import { useState } from "react"; import { useState } from "react";
import { createPortal } from "react-dom";
import { FolderBrowser } from "./FolderBrowser"; import { FolderBrowser } from "./FolderBrowser";
import { FolderItem } from "../../lib/api"; import { FolderItem } from "../../lib/api";
import { Button } from "./ui"; import { Button } from "./ui";
@@ -64,7 +65,7 @@ export function FolderPicker({ initialFolders, selectedPath, onSelect }: FolderP
</div> </div>
{/* Popup Modal */} {/* Popup Modal */}
{isOpen && ( {isOpen && createPortal(
<> <>
{/* Backdrop */} {/* Backdrop */}
<div <div
@@ -121,7 +122,8 @@ export function FolderPicker({ initialFolders, selectedPath, onSelect }: FolderP
</div> </div>
</div> </div>
</div> </div>
</> </>,
document.body
)} )}
</div> </div>
); );

View File

@@ -0,0 +1,44 @@
"use client";
import { useEffect, useRef } from "react";
import { useRouter } from "next/navigation";
interface JobDetailLiveProps {
jobId: string;
isTerminal: boolean;
}
export function JobDetailLive({ jobId, isTerminal }: JobDetailLiveProps) {
const router = useRouter();
const isTerminalRef = useRef(isTerminal);
isTerminalRef.current = isTerminal;
useEffect(() => {
if (isTerminalRef.current) return;
const eventSource = new EventSource(`/api/jobs/${jobId}/stream`);
eventSource.onmessage = (event) => {
try {
const data = JSON.parse(event.data);
router.refresh();
if (data.status === "success" || data.status === "failed" || data.status === "cancelled") {
eventSource.close();
}
} catch {
// ignore parse errors
}
};
eventSource.onerror = () => {
eventSource.close();
};
return () => {
eventSource.close();
};
}, [jobId, router]);
return null;
}

View File

@@ -54,21 +54,62 @@ export function JobsIndicator() {
const [popinStyle, setPopinStyle] = useState<React.CSSProperties>({}); const [popinStyle, setPopinStyle] = useState<React.CSSProperties>({});
useEffect(() => { useEffect(() => {
const fetchActiveJobs = async () => { let eventSource: EventSource | null = null;
try { let reconnectTimeout: ReturnType<typeof setTimeout> | null = null;
const response = await fetch("/api/jobs/active");
if (response.ok) { const connect = () => {
const jobs = await response.json(); if (eventSource) {
setActiveJobs(jobs); eventSource.close();
} }
} catch (error) { eventSource = new EventSource("/api/jobs/stream");
console.error("Failed to fetch jobs:", error);
eventSource.onmessage = (event) => {
try {
const allJobs: Job[] = JSON.parse(event.data);
const active = allJobs.filter(j =>
j.status === "running" || j.status === "pending" ||
j.status === "extracting_pages" || j.status === "generating_thumbnails"
);
setActiveJobs(active);
} catch {
// ignore malformed data
} }
}; };
fetchActiveJobs(); eventSource.onerror = () => {
const interval = setInterval(fetchActiveJobs, 2000); eventSource?.close();
return () => clearInterval(interval); eventSource = null;
// Reconnect after 5s on error
reconnectTimeout = setTimeout(connect, 5000);
};
};
const disconnect = () => {
if (reconnectTimeout) {
clearTimeout(reconnectTimeout);
reconnectTimeout = null;
}
if (eventSource) {
eventSource.close();
eventSource = null;
}
};
const handleVisibilityChange = () => {
if (document.hidden) {
disconnect();
} else {
connect();
}
};
connect();
document.addEventListener("visibilitychange", handleVisibilityChange);
return () => {
disconnect();
document.removeEventListener("visibilitychange", handleVisibilityChange);
};
}, []); }, []);
// Position the popin relative to the button // Position the popin relative to the button

View File

@@ -40,34 +40,21 @@ function formatDuration(start: string, end: string | null): string {
return `${Math.floor(diff / 3600000)}h ${Math.floor((diff % 3600000) / 60000)}m`; return `${Math.floor(diff / 3600000)}h ${Math.floor((diff % 3600000) / 60000)}m`;
} }
function getDateParts(dateStr: string): { mins: number; hours: number; useDate: boolean; date: Date } {
const date = new Date(dateStr);
const now = new Date();
const diff = now.getTime() - date.getTime();
if (diff < 3600000) {
const mins = Math.floor(diff / 60000);
return { mins, hours: 0, useDate: false, date };
}
if (diff < 86400000) {
const hours = Math.floor(diff / 3600000);
return { mins: 0, hours, useDate: false, date };
}
return { mins: 0, hours: 0, useDate: true, date };
}
export function JobsList({ initialJobs, libraries, highlightJobId }: JobsListProps) { export function JobsList({ initialJobs, libraries, highlightJobId }: JobsListProps) {
const { t } = useTranslation(); const { t, locale } = useTranslation();
const [jobs, setJobs] = useState(initialJobs); const [jobs, setJobs] = useState(initialJobs);
const formatDate = (dateStr: string): string => { const formatDate = (dateStr: string): string => {
const parts = getDateParts(dateStr); const date = new Date(dateStr);
if (parts.useDate) { if (isNaN(date.getTime())) return dateStr;
return parts.date.toLocaleDateString(); const loc = locale === "fr" ? "fr-FR" : "en-US";
} return date.toLocaleString(loc, {
if (parts.mins < 1) return t("time.justNow"); day: "2-digit",
if (parts.hours > 0) return t("time.hoursAgo", { count: parts.hours }); month: "2-digit",
return t("time.minutesAgo", { count: parts.mins }); year: "numeric",
hour: "2-digit",
minute: "2-digit",
});
}; };
// Refresh jobs list via SSE // Refresh jobs list via SSE

View File

@@ -1,6 +1,7 @@
"use client"; "use client";
import { useState, useRef, useEffect, useTransition } from "react"; import { useState, useTransition } from "react";
import { createPortal } from "react-dom";
import { Button } from "../components/ui"; import { Button } from "../components/ui";
import { ProviderIcon } from "../components/ProviderIcon"; import { ProviderIcon } from "../components/ProviderIcon";
import { useTranslation } from "../../lib/i18n/context"; import { useTranslation } from "../../lib/i18n/context";
@@ -12,6 +13,7 @@ interface LibraryActionsProps {
watcherEnabled: boolean; watcherEnabled: boolean;
metadataProvider: string | null; metadataProvider: string | null;
fallbackMetadataProvider: string | null; fallbackMetadataProvider: string | null;
metadataRefreshMode: string;
onUpdate?: () => void; onUpdate?: () => void;
} }
@@ -22,23 +24,12 @@ export function LibraryActions({
watcherEnabled, watcherEnabled,
metadataProvider, metadataProvider,
fallbackMetadataProvider, fallbackMetadataProvider,
onUpdate metadataRefreshMode,
}: LibraryActionsProps) { }: LibraryActionsProps) {
const { t } = useTranslation(); const { t } = useTranslation();
const [isOpen, setIsOpen] = useState(false); const [isOpen, setIsOpen] = useState(false);
const [isPending, startTransition] = useTransition(); const [isPending, startTransition] = useTransition();
const [saveError, setSaveError] = useState<string | null>(null); const [saveError, setSaveError] = useState<string | null>(null);
const dropdownRef = useRef<HTMLDivElement>(null);
useEffect(() => {
const handleClickOutside = (event: MouseEvent) => {
if (dropdownRef.current && !dropdownRef.current.contains(event.target as Node)) {
setIsOpen(false);
}
};
document.addEventListener("mousedown", handleClickOutside);
return () => document.removeEventListener("mousedown", handleClickOutside);
}, []);
const handleSubmit = (formData: FormData) => { const handleSubmit = (formData: FormData) => {
setSaveError(null); setSaveError(null);
@@ -48,6 +39,7 @@ export function LibraryActions({
const scanMode = formData.get("scan_mode") as string; const scanMode = formData.get("scan_mode") as string;
const newMetadataProvider = (formData.get("metadata_provider") as string) || null; const newMetadataProvider = (formData.get("metadata_provider") as string) || null;
const newFallbackProvider = (formData.get("fallback_metadata_provider") as string) || null; const newFallbackProvider = (formData.get("fallback_metadata_provider") as string) || null;
const newMetadataRefreshMode = formData.get("metadata_refresh_mode") as string;
try { try {
const [response] = await Promise.all([ const [response] = await Promise.all([
@@ -58,6 +50,7 @@ export function LibraryActions({
monitor_enabled: monitorEnabled, monitor_enabled: monitorEnabled,
scan_mode: scanMode, scan_mode: scanMode,
watcher_enabled: watcherEnabled, watcher_enabled: watcherEnabled,
metadata_refresh_mode: newMetadataRefreshMode,
}), }),
}), }),
fetch(`/api/libraries/${libraryId}/metadata-provider`, { fetch(`/api/libraries/${libraryId}/metadata-provider`, {
@@ -85,11 +78,11 @@ export function LibraryActions({
}; };
return ( return (
<div className="relative" ref={dropdownRef}> <>
<Button <Button
variant="ghost" variant="ghost"
size="sm" size="sm"
onClick={() => setIsOpen(!isOpen)} onClick={() => setIsOpen(true)}
className={isOpen ? "bg-accent" : ""} className={isOpen ? "bg-accent" : ""}
> >
<svg className="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24"> <svg className="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
@@ -98,12 +91,54 @@ export function LibraryActions({
</svg> </svg>
</Button> </Button>
{isOpen && ( {isOpen && createPortal(
<div className="absolute right-0 top-full mt-2 w-72 bg-card rounded-xl shadow-md border border-border/60 p-4 z-50"> <>
{/* Backdrop */}
<div
className="fixed inset-0 bg-black/30 backdrop-blur-sm z-50"
onClick={() => setIsOpen(false)}
/>
{/* Modal */}
<div className="fixed inset-0 flex items-center justify-center z-50 p-4">
<div className="bg-card border border-border/50 rounded-xl shadow-2xl w-full max-w-lg overflow-hidden animate-in fade-in zoom-in-95 duration-200">
{/* Header */}
<div className="flex items-center justify-between px-5 py-4 border-b border-border/50 bg-muted/30">
<div className="flex items-center gap-2.5">
<svg className="w-5 h-5 text-primary" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M10.325 4.317c.426-1.756 2.924-1.756 3.35 0a1.724 1.724 0 002.573 1.066c1.543-.94 3.31.826 2.37 2.37a1.724 1.724 0 001.065 2.572c1.756.426 1.756 2.924 0 3.35a1.724 1.724 0 00-1.066 2.573c.94 1.543-.826 3.31-2.37 2.37a1.724 1.724 0 00-2.572 1.065c-.426 1.756-2.924 1.756-3.35 0a1.724 1.724 0 00-2.573-1.066c-1.543.94-3.31-.826-2.37-2.37a1.724 1.724 0 00-1.065-2.572c-1.756-.426-1.756-2.924 0-3.35a1.724 1.724 0 001.066-2.573c-.94-1.543.826-3.31 2.37-2.37.996.608 2.296.07 2.572-1.065z" />
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M15 12a3 3 0 11-6 0 3 3 0 016 0z" />
</svg>
<span className="font-semibold text-lg">{t("libraryActions.settingsTitle")}</span>
</div>
<button
type="button"
onClick={() => setIsOpen(false)}
className="text-muted-foreground hover:text-foreground transition-colors p-1.5 hover:bg-accent rounded-lg"
>
<svg className="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M6 18L18 6M6 6l12 12" />
</svg>
</button>
</div>
{/* Form */}
<form action={handleSubmit}> <form action={handleSubmit}>
<div className="space-y-4"> <div className="p-6 space-y-8 max-h-[70vh] overflow-y-auto">
<div className="flex items-center justify-between">
<label className="text-sm font-medium text-foreground flex items-center gap-2"> {/* Section: Indexation */}
<div className="space-y-5">
<h3 className="flex items-center gap-2 text-sm font-semibold text-foreground uppercase tracking-wide">
<svg className="w-4 h-4 text-primary" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M3 7v10a2 2 0 002 2h14a2 2 0 002-2V9a2 2 0 00-2-2h-6l-2-2H5a2 2 0 00-2 2z" />
</svg>
{t("libraryActions.sectionIndexation")}
</h3>
{/* Auto scan */}
<div className="flex items-start justify-between gap-4">
<div className="flex-1">
<label className="text-sm font-medium text-foreground flex items-center gap-2 cursor-pointer">
<input <input
type="checkbox" type="checkbox"
name="monitor_enabled" name="monitor_enabled"
@@ -113,10 +148,23 @@ export function LibraryActions({
/> />
{t("libraryActions.autoScan")} {t("libraryActions.autoScan")}
</label> </label>
<p className="text-xs text-muted-foreground mt-1.5 ml-6">{t("libraryActions.autoScanDesc")}</p>
</div>
<select
name="scan_mode"
defaultValue={scanMode}
className="text-sm border border-border rounded-lg px-3 py-1.5 bg-background min-w-[130px] shrink-0"
>
<option value="manual">{t("monitoring.manual")}</option>
<option value="hourly">{t("monitoring.hourly")}</option>
<option value="daily">{t("monitoring.daily")}</option>
<option value="weekly">{t("monitoring.weekly")}</option>
</select>
</div> </div>
<div className="flex items-center justify-between"> {/* File watcher */}
<label className="text-sm font-medium text-foreground flex items-center gap-2"> <div>
<label className="text-sm font-medium text-foreground flex items-center gap-2 cursor-pointer">
<input <input
type="checkbox" type="checkbox"
name="watcher_enabled" name="watcher_enabled"
@@ -126,31 +174,32 @@ export function LibraryActions({
/> />
{t("libraryActions.fileWatch")} {t("libraryActions.fileWatch")}
</label> </label>
<p className="text-xs text-muted-foreground mt-1.5 ml-6">{t("libraryActions.fileWatchDesc")}</p>
</div>
</div> </div>
<div className="flex items-center justify-between"> <hr className="border-border/40" />
<label className="text-sm font-medium text-foreground">{t("libraryActions.schedule")}</label>
<select
name="scan_mode"
defaultValue={scanMode}
className="text-sm border border-border rounded-lg px-2 py-1 bg-background"
>
<option value="manual">{t("monitoring.manual")}</option>
<option value="hourly">{t("monitoring.hourly")}</option>
<option value="daily">{t("monitoring.daily")}</option>
<option value="weekly">{t("monitoring.weekly")}</option>
</select>
</div>
<div className="flex items-center justify-between"> {/* Section: Metadata */}
<div className="space-y-5">
<h3 className="flex items-center gap-2 text-sm font-semibold text-foreground uppercase tracking-wide">
<svg className="w-4 h-4 text-primary" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M7 7h.01M7 3h5c.512 0 1.024.195 1.414.586l7 7a2 2 0 010 2.828l-7 7a2 2 0 01-2.828 0l-7-7A1.994 1.994 0 013 12V7a4 4 0 014-4z" />
</svg>
{t("libraryActions.sectionMetadata")}
</h3>
{/* Provider */}
<div>
<div className="flex items-center justify-between gap-4">
<label className="text-sm font-medium text-foreground flex items-center gap-1.5"> <label className="text-sm font-medium text-foreground flex items-center gap-1.5">
{metadataProvider && <ProviderIcon provider={metadataProvider} size={16} />} {metadataProvider && metadataProvider !== "none" && <ProviderIcon provider={metadataProvider} size={16} />}
{t("libraryActions.provider")} {t("libraryActions.provider")}
</label> </label>
<select <select
name="metadata_provider" name="metadata_provider"
defaultValue={metadataProvider || ""} defaultValue={metadataProvider || ""}
className="text-sm border border-border rounded-lg px-2 py-1 bg-background" className="text-sm border border-border rounded-lg px-3 py-1.5 bg-background min-w-[160px] shrink-0"
> >
<option value="">{t("libraryActions.default")}</option> <option value="">{t("libraryActions.default")}</option>
<option value="none">{t("libraryActions.none")}</option> <option value="none">{t("libraryActions.none")}</option>
@@ -161,8 +210,12 @@ export function LibraryActions({
<option value="bedetheque">Bédéthèque</option> <option value="bedetheque">Bédéthèque</option>
</select> </select>
</div> </div>
<p className="text-xs text-muted-foreground mt-1.5">{t("libraryActions.providerDesc")}</p>
</div>
<div className="flex items-center justify-between"> {/* Fallback */}
<div>
<div className="flex items-center justify-between gap-4">
<label className="text-sm font-medium text-foreground flex items-center gap-1.5"> <label className="text-sm font-medium text-foreground flex items-center gap-1.5">
{fallbackMetadataProvider && fallbackMetadataProvider !== "none" && <ProviderIcon provider={fallbackMetadataProvider} size={16} />} {fallbackMetadataProvider && fallbackMetadataProvider !== "none" && <ProviderIcon provider={fallbackMetadataProvider} size={16} />}
{t("libraryActions.fallback")} {t("libraryActions.fallback")}
@@ -170,7 +223,7 @@ export function LibraryActions({
<select <select
name="fallback_metadata_provider" name="fallback_metadata_provider"
defaultValue={fallbackMetadataProvider || ""} defaultValue={fallbackMetadataProvider || ""}
className="text-sm border border-border rounded-lg px-2 py-1 bg-background" className="text-sm border border-border rounded-lg px-3 py-1.5 bg-background min-w-[160px] shrink-0"
> >
<option value="">{t("libraryActions.none")}</option> <option value="">{t("libraryActions.none")}</option>
<option value="google_books">Google Books</option> <option value="google_books">Google Books</option>
@@ -180,17 +233,48 @@ export function LibraryActions({
<option value="bedetheque">Bédéthèque</option> <option value="bedetheque">Bédéthèque</option>
</select> </select>
</div> </div>
<p className="text-xs text-muted-foreground mt-1.5">{t("libraryActions.fallbackDesc")}</p>
</div>
{/* Metadata refresh */}
<div>
<div className="flex items-center justify-between gap-4">
<label className="text-sm font-medium text-foreground">{t("libraryActions.metadataRefreshSchedule")}</label>
<select
name="metadata_refresh_mode"
defaultValue={metadataRefreshMode}
className="text-sm border border-border rounded-lg px-3 py-1.5 bg-background min-w-[160px] shrink-0"
>
<option value="manual">{t("monitoring.manual")}</option>
<option value="hourly">{t("monitoring.hourly")}</option>
<option value="daily">{t("monitoring.daily")}</option>
<option value="weekly">{t("monitoring.weekly")}</option>
</select>
</div>
<p className="text-xs text-muted-foreground mt-1.5">{t("libraryActions.metadataRefreshDesc")}</p>
</div>
</div>
{saveError && ( {saveError && (
<p className="text-xs text-destructive bg-destructive/10 px-2 py-1.5 rounded-lg break-all"> <p className="text-sm text-destructive bg-destructive/10 px-3 py-2 rounded-lg break-all">
{saveError} {saveError}
</p> </p>
)} )}
</div>
{/* Footer */}
<div className="flex items-center justify-end gap-2 px-5 py-4 border-t border-border/50 bg-muted/30">
<Button
type="button"
variant="ghost"
size="sm"
onClick={() => setIsOpen(false)}
>
{t("common.cancel")}
</Button>
<Button <Button
type="submit" type="submit"
size="sm" size="sm"
className="w-full"
disabled={isPending} disabled={isPending}
> >
{isPending ? t("libraryActions.saving") : t("common.save")} {isPending ? t("libraryActions.saving") : t("common.save")}
@@ -198,7 +282,10 @@ export function LibraryActions({
</div> </div>
</form> </form>
</div> </div>
)}
</div> </div>
</>,
document.body
)}
</>
); );
} }

View File

@@ -18,6 +18,10 @@ const FILTER_ICONS: Record<string, string> = {
metadata_provider: "M7 7h.01M7 3h5c.512 0 1.024.195 1.414.586l7 7a2 2 0 010 2.828l-7 7a2 2 0 01-2.828 0l-7-7A1.994 1.994 0 013 12V7a4 4 0 014-4z", metadata_provider: "M7 7h.01M7 3h5c.512 0 1.024.195 1.414.586l7 7a2 2 0 010 2.828l-7 7a2 2 0 01-2.828 0l-7-7A1.994 1.994 0 013 12V7a4 4 0 014-4z",
// Sort - arrows up/down // Sort - arrows up/down
sort: "M3 4h13M3 8h9m-9 4h6m4 0l4-4m0 0l4 4m-4-4v12", sort: "M3 4h13M3 8h9m-9 4h6m4 0l4-4m0 0l4 4m-4-4v12",
// Format - document/file
format: "M7 21h10a2 2 0 002-2V9.414a1 1 0 00-.293-.707l-5.414-5.414A1 1 0 0012.586 3H7a2 2 0 00-2 2v14a2 2 0 002 2z",
// Metadata - link/chain
metadata: "M13.828 10.172a4 4 0 00-5.656 0l-4 4a4 4 0 105.656 5.656l1.102-1.101m-.758-4.899a4 4 0 005.656 0l4-4a4 4 0 00-5.656-5.656l-1.1 1.1",
}; };
interface FieldDef { interface FieldDef {
@@ -35,12 +39,17 @@ interface LiveSearchFormProps {
debounceMs?: number; debounceMs?: number;
} }
const STORAGE_KEY_PREFIX = "filters:";
export function LiveSearchForm({ fields, basePath, debounceMs = 300 }: LiveSearchFormProps) { export function LiveSearchForm({ fields, basePath, debounceMs = 300 }: LiveSearchFormProps) {
const router = useRouter(); const router = useRouter();
const searchParams = useSearchParams(); const searchParams = useSearchParams();
const { t } = useTranslation(); const { t } = useTranslation();
const timerRef = useRef<ReturnType<typeof setTimeout> | null>(null); const timerRef = useRef<ReturnType<typeof setTimeout> | null>(null);
const formRef = useRef<HTMLFormElement>(null); const formRef = useRef<HTMLFormElement>(null);
const restoredRef = useRef(false);
const storageKey = `${STORAGE_KEY_PREFIX}${basePath}`;
const buildUrl = useCallback((): string => { const buildUrl = useCallback((): string => {
if (!formRef.current) return basePath; if (!formRef.current) return basePath;
@@ -54,16 +63,58 @@ export function LiveSearchForm({ fields, basePath, debounceMs = 300 }: LiveSearc
return qs ? `${basePath}?${qs}` : basePath; return qs ? `${basePath}?${qs}` : basePath;
}, [basePath]); }, [basePath]);
const saveFilters = useCallback(() => {
if (!formRef.current) return;
const formData = new FormData(formRef.current);
const filters: Record<string, string> = {};
for (const [key, value] of formData.entries()) {
const str = value.toString().trim();
if (str) filters[key] = str;
}
try {
localStorage.setItem(storageKey, JSON.stringify(filters));
} catch {}
}, [storageKey]);
const navigate = useCallback((immediate: boolean) => { const navigate = useCallback((immediate: boolean) => {
if (timerRef.current) clearTimeout(timerRef.current); if (timerRef.current) clearTimeout(timerRef.current);
if (immediate) { if (immediate) {
saveFilters();
router.replace(buildUrl() as any); router.replace(buildUrl() as any);
} else { } else {
timerRef.current = setTimeout(() => { timerRef.current = setTimeout(() => {
saveFilters();
router.replace(buildUrl() as any); router.replace(buildUrl() as any);
}, debounceMs); }, debounceMs);
} }
}, [router, buildUrl, debounceMs]); }, [router, buildUrl, debounceMs, saveFilters]);
// Restore filters from localStorage on mount if URL has no filters
useEffect(() => {
if (restoredRef.current) return;
restoredRef.current = true;
const hasUrlFilters = fields.some((f) => {
const val = searchParams.get(f.name);
return val && val.trim() !== "";
});
if (hasUrlFilters) return;
try {
const saved = localStorage.getItem(storageKey);
if (!saved) return;
const filters: Record<string, string> = JSON.parse(saved);
const fieldNames = new Set(fields.map((f) => f.name));
const params = new URLSearchParams();
for (const [key, value] of Object.entries(filters)) {
if (fieldNames.has(key) && value) params.set(key, value);
}
const qs = params.toString();
if (qs) {
router.replace(`${basePath}?${qs}` as any);
}
} catch {}
}, []); // eslint-disable-line react-hooks/exhaustive-deps
useEffect(() => { useEffect(() => {
return () => { return () => {
@@ -85,6 +136,7 @@ export function LiveSearchForm({ fields, basePath, debounceMs = 300 }: LiveSearc
onSubmit={(e) => { onSubmit={(e) => {
e.preventDefault(); e.preventDefault();
if (timerRef.current) clearTimeout(timerRef.current); if (timerRef.current) clearTimeout(timerRef.current);
saveFilters();
router.replace(buildUrl() as any); router.replace(buildUrl() as any);
}} }}
className="space-y-4" className="space-y-4"
@@ -145,7 +197,11 @@ export function LiveSearchForm({ fields, basePath, debounceMs = 300 }: LiveSearc
{hasFilters && ( {hasFilters && (
<button <button
type="button" type="button"
onClick={() => router.replace(basePath as any)} onClick={() => {
formRef.current?.reset();
try { localStorage.removeItem(storageKey); } catch {}
router.replace(basePath as any);
}}
className=" className="
inline-flex items-center gap-1 inline-flex items-center gap-1
h-8 px-2.5 h-8 px-2.5

View File

@@ -7,9 +7,9 @@ import { NavIcon } from "./ui";
import { useTranslation } from "../../lib/i18n/context"; import { useTranslation } from "../../lib/i18n/context";
type NavItem = { type NavItem = {
href: "/" | "/books" | "/series" | "/libraries" | "/jobs" | "/tokens" | "/settings"; href: "/" | "/books" | "/series" | "/authors" | "/libraries" | "/jobs" | "/tokens" | "/settings";
label: string; label: string;
icon: "dashboard" | "books" | "series" | "libraries" | "jobs" | "tokens" | "settings"; icon: "dashboard" | "books" | "series" | "authors" | "libraries" | "jobs" | "tokens" | "settings";
}; };
const HamburgerIcon = () => ( const HamburgerIcon = () => (

View File

@@ -64,7 +64,11 @@ export function ProwlarrSearchModal({ seriesName, missingBooks }: ProwlarrSearch
setError(null); setError(null);
setResults([]); setResults([]);
try { try {
const body = { series_name: seriesName, custom_query: searchQuery.trim() }; const missing_volumes = missingBooks?.map((b) => ({
volume_number: b.volume_number,
title: b.title,
})) ?? undefined;
const body = { series_name: seriesName, custom_query: searchQuery.trim(), missing_volumes };
const resp = await fetch("/api/prowlarr/search", { const resp = await fetch("/api/prowlarr/search", {
method: "POST", method: "POST",
headers: { "Content-Type": "application/json" }, headers: { "Content-Type": "application/json" },
@@ -237,12 +241,23 @@ export function ProwlarrSearchModal({ seriesName, missingBooks }: ProwlarrSearch
</tr> </tr>
</thead> </thead>
<tbody className="divide-y divide-border"> <tbody className="divide-y divide-border">
{results.map((release, i) => ( {results.map((release, i) => {
<tr key={release.guid || i} className="hover:bg-muted/20 transition-colors"> const hasMissing = release.matchedMissingVolumes && release.matchedMissingVolumes.length > 0;
return (
<tr key={release.guid || i} className={`transition-colors ${hasMissing ? "bg-green-500/10 hover:bg-green-500/20 border-l-2 border-l-green-500" : "hover:bg-muted/20"}`}>
<td className="px-3 py-2 max-w-[400px]"> <td className="px-3 py-2 max-w-[400px]">
<span className="truncate block" title={release.title}> <span className="truncate block" title={release.title}>
{release.title} {release.title}
</span> </span>
{hasMissing && (
<div className="flex items-center gap-1 mt-1">
{release.matchedMissingVolumes!.map((vol) => (
<span key={vol} className="inline-flex items-center px-1.5 py-0.5 rounded text-[10px] font-medium bg-green-500/20 text-green-600">
{t("prowlarr.missingVol", { vol })}
</span>
))}
</div>
)}
</td> </td>
<td className="px-3 py-2 text-muted-foreground whitespace-nowrap"> <td className="px-3 py-2 text-muted-foreground whitespace-nowrap">
{release.indexer || "—"} {release.indexer || "—"}
@@ -325,7 +340,8 @@ export function ProwlarrSearchModal({ seriesName, missingBooks }: ProwlarrSearch
</div> </div>
</td> </td>
</tr> </tr>
))} );
})}
</tbody> </tbody>
</table> </table>
</div> </div>

View File

@@ -93,6 +93,7 @@ export function StatusBadge({ status, className = "" }: StatusBadgeProps) {
// Job type badge // Job type badge
const jobTypeVariants: Record<string, BadgeVariant> = { const jobTypeVariants: Record<string, BadgeVariant> = {
rebuild: "primary", rebuild: "primary",
rescan: "primary",
full_rebuild: "warning", full_rebuild: "warning",
thumbnail_rebuild: "secondary", thumbnail_rebuild: "secondary",
thumbnail_regenerate: "warning", thumbnail_regenerate: "warning",
@@ -109,6 +110,7 @@ export function JobTypeBadge({ type, className = "" }: JobTypeBadgeProps) {
const variant = jobTypeVariants[key] || "default"; const variant = jobTypeVariants[key] || "default";
const jobTypeLabels: Record<string, string> = { const jobTypeLabels: Record<string, string> = {
rebuild: t("jobType.rebuild"), rebuild: t("jobType.rebuild"),
rescan: t("jobType.rescan"),
full_rebuild: t("jobType.full_rebuild"), full_rebuild: t("jobType.full_rebuild"),
thumbnail_rebuild: t("jobType.thumbnail_rebuild"), thumbnail_rebuild: t("jobType.thumbnail_rebuild"),
thumbnail_regenerate: t("jobType.thumbnail_regenerate"), thumbnail_regenerate: t("jobType.thumbnail_regenerate"),

View File

@@ -33,7 +33,9 @@ type IconName =
| "spinner" | "spinner"
| "warning" | "warning"
| "tag" | "tag"
| "document"; | "document"
| "authors"
| "bell";
type IconSize = "sm" | "md" | "lg" | "xl"; type IconSize = "sm" | "md" | "lg" | "xl";
@@ -86,6 +88,8 @@ const icons: Record<IconName, string> = {
warning: "M12 9v2m0 4h.01m-6.938 4h13.856c1.54 0 2.502-1.667 1.732-3L13.732 4c-.77-1.333-2.694-1.333-3.464 0L3.34 16c-.77 1.333.192 3 1.732 3z", warning: "M12 9v2m0 4h.01m-6.938 4h13.856c1.54 0 2.502-1.667 1.732-3L13.732 4c-.77-1.333-2.694-1.333-3.464 0L3.34 16c-.77 1.333.192 3 1.732 3z",
tag: "M7 7h.01M7 3h5a1.99 1.99 0 011.414.586l7 7a2 2 0 010 2.828l-7 7a2 2 0 01-2.828 0l-7-7A1.994 1.994 0 013 12V7a4 4 0 014-4z", tag: "M7 7h.01M7 3h5a1.99 1.99 0 011.414.586l7 7a2 2 0 010 2.828l-7 7a2 2 0 01-2.828 0l-7-7A1.994 1.994 0 013 12V7a4 4 0 014-4z",
document: "M9 12h6m-6 4h6m2 5H7a2 2 0 01-2-2V5a2 2 0 012-2h5.586a1 1 0 01.707.293l5.414 5.414a1 1 0 01.293.707V19a2 2 0 01-2 2z", document: "M9 12h6m-6 4h6m2 5H7a2 2 0 01-2-2V5a2 2 0 012-2h5.586a1 1 0 01.707.293l5.414 5.414a1 1 0 01.293.707V19a2 2 0 01-2 2z",
authors: "M17 20h5v-2a3 3 0 00-5.356-1.857M17 20H7m10 0v-2c0-.656-.126-1.283-.356-1.857M7 20H2v-2a3 3 0 015.356-1.857M7 20v-2c0-.656.126-1.283.356-1.857m0 0a5.002 5.002 0 019.288 0M15 7a3 3 0 11-6 0 3 3 0 016 0zm6 3a2 2 0 11-4 0 2 2 0 014 0zM7 10a2 2 0 11-4 0 2 2 0 014 0z",
bell: "M15 17h5l-1.405-1.405A2.032 2.032 0 0118 14.158V11a6.002 6.002 0 00-4-5.659V5a2 2 0 10-4 0v.341C7.67 6.165 6 8.388 6 11v3.159c0 .538-.214 1.055-.595 1.436L4 17h5m6 0v1a3 3 0 11-6 0v-1m6 0H9",
}; };
const colorClasses: Partial<Record<IconName, string>> = { const colorClasses: Partial<Record<IconName, string>> = {
@@ -99,6 +103,7 @@ const colorClasses: Partial<Record<IconName, string>> = {
image: "text-primary", image: "text-primary",
cache: "text-warning", cache: "text-warning",
performance: "text-success", performance: "text-success",
authors: "text-violet-500",
}; };
export function Icon({ name, size = "md", className = "" }: IconProps) { export function Icon({ name, size = "md", className = "" }: IconProps) {

View File

@@ -1,3 +1,5 @@
export const dynamic = "force-dynamic";
import { notFound } from "next/navigation"; import { notFound } from "next/navigation";
import Link from "next/link"; import Link from "next/link";
import { apiFetch, getMetadataBatchReport, getMetadataBatchResults, getMetadataRefreshReport, MetadataBatchReportDto, MetadataBatchResultDto, MetadataRefreshReportDto } from "../../../lib/api"; import { apiFetch, getMetadataBatchReport, getMetadataBatchResults, getMetadataRefreshReport, MetadataBatchReportDto, MetadataBatchResultDto, MetadataRefreshReportDto } from "../../../lib/api";
@@ -5,6 +7,7 @@ import {
Card, CardHeader, CardTitle, CardDescription, CardContent, Card, CardHeader, CardTitle, CardDescription, CardContent,
StatusBadge, JobTypeBadge, StatBox, ProgressBar StatusBadge, JobTypeBadge, StatBox, ProgressBar
} from "../../components/ui"; } from "../../components/ui";
import { JobDetailLive } from "../../components/JobDetailLive";
import { getServerTranslations } from "../../../lib/i18n/server"; import { getServerTranslations } from "../../../lib/i18n/server";
interface JobDetailPageProps { interface JobDetailPageProps {
@@ -99,6 +102,11 @@ export default async function JobDetailPage({ params }: JobDetailPageProps) {
description: t("jobType.full_rebuildDesc"), description: t("jobType.full_rebuildDesc"),
isThumbnailOnly: false, isThumbnailOnly: false,
}, },
rescan: {
label: t("jobType.rescanLabel"),
description: t("jobType.rescanDesc"),
isThumbnailOnly: false,
},
thumbnail_rebuild: { thumbnail_rebuild: {
label: t("jobType.thumbnail_rebuildLabel"), label: t("jobType.thumbnail_rebuildLabel"),
description: t("jobType.thumbnail_rebuildDesc"), description: t("jobType.thumbnail_rebuildDesc"),
@@ -158,6 +166,7 @@ export default async function JobDetailPage({ params }: JobDetailPageProps) {
const isCompleted = job.status === "success"; const isCompleted = job.status === "success";
const isFailed = job.status === "failed"; const isFailed = job.status === "failed";
const isCancelled = job.status === "cancelled"; const isCancelled = job.status === "cancelled";
const isTerminal = isCompleted || isFailed || isCancelled;
const isExtractingPages = job.status === "extracting_pages"; const isExtractingPages = job.status === "extracting_pages";
const isThumbnailPhase = job.status === "generating_thumbnails"; const isThumbnailPhase = job.status === "generating_thumbnails";
const isPhase2 = isExtractingPages || isThumbnailPhase; const isPhase2 = isExtractingPages || isThumbnailPhase;
@@ -199,6 +208,7 @@ export default async function JobDetailPage({ params }: JobDetailPageProps) {
return ( return (
<> <>
<JobDetailLive jobId={id} isTerminal={isTerminal} />
<div className="mb-6"> <div className="mb-6">
<Link <Link
href="/jobs" href="/jobs"

View File

@@ -33,6 +33,14 @@ export default async function JobsPage({ searchParams }: { searchParams: Promise
redirect(`/jobs?highlight=${result.id}`); redirect(`/jobs?highlight=${result.id}`);
} }
async function triggerRescan(formData: FormData) {
"use server";
const libraryId = formData.get("library_id") as string;
const result = await rebuildIndex(libraryId || undefined, false, true);
revalidatePath("/jobs");
redirect(`/jobs?highlight=${result.id}`);
}
async function triggerThumbnailsRebuild(formData: FormData) { async function triggerThumbnailsRebuild(formData: FormData) {
"use server"; "use server";
const libraryId = formData.get("library_id") as string; const libraryId = formData.get("library_id") as string;
@@ -52,7 +60,7 @@ export default async function JobsPage({ searchParams }: { searchParams: Promise
async function triggerMetadataBatch(formData: FormData) { async function triggerMetadataBatch(formData: FormData) {
"use server"; "use server";
const libraryId = formData.get("library_id") as string; const libraryId = formData.get("library_id") as string;
if (!libraryId) return; if (libraryId) {
let result; let result;
try { try {
result = await startMetadataBatch(libraryId); result = await startMetadataBatch(libraryId);
@@ -62,12 +70,28 @@ export default async function JobsPage({ searchParams }: { searchParams: Promise
} }
revalidatePath("/jobs"); revalidatePath("/jobs");
redirect(`/jobs?highlight=${result.id}`); redirect(`/jobs?highlight=${result.id}`);
} else {
// All libraries — skip those with metadata disabled
const allLibraries = await fetchLibraries().catch(() => [] as LibraryDto[]);
let lastId: string | undefined;
for (const lib of allLibraries) {
if (lib.metadata_provider === "none") continue;
try {
const result = await startMetadataBatch(lib.id);
if (result.status !== "already_running") lastId = result.id;
} catch {
// Library may have metadata disabled or other issue — skip
}
}
revalidatePath("/jobs");
redirect(lastId ? `/jobs?highlight=${lastId}` : "/jobs");
}
} }
async function triggerMetadataRefresh(formData: FormData) { async function triggerMetadataRefresh(formData: FormData) {
"use server"; "use server";
const libraryId = formData.get("library_id") as string; const libraryId = formData.get("library_id") as string;
if (!libraryId) return; if (libraryId) {
let result; let result;
try { try {
result = await startMetadataRefresh(libraryId); result = await startMetadataRefresh(libraryId);
@@ -76,6 +100,22 @@ export default async function JobsPage({ searchParams }: { searchParams: Promise
} }
revalidatePath("/jobs"); revalidatePath("/jobs");
redirect(`/jobs?highlight=${result.id}`); redirect(`/jobs?highlight=${result.id}`);
} else {
// All libraries — skip those with metadata disabled
const allLibraries = await fetchLibraries().catch(() => [] as LibraryDto[]);
let lastId: string | undefined;
for (const lib of allLibraries) {
if (lib.metadata_provider === "none") continue;
try {
const result = await startMetadataRefresh(lib.id);
if (result.status !== "already_running") lastId = result.id;
} catch {
// Library may have metadata disabled or no approved links — skip
}
}
revalidatePath("/jobs");
redirect(lastId ? `/jobs?highlight=${lastId}` : "/jobs");
}
} }
return ( return (
@@ -127,13 +167,23 @@ export default async function JobsPage({ searchParams }: { searchParams: Promise
</div> </div>
<p className="text-xs text-muted-foreground mt-1 ml-6">{t("jobs.rebuildShort")}</p> <p className="text-xs text-muted-foreground mt-1 ml-6">{t("jobs.rebuildShort")}</p>
</button> </button>
<button type="submit" formAction={triggerFullRebuild} <button type="submit" formAction={triggerRescan}
className="w-full text-left rounded-lg border border-warning/30 bg-warning/5 p-3 hover:bg-warning/10 transition-colors group cursor-pointer"> className="w-full text-left rounded-lg border border-input bg-background p-3 hover:bg-accent/50 transition-colors group cursor-pointer">
<div className="flex items-center gap-2"> <div className="flex items-center gap-2">
<svg className="w-4 h-4 text-warning shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24"> <svg className="w-4 h-4 text-primary shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M21 21l-6-6m2-5a7 7 0 11-14 0 7 7 0 0114 0z" />
</svg>
<span className="font-medium text-sm text-foreground">{t("jobs.rescan")}</span>
</div>
<p className="text-xs text-muted-foreground mt-1 ml-6">{t("jobs.rescanShort")}</p>
</button>
<button type="submit" formAction={triggerFullRebuild}
className="w-full text-left rounded-lg border border-destructive/30 bg-destructive/5 p-3 hover:bg-destructive/10 transition-colors group cursor-pointer">
<div className="flex items-center gap-2">
<svg className="w-4 h-4 text-destructive shrink-0" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M12 9v2m0 4h.01m-6.938 4h13.856c1.54 0 2.502-1.667 1.732-3L13.732 4c-.77-1.333-2.694-1.333-3.464 0L3.34 16c-.77 1.333.192 3 1.732 3z" /> <path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M12 9v2m0 4h.01m-6.938 4h13.856c1.54 0 2.502-1.667 1.732-3L13.732 4c-.77-1.333-2.694-1.333-3.464 0L3.34 16c-.77 1.333.192 3 1.732 3z" />
</svg> </svg>
<span className="font-medium text-sm text-warning">{t("jobs.fullRebuild")}</span> <span className="font-medium text-sm text-destructive">{t("jobs.fullRebuild")}</span>
</div> </div>
<p className="text-xs text-muted-foreground mt-1 ml-6">{t("jobs.fullRebuildShort")}</p> <p className="text-xs text-muted-foreground mt-1 ml-6">{t("jobs.fullRebuildShort")}</p>
</button> </button>
@@ -179,7 +229,6 @@ export default async function JobsPage({ searchParams }: { searchParams: Promise
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M7 7h.01M7 3h5c.512 0 1.024.195 1.414.586l7 7a2 2 0 010 2.828l-7 7a2 2 0 01-2.828 0l-7-7A1.994 1.994 0 013 12V7a4 4 0 014-4z" /> <path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M7 7h.01M7 3h5c.512 0 1.024.195 1.414.586l7 7a2 2 0 010 2.828l-7 7a2 2 0 01-2.828 0l-7-7A1.994 1.994 0 013 12V7a4 4 0 014-4z" />
</svg> </svg>
{t("jobs.groupMetadata")} {t("jobs.groupMetadata")}
<span className="text-xs font-normal text-muted-foreground">({t("jobs.requiresLibrary")})</span>
</div> </div>
<div className="space-y-2"> <div className="space-y-2">
<button type="submit" formAction={triggerMetadataBatch} <button type="submit" formAction={triggerMetadataBatch}

View File

@@ -18,15 +18,16 @@ export const metadata: Metadata = {
}; };
type NavItem = { type NavItem = {
href: "/" | "/books" | "/series" | "/libraries" | "/jobs" | "/tokens" | "/settings"; href: "/" | "/books" | "/series" | "/authors" | "/libraries" | "/jobs" | "/tokens" | "/settings";
labelKey: TranslationKey; labelKey: TranslationKey;
icon: "dashboard" | "books" | "series" | "libraries" | "jobs" | "tokens" | "settings"; icon: "dashboard" | "books" | "series" | "authors" | "libraries" | "jobs" | "tokens" | "settings";
}; };
const navItems: NavItem[] = [ const navItems: NavItem[] = [
{ href: "/", labelKey: "nav.dashboard", icon: "dashboard" }, { href: "/", labelKey: "nav.dashboard", icon: "dashboard" },
{ href: "/books", labelKey: "nav.books", icon: "books" }, { href: "/books", labelKey: "nav.books", icon: "books" },
{ href: "/series", labelKey: "nav.series", icon: "series" }, { href: "/series", labelKey: "nav.series", icon: "series" },
{ href: "/authors", labelKey: "nav.authors", icon: "authors" },
{ href: "/libraries", labelKey: "nav.libraries", icon: "libraries" }, { href: "/libraries", labelKey: "nav.libraries", icon: "libraries" },
{ href: "/jobs", labelKey: "nav.jobs", icon: "jobs" }, { href: "/jobs", labelKey: "nav.jobs", icon: "jobs" },
{ href: "/tokens", labelKey: "nav.tokens", icon: "tokens" }, { href: "/tokens", labelKey: "nav.tokens", icon: "tokens" },

View File

@@ -2,13 +2,21 @@ import { fetchLibraries, fetchBooks, fetchSeriesMetadata, getBookCoverUrl, getMe
import { BooksGrid, EmptyState } from "../../../../components/BookCard"; import { BooksGrid, EmptyState } from "../../../../components/BookCard";
import { MarkSeriesReadButton } from "../../../../components/MarkSeriesReadButton"; import { MarkSeriesReadButton } from "../../../../components/MarkSeriesReadButton";
import { MarkBookReadButton } from "../../../../components/MarkBookReadButton"; import { MarkBookReadButton } from "../../../../components/MarkBookReadButton";
import { EditSeriesForm } from "../../../../components/EditSeriesForm"; import nextDynamic from "next/dynamic";
import { MetadataSearchModal } from "../../../../components/MetadataSearchModal";
import { ProwlarrSearchModal } from "../../../../components/ProwlarrSearchModal";
import { OffsetPagination } from "../../../../components/ui"; import { OffsetPagination } from "../../../../components/ui";
import { SafeHtml } from "../../../../components/SafeHtml"; import { SafeHtml } from "../../../../components/SafeHtml";
import Image from "next/image"; import Image from "next/image";
import Link from "next/link"; import Link from "next/link";
const EditSeriesForm = nextDynamic(
() => import("../../../../components/EditSeriesForm").then(m => m.EditSeriesForm)
);
const MetadataSearchModal = nextDynamic(
() => import("../../../../components/MetadataSearchModal").then(m => m.MetadataSearchModal)
);
const ProwlarrSearchModal = nextDynamic(
() => import("../../../../components/ProwlarrSearchModal").then(m => m.ProwlarrSearchModal)
);
import { notFound } from "next/navigation"; import { notFound } from "next/navigation";
import { getServerTranslations } from "../../../../../lib/i18n/server"; import { getServerTranslations } from "../../../../../lib/i18n/server";
@@ -94,7 +102,7 @@ export default async function SeriesDetailPage({
alt={t("books.coverOf", { name: displayName })} alt={t("books.coverOf", { name: displayName })}
fill fill
className="object-cover" className="object-cover"
unoptimized sizes="160px"
/> />
</div> </div>
</div> </div>

View File

@@ -86,7 +86,7 @@ export default async function LibrarySeriesPage({
alt={t("books.coverOf", { name: s.name })} alt={t("books.coverOf", { name: s.name })}
fill fill
className="object-cover" className="object-cover"
unoptimized sizes="(max-width: 640px) 50vw, (max-width: 768px) 33vw, (max-width: 1024px) 25vw, 20vw"
/> />
</div> </div>
<div className="p-3"> <div className="p-3">

View File

@@ -1,9 +1,12 @@
import { revalidatePath } from "next/cache"; import { revalidatePath } from "next/cache";
import Image from "next/image";
import Link from "next/link"; import Link from "next/link";
import { listFolders, createLibrary, deleteLibrary, fetchLibraries, fetchSeries, scanLibrary, startMetadataBatch, LibraryDto, FolderItem } from "../../lib/api"; import { listFolders, createLibrary, deleteLibrary, fetchLibraries, getBookCoverUrl, LibraryDto, FolderItem } from "../../lib/api";
import type { TranslationKey } from "../../lib/i18n/fr";
import { getServerTranslations } from "../../lib/i18n/server"; import { getServerTranslations } from "../../lib/i18n/server";
import { LibraryActions } from "../components/LibraryActions"; import { LibraryActions } from "../components/LibraryActions";
import { LibraryForm } from "../components/LibraryForm"; import { LibraryForm } from "../components/LibraryForm";
import { ProviderIcon } from "../components/ProviderIcon";
import { import {
Card, CardHeader, CardTitle, CardDescription, CardContent, Card, CardHeader, CardTitle, CardDescription, CardContent,
Button, Badge Button, Badge
@@ -31,19 +34,13 @@ export default async function LibrariesPage() {
listFolders().catch(() => [] as FolderItem[]) listFolders().catch(() => [] as FolderItem[])
]); ]);
const seriesCounts = await Promise.all( const thumbnailMap = new Map(
libraries.map(async (lib) => { libraries.map(lib => [
try { lib.id,
const seriesPage = await fetchSeries(lib.id); (lib.thumbnail_book_ids || []).map(bookId => getBookCoverUrl(bookId)),
return { id: lib.id, count: seriesPage.items.length }; ])
} catch {
return { id: lib.id, count: 0 };
}
})
); );
const seriesCountMap = new Map(seriesCounts.map(s => [s.id, s.count]));
async function addLibrary(formData: FormData) { async function addLibrary(formData: FormData) {
"use server"; "use server";
const name = formData.get("name") as string; const name = formData.get("name") as string;
@@ -61,35 +58,6 @@ export default async function LibrariesPage() {
revalidatePath("/libraries"); revalidatePath("/libraries");
} }
async function scanLibraryAction(formData: FormData) {
"use server";
const id = formData.get("id") as string;
await scanLibrary(id);
revalidatePath("/libraries");
revalidatePath("/jobs");
}
async function scanLibraryFullAction(formData: FormData) {
"use server";
const id = formData.get("id") as string;
await scanLibrary(id, true);
revalidatePath("/libraries");
revalidatePath("/jobs");
}
async function batchMetadataAction(formData: FormData) {
"use server";
const id = formData.get("id") as string;
try {
await startMetadataBatch(id);
} catch {
// Library may have metadata disabled — ignore silently
return;
}
revalidatePath("/libraries");
revalidatePath("/jobs");
}
return ( return (
<> <>
<div className="mb-6"> <div className="mb-6">
@@ -115,15 +83,61 @@ export default async function LibrariesPage() {
{/* Libraries Grid */} {/* Libraries Grid */}
<div className="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-4"> <div className="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-4">
{libraries.map((lib) => { {libraries.map((lib) => {
const seriesCount = seriesCountMap.get(lib.id) || 0; const thumbnails = thumbnailMap.get(lib.id) || [];
return ( return (
<Card key={lib.id} className="flex flex-col"> <Card key={lib.id} className="flex flex-col overflow-hidden">
{/* Thumbnail fan */}
{thumbnails.length > 0 ? (
<Link href={`/libraries/${lib.id}/series`} className="block relative h-48 overflow-hidden bg-muted/10">
<Image
src={thumbnails[0]}
alt=""
fill
className="object-cover blur-xl scale-110 opacity-40"
sizes="(max-width: 768px) 100vw, 33vw"
loading="lazy"
/>
<div className="absolute inset-0 flex items-end justify-center">
{thumbnails.map((url, i) => {
const count = thumbnails.length;
const mid = (count - 1) / 2;
const angle = (i - mid) * 12;
const radius = 220;
const rad = ((angle - 90) * Math.PI) / 180;
const cx = Math.cos(rad) * radius;
const cy = Math.sin(rad) * radius;
return (
<Image
key={i}
src={url}
alt=""
width={96}
height={144}
className="absolute object-cover shadow-lg"
style={{
transform: `translate(${cx}px, ${cy}px) rotate(${angle}deg)`,
transformOrigin: 'bottom center',
zIndex: count - Math.abs(Math.round(i - mid)),
bottom: '-185px',
}}
sizes="96px"
loading="lazy"
/>
);
})}
</div>
</Link>
) : (
<div className="h-8 bg-muted/10" />
)}
<CardHeader className="pb-2"> <CardHeader className="pb-2">
<div className="flex items-start justify-between"> <div className="flex items-start justify-between">
<div> <div>
<CardTitle className="text-lg">{lib.name}</CardTitle> <CardTitle className="text-lg">{lib.name}</CardTitle>
{!lib.enabled && <Badge variant="muted" className="mt-1">{t("libraries.disabled")}</Badge>} {!lib.enabled && <Badge variant="muted" className="mt-1">{t("libraries.disabled")}</Badge>}
</div> </div>
<div className="flex items-center gap-1">
<LibraryActions <LibraryActions
libraryId={lib.id} libraryId={lib.id}
monitorEnabled={lib.monitor_enabled} monitorEnabled={lib.monitor_enabled}
@@ -131,85 +145,78 @@ export default async function LibrariesPage() {
watcherEnabled={lib.watcher_enabled} watcherEnabled={lib.watcher_enabled}
metadataProvider={lib.metadata_provider} metadataProvider={lib.metadata_provider}
fallbackMetadataProvider={lib.fallback_metadata_provider} fallbackMetadataProvider={lib.fallback_metadata_provider}
metadataRefreshMode={lib.metadata_refresh_mode}
/> />
<form>
<input type="hidden" name="id" value={lib.id} />
<Button type="submit" variant="ghost" size="sm" formAction={removeLibrary} className="text-muted-foreground hover:text-destructive">
<svg className="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M19 7l-.867 12.142A2 2 0 0116.138 21H7.862a2 2 0 01-1.995-1.858L5 7m5 4v6m4-6v6m1-10V4a1 1 0 00-1-1h-4a1 1 0 00-1 1v3M4 7h16" />
</svg>
</Button>
</form>
</div> </div>
</div>
<code className="text-xs font-mono text-muted-foreground break-all">{lib.root_path}</code>
</CardHeader> </CardHeader>
<CardContent className="flex-1 pt-0"> <CardContent className="flex-1 pt-0">
{/* Path */}
<code className="text-xs font-mono text-muted-foreground mb-4 break-all block">{lib.root_path}</code>
{/* Stats */} {/* Stats */}
<div className="grid grid-cols-2 gap-3 mb-4"> <div className="grid grid-cols-2 gap-3 mb-3">
<Link <Link
href={`/libraries/${lib.id}/books`} href={`/libraries/${lib.id}/books`}
className="text-center p-3 bg-muted/50 rounded-lg hover:bg-accent transition-colors duration-200" className="text-center p-2.5 bg-muted/50 rounded-lg hover:bg-accent transition-colors duration-200"
> >
<span className="block text-2xl font-bold text-primary">{lib.book_count}</span> <span className="block text-2xl font-bold text-primary">{lib.book_count}</span>
<span className="text-xs text-muted-foreground">{t("libraries.books")}</span> <span className="text-xs text-muted-foreground">{t("libraries.books")}</span>
</Link> </Link>
<Link <Link
href={`/libraries/${lib.id}/series`} href={`/libraries/${lib.id}/series`}
className="text-center p-3 bg-muted/50 rounded-lg hover:bg-accent transition-colors duration-200" className="text-center p-2.5 bg-muted/50 rounded-lg hover:bg-accent transition-colors duration-200"
> >
<span className="block text-2xl font-bold text-foreground">{seriesCount}</span> <span className="block text-2xl font-bold text-foreground">{lib.series_count}</span>
<span className="text-xs text-muted-foreground">{t("libraries.series")}</span> <span className="text-xs text-muted-foreground">{t("libraries.series")}</span>
</Link> </Link>
</div> </div>
{/* Status */} {/* Configuration tags */}
<div className="flex items-center gap-3 mb-4 text-sm"> <div className="flex flex-wrap gap-1.5">
<span className={`flex items-center gap-1 ${lib.monitor_enabled ? 'text-success' : 'text-muted-foreground'}`}> <span className={`inline-flex items-center gap-1 px-2 py-0.5 rounded-full text-[11px] font-medium ${
{lib.monitor_enabled ? '●' : '○'} {lib.monitor_enabled ? t("libraries.auto") : t("libraries.manual")} lib.monitor_enabled
? 'bg-success/10 text-success'
: 'bg-muted/50 text-muted-foreground'
}`}>
<span className="text-[9px]">{lib.monitor_enabled ? '●' : '○'}</span>
{t("libraries.scanLabel", { mode: t(`monitoring.${lib.scan_mode}` as TranslationKey) })}
</span>
<span className={`inline-flex items-center gap-1 px-2 py-0.5 rounded-full text-[11px] font-medium ${
lib.watcher_enabled
? 'bg-warning/10 text-warning'
: 'bg-muted/50 text-muted-foreground'
}`}>
<span>{lib.watcher_enabled ? '⚡' : '○'}</span>
<span>{t("libraries.watcherLabel")}</span>
</span>
{lib.metadata_provider && lib.metadata_provider !== "none" && (
<span className="inline-flex items-center gap-1 px-2 py-0.5 rounded-full text-[11px] font-medium bg-primary/10 text-primary">
<ProviderIcon provider={lib.metadata_provider} size={11} />
{lib.metadata_provider.replace('_', ' ')}
</span> </span>
{lib.watcher_enabled && (
<span className="text-warning" title="Surveillance de fichiers active"></span>
)} )}
{lib.metadata_refresh_mode !== "manual" && (
<span className="inline-flex items-center gap-1 px-2 py-0.5 rounded-full text-[11px] font-medium bg-muted/50 text-muted-foreground">
{t("libraries.metaRefreshLabel", { mode: t(`monitoring.${lib.metadata_refresh_mode}` as TranslationKey) })}
</span>
)}
{lib.monitor_enabled && lib.next_scan_at && ( {lib.monitor_enabled && lib.next_scan_at && (
<span className="text-xs text-muted-foreground ml-auto"> <span className="inline-flex items-center gap-1 px-2 py-0.5 rounded-full text-[11px] font-medium bg-muted/50 text-muted-foreground">
{t("libraries.nextScan", { time: formatNextScan(lib.next_scan_at, t("libraries.imminent")) })} {t("libraries.nextScan", { time: formatNextScan(lib.next_scan_at, t("libraries.imminent")) })}
</span> </span>
)} )}
</div> </div>
{/* Actions */}
<div className="flex items-center gap-2">
<form className="flex-1">
<input type="hidden" name="id" value={lib.id} />
<Button type="submit" variant="default" size="sm" className="w-full" formAction={scanLibraryAction}>
<svg className="w-4 h-4 mr-1" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M4 4v5h.582m15.356 2A8.001 8.001 0 004.582 9m0 0H9m11 11v-5h-.581m0 0a8.003 8.003 0 01-15.357-2m15.357 2H15" />
</svg>
{t("libraries.index")}
</Button>
</form>
<form className="flex-1">
<input type="hidden" name="id" value={lib.id} />
<Button type="submit" variant="secondary" size="sm" className="w-full" formAction={scanLibraryFullAction}>
<svg className="w-4 h-4 mr-1" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M4 4v5h.582m15.356 2A8.001 8.001 0 004.582 9m0 0H9m11 11v-5h-.581m0 0a8.003 8.003 0 01-15.357-2m15.357 2H15" />
</svg>
{t("libraries.fullIndex")}
</Button>
</form>
{lib.metadata_provider !== "none" && (
<form>
<input type="hidden" name="id" value={lib.id} />
<Button type="submit" variant="secondary" size="sm" formAction={batchMetadataAction} title={t("libraries.batchMetadata")}>
<svg className="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M21 21l-6-6m2-5a7 7 0 11-14 0 7 7 0 0114 0z" />
</svg>
</Button>
</form>
)}
<form>
<input type="hidden" name="id" value={lib.id} />
<Button type="submit" variant="destructive" size="sm" formAction={removeLibrary}>
<svg className="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M19 7l-.867 12.142A2 2 0 0116.138 21H7.862a2 2 0 01-1.995-1.858L5 7m5 4v6m4-6v6m1-10V4a1 1 0 00-1-1h-4a1 1 0 00-1 1v3M4 7h16" />
</svg>
</Button>
</form>
</div>
</CardContent> </CardContent>
</Card> </Card>
); );

View File

@@ -1,6 +1,8 @@
import React from "react"; import React from "react";
import { fetchStats, StatsResponse } from "../lib/api"; import { fetchStats, StatsResponse, getBookCoverUrl } from "../lib/api";
import { Card, CardContent, CardHeader, CardTitle } from "./components/ui"; import { Card, CardContent, CardHeader, CardTitle } from "./components/ui";
import { RcDonutChart, RcBarChart, RcAreaChart, RcStackedBar, RcHorizontalBar } from "./components/DashboardCharts";
import Image from "next/image";
import Link from "next/link"; import Link from "next/link";
import { getServerTranslations } from "../lib/i18n/server"; import { getServerTranslations } from "../lib/i18n/server";
import type { TranslateFunction } from "../lib/i18n/dictionaries"; import type { TranslateFunction } from "../lib/i18n/dictionaries";
@@ -19,84 +21,7 @@ function formatNumber(n: number, locale: string): string {
return n.toLocaleString(locale === "fr" ? "fr-FR" : "en-US"); return n.toLocaleString(locale === "fr" ? "fr-FR" : "en-US");
} }
// Donut chart via SVG // Horizontal progress bar for metadata quality (stays server-rendered, no recharts needed)
function DonutChart({ data, colors, noDataLabel, locale = "fr" }: { data: { label: string; value: number; color: string }[]; colors?: string[]; noDataLabel?: string; locale?: string }) {
const total = data.reduce((sum, d) => sum + d.value, 0);
if (total === 0) return <p className="text-muted-foreground text-sm text-center py-8">{noDataLabel}</p>;
const radius = 40;
const circumference = 2 * Math.PI * radius;
let offset = 0;
return (
<div className="flex items-center gap-6">
<svg viewBox="0 0 100 100" className="w-32 h-32 shrink-0">
{data.map((d, i) => {
const pct = d.value / total;
const dashLength = pct * circumference;
const currentOffset = offset;
offset += dashLength;
return (
<circle
key={i}
cx="50"
cy="50"
r={radius}
fill="none"
stroke={d.color}
strokeWidth="16"
strokeDasharray={`${dashLength} ${circumference - dashLength}`}
strokeDashoffset={-currentOffset}
transform="rotate(-90 50 50)"
className="transition-all duration-500"
/>
);
})}
<text x="50" y="50" textAnchor="middle" dominantBaseline="central" className="fill-foreground text-[10px] font-bold">
{formatNumber(total, locale)}
</text>
</svg>
<div className="flex flex-col gap-1.5 min-w-0">
{data.map((d, i) => (
<div key={i} className="flex items-center gap-2 text-sm">
<span className="w-3 h-3 rounded-full shrink-0" style={{ backgroundColor: d.color }} />
<span className="text-muted-foreground truncate">{d.label}</span>
<span className="font-medium text-foreground ml-auto">{d.value}</span>
</div>
))}
</div>
</div>
);
}
// Bar chart via pure CSS
function BarChart({ data, color = "var(--color-primary)", noDataLabel }: { data: { label: string; value: number }[]; color?: string; noDataLabel?: string }) {
const max = Math.max(...data.map((d) => d.value), 1);
if (data.length === 0) return <p className="text-muted-foreground text-sm text-center py-8">{noDataLabel}</p>;
return (
<div className="flex items-end gap-1.5 h-40">
{data.map((d, i) => (
<div key={i} className="flex-1 flex flex-col items-center gap-1 min-w-0">
<span className="text-[10px] text-muted-foreground font-medium">{d.value || ""}</span>
<div
className="w-full rounded-t-sm transition-all duration-500 min-h-[2px]"
style={{
height: `${(d.value / max) * 100}%`,
backgroundColor: color,
opacity: d.value === 0 ? 0.2 : 1,
}}
/>
<span className="text-[10px] text-muted-foreground truncate w-full text-center">
{d.label}
</span>
</div>
))}
</div>
);
}
// Horizontal progress bar for library breakdown
function HorizontalBar({ label, value, max, subLabel, color = "var(--color-primary)" }: { label: string; value: number; max: number; subLabel?: string; color?: string }) { function HorizontalBar({ label, value, max, subLabel, color = "var(--color-primary)" }: { label: string; value: number; max: number; subLabel?: string; color?: string }) {
const pct = max > 0 ? (value / max) * 100 : 0; const pct = max > 0 ? (value / max) * 100 : 0;
return ( return (
@@ -137,7 +62,7 @@ export default async function DashboardPage() {
); );
} }
const { overview, reading_status, by_format, by_language, by_library, top_series, additions_over_time, metadata } = stats; const { overview, reading_status, currently_reading = [], recently_read = [], reading_over_time = [], by_format, by_library, top_series, additions_over_time, metadata } = stats;
const readingColors = ["hsl(220 13% 70%)", "hsl(45 93% 47%)", "hsl(142 60% 45%)"]; const readingColors = ["hsl(220 13% 70%)", "hsl(45 93% 47%)", "hsl(142 60% 45%)"];
const formatColors = [ const formatColors = [
@@ -146,7 +71,6 @@ export default async function DashboardPage() {
"hsl(170 60% 45%)", "hsl(220 60% 50%)", "hsl(170 60% 45%)", "hsl(220 60% 50%)",
]; ];
const maxLibBooks = Math.max(...by_library.map((l) => l.book_count), 1);
const noDataLabel = t("common.noData"); const noDataLabel = t("common.noData");
return ( return (
@@ -174,6 +98,98 @@ export default async function DashboardPage() {
<StatCard icon="size" label={t("dashboard.totalSize")} value={formatBytes(overview.total_size_bytes)} color="warning" /> <StatCard icon="size" label={t("dashboard.totalSize")} value={formatBytes(overview.total_size_bytes)} color="warning" />
</div> </div>
{/* Currently reading + Recently read */}
{(currently_reading.length > 0 || recently_read.length > 0) && (
<div className="grid grid-cols-1 lg:grid-cols-2 gap-6">
{/* Currently reading */}
<Card hover={false}>
<CardHeader>
<CardTitle className="text-base">{t("dashboard.currentlyReading")}</CardTitle>
</CardHeader>
<CardContent>
{currently_reading.length === 0 ? (
<p className="text-muted-foreground text-sm text-center py-4">{t("dashboard.noCurrentlyReading")}</p>
) : (
<div className="space-y-3">
{currently_reading.slice(0, 8).map((book) => {
const pct = book.page_count > 0 ? Math.round((book.current_page / book.page_count) * 100) : 0;
return (
<Link key={book.book_id} href={`/books/${book.book_id}` as any} className="flex items-center gap-3 group">
<Image
src={getBookCoverUrl(book.book_id)}
alt={book.title}
width={40}
height={56}
className="w-10 h-14 object-cover rounded shadow-sm shrink-0 bg-muted"
/>
<div className="min-w-0 flex-1">
<p className="text-sm font-medium text-foreground truncate group-hover:text-primary transition-colors">{book.title}</p>
{book.series && <p className="text-xs text-muted-foreground truncate">{book.series}</p>}
<div className="mt-1.5 flex items-center gap-2">
<div className="h-1.5 flex-1 bg-muted rounded-full overflow-hidden">
<div className="h-full bg-warning rounded-full transition-all" style={{ width: `${pct}%` }} />
</div>
<span className="text-[10px] text-muted-foreground shrink-0">{pct}%</span>
</div>
<p className="text-[10px] text-muted-foreground mt-0.5">{t("dashboard.pageProgress", { current: book.current_page, total: book.page_count })}</p>
</div>
</Link>
);
})}
</div>
)}
</CardContent>
</Card>
{/* Recently read */}
<Card hover={false}>
<CardHeader>
<CardTitle className="text-base">{t("dashboard.recentlyRead")}</CardTitle>
</CardHeader>
<CardContent>
{recently_read.length === 0 ? (
<p className="text-muted-foreground text-sm text-center py-4">{t("dashboard.noRecentlyRead")}</p>
) : (
<div className="space-y-3">
{recently_read.map((book) => (
<Link key={book.book_id} href={`/books/${book.book_id}` as any} className="flex items-center gap-3 group">
<Image
src={getBookCoverUrl(book.book_id)}
alt={book.title}
width={40}
height={56}
className="w-10 h-14 object-cover rounded shadow-sm shrink-0 bg-muted"
/>
<div className="min-w-0 flex-1">
<p className="text-sm font-medium text-foreground truncate group-hover:text-primary transition-colors">{book.title}</p>
{book.series && <p className="text-xs text-muted-foreground truncate">{book.series}</p>}
</div>
<span className="text-xs text-muted-foreground shrink-0">{book.last_read_at}</span>
</Link>
))}
</div>
)}
</CardContent>
</Card>
</div>
)}
{/* Reading activity line chart */}
{reading_over_time.length > 0 && (
<Card hover={false}>
<CardHeader>
<CardTitle className="text-base">{t("dashboard.readingActivity")}</CardTitle>
</CardHeader>
<CardContent>
<RcAreaChart
noDataLabel={noDataLabel}
data={reading_over_time.map((m) => ({ label: m.month.slice(5), value: m.books_read }))}
color="hsl(142 60% 45%)"
/>
</CardContent>
</Card>
)}
{/* Charts row */} {/* Charts row */}
<div className="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-6"> <div className="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-3 gap-6">
{/* Reading status donut */} {/* Reading status donut */}
@@ -182,13 +198,12 @@ export default async function DashboardPage() {
<CardTitle className="text-base">{t("dashboard.readingStatus")}</CardTitle> <CardTitle className="text-base">{t("dashboard.readingStatus")}</CardTitle>
</CardHeader> </CardHeader>
<CardContent> <CardContent>
<DonutChart <RcDonutChart
locale={locale}
noDataLabel={noDataLabel} noDataLabel={noDataLabel}
data={[ data={[
{ label: t("status.unread"), value: reading_status.unread, color: readingColors[0] }, { name: t("status.unread"), value: reading_status.unread, color: readingColors[0] },
{ label: t("status.reading"), value: reading_status.reading, color: readingColors[1] }, { name: t("status.reading"), value: reading_status.reading, color: readingColors[1] },
{ label: t("status.read"), value: reading_status.read, color: readingColors[2] }, { name: t("status.read"), value: reading_status.read, color: readingColors[2] },
]} ]}
/> />
</CardContent> </CardContent>
@@ -200,11 +215,10 @@ export default async function DashboardPage() {
<CardTitle className="text-base">{t("dashboard.byFormat")}</CardTitle> <CardTitle className="text-base">{t("dashboard.byFormat")}</CardTitle>
</CardHeader> </CardHeader>
<CardContent> <CardContent>
<DonutChart <RcDonutChart
locale={locale}
noDataLabel={noDataLabel} noDataLabel={noDataLabel}
data={by_format.slice(0, 6).map((f, i) => ({ data={by_format.slice(0, 6).map((f, i) => ({
label: (f.format || t("dashboard.unknown")).toUpperCase(), name: (f.format || t("dashboard.unknown")).toUpperCase(),
value: f.count, value: f.count,
color: formatColors[i % formatColors.length], color: formatColors[i % formatColors.length],
}))} }))}
@@ -218,11 +232,10 @@ export default async function DashboardPage() {
<CardTitle className="text-base">{t("dashboard.byLibrary")}</CardTitle> <CardTitle className="text-base">{t("dashboard.byLibrary")}</CardTitle>
</CardHeader> </CardHeader>
<CardContent> <CardContent>
<DonutChart <RcDonutChart
locale={locale}
noDataLabel={noDataLabel} noDataLabel={noDataLabel}
data={by_library.slice(0, 6).map((l, i) => ({ data={by_library.slice(0, 6).map((l, i) => ({
label: l.library_name, name: l.library_name,
value: l.book_count, value: l.book_count,
color: formatColors[i % formatColors.length], color: formatColors[i % formatColors.length],
}))} }))}
@@ -239,12 +252,11 @@ export default async function DashboardPage() {
<CardTitle className="text-base">{t("dashboard.metadataCoverage")}</CardTitle> <CardTitle className="text-base">{t("dashboard.metadataCoverage")}</CardTitle>
</CardHeader> </CardHeader>
<CardContent> <CardContent>
<DonutChart <RcDonutChart
locale={locale}
noDataLabel={noDataLabel} noDataLabel={noDataLabel}
data={[ data={[
{ label: t("dashboard.seriesLinked"), value: metadata.series_linked, color: "hsl(142 60% 45%)" }, { name: t("dashboard.seriesLinked"), value: metadata.series_linked, color: "hsl(142 60% 45%)" },
{ label: t("dashboard.seriesUnlinked"), value: metadata.series_unlinked, color: "hsl(220 13% 70%)" }, { name: t("dashboard.seriesUnlinked"), value: metadata.series_unlinked, color: "hsl(220 13% 70%)" },
]} ]}
/> />
</CardContent> </CardContent>
@@ -256,11 +268,10 @@ export default async function DashboardPage() {
<CardTitle className="text-base">{t("dashboard.byProvider")}</CardTitle> <CardTitle className="text-base">{t("dashboard.byProvider")}</CardTitle>
</CardHeader> </CardHeader>
<CardContent> <CardContent>
<DonutChart <RcDonutChart
locale={locale}
noDataLabel={noDataLabel} noDataLabel={noDataLabel}
data={metadata.by_provider.map((p, i) => ({ data={metadata.by_provider.map((p, i) => ({
label: p.provider.replace(/_/g, " ").replace(/\b\w/g, (c) => c.toUpperCase()), name: p.provider.replace(/_/g, " ").replace(/\b\w/g, (c) => c.toUpperCase()),
value: p.count, value: p.count,
color: formatColors[i % formatColors.length], color: formatColors[i % formatColors.length],
}))} }))}
@@ -294,24 +305,32 @@ export default async function DashboardPage() {
</Card> </Card>
</div> </div>
{/* Second row */} {/* Libraries breakdown + Top series */}
<div className="grid grid-cols-1 lg:grid-cols-2 gap-6"> <div className="grid grid-cols-1 lg:grid-cols-2 gap-6">
{/* Monthly additions bar chart */} {by_library.length > 0 && (
<Card hover={false}> <Card hover={false}>
<CardHeader> <CardHeader>
<CardTitle className="text-base">{t("dashboard.booksAdded")}</CardTitle> <CardTitle className="text-base">{t("dashboard.libraries")}</CardTitle>
</CardHeader> </CardHeader>
<CardContent> <CardContent>
<BarChart <RcStackedBar
noDataLabel={noDataLabel} data={by_library.map((lib) => ({
data={additions_over_time.map((m) => ({ name: lib.library_name,
label: m.month.slice(5), // "MM" from "YYYY-MM" read: lib.read_count,
value: m.books_added, reading: lib.reading_count,
unread: lib.unread_count,
sizeLabel: formatBytes(lib.size_bytes),
}))} }))}
color="hsl(198 78% 37%)" labels={{
read: t("status.read"),
reading: t("status.reading"),
unread: t("status.unread"),
books: t("dashboard.books"),
}}
/> />
</CardContent> </CardContent>
</Card> </Card>
)}
{/* Top series */} {/* Top series */}
<Card hover={false}> <Card hover={false}>
@@ -319,67 +338,32 @@ export default async function DashboardPage() {
<CardTitle className="text-base">{t("dashboard.popularSeries")}</CardTitle> <CardTitle className="text-base">{t("dashboard.popularSeries")}</CardTitle>
</CardHeader> </CardHeader>
<CardContent> <CardContent>
<div className="space-y-3"> <RcHorizontalBar
{top_series.slice(0, 8).map((s, i) => ( noDataLabel={t("dashboard.noSeries")}
<HorizontalBar data={top_series.slice(0, 8).map((s) => ({
key={i} name: s.series,
label={s.series} value: s.book_count,
value={s.book_count} subLabel: t("dashboard.readCount", { read: s.read_count, total: s.book_count }),
max={top_series[0]?.book_count || 1} }))}
subLabel={t("dashboard.readCount", { read: s.read_count, total: s.book_count })}
color="hsl(142 60% 45%)" color="hsl(142 60% 45%)"
/> />
))}
{top_series.length === 0 && (
<p className="text-muted-foreground text-sm text-center py-4">{t("dashboard.noSeries")}</p>
)}
</div>
</CardContent> </CardContent>
</Card> </Card>
</div> </div>
{/* Libraries breakdown */} {/* Monthly additions line chart full width */}
{by_library.length > 0 && (
<Card hover={false}> <Card hover={false}>
<CardHeader> <CardHeader>
<CardTitle className="text-base">{t("dashboard.libraries")}</CardTitle> <CardTitle className="text-base">{t("dashboard.booksAdded")}</CardTitle>
</CardHeader> </CardHeader>
<CardContent> <CardContent>
<div className="grid grid-cols-1 md:grid-cols-2 gap-x-8 gap-y-4"> <RcAreaChart
{by_library.map((lib, i) => ( noDataLabel={noDataLabel}
<div key={i} className="space-y-2"> data={additions_over_time.map((m) => ({ label: m.month.slice(5), value: m.books_added }))}
<div className="flex justify-between items-baseline"> color="hsl(198 78% 37%)"
<span className="font-medium text-foreground text-sm">{lib.library_name}</span>
<span className="text-xs text-muted-foreground">{formatBytes(lib.size_bytes)}</span>
</div>
<div className="h-3 bg-muted rounded-full overflow-hidden flex">
<div
className="h-full transition-all duration-500"
style={{ width: `${(lib.read_count / Math.max(lib.book_count, 1)) * 100}%`, backgroundColor: "hsl(142 60% 45%)" }}
title={`${t("status.read")} : ${lib.read_count}`}
/> />
<div
className="h-full transition-all duration-500"
style={{ width: `${(lib.reading_count / Math.max(lib.book_count, 1)) * 100}%`, backgroundColor: "hsl(45 93% 47%)" }}
title={`${t("status.reading")} : ${lib.reading_count}`}
/>
<div
className="h-full transition-all duration-500"
style={{ width: `${(lib.unread_count / Math.max(lib.book_count, 1)) * 100}%`, backgroundColor: "hsl(220 13% 70%)" }}
title={`${t("status.unread")} : ${lib.unread_count}`}
/>
</div>
<div className="flex gap-3 text-[11px] text-muted-foreground">
<span>{lib.book_count} {t("dashboard.books").toLowerCase()}</span>
<span className="text-success">{lib.read_count} {t("status.read").toLowerCase()}</span>
<span className="text-warning">{lib.reading_count} {t("status.reading").toLowerCase()}</span>
</div>
</div>
))}
</div>
</CardContent> </CardContent>
</Card> </Card>
)}
{/* Quick links */} {/* Quick links */}
<QuickLinks t={t} /> <QuickLinks t={t} />

View File

@@ -138,7 +138,7 @@ export default async function SeriesPage({
alt={t("books.coverOf", { name: s.name })} alt={t("books.coverOf", { name: s.name })}
fill fill
className="object-cover" className="object-cover"
unoptimized sizes="(max-width: 640px) 50vw, (max-width: 768px) 33vw, (max-width: 1024px) 25vw, 16vw"
/> />
</div> </div>
<div className="p-3"> <div className="p-3">

View File

@@ -150,11 +150,12 @@ export default function SettingsPage({ initialSettings, initialCacheStats, initi
} }
} }
const [activeTab, setActiveTab] = useState<"general" | "integrations">("general"); const [activeTab, setActiveTab] = useState<"general" | "integrations" | "notifications">("general");
const tabs = [ const tabs = [
{ id: "general" as const, label: t("settings.general"), icon: "settings" as const }, { id: "general" as const, label: t("settings.general"), icon: "settings" as const },
{ id: "integrations" as const, label: t("settings.integrations"), icon: "refresh" as const }, { id: "integrations" as const, label: t("settings.integrations"), icon: "refresh" as const },
{ id: "notifications" as const, label: t("settings.notifications"), icon: "bell" as const },
]; ];
return ( return (
@@ -734,7 +735,7 @@ export default function SettingsPage({ initialSettings, initialCacheStats, initi
> >
<div className="flex items-center justify-between"> <div className="flex items-center justify-between">
<span className="text-sm font-medium text-foreground"> <span className="text-sm font-medium text-foreground">
{new Date(r.created_at).toLocaleString()} {new Date(r.created_at).toLocaleString(locale)}
</span> </span>
<span className="text-xs text-muted-foreground truncate ml-2" title={r.komga_url}> <span className="text-xs text-muted-foreground truncate ml-2" title={r.komga_url}>
{r.komga_url} {r.komga_url}
@@ -826,6 +827,11 @@ export default function SettingsPage({ initialSettings, initialCacheStats, initi
</CardContent> </CardContent>
</Card> </Card>
</>)} </>)}
{activeTab === "notifications" && (<>
{/* Telegram Notifications */}
<TelegramCard handleUpdateSetting={handleUpdateSetting} />
</>)}
</> </>
); );
} }
@@ -1480,3 +1486,254 @@ function QBittorrentCard({ handleUpdateSetting }: { handleUpdateSetting: (key: s
</Card> </Card>
); );
} }
// ---------------------------------------------------------------------------
// Telegram Notifications sub-component
// ---------------------------------------------------------------------------
const DEFAULT_EVENTS = {
scan_completed: true,
scan_failed: true,
scan_cancelled: true,
thumbnail_completed: true,
thumbnail_failed: true,
conversion_completed: true,
conversion_failed: true,
metadata_approved: true,
metadata_batch_completed: true,
metadata_batch_failed: true,
metadata_refresh_completed: true,
metadata_refresh_failed: true,
};
function TelegramCard({ handleUpdateSetting }: { handleUpdateSetting: (key: string, value: unknown) => Promise<void> }) {
const { t } = useTranslation();
const [botToken, setBotToken] = useState("");
const [chatId, setChatId] = useState("");
const [enabled, setEnabled] = useState(false);
const [events, setEvents] = useState(DEFAULT_EVENTS);
const [isTesting, setIsTesting] = useState(false);
const [testResult, setTestResult] = useState<{ success: boolean; message: string } | null>(null);
const [showHelp, setShowHelp] = useState(false);
useEffect(() => {
fetch("/api/settings/telegram")
.then((r) => (r.ok ? r.json() : null))
.then((data) => {
if (data) {
if (data.bot_token) setBotToken(data.bot_token);
if (data.chat_id) setChatId(data.chat_id);
if (data.enabled !== undefined) setEnabled(data.enabled);
if (data.events) setEvents({ ...DEFAULT_EVENTS, ...data.events });
}
})
.catch(() => {});
}, []);
function saveTelegram(token?: string, chat?: string, en?: boolean, ev?: typeof events) {
handleUpdateSetting("telegram", {
bot_token: token ?? botToken,
chat_id: chat ?? chatId,
enabled: en ?? enabled,
events: ev ?? events,
});
}
async function handleTestConnection() {
setIsTesting(true);
setTestResult(null);
try {
const resp = await fetch("/api/telegram/test");
const data = await resp.json();
if (data.error) {
setTestResult({ success: false, message: data.error });
} else {
setTestResult(data);
}
} catch {
setTestResult({ success: false, message: "Failed to connect" });
} finally {
setIsTesting(false);
}
}
return (
<Card className="mb-6">
<CardHeader>
<CardTitle className="flex items-center gap-2">
<Icon name="bell" size="md" />
{t("settings.telegram")}
</CardTitle>
<CardDescription>{t("settings.telegramDesc")}</CardDescription>
</CardHeader>
<CardContent>
<div className="space-y-4">
{/* Setup guide */}
<div>
<button
type="button"
onClick={() => setShowHelp(!showHelp)}
className="text-sm text-primary hover:text-primary/80 flex items-center gap-1 transition-colors"
>
<Icon name={showHelp ? "chevronDown" : "chevronRight"} size="sm" />
{t("settings.telegramHelp")}
</button>
{showHelp && (
<div className="mt-3 p-4 rounded-lg bg-muted/30 space-y-3 text-sm text-foreground">
<div>
<p className="font-medium mb-1">1. Bot Token</p>
<p className="text-muted-foreground" dangerouslySetInnerHTML={{ __html: t("settings.telegramHelpBot") }} />
</div>
<div>
<p className="font-medium mb-1">2. Chat ID</p>
<p className="text-muted-foreground" dangerouslySetInnerHTML={{ __html: t("settings.telegramHelpChat") }} />
</div>
<div>
<p className="font-medium mb-1">3. Group chat</p>
<p className="text-muted-foreground" dangerouslySetInnerHTML={{ __html: t("settings.telegramHelpGroup") }} />
</div>
</div>
)}
</div>
<div className="flex items-center gap-3">
<label className="relative inline-flex items-center cursor-pointer">
<input
type="checkbox"
checked={enabled}
onChange={(e) => {
setEnabled(e.target.checked);
saveTelegram(undefined, undefined, e.target.checked);
}}
className="sr-only peer"
/>
<div className="w-11 h-6 bg-muted rounded-full peer peer-checked:after:translate-x-full peer-checked:after:border-white after:content-[''] after:absolute after:top-[2px] after:left-[2px] after:bg-white after:border-gray-300 after:border after:rounded-full after:h-5 after:w-5 after:transition-all peer-checked:bg-primary"></div>
</label>
<span className="text-sm font-medium text-foreground">{t("settings.telegramEnabled")}</span>
</div>
<FormRow>
<FormField className="flex-1">
<label className="text-sm font-medium text-muted-foreground mb-1 block">{t("settings.botToken")}</label>
<FormInput
type="password"
placeholder={t("settings.botTokenPlaceholder")}
value={botToken}
onChange={(e) => setBotToken(e.target.value)}
onBlur={() => saveTelegram()}
/>
</FormField>
</FormRow>
<FormRow>
<FormField className="flex-1">
<label className="text-sm font-medium text-muted-foreground mb-1 block">{t("settings.chatId")}</label>
<FormInput
type="text"
placeholder={t("settings.chatIdPlaceholder")}
value={chatId}
onChange={(e) => setChatId(e.target.value)}
onBlur={() => saveTelegram()}
/>
</FormField>
</FormRow>
{/* Event toggles grouped by category */}
<div className="border-t border-border/50 pt-4">
<h4 className="text-sm font-medium text-foreground mb-4">{t("settings.telegramEvents")}</h4>
<div className="grid grid-cols-2 gap-x-6 gap-y-5">
{([
{
category: t("settings.eventCategoryScan"),
icon: "search" as const,
items: [
{ key: "scan_completed" as const, label: t("settings.eventCompleted") },
{ key: "scan_failed" as const, label: t("settings.eventFailed") },
{ key: "scan_cancelled" as const, label: t("settings.eventCancelled") },
],
},
{
category: t("settings.eventCategoryThumbnail"),
icon: "image" as const,
items: [
{ key: "thumbnail_completed" as const, label: t("settings.eventCompleted") },
{ key: "thumbnail_failed" as const, label: t("settings.eventFailed") },
],
},
{
category: t("settings.eventCategoryConversion"),
icon: "refresh" as const,
items: [
{ key: "conversion_completed" as const, label: t("settings.eventCompleted") },
{ key: "conversion_failed" as const, label: t("settings.eventFailed") },
],
},
{
category: t("settings.eventCategoryMetadata"),
icon: "tag" as const,
items: [
{ key: "metadata_approved" as const, label: t("settings.eventLinked") },
{ key: "metadata_batch_completed" as const, label: t("settings.eventBatchCompleted") },
{ key: "metadata_batch_failed" as const, label: t("settings.eventBatchFailed") },
{ key: "metadata_refresh_completed" as const, label: t("settings.eventRefreshCompleted") },
{ key: "metadata_refresh_failed" as const, label: t("settings.eventRefreshFailed") },
],
},
]).map(({ category, icon, items }) => (
<div key={category}>
<p className="text-xs font-medium text-muted-foreground uppercase tracking-wide mb-2 flex items-center gap-1.5">
<Icon name={icon} size="sm" className="text-muted-foreground" />
{category}
</p>
<div className="space-y-1">
{items.map(({ key, label }) => (
<label key={key} className="flex items-center justify-between py-1.5 cursor-pointer group">
<span className="text-sm text-foreground group-hover:text-foreground/80">{label}</span>
<div className="relative">
<input
type="checkbox"
checked={events[key]}
onChange={(e) => {
const updated = { ...events, [key]: e.target.checked };
setEvents(updated);
saveTelegram(undefined, undefined, undefined, updated);
}}
className="sr-only peer"
/>
<div className="w-9 h-5 bg-muted rounded-full peer peer-checked:after:translate-x-full peer-checked:after:border-white after:content-[''] after:absolute after:top-[2px] after:left-[2px] after:bg-white after:border-gray-300 after:border after:rounded-full after:h-4 after:w-4 after:transition-all peer-checked:bg-primary" />
</div>
</label>
))}
</div>
</div>
))}
</div>
</div>
<div className="flex items-center gap-3">
<Button
onClick={handleTestConnection}
disabled={isTesting || !botToken || !chatId || !enabled}
>
{isTesting ? (
<>
<Icon name="spinner" size="sm" className="animate-spin -ml-1 mr-2" />
{t("settings.testing")}
</>
) : (
<>
<Icon name="refresh" size="sm" className="mr-2" />
{t("settings.testConnection")}
</>
)}
</Button>
{testResult && (
<span className={`text-sm font-medium ${testResult.success ? "text-success" : "text-destructive"}`}>
{testResult.message}
</span>
)}
</div>
</div>
</CardContent>
</Card>
);
}

View File

@@ -12,6 +12,8 @@ export type LibraryDto = {
fallback_metadata_provider: string | null; fallback_metadata_provider: string | null;
metadata_refresh_mode: string; metadata_refresh_mode: string;
next_metadata_refresh_at: string | null; next_metadata_refresh_at: string | null;
series_count: number;
thumbnail_book_ids: string[];
}; };
export type IndexJobDto = { export type IndexJobDto = {
@@ -139,7 +141,7 @@ export function config() {
export async function apiFetch<T>( export async function apiFetch<T>(
path: string, path: string,
init?: RequestInit, init?: RequestInit & { next?: { revalidate?: number; tags?: string[] } },
): Promise<T> { ): Promise<T> {
const { baseUrl, token } = config(); const { baseUrl, token } = config();
const headers = new Headers(init?.headers || {}); const headers = new Headers(init?.headers || {});
@@ -148,10 +150,12 @@ export async function apiFetch<T>(
headers.set("Content-Type", "application/json"); headers.set("Content-Type", "application/json");
} }
const { next: nextOptions, ...restInit } = init ?? {};
const res = await fetch(`${baseUrl}${path}`, { const res = await fetch(`${baseUrl}${path}`, {
...init, ...restInit,
headers, headers,
cache: "no-store", ...(nextOptions ? { next: nextOptions } : { cache: "no-store" as const }),
}); });
if (!res.ok) { if (!res.ok) {
@@ -166,7 +170,7 @@ export async function apiFetch<T>(
} }
export async function fetchLibraries() { export async function fetchLibraries() {
return apiFetch<LibraryDto[]>("/libraries"); return apiFetch<LibraryDto[]>("/libraries", { next: { revalidate: 30 } });
} }
export async function createLibrary(name: string, rootPath: string) { export async function createLibrary(name: string, rootPath: string) {
@@ -221,10 +225,11 @@ export async function listJobs() {
return apiFetch<IndexJobDto[]>("/index/status"); return apiFetch<IndexJobDto[]>("/index/status");
} }
export async function rebuildIndex(libraryId?: string, full?: boolean) { export async function rebuildIndex(libraryId?: string, full?: boolean, rescan?: boolean) {
const body: { library_id?: string; full?: boolean } = {}; const body: { library_id?: string; full?: boolean; rescan?: boolean } = {};
if (libraryId) body.library_id = libraryId; if (libraryId) body.library_id = libraryId;
if (full) body.full = true; if (full) body.full = true;
if (rescan) body.rescan = true;
return apiFetch<IndexJobDto>("/index/rebuild", { return apiFetch<IndexJobDto>("/index/rebuild", {
method: "POST", method: "POST",
body: JSON.stringify(body), body: JSON.stringify(body),
@@ -284,12 +289,18 @@ export async function fetchBooks(
limit: number = 50, limit: number = 50,
readingStatus?: string, readingStatus?: string,
sort?: string, sort?: string,
author?: string,
format?: string,
metadataProvider?: string,
): Promise<BooksPageDto> { ): Promise<BooksPageDto> {
const params = new URLSearchParams(); const params = new URLSearchParams();
if (libraryId) params.set("library_id", libraryId); if (libraryId) params.set("library_id", libraryId);
if (series) params.set("series", series); if (series) params.set("series", series);
if (readingStatus) params.set("reading_status", readingStatus); if (readingStatus) params.set("reading_status", readingStatus);
if (sort) params.set("sort", sort); if (sort) params.set("sort", sort);
if (author) params.set("author", author);
if (format) params.set("format", format);
if (metadataProvider) params.set("metadata_provider", metadataProvider);
params.set("page", page.toString()); params.set("page", page.toString());
params.set("limit", limit.toString()); params.set("limit", limit.toString());
@@ -331,6 +342,7 @@ export async function fetchAllSeries(
seriesStatus?: string, seriesStatus?: string,
hasMissing?: boolean, hasMissing?: boolean,
metadataProvider?: string, metadataProvider?: string,
author?: string,
): Promise<SeriesPageDto> { ): Promise<SeriesPageDto> {
const params = new URLSearchParams(); const params = new URLSearchParams();
if (libraryId) params.set("library_id", libraryId); if (libraryId) params.set("library_id", libraryId);
@@ -340,6 +352,7 @@ export async function fetchAllSeries(
if (seriesStatus) params.set("series_status", seriesStatus); if (seriesStatus) params.set("series_status", seriesStatus);
if (hasMissing) params.set("has_missing", "true"); if (hasMissing) params.set("has_missing", "true");
if (metadataProvider) params.set("metadata_provider", metadataProvider); if (metadataProvider) params.set("metadata_provider", metadataProvider);
if (author) params.set("author", author);
params.set("page", page.toString()); params.set("page", page.toString());
params.set("limit", limit.toString()); params.set("limit", limit.toString());
@@ -347,7 +360,7 @@ export async function fetchAllSeries(
} }
export async function fetchSeriesStatuses(): Promise<string[]> { export async function fetchSeriesStatuses(): Promise<string[]> {
return apiFetch<string[]>("/series/statuses"); return apiFetch<string[]>("/series/statuses", { next: { revalidate: 300 } });
} }
export async function searchBooks( export async function searchBooks(
@@ -412,7 +425,7 @@ export type ThumbnailStats = {
}; };
export async function getSettings() { export async function getSettings() {
return apiFetch<Settings>("/settings"); return apiFetch<Settings>("/settings", { next: { revalidate: 60 } });
} }
export async function updateSetting(key: string, value: unknown) { export async function updateSetting(key: string, value: unknown) {
@@ -423,7 +436,7 @@ export async function updateSetting(key: string, value: unknown) {
} }
export async function getCacheStats() { export async function getCacheStats() {
return apiFetch<CacheStats>("/settings/cache/stats"); return apiFetch<CacheStats>("/settings/cache/stats", { next: { revalidate: 30 } });
} }
export async function clearCache() { export async function clearCache() {
@@ -433,7 +446,7 @@ export async function clearCache() {
} }
export async function getThumbnailStats() { export async function getThumbnailStats() {
return apiFetch<ThumbnailStats>("/settings/thumbnail/stats"); return apiFetch<ThumbnailStats>("/settings/thumbnail/stats", { next: { revalidate: 30 } });
} }
// Status mappings // Status mappings
@@ -444,7 +457,7 @@ export type StatusMappingDto = {
}; };
export async function fetchStatusMappings(): Promise<StatusMappingDto[]> { export async function fetchStatusMappings(): Promise<StatusMappingDto[]> {
return apiFetch<StatusMappingDto[]>("/settings/status-mappings"); return apiFetch<StatusMappingDto[]>("/settings/status-mappings", { next: { revalidate: 60 } });
} }
export async function upsertStatusMapping(provider_status: string, mapped_status: string): Promise<StatusMappingDto> { export async function upsertStatusMapping(provider_status: string, mapped_status: string): Promise<StatusMappingDto> {
@@ -537,9 +550,32 @@ export type MetadataStats = {
by_provider: ProviderCount[]; by_provider: ProviderCount[];
}; };
export type CurrentlyReadingItem = {
book_id: string;
title: string;
series: string | null;
current_page: number;
page_count: number;
};
export type RecentlyReadItem = {
book_id: string;
title: string;
series: string | null;
last_read_at: string;
};
export type MonthlyReading = {
month: string;
books_read: number;
};
export type StatsResponse = { export type StatsResponse = {
overview: StatsOverview; overview: StatsOverview;
reading_status: ReadingStatusStats; reading_status: ReadingStatusStats;
currently_reading: CurrentlyReadingItem[];
recently_read: RecentlyReadItem[];
reading_over_time: MonthlyReading[];
by_format: FormatCount[]; by_format: FormatCount[];
by_language: LanguageCount[]; by_language: LanguageCount[];
by_library: LibraryStatsItem[]; by_library: LibraryStatsItem[];
@@ -549,7 +585,39 @@ export type StatsResponse = {
}; };
export async function fetchStats() { export async function fetchStats() {
return apiFetch<StatsResponse>("/stats"); return apiFetch<StatsResponse>("/stats", { next: { revalidate: 30 } });
}
// ---------------------------------------------------------------------------
// Authors
// ---------------------------------------------------------------------------
export type AuthorDto = {
name: string;
book_count: number;
series_count: number;
};
export type AuthorsPageDto = {
items: AuthorDto[];
total: number;
page: number;
limit: number;
};
export async function fetchAuthors(
q?: string,
page: number = 1,
limit: number = 20,
sort?: string,
): Promise<AuthorsPageDto> {
const params = new URLSearchParams();
if (q) params.set("q", q);
if (sort) params.set("sort", sort);
params.set("page", page.toString());
params.set("limit", limit.toString());
return apiFetch<AuthorsPageDto>(`/authors?${params.toString()}`);
} }
export type UpdateBookRequest = { export type UpdateBookRequest = {
@@ -905,6 +973,7 @@ export type ProwlarrRelease = {
protocol: string | null; protocol: string | null;
infoUrl: string | null; infoUrl: string | null;
categories: ProwlarrCategory[] | null; categories: ProwlarrCategory[] | null;
matchedMissingVolumes: number[] | null;
}; };
export type ProwlarrSearchResponse = { export type ProwlarrSearchResponse = {

View File

@@ -82,6 +82,12 @@ const en: Record<TranslationKey, string> = {
"dashboard.bookMetadata": "Book metadata", "dashboard.bookMetadata": "Book metadata",
"dashboard.withSummary": "With summary", "dashboard.withSummary": "With summary",
"dashboard.withIsbn": "With ISBN", "dashboard.withIsbn": "With ISBN",
"dashboard.currentlyReading": "Currently reading",
"dashboard.recentlyRead": "Recently read",
"dashboard.readingActivity": "Reading activity (last 12 months)",
"dashboard.pageProgress": "p. {{current}} / {{total}}",
"dashboard.noCurrentlyReading": "No books in progress",
"dashboard.noRecentlyRead": "No books read recently",
// Books page // Books page
"books.title": "Books", "books.title": "Books",
@@ -100,6 +106,8 @@ const en: Record<TranslationKey, string> = {
"books.noResults": "No books found for \"{{query}}\"", "books.noResults": "No books found for \"{{query}}\"",
"books.noBooks": "No books available", "books.noBooks": "No books available",
"books.coverOf": "Cover of {{name}}", "books.coverOf": "Cover of {{name}}",
"books.format": "Format",
"books.allFormats": "All formats",
// Series page // Series page
"series.title": "Series", "series.title": "Series",
@@ -113,6 +121,20 @@ const en: Record<TranslationKey, string> = {
"series.missingCount": "{{count}} missing", "series.missingCount": "{{count}} missing",
"series.readCount": "{{read}}/{{total}} read", "series.readCount": "{{read}}/{{total}} read",
// Authors page
"nav.authors": "Authors",
"authors.title": "Authors",
"authors.searchPlaceholder": "Search by author name...",
"authors.bookCount": "{{count}} book{{plural}}",
"authors.seriesCount": "{{count}} serie{{plural}}",
"authors.noResults": "No authors found matching your filters",
"authors.noAuthors": "No authors available",
"authors.matchingQuery": "matching",
"authors.sortName": "Name",
"authors.sortBooks": "Book count",
"authors.booksBy": "Books by {{name}}",
"authors.seriesBy": "Series by {{name}}",
// Libraries page // Libraries page
"libraries.title": "Libraries", "libraries.title": "Libraries",
"libraries.addLibrary": "Add a library", "libraries.addLibrary": "Add a library",
@@ -124,6 +146,11 @@ const en: Record<TranslationKey, string> = {
"libraries.manual": "Manual", "libraries.manual": "Manual",
"libraries.nextScan": "Next: {{time}}", "libraries.nextScan": "Next: {{time}}",
"libraries.imminent": "Imminent", "libraries.imminent": "Imminent",
"libraries.nextMetadataRefresh": "Next metadata refresh: {{time}}",
"libraries.nextMetadataRefreshShort": "Meta.: {{time}}",
"libraries.scanLabel": "Scan: {{mode}}",
"libraries.watcherLabel": "File watch",
"libraries.metaRefreshLabel": "Meta refresh: {{mode}}",
"libraries.index": "Index", "libraries.index": "Index",
"libraries.fullIndex": "Full", "libraries.fullIndex": "Full",
"libraries.batchMetadata": "Batch metadata", "libraries.batchMetadata": "Batch metadata",
@@ -141,13 +168,22 @@ const en: Record<TranslationKey, string> = {
"librarySeries.noBooksInSeries": "No books in this series", "librarySeries.noBooksInSeries": "No books in this series",
// Library actions // Library actions
"libraryActions.autoScan": "Auto scan", "libraryActions.settingsTitle": "Library settings",
"libraryActions.fileWatch": "File watch ⚡", "libraryActions.sectionIndexation": "Indexation",
"libraryActions.schedule": "📅 Schedule", "libraryActions.sectionMetadata": "Metadata",
"libraryActions.autoScan": "Scheduled scan",
"libraryActions.autoScanDesc": "Automatically scan for new and modified files",
"libraryActions.fileWatch": "Real-time file watch",
"libraryActions.fileWatchDesc": "Detect file changes instantly via filesystem events",
"libraryActions.schedule": "Frequency",
"libraryActions.provider": "Provider", "libraryActions.provider": "Provider",
"libraryActions.fallback": "Fallback", "libraryActions.providerDesc": "Source used to fetch series and volume metadata",
"libraryActions.fallback": "Fallback provider",
"libraryActions.fallbackDesc": "Used when the primary provider returns no results",
"libraryActions.default": "Default", "libraryActions.default": "Default",
"libraryActions.none": "None", "libraryActions.none": "None",
"libraryActions.metadataRefreshSchedule": "Auto-refresh",
"libraryActions.metadataRefreshDesc": "Periodically re-fetch metadata for existing series",
"libraryActions.saving": "Saving...", "libraryActions.saving": "Saving...",
// Library sub-page header // Library sub-page header
@@ -169,6 +205,7 @@ const en: Record<TranslationKey, string> = {
"jobs.startJobDescription": "Select a library (or all) and choose the action to perform.", "jobs.startJobDescription": "Select a library (or all) and choose the action to perform.",
"jobs.allLibraries": "All libraries", "jobs.allLibraries": "All libraries",
"jobs.rebuild": "Rebuild", "jobs.rebuild": "Rebuild",
"jobs.rescan": "Deep rescan",
"jobs.fullRebuild": "Full rebuild", "jobs.fullRebuild": "Full rebuild",
"jobs.generateThumbnails": "Generate thumbnails", "jobs.generateThumbnails": "Generate thumbnails",
"jobs.regenerateThumbnails": "Regenerate thumbnails", "jobs.regenerateThumbnails": "Regenerate thumbnails",
@@ -181,12 +218,14 @@ const en: Record<TranslationKey, string> = {
"jobs.groupMetadata": "Metadata", "jobs.groupMetadata": "Metadata",
"jobs.requiresLibrary": "Requires a specific library", "jobs.requiresLibrary": "Requires a specific library",
"jobs.rebuildShort": "Scan new & modified files", "jobs.rebuildShort": "Scan new & modified files",
"jobs.rescanShort": "Re-walk all directories to discover new formats",
"jobs.fullRebuildShort": "Delete all & re-scan from scratch", "jobs.fullRebuildShort": "Delete all & re-scan from scratch",
"jobs.generateThumbnailsShort": "Missing thumbnails only", "jobs.generateThumbnailsShort": "Missing thumbnails only",
"jobs.regenerateThumbnailsShort": "Recreate all thumbnails", "jobs.regenerateThumbnailsShort": "Recreate all thumbnails",
"jobs.batchMetadataShort": "Auto-match unlinked series", "jobs.batchMetadataShort": "Auto-match unlinked series",
"jobs.refreshMetadataShort": "Update existing linked series", "jobs.refreshMetadataShort": "Update existing linked series",
"jobs.rebuildDescription": "Incremental scan: detects files added, modified, or deleted since the last scan, indexes them, and generates missing thumbnails. Existing unmodified data is preserved. This is the most common and fastest action.", "jobs.rebuildDescription": "Incremental scan: detects files added, modified, or deleted since the last scan, indexes them, and generates missing thumbnails. Existing unmodified data is preserved. This is the most common and fastest action.",
"jobs.rescanDescription": "Re-walks all directories regardless of whether they changed, discovering files in newly supported formats (e.g. EPUB). Existing books and metadata are fully preserved — only genuinely new files are added. Slower than a rebuild but safe for your data.",
"jobs.fullRebuildDescription": "Deletes all indexed data (books, series, thumbnails) then performs a full scan from scratch. Useful if the database is out of sync or corrupted. Long and destructive operation: reading statuses and manual metadata will be lost.", "jobs.fullRebuildDescription": "Deletes all indexed data (books, series, thumbnails) then performs a full scan from scratch. Useful if the database is out of sync or corrupted. Long and destructive operation: reading statuses and manual metadata will be lost.",
"jobs.generateThumbnailsDescription": "Generates thumbnails only for books that don't have one yet. Existing thumbnails are not affected. Useful after an import or if some thumbnails are missing.", "jobs.generateThumbnailsDescription": "Generates thumbnails only for books that don't have one yet. Existing thumbnails are not affected. Useful after an import or if some thumbnails are missing.",
"jobs.regenerateThumbnailsDescription": "Regenerates all thumbnails from scratch, replacing existing ones. Useful if thumbnail quality or size has changed in the configuration, or if thumbnails are corrupted.", "jobs.regenerateThumbnailsDescription": "Regenerates all thumbnails from scratch, replacing existing ones. Useful if thumbnail quality or size has changed in the configuration, or if thumbnails are corrupted.",
@@ -293,6 +332,7 @@ const en: Record<TranslationKey, string> = {
// Job types // Job types
"jobType.rebuild": "Indexing", "jobType.rebuild": "Indexing",
"jobType.rescan": "Deep rescan",
"jobType.full_rebuild": "Full indexing", "jobType.full_rebuild": "Full indexing",
"jobType.thumbnail_rebuild": "Thumbnails", "jobType.thumbnail_rebuild": "Thumbnails",
"jobType.thumbnail_regenerate": "Regen. thumbnails", "jobType.thumbnail_regenerate": "Regen. thumbnails",
@@ -301,6 +341,8 @@ const en: Record<TranslationKey, string> = {
"jobType.metadata_refresh": "Refresh meta.", "jobType.metadata_refresh": "Refresh meta.",
"jobType.rebuildLabel": "Incremental indexing", "jobType.rebuildLabel": "Incremental indexing",
"jobType.rebuildDesc": "Scans new/modified files, analyzes them, and generates missing thumbnails.", "jobType.rebuildDesc": "Scans new/modified files, analyzes them, and generates missing thumbnails.",
"jobType.rescanLabel": "Deep rescan",
"jobType.rescanDesc": "Re-walks all directories to discover files in newly supported formats (e.g. EPUB). Existing data is preserved — only new files are added.",
"jobType.full_rebuildLabel": "Full reindexing", "jobType.full_rebuildLabel": "Full reindexing",
"jobType.full_rebuildDesc": "Deletes all existing data then performs a full scan, re-analysis, and thumbnail generation.", "jobType.full_rebuildDesc": "Deletes all existing data then performs a full scan, re-analysis, and thumbnail generation.",
"jobType.thumbnail_rebuildLabel": "Thumbnail rebuild", "jobType.thumbnail_rebuildLabel": "Thumbnail rebuild",
@@ -497,6 +539,7 @@ const en: Record<TranslationKey, string> = {
"prowlarr.sending": "Sending...", "prowlarr.sending": "Sending...",
"prowlarr.sentSuccess": "Sent to qBittorrent", "prowlarr.sentSuccess": "Sent to qBittorrent",
"prowlarr.sentError": "Failed to send to qBittorrent", "prowlarr.sentError": "Failed to send to qBittorrent",
"prowlarr.missingVol": "Vol. {{vol}} missing",
// Settings - qBittorrent // Settings - qBittorrent
"settings.qbittorrent": "qBittorrent", "settings.qbittorrent": "qBittorrent",
@@ -506,6 +549,33 @@ const en: Record<TranslationKey, string> = {
"settings.qbittorrentUsername": "Username", "settings.qbittorrentUsername": "Username",
"settings.qbittorrentPassword": "Password", "settings.qbittorrentPassword": "Password",
// Settings - Telegram Notifications
"settings.notifications": "Notifications",
"settings.telegram": "Telegram",
"settings.telegramDesc": "Receive Telegram notifications for scans, errors, and metadata linking.",
"settings.botToken": "Bot Token",
"settings.botTokenPlaceholder": "123456:ABC-DEF1234ghIkl-zyx57W2v1u123ew11",
"settings.chatId": "Chat ID",
"settings.chatIdPlaceholder": "123456789",
"settings.telegramEnabled": "Enable Telegram notifications",
"settings.telegramEvents": "Events",
"settings.eventCategoryScan": "Scans",
"settings.eventCategoryThumbnail": "Thumbnails",
"settings.eventCategoryConversion": "CBR → CBZ Conversion",
"settings.eventCategoryMetadata": "Metadata",
"settings.eventCompleted": "Completed",
"settings.eventFailed": "Failed",
"settings.eventCancelled": "Cancelled",
"settings.eventLinked": "Linked",
"settings.eventBatchCompleted": "Batch completed",
"settings.eventBatchFailed": "Batch failed",
"settings.eventRefreshCompleted": "Refresh completed",
"settings.eventRefreshFailed": "Refresh failed",
"settings.telegramHelp": "How to get the required information?",
"settings.telegramHelpBot": "Open Telegram, search for <b>@BotFather</b>, send <code>/newbot</code> and follow the instructions. Copy the token it gives you.",
"settings.telegramHelpChat": "Send a message to your bot, then open <code>https://api.telegram.org/bot&lt;TOKEN&gt;/getUpdates</code> in your browser. The <b>chat id</b> is in <code>message.chat.id</code>.",
"settings.telegramHelpGroup": "For a group: add the bot to the group, send a message, then check the same URL. Group IDs are negative (e.g. <code>-123456789</code>).",
// Settings - Language // Settings - Language
"settings.language": "Language", "settings.language": "Language",
"settings.languageDesc": "Choose the interface language", "settings.languageDesc": "Choose the interface language",

View File

@@ -80,6 +80,12 @@ const fr = {
"dashboard.bookMetadata": "Métadonnées livres", "dashboard.bookMetadata": "Métadonnées livres",
"dashboard.withSummary": "Avec résumé", "dashboard.withSummary": "Avec résumé",
"dashboard.withIsbn": "Avec ISBN", "dashboard.withIsbn": "Avec ISBN",
"dashboard.currentlyReading": "En cours de lecture",
"dashboard.recentlyRead": "Derniers livres lus",
"dashboard.readingActivity": "Activité de lecture (12 derniers mois)",
"dashboard.pageProgress": "p. {{current}} / {{total}}",
"dashboard.noCurrentlyReading": "Aucun livre en cours",
"dashboard.noRecentlyRead": "Aucun livre lu récemment",
// Books page // Books page
"books.title": "Livres", "books.title": "Livres",
@@ -98,6 +104,8 @@ const fr = {
"books.noResults": "Aucun livre trouvé pour \"{{query}}\"", "books.noResults": "Aucun livre trouvé pour \"{{query}}\"",
"books.noBooks": "Aucun livre disponible", "books.noBooks": "Aucun livre disponible",
"books.coverOf": "Couverture de {{name}}", "books.coverOf": "Couverture de {{name}}",
"books.format": "Format",
"books.allFormats": "Tous les formats",
// Series page // Series page
"series.title": "Séries", "series.title": "Séries",
@@ -111,6 +119,20 @@ const fr = {
"series.missingCount": "{{count}} manquant{{plural}}", "series.missingCount": "{{count}} manquant{{plural}}",
"series.readCount": "{{read}}/{{total}} lu{{plural}}", "series.readCount": "{{read}}/{{total}} lu{{plural}}",
// Authors page
"nav.authors": "Auteurs",
"authors.title": "Auteurs",
"authors.searchPlaceholder": "Rechercher par nom d'auteur...",
"authors.bookCount": "{{count}} livre{{plural}}",
"authors.seriesCount": "{{count}} série{{plural}}",
"authors.noResults": "Aucun auteur trouvé correspondant à vos filtres",
"authors.noAuthors": "Aucun auteur disponible",
"authors.matchingQuery": "correspondant à",
"authors.sortName": "Nom",
"authors.sortBooks": "Nombre de livres",
"authors.booksBy": "Livres de {{name}}",
"authors.seriesBy": "Séries de {{name}}",
// Libraries page // Libraries page
"libraries.title": "Bibliothèques", "libraries.title": "Bibliothèques",
"libraries.addLibrary": "Ajouter une bibliothèque", "libraries.addLibrary": "Ajouter une bibliothèque",
@@ -122,6 +144,11 @@ const fr = {
"libraries.manual": "Manuel", "libraries.manual": "Manuel",
"libraries.nextScan": "Prochain : {{time}}", "libraries.nextScan": "Prochain : {{time}}",
"libraries.imminent": "Imminent", "libraries.imminent": "Imminent",
"libraries.nextMetadataRefresh": "Prochain rafraîchissement méta. : {{time}}",
"libraries.nextMetadataRefreshShort": "Méta. : {{time}}",
"libraries.scanLabel": "Scan : {{mode}}",
"libraries.watcherLabel": "Surveillance fichiers",
"libraries.metaRefreshLabel": "Rafraîch. méta. : {{mode}}",
"libraries.index": "Indexer", "libraries.index": "Indexer",
"libraries.fullIndex": "Complet", "libraries.fullIndex": "Complet",
"libraries.batchMetadata": "Métadonnées en lot", "libraries.batchMetadata": "Métadonnées en lot",
@@ -139,13 +166,22 @@ const fr = {
"librarySeries.noBooksInSeries": "Aucun livre dans cette série", "librarySeries.noBooksInSeries": "Aucun livre dans cette série",
// Library actions // Library actions
"libraryActions.autoScan": "Scan auto", "libraryActions.settingsTitle": "Paramètres de la bibliothèque",
"libraryActions.fileWatch": "Surveillance fichiers ⚡", "libraryActions.sectionIndexation": "Indexation",
"libraryActions.schedule": "📅 Planification", "libraryActions.sectionMetadata": "Métadonnées",
"libraryActions.autoScan": "Scan planifié",
"libraryActions.autoScanDesc": "Scanner automatiquement les fichiers nouveaux et modifiés",
"libraryActions.fileWatch": "Surveillance en temps réel",
"libraryActions.fileWatchDesc": "Détecter les changements de fichiers instantanément",
"libraryActions.schedule": "Fréquence",
"libraryActions.provider": "Fournisseur", "libraryActions.provider": "Fournisseur",
"libraryActions.fallback": "Secours", "libraryActions.providerDesc": "Source utilisée pour récupérer les métadonnées des séries",
"libraryActions.fallback": "Fournisseur de secours",
"libraryActions.fallbackDesc": "Utilisé quand le fournisseur principal ne retourne aucun résultat",
"libraryActions.default": "Par défaut", "libraryActions.default": "Par défaut",
"libraryActions.none": "Aucun", "libraryActions.none": "Aucun",
"libraryActions.metadataRefreshSchedule": "Rafraîchissement auto",
"libraryActions.metadataRefreshDesc": "Re-télécharger périodiquement les métadonnées existantes",
"libraryActions.saving": "Enregistrement...", "libraryActions.saving": "Enregistrement...",
// Library sub-page header // Library sub-page header
@@ -166,8 +202,9 @@ const fr = {
"jobs.startJob": "Lancer une tâche", "jobs.startJob": "Lancer une tâche",
"jobs.startJobDescription": "Sélectionnez une bibliothèque (ou toutes) et choisissez l'action à effectuer.", "jobs.startJobDescription": "Sélectionnez une bibliothèque (ou toutes) et choisissez l'action à effectuer.",
"jobs.allLibraries": "Toutes les bibliothèques", "jobs.allLibraries": "Toutes les bibliothèques",
"jobs.rebuild": "Reconstruction", "jobs.rebuild": "Mise à jour",
"jobs.fullRebuild": "Reconstruction complète", "jobs.rescan": "Rescan complet",
"jobs.fullRebuild": "Reconstruction complète (destructif)",
"jobs.generateThumbnails": "Générer les miniatures", "jobs.generateThumbnails": "Générer les miniatures",
"jobs.regenerateThumbnails": "Regénérer les miniatures", "jobs.regenerateThumbnails": "Regénérer les miniatures",
"jobs.batchMetadata": "Métadonnées en lot", "jobs.batchMetadata": "Métadonnées en lot",
@@ -179,12 +216,14 @@ const fr = {
"jobs.groupMetadata": "Métadonnées", "jobs.groupMetadata": "Métadonnées",
"jobs.requiresLibrary": "Requiert une bibliothèque spécifique", "jobs.requiresLibrary": "Requiert une bibliothèque spécifique",
"jobs.rebuildShort": "Scanner les fichiers nouveaux et modifiés", "jobs.rebuildShort": "Scanner les fichiers nouveaux et modifiés",
"jobs.fullRebuildShort": "Tout supprimer et re-scanner depuis zéro", "jobs.rescanShort": "Re-parcourir tous les dossiers pour découvrir de nouveaux formats",
"jobs.fullRebuildShort": "Tout supprimer et re-scanner depuis zéro. Les métadonnées, statuts de lecture et liens seront perdus.",
"jobs.generateThumbnailsShort": "Miniatures manquantes uniquement", "jobs.generateThumbnailsShort": "Miniatures manquantes uniquement",
"jobs.regenerateThumbnailsShort": "Recréer toutes les miniatures", "jobs.regenerateThumbnailsShort": "Recréer toutes les miniatures",
"jobs.batchMetadataShort": "Lier automatiquement les séries non liées", "jobs.batchMetadataShort": "Lier automatiquement les séries non liées",
"jobs.refreshMetadataShort": "Mettre à jour les séries déjà liées", "jobs.refreshMetadataShort": "Mettre à jour les séries déjà liées",
"jobs.rebuildDescription": "Scan incrémental : détecte les fichiers ajoutés, modifiés ou supprimés depuis le dernier scan, les indexe et génère les miniatures manquantes. Les données existantes non modifiées sont conservées. C'est l'action la plus courante et la plus rapide.", "jobs.rebuildDescription": "Scan incrémental : détecte les fichiers ajoutés, modifiés ou supprimés depuis le dernier scan, les indexe et génère les miniatures manquantes. Les données existantes non modifiées sont conservées. C'est l'action la plus courante et la plus rapide.",
"jobs.rescanDescription": "Re-parcourt tous les dossiers même s'ils n'ont pas changé, pour découvrir les fichiers dans les formats nouvellement supportés (ex. EPUB). Les livres et métadonnées existants sont entièrement préservés — seuls les fichiers réellement nouveaux sont ajoutés. Plus lent qu'un rebuild mais sans risque pour vos données.",
"jobs.fullRebuildDescription": "Supprime toutes les données indexées (livres, séries, miniatures) puis effectue un scan complet depuis zéro. Utile si la base de données est désynchronisée ou corrompue. Opération longue et destructive : les statuts de lecture et les métadonnées manuelles seront perdus.", "jobs.fullRebuildDescription": "Supprime toutes les données indexées (livres, séries, miniatures) puis effectue un scan complet depuis zéro. Utile si la base de données est désynchronisée ou corrompue. Opération longue et destructive : les statuts de lecture et les métadonnées manuelles seront perdus.",
"jobs.generateThumbnailsDescription": "Génère les miniatures uniquement pour les livres qui n'en ont pas encore. Les miniatures existantes ne sont pas touchées. Utile après un import ou si certaines miniatures sont manquantes.", "jobs.generateThumbnailsDescription": "Génère les miniatures uniquement pour les livres qui n'en ont pas encore. Les miniatures existantes ne sont pas touchées. Utile après un import ou si certaines miniatures sont manquantes.",
"jobs.regenerateThumbnailsDescription": "Regénère toutes les miniatures depuis zéro, en remplaçant les existantes. Utile si la qualité ou la taille des miniatures a changé dans la configuration, ou si des miniatures sont corrompues.", "jobs.regenerateThumbnailsDescription": "Regénère toutes les miniatures depuis zéro, en remplaçant les existantes. Utile si la qualité ou la taille des miniatures a changé dans la configuration, ou si des miniatures sont corrompues.",
@@ -291,6 +330,7 @@ const fr = {
// Job types // Job types
"jobType.rebuild": "Indexation", "jobType.rebuild": "Indexation",
"jobType.rescan": "Rescan complet",
"jobType.full_rebuild": "Indexation complète", "jobType.full_rebuild": "Indexation complète",
"jobType.thumbnail_rebuild": "Miniatures", "jobType.thumbnail_rebuild": "Miniatures",
"jobType.thumbnail_regenerate": "Régén. miniatures", "jobType.thumbnail_regenerate": "Régén. miniatures",
@@ -299,6 +339,8 @@ const fr = {
"jobType.metadata_refresh": "Rafraîchir méta.", "jobType.metadata_refresh": "Rafraîchir méta.",
"jobType.rebuildLabel": "Indexation incrémentale", "jobType.rebuildLabel": "Indexation incrémentale",
"jobType.rebuildDesc": "Scanne les fichiers nouveaux/modifiés, les analyse et génère les miniatures manquantes.", "jobType.rebuildDesc": "Scanne les fichiers nouveaux/modifiés, les analyse et génère les miniatures manquantes.",
"jobType.rescanLabel": "Rescan complet",
"jobType.rescanDesc": "Re-parcourt tous les dossiers pour découvrir les fichiers dans les formats nouvellement supportés (ex. EPUB). Les données existantes sont préservées — seuls les nouveaux fichiers sont ajoutés.",
"jobType.full_rebuildLabel": "Réindexation complète", "jobType.full_rebuildLabel": "Réindexation complète",
"jobType.full_rebuildDesc": "Supprime toutes les données existantes puis effectue un scan complet, une ré-analyse et la génération des miniatures.", "jobType.full_rebuildDesc": "Supprime toutes les données existantes puis effectue un scan complet, une ré-analyse et la génération des miniatures.",
"jobType.thumbnail_rebuildLabel": "Reconstruction des miniatures", "jobType.thumbnail_rebuildLabel": "Reconstruction des miniatures",
@@ -495,6 +537,7 @@ const fr = {
"prowlarr.sending": "Envoi...", "prowlarr.sending": "Envoi...",
"prowlarr.sentSuccess": "Envoyé à qBittorrent", "prowlarr.sentSuccess": "Envoyé à qBittorrent",
"prowlarr.sentError": "Échec de l'envoi à qBittorrent", "prowlarr.sentError": "Échec de l'envoi à qBittorrent",
"prowlarr.missingVol": "T{{vol}} manquant",
// Settings - qBittorrent // Settings - qBittorrent
"settings.qbittorrent": "qBittorrent", "settings.qbittorrent": "qBittorrent",
@@ -504,6 +547,33 @@ const fr = {
"settings.qbittorrentUsername": "Nom d'utilisateur", "settings.qbittorrentUsername": "Nom d'utilisateur",
"settings.qbittorrentPassword": "Mot de passe", "settings.qbittorrentPassword": "Mot de passe",
// Settings - Telegram Notifications
"settings.notifications": "Notifications",
"settings.telegram": "Telegram",
"settings.telegramDesc": "Recevoir des notifications Telegram lors des scans, erreurs et liaisons de métadonnées.",
"settings.botToken": "Bot Token",
"settings.botTokenPlaceholder": "123456:ABC-DEF1234ghIkl-zyx57W2v1u123ew11",
"settings.chatId": "Chat ID",
"settings.chatIdPlaceholder": "123456789",
"settings.telegramEnabled": "Activer les notifications Telegram",
"settings.telegramEvents": "Événements",
"settings.eventCategoryScan": "Scans",
"settings.eventCategoryThumbnail": "Miniatures",
"settings.eventCategoryConversion": "Conversion CBR → CBZ",
"settings.eventCategoryMetadata": "Métadonnées",
"settings.eventCompleted": "Terminé",
"settings.eventFailed": "Échoué",
"settings.eventCancelled": "Annulé",
"settings.eventLinked": "Liée",
"settings.eventBatchCompleted": "Batch terminé",
"settings.eventBatchFailed": "Batch échoué",
"settings.eventRefreshCompleted": "Rafraîchissement terminé",
"settings.eventRefreshFailed": "Rafraîchissement échoué",
"settings.telegramHelp": "Comment obtenir les informations ?",
"settings.telegramHelpBot": "Ouvrez Telegram, recherchez <b>@BotFather</b>, envoyez <code>/newbot</code> et suivez les instructions. Copiez le token fourni.",
"settings.telegramHelpChat": "Envoyez un message à votre bot, puis ouvrez <code>https://api.telegram.org/bot&lt;TOKEN&gt;/getUpdates</code> dans votre navigateur. Le <b>chat id</b> apparaît dans <code>message.chat.id</code>.",
"settings.telegramHelpGroup": "Pour un groupe : ajoutez le bot au groupe, envoyez un message, puis consultez la même URL. Les IDs de groupe sont négatifs (ex: <code>-123456789</code>).",
// Settings - Language // Settings - Language
"settings.language": "Langue", "settings.language": "Langue",
"settings.languageDesc": "Choisir la langue de l'interface", "settings.languageDesc": "Choisir la langue de l'interface",

View File

@@ -1,7 +1,11 @@
/** @type {import('next').NextConfig} */ /** @type {import('next').NextConfig} */
const nextConfig = { const nextConfig = {
output: "standalone", output: "standalone",
typedRoutes: true typedRoutes: true,
images: {
minimumCacheTTL: 86400,
unoptimized: true,
},
}; };
export default nextConfig; export default nextConfig;

View File

@@ -1,17 +1,18 @@
{ {
"name": "stripstream-backoffice", "name": "stripstream-backoffice",
"version": "1.4.0", "version": "1.23.0",
"lockfileVersion": 3, "lockfileVersion": 3,
"requires": true, "requires": true,
"packages": { "packages": {
"": { "": {
"name": "stripstream-backoffice", "name": "stripstream-backoffice",
"version": "1.4.0", "version": "1.23.0",
"dependencies": { "dependencies": {
"next": "^16.1.6", "next": "^16.1.6",
"next-themes": "^0.4.6", "next-themes": "^0.4.6",
"react": "19.0.0", "react": "19.0.0",
"react-dom": "19.0.0", "react-dom": "19.0.0",
"recharts": "^3.8.0",
"sanitize-html": "^2.17.1" "sanitize-html": "^2.17.1"
}, },
"devDependencies": { "devDependencies": {
@@ -759,6 +760,54 @@
"node": ">= 10" "node": ">= 10"
} }
}, },
"node_modules/@reduxjs/toolkit": {
"version": "2.11.2",
"resolved": "https://registry.npmjs.org/@reduxjs/toolkit/-/toolkit-2.11.2.tgz",
"integrity": "sha512-Kd6kAHTA6/nUpp8mySPqj3en3dm0tdMIgbttnQ1xFMVpufoj+ADi8pXLBsd4xzTRHQa7t/Jv8W5UnCuW4kuWMQ==",
"license": "MIT",
"dependencies": {
"@standard-schema/spec": "^1.0.0",
"@standard-schema/utils": "^0.3.0",
"immer": "^11.0.0",
"redux": "^5.0.1",
"redux-thunk": "^3.1.0",
"reselect": "^5.1.0"
},
"peerDependencies": {
"react": "^16.9.0 || ^17.0.0 || ^18 || ^19",
"react-redux": "^7.2.1 || ^8.1.3 || ^9.0.0"
},
"peerDependenciesMeta": {
"react": {
"optional": true
},
"react-redux": {
"optional": true
}
}
},
"node_modules/@reduxjs/toolkit/node_modules/immer": {
"version": "11.1.4",
"resolved": "https://registry.npmjs.org/immer/-/immer-11.1.4.tgz",
"integrity": "sha512-XREFCPo6ksxVzP4E0ekD5aMdf8WMwmdNaz6vuvxgI40UaEiu6q3p8X52aU6GdyvLY3XXX/8R7JOTXStz/nBbRw==",
"license": "MIT",
"funding": {
"type": "opencollective",
"url": "https://opencollective.com/immer"
}
},
"node_modules/@standard-schema/spec": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/@standard-schema/spec/-/spec-1.1.0.tgz",
"integrity": "sha512-l2aFy5jALhniG5HgqrD6jXLi/rUWrKvqN/qJx6yoJsgKhblVd+iqqU4RCXavm/jPityDo5TCvKMnpjKnOriy0w==",
"license": "MIT"
},
"node_modules/@standard-schema/utils": {
"version": "0.3.0",
"resolved": "https://registry.npmjs.org/@standard-schema/utils/-/utils-0.3.0.tgz",
"integrity": "sha512-e7Mew686owMaPJVNNLs55PUvgz371nKgwsc4vxE49zsODpJEnxgxRo2y/OKrqueavXgZNMDVj3DdHFlaSAeU8g==",
"license": "MIT"
},
"node_modules/@swc/helpers": { "node_modules/@swc/helpers": {
"version": "0.5.15", "version": "0.5.15",
"resolved": "https://registry.npmjs.org/@swc/helpers/-/helpers-0.5.15.tgz", "resolved": "https://registry.npmjs.org/@swc/helpers/-/helpers-0.5.15.tgz",
@@ -1051,6 +1100,69 @@
"tailwindcss": "4.2.1" "tailwindcss": "4.2.1"
} }
}, },
"node_modules/@types/d3-array": {
"version": "3.2.2",
"resolved": "https://registry.npmjs.org/@types/d3-array/-/d3-array-3.2.2.tgz",
"integrity": "sha512-hOLWVbm7uRza0BYXpIIW5pxfrKe0W+D5lrFiAEYR+pb6w3N2SwSMaJbXdUfSEv+dT4MfHBLtn5js0LAWaO6otw==",
"license": "MIT"
},
"node_modules/@types/d3-color": {
"version": "3.1.3",
"resolved": "https://registry.npmjs.org/@types/d3-color/-/d3-color-3.1.3.tgz",
"integrity": "sha512-iO90scth9WAbmgv7ogoq57O9YpKmFBbmoEoCHDB2xMBY0+/KVrqAaCDyCE16dUspeOvIxFFRI+0sEtqDqy2b4A==",
"license": "MIT"
},
"node_modules/@types/d3-ease": {
"version": "3.0.2",
"resolved": "https://registry.npmjs.org/@types/d3-ease/-/d3-ease-3.0.2.tgz",
"integrity": "sha512-NcV1JjO5oDzoK26oMzbILE6HW7uVXOHLQvHshBUW4UMdZGfiY6v5BeQwh9a9tCzv+CeefZQHJt5SRgK154RtiA==",
"license": "MIT"
},
"node_modules/@types/d3-interpolate": {
"version": "3.0.4",
"resolved": "https://registry.npmjs.org/@types/d3-interpolate/-/d3-interpolate-3.0.4.tgz",
"integrity": "sha512-mgLPETlrpVV1YRJIglr4Ez47g7Yxjl1lj7YKsiMCb27VJH9W8NVM6Bb9d8kkpG/uAQS5AmbA48q2IAolKKo1MA==",
"license": "MIT",
"dependencies": {
"@types/d3-color": "*"
}
},
"node_modules/@types/d3-path": {
"version": "3.1.1",
"resolved": "https://registry.npmjs.org/@types/d3-path/-/d3-path-3.1.1.tgz",
"integrity": "sha512-VMZBYyQvbGmWyWVea0EHs/BwLgxc+MKi1zLDCONksozI4YJMcTt8ZEuIR4Sb1MMTE8MMW49v0IwI5+b7RmfWlg==",
"license": "MIT"
},
"node_modules/@types/d3-scale": {
"version": "4.0.9",
"resolved": "https://registry.npmjs.org/@types/d3-scale/-/d3-scale-4.0.9.tgz",
"integrity": "sha512-dLmtwB8zkAeO/juAMfnV+sItKjlsw2lKdZVVy6LRr0cBmegxSABiLEpGVmSJJ8O08i4+sGR6qQtb6WtuwJdvVw==",
"license": "MIT",
"dependencies": {
"@types/d3-time": "*"
}
},
"node_modules/@types/d3-shape": {
"version": "3.1.8",
"resolved": "https://registry.npmjs.org/@types/d3-shape/-/d3-shape-3.1.8.tgz",
"integrity": "sha512-lae0iWfcDeR7qt7rA88BNiqdvPS5pFVPpo5OfjElwNaT2yyekbM0C9vK+yqBqEmHr6lDkRnYNoTBYlAgJa7a4w==",
"license": "MIT",
"dependencies": {
"@types/d3-path": "*"
}
},
"node_modules/@types/d3-time": {
"version": "3.0.4",
"resolved": "https://registry.npmjs.org/@types/d3-time/-/d3-time-3.0.4.tgz",
"integrity": "sha512-yuzZug1nkAAaBlBBikKZTgzCeA+k1uy4ZFwWANOfKw5z5LRhV0gNA7gNkKm7HoK+HRN0wX3EkxGk0fpbWhmB7g==",
"license": "MIT"
},
"node_modules/@types/d3-timer": {
"version": "3.0.2",
"resolved": "https://registry.npmjs.org/@types/d3-timer/-/d3-timer-3.0.2.tgz",
"integrity": "sha512-Ps3T8E8dZDam6fUyNiMkekK3XUsaUEik+idO9/YjPtfj2qruF8tFBXS7XhtE4iIXBLxhmLjP3SXpLhVf21I9Lw==",
"license": "MIT"
},
"node_modules/@types/node": { "node_modules/@types/node": {
"version": "22.13.14", "version": "22.13.14",
"resolved": "https://registry.npmjs.org/@types/node/-/node-22.13.14.tgz", "resolved": "https://registry.npmjs.org/@types/node/-/node-22.13.14.tgz",
@@ -1065,7 +1177,7 @@
"version": "19.0.12", "version": "19.0.12",
"resolved": "https://registry.npmjs.org/@types/react/-/react-19.0.12.tgz", "resolved": "https://registry.npmjs.org/@types/react/-/react-19.0.12.tgz",
"integrity": "sha512-V6Ar115dBDrjbtXSrS+/Oruobc+qVbbUxDFC1RSbRqLt5SYvxxyIDrSC85RWml54g+jfNeEMZhEj7wW07ONQhA==", "integrity": "sha512-V6Ar115dBDrjbtXSrS+/Oruobc+qVbbUxDFC1RSbRqLt5SYvxxyIDrSC85RWml54g+jfNeEMZhEj7wW07ONQhA==",
"dev": true, "devOptional": true,
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"csstype": "^3.0.2" "csstype": "^3.0.2"
@@ -1124,6 +1236,12 @@
"entities": "^7.0.1" "entities": "^7.0.1"
} }
}, },
"node_modules/@types/use-sync-external-store": {
"version": "0.0.6",
"resolved": "https://registry.npmjs.org/@types/use-sync-external-store/-/use-sync-external-store-0.0.6.tgz",
"integrity": "sha512-zFDAD+tlpf2r4asuHEj0XH6pY6i0g5NeAHPn+15wk3BV6JA69eERFXC1gyGThDkVa1zCyKr5jox1+2LbV/AMLg==",
"license": "MIT"
},
"node_modules/autoprefixer": { "node_modules/autoprefixer": {
"version": "10.4.27", "version": "10.4.27",
"resolved": "https://registry.npmjs.org/autoprefixer/-/autoprefixer-10.4.27.tgz", "resolved": "https://registry.npmjs.org/autoprefixer/-/autoprefixer-10.4.27.tgz",
@@ -1233,11 +1351,147 @@
"integrity": "sha512-IV3Ou0jSMzZrd3pZ48nLkT9DA7Ag1pnPzaiQhpW7c3RbcqqzvzzVu+L8gfqMp/8IM2MQtSiqaCxrrcfu8I8rMA==", "integrity": "sha512-IV3Ou0jSMzZrd3pZ48nLkT9DA7Ag1pnPzaiQhpW7c3RbcqqzvzzVu+L8gfqMp/8IM2MQtSiqaCxrrcfu8I8rMA==",
"license": "MIT" "license": "MIT"
}, },
"node_modules/clsx": {
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/clsx/-/clsx-2.1.1.tgz",
"integrity": "sha512-eYm0QWBtUrBWZWG0d386OGAw16Z995PiOVo2B7bjWSbHedGl5e0ZWaq65kOGgUSNesEIDkB9ISbTg/JK9dhCZA==",
"license": "MIT",
"engines": {
"node": ">=6"
}
},
"node_modules/csstype": { "node_modules/csstype": {
"version": "3.2.3", "version": "3.2.3",
"resolved": "https://registry.npmjs.org/csstype/-/csstype-3.2.3.tgz", "resolved": "https://registry.npmjs.org/csstype/-/csstype-3.2.3.tgz",
"integrity": "sha512-z1HGKcYy2xA8AGQfwrn0PAy+PB7X/GSj3UVJW9qKyn43xWa+gl5nXmU4qqLMRzWVLFC8KusUX8T/0kCiOYpAIQ==", "integrity": "sha512-z1HGKcYy2xA8AGQfwrn0PAy+PB7X/GSj3UVJW9qKyn43xWa+gl5nXmU4qqLMRzWVLFC8KusUX8T/0kCiOYpAIQ==",
"dev": true, "devOptional": true,
"license": "MIT"
},
"node_modules/d3-array": {
"version": "3.2.4",
"resolved": "https://registry.npmjs.org/d3-array/-/d3-array-3.2.4.tgz",
"integrity": "sha512-tdQAmyA18i4J7wprpYq8ClcxZy3SC31QMeByyCFyRt7BVHdREQZ5lpzoe5mFEYZUWe+oq8HBvk9JjpibyEV4Jg==",
"license": "ISC",
"dependencies": {
"internmap": "1 - 2"
},
"engines": {
"node": ">=12"
}
},
"node_modules/d3-color": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/d3-color/-/d3-color-3.1.0.tgz",
"integrity": "sha512-zg/chbXyeBtMQ1LbD/WSoW2DpC3I0mpmPdW+ynRTj/x2DAWYrIY7qeZIHidozwV24m4iavr15lNwIwLxRmOxhA==",
"license": "ISC",
"engines": {
"node": ">=12"
}
},
"node_modules/d3-ease": {
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/d3-ease/-/d3-ease-3.0.1.tgz",
"integrity": "sha512-wR/XK3D3XcLIZwpbvQwQ5fK+8Ykds1ip7A2Txe0yxncXSdq1L9skcG7blcedkOX+ZcgxGAmLX1FrRGbADwzi0w==",
"license": "BSD-3-Clause",
"engines": {
"node": ">=12"
}
},
"node_modules/d3-format": {
"version": "3.1.2",
"resolved": "https://registry.npmjs.org/d3-format/-/d3-format-3.1.2.tgz",
"integrity": "sha512-AJDdYOdnyRDV5b6ArilzCPPwc1ejkHcoyFarqlPqT7zRYjhavcT3uSrqcMvsgh2CgoPbK3RCwyHaVyxYcP2Arg==",
"license": "ISC",
"engines": {
"node": ">=12"
}
},
"node_modules/d3-interpolate": {
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/d3-interpolate/-/d3-interpolate-3.0.1.tgz",
"integrity": "sha512-3bYs1rOD33uo8aqJfKP3JWPAibgw8Zm2+L9vBKEHJ2Rg+viTR7o5Mmv5mZcieN+FRYaAOWX5SJATX6k1PWz72g==",
"license": "ISC",
"dependencies": {
"d3-color": "1 - 3"
},
"engines": {
"node": ">=12"
}
},
"node_modules/d3-path": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/d3-path/-/d3-path-3.1.0.tgz",
"integrity": "sha512-p3KP5HCf/bvjBSSKuXid6Zqijx7wIfNW+J/maPs+iwR35at5JCbLUT0LzF1cnjbCHWhqzQTIN2Jpe8pRebIEFQ==",
"license": "ISC",
"engines": {
"node": ">=12"
}
},
"node_modules/d3-scale": {
"version": "4.0.2",
"resolved": "https://registry.npmjs.org/d3-scale/-/d3-scale-4.0.2.tgz",
"integrity": "sha512-GZW464g1SH7ag3Y7hXjf8RoUuAFIqklOAq3MRl4OaWabTFJY9PN/E1YklhXLh+OQ3fM9yS2nOkCoS+WLZ6kvxQ==",
"license": "ISC",
"dependencies": {
"d3-array": "2.10.0 - 3",
"d3-format": "1 - 3",
"d3-interpolate": "1.2.0 - 3",
"d3-time": "2.1.1 - 3",
"d3-time-format": "2 - 4"
},
"engines": {
"node": ">=12"
}
},
"node_modules/d3-shape": {
"version": "3.2.0",
"resolved": "https://registry.npmjs.org/d3-shape/-/d3-shape-3.2.0.tgz",
"integrity": "sha512-SaLBuwGm3MOViRq2ABk3eLoxwZELpH6zhl3FbAoJ7Vm1gofKx6El1Ib5z23NUEhF9AsGl7y+dzLe5Cw2AArGTA==",
"license": "ISC",
"dependencies": {
"d3-path": "^3.1.0"
},
"engines": {
"node": ">=12"
}
},
"node_modules/d3-time": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/d3-time/-/d3-time-3.1.0.tgz",
"integrity": "sha512-VqKjzBLejbSMT4IgbmVgDjpkYrNWUYJnbCGo874u7MMKIWsILRX+OpX/gTk8MqjpT1A/c6HY2dCA77ZN0lkQ2Q==",
"license": "ISC",
"dependencies": {
"d3-array": "2 - 3"
},
"engines": {
"node": ">=12"
}
},
"node_modules/d3-time-format": {
"version": "4.1.0",
"resolved": "https://registry.npmjs.org/d3-time-format/-/d3-time-format-4.1.0.tgz",
"integrity": "sha512-dJxPBlzC7NugB2PDLwo9Q8JiTR3M3e4/XANkreKSUxF8vvXKqm1Yfq4Q5dl8budlunRVlUUaDUgFt7eA8D6NLg==",
"license": "ISC",
"dependencies": {
"d3-time": "1 - 3"
},
"engines": {
"node": ">=12"
}
},
"node_modules/d3-timer": {
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/d3-timer/-/d3-timer-3.0.1.tgz",
"integrity": "sha512-ndfJ/JxxMd3nw31uyKoY2naivF+r29V+Lc0svZxe1JvvIRmi8hUsrMvdOwgS1o6uBHmiz91geQ0ylPP0aj1VUA==",
"license": "ISC",
"engines": {
"node": ">=12"
}
},
"node_modules/decimal.js-light": {
"version": "2.5.1",
"resolved": "https://registry.npmjs.org/decimal.js-light/-/decimal.js-light-2.5.1.tgz",
"integrity": "sha512-qIMFpTMZmny+MMIitAB6D7iVPEorVw6YQRWkvarTkT4tBeSLLiHzcwj6q0MmYSFCiVpiqPJTJEYIrpcPzVEIvg==",
"license": "MIT" "license": "MIT"
}, },
"node_modules/deepmerge": { "node_modules/deepmerge": {
@@ -1347,6 +1601,16 @@
"url": "https://github.com/fb55/entities?sponsor=1" "url": "https://github.com/fb55/entities?sponsor=1"
} }
}, },
"node_modules/es-toolkit": {
"version": "1.45.1",
"resolved": "https://registry.npmjs.org/es-toolkit/-/es-toolkit-1.45.1.tgz",
"integrity": "sha512-/jhoOj/Fx+A+IIyDNOvO3TItGmlMKhtX8ISAHKE90c4b/k1tqaqEZ+uUqfpU8DMnW5cgNJv606zS55jGvza0Xw==",
"license": "MIT",
"workspaces": [
"docs",
"benchmarks"
]
},
"node_modules/escalade": { "node_modules/escalade": {
"version": "3.2.0", "version": "3.2.0",
"resolved": "https://registry.npmjs.org/escalade/-/escalade-3.2.0.tgz", "resolved": "https://registry.npmjs.org/escalade/-/escalade-3.2.0.tgz",
@@ -1369,6 +1633,12 @@
"url": "https://github.com/sponsors/sindresorhus" "url": "https://github.com/sponsors/sindresorhus"
} }
}, },
"node_modules/eventemitter3": {
"version": "5.0.4",
"resolved": "https://registry.npmjs.org/eventemitter3/-/eventemitter3-5.0.4.tgz",
"integrity": "sha512-mlsTRyGaPBjPedk6Bvw+aqbsXDtoAyAzm5MO7JgU+yVRyMQ5O8bD4Kcci7BS85f93veegeCPkL8R4GLClnjLFw==",
"license": "MIT"
},
"node_modules/fraction.js": { "node_modules/fraction.js": {
"version": "5.3.4", "version": "5.3.4",
"resolved": "https://registry.npmjs.org/fraction.js/-/fraction.js-5.3.4.tgz", "resolved": "https://registry.npmjs.org/fraction.js/-/fraction.js-5.3.4.tgz",
@@ -1409,6 +1679,25 @@
"entities": "^4.4.0" "entities": "^4.4.0"
} }
}, },
"node_modules/immer": {
"version": "10.2.0",
"resolved": "https://registry.npmjs.org/immer/-/immer-10.2.0.tgz",
"integrity": "sha512-d/+XTN3zfODyjr89gM3mPq1WNX2B8pYsu7eORitdwyA2sBubnTl3laYlBk4sXY5FUa5qTZGBDPJICVbvqzjlbw==",
"license": "MIT",
"funding": {
"type": "opencollective",
"url": "https://opencollective.com/immer"
}
},
"node_modules/internmap": {
"version": "2.0.3",
"resolved": "https://registry.npmjs.org/internmap/-/internmap-2.0.3.tgz",
"integrity": "sha512-5Hh7Y1wQbvY5ooGgPbDaL5iYLAPzMTUrjMulskHLH6wnv/A+1q5rgEaiuqEjB+oxGXIVZs1FF+R/KPN3ZSQYYg==",
"license": "ISC",
"engines": {
"node": ">=12"
}
},
"node_modules/is-plain-object": { "node_modules/is-plain-object": {
"version": "5.0.0", "version": "5.0.0",
"resolved": "https://registry.npmjs.org/is-plain-object/-/is-plain-object-5.0.0.tgz", "resolved": "https://registry.npmjs.org/is-plain-object/-/is-plain-object-5.0.0.tgz",
@@ -1895,6 +2184,87 @@
"react": "^19.0.0" "react": "^19.0.0"
} }
}, },
"node_modules/react-is": {
"version": "19.2.4",
"resolved": "https://registry.npmjs.org/react-is/-/react-is-19.2.4.tgz",
"integrity": "sha512-W+EWGn2v0ApPKgKKCy/7s7WHXkboGcsrXE+2joLyVxkbyVQfO3MUEaUQDHoSmb8TFFrSKYa9mw64WZHNHSDzYA==",
"license": "MIT",
"peer": true
},
"node_modules/react-redux": {
"version": "9.2.0",
"resolved": "https://registry.npmjs.org/react-redux/-/react-redux-9.2.0.tgz",
"integrity": "sha512-ROY9fvHhwOD9ySfrF0wmvu//bKCQ6AeZZq1nJNtbDC+kk5DuSuNX/n6YWYF/SYy7bSba4D4FSz8DJeKY/S/r+g==",
"license": "MIT",
"dependencies": {
"@types/use-sync-external-store": "^0.0.6",
"use-sync-external-store": "^1.4.0"
},
"peerDependencies": {
"@types/react": "^18.2.25 || ^19",
"react": "^18.0 || ^19",
"redux": "^5.0.0"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
},
"redux": {
"optional": true
}
}
},
"node_modules/recharts": {
"version": "3.8.0",
"resolved": "https://registry.npmjs.org/recharts/-/recharts-3.8.0.tgz",
"integrity": "sha512-Z/m38DX3L73ExO4Tpc9/iZWHmHnlzWG4njQbxsF5aSjwqmHNDDIm0rdEBArkwsBvR8U6EirlEHiQNYWCVh9sGQ==",
"license": "MIT",
"workspaces": [
"www"
],
"dependencies": {
"@reduxjs/toolkit": "^1.9.0 || 2.x.x",
"clsx": "^2.1.1",
"decimal.js-light": "^2.5.1",
"es-toolkit": "^1.39.3",
"eventemitter3": "^5.0.1",
"immer": "^10.1.1",
"react-redux": "8.x.x || 9.x.x",
"reselect": "5.1.1",
"tiny-invariant": "^1.3.3",
"use-sync-external-store": "^1.2.2",
"victory-vendor": "^37.0.2"
},
"engines": {
"node": ">=18"
},
"peerDependencies": {
"react": "^16.8.0 || ^17.0.0 || ^18.0.0 || ^19.0.0",
"react-dom": "^16.0.0 || ^17.0.0 || ^18.0.0 || ^19.0.0",
"react-is": "^16.8.0 || ^17.0.0 || ^18.0.0 || ^19.0.0"
}
},
"node_modules/redux": {
"version": "5.0.1",
"resolved": "https://registry.npmjs.org/redux/-/redux-5.0.1.tgz",
"integrity": "sha512-M9/ELqF6fy8FwmkpnF0S3YKOqMyoWJ4+CS5Efg2ct3oY9daQvd/Pc71FpGZsVsbl3Cpb+IIcjBDUnnyBdQbq4w==",
"license": "MIT"
},
"node_modules/redux-thunk": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/redux-thunk/-/redux-thunk-3.1.0.tgz",
"integrity": "sha512-NW2r5T6ksUKXCabzhL9z+h206HQw/NJkcLm1GPImRQ8IzfXwRGqjVhKJGauHirT0DAuyy6hjdnMZaRoAcy0Klw==",
"license": "MIT",
"peerDependencies": {
"redux": "^5.0.0"
}
},
"node_modules/reselect": {
"version": "5.1.1",
"resolved": "https://registry.npmjs.org/reselect/-/reselect-5.1.1.tgz",
"integrity": "sha512-K/BG6eIky/SBpzfHZv/dd+9JBFiS4SWV7FIujVyJRux6e45+73RaUHXLmIR1f7WOMaQ0U1km6qwklRQxpJJY0w==",
"license": "MIT"
},
"node_modules/sanitize-html": { "node_modules/sanitize-html": {
"version": "2.17.1", "version": "2.17.1",
"resolved": "https://registry.npmjs.org/sanitize-html/-/sanitize-html-2.17.1.tgz", "resolved": "https://registry.npmjs.org/sanitize-html/-/sanitize-html-2.17.1.tgz",
@@ -2026,6 +2396,12 @@
"url": "https://opencollective.com/webpack" "url": "https://opencollective.com/webpack"
} }
}, },
"node_modules/tiny-invariant": {
"version": "1.3.3",
"resolved": "https://registry.npmjs.org/tiny-invariant/-/tiny-invariant-1.3.3.tgz",
"integrity": "sha512-+FbBPE1o9QAYvviau/qC5SE3caw21q3xkvWKBtja5vgqOWIHHJ3ioaq1VPfn/Szqctz2bU/oYeKd9/z5BL+PVg==",
"license": "MIT"
},
"node_modules/tslib": { "node_modules/tslib": {
"version": "2.8.1", "version": "2.8.1",
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.8.1.tgz", "resolved": "https://registry.npmjs.org/tslib/-/tslib-2.8.1.tgz",
@@ -2083,6 +2459,37 @@
"peerDependencies": { "peerDependencies": {
"browserslist": ">= 4.21.0" "browserslist": ">= 4.21.0"
} }
},
"node_modules/use-sync-external-store": {
"version": "1.6.0",
"resolved": "https://registry.npmjs.org/use-sync-external-store/-/use-sync-external-store-1.6.0.tgz",
"integrity": "sha512-Pp6GSwGP/NrPIrxVFAIkOQeyw8lFenOHijQWkUTrDvrF4ALqylP2C/KCkeS9dpUM3KvYRQhna5vt7IL95+ZQ9w==",
"license": "MIT",
"peerDependencies": {
"react": "^16.8.0 || ^17.0.0 || ^18.0.0 || ^19.0.0"
}
},
"node_modules/victory-vendor": {
"version": "37.3.6",
"resolved": "https://registry.npmjs.org/victory-vendor/-/victory-vendor-37.3.6.tgz",
"integrity": "sha512-SbPDPdDBYp+5MJHhBCAyI7wKM3d5ivekigc2Dk2s7pgbZ9wIgIBYGVw4zGHBml/qTFbexrofXW6Gu4noGxrOwQ==",
"license": "MIT AND ISC",
"dependencies": {
"@types/d3-array": "^3.0.3",
"@types/d3-ease": "^3.0.0",
"@types/d3-interpolate": "^3.0.1",
"@types/d3-scale": "^4.0.2",
"@types/d3-shape": "^3.1.0",
"@types/d3-time": "^3.0.0",
"@types/d3-timer": "^3.0.0",
"d3-array": "^3.1.6",
"d3-ease": "^3.0.1",
"d3-interpolate": "^3.0.1",
"d3-scale": "^4.0.2",
"d3-shape": "^3.1.0",
"d3-time": "^3.0.0",
"d3-timer": "^3.0.1"
}
} }
} }
} }

View File

@@ -1,6 +1,6 @@
{ {
"name": "stripstream-backoffice", "name": "stripstream-backoffice",
"version": "1.11.1", "version": "1.25.0",
"private": true, "private": true,
"scripts": { "scripts": {
"dev": "next dev -p 7082", "dev": "next dev -p 7082",
@@ -12,6 +12,7 @@
"next-themes": "^0.4.6", "next-themes": "^0.4.6",
"react": "19.0.0", "react": "19.0.0",
"react-dom": "19.0.0", "react-dom": "19.0.0",
"recharts": "^3.8.0",
"sanitize-html": "^2.17.1" "sanitize-html": "^2.17.1"
}, },
"devDependencies": { "devDependencies": {

View File

@@ -14,6 +14,7 @@ futures = "0.3"
image.workspace = true image.workspace = true
jpeg-decoder.workspace = true jpeg-decoder.workspace = true
num_cpus.workspace = true num_cpus.workspace = true
notifications = { path = "../../crates/notifications" }
parsers = { path = "../../crates/parsers" } parsers = { path = "../../crates/parsers" }
reqwest.workspace = true reqwest.workspace = true
serde.workspace = true serde.workspace = true

View File

@@ -6,13 +6,15 @@ COPY Cargo.toml ./
COPY apps/api/Cargo.toml apps/api/Cargo.toml COPY apps/api/Cargo.toml apps/api/Cargo.toml
COPY apps/indexer/Cargo.toml apps/indexer/Cargo.toml COPY apps/indexer/Cargo.toml apps/indexer/Cargo.toml
COPY crates/core/Cargo.toml crates/core/Cargo.toml COPY crates/core/Cargo.toml crates/core/Cargo.toml
COPY crates/notifications/Cargo.toml crates/notifications/Cargo.toml
COPY crates/parsers/Cargo.toml crates/parsers/Cargo.toml COPY crates/parsers/Cargo.toml crates/parsers/Cargo.toml
RUN mkdir -p apps/api/src apps/indexer/src crates/core/src crates/parsers/src && \ RUN mkdir -p apps/api/src apps/indexer/src crates/core/src crates/notifications/src crates/parsers/src && \
echo "fn main() {}" > apps/api/src/main.rs && \ echo "fn main() {}" > apps/api/src/main.rs && \
echo "fn main() {}" > apps/indexer/src/main.rs && \ echo "fn main() {}" > apps/indexer/src/main.rs && \
echo "" > apps/indexer/src/lib.rs && \ echo "" > apps/indexer/src/lib.rs && \
echo "" > crates/core/src/lib.rs && \ echo "" > crates/core/src/lib.rs && \
echo "" > crates/notifications/src/lib.rs && \
echo "" > crates/parsers/src/lib.rs echo "" > crates/parsers/src/lib.rs
# Build dependencies only (cached as long as Cargo.toml files don't change) # Build dependencies only (cached as long as Cargo.toml files don't change)
@@ -25,12 +27,13 @@ RUN --mount=type=cache,target=/usr/local/cargo/registry \
COPY apps/api/src apps/api/src COPY apps/api/src apps/api/src
COPY apps/indexer/src apps/indexer/src COPY apps/indexer/src apps/indexer/src
COPY crates/core/src crates/core/src COPY crates/core/src crates/core/src
COPY crates/notifications/src crates/notifications/src
COPY crates/parsers/src crates/parsers/src COPY crates/parsers/src crates/parsers/src
RUN --mount=type=cache,target=/usr/local/cargo/registry \ RUN --mount=type=cache,target=/usr/local/cargo/registry \
--mount=type=cache,target=/usr/local/cargo/git \ --mount=type=cache,target=/usr/local/cargo/git \
--mount=type=cache,target=/app/target \ --mount=type=cache,target=/app/target \
touch apps/indexer/src/main.rs crates/core/src/lib.rs crates/parsers/src/lib.rs && \ touch apps/indexer/src/main.rs crates/core/src/lib.rs crates/notifications/src/lib.rs crates/parsers/src/lib.rs && \
cargo build --release -p indexer && \ cargo build --release -p indexer && \
cp /app/target/release/indexer /usr/local/bin/indexer cp /app/target/release/indexer /usr/local/bin/indexer

View File

@@ -290,6 +290,7 @@ fn book_format_from_str(s: &str) -> Option<BookFormat> {
"cbz" => Some(BookFormat::Cbz), "cbz" => Some(BookFormat::Cbz),
"cbr" => Some(BookFormat::Cbr), "cbr" => Some(BookFormat::Cbr),
"pdf" => Some(BookFormat::Pdf), "pdf" => Some(BookFormat::Pdf),
"epub" => Some(BookFormat::Epub),
_ => None, _ => None,
} }
} }

View File

@@ -43,6 +43,7 @@ const API_ONLY_JOB_TYPES: &[&str] = &["metadata_batch", "metadata_refresh"];
const EXCLUSIVE_JOB_TYPES: &[&str] = &[ const EXCLUSIVE_JOB_TYPES: &[&str] = &[
"rebuild", "rebuild",
"full_rebuild", "full_rebuild",
"rescan",
"scan", "scan",
"thumbnail_rebuild", "thumbnail_rebuild",
"thumbnail_regenerate", "thumbnail_regenerate",
@@ -211,11 +212,29 @@ pub async fn process_job(
} }
let is_full_rebuild = job_type == "full_rebuild"; let is_full_rebuild = job_type == "full_rebuild";
let is_rescan = job_type == "rescan";
info!( info!(
"[JOB] {} type={} full_rebuild={}", "[JOB] {} type={} full_rebuild={} rescan={}",
job_id, job_type, is_full_rebuild job_id, job_type, is_full_rebuild, is_rescan
); );
// Rescan: clear directory mtimes to force re-walking all directories,
// but keep existing data intact (unlike full_rebuild)
if is_rescan {
if let Some(library_id) = target_library_id {
let _ = sqlx::query("DELETE FROM directory_mtimes WHERE library_id = $1")
.bind(library_id)
.execute(&state.pool)
.await;
info!("[JOB] Rescan: cleared directory mtimes for library {}", library_id);
} else {
let _ = sqlx::query("DELETE FROM directory_mtimes")
.execute(&state.pool)
.await;
info!("[JOB] Rescan: cleared all directory mtimes");
}
}
// Full rebuild: delete existing data first // Full rebuild: delete existing data first
if is_full_rebuild { if is_full_rebuild {
info!("[JOB] Full rebuild: deleting existing data"); info!("[JOB] Full rebuild: deleting existing data");
@@ -258,7 +277,7 @@ pub async fn process_job(
// For full rebuilds, the DB is already cleared, so we must walk the filesystem. // For full rebuilds, the DB is already cleared, so we must walk the filesystem.
let library_ids: Vec<uuid::Uuid> = libraries.iter().map(|r| r.get("id")).collect(); let library_ids: Vec<uuid::Uuid> = libraries.iter().map(|r| r.get("id")).collect();
let total_files: usize = if !is_full_rebuild { let total_files: usize = if !is_full_rebuild && !is_rescan {
let count: i64 = sqlx::query_scalar( let count: i64 = sqlx::query_scalar(
"SELECT COUNT(*) FROM book_files bf JOIN books b ON b.id = bf.book_id WHERE b.library_id = ANY($1)" "SELECT COUNT(*) FROM book_files bf JOIN books b ON b.id = bf.book_id WHERE b.library_id = ANY($1)"
) )
@@ -309,6 +328,7 @@ pub async fn process_job(
removed_files: 0, removed_files: 0,
errors: 0, errors: 0,
warnings: 0, warnings: 0,
new_series: 0,
}; };
let mut total_processed_count = 0i32; let mut total_processed_count = 0i32;

View File

@@ -14,6 +14,7 @@ use crate::{
utils, utils,
AppState, AppState,
}; };
use std::collections::HashSet;
#[derive(Serialize)] #[derive(Serialize)]
pub struct JobStats { pub struct JobStats {
@@ -22,6 +23,7 @@ pub struct JobStats {
pub removed_files: usize, pub removed_files: usize,
pub errors: usize, pub errors: usize,
pub warnings: usize, pub warnings: usize,
pub new_series: usize,
} }
const BATCH_SIZE: usize = 100; const BATCH_SIZE: usize = 100;
@@ -106,6 +108,18 @@ pub async fn scan_library_discovery(
HashMap::new() HashMap::new()
}; };
// Track existing series names for new_series counting
let existing_series: HashSet<String> = sqlx::query_scalar(
"SELECT DISTINCT COALESCE(NULLIF(series, ''), 'unclassified') FROM books WHERE library_id = $1",
)
.bind(library_id)
.fetch_all(&state.pool)
.await
.unwrap_or_default()
.into_iter()
.collect();
let mut seen_new_series: HashSet<String> = HashSet::new();
let mut seen: HashMap<String, bool> = HashMap::new(); let mut seen: HashMap<String, bool> = HashMap::new();
let mut library_processed_count = 0i32; let mut library_processed_count = 0i32;
let mut last_progress_update = std::time::Instant::now(); let mut last_progress_update = std::time::Instant::now();
@@ -382,6 +396,12 @@ pub async fn scan_library_discovery(
let book_id = Uuid::new_v4(); let book_id = Uuid::new_v4();
let file_id = Uuid::new_v4(); let file_id = Uuid::new_v4();
// Track new series
let series_key = parsed.series.as_deref().unwrap_or("unclassified").to_string();
if !existing_series.contains(&series_key) && seen_new_series.insert(series_key) {
stats.new_series += 1;
}
books_to_insert.push(BookInsert { books_to_insert.push(BookInsert {
book_id, book_id,
library_id, library_id,

View File

@@ -65,3 +65,65 @@ pub async fn check_and_schedule_auto_scans(pool: &PgPool) -> Result<()> {
Ok(()) Ok(())
} }
pub async fn check_and_schedule_metadata_refreshes(pool: &PgPool) -> Result<()> {
let libraries = sqlx::query(
r#"
SELECT id, metadata_refresh_mode
FROM libraries
WHERE metadata_refresh_mode != 'manual'
AND (
next_metadata_refresh_at IS NULL
OR next_metadata_refresh_at <= NOW()
)
AND NOT EXISTS (
SELECT 1 FROM index_jobs
WHERE library_id = libraries.id
AND type = 'metadata_refresh'
AND status IN ('pending', 'running')
)
AND EXISTS (
SELECT 1 FROM external_metadata_links
WHERE library_id = libraries.id
AND status = 'approved'
)
"#
)
.fetch_all(pool)
.await?;
for row in libraries {
let library_id: Uuid = row.get("id");
let refresh_mode: String = row.get("metadata_refresh_mode");
info!("[SCHEDULER] Auto-refreshing metadata for library {} (mode: {})", library_id, refresh_mode);
let job_id = Uuid::new_v4();
sqlx::query(
"INSERT INTO index_jobs (id, library_id, type, status) VALUES ($1, $2, 'metadata_refresh', 'pending')"
)
.bind(job_id)
.bind(library_id)
.execute(pool)
.await?;
let interval_minutes = match refresh_mode.as_str() {
"hourly" => 60,
"daily" => 1440,
"weekly" => 10080,
_ => 1440,
};
sqlx::query(
"UPDATE libraries SET last_metadata_refresh_at = NOW(), next_metadata_refresh_at = NOW() + INTERVAL '1 minute' * $2 WHERE id = $1"
)
.bind(library_id)
.bind(interval_minutes)
.execute(pool)
.await?;
info!("[SCHEDULER] Created metadata_refresh job {} for library {}", job_id, library_id);
}
Ok(())
}

View File

@@ -40,7 +40,7 @@ pub fn compute_fingerprint(path: &Path, size: u64, mtime: &DateTime<Utc>) -> Res
pub fn kind_from_format(format: BookFormat) -> &'static str { pub fn kind_from_format(format: BookFormat) -> &'static str {
match format { match format {
BookFormat::Pdf => "ebook", BookFormat::Pdf | BookFormat::Epub => "ebook",
BookFormat::Cbz | BookFormat::Cbr => "comic", BookFormat::Cbz | BookFormat::Cbr => "comic",
} }
} }

View File

@@ -1,5 +1,7 @@
use std::time::Duration; use std::time::Duration;
use sqlx::Row;
use tracing::{error, info, trace}; use tracing::{error, info, trace};
use uuid::Uuid;
use crate::{job, scheduler, watcher, AppState}; use crate::{job, scheduler, watcher, AppState};
pub async fn run_worker(state: AppState, interval_seconds: u64) { pub async fn run_worker(state: AppState, interval_seconds: u64) {
@@ -27,25 +29,190 @@ pub async fn run_worker(state: AppState, interval_seconds: u64) {
if let Err(err) = scheduler::check_and_schedule_auto_scans(&scheduler_state.pool).await { if let Err(err) = scheduler::check_and_schedule_auto_scans(&scheduler_state.pool).await {
error!("[SCHEDULER] Error: {}", err); error!("[SCHEDULER] Error: {}", err);
} }
if let Err(err) = scheduler::check_and_schedule_metadata_refreshes(&scheduler_state.pool).await {
error!("[SCHEDULER] Metadata refresh error: {}", err);
}
tokio::time::sleep(scheduler_wait).await; tokio::time::sleep(scheduler_wait).await;
} }
}); });
struct JobInfo {
job_type: String,
library_name: Option<String>,
book_title: Option<String>,
thumbnail_path: Option<String>,
}
async fn load_job_info(
pool: &sqlx::PgPool,
job_id: Uuid,
library_id: Option<Uuid>,
) -> JobInfo {
let row = sqlx::query("SELECT type, book_id FROM index_jobs WHERE id = $1")
.bind(job_id)
.fetch_optional(pool)
.await
.ok()
.flatten();
let (job_type, book_id): (String, Option<Uuid>) = match row {
Some(r) => (r.get("type"), r.get("book_id")),
None => ("unknown".to_string(), None),
};
let library_name: Option<String> = if let Some(lib_id) = library_id {
sqlx::query_scalar("SELECT name FROM libraries WHERE id = $1")
.bind(lib_id)
.fetch_optional(pool)
.await
.ok()
.flatten()
} else {
None
};
let (book_title, thumbnail_path): (Option<String>, Option<String>) = if let Some(bid) = book_id {
let row = sqlx::query("SELECT title, thumbnail_path FROM books WHERE id = $1")
.bind(bid)
.fetch_optional(pool)
.await
.ok()
.flatten();
match row {
Some(r) => (r.get("title"), r.get("thumbnail_path")),
None => (None, None),
}
} else {
(None, None)
};
JobInfo { job_type, library_name, book_title, thumbnail_path }
}
async fn load_scan_stats(pool: &sqlx::PgPool, job_id: Uuid) -> notifications::ScanStats {
let row = sqlx::query("SELECT stats_json FROM index_jobs WHERE id = $1")
.bind(job_id)
.fetch_optional(pool)
.await
.ok()
.flatten();
if let Some(row) = row {
if let Ok(val) = row.try_get::<serde_json::Value, _>("stats_json") {
return notifications::ScanStats {
scanned_files: val.get("scanned_files").and_then(|v| v.as_u64()).unwrap_or(0) as usize,
indexed_files: val.get("indexed_files").and_then(|v| v.as_u64()).unwrap_or(0) as usize,
removed_files: val.get("removed_files").and_then(|v| v.as_u64()).unwrap_or(0) as usize,
new_series: val.get("new_series").and_then(|v| v.as_u64()).unwrap_or(0) as usize,
errors: val.get("errors").and_then(|v| v.as_u64()).unwrap_or(0) as usize,
};
}
}
notifications::ScanStats {
scanned_files: 0,
indexed_files: 0,
removed_files: 0,
new_series: 0,
errors: 0,
}
}
fn build_completed_event(
job_type: &str,
library_name: Option<String>,
book_title: Option<String>,
thumbnail_path: Option<String>,
stats: notifications::ScanStats,
duration_seconds: u64,
) -> notifications::NotificationEvent {
match notifications::job_type_category(job_type) {
"thumbnail" => notifications::NotificationEvent::ThumbnailCompleted {
job_type: job_type.to_string(),
library_name,
duration_seconds,
},
"conversion" => notifications::NotificationEvent::ConversionCompleted {
library_name,
book_title,
thumbnail_path,
},
_ => notifications::NotificationEvent::ScanCompleted {
job_type: job_type.to_string(),
library_name,
stats,
duration_seconds,
},
}
}
fn build_failed_event(
job_type: &str,
library_name: Option<String>,
book_title: Option<String>,
thumbnail_path: Option<String>,
error: String,
) -> notifications::NotificationEvent {
match notifications::job_type_category(job_type) {
"thumbnail" => notifications::NotificationEvent::ThumbnailFailed {
job_type: job_type.to_string(),
library_name,
error,
},
"conversion" => notifications::NotificationEvent::ConversionFailed {
library_name,
book_title,
thumbnail_path,
error,
},
_ => notifications::NotificationEvent::ScanFailed {
job_type: job_type.to_string(),
library_name,
error,
},
}
}
loop { loop {
match job::claim_next_job(&state.pool).await { match job::claim_next_job(&state.pool).await {
Ok(Some((job_id, library_id))) => { Ok(Some((job_id, library_id))) => {
info!("[INDEXER] Starting job {} library={:?}", job_id, library_id); info!("[INDEXER] Starting job {} library={:?}", job_id, library_id);
let started_at = std::time::Instant::now();
let info = load_job_info(&state.pool, job_id, library_id).await;
if let Err(err) = job::process_job(&state, job_id, library_id).await { if let Err(err) = job::process_job(&state, job_id, library_id).await {
let err_str = err.to_string(); let err_str = err.to_string();
if err_str.contains("cancelled") || err_str.contains("Cancelled") { if err_str.contains("cancelled") || err_str.contains("Cancelled") {
info!("[INDEXER] Job {} was cancelled by user", job_id); info!("[INDEXER] Job {} was cancelled by user", job_id);
// Status is already 'cancelled' in DB, don't change it notifications::notify(
state.pool.clone(),
notifications::NotificationEvent::ScanCancelled {
job_type: info.job_type.clone(),
library_name: info.library_name.clone(),
},
);
} else { } else {
error!("[INDEXER] Job {} failed: {}", job_id, err); error!("[INDEXER] Job {} failed: {}", job_id, err);
let _ = job::fail_job(&state.pool, job_id, &err_str).await; let _ = job::fail_job(&state.pool, job_id, &err_str).await;
notifications::notify(
state.pool.clone(),
build_failed_event(&info.job_type, info.library_name.clone(), info.book_title.clone(), info.thumbnail_path.clone(), err_str),
);
} }
} else { } else {
info!("[INDEXER] Job {} completed", job_id); info!("[INDEXER] Job {} completed", job_id);
let stats = load_scan_stats(&state.pool, job_id).await;
notifications::notify(
state.pool.clone(),
build_completed_event(
&info.job_type,
info.library_name.clone(),
info.book_title.clone(),
info.thumbnail_path.clone(),
stats,
started_at.elapsed().as_secs(),
),
);
} }
} }
Ok(None) => { Ok(None) => {

View File

@@ -0,0 +1,13 @@
[package]
name = "notifications"
version.workspace = true
edition.workspace = true
[dependencies]
anyhow.workspace = true
reqwest.workspace = true
serde.workspace = true
serde_json.workspace = true
sqlx.workspace = true
tokio.workspace = true
tracing.workspace = true

View File

@@ -0,0 +1,513 @@
use anyhow::Result;
use serde::Deserialize;
use sqlx::PgPool;
use tracing::{info, warn};
// ---------------------------------------------------------------------------
// Config
// ---------------------------------------------------------------------------
#[derive(Debug, Deserialize)]
pub struct TelegramConfig {
pub bot_token: String,
pub chat_id: String,
#[serde(default)]
pub enabled: bool,
#[serde(default = "default_events")]
pub events: EventToggles,
}
#[derive(Debug, Deserialize)]
pub struct EventToggles {
#[serde(default = "default_true")]
pub scan_completed: bool,
#[serde(default = "default_true")]
pub scan_failed: bool,
#[serde(default = "default_true")]
pub scan_cancelled: bool,
#[serde(default = "default_true")]
pub thumbnail_completed: bool,
#[serde(default = "default_true")]
pub thumbnail_failed: bool,
#[serde(default = "default_true")]
pub conversion_completed: bool,
#[serde(default = "default_true")]
pub conversion_failed: bool,
#[serde(default = "default_true")]
pub metadata_approved: bool,
#[serde(default = "default_true")]
pub metadata_batch_completed: bool,
#[serde(default = "default_true")]
pub metadata_batch_failed: bool,
#[serde(default = "default_true")]
pub metadata_refresh_completed: bool,
#[serde(default = "default_true")]
pub metadata_refresh_failed: bool,
}
fn default_true() -> bool {
true
}
fn default_events() -> EventToggles {
EventToggles {
scan_completed: true,
scan_failed: true,
scan_cancelled: true,
thumbnail_completed: true,
thumbnail_failed: true,
conversion_completed: true,
conversion_failed: true,
metadata_approved: true,
metadata_batch_completed: true,
metadata_batch_failed: true,
metadata_refresh_completed: true,
metadata_refresh_failed: true,
}
}
/// Load the Telegram config from `app_settings` (key = "telegram").
/// Returns `None` when the row is missing, disabled, or has empty credentials.
pub async fn load_telegram_config(pool: &PgPool) -> Option<TelegramConfig> {
let row = sqlx::query_scalar::<_, serde_json::Value>(
"SELECT value FROM app_settings WHERE key = 'telegram'",
)
.fetch_optional(pool)
.await
.ok()??;
let config: TelegramConfig = serde_json::from_value(row).ok()?;
if !config.enabled || config.bot_token.is_empty() || config.chat_id.is_empty() {
return None;
}
Some(config)
}
// ---------------------------------------------------------------------------
// Telegram HTTP
// ---------------------------------------------------------------------------
fn build_client() -> Result<reqwest::Client> {
Ok(reqwest::Client::builder()
.timeout(std::time::Duration::from_secs(10))
.build()?)
}
async fn send_telegram(config: &TelegramConfig, text: &str) -> Result<()> {
let url = format!(
"https://api.telegram.org/bot{}/sendMessage",
config.bot_token
);
let body = serde_json::json!({
"chat_id": config.chat_id,
"text": text,
"parse_mode": "HTML",
});
let resp = build_client()?.post(&url).json(&body).send().await?;
if !resp.status().is_success() {
let status = resp.status();
let text = resp.text().await.unwrap_or_default();
anyhow::bail!("Telegram API returned {status}: {text}");
}
Ok(())
}
async fn send_telegram_photo(config: &TelegramConfig, caption: &str, photo_path: &str) -> Result<()> {
let url = format!(
"https://api.telegram.org/bot{}/sendPhoto",
config.bot_token
);
let photo_bytes = tokio::fs::read(photo_path).await?;
let filename = std::path::Path::new(photo_path)
.file_name()
.unwrap_or_default()
.to_string_lossy()
.to_string();
let mime = if filename.ends_with(".webp") {
"image/webp"
} else if filename.ends_with(".png") {
"image/png"
} else {
"image/jpeg"
};
let part = reqwest::multipart::Part::bytes(photo_bytes)
.file_name(filename)
.mime_str(mime)?;
let form = reqwest::multipart::Form::new()
.text("chat_id", config.chat_id.clone())
.text("caption", caption.to_string())
.text("parse_mode", "HTML")
.part("photo", part);
let resp = build_client()?.post(&url).multipart(form).send().await?;
if !resp.status().is_success() {
let status = resp.status();
let text = resp.text().await.unwrap_or_default();
anyhow::bail!("Telegram API returned {status}: {text}");
}
Ok(())
}
/// Send a test message. Returns the result directly (not fire-and-forget).
pub async fn send_test_message(config: &TelegramConfig) -> Result<()> {
send_telegram(config, "🔔 <b>Stripstream Librarian</b>\nTest notification — connection OK!").await
}
// ---------------------------------------------------------------------------
// Notification events
// ---------------------------------------------------------------------------
pub struct ScanStats {
pub scanned_files: usize,
pub indexed_files: usize,
pub removed_files: usize,
pub new_series: usize,
pub errors: usize,
}
pub enum NotificationEvent {
// Scan jobs (rebuild, full_rebuild, rescan, scan)
ScanCompleted {
job_type: String,
library_name: Option<String>,
stats: ScanStats,
duration_seconds: u64,
},
ScanFailed {
job_type: String,
library_name: Option<String>,
error: String,
},
ScanCancelled {
job_type: String,
library_name: Option<String>,
},
// Thumbnail jobs (thumbnail_rebuild, thumbnail_regenerate)
ThumbnailCompleted {
job_type: String,
library_name: Option<String>,
duration_seconds: u64,
},
ThumbnailFailed {
job_type: String,
library_name: Option<String>,
error: String,
},
// CBR→CBZ conversion
ConversionCompleted {
library_name: Option<String>,
book_title: Option<String>,
thumbnail_path: Option<String>,
},
ConversionFailed {
library_name: Option<String>,
book_title: Option<String>,
thumbnail_path: Option<String>,
error: String,
},
// Metadata manual approve
MetadataApproved {
series_name: String,
provider: String,
thumbnail_path: Option<String>,
},
// Metadata batch (auto-match)
MetadataBatchCompleted {
library_name: Option<String>,
total_series: i32,
processed: i32,
},
MetadataBatchFailed {
library_name: Option<String>,
error: String,
},
// Metadata refresh
MetadataRefreshCompleted {
library_name: Option<String>,
refreshed: i32,
unchanged: i32,
errors: i32,
},
MetadataRefreshFailed {
library_name: Option<String>,
error: String,
},
}
/// Classify an indexer job_type string into the right event constructor category.
/// Returns "scan", "thumbnail", or "conversion".
pub fn job_type_category(job_type: &str) -> &'static str {
match job_type {
"thumbnail_rebuild" | "thumbnail_regenerate" => "thumbnail",
"cbr_to_cbz" => "conversion",
_ => "scan",
}
}
fn format_event(event: &NotificationEvent) -> String {
match event {
NotificationEvent::ScanCompleted {
job_type,
library_name,
stats,
duration_seconds,
} => {
let lib = library_name.as_deref().unwrap_or("All libraries");
let duration = format_duration(*duration_seconds);
format!(
"📚 <b>Scan completed</b>\n\
Library: {lib}\n\
Type: {job_type}\n\
New books: {}\n\
New series: {}\n\
Files scanned: {}\n\
Removed: {}\n\
Errors: {}\n\
Duration: {duration}",
stats.indexed_files,
stats.new_series,
stats.scanned_files,
stats.removed_files,
stats.errors,
)
}
NotificationEvent::ScanFailed {
job_type,
library_name,
error,
} => {
let lib = library_name.as_deref().unwrap_or("All libraries");
let err = truncate(error, 200);
format!(
"❌ <b>Scan failed</b>\n\
Library: {lib}\n\
Type: {job_type}\n\
Error: {err}"
)
}
NotificationEvent::ScanCancelled {
job_type,
library_name,
} => {
let lib = library_name.as_deref().unwrap_or("All libraries");
format!(
"⏹ <b>Scan cancelled</b>\n\
Library: {lib}\n\
Type: {job_type}"
)
}
NotificationEvent::ThumbnailCompleted {
job_type,
library_name,
duration_seconds,
} => {
let lib = library_name.as_deref().unwrap_or("All libraries");
let duration = format_duration(*duration_seconds);
format!(
"🖼 <b>Thumbnails completed</b>\n\
Library: {lib}\n\
Type: {job_type}\n\
Duration: {duration}"
)
}
NotificationEvent::ThumbnailFailed {
job_type,
library_name,
error,
} => {
let lib = library_name.as_deref().unwrap_or("All libraries");
let err = truncate(error, 200);
format!(
"❌ <b>Thumbnails failed</b>\n\
Library: {lib}\n\
Type: {job_type}\n\
Error: {err}"
)
}
NotificationEvent::ConversionCompleted {
library_name,
book_title,
..
} => {
let lib = library_name.as_deref().unwrap_or("Unknown");
let title = book_title.as_deref().unwrap_or("Unknown");
format!(
"🔄 <b>CBR→CBZ conversion completed</b>\n\
Library: {lib}\n\
Book: {title}"
)
}
NotificationEvent::ConversionFailed {
library_name,
book_title,
error,
..
} => {
let lib = library_name.as_deref().unwrap_or("Unknown");
let title = book_title.as_deref().unwrap_or("Unknown");
let err = truncate(error, 200);
format!(
"❌ <b>CBR→CBZ conversion failed</b>\n\
Library: {lib}\n\
Book: {title}\n\
Error: {err}"
)
}
NotificationEvent::MetadataApproved {
series_name,
provider,
..
} => {
format!(
"🔗 <b>Metadata linked</b>\n\
Series: {series_name}\n\
Provider: {provider}"
)
}
NotificationEvent::MetadataBatchCompleted {
library_name,
total_series,
processed,
} => {
let lib = library_name.as_deref().unwrap_or("All libraries");
format!(
"🔍 <b>Metadata batch completed</b>\n\
Library: {lib}\n\
Series processed: {processed}/{total_series}"
)
}
NotificationEvent::MetadataBatchFailed {
library_name,
error,
} => {
let lib = library_name.as_deref().unwrap_or("All libraries");
let err = truncate(error, 200);
format!(
"❌ <b>Metadata batch failed</b>\n\
Library: {lib}\n\
Error: {err}"
)
}
NotificationEvent::MetadataRefreshCompleted {
library_name,
refreshed,
unchanged,
errors,
} => {
let lib = library_name.as_deref().unwrap_or("All libraries");
format!(
"🔄 <b>Metadata refresh completed</b>\n\
Library: {lib}\n\
Updated: {refreshed}\n\
Unchanged: {unchanged}\n\
Errors: {errors}"
)
}
NotificationEvent::MetadataRefreshFailed {
library_name,
error,
} => {
let lib = library_name.as_deref().unwrap_or("All libraries");
let err = truncate(error, 200);
format!(
"❌ <b>Metadata refresh failed</b>\n\
Library: {lib}\n\
Error: {err}"
)
}
}
}
fn truncate(s: &str, max: usize) -> String {
if s.len() > max {
format!("{}", &s[..max])
} else {
s.to_string()
}
}
fn format_duration(secs: u64) -> String {
if secs < 60 {
format!("{secs}s")
} else {
let m = secs / 60;
let s = secs % 60;
format!("{m}m{s}s")
}
}
// ---------------------------------------------------------------------------
// Public entry point — fire & forget
// ---------------------------------------------------------------------------
/// Returns whether this event type is enabled in the config.
fn is_event_enabled(config: &TelegramConfig, event: &NotificationEvent) -> bool {
match event {
NotificationEvent::ScanCompleted { .. } => config.events.scan_completed,
NotificationEvent::ScanFailed { .. } => config.events.scan_failed,
NotificationEvent::ScanCancelled { .. } => config.events.scan_cancelled,
NotificationEvent::ThumbnailCompleted { .. } => config.events.thumbnail_completed,
NotificationEvent::ThumbnailFailed { .. } => config.events.thumbnail_failed,
NotificationEvent::ConversionCompleted { .. } => config.events.conversion_completed,
NotificationEvent::ConversionFailed { .. } => config.events.conversion_failed,
NotificationEvent::MetadataApproved { .. } => config.events.metadata_approved,
NotificationEvent::MetadataBatchCompleted { .. } => config.events.metadata_batch_completed,
NotificationEvent::MetadataBatchFailed { .. } => config.events.metadata_batch_failed,
NotificationEvent::MetadataRefreshCompleted { .. } => config.events.metadata_refresh_completed,
NotificationEvent::MetadataRefreshFailed { .. } => config.events.metadata_refresh_failed,
}
}
/// Extract thumbnail path from event if present and file exists on disk.
fn event_thumbnail(event: &NotificationEvent) -> Option<&str> {
let path = match event {
NotificationEvent::ConversionCompleted { thumbnail_path, .. } => thumbnail_path.as_deref(),
NotificationEvent::ConversionFailed { thumbnail_path, .. } => thumbnail_path.as_deref(),
NotificationEvent::MetadataApproved { thumbnail_path, .. } => thumbnail_path.as_deref(),
_ => None,
};
path.filter(|p| std::path::Path::new(p).exists())
}
/// Load config + format + send in a spawned task. Errors are only logged.
pub fn notify(pool: PgPool, event: NotificationEvent) {
tokio::spawn(async move {
let config = match load_telegram_config(&pool).await {
Some(c) => c,
None => return, // disabled or not configured
};
if !is_event_enabled(&config, &event) {
return;
}
let text = format_event(&event);
let sent = if let Some(photo) = event_thumbnail(&event) {
match send_telegram_photo(&config, &text, photo).await {
Ok(()) => Ok(()),
Err(e) => {
warn!("[TELEGRAM] Photo send failed, falling back to text: {e}");
send_telegram(&config, &text).await
}
}
} else {
send_telegram(&config, &text).await
};
match sent {
Ok(()) => info!("[TELEGRAM] Notification sent"),
Err(e) => warn!("[TELEGRAM] Failed to send notification: {e}"),
}
});
}

View File

@@ -9,6 +9,7 @@ pub enum BookFormat {
Cbz, Cbz,
Cbr, Cbr,
Pdf, Pdf,
Epub,
} }
impl BookFormat { impl BookFormat {
@@ -17,6 +18,7 @@ impl BookFormat {
Self::Cbz => "cbz", Self::Cbz => "cbz",
Self::Cbr => "cbr", Self::Cbr => "cbr",
Self::Pdf => "pdf", Self::Pdf => "pdf",
Self::Epub => "epub",
} }
} }
} }
@@ -35,6 +37,7 @@ pub fn detect_format(path: &Path) -> Option<BookFormat> {
"cbz" => Some(BookFormat::Cbz), "cbz" => Some(BookFormat::Cbz),
"cbr" => Some(BookFormat::Cbr), "cbr" => Some(BookFormat::Cbr),
"pdf" => Some(BookFormat::Pdf), "pdf" => Some(BookFormat::Pdf),
"epub" => Some(BookFormat::Epub),
_ => None, _ => None,
} }
} }
@@ -144,6 +147,7 @@ pub fn parse_metadata(
BookFormat::Cbz => parse_cbz_page_count(path).ok(), BookFormat::Cbz => parse_cbz_page_count(path).ok(),
BookFormat::Cbr => parse_cbr_page_count(path).ok(), BookFormat::Cbr => parse_cbr_page_count(path).ok(),
BookFormat::Pdf => parse_pdf_page_count(path).ok(), BookFormat::Pdf => parse_pdf_page_count(path).ok(),
BookFormat::Epub => parse_epub_page_count(path).ok(),
}; };
Ok(meta) Ok(meta)
@@ -156,6 +160,7 @@ pub fn analyze_book(path: &Path, format: BookFormat, pdf_render_scale: u32) -> R
BookFormat::Cbz => analyze_cbz(path, true), BookFormat::Cbz => analyze_cbz(path, true),
BookFormat::Cbr => analyze_cbr(path, true), BookFormat::Cbr => analyze_cbr(path, true),
BookFormat::Pdf => analyze_pdf(path, pdf_render_scale), BookFormat::Pdf => analyze_pdf(path, pdf_render_scale),
BookFormat::Epub => analyze_epub(path),
} }
} }
@@ -530,6 +535,7 @@ pub fn list_archive_images(path: &Path, format: BookFormat) -> Result<Vec<String
BookFormat::Cbz => list_cbz_images(path), BookFormat::Cbz => list_cbz_images(path),
BookFormat::Cbr => list_cbr_images(path), BookFormat::Cbr => list_cbr_images(path),
BookFormat::Pdf => Err(anyhow::anyhow!("list_archive_images not applicable for PDF")), BookFormat::Pdf => Err(anyhow::anyhow!("list_archive_images not applicable for PDF")),
BookFormat::Epub => get_epub_image_index(path),
} }
} }
@@ -629,6 +635,7 @@ pub fn extract_image_by_name(path: &Path, format: BookFormat, image_name: &str)
BookFormat::Cbz => extract_cbz_by_name(path, image_name), BookFormat::Cbz => extract_cbz_by_name(path, image_name),
BookFormat::Cbr => extract_cbr_by_name(path, image_name), BookFormat::Cbr => extract_cbr_by_name(path, image_name),
BookFormat::Pdf => Err(anyhow::anyhow!("use extract_page for PDF")), BookFormat::Pdf => Err(anyhow::anyhow!("use extract_page for PDF")),
BookFormat::Epub => extract_cbz_by_name(path, image_name),
} }
} }
@@ -721,6 +728,7 @@ pub fn extract_page(path: &Path, format: BookFormat, page_number: u32, pdf_rende
let width = if pdf_render_width == 0 { 1200 } else { pdf_render_width }; let width = if pdf_render_width == 0 { 1200 } else { pdf_render_width };
render_pdf_page_n(path, page_number, width) render_pdf_page_n(path, page_number, width)
} }
BookFormat::Epub => extract_epub_page(path, page_number),
} }
} }
@@ -894,6 +902,340 @@ fn render_pdf_page_n(path: &Path, page_number: u32, width: u32) -> Result<Vec<u8
} }
// ============================================================
// EPUB support — spine-aware image index with cache
// ============================================================
/// Cache of ordered image paths per EPUB file. Avoids re-parsing OPF/XHTML on every page request.
static EPUB_INDEX_CACHE: OnceLock<Mutex<HashMap<PathBuf, Vec<String>>>> = OnceLock::new();
fn epub_index_cache() -> &'static Mutex<HashMap<PathBuf, Vec<String>>> {
EPUB_INDEX_CACHE.get_or_init(|| Mutex::new(HashMap::new()))
}
// Pre-compiled regex patterns for EPUB XML parsing (compiled once on first use)
static RE_EPUB_ROOTFILE: OnceLock<regex::Regex> = OnceLock::new();
static RE_EPUB_ITEM: OnceLock<regex::Regex> = OnceLock::new();
static RE_EPUB_ITEMREF: OnceLock<regex::Regex> = OnceLock::new();
static RE_EPUB_IMG_SRC: OnceLock<regex::Regex> = OnceLock::new();
static RE_EPUB_SVG_HREF: OnceLock<regex::Regex> = OnceLock::new();
static RE_EPUB_ATTR_ID: OnceLock<regex::Regex> = OnceLock::new();
static RE_EPUB_ATTR_HREF: OnceLock<regex::Regex> = OnceLock::new();
static RE_EPUB_ATTR_MEDIA: OnceLock<regex::Regex> = OnceLock::new();
struct EpubManifestItem {
href: String,
media_type: String,
}
/// Build the ordered list of image paths for an EPUB file.
/// Walks the OPF spine to determine reading order, parses XHTML/SVG pages
/// for image references, and falls back to CBZ-style listing if no
/// images are found through the spine.
fn build_epub_image_index(path: &Path) -> Result<Vec<String>> {
let file = std::fs::File::open(path)
.with_context(|| format!("cannot open epub: {}", path.display()))?;
let mut archive = zip::ZipArchive::new(file)
.with_context(|| format!("invalid epub zip: {}", path.display()))?;
// 1. Find OPF path from META-INF/container.xml
let opf_path = {
let mut entry = archive
.by_name("META-INF/container.xml")
.context("missing META-INF/container.xml — not a valid EPUB")?;
let mut buf = Vec::new();
entry.read_to_end(&mut buf)?;
let xml = String::from_utf8_lossy(&buf);
let re = RE_EPUB_ROOTFILE.get_or_init(|| {
regex::Regex::new(r#"<(?:\w+:)?rootfile[^>]+full-path="([^"]+)""#).unwrap()
});
re.captures(&xml)
.and_then(|c| c.get(1))
.map(|m| decode_xml_entities(m.as_str()))
.context("no rootfile found in container.xml")?
};
let opf_dir = std::path::Path::new(&opf_path)
.parent()
.map(|p| p.to_string_lossy().to_string())
.unwrap_or_default();
// 2. Parse OPF manifest + spine
let (manifest, spine_idrefs) = {
let mut entry = archive
.by_name(&opf_path)
.with_context(|| format!("missing OPF file: {}", opf_path))?;
let mut buf = Vec::new();
entry.read_to_end(&mut buf)?;
let xml = String::from_utf8_lossy(&buf);
parse_epub_opf(&xml, &opf_dir)?
};
// 3. Walk spine entries to build ordered image list
let re_img = RE_EPUB_IMG_SRC.get_or_init(|| {
regex::Regex::new(r#"(?i)<img\s[^>]*src=["']([^"']+)["']"#).unwrap()
});
let re_svg = RE_EPUB_SVG_HREF.get_or_init(|| {
regex::Regex::new(r#"(?i)<image\s[^>]*(?:xlink:)?href=["']([^"']+)["']"#).unwrap()
});
let mut images: Vec<String> = Vec::new();
let mut seen = std::collections::HashSet::new();
for idref in &spine_idrefs {
let item = match manifest.get(idref.as_str()) {
Some(item) => item,
None => continue,
};
// Direct raster image in spine (rare but possible)
if item.media_type.starts_with("image/") && !item.media_type.contains("svg") {
if seen.insert(item.href.clone()) {
images.push(item.href.clone());
}
continue;
}
// Read XHTML/SVG content — entry is dropped at end of match arm, releasing archive borrow
let content = match archive.by_name(&item.href) {
Ok(mut entry) => {
let mut buf = Vec::new();
match entry.read_to_end(&mut buf) {
Ok(_) => String::from_utf8_lossy(&buf).to_string(),
Err(_) => continue,
}
}
Err(_) => continue,
};
let content_dir = std::path::Path::new(&item.href)
.parent()
.map(|p| p.to_string_lossy().to_string())
.unwrap_or_default();
// Extract <img src="..."> and <image [xlink:]href="...">
for re in [re_img, re_svg] {
for cap in re.captures_iter(&content) {
if let Some(src) = cap.get(1) {
let src_str = src.as_str();
if src_str.starts_with("data:") {
continue;
}
let decoded = decode_xml_entities(&percent_decode_epub(src_str));
let resolved = resolve_epub_path(&content_dir, &decoded);
if seen.insert(resolved.clone()) {
images.push(resolved);
}
}
}
}
}
// 4. Fallback: no images from spine → list all images in ZIP (CBZ-style)
if images.is_empty() {
for i in 0..archive.len() {
if let Ok(entry) = archive.by_index(i) {
let name = entry.name().to_string();
if is_image_name(&name.to_ascii_lowercase()) && seen.insert(name.clone()) {
images.push(name);
}
}
}
images.sort_by(|a, b| natord::compare(a, b));
}
if images.is_empty() {
return Err(anyhow::anyhow!("no images found in epub: {}", path.display()));
}
Ok(images)
}
fn parse_epub_opf(
xml: &str,
opf_dir: &str,
) -> Result<(HashMap<String, EpubManifestItem>, Vec<String>)> {
let re_item = RE_EPUB_ITEM.get_or_init(|| {
regex::Regex::new(r#"(?s)<(?:\w+:)?item\s([^>]+?)/?>"#).unwrap()
});
let re_itemref = RE_EPUB_ITEMREF.get_or_init(|| {
regex::Regex::new(r#"<(?:\w+:)?itemref\s[^>]*idref="([^"]+)""#).unwrap()
});
let re_id = RE_EPUB_ATTR_ID.get_or_init(|| {
regex::Regex::new(r#"(?:^|\s)id="([^"]+)""#).unwrap()
});
let re_href = RE_EPUB_ATTR_HREF.get_or_init(|| {
regex::Regex::new(r#"(?:^|\s)href="([^"]+)""#).unwrap()
});
let re_media = RE_EPUB_ATTR_MEDIA.get_or_init(|| {
regex::Regex::new(r#"media-type="([^"]+)""#).unwrap()
});
let mut manifest: HashMap<String, EpubManifestItem> = HashMap::new();
for cap in re_item.captures_iter(xml) {
if let Some(attrs) = cap.get(1) {
let a = attrs.as_str();
let id = re_id.captures(a).and_then(|c| c.get(1));
let href = re_href.captures(a).and_then(|c| c.get(1));
let media = re_media.captures(a).and_then(|c| c.get(1));
if let (Some(id), Some(href), Some(media)) = (id, href, media) {
let decoded_href = decode_xml_entities(&percent_decode_epub(href.as_str()));
let resolved = resolve_epub_path(opf_dir, &decoded_href);
manifest.insert(
id.as_str().to_string(),
EpubManifestItem {
href: resolved,
media_type: media.as_str().to_string(),
},
);
}
}
}
let spine_idrefs: Vec<String> = re_itemref
.captures_iter(xml)
.filter_map(|c| c.get(1).map(|m| m.as_str().to_string()))
.collect();
Ok((manifest, spine_idrefs))
}
/// Get the cached image index for an EPUB, building it on first access.
fn get_epub_image_index(path: &Path) -> Result<Vec<String>> {
{
let cache = epub_index_cache().lock().unwrap();
if let Some(names) = cache.get(path) {
return Ok(names.clone());
}
}
let images = build_epub_image_index(path)?;
{
let mut cache = epub_index_cache().lock().unwrap();
cache.insert(path.to_path_buf(), images.clone());
}
Ok(images)
}
fn parse_epub_page_count(path: &Path) -> Result<i32> {
let images = build_epub_image_index(path)?;
Ok(images.len() as i32)
}
fn analyze_epub(path: &Path) -> Result<(i32, Vec<u8>)> {
let images = get_epub_image_index(path)?;
let count = images.len() as i32;
let file = std::fs::File::open(path)
.with_context(|| format!("cannot open epub: {}", path.display()))?;
let mut archive = zip::ZipArchive::new(file)?;
for img_path in &images {
if let Ok(mut entry) = archive.by_name(img_path) {
let mut buf = Vec::new();
if entry.read_to_end(&mut buf).is_ok() && !buf.is_empty() {
return Ok((count, buf));
}
}
}
Err(anyhow::anyhow!(
"no readable images in epub: {}",
path.display()
))
}
fn extract_epub_page(path: &Path, page_number: u32) -> Result<Vec<u8>> {
let images = get_epub_image_index(path)?;
let index = page_number as usize - 1;
let img_path = images
.get(index)
.with_context(|| {
format!(
"page {} out of range (total: {})",
page_number,
images.len()
)
})?;
let file = std::fs::File::open(path)
.with_context(|| format!("cannot open epub: {}", path.display()))?;
let mut archive = zip::ZipArchive::new(file)?;
let mut entry = archive
.by_name(img_path)
.with_context(|| format!("image '{}' not found in epub", img_path))?;
let mut buf = Vec::new();
entry.read_to_end(&mut buf)?;
Ok(buf)
}
// --- EPUB path/encoding helpers ---
fn resolve_epub_path(base_dir: &str, href: &str) -> String {
if let Some(stripped) = href.strip_prefix('/') {
return normalize_epub_path(stripped);
}
if base_dir.is_empty() {
return normalize_epub_path(href);
}
normalize_epub_path(&format!("{}/{}", base_dir, href))
}
fn normalize_epub_path(path: &str) -> String {
let mut parts: Vec<&str> = Vec::new();
for part in path.split('/') {
match part {
".." => {
parts.pop();
}
"." | "" => {}
_ => parts.push(part),
}
}
parts.join("/")
}
fn percent_decode_epub(s: &str) -> String {
if !s.contains('%') {
return s.to_string();
}
let bytes = s.as_bytes();
let mut result = Vec::with_capacity(bytes.len());
let mut i = 0;
while i < bytes.len() {
if bytes[i] == b'%' && i + 2 < bytes.len() {
if let (Some(h), Some(l)) = (epub_hex_val(bytes[i + 1]), epub_hex_val(bytes[i + 2])) {
result.push(h * 16 + l);
i += 3;
continue;
}
}
result.push(bytes[i]);
i += 1;
}
String::from_utf8_lossy(&result).to_string()
}
fn epub_hex_val(b: u8) -> Option<u8> {
match b {
b'0'..=b'9' => Some(b - b'0'),
b'a'..=b'f' => Some(b - b'a' + 10),
b'A'..=b'F' => Some(b - b'A' + 10),
_ => None,
}
}
fn decode_xml_entities(s: &str) -> String {
if !s.contains('&') {
return s.to_string();
}
s.replace("&amp;", "&")
.replace("&lt;", "<")
.replace("&gt;", ">")
.replace("&quot;", "\"")
.replace("&apos;", "'")
}
/// Convert a CBR file to CBZ in-place (same directory, same stem). /// Convert a CBR file to CBZ in-place (same directory, same stem).
/// ///
/// The conversion is safe: a `.cbz.tmp` file is written first, verified, then /// The conversion is safe: a `.cbz.tmp` file is written first, verified, then

341
docs/FEATURES.md Normal file
View File

@@ -0,0 +1,341 @@
# Stripstream Librarian — Features & Business Rules
## Libraries
### Multi-Library Management
- Create and manage multiple independent libraries, each with its own root path
- Enable/disable libraries individually
- Delete a library cascades to all its books, jobs, and metadata
### Scanning & Indexing
- **Incremental scan**: uses directory mtime tracking to skip unchanged directories
- **Full rebuild**: force re-walk all directories, ignoring cached mtimes
- **Rescan**: deep rescan to discover newly supported formats
- **Two-phase pipeline**:
- Phase 1 (Discovery): fast filename-based metadata extraction (no archive I/O)
- Phase 2 (Analysis): extract page counts, first page image from archives
### Real-Time Monitoring
- **Automatic periodic scanning**: configurable interval (default 5 seconds)
- **Filesystem watcher**: real-time detection of file changes for instant indexing
- Each can be toggled per library (`monitor_enabled`, `watcher_enabled`)
---
## Books
### Format Support
- **CBZ** (ZIP-based comic archives)
- **CBR** (RAR-based comic archives)
- **PDF**
- **EPUB**
- Automatic format detection from file extension and magic bytes
### Metadata Extraction
- **Title**: derived from filename or external metadata
- **Series**: derived from directory structure (first directory level under library root)
- **Volume**: extracted from filename with pattern detection:
- `T##` (Tome) — most common for French comics
- `Vol.##`, `Vol ##`, `Volume ##`
- `###` (standalone number)
- `-## ` (dash-separated)
- **Author(s)**: single scalar and array support
- **Page count**: extracted from archive analysis
- **Language**, **kind** (ebook, comic, bd)
### Thumbnails
- Generated from the first page of each archive
- Output format configurable: WebP (default), JPEG, PNG
- Configurable dimensions (default 300×400)
- Lazy generation: created on first access if missing
- Bulk operations: rebuild missing or regenerate all
### CBR to CBZ Conversion
- Convert RAR archives to ZIP format
- Tracked as background job with progress
---
## Series
### Automatic Aggregation
- Series derived from directory structure during scanning
- Books without series grouped as "unclassified"
### Series Metadata
- Description, publisher, start year, status (`ongoing`, `ended`, `completed`, `on_hold`, `hiatus`)
- Total volume count (from external providers)
- Authors (aggregated from books or metadata)
### Filtering & Discovery
- Filter by: series name (partial match), reading status, series status, metadata provider linkage
- Sort by: name, reading status, book count
- **Missing books detection**: identifies gaps in volume numbering within a series
---
## Reading Progress
### Per-Book Tracking
- Three states: `unread` (default), `reading`, `read`
- Current page tracking when status is `reading`
- `last_read_at` timestamp auto-updated
### Series-Level Status
- Calculated from book statuses:
- All read → series `read`
- None read → series `unread`
- Mixed → series `reading`
### Bulk Operations
- Mark entire series as read (updates all books)
---
## Search & Discovery
### Full-Text Search
- PostgreSQL-based (`ILIKE` + `pg_trgm`)
- Searches across: book titles, series names, authors (scalar and array fields), series metadata authors
- Case-insensitive partial matching
- Library-scoped filtering
### Results
- Book hits: title, authors, series, volume, language, kind
- Series hits: name, book count, read count, first book (for linking)
- Processing time included in response
---
## Authors
- Unique author aggregation from books and series metadata
- Per-author book and series count
- Searchable by name (partial match)
- Sortable by name or book count
---
## External Metadata
### Supported Providers
| Provider | Focus |
|----------|-------|
| Google Books | General books (default fallback) |
| ComicVine | Comics |
| BedéThèque | Franco-Belgian comics |
| AniList | Manga/anime |
| Open Library | General books |
### Provider Configuration
- Global default provider with library-level override
- Fallback provider if primary is unavailable
### Matching Workflow
1. **Search**: query a provider, get candidates with confidence scores
2. **Match**: link a series to an external result (status `pending`)
3. **Approve**: validate and sync metadata to series and books
4. **Reject**: discard a match
### Batch Processing
- Auto-match all series in a library via `metadata_batch` job
- Configurable confidence threshold
- Result statuses: `auto_matched`, `no_results`, `too_many_results`, `low_confidence`, `already_linked`
### Metadata Refresh
- Update approved links with latest data from providers
- Change tracking reports per series/book
- Non-destructive: only updates when provider has new data
### Field Locking
- Individual book fields can be locked to prevent external sync from overwriting manual edits
---
## External Integrations
### Komga Sync
- Import reading progress from a Komga server
- Matches local series/books by name
- Detailed sync report: matched, already read, newly marked, unmatched
### Prowlarr (Indexer Search)
- Search Prowlarr for missing volumes in a series
- Volume pattern matching against release titles
- Results: title, size, seeders/leechers, download URL, matched missing volumes
### qBittorrent
- Add torrents directly from Prowlarr search results
- Connection test endpoint
---
## Notifications
### Telegram
- Real-time notifications via Telegram Bot API (`sendMessage` and `sendPhoto`)
- Configuration: bot token, chat ID, enable/disable toggle
- Test connection button in settings
### Granular Event Toggles
12 individually configurable notification events grouped by category:
| Category | Events |
|----------|--------|
| Scans | `scan_completed`, `scan_failed`, `scan_cancelled` |
| Thumbnails | `thumbnail_completed`, `thumbnail_failed`, `thumbnail_cancelled` |
| Conversion | `conversion_completed`, `conversion_failed`, `conversion_cancelled` |
| Metadata | `metadata_approved`, `metadata_batch_completed`, `metadata_refresh_completed` |
### Thumbnail Images in Notifications
- Book cover thumbnails attached to applicable notifications (conversion, metadata approval)
- Uses `sendPhoto` multipart upload with fallback to text-only `sendMessage`
### Implementation
- Shared `crates/notifications` crate used by both API and indexer
- Fire-and-forget: notification failures are logged but never block the main operation
- Messages formatted in HTML with event-specific icons
---
## Page Rendering & Caching
### Page Extraction
- Render any page from supported archive formats
- 1-indexed page numbers
### Image Processing
- Output formats: original, JPEG, PNG, WebP
- Quality parameter (1100)
- Max width parameter (12160 px)
- Configurable resampling filter: lanczos3, nearest, triangle/bilinear
- Concurrent render limit (default 8) with semaphore
### Caching
- **LRU in-memory cache**: 512 entries
- **Disk cache**: SHA256-keyed, two-level directory structure
- Cache key = hash(path + page + format + quality + width)
- Configurable cache directory and max size
- Manual cache clear via settings
---
## Background Jobs
### Job Types
| Type | Description |
|------|-------------|
| `rebuild` | Incremental scan |
| `full_rebuild` | Full filesystem rescan |
| `rescan` | Deep rescan for new formats |
| `thumbnail_rebuild` | Generate missing thumbnails |
| `thumbnail_regenerate` | Clear and regenerate all thumbnails |
| `cbr_to_cbz` | Convert RAR to ZIP |
| `metadata_batch` | Auto-match series to metadata |
| `metadata_refresh` | Update approved metadata links |
### Job Lifecycle
- Status flow: `pending``running``success` | `failed` | `cancelled`
- Intermediate statuses: `extracting_pages`, `generating_thumbnails`
- Real-time progress via **Server-Sent Events** (SSE)
- Per-file error tracking (non-fatal: job continues on errors)
- Cancellation support for pending/running jobs
### Progress Tracking
- Percentage (0100), current file, processed/total counts
- Timing: started_at, finished_at, phase2_started_at
- Stats JSON blob with job-specific metrics
---
## Authentication & Security
### Token System
- **Bootstrap token**: admin token via `API_BOOTSTRAP_TOKEN` env var
- **API tokens**: create, list, revoke with scopes
- Token format: `stl_{prefix}_{secret}` with Argon2 hashing
- Expiration dates, last usage tracking, revocation
### Access Control
- **Two scopes**: `admin` (full access) and `read` (read-only)
- Route-level middleware enforcement
- Rate limiting: configurable sliding window (default 120 req/s)
---
## Backoffice (Web UI)
### Dashboard
- Statistics cards: books, series, authors, libraries, pages, total size
- Interactive charts (recharts): donut, area, stacked bar, horizontal bar
- Reading status breakdown, format distribution, library distribution
- Currently reading section with progress bars
- Recently read section with cover thumbnails
- Reading activity over time (area chart)
- Books added over time (area chart)
- Per-library stacked reading progress
- Top series by book count
- Metadata coverage and provider breakdown
### Pages
- **Libraries**: list, create, delete, configure monitoring and metadata provider
- **Books**: global list with filtering/sorting, detail view with metadata and page rendering
- **Series**: global list, per-library view, detail with metadata management
- **Authors**: list with book/series counts, detail with author's books
- **Jobs**: history, live progress via SSE, error details
- **Tokens**: create, list, revoke API tokens
- **Settings**: image processing, cache, thumbnails, external services (Prowlarr, qBittorrent), notifications (Telegram)
### Interactive Features
- Real-time search with suggestions
- Metadata search and matching modals
- Prowlarr search modal for missing volumes
- Folder browser/picker for library paths
- Book/series editing forms
- Quick reading status toggles
- CBR to CBZ conversion trigger
---
## API
### Documentation
- OpenAPI/Swagger UI available at `/swagger-ui`
- Health check (`/health`), readiness (`/ready`), Prometheus metrics (`/metrics`)
### Public Endpoints (no auth)
- `GET /health`, `GET /ready`, `GET /metrics`, `GET /swagger-ui`
### Read Endpoints (read scope)
- Libraries, books, series, authors listing and detail
- Book pages and thumbnails
- Reading progress get/update
- Full-text search, collection statistics
### Admin Endpoints (admin scope)
- Library CRUD and configuration
- Book metadata editing, CBR conversion
- Series metadata editing
- Indexing job management (trigger, cancel, stream)
- API token management
- Metadata operations (search, match, approve, reject, batch, refresh)
- External integrations (Prowlarr, qBittorrent, Komga)
- Application settings and cache management
---
## Database
### Key Design Decisions
- PostgreSQL with `pg_trgm` for full-text search (no external search engine)
- All deletions cascade from libraries
- Unique constraints: file paths, token prefixes, metadata links (library + series + provider)
- Directory mtime caching for incremental scan optimization
- Connection pool: 10 (API), 20 (indexer)
### Archive Resilience
- CBZ: fallback streaming reader if central directory corrupted
- CBR: RAR extraction via system `unar`, fallback to CBZ parsing
- PDF: `pdfinfo` for page count, `pdftoppm` for rendering
- EPUB: ZIP-based extraction
- FD exhaustion detection: aborts if too many consecutive IO errors

View File

@@ -0,0 +1,4 @@
ALTER TABLE libraries
ADD COLUMN metadata_refresh_mode TEXT NOT NULL DEFAULT 'manual',
ADD COLUMN last_metadata_refresh_at TIMESTAMPTZ,
ADD COLUMN next_metadata_refresh_at TIMESTAMPTZ;

View File

@@ -0,0 +1,10 @@
-- Add EPUB to allowed format values in book_files and books tables.
-- PostgreSQL CHECK constraints are dropped+recreated (no ALTER CONSTRAINT).
-- book_files.format
ALTER TABLE book_files DROP CONSTRAINT IF EXISTS book_files_format_check;
ALTER TABLE book_files ADD CONSTRAINT book_files_format_check CHECK (format IN ('pdf', 'cbz', 'cbr', 'epub'));
-- books.format (denormalized column added in 0020)
ALTER TABLE books DROP CONSTRAINT IF EXISTS books_format_check;
ALTER TABLE books ADD CONSTRAINT books_format_check CHECK (format IN ('pdf', 'cbz', 'cbr', 'epub'));

View File

@@ -0,0 +1,7 @@
-- Add rescan job type: clears directory mtimes to force re-walking all directories
-- while preserving existing data (unlike full_rebuild which deletes everything).
-- Useful for discovering newly supported formats (e.g. EPUB) without losing metadata.
ALTER TABLE index_jobs
DROP CONSTRAINT IF EXISTS index_jobs_type_check,
ADD CONSTRAINT index_jobs_type_check
CHECK (type IN ('scan', 'rebuild', 'full_rebuild', 'rescan', 'thumbnail_rebuild', 'thumbnail_regenerate', 'cbr_to_cbz', 'metadata_batch', 'metadata_refresh'));

View File

@@ -0,0 +1,3 @@
INSERT INTO app_settings (key, value) VALUES
('telegram', '{"bot_token": "", "chat_id": "", "enabled": false, "events": {"job_completed": true, "job_failed": true, "job_cancelled": true, "metadata_approved": true}}')
ON CONFLICT DO NOTHING;

View File

@@ -0,0 +1,8 @@
-- Update telegram events from 4 generic toggles to 12 granular toggles
UPDATE app_settings
SET value = jsonb_set(
value,
'{events}',
'{"scan_completed": true, "scan_failed": true, "scan_cancelled": true, "thumbnail_completed": true, "thumbnail_failed": true, "conversion_completed": true, "conversion_failed": true, "metadata_approved": true, "metadata_batch_completed": true, "metadata_batch_failed": true, "metadata_refresh_completed": true, "metadata_refresh_failed": true}'::jsonb
)
WHERE key = 'telegram';