Compare commits
4 Commits
24516f1069
...
bd74c9e3e3
| Author | SHA1 | Date | |
|---|---|---|---|
| bd74c9e3e3 | |||
| 41228430cf | |||
| 6a4ba06fac | |||
| e5c3542d3f |
8
Cargo.lock
generated
8
Cargo.lock
generated
@@ -64,7 +64,7 @@ checksum = "7f202df86484c868dbad7eaa557ef785d5c66295e41b460ef922eca0723b842c"
|
|||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "api"
|
name = "api"
|
||||||
version = "1.21.1"
|
version = "1.21.2"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"anyhow",
|
"anyhow",
|
||||||
"argon2",
|
"argon2",
|
||||||
@@ -1232,7 +1232,7 @@ dependencies = [
|
|||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "indexer"
|
name = "indexer"
|
||||||
version = "1.21.1"
|
version = "1.21.2"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"anyhow",
|
"anyhow",
|
||||||
"axum",
|
"axum",
|
||||||
@@ -1771,7 +1771,7 @@ dependencies = [
|
|||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "parsers"
|
name = "parsers"
|
||||||
version = "1.21.1"
|
version = "1.21.2"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"anyhow",
|
"anyhow",
|
||||||
"flate2",
|
"flate2",
|
||||||
@@ -2906,7 +2906,7 @@ dependencies = [
|
|||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "stripstream-core"
|
name = "stripstream-core"
|
||||||
version = "1.21.1"
|
version = "1.21.2"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"anyhow",
|
"anyhow",
|
||||||
"serde",
|
"serde",
|
||||||
|
|||||||
@@ -9,7 +9,7 @@ resolver = "2"
|
|||||||
|
|
||||||
[workspace.package]
|
[workspace.package]
|
||||||
edition = "2021"
|
edition = "2021"
|
||||||
version = "1.21.1"
|
version = "1.21.2"
|
||||||
license = "MIT"
|
license = "MIT"
|
||||||
|
|
||||||
[workspace.dependencies]
|
[workspace.dependencies]
|
||||||
|
|||||||
68
README.md
68
README.md
@@ -81,28 +81,58 @@ The backoffice will be available at http://localhost:7082
|
|||||||
|
|
||||||
## Features
|
## Features
|
||||||
|
|
||||||
### Libraries Management
|
> For the full feature list, business rules, and API details, see [docs/FEATURES.md](docs/FEATURES.md).
|
||||||
- Create and manage multiple libraries
|
|
||||||
- Configure automatic scanning schedules (hourly, daily, weekly)
|
|
||||||
- Real-time file watcher for instant indexing
|
|
||||||
- Full and incremental rebuild options
|
|
||||||
|
|
||||||
### Books Management
|
### Libraries
|
||||||
- Support for CBZ, CBR, and PDF formats
|
- Multi-library management with per-library configuration
|
||||||
- Automatic metadata extraction
|
- Incremental and full scanning, real-time filesystem watcher
|
||||||
- Series and volume detection
|
- Per-library metadata provider selection (Google Books, ComicVine, BedéThèque, AniList, Open Library)
|
||||||
- Full-text search powered by PostgreSQL
|
|
||||||
|
|
||||||
### Jobs Monitoring
|
### Books & Series
|
||||||
- Real-time job progress tracking
|
- **Formats**: CBZ, CBR, PDF, EPUB
|
||||||
- Detailed statistics (scanned, indexed, removed, errors)
|
- Automatic metadata extraction (title, series, volume, authors, page count) from filenames and directory structure
|
||||||
- Job history and logs
|
- Series aggregation with missing volume detection
|
||||||
- Cancel pending jobs
|
- Thumbnail generation (WebP/JPEG/PNG) with lazy generation and bulk rebuild
|
||||||
|
- CBR → CBZ conversion
|
||||||
|
|
||||||
### Search
|
### Reading Progress
|
||||||
- Full-text search across titles, authors, and series
|
- Per-book tracking: unread / reading / read with current page
|
||||||
- Library filtering
|
- Series-level aggregated reading status
|
||||||
- Real-time suggestions
|
- Bulk mark-as-read for series
|
||||||
|
|
||||||
|
### Search & Discovery
|
||||||
|
- Full-text search across titles, authors, and series (PostgreSQL `pg_trgm`)
|
||||||
|
- Author listing with book/series counts
|
||||||
|
- Filtering by reading status, series status, format, metadata provider
|
||||||
|
|
||||||
|
### External Metadata
|
||||||
|
- Search, match, approve/reject workflow with confidence scoring
|
||||||
|
- Batch auto-matching and scheduled metadata refresh
|
||||||
|
- Field locking to protect manual edits from sync
|
||||||
|
|
||||||
|
### External Integrations
|
||||||
|
- **Komga**: import reading progress
|
||||||
|
- **Prowlarr**: search for missing volumes
|
||||||
|
- **qBittorrent**: add torrents directly from search results
|
||||||
|
|
||||||
|
### Background Jobs
|
||||||
|
- Rebuild, rescan, thumbnail generation, metadata batch, CBR conversion
|
||||||
|
- Real-time progress via Server-Sent Events (SSE)
|
||||||
|
- Job history, error tracking, cancellation
|
||||||
|
|
||||||
|
### Page Rendering
|
||||||
|
- On-demand page extraction from all formats
|
||||||
|
- Image processing (format, quality, max width, resampling filter)
|
||||||
|
- LRU in-memory + disk cache
|
||||||
|
|
||||||
|
### Security
|
||||||
|
- Token-based auth (`admin` / `read` scopes) with Argon2 hashing
|
||||||
|
- Rate limiting, token expiration and revocation
|
||||||
|
|
||||||
|
### Web UI (Backoffice)
|
||||||
|
- Dashboard with statistics, charts, and reading progress
|
||||||
|
- Library, book, series, author management
|
||||||
|
- Live job monitoring, metadata search modals, settings panel
|
||||||
|
|
||||||
## Environment Variables
|
## Environment Variables
|
||||||
|
|
||||||
|
|||||||
@@ -68,7 +68,7 @@ pub async fn list_authors(
|
|||||||
.filter(|s| !s.trim().is_empty())
|
.filter(|s| !s.trim().is_empty())
|
||||||
.map(|s| format!("%{s}%"));
|
.map(|s| format!("%{s}%"));
|
||||||
|
|
||||||
// Aggregate unique authors from books.authors + books.author
|
// Aggregate unique authors from books.authors + books.author + series_metadata.authors
|
||||||
let sql = format!(
|
let sql = format!(
|
||||||
r#"
|
r#"
|
||||||
WITH all_authors AS (
|
WITH all_authors AS (
|
||||||
@@ -79,18 +79,21 @@ pub async fn list_authors(
|
|||||||
)
|
)
|
||||||
) AS name
|
) AS name
|
||||||
FROM books
|
FROM books
|
||||||
|
UNION
|
||||||
|
SELECT DISTINCT UNNEST(authors) AS name
|
||||||
|
FROM series_metadata
|
||||||
|
WHERE authors != '{{}}'
|
||||||
),
|
),
|
||||||
filtered AS (
|
filtered AS (
|
||||||
SELECT name FROM all_authors
|
SELECT name FROM all_authors
|
||||||
WHERE ($1::text IS NULL OR name ILIKE $1)
|
WHERE ($1::text IS NULL OR name ILIKE $1)
|
||||||
),
|
),
|
||||||
counted AS (
|
book_counts AS (
|
||||||
SELECT
|
SELECT
|
||||||
f.name,
|
f.name AS author_name,
|
||||||
COUNT(DISTINCT b.id) AS book_count,
|
COUNT(DISTINCT b.id) AS book_count
|
||||||
COUNT(DISTINCT NULLIF(b.series, '')) AS series_count
|
|
||||||
FROM filtered f
|
FROM filtered f
|
||||||
JOIN books b ON (
|
LEFT JOIN books b ON (
|
||||||
f.name = ANY(
|
f.name = ANY(
|
||||||
COALESCE(
|
COALESCE(
|
||||||
NULLIF(b.authors, '{{}}'),
|
NULLIF(b.authors, '{{}}'),
|
||||||
@@ -99,9 +102,24 @@ pub async fn list_authors(
|
|||||||
)
|
)
|
||||||
)
|
)
|
||||||
GROUP BY f.name
|
GROUP BY f.name
|
||||||
|
),
|
||||||
|
series_counts AS (
|
||||||
|
SELECT
|
||||||
|
f.name AS author_name,
|
||||||
|
COUNT(DISTINCT (sm.library_id, sm.name)) AS series_count
|
||||||
|
FROM filtered f
|
||||||
|
LEFT JOIN series_metadata sm ON (
|
||||||
|
f.name = ANY(sm.authors) AND sm.authors != '{{}}'
|
||||||
|
)
|
||||||
|
GROUP BY f.name
|
||||||
)
|
)
|
||||||
SELECT name, book_count, series_count
|
SELECT
|
||||||
FROM counted
|
f.name,
|
||||||
|
COALESCE(bc.book_count, 0) AS book_count,
|
||||||
|
COALESCE(sc.series_count, 0) AS series_count
|
||||||
|
FROM filtered f
|
||||||
|
LEFT JOIN book_counts bc ON bc.author_name = f.name
|
||||||
|
LEFT JOIN series_counts sc ON sc.author_name = f.name
|
||||||
ORDER BY {order_clause}
|
ORDER BY {order_clause}
|
||||||
LIMIT $2 OFFSET $3
|
LIMIT $2 OFFSET $3
|
||||||
"#
|
"#
|
||||||
@@ -116,6 +134,10 @@ pub async fn list_authors(
|
|||||||
)
|
)
|
||||||
) AS name
|
) AS name
|
||||||
FROM books
|
FROM books
|
||||||
|
UNION
|
||||||
|
SELECT DISTINCT UNNEST(authors) AS name
|
||||||
|
FROM series_metadata
|
||||||
|
WHERE authors != '{}'
|
||||||
)
|
)
|
||||||
SELECT COUNT(*) AS total
|
SELECT COUNT(*) AS total
|
||||||
FROM all_authors
|
FROM all_authors
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
@@ -48,7 +48,6 @@ pub struct CreateLibraryRequest {
|
|||||||
responses(
|
responses(
|
||||||
(status = 200, body = Vec<LibraryResponse>),
|
(status = 200, body = Vec<LibraryResponse>),
|
||||||
(status = 401, description = "Unauthorized"),
|
(status = 401, description = "Unauthorized"),
|
||||||
(status = 403, description = "Forbidden - Admin scope required"),
|
|
||||||
),
|
),
|
||||||
security(("Bearer" = []))
|
security(("Bearer" = []))
|
||||||
)]
|
)]
|
||||||
@@ -221,7 +220,6 @@ use crate::index_jobs::{IndexJobResponse, RebuildRequest};
|
|||||||
(status = 200, body = IndexJobResponse),
|
(status = 200, body = IndexJobResponse),
|
||||||
(status = 404, description = "Library not found"),
|
(status = 404, description = "Library not found"),
|
||||||
(status = 401, description = "Unauthorized"),
|
(status = 401, description = "Unauthorized"),
|
||||||
(status = 403, description = "Forbidden - Admin scope required"),
|
|
||||||
),
|
),
|
||||||
security(("Bearer" = []))
|
security(("Bearer" = []))
|
||||||
)]
|
)]
|
||||||
|
|||||||
@@ -17,6 +17,7 @@ mod prowlarr;
|
|||||||
mod qbittorrent;
|
mod qbittorrent;
|
||||||
mod reading_progress;
|
mod reading_progress;
|
||||||
mod search;
|
mod search;
|
||||||
|
mod series;
|
||||||
mod settings;
|
mod settings;
|
||||||
mod state;
|
mod state;
|
||||||
mod stats;
|
mod stats;
|
||||||
@@ -86,14 +87,13 @@ async fn main() -> anyhow::Result<()> {
|
|||||||
};
|
};
|
||||||
|
|
||||||
let admin_routes = Router::new()
|
let admin_routes = Router::new()
|
||||||
.route("/libraries", get(libraries::list_libraries).post(libraries::create_library))
|
.route("/libraries", axum::routing::post(libraries::create_library))
|
||||||
.route("/libraries/:id", delete(libraries::delete_library))
|
.route("/libraries/:id", delete(libraries::delete_library))
|
||||||
.route("/libraries/:id/scan", axum::routing::post(libraries::scan_library))
|
|
||||||
.route("/libraries/:id/monitoring", axum::routing::patch(libraries::update_monitoring))
|
.route("/libraries/:id/monitoring", axum::routing::patch(libraries::update_monitoring))
|
||||||
.route("/libraries/:id/metadata-provider", axum::routing::patch(libraries::update_metadata_provider))
|
.route("/libraries/:id/metadata-provider", axum::routing::patch(libraries::update_metadata_provider))
|
||||||
.route("/books/:id", axum::routing::patch(books::update_book))
|
.route("/books/:id", axum::routing::patch(books::update_book))
|
||||||
.route("/books/:id/convert", axum::routing::post(books::convert_book))
|
.route("/books/:id/convert", axum::routing::post(books::convert_book))
|
||||||
.route("/libraries/:library_id/series/:name", axum::routing::patch(books::update_series))
|
.route("/libraries/:library_id/series/:name", axum::routing::patch(series::update_series))
|
||||||
.route("/index/rebuild", axum::routing::post(index_jobs::enqueue_rebuild))
|
.route("/index/rebuild", axum::routing::post(index_jobs::enqueue_rebuild))
|
||||||
.route("/index/thumbnails/rebuild", axum::routing::post(thumbnails::start_thumbnails_rebuild))
|
.route("/index/thumbnails/rebuild", axum::routing::post(thumbnails::start_thumbnails_rebuild))
|
||||||
.route("/index/thumbnails/regenerate", axum::routing::post(thumbnails::start_thumbnails_regenerate))
|
.route("/index/thumbnails/regenerate", axum::routing::post(thumbnails::start_thumbnails_regenerate))
|
||||||
@@ -133,18 +133,20 @@ async fn main() -> anyhow::Result<()> {
|
|||||||
));
|
));
|
||||||
|
|
||||||
let read_routes = Router::new()
|
let read_routes = Router::new()
|
||||||
|
.route("/libraries", get(libraries::list_libraries))
|
||||||
|
.route("/libraries/:id/scan", axum::routing::post(libraries::scan_library))
|
||||||
.route("/books", get(books::list_books))
|
.route("/books", get(books::list_books))
|
||||||
.route("/books/ongoing", get(books::ongoing_books))
|
.route("/books/ongoing", get(series::ongoing_books))
|
||||||
.route("/books/:id", get(books::get_book))
|
.route("/books/:id", get(books::get_book))
|
||||||
.route("/books/:id/thumbnail", get(books::get_thumbnail))
|
.route("/books/:id/thumbnail", get(books::get_thumbnail))
|
||||||
.route("/books/:id/pages/:n", get(pages::get_page))
|
.route("/books/:id/pages/:n", get(pages::get_page))
|
||||||
.route("/books/:id/progress", get(reading_progress::get_reading_progress).patch(reading_progress::update_reading_progress))
|
.route("/books/:id/progress", get(reading_progress::get_reading_progress).patch(reading_progress::update_reading_progress))
|
||||||
.route("/libraries/:library_id/series", get(books::list_series))
|
.route("/libraries/:library_id/series", get(series::list_series))
|
||||||
.route("/libraries/:library_id/series/:name/metadata", get(books::get_series_metadata))
|
.route("/libraries/:library_id/series/:name/metadata", get(series::get_series_metadata))
|
||||||
.route("/series", get(books::list_all_series))
|
.route("/series", get(series::list_all_series))
|
||||||
.route("/series/ongoing", get(books::ongoing_series))
|
.route("/series/ongoing", get(series::ongoing_series))
|
||||||
.route("/series/statuses", get(books::series_statuses))
|
.route("/series/statuses", get(series::series_statuses))
|
||||||
.route("/series/provider-statuses", get(books::provider_statuses))
|
.route("/series/provider-statuses", get(series::provider_statuses))
|
||||||
.route("/series/mark-read", axum::routing::post(reading_progress::mark_series_read))
|
.route("/series/mark-read", axum::routing::post(reading_progress::mark_series_read))
|
||||||
.route("/authors", get(authors::list_authors))
|
.route("/authors", get(authors::list_authors))
|
||||||
.route("/stats", get(stats::get_stats))
|
.route("/stats", get(stats::get_stats))
|
||||||
|
|||||||
@@ -10,14 +10,14 @@ use utoipa::OpenApi;
|
|||||||
crate::reading_progress::update_reading_progress,
|
crate::reading_progress::update_reading_progress,
|
||||||
crate::reading_progress::mark_series_read,
|
crate::reading_progress::mark_series_read,
|
||||||
crate::books::get_thumbnail,
|
crate::books::get_thumbnail,
|
||||||
crate::books::list_series,
|
crate::series::list_series,
|
||||||
crate::books::list_all_series,
|
crate::series::list_all_series,
|
||||||
crate::books::ongoing_series,
|
crate::series::ongoing_series,
|
||||||
crate::books::ongoing_books,
|
crate::series::ongoing_books,
|
||||||
crate::books::convert_book,
|
crate::books::convert_book,
|
||||||
crate::books::update_book,
|
crate::books::update_book,
|
||||||
crate::books::get_series_metadata,
|
crate::series::get_series_metadata,
|
||||||
crate::books::update_series,
|
crate::series::update_series,
|
||||||
crate::pages::get_page,
|
crate::pages::get_page,
|
||||||
crate::search::search_books,
|
crate::search::search_books,
|
||||||
crate::index_jobs::enqueue_rebuild,
|
crate::index_jobs::enqueue_rebuild,
|
||||||
@@ -35,6 +35,7 @@ use utoipa::OpenApi;
|
|||||||
crate::libraries::delete_library,
|
crate::libraries::delete_library,
|
||||||
crate::libraries::scan_library,
|
crate::libraries::scan_library,
|
||||||
crate::libraries::update_monitoring,
|
crate::libraries::update_monitoring,
|
||||||
|
crate::libraries::update_metadata_provider,
|
||||||
crate::tokens::list_tokens,
|
crate::tokens::list_tokens,
|
||||||
crate::tokens::create_token,
|
crate::tokens::create_token,
|
||||||
crate::tokens::revoke_token,
|
crate::tokens::revoke_token,
|
||||||
@@ -54,8 +55,8 @@ use utoipa::OpenApi;
|
|||||||
crate::metadata::get_metadata_links,
|
crate::metadata::get_metadata_links,
|
||||||
crate::metadata::get_missing_books,
|
crate::metadata::get_missing_books,
|
||||||
crate::metadata::delete_metadata_link,
|
crate::metadata::delete_metadata_link,
|
||||||
crate::books::series_statuses,
|
crate::series::series_statuses,
|
||||||
crate::books::provider_statuses,
|
crate::series::provider_statuses,
|
||||||
crate::settings::list_status_mappings,
|
crate::settings::list_status_mappings,
|
||||||
crate::settings::upsert_status_mapping,
|
crate::settings::upsert_status_mapping,
|
||||||
crate::settings::delete_status_mapping,
|
crate::settings::delete_status_mapping,
|
||||||
@@ -63,6 +64,14 @@ use utoipa::OpenApi;
|
|||||||
crate::prowlarr::test_prowlarr,
|
crate::prowlarr::test_prowlarr,
|
||||||
crate::qbittorrent::add_torrent,
|
crate::qbittorrent::add_torrent,
|
||||||
crate::qbittorrent::test_qbittorrent,
|
crate::qbittorrent::test_qbittorrent,
|
||||||
|
crate::metadata_batch::start_batch,
|
||||||
|
crate::metadata_batch::get_batch_report,
|
||||||
|
crate::metadata_batch::get_batch_results,
|
||||||
|
crate::metadata_refresh::start_refresh,
|
||||||
|
crate::metadata_refresh::get_refresh_report,
|
||||||
|
crate::komga::sync_komga_read_books,
|
||||||
|
crate::komga::list_sync_reports,
|
||||||
|
crate::komga::get_sync_report,
|
||||||
),
|
),
|
||||||
components(
|
components(
|
||||||
schemas(
|
schemas(
|
||||||
@@ -74,14 +83,14 @@ use utoipa::OpenApi;
|
|||||||
crate::reading_progress::UpdateReadingProgressRequest,
|
crate::reading_progress::UpdateReadingProgressRequest,
|
||||||
crate::reading_progress::MarkSeriesReadRequest,
|
crate::reading_progress::MarkSeriesReadRequest,
|
||||||
crate::reading_progress::MarkSeriesReadResponse,
|
crate::reading_progress::MarkSeriesReadResponse,
|
||||||
crate::books::SeriesItem,
|
crate::series::SeriesItem,
|
||||||
crate::books::SeriesPage,
|
crate::series::SeriesPage,
|
||||||
crate::books::ListAllSeriesQuery,
|
crate::series::ListAllSeriesQuery,
|
||||||
crate::books::OngoingQuery,
|
crate::series::OngoingQuery,
|
||||||
crate::books::UpdateBookRequest,
|
crate::books::UpdateBookRequest,
|
||||||
crate::books::SeriesMetadata,
|
crate::series::SeriesMetadata,
|
||||||
crate::books::UpdateSeriesRequest,
|
crate::series::UpdateSeriesRequest,
|
||||||
crate::books::UpdateSeriesResponse,
|
crate::series::UpdateSeriesResponse,
|
||||||
crate::pages::PageQuery,
|
crate::pages::PageQuery,
|
||||||
crate::search::SearchQuery,
|
crate::search::SearchQuery,
|
||||||
crate::search::SearchResponse,
|
crate::search::SearchResponse,
|
||||||
@@ -96,6 +105,7 @@ use utoipa::OpenApi;
|
|||||||
crate::libraries::LibraryResponse,
|
crate::libraries::LibraryResponse,
|
||||||
crate::libraries::CreateLibraryRequest,
|
crate::libraries::CreateLibraryRequest,
|
||||||
crate::libraries::UpdateMonitoringRequest,
|
crate::libraries::UpdateMonitoringRequest,
|
||||||
|
crate::libraries::UpdateMetadataProviderRequest,
|
||||||
crate::tokens::CreateTokenRequest,
|
crate::tokens::CreateTokenRequest,
|
||||||
crate::tokens::TokenResponse,
|
crate::tokens::TokenResponse,
|
||||||
crate::tokens::CreatedTokenResponse,
|
crate::tokens::CreatedTokenResponse,
|
||||||
@@ -137,7 +147,16 @@ use utoipa::OpenApi;
|
|||||||
crate::prowlarr::ProwlarrRelease,
|
crate::prowlarr::ProwlarrRelease,
|
||||||
crate::prowlarr::ProwlarrCategory,
|
crate::prowlarr::ProwlarrCategory,
|
||||||
crate::prowlarr::ProwlarrSearchResponse,
|
crate::prowlarr::ProwlarrSearchResponse,
|
||||||
|
crate::prowlarr::MissingVolumeInput,
|
||||||
crate::prowlarr::ProwlarrTestResponse,
|
crate::prowlarr::ProwlarrTestResponse,
|
||||||
|
crate::metadata_batch::MetadataBatchRequest,
|
||||||
|
crate::metadata_batch::MetadataBatchReportDto,
|
||||||
|
crate::metadata_batch::MetadataBatchResultDto,
|
||||||
|
crate::metadata_refresh::MetadataRefreshRequest,
|
||||||
|
crate::metadata_refresh::MetadataRefreshReportDto,
|
||||||
|
crate::komga::KomgaSyncRequest,
|
||||||
|
crate::komga::KomgaSyncResponse,
|
||||||
|
crate::komga::KomgaSyncReportSummary,
|
||||||
ErrorResponse,
|
ErrorResponse,
|
||||||
)
|
)
|
||||||
),
|
),
|
||||||
@@ -145,11 +164,16 @@ use utoipa::OpenApi;
|
|||||||
("Bearer" = [])
|
("Bearer" = [])
|
||||||
),
|
),
|
||||||
tags(
|
tags(
|
||||||
(name = "authors", description = "Author browsing and listing"),
|
(name = "books", description = "Book browsing, details and management"),
|
||||||
(name = "books", description = "Read-only endpoints for browsing and searching books"),
|
(name = "series", description = "Series browsing, filtering and management"),
|
||||||
|
(name = "search", description = "Full-text search across books and series"),
|
||||||
(name = "reading-progress", description = "Reading progress tracking per book"),
|
(name = "reading-progress", description = "Reading progress tracking per book"),
|
||||||
(name = "libraries", description = "Library management endpoints (Admin only)"),
|
(name = "authors", description = "Author browsing and listing"),
|
||||||
|
(name = "stats", description = "Collection statistics and dashboard data"),
|
||||||
|
(name = "libraries", description = "Library listing, scanning, and management (create/delete/settings: Admin only)"),
|
||||||
(name = "indexing", description = "Search index management and job control (Admin only)"),
|
(name = "indexing", description = "Search index management and job control (Admin only)"),
|
||||||
|
(name = "metadata", description = "External metadata providers and matching (Admin only)"),
|
||||||
|
(name = "komga", description = "Komga read-status sync (Admin only)"),
|
||||||
(name = "tokens", description = "API token management (Admin only)"),
|
(name = "tokens", description = "API token management (Admin only)"),
|
||||||
(name = "settings", description = "Application settings and cache management (Admin only)"),
|
(name = "settings", description = "Application settings and cache management (Admin only)"),
|
||||||
(name = "prowlarr", description = "Prowlarr indexer integration (Admin only)"),
|
(name = "prowlarr", description = "Prowlarr indexer integration (Admin only)"),
|
||||||
|
|||||||
@@ -43,7 +43,7 @@ pub struct SearchResponse {
|
|||||||
#[utoipa::path(
|
#[utoipa::path(
|
||||||
get,
|
get,
|
||||||
path = "/search",
|
path = "/search",
|
||||||
tag = "books",
|
tag = "search",
|
||||||
params(
|
params(
|
||||||
("q" = String, Query, description = "Search query (books + series via PostgreSQL full-text)"),
|
("q" = String, Query, description = "Search query (books + series via PostgreSQL full-text)"),
|
||||||
("library_id" = Option<String>, Query, description = "Filter by library ID"),
|
("library_id" = Option<String>, Query, description = "Filter by library ID"),
|
||||||
|
|||||||
1028
apps/api/src/series.rs
Normal file
1028
apps/api/src/series.rs
Normal file
File diff suppressed because it is too large
Load Diff
@@ -90,7 +90,7 @@ pub struct StatsResponse {
|
|||||||
#[utoipa::path(
|
#[utoipa::path(
|
||||||
get,
|
get,
|
||||||
path = "/stats",
|
path = "/stats",
|
||||||
tag = "books",
|
tag = "stats",
|
||||||
responses(
|
responses(
|
||||||
(status = 200, body = StatsResponse),
|
(status = 200, body = StatsResponse),
|
||||||
(status = 401, description = "Unauthorized"),
|
(status = 401, description = "Unauthorized"),
|
||||||
|
|||||||
@@ -21,26 +21,19 @@ export default async function AuthorDetailPage({
|
|||||||
const page = typeof searchParamsAwaited.page === "string" ? parseInt(searchParamsAwaited.page) : 1;
|
const page = typeof searchParamsAwaited.page === "string" ? parseInt(searchParamsAwaited.page) : 1;
|
||||||
const limit = typeof searchParamsAwaited.limit === "string" ? parseInt(searchParamsAwaited.limit) : 20;
|
const limit = typeof searchParamsAwaited.limit === "string" ? parseInt(searchParamsAwaited.limit) : 20;
|
||||||
|
|
||||||
// Fetch books by this author (server-side filtering via API) and series
|
// Fetch books by this author (server-side filtering via API) and series by this author
|
||||||
const [booksPage, seriesPage] = await Promise.all([
|
const [booksPage, seriesPage] = await Promise.all([
|
||||||
fetchBooks(undefined, undefined, page, limit, undefined, undefined, authorName).catch(
|
fetchBooks(undefined, undefined, page, limit, undefined, undefined, authorName).catch(
|
||||||
() => ({ items: [], total: 0, page: 1, limit }) as BooksPageDto
|
() => ({ items: [], total: 0, page: 1, limit }) as BooksPageDto
|
||||||
),
|
),
|
||||||
fetchAllSeries(undefined, undefined, undefined, 1, 200).catch(
|
fetchAllSeries(undefined, undefined, undefined, 1, 200, undefined, undefined, undefined, undefined, authorName).catch(
|
||||||
() => ({ items: [], total: 0, page: 1, limit: 200 }) as SeriesPageDto
|
() => ({ items: [], total: 0, page: 1, limit: 200 }) as SeriesPageDto
|
||||||
),
|
),
|
||||||
]);
|
]);
|
||||||
|
|
||||||
const totalPages = Math.ceil(booksPage.total / limit);
|
const totalPages = Math.ceil(booksPage.total / limit);
|
||||||
|
|
||||||
// Extract unique series names from this author's books
|
const authorSeries = seriesPage.items;
|
||||||
const authorSeriesNames = new Set(
|
|
||||||
booksPage.items
|
|
||||||
.map((b) => b.series)
|
|
||||||
.filter((s): s is string => s != null && s !== "")
|
|
||||||
);
|
|
||||||
|
|
||||||
const authorSeries = seriesPage.items.filter((s) => authorSeriesNames.has(s.name));
|
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<>
|
<>
|
||||||
|
|||||||
@@ -342,6 +342,7 @@ export async function fetchAllSeries(
|
|||||||
seriesStatus?: string,
|
seriesStatus?: string,
|
||||||
hasMissing?: boolean,
|
hasMissing?: boolean,
|
||||||
metadataProvider?: string,
|
metadataProvider?: string,
|
||||||
|
author?: string,
|
||||||
): Promise<SeriesPageDto> {
|
): Promise<SeriesPageDto> {
|
||||||
const params = new URLSearchParams();
|
const params = new URLSearchParams();
|
||||||
if (libraryId) params.set("library_id", libraryId);
|
if (libraryId) params.set("library_id", libraryId);
|
||||||
@@ -351,6 +352,7 @@ export async function fetchAllSeries(
|
|||||||
if (seriesStatus) params.set("series_status", seriesStatus);
|
if (seriesStatus) params.set("series_status", seriesStatus);
|
||||||
if (hasMissing) params.set("has_missing", "true");
|
if (hasMissing) params.set("has_missing", "true");
|
||||||
if (metadataProvider) params.set("metadata_provider", metadataProvider);
|
if (metadataProvider) params.set("metadata_provider", metadataProvider);
|
||||||
|
if (author) params.set("author", author);
|
||||||
params.set("page", page.toString());
|
params.set("page", page.toString());
|
||||||
params.set("limit", limit.toString());
|
params.set("limit", limit.toString());
|
||||||
|
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "stripstream-backoffice",
|
"name": "stripstream-backoffice",
|
||||||
"version": "1.21.1",
|
"version": "1.21.2",
|
||||||
"private": true,
|
"private": true,
|
||||||
"scripts": {
|
"scripts": {
|
||||||
"dev": "next dev -p 7082",
|
"dev": "next dev -p 7082",
|
||||||
|
|||||||
310
docs/FEATURES.md
Normal file
310
docs/FEATURES.md
Normal file
@@ -0,0 +1,310 @@
|
|||||||
|
# Stripstream Librarian — Features & Business Rules
|
||||||
|
|
||||||
|
## Libraries
|
||||||
|
|
||||||
|
### Multi-Library Management
|
||||||
|
- Create and manage multiple independent libraries, each with its own root path
|
||||||
|
- Enable/disable libraries individually
|
||||||
|
- Delete a library cascades to all its books, jobs, and metadata
|
||||||
|
|
||||||
|
### Scanning & Indexing
|
||||||
|
- **Incremental scan**: uses directory mtime tracking to skip unchanged directories
|
||||||
|
- **Full rebuild**: force re-walk all directories, ignoring cached mtimes
|
||||||
|
- **Rescan**: deep rescan to discover newly supported formats
|
||||||
|
- **Two-phase pipeline**:
|
||||||
|
- Phase 1 (Discovery): fast filename-based metadata extraction (no archive I/O)
|
||||||
|
- Phase 2 (Analysis): extract page counts, first page image from archives
|
||||||
|
|
||||||
|
### Real-Time Monitoring
|
||||||
|
- **Automatic periodic scanning**: configurable interval (default 5 seconds)
|
||||||
|
- **Filesystem watcher**: real-time detection of file changes for instant indexing
|
||||||
|
- Each can be toggled per library (`monitor_enabled`, `watcher_enabled`)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Books
|
||||||
|
|
||||||
|
### Format Support
|
||||||
|
- **CBZ** (ZIP-based comic archives)
|
||||||
|
- **CBR** (RAR-based comic archives)
|
||||||
|
- **PDF**
|
||||||
|
- **EPUB**
|
||||||
|
- Automatic format detection from file extension and magic bytes
|
||||||
|
|
||||||
|
### Metadata Extraction
|
||||||
|
- **Title**: derived from filename or external metadata
|
||||||
|
- **Series**: derived from directory structure (first directory level under library root)
|
||||||
|
- **Volume**: extracted from filename with pattern detection:
|
||||||
|
- `T##` (Tome) — most common for French comics
|
||||||
|
- `Vol.##`, `Vol ##`, `Volume ##`
|
||||||
|
- `###` (standalone number)
|
||||||
|
- `-## ` (dash-separated)
|
||||||
|
- **Author(s)**: single scalar and array support
|
||||||
|
- **Page count**: extracted from archive analysis
|
||||||
|
- **Language**, **kind** (ebook, comic, bd)
|
||||||
|
|
||||||
|
### Thumbnails
|
||||||
|
- Generated from the first page of each archive
|
||||||
|
- Output format configurable: WebP (default), JPEG, PNG
|
||||||
|
- Configurable dimensions (default 300×400)
|
||||||
|
- Lazy generation: created on first access if missing
|
||||||
|
- Bulk operations: rebuild missing or regenerate all
|
||||||
|
|
||||||
|
### CBR to CBZ Conversion
|
||||||
|
- Convert RAR archives to ZIP format
|
||||||
|
- Tracked as background job with progress
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Series
|
||||||
|
|
||||||
|
### Automatic Aggregation
|
||||||
|
- Series derived from directory structure during scanning
|
||||||
|
- Books without series grouped as "unclassified"
|
||||||
|
|
||||||
|
### Series Metadata
|
||||||
|
- Description, publisher, start year, status (`ongoing`, `ended`, `completed`, `on_hold`, `hiatus`)
|
||||||
|
- Total volume count (from external providers)
|
||||||
|
- Authors (aggregated from books or metadata)
|
||||||
|
|
||||||
|
### Filtering & Discovery
|
||||||
|
- Filter by: series name (partial match), reading status, series status, metadata provider linkage
|
||||||
|
- Sort by: name, reading status, book count
|
||||||
|
- **Missing books detection**: identifies gaps in volume numbering within a series
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Reading Progress
|
||||||
|
|
||||||
|
### Per-Book Tracking
|
||||||
|
- Three states: `unread` (default), `reading`, `read`
|
||||||
|
- Current page tracking when status is `reading`
|
||||||
|
- `last_read_at` timestamp auto-updated
|
||||||
|
|
||||||
|
### Series-Level Status
|
||||||
|
- Calculated from book statuses:
|
||||||
|
- All read → series `read`
|
||||||
|
- None read → series `unread`
|
||||||
|
- Mixed → series `reading`
|
||||||
|
|
||||||
|
### Bulk Operations
|
||||||
|
- Mark entire series as read (updates all books)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Search & Discovery
|
||||||
|
|
||||||
|
### Full-Text Search
|
||||||
|
- PostgreSQL-based (`ILIKE` + `pg_trgm`)
|
||||||
|
- Searches across: book titles, series names, authors (scalar and array fields), series metadata authors
|
||||||
|
- Case-insensitive partial matching
|
||||||
|
- Library-scoped filtering
|
||||||
|
|
||||||
|
### Results
|
||||||
|
- Book hits: title, authors, series, volume, language, kind
|
||||||
|
- Series hits: name, book count, read count, first book (for linking)
|
||||||
|
- Processing time included in response
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Authors
|
||||||
|
|
||||||
|
- Unique author aggregation from books and series metadata
|
||||||
|
- Per-author book and series count
|
||||||
|
- Searchable by name (partial match)
|
||||||
|
- Sortable by name or book count
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## External Metadata
|
||||||
|
|
||||||
|
### Supported Providers
|
||||||
|
| Provider | Focus |
|
||||||
|
|----------|-------|
|
||||||
|
| Google Books | General books (default fallback) |
|
||||||
|
| ComicVine | Comics |
|
||||||
|
| BedéThèque | Franco-Belgian comics |
|
||||||
|
| AniList | Manga/anime |
|
||||||
|
| Open Library | General books |
|
||||||
|
|
||||||
|
### Provider Configuration
|
||||||
|
- Global default provider with library-level override
|
||||||
|
- Fallback provider if primary is unavailable
|
||||||
|
|
||||||
|
### Matching Workflow
|
||||||
|
1. **Search**: query a provider, get candidates with confidence scores
|
||||||
|
2. **Match**: link a series to an external result (status `pending`)
|
||||||
|
3. **Approve**: validate and sync metadata to series and books
|
||||||
|
4. **Reject**: discard a match
|
||||||
|
|
||||||
|
### Batch Processing
|
||||||
|
- Auto-match all series in a library via `metadata_batch` job
|
||||||
|
- Configurable confidence threshold
|
||||||
|
- Result statuses: `auto_matched`, `no_results`, `too_many_results`, `low_confidence`, `already_linked`
|
||||||
|
|
||||||
|
### Metadata Refresh
|
||||||
|
- Update approved links with latest data from providers
|
||||||
|
- Change tracking reports per series/book
|
||||||
|
- Non-destructive: only updates when provider has new data
|
||||||
|
|
||||||
|
### Field Locking
|
||||||
|
- Individual book fields can be locked to prevent external sync from overwriting manual edits
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## External Integrations
|
||||||
|
|
||||||
|
### Komga Sync
|
||||||
|
- Import reading progress from a Komga server
|
||||||
|
- Matches local series/books by name
|
||||||
|
- Detailed sync report: matched, already read, newly marked, unmatched
|
||||||
|
|
||||||
|
### Prowlarr (Indexer Search)
|
||||||
|
- Search Prowlarr for missing volumes in a series
|
||||||
|
- Volume pattern matching against release titles
|
||||||
|
- Results: title, size, seeders/leechers, download URL, matched missing volumes
|
||||||
|
|
||||||
|
### qBittorrent
|
||||||
|
- Add torrents directly from Prowlarr search results
|
||||||
|
- Connection test endpoint
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Page Rendering & Caching
|
||||||
|
|
||||||
|
### Page Extraction
|
||||||
|
- Render any page from supported archive formats
|
||||||
|
- 1-indexed page numbers
|
||||||
|
|
||||||
|
### Image Processing
|
||||||
|
- Output formats: original, JPEG, PNG, WebP
|
||||||
|
- Quality parameter (1–100)
|
||||||
|
- Max width parameter (1–2160 px)
|
||||||
|
- Configurable resampling filter: lanczos3, nearest, triangle/bilinear
|
||||||
|
- Concurrent render limit (default 8) with semaphore
|
||||||
|
|
||||||
|
### Caching
|
||||||
|
- **LRU in-memory cache**: 512 entries
|
||||||
|
- **Disk cache**: SHA256-keyed, two-level directory structure
|
||||||
|
- Cache key = hash(path + page + format + quality + width)
|
||||||
|
- Configurable cache directory and max size
|
||||||
|
- Manual cache clear via settings
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Background Jobs
|
||||||
|
|
||||||
|
### Job Types
|
||||||
|
| Type | Description |
|
||||||
|
|------|-------------|
|
||||||
|
| `rebuild` | Incremental scan |
|
||||||
|
| `full_rebuild` | Full filesystem rescan |
|
||||||
|
| `rescan` | Deep rescan for new formats |
|
||||||
|
| `thumbnail_rebuild` | Generate missing thumbnails |
|
||||||
|
| `thumbnail_regenerate` | Clear and regenerate all thumbnails |
|
||||||
|
| `cbr_to_cbz` | Convert RAR to ZIP |
|
||||||
|
| `metadata_batch` | Auto-match series to metadata |
|
||||||
|
| `metadata_refresh` | Update approved metadata links |
|
||||||
|
|
||||||
|
### Job Lifecycle
|
||||||
|
- Status flow: `pending` → `running` → `success` | `failed` | `cancelled`
|
||||||
|
- Intermediate statuses: `extracting_pages`, `generating_thumbnails`
|
||||||
|
- Real-time progress via **Server-Sent Events** (SSE)
|
||||||
|
- Per-file error tracking (non-fatal: job continues on errors)
|
||||||
|
- Cancellation support for pending/running jobs
|
||||||
|
|
||||||
|
### Progress Tracking
|
||||||
|
- Percentage (0–100), current file, processed/total counts
|
||||||
|
- Timing: started_at, finished_at, phase2_started_at
|
||||||
|
- Stats JSON blob with job-specific metrics
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Authentication & Security
|
||||||
|
|
||||||
|
### Token System
|
||||||
|
- **Bootstrap token**: admin token via `API_BOOTSTRAP_TOKEN` env var
|
||||||
|
- **API tokens**: create, list, revoke with scopes
|
||||||
|
- Token format: `stl_{prefix}_{secret}` with Argon2 hashing
|
||||||
|
- Expiration dates, last usage tracking, revocation
|
||||||
|
|
||||||
|
### Access Control
|
||||||
|
- **Two scopes**: `admin` (full access) and `read` (read-only)
|
||||||
|
- Route-level middleware enforcement
|
||||||
|
- Rate limiting: configurable sliding window (default 120 req/s)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Backoffice (Web UI)
|
||||||
|
|
||||||
|
### Dashboard
|
||||||
|
- Statistics cards: books, series, authors, libraries
|
||||||
|
- Donut charts: reading status breakdown, format distribution
|
||||||
|
- Bar charts: books per language
|
||||||
|
- Per-library reading progress bars
|
||||||
|
- Top series by book/page count
|
||||||
|
- Monthly addition timeline
|
||||||
|
- Metadata coverage stats
|
||||||
|
|
||||||
|
### Pages
|
||||||
|
- **Libraries**: list, create, delete, configure monitoring and metadata provider
|
||||||
|
- **Books**: global list with filtering/sorting, detail view with metadata and page rendering
|
||||||
|
- **Series**: global list, per-library view, detail with metadata management
|
||||||
|
- **Authors**: list with book/series counts, detail with author's books
|
||||||
|
- **Jobs**: history, live progress via SSE, error details
|
||||||
|
- **Tokens**: create, list, revoke API tokens
|
||||||
|
- **Settings**: image processing, cache, thumbnails, external services (Prowlarr, qBittorrent)
|
||||||
|
|
||||||
|
### Interactive Features
|
||||||
|
- Real-time search with suggestions
|
||||||
|
- Metadata search and matching modals
|
||||||
|
- Prowlarr search modal for missing volumes
|
||||||
|
- Folder browser/picker for library paths
|
||||||
|
- Book/series editing forms
|
||||||
|
- Quick reading status toggles
|
||||||
|
- CBR to CBZ conversion trigger
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## API
|
||||||
|
|
||||||
|
### Documentation
|
||||||
|
- OpenAPI/Swagger UI available at `/swagger-ui`
|
||||||
|
- Health check (`/health`), readiness (`/ready`), Prometheus metrics (`/metrics`)
|
||||||
|
|
||||||
|
### Public Endpoints (no auth)
|
||||||
|
- `GET /health`, `GET /ready`, `GET /metrics`, `GET /swagger-ui`
|
||||||
|
|
||||||
|
### Read Endpoints (read scope)
|
||||||
|
- Libraries, books, series, authors listing and detail
|
||||||
|
- Book pages and thumbnails
|
||||||
|
- Reading progress get/update
|
||||||
|
- Full-text search, collection statistics
|
||||||
|
|
||||||
|
### Admin Endpoints (admin scope)
|
||||||
|
- Library CRUD and configuration
|
||||||
|
- Book metadata editing, CBR conversion
|
||||||
|
- Series metadata editing
|
||||||
|
- Indexing job management (trigger, cancel, stream)
|
||||||
|
- API token management
|
||||||
|
- Metadata operations (search, match, approve, reject, batch, refresh)
|
||||||
|
- External integrations (Prowlarr, qBittorrent, Komga)
|
||||||
|
- Application settings and cache management
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Database
|
||||||
|
|
||||||
|
### Key Design Decisions
|
||||||
|
- PostgreSQL with `pg_trgm` for full-text search (no external search engine)
|
||||||
|
- All deletions cascade from libraries
|
||||||
|
- Unique constraints: file paths, token prefixes, metadata links (library + series + provider)
|
||||||
|
- Directory mtime caching for incremental scan optimization
|
||||||
|
- Connection pool: 10 (API), 20 (indexer)
|
||||||
|
|
||||||
|
### Archive Resilience
|
||||||
|
- CBZ: fallback streaming reader if central directory corrupted
|
||||||
|
- CBR: RAR extraction via system `unar`, fallback to CBZ parsing
|
||||||
|
- PDF: `pdfinfo` for page count, `pdftoppm` for rendering
|
||||||
|
- EPUB: ZIP-based extraction
|
||||||
|
- FD exhaustion detection: aborts if too many consecutive IO errors
|
||||||
Reference in New Issue
Block a user