Compare commits

...

86 Commits

Author SHA1 Message Date
3d4803f975 perf(realtime+data): implement perf-data-optimization and perf-realtime-scale
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 3m33s
## perf-data-optimization
- Add @@index([name]) on User model (migration)
- Add WEATHER_HISTORY_LIMIT=90 constant, apply take/orderBy on weather history queries
- Replace deep includes with explicit select on all 6 list service queries
- Add unstable_cache layer with revalidateTag on all list service functions
- Add cache-tags.ts helpers (sessionTag, sessionsListTag, userStatsTag)
- Invalidate sessionsListTag in all create/delete Server Actions

## perf-realtime-scale
- Create src/lib/broadcast.ts: generic createBroadcaster factory with shared polling
  (one interval per active session, starts on first subscriber, stops on last)
- Migrate all 6 SSE routes to use createBroadcaster — removes per-connection setInterval
- Add broadcastToXxx() calls in all Server Actions after mutations for immediate push
- Add SESSIONS_PAGE_SIZE=20, pagination on sessions page with loadMoreSessions action
- Add "Charger plus" button with loading state and "X sur Y" counter in WorkshopTabs

## Tests
- Add 19 unit tests for broadcast.ts (polling lifecycle, userId filtering,
  formatEvent, error resilience, session isolation)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-10 15:30:54 +01:00
5b45f18ad9 chore: add Vitest for testing and coverage support
- Introduced new test scripts in package.json: "test", "test:watch", and "test:coverage".
- Added Vitest and vite-tsconfig-paths as dependencies for improved testing capabilities.
- Updated pnpm-lock.yaml to reflect new dependencies and their versions.
2026-03-10 08:38:40 +01:00
f9ed732f1c test: add unit test coverage for services and lib
- 255 tests across 14 files (was 70 tests in 4 files)
- src/services/__tests__: auth (registerUser, updateUserPassword, updateUserProfile), okrs (calculateOKRProgress, createOKR, updateKeyResult, updateOKR), teams (createTeam, addTeamMember, isAdminOfUser, getTeamMemberIdsForAdminTeams, getUserTeams), weather (getPreviousWeatherEntriesForUsers, shareWeatherSessionToTeam, getWeatherSessionsHistory), workshops (createSwotItem, duplicateSwotItem, updateAction, createMotivatorSession, updateCardInfluence, addGifMoodItem, shareGifMoodSessionToTeam, getLatestEventTimestamp, cleanupOldEvents)
- src/lib/__tests__: date-utils, weather-utils, okr-utils, gravatar, workshops, share-utils
- Update vitest coverage to include src/lib/**

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-10 08:37:32 +01:00
a8c05aa841 perf(quick-wins): batch collaborator resolution, debounce SSE refresh, loading states
- Eliminate N+1 on resolveCollaborator: add batchResolveCollaborators() in
  auth.ts (2 DB queries max regardless of session count), update all 4
  workshop services to use post-batch mapping
- Debounce router.refresh() in useLive.ts (300ms) to group simultaneous
  SSE events and avoid cascade re-renders
- Call cleanupOldEvents fire-and-forget in createEvent to purge old SSE
  events inline without blocking the response
- Add loading.tsx skeletons on /sessions and /users matching actual page
  layout (PageHeader + content structure)
- Lazy-load ShareModal via next/dynamic in BaseSessionLiveWrapper to reduce
  initial JS bundle

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-10 08:07:22 +01:00
2d266f89f9 feat(perf): implement performance optimizations for session handling
- Introduced a new configuration file `config.yaml` for specifying project context and artifact rules.
- Added `.openspec.yaml` files for tracking changes related to performance improvements.
- Created design documents outlining the context, goals, decisions, and migration plans for optimizing session performance.
- Proposed changes include batching database queries, debouncing event refreshes, purging old events, and implementing loading states for better user experience.
- Added tasks and specifications to ensure proper implementation and validation of the new features.

These enhancements aim to improve the scalability and responsiveness of the application during collaborative sessions.
2026-03-10 08:06:47 +01:00
6baa9bfada feat(opsx): add new commands for workflow management
- Introduced `OPSX: Apply` to implement tasks from OpenSpec changes.
- Added `OPSX: Archive` for archiving completed changes in the experimental workflow.
- Created `OPSX: Explore` for a thinking partner mode to investigate ideas and clarify requirements.
- Implemented `OPSX: Propose` to generate change proposals and associated artifacts in one step.
- Developed skills for `openspec-apply-change` and `openspec-archive-change` to facilitate task implementation and archiving processes.

These additions enhance the workflow capabilities and provide structured approaches for managing changes within the OpenSpec framework.
2026-03-09 21:31:05 +01:00
f2c1b195b3 Fix UserStats typing in users page counters
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 2m58s
2026-03-04 17:04:09 +01:00
367eea6ee8 Persist sessions view mode in localStorage 2026-03-04 17:04:03 +01:00
dcc769a930 fix(users): include all workshop types in user stats 2026-03-04 08:44:07 +01:00
313ad53e2e refactor(weather): move top disclosures below board
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 3m3s
2026-03-04 08:39:19 +01:00
8bff21bede feat(weather): show trend indicators on team averages 2026-03-04 08:34:23 +01:00
4aea17124e feat(ui): allow per-disclosure emoji icons 2026-03-04 08:32:19 +01:00
db7a0cef96 refactor(ui): unify low-level controls and expand design system
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 2m57s
2026-03-03 15:50:15 +01:00
9a43980412 refactor: extract Icons and InlineFormActions UI components
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 3m28s
- Add Icons.tsx: IconEdit, IconTrash, IconDuplicate, IconPlus, IconCheck, IconClose
- Add InlineFormActions.tsx: unified Annuler/Ajouter-Enregistrer button pair
- Replace inline SVGs in SwotCard, YearReviewCard, WeeklyCheckInCard, SwotQuadrant,
  YearReviewSection, WeeklyCheckInSection, EditableTitle, Modal, GifMoodCard

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-03 14:25:35 +01:00
09a849279b refactor: add SessionPageHeader and apply to all 6 session detail pages
- Create SessionPageHeader component (breadcrumb + editable title + collaborator + badges + date)
- Embed UPDATE_FN map internally, keyed by workshopType — no prop drilling
- Replace duplicated header blocks in sessions, motivators, year-review, weather, weekly-checkin, gif-mood

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-03 14:15:43 +01:00
b1ba43fd30 refactor: merge 6 EditableTitle wrappers into one file
Replace EditableSessionTitle, EditableMotivatorTitle, EditableYearReviewTitle,
EditableWeatherTitle, EditableWeeklyCheckInTitle, EditableGifMoodTitle individual
files with a single EditableTitles.tsx using spread props. Same public API.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-03 14:06:45 +01:00
2e00522bfc feat: add PageHeader component and centralize page spacing
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 3m1s
- Create reusable PageHeader component (emoji + title + subtitle + actions)
- Use PageHeader in sessions, teams, users, objectives pages
- Centralize vertical padding in layout (py-6) and remove per-page py-* values

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-03 14:01:07 +01:00
66ac190c15 feat: redesign sessions dashboard with multi-view layout and sortable table
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 3m17s
- Redesign session cards with colored left border (Figma-style), improved
  visual hierarchy, hover states, and stats in footer
- Add 4 switchable view modes: grid, list, sortable table, and timeline
- Table view: unified flat table with clickable column headers for sorting
  (Type, Titre, Créateur, Participant, Stats, Date)
- Add Créateur column showing the workshop owner with Gravatar avatar
- Widen Type column to 160px for better readability
- Improve tabs navigation with pill-shaped active state and shadow
- Fix TypeFilterDropdown to exclude 'Équipe' from type list
- Make filter tabs visually distinct with bg-card + border + shadow-sm
- Split WorkshopTabs.tsx into 4 focused modules:
  workshop-session-types.ts, workshop-session-helpers.ts,
  SessionCard.tsx, WorkshopTabs.tsx

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-03 13:54:23 +01:00
7be296231c feat: add weather trend chart showing indicator averages over time
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 3m6s
Adds a collapsible SVG line graph on weather session pages displaying
the evolution of all 4 indicators (Performance, Moral, Flux, Création
de valeur) across sessions, with per-session average scores, hover
tooltips, and a marker on the current session.

Also fixes pre-existing lint errors: non-null assertion on optional
chain in Header and eslint-disable for intentional hydration pattern
in ThemeToggle.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-03 11:45:19 +01:00
c3b653601c fix: notes patching in weather
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 2m58s
2026-03-03 11:18:11 +01:00
8de4c1985f feat: update gif mood board column options to 4/5/6
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 2m58s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-03 10:14:36 +01:00
766f3d5a59 feat: add GIF Mood Board workshop
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 4m5s
- New workshop where each team member shares up to 5 GIFs with notes to express their weekly mood
- Per-user week rating (1-5 stars) visible next to each member's section
- Masonry-style grid with adjustable column count (3/4/5) toggle
- Handwriting font (Caveat) for GIF notes
- Full real-time collaboration via SSE
- Clean migration (add_gif_mood_workshop) safe for production deploy
- DB backup via cp before each migration in docker-entrypoint

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-03 10:04:56 +01:00
7c68fb81e3 fix: prevent ThemeToggle hydration mismatch by deferring icon render
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 4m32s
Server doesn't know localStorage theme, so defer emoji rendering until
after mount to avoid server/client text mismatch.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 11:16:31 +01:00
9298eef0cb refactor: make Header a server component to avoid auth flash on load
Some checks failed
Deploy with Docker Compose / deploy (push) Has been cancelled
Move session check from client-side useSession() to server-side auth(),
so the authenticated state is known at initial render. Extract interactive
parts (ThemeToggle, UserMenu, WorkshopsDropdown, NavLinks) into small
client components.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-26 11:14:27 +01:00
a10205994c refactor: improve team management, OKRs, and session components 2026-02-25 17:29:40 +01:00
c828ab1a48 perf: optimize DB queries, SSE polling, and client rendering
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 4m45s
- Fix resolveCollaborator N+1: replace full User table scan with findFirst
- Fix getAllUsersWithStats N+1: use groupBy instead of per-user count queries
- Cache getTeamMemberIdsForAdminTeams and isAdminOfUser with React.cache
- Increase SSE poll interval from 1s to 2s across all 5 subscribe routes
- Add cleanupOldEvents method to session-share-events for event table TTL
- Add React.memo to all card components (Swot, Motivator, Weather, WeeklyCheckIn, YearReview)
- Fix WeatherCard useEffect+setState lint error with idiomatic prop sync pattern
- Add optimizePackageImports for DnD libs and poweredByHeader:false in next.config
- Add inline theme script in layout.tsx to prevent dark mode FOUC
- Remove unused Next.js template SVGs from public/

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-25 14:04:58 +01:00
6dfeab5eb8 docs: add CLAUDE.md with project conventions and architecture guide
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-25 13:45:49 +01:00
74b1b2e838 fix: restore WeatherAverageBar component in session header and adjust styling
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 6m12s
Reintroduced the WeatherAverageBar component in the WeatherSessionPage to display team averages. Updated the styling of the WeatherAverageBar for improved spacing. Enhanced the EvolutionIndicator component to use dynamic background colors for better visibility of status indicators.
2026-02-25 07:55:01 +01:00
73219c89fb fix: make evolution indicators visually prominent with badge style
Replace plain text-xs arrows with 20×20px colored circular badges
(green ↑, red ↓, muted →) to ensure they are clearly visible next
to emoji cells. Also widen emoji columns from w-24 → w-28 to give
the badge room without overflow.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-24 18:18:28 +01:00
30c2b6cc1e fix: display evolution indicators inline (flex-row) next to emoji instead of below 2026-02-24 17:17:45 +01:00
51bc187374 fix: convert Map to Record for server-client boundary, remove dead currentUser prop
- WeatherBoard: change previousEntries type from Map<string, PreviousEntry> to Record<string, PreviousEntry> and update lookup from .get() to bracket notation
- page.tsx: wrap previousEntries with Object.fromEntries() before passing as prop, remove unused currentUser prop
- WeatherCard: remove spurious eslint-disable-next-line comment for non-existent rule react-hooks/set-state-in-effect

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-24 17:12:46 +01:00
3e869bf8ad feat: show evolution indicators per person per axis in weather board
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-24 17:02:31 +01:00
3b212d6dda feat: fetch and pass previous weather entries through component tree
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-24 17:00:12 +01:00
9b8c9efbd6 feat: display team weather average bar in session header 2026-02-24 16:57:26 +01:00
11c770da9c feat: add WeatherAverageBar component
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-24 16:55:57 +01:00
220dcf87b9 feat: add getPreviousWeatherEntriesForUsers service function
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-24 16:54:21 +01:00
6b8d3c42f7 refactor: extract WEATHER_EMOJIS and add scoring utils to weather-utils.ts
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-24 16:50:26 +01:00
Julien Froidefond
739b0bf87d feat: refactor session components to utilize BaseSessionLiveWrapper, streamlining sharing functionality and reducing code duplication across various session types
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 3m14s
2026-02-18 08:39:15 +01:00
Julien Froidefond
35228441e3 feat: add editable functionality for current quarter OKRs, allowing participants and team admins to modify objectives and key results, enhancing user interaction and collaboration
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 2m26s
2026-02-18 08:31:32 +01:00
Julien Froidefond
ee13f8ba99 feat: enhance dropdown components by integrating useClickOutside hook for improved user experience and accessibility in NewWorkshopDropdown and WorkshopTabs 2026-02-18 08:25:08 +01:00
Julien Froidefond
d50a8a0266 style: update card hover colors in globals.css and page.tsx for improved UI consistency
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 2m29s
2026-02-17 15:13:00 +01:00
Julien Froidefond
520a1f4838 feat: implement auto-sharing functionality for session creation across motivators, weekly check-ins, and year reviews, enhancing collaboration capabilities 2026-02-17 15:11:46 +01:00
Julien Froidefond
d05157d498 feat: integrate user team retrieval into session components, enhancing sharing functionality and user experience across motivators, sessions, weekly check-ins, and year reviews
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 2m35s
2026-02-17 14:47:43 +01:00
Julien Froidefond
4d04d3ede8 feat: refactor session retrieval logic to utilize generic session queries, enhancing code maintainability and reducing duplication across session types 2026-02-17 14:38:54 +01:00
Julien Froidefond
aad4b7f111 feat: enhance session management by implementing edit permissions for team admins and updating session components to reflect new access controls 2026-02-17 14:20:40 +01:00
Julien Froidefond
5e9ae0936f feat: implement getWeekBounds function for calculating ISO week boundaries and integrate it into weather session sharing logic 2026-02-17 14:05:50 +01:00
Julien Froidefond
4e14112ffa feat: add team collaboration sessions for admins, enhancing session management and visibility in the application 2026-02-17 14:03:31 +01:00
Julien Froidefond
7f3eabbdb2 refactor: update application name and related metadata from SWOT Manager to Workshop Manager for consistency across the project 2026-02-17 10:05:51 +01:00
Julien Froidefond
cc7e73ce7b feat: refactor workshop management by centralizing workshop data and improving session navigation across components
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 3m0s
2026-02-17 09:43:08 +01:00
Julien Froidefond
a8f53bfe2a feat: replace individual workshop buttons with a dropdown for creating new workshops in SessionsPage and update WorkshopTabs for improved tab management 2026-02-17 09:30:46 +01:00
Julien Froidefond
e8282bb118 feat: add apple icon to metadata for enhanced application branding
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 2m32s
2026-02-17 08:14:05 +01:00
Julien Froidefond
31d9c00b6d docs: update README.md to enhance feature descriptions and improve clarity on collaboration tools
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 4m57s
2026-02-17 06:58:21 +01:00
Julien Froidefond
7805e8dcd0 feat: add RocketIcon to Header component and update metadata with application icon 2026-02-16 11:44:41 +01:00
Julien Froidefond
390c4c653e fix: update modal title in WeeklyCheckInShareModal for improved clarity 2026-02-16 11:38:09 +01:00
Julien Froidefond
39910f559e feat: improve SSE broadcasting for weather sessions with enhanced error handling and connection management
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 5m49s
2026-02-04 13:18:10 +01:00
Julien Froidefond
057732f00e feat: enhance real-time weather session updates by broadcasting user information and syncing local state in WeatherCard component
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 6m14s
2026-02-04 11:05:33 +01:00
Julien Froidefond
e8ffccd286 refactor: streamline date and title handling in NewWeatherPage and NewWeeklyCheckInPage components for improved user experience
Some checks failed
Deploy with Docker Compose / deploy (push) Has been cancelled
2026-02-04 11:02:52 +01:00
Julien Froidefond
ef0772f894 feat: enhance user experience by adding notifications for OKR updates and improving session timeout handling for better usability
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 3m21s
2026-02-04 10:50:38 +01:00
Julien Froidefond
163caa398c feat: implement Weather Workshop feature with models, UI components, and session management for enhanced team visibility and personal well-being tracking
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 3m16s
2026-02-03 18:08:06 +01:00
Julien Froidefond
3a2eb83197 feat: add comparePeriods utility for sorting OKR periods and refactor ObjectivesPage to utilize it for improved period sorting
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 8m28s
2026-01-28 13:58:01 +01:00
Julien Froidefond
e848e85b63 feat: implement period filtering in OKRsList component with toggle for viewing all OKRs or current quarter's OKRs
Some checks failed
Deploy with Docker Compose / deploy (push) Has been cancelled
2026-01-28 13:54:10 +01:00
Julien Froidefond
53ee344ae7 feat: add Weekly Check-in feature with models, UI components, and session management for enhanced team collaboration
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 6m24s
2026-01-14 10:23:58 +01:00
Julien Froidefond
67d685d346 refactor: update component exports in OKRs, SWOT, Teams, and UI modules for improved organization and clarity 2026-01-13 14:51:50 +01:00
Julien Froidefond
47703db348 refactor: update OKR form and edit page to use new CreateKeyResultInput type for improved type safety and clarity
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 6m54s
2026-01-07 17:32:27 +01:00
Julien Froidefond
86c26b5af8 fix: improve error handling in API routes and update date handling for OKR and Key Result submissions
Some checks failed
Deploy with Docker Compose / deploy (push) Failing after 3m38s
2026-01-07 17:22:33 +01:00
Julien Froidefond
97045342b7 feat: refactor ObjectivesPage to utilize ObjectivesList component for improved rendering and simplify OKR status handling in OKRCard with compact view option
Some checks failed
Deploy with Docker Compose / deploy (push) Failing after 4m17s
2026-01-07 17:18:16 +01:00
Julien Froidefond
ca9b68ebbd feat: enhance OKR management by adding permission checks for editing and deleting, and updating OKR forms to handle key results more effectively
Some checks failed
Deploy with Docker Compose / deploy (push) Failing after 4m44s
2026-01-07 16:48:23 +01:00
Julien Froidefond
5f661c8bfd feat: introduce Teams & OKRs feature with models, types, and UI components for team management and objective tracking
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 12m53s
2026-01-07 10:11:59 +01:00
Julien Froidefond
e3a47dd7e5 chore: update .gitignore to include database files and improve data management 2025-12-16 10:45:35 +01:00
Julien Froidefond
35b9ac8a66 chore: remove obsolete database files from the project to streamline data management 2025-12-16 10:45:30 +01:00
Julien Froidefond
fd65e0d5b9 feat: enhance live collaboration features by introducing useLive hook for real-time event handling across motivators, sessions, and year reviews; refactor existing hooks to utilize this new functionality
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 2m39s
2025-12-16 10:41:16 +01:00
Julien Froidefond
246298dd82 refactor: consolidate editable title components into a unified UI module, removing redundant files and updating imports 2025-12-16 08:58:09 +01:00
Julien Froidefond
56a9c2c3be feat: implement Year Review feature with session management, item categorization, and real-time collaboration
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 6m7s
2025-12-16 08:55:13 +01:00
Julien Froidefond
48ff86fb5f chore: update deploy workflow to rebuild Docker images during deployment
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 5m43s
2025-12-15 13:43:57 +01:00
Julien Froidefond
d735e1c4c5 feat: add linked item management to action updates in SWOT analysis
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 5s
2025-12-15 13:34:09 +01:00
Julien Froidefond
0cf7437efe chore: optimize Dockerfile by adding cache mount for pnpm installation to improve build performance
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 5s
2025-12-13 12:16:14 +01:00
Julien Froidefond
ccb5338aa6 chore: update Dockerfile to set PNPM_HOME environment variable and prepare directory for pnpm installation 2025-12-13 12:16:07 +01:00
Julien Froidefond
fa2879c903 fix: update next to 16.0.10
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 4s
2025-12-13 07:28:00 +01:00
Julien Froidefond
0daade6533 chore: update docker-compose.yml to change data volume path to a relative directory for better portability
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 4s
2025-12-11 11:24:26 +01:00
Julien Froidefond
acdcc37091 chore: update docker-compose.yml to set a specific data volume path for improved data management
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 4s
2025-12-11 11:22:09 +01:00
Julien Froidefond
7a4de67b9c chore: update docker-compose.yml to set a specific data volume path and modify deploy workflow to include DATA_VOLUME_PATH variable
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 6s
2025-12-11 11:21:15 +01:00
Julien Froidefond
27995e7e7f chore: update docker-compose.yml to use dynamic data volume path and modify deploy workflow to reference environment variables
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 9s
2025-12-11 11:18:15 +01:00
Julien Froidefond
8a3966e6a9 chore: update deploy workflow to enable Docker BuildKit and add environment variables for authentication and database configuration
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 13s
2025-12-11 08:56:03 +01:00
Julien Froidefond
e2232ca595 chore: update .gitignore to include data directory and modify docker-compose.yml for external volume mapping
All checks were successful
Deploy with Docker Compose / deploy (push) Successful in 20s
2025-12-11 07:58:58 +01:00
Julien Froidefond
434043041c chore: rename app service to workshop-manager-app in docker-compose.yml for clarity
Some checks failed
Deploy with Docker Compose / deploy (push) Failing after 10m44s
2025-12-10 14:29:28 +01:00
9764402ef2 feat: adapting dor server 2025-12-06 13:09:12 +01:00
252 changed files with 26228 additions and 2828 deletions

View File

@@ -0,0 +1,152 @@
---
name: "OPSX: Apply"
description: Implement tasks from an OpenSpec change (Experimental)
category: Workflow
tags: [workflow, artifacts, experimental]
---
Implement tasks from an OpenSpec change.
**Input**: Optionally specify a change name (e.g., `/opsx:apply add-auth`). If omitted, check if it can be inferred from conversation context. If vague or ambiguous you MUST prompt for available changes.
**Steps**
1. **Select the change**
If a name is provided, use it. Otherwise:
- Infer from conversation context if the user mentioned a change
- Auto-select if only one active change exists
- If ambiguous, run `openspec list --json` to get available changes and use the **AskUserQuestion tool** to let the user select
Always announce: "Using change: <name>" and how to override (e.g., `/opsx:apply <other>`).
2. **Check status to understand the schema**
```bash
openspec status --change "<name>" --json
```
Parse the JSON to understand:
- `schemaName`: The workflow being used (e.g., "spec-driven")
- Which artifact contains the tasks (typically "tasks" for spec-driven, check status for others)
3. **Get apply instructions**
```bash
openspec instructions apply --change "<name>" --json
```
This returns:
- Context file paths (varies by schema)
- Progress (total, complete, remaining)
- Task list with status
- Dynamic instruction based on current state
**Handle states:**
- If `state: "blocked"` (missing artifacts): show message, suggest using `/opsx:continue`
- If `state: "all_done"`: congratulate, suggest archive
- Otherwise: proceed to implementation
4. **Read context files**
Read the files listed in `contextFiles` from the apply instructions output.
The files depend on the schema being used:
- **spec-driven**: proposal, specs, design, tasks
- Other schemas: follow the contextFiles from CLI output
5. **Show current progress**
Display:
- Schema being used
- Progress: "N/M tasks complete"
- Remaining tasks overview
- Dynamic instruction from CLI
6. **Implement tasks (loop until done or blocked)**
For each pending task:
- Show which task is being worked on
- Make the code changes required
- Keep changes minimal and focused
- Mark task complete in the tasks file: `- [ ]` → `- [x]`
- Continue to next task
**Pause if:**
- Task is unclear → ask for clarification
- Implementation reveals a design issue → suggest updating artifacts
- Error or blocker encountered → report and wait for guidance
- User interrupts
7. **On completion or pause, show status**
Display:
- Tasks completed this session
- Overall progress: "N/M tasks complete"
- If all done: suggest archive
- If paused: explain why and wait for guidance
**Output During Implementation**
```
## Implementing: <change-name> (schema: <schema-name>)
Working on task 3/7: <task description>
[...implementation happening...]
✓ Task complete
Working on task 4/7: <task description>
[...implementation happening...]
✓ Task complete
```
**Output On Completion**
```
## Implementation Complete
**Change:** <change-name>
**Schema:** <schema-name>
**Progress:** 7/7 tasks complete ✓
### Completed This Session
- [x] Task 1
- [x] Task 2
...
All tasks complete! You can archive this change with `/opsx:archive`.
```
**Output On Pause (Issue Encountered)**
```
## Implementation Paused
**Change:** <change-name>
**Schema:** <schema-name>
**Progress:** 4/7 tasks complete
### Issue Encountered
<description of the issue>
**Options:**
1. <option 1>
2. <option 2>
3. Other approach
What would you like to do?
```
**Guardrails**
- Keep going through tasks until done or blocked
- Always read context files before starting (from the apply instructions output)
- If task is ambiguous, pause and ask before implementing
- If implementation reveals issues, pause and suggest artifact updates
- Keep code changes minimal and scoped to each task
- Update task checkbox immediately after completing each task
- Pause on errors, blockers, or unclear requirements - don't guess
- Use contextFiles from CLI output, don't assume specific file names
**Fluid Workflow Integration**
This skill supports the "actions on a change" model:
- **Can be invoked anytime**: Before all artifacts are done (if tasks exist), after partial implementation, interleaved with other actions
- **Allows artifact updates**: If implementation reveals design issues, suggest updating artifacts - not phase-locked, work fluidly

View File

@@ -0,0 +1,157 @@
---
name: "OPSX: Archive"
description: Archive a completed change in the experimental workflow
category: Workflow
tags: [workflow, archive, experimental]
---
Archive a completed change in the experimental workflow.
**Input**: Optionally specify a change name after `/opsx:archive` (e.g., `/opsx:archive add-auth`). If omitted, check if it can be inferred from conversation context. If vague or ambiguous you MUST prompt for available changes.
**Steps**
1. **If no change name provided, prompt for selection**
Run `openspec list --json` to get available changes. Use the **AskUserQuestion tool** to let the user select.
Show only active changes (not already archived).
Include the schema used for each change if available.
**IMPORTANT**: Do NOT guess or auto-select a change. Always let the user choose.
2. **Check artifact completion status**
Run `openspec status --change "<name>" --json` to check artifact completion.
Parse the JSON to understand:
- `schemaName`: The workflow being used
- `artifacts`: List of artifacts with their status (`done` or other)
**If any artifacts are not `done`:**
- Display warning listing incomplete artifacts
- Prompt user for confirmation to continue
- Proceed if user confirms
3. **Check task completion status**
Read the tasks file (typically `tasks.md`) to check for incomplete tasks.
Count tasks marked with `- [ ]` (incomplete) vs `- [x]` (complete).
**If incomplete tasks found:**
- Display warning showing count of incomplete tasks
- Prompt user for confirmation to continue
- Proceed if user confirms
**If no tasks file exists:** Proceed without task-related warning.
4. **Assess delta spec sync state**
Check for delta specs at `openspec/changes/<name>/specs/`. If none exist, proceed without sync prompt.
**If delta specs exist:**
- Compare each delta spec with its corresponding main spec at `openspec/specs/<capability>/spec.md`
- Determine what changes would be applied (adds, modifications, removals, renames)
- Show a combined summary before prompting
**Prompt options:**
- If changes needed: "Sync now (recommended)", "Archive without syncing"
- If already synced: "Archive now", "Sync anyway", "Cancel"
If user chooses sync, use Task tool (subagent_type: "general-purpose", prompt: "Use Skill tool to invoke openspec-sync-specs for change '<name>'. Delta spec analysis: <include the analyzed delta spec summary>"). Proceed to archive regardless of choice.
5. **Perform the archive**
Create the archive directory if it doesn't exist:
```bash
mkdir -p openspec/changes/archive
```
Generate target name using current date: `YYYY-MM-DD-<change-name>`
**Check if target already exists:**
- If yes: Fail with error, suggest renaming existing archive or using different date
- If no: Move the change directory to archive
```bash
mv openspec/changes/<name> openspec/changes/archive/YYYY-MM-DD-<name>
```
6. **Display summary**
Show archive completion summary including:
- Change name
- Schema that was used
- Archive location
- Spec sync status (synced / sync skipped / no delta specs)
- Note about any warnings (incomplete artifacts/tasks)
**Output On Success**
```
## Archive Complete
**Change:** <change-name>
**Schema:** <schema-name>
**Archived to:** openspec/changes/archive/YYYY-MM-DD-<name>/
**Specs:** ✓ Synced to main specs
All artifacts complete. All tasks complete.
```
**Output On Success (No Delta Specs)**
```
## Archive Complete
**Change:** <change-name>
**Schema:** <schema-name>
**Archived to:** openspec/changes/archive/YYYY-MM-DD-<name>/
**Specs:** No delta specs
All artifacts complete. All tasks complete.
```
**Output On Success With Warnings**
```
## Archive Complete (with warnings)
**Change:** <change-name>
**Schema:** <schema-name>
**Archived to:** openspec/changes/archive/YYYY-MM-DD-<name>/
**Specs:** Sync skipped (user chose to skip)
**Warnings:**
- Archived with 2 incomplete artifacts
- Archived with 3 incomplete tasks
- Delta spec sync was skipped (user chose to skip)
Review the archive if this was not intentional.
```
**Output On Error (Archive Exists)**
```
## Archive Failed
**Change:** <change-name>
**Target:** openspec/changes/archive/YYYY-MM-DD-<name>/
Target archive directory already exists.
**Options:**
1. Rename the existing archive
2. Delete the existing archive if it's a duplicate
3. Wait until a different date to archive
```
**Guardrails**
- Always prompt for change selection if not provided
- Use artifact graph (openspec status --json) for completion checking
- Don't block archive on warnings - just inform and confirm
- Preserve .openspec.yaml when moving to archive (it moves with the directory)
- Show clear summary of what happened
- If sync is requested, use the Skill tool to invoke `openspec-sync-specs` (agent-driven)
- If delta specs exist, always run the sync assessment and show the combined summary before prompting

View File

@@ -0,0 +1,173 @@
---
name: "OPSX: Explore"
description: "Enter explore mode - think through ideas, investigate problems, clarify requirements"
category: Workflow
tags: [workflow, explore, experimental, thinking]
---
Enter explore mode. Think deeply. Visualize freely. Follow the conversation wherever it goes.
**IMPORTANT: Explore mode is for thinking, not implementing.** You may read files, search code, and investigate the codebase, but you must NEVER write code or implement features. If the user asks you to implement something, remind them to exit explore mode first and create a change proposal. You MAY create OpenSpec artifacts (proposals, designs, specs) if the user asks—that's capturing thinking, not implementing.
**This is a stance, not a workflow.** There are no fixed steps, no required sequence, no mandatory outputs. You're a thinking partner helping the user explore.
**Input**: The argument after `/opsx:explore` is whatever the user wants to think about. Could be:
- A vague idea: "real-time collaboration"
- A specific problem: "the auth system is getting unwieldy"
- A change name: "add-dark-mode" (to explore in context of that change)
- A comparison: "postgres vs sqlite for this"
- Nothing (just enter explore mode)
---
## The Stance
- **Curious, not prescriptive** - Ask questions that emerge naturally, don't follow a script
- **Open threads, not interrogations** - Surface multiple interesting directions and let the user follow what resonates. Don't funnel them through a single path of questions.
- **Visual** - Use ASCII diagrams liberally when they'd help clarify thinking
- **Adaptive** - Follow interesting threads, pivot when new information emerges
- **Patient** - Don't rush to conclusions, let the shape of the problem emerge
- **Grounded** - Explore the actual codebase when relevant, don't just theorize
---
## What You Might Do
Depending on what the user brings, you might:
**Explore the problem space**
- Ask clarifying questions that emerge from what they said
- Challenge assumptions
- Reframe the problem
- Find analogies
**Investigate the codebase**
- Map existing architecture relevant to the discussion
- Find integration points
- Identify patterns already in use
- Surface hidden complexity
**Compare options**
- Brainstorm multiple approaches
- Build comparison tables
- Sketch tradeoffs
- Recommend a path (if asked)
**Visualize**
```
┌─────────────────────────────────────────┐
│ Use ASCII diagrams liberally │
├─────────────────────────────────────────┤
│ │
│ ┌────────┐ ┌────────┐ │
│ │ State │────────▶│ State │ │
│ │ A │ │ B │ │
│ └────────┘ └────────┘ │
│ │
│ System diagrams, state machines, │
│ data flows, architecture sketches, │
│ dependency graphs, comparison tables │
│ │
└─────────────────────────────────────────┘
```
**Surface risks and unknowns**
- Identify what could go wrong
- Find gaps in understanding
- Suggest spikes or investigations
---
## OpenSpec Awareness
You have full context of the OpenSpec system. Use it naturally, don't force it.
### Check for context
At the start, quickly check what exists:
```bash
openspec list --json
```
This tells you:
- If there are active changes
- Their names, schemas, and status
- What the user might be working on
If the user mentioned a specific change name, read its artifacts for context.
### When no change exists
Think freely. When insights crystallize, you might offer:
- "This feels solid enough to start a change. Want me to create a proposal?"
- Or keep exploring - no pressure to formalize
### When a change exists
If the user mentions a change or you detect one is relevant:
1. **Read existing artifacts for context**
- `openspec/changes/<name>/proposal.md`
- `openspec/changes/<name>/design.md`
- `openspec/changes/<name>/tasks.md`
- etc.
2. **Reference them naturally in conversation**
- "Your design mentions using Redis, but we just realized SQLite fits better..."
- "The proposal scopes this to premium users, but we're now thinking everyone..."
3. **Offer to capture when decisions are made**
| Insight Type | Where to Capture |
|--------------|------------------|
| New requirement discovered | `specs/<capability>/spec.md` |
| Requirement changed | `specs/<capability>/spec.md` |
| Design decision made | `design.md` |
| Scope changed | `proposal.md` |
| New work identified | `tasks.md` |
| Assumption invalidated | Relevant artifact |
Example offers:
- "That's a design decision. Capture it in design.md?"
- "This is a new requirement. Add it to specs?"
- "This changes scope. Update the proposal?"
4. **The user decides** - Offer and move on. Don't pressure. Don't auto-capture.
---
## What You Don't Have To Do
- Follow a script
- Ask the same questions every time
- Produce a specific artifact
- Reach a conclusion
- Stay on topic if a tangent is valuable
- Be brief (this is thinking time)
---
## Ending Discovery
There's no required ending. Discovery might:
- **Flow into a proposal**: "Ready to start? I can create a change proposal."
- **Result in artifact updates**: "Updated design.md with these decisions"
- **Just provide clarity**: User has what they need, moves on
- **Continue later**: "We can pick this up anytime"
When things crystallize, you might offer a summary - but it's optional. Sometimes the thinking IS the value.
---
## Guardrails
- **Don't implement** - Never write code or implement features. Creating OpenSpec artifacts is fine, writing application code is not.
- **Don't fake understanding** - If something is unclear, dig deeper
- **Don't rush** - Discovery is thinking time, not task time
- **Don't force structure** - Let patterns emerge naturally
- **Don't auto-capture** - Offer to save insights, don't just do it
- **Do visualize** - A good diagram is worth many paragraphs
- **Do explore the codebase** - Ground discussions in reality
- **Do question assumptions** - Including the user's and your own

View File

@@ -0,0 +1,106 @@
---
name: "OPSX: Propose"
description: Propose a new change - create it and generate all artifacts in one step
category: Workflow
tags: [workflow, artifacts, experimental]
---
Propose a new change - create the change and generate all artifacts in one step.
I'll create a change with artifacts:
- proposal.md (what & why)
- design.md (how)
- tasks.md (implementation steps)
When ready to implement, run /opsx:apply
---
**Input**: The argument after `/opsx:propose` is the change name (kebab-case), OR a description of what the user wants to build.
**Steps**
1. **If no input provided, ask what they want to build**
Use the **AskUserQuestion tool** (open-ended, no preset options) to ask:
> "What change do you want to work on? Describe what you want to build or fix."
From their description, derive a kebab-case name (e.g., "add user authentication" → `add-user-auth`).
**IMPORTANT**: Do NOT proceed without understanding what the user wants to build.
2. **Create the change directory**
```bash
openspec new change "<name>"
```
This creates a scaffolded change at `openspec/changes/<name>/` with `.openspec.yaml`.
3. **Get the artifact build order**
```bash
openspec status --change "<name>" --json
```
Parse the JSON to get:
- `applyRequires`: array of artifact IDs needed before implementation (e.g., `["tasks"]`)
- `artifacts`: list of all artifacts with their status and dependencies
4. **Create artifacts in sequence until apply-ready**
Use the **TodoWrite tool** to track progress through the artifacts.
Loop through artifacts in dependency order (artifacts with no pending dependencies first):
a. **For each artifact that is `ready` (dependencies satisfied)**:
- Get instructions:
```bash
openspec instructions <artifact-id> --change "<name>" --json
```
- The instructions JSON includes:
- `context`: Project background (constraints for you - do NOT include in output)
- `rules`: Artifact-specific rules (constraints for you - do NOT include in output)
- `template`: The structure to use for your output file
- `instruction`: Schema-specific guidance for this artifact type
- `outputPath`: Where to write the artifact
- `dependencies`: Completed artifacts to read for context
- Read any completed dependency files for context
- Create the artifact file using `template` as the structure
- Apply `context` and `rules` as constraints - but do NOT copy them into the file
- Show brief progress: "Created <artifact-id>"
b. **Continue until all `applyRequires` artifacts are complete**
- After creating each artifact, re-run `openspec status --change "<name>" --json`
- Check if every artifact ID in `applyRequires` has `status: "done"` in the artifacts array
- Stop when all `applyRequires` artifacts are done
c. **If an artifact requires user input** (unclear context):
- Use **AskUserQuestion tool** to clarify
- Then continue with creation
5. **Show final status**
```bash
openspec status --change "<name>"
```
**Output**
After completing all artifacts, summarize:
- Change name and location
- List of artifacts created with brief descriptions
- What's ready: "All artifacts created! Ready for implementation."
- Prompt: "Run `/opsx:apply` to start implementing."
**Artifact Creation Guidelines**
- Follow the `instruction` field from `openspec instructions` for each artifact type
- The schema defines what each artifact should contain - follow it
- Read dependency artifacts for context before creating new ones
- Use `template` as the structure for your output file - fill in its sections
- **IMPORTANT**: `context` and `rules` are constraints for YOU, not content for the file
- Do NOT copy `<context>`, `<rules>`, `<project_context>` blocks into the artifact
- These guide what you write, but should never appear in the output
**Guardrails**
- Create ALL artifacts needed for implementation (as defined by schema's `apply.requires`)
- Always read dependency artifacts before creating a new one
- If context is critically unclear, ask the user - but prefer making reasonable decisions to keep momentum
- If a change with that name already exists, ask if user wants to continue it or create a new one
- Verify each artifact file exists after writing before proceeding to next

View File

@@ -0,0 +1,156 @@
---
name: openspec-apply-change
description: Implement tasks from an OpenSpec change. Use when the user wants to start implementing, continue implementation, or work through tasks.
license: MIT
compatibility: Requires openspec CLI.
metadata:
author: openspec
version: "1.0"
generatedBy: "1.2.0"
---
Implement tasks from an OpenSpec change.
**Input**: Optionally specify a change name. If omitted, check if it can be inferred from conversation context. If vague or ambiguous you MUST prompt for available changes.
**Steps**
1. **Select the change**
If a name is provided, use it. Otherwise:
- Infer from conversation context if the user mentioned a change
- Auto-select if only one active change exists
- If ambiguous, run `openspec list --json` to get available changes and use the **AskUserQuestion tool** to let the user select
Always announce: "Using change: <name>" and how to override (e.g., `/opsx:apply <other>`).
2. **Check status to understand the schema**
```bash
openspec status --change "<name>" --json
```
Parse the JSON to understand:
- `schemaName`: The workflow being used (e.g., "spec-driven")
- Which artifact contains the tasks (typically "tasks" for spec-driven, check status for others)
3. **Get apply instructions**
```bash
openspec instructions apply --change "<name>" --json
```
This returns:
- Context file paths (varies by schema - could be proposal/specs/design/tasks or spec/tests/implementation/docs)
- Progress (total, complete, remaining)
- Task list with status
- Dynamic instruction based on current state
**Handle states:**
- If `state: "blocked"` (missing artifacts): show message, suggest using openspec-continue-change
- If `state: "all_done"`: congratulate, suggest archive
- Otherwise: proceed to implementation
4. **Read context files**
Read the files listed in `contextFiles` from the apply instructions output.
The files depend on the schema being used:
- **spec-driven**: proposal, specs, design, tasks
- Other schemas: follow the contextFiles from CLI output
5. **Show current progress**
Display:
- Schema being used
- Progress: "N/M tasks complete"
- Remaining tasks overview
- Dynamic instruction from CLI
6. **Implement tasks (loop until done or blocked)**
For each pending task:
- Show which task is being worked on
- Make the code changes required
- Keep changes minimal and focused
- Mark task complete in the tasks file: `- [ ]` → `- [x]`
- Continue to next task
**Pause if:**
- Task is unclear → ask for clarification
- Implementation reveals a design issue → suggest updating artifacts
- Error or blocker encountered → report and wait for guidance
- User interrupts
7. **On completion or pause, show status**
Display:
- Tasks completed this session
- Overall progress: "N/M tasks complete"
- If all done: suggest archive
- If paused: explain why and wait for guidance
**Output During Implementation**
```
## Implementing: <change-name> (schema: <schema-name>)
Working on task 3/7: <task description>
[...implementation happening...]
✓ Task complete
Working on task 4/7: <task description>
[...implementation happening...]
✓ Task complete
```
**Output On Completion**
```
## Implementation Complete
**Change:** <change-name>
**Schema:** <schema-name>
**Progress:** 7/7 tasks complete ✓
### Completed This Session
- [x] Task 1
- [x] Task 2
...
All tasks complete! Ready to archive this change.
```
**Output On Pause (Issue Encountered)**
```
## Implementation Paused
**Change:** <change-name>
**Schema:** <schema-name>
**Progress:** 4/7 tasks complete
### Issue Encountered
<description of the issue>
**Options:**
1. <option 1>
2. <option 2>
3. Other approach
What would you like to do?
```
**Guardrails**
- Keep going through tasks until done or blocked
- Always read context files before starting (from the apply instructions output)
- If task is ambiguous, pause and ask before implementing
- If implementation reveals issues, pause and suggest artifact updates
- Keep code changes minimal and scoped to each task
- Update task checkbox immediately after completing each task
- Pause on errors, blockers, or unclear requirements - don't guess
- Use contextFiles from CLI output, don't assume specific file names
**Fluid Workflow Integration**
This skill supports the "actions on a change" model:
- **Can be invoked anytime**: Before all artifacts are done (if tasks exist), after partial implementation, interleaved with other actions
- **Allows artifact updates**: If implementation reveals design issues, suggest updating artifacts - not phase-locked, work fluidly

View File

@@ -0,0 +1,114 @@
---
name: openspec-archive-change
description: Archive a completed change in the experimental workflow. Use when the user wants to finalize and archive a change after implementation is complete.
license: MIT
compatibility: Requires openspec CLI.
metadata:
author: openspec
version: "1.0"
generatedBy: "1.2.0"
---
Archive a completed change in the experimental workflow.
**Input**: Optionally specify a change name. If omitted, check if it can be inferred from conversation context. If vague or ambiguous you MUST prompt for available changes.
**Steps**
1. **If no change name provided, prompt for selection**
Run `openspec list --json` to get available changes. Use the **AskUserQuestion tool** to let the user select.
Show only active changes (not already archived).
Include the schema used for each change if available.
**IMPORTANT**: Do NOT guess or auto-select a change. Always let the user choose.
2. **Check artifact completion status**
Run `openspec status --change "<name>" --json` to check artifact completion.
Parse the JSON to understand:
- `schemaName`: The workflow being used
- `artifacts`: List of artifacts with their status (`done` or other)
**If any artifacts are not `done`:**
- Display warning listing incomplete artifacts
- Use **AskUserQuestion tool** to confirm user wants to proceed
- Proceed if user confirms
3. **Check task completion status**
Read the tasks file (typically `tasks.md`) to check for incomplete tasks.
Count tasks marked with `- [ ]` (incomplete) vs `- [x]` (complete).
**If incomplete tasks found:**
- Display warning showing count of incomplete tasks
- Use **AskUserQuestion tool** to confirm user wants to proceed
- Proceed if user confirms
**If no tasks file exists:** Proceed without task-related warning.
4. **Assess delta spec sync state**
Check for delta specs at `openspec/changes/<name>/specs/`. If none exist, proceed without sync prompt.
**If delta specs exist:**
- Compare each delta spec with its corresponding main spec at `openspec/specs/<capability>/spec.md`
- Determine what changes would be applied (adds, modifications, removals, renames)
- Show a combined summary before prompting
**Prompt options:**
- If changes needed: "Sync now (recommended)", "Archive without syncing"
- If already synced: "Archive now", "Sync anyway", "Cancel"
If user chooses sync, use Task tool (subagent_type: "general-purpose", prompt: "Use Skill tool to invoke openspec-sync-specs for change '<name>'. Delta spec analysis: <include the analyzed delta spec summary>"). Proceed to archive regardless of choice.
5. **Perform the archive**
Create the archive directory if it doesn't exist:
```bash
mkdir -p openspec/changes/archive
```
Generate target name using current date: `YYYY-MM-DD-<change-name>`
**Check if target already exists:**
- If yes: Fail with error, suggest renaming existing archive or using different date
- If no: Move the change directory to archive
```bash
mv openspec/changes/<name> openspec/changes/archive/YYYY-MM-DD-<name>
```
6. **Display summary**
Show archive completion summary including:
- Change name
- Schema that was used
- Archive location
- Whether specs were synced (if applicable)
- Note about any warnings (incomplete artifacts/tasks)
**Output On Success**
```
## Archive Complete
**Change:** <change-name>
**Schema:** <schema-name>
**Archived to:** openspec/changes/archive/YYYY-MM-DD-<name>/
**Specs:** ✓ Synced to main specs (or "No delta specs" or "Sync skipped")
All artifacts complete. All tasks complete.
```
**Guardrails**
- Always prompt for change selection if not provided
- Use artifact graph (openspec status --json) for completion checking
- Don't block archive on warnings - just inform and confirm
- Preserve .openspec.yaml when moving to archive (it moves with the directory)
- Show clear summary of what happened
- If sync is requested, use openspec-sync-specs approach (agent-driven)
- If delta specs exist, always run the sync assessment and show the combined summary before prompting

View File

@@ -0,0 +1,288 @@
---
name: openspec-explore
description: Enter explore mode - a thinking partner for exploring ideas, investigating problems, and clarifying requirements. Use when the user wants to think through something before or during a change.
license: MIT
compatibility: Requires openspec CLI.
metadata:
author: openspec
version: "1.0"
generatedBy: "1.2.0"
---
Enter explore mode. Think deeply. Visualize freely. Follow the conversation wherever it goes.
**IMPORTANT: Explore mode is for thinking, not implementing.** You may read files, search code, and investigate the codebase, but you must NEVER write code or implement features. If the user asks you to implement something, remind them to exit explore mode first and create a change proposal. You MAY create OpenSpec artifacts (proposals, designs, specs) if the user asks—that's capturing thinking, not implementing.
**This is a stance, not a workflow.** There are no fixed steps, no required sequence, no mandatory outputs. You're a thinking partner helping the user explore.
---
## The Stance
- **Curious, not prescriptive** - Ask questions that emerge naturally, don't follow a script
- **Open threads, not interrogations** - Surface multiple interesting directions and let the user follow what resonates. Don't funnel them through a single path of questions.
- **Visual** - Use ASCII diagrams liberally when they'd help clarify thinking
- **Adaptive** - Follow interesting threads, pivot when new information emerges
- **Patient** - Don't rush to conclusions, let the shape of the problem emerge
- **Grounded** - Explore the actual codebase when relevant, don't just theorize
---
## What You Might Do
Depending on what the user brings, you might:
**Explore the problem space**
- Ask clarifying questions that emerge from what they said
- Challenge assumptions
- Reframe the problem
- Find analogies
**Investigate the codebase**
- Map existing architecture relevant to the discussion
- Find integration points
- Identify patterns already in use
- Surface hidden complexity
**Compare options**
- Brainstorm multiple approaches
- Build comparison tables
- Sketch tradeoffs
- Recommend a path (if asked)
**Visualize**
```
┌─────────────────────────────────────────┐
│ Use ASCII diagrams liberally │
├─────────────────────────────────────────┤
│ │
│ ┌────────┐ ┌────────┐ │
│ │ State │────────▶│ State │ │
│ │ A │ │ B │ │
│ └────────┘ └────────┘ │
│ │
│ System diagrams, state machines, │
│ data flows, architecture sketches, │
│ dependency graphs, comparison tables │
│ │
└─────────────────────────────────────────┘
```
**Surface risks and unknowns**
- Identify what could go wrong
- Find gaps in understanding
- Suggest spikes or investigations
---
## OpenSpec Awareness
You have full context of the OpenSpec system. Use it naturally, don't force it.
### Check for context
At the start, quickly check what exists:
```bash
openspec list --json
```
This tells you:
- If there are active changes
- Their names, schemas, and status
- What the user might be working on
### When no change exists
Think freely. When insights crystallize, you might offer:
- "This feels solid enough to start a change. Want me to create a proposal?"
- Or keep exploring - no pressure to formalize
### When a change exists
If the user mentions a change or you detect one is relevant:
1. **Read existing artifacts for context**
- `openspec/changes/<name>/proposal.md`
- `openspec/changes/<name>/design.md`
- `openspec/changes/<name>/tasks.md`
- etc.
2. **Reference them naturally in conversation**
- "Your design mentions using Redis, but we just realized SQLite fits better..."
- "The proposal scopes this to premium users, but we're now thinking everyone..."
3. **Offer to capture when decisions are made**
| Insight Type | Where to Capture |
|--------------|------------------|
| New requirement discovered | `specs/<capability>/spec.md` |
| Requirement changed | `specs/<capability>/spec.md` |
| Design decision made | `design.md` |
| Scope changed | `proposal.md` |
| New work identified | `tasks.md` |
| Assumption invalidated | Relevant artifact |
Example offers:
- "That's a design decision. Capture it in design.md?"
- "This is a new requirement. Add it to specs?"
- "This changes scope. Update the proposal?"
4. **The user decides** - Offer and move on. Don't pressure. Don't auto-capture.
---
## What You Don't Have To Do
- Follow a script
- Ask the same questions every time
- Produce a specific artifact
- Reach a conclusion
- Stay on topic if a tangent is valuable
- Be brief (this is thinking time)
---
## Handling Different Entry Points
**User brings a vague idea:**
```
User: I'm thinking about adding real-time collaboration
You: Real-time collab is a big space. Let me think about this...
COLLABORATION SPECTRUM
════════════════════════════════════════════
Awareness Coordination Sync
│ │ │
▼ ▼ ▼
┌────────┐ ┌────────┐ ┌────────┐
│Presence│ │Cursors │ │ CRDT │
│ "3 │ │ Multi │ │Conflict│
│online" │ │ select │ │ free │
└────────┘ └────────┘ └────────┘
│ │ │
trivial moderate complex
Where's your head at?
```
**User brings a specific problem:**
```
User: The auth system is a mess
You: [reads codebase]
┌─────────────────────────────────────────────┐
│ CURRENT AUTH FLOW │
└─────────────────────────────────────────────┘
┌─────────────┼─────────────┐
▼ ▼ ▼
┌─────────┐ ┌─────────┐ ┌─────────┐
│ Google │ │ GitHub │ │ Email │
│ OAuth │ │ OAuth │ │ Magic │
└────┬────┘ └────┬────┘ └────┬────┘
│ │ │
└─────────────┼─────────────┘
┌───────────┐
│ Session │
└─────┬─────┘
┌───────────┐
│ Perms │
└───────────┘
I see three tangles. Which one's burning?
```
**User is stuck mid-implementation:**
```
User: /opsx:explore add-auth-system
The OAuth integration is more complex than expected
You: [reads change artifacts]
You're on task 4: "Implement OAuth flow"
Let me trace what's involved...
[draws diagram, explores options, suggests paths]
Want to update the design to reflect this?
Or add a spike task to investigate?
```
**User wants to compare options:**
```
User: Should we use Postgres or SQLite?
You: Generic answer is boring. What's the context?
User: A CLI tool that tracks local dev environments
You: That changes everything.
┌─────────────────────────────────────────────────┐
│ CLI TOOL DATA STORAGE │
└─────────────────────────────────────────────────┘
Key constraints:
• No daemon running
• Must work offline
• Single user
SQLite Postgres
Deployment embedded ✓ needs server ✗
Offline yes ✓ no ✗
Single file yes ✓ no ✗
SQLite. Not even close.
Unless... is there a sync component?
```
---
## Ending Discovery
There's no required ending. Discovery might:
- **Flow into a proposal**: "Ready to start? I can create a change proposal."
- **Result in artifact updates**: "Updated design.md with these decisions"
- **Just provide clarity**: User has what they need, moves on
- **Continue later**: "We can pick this up anytime"
When it feels like things are crystallizing, you might summarize:
```
## What We Figured Out
**The problem**: [crystallized understanding]
**The approach**: [if one emerged]
**Open questions**: [if any remain]
**Next steps** (if ready):
- Create a change proposal
- Keep exploring: just keep talking
```
But this summary is optional. Sometimes the thinking IS the value.
---
## Guardrails
- **Don't implement** - Never write code or implement features. Creating OpenSpec artifacts is fine, writing application code is not.
- **Don't fake understanding** - If something is unclear, dig deeper
- **Don't rush** - Discovery is thinking time, not task time
- **Don't force structure** - Let patterns emerge naturally
- **Don't auto-capture** - Offer to save insights, don't just do it
- **Do visualize** - A good diagram is worth many paragraphs
- **Do explore the codebase** - Ground discussions in reality
- **Do question assumptions** - Including the user's and your own

View File

@@ -0,0 +1,110 @@
---
name: openspec-propose
description: Propose a new change with all artifacts generated in one step. Use when the user wants to quickly describe what they want to build and get a complete proposal with design, specs, and tasks ready for implementation.
license: MIT
compatibility: Requires openspec CLI.
metadata:
author: openspec
version: "1.0"
generatedBy: "1.2.0"
---
Propose a new change - create the change and generate all artifacts in one step.
I'll create a change with artifacts:
- proposal.md (what & why)
- design.md (how)
- tasks.md (implementation steps)
When ready to implement, run /opsx:apply
---
**Input**: The user's request should include a change name (kebab-case) OR a description of what they want to build.
**Steps**
1. **If no clear input provided, ask what they want to build**
Use the **AskUserQuestion tool** (open-ended, no preset options) to ask:
> "What change do you want to work on? Describe what you want to build or fix."
From their description, derive a kebab-case name (e.g., "add user authentication" → `add-user-auth`).
**IMPORTANT**: Do NOT proceed without understanding what the user wants to build.
2. **Create the change directory**
```bash
openspec new change "<name>"
```
This creates a scaffolded change at `openspec/changes/<name>/` with `.openspec.yaml`.
3. **Get the artifact build order**
```bash
openspec status --change "<name>" --json
```
Parse the JSON to get:
- `applyRequires`: array of artifact IDs needed before implementation (e.g., `["tasks"]`)
- `artifacts`: list of all artifacts with their status and dependencies
4. **Create artifacts in sequence until apply-ready**
Use the **TodoWrite tool** to track progress through the artifacts.
Loop through artifacts in dependency order (artifacts with no pending dependencies first):
a. **For each artifact that is `ready` (dependencies satisfied)**:
- Get instructions:
```bash
openspec instructions <artifact-id> --change "<name>" --json
```
- The instructions JSON includes:
- `context`: Project background (constraints for you - do NOT include in output)
- `rules`: Artifact-specific rules (constraints for you - do NOT include in output)
- `template`: The structure to use for your output file
- `instruction`: Schema-specific guidance for this artifact type
- `outputPath`: Where to write the artifact
- `dependencies`: Completed artifacts to read for context
- Read any completed dependency files for context
- Create the artifact file using `template` as the structure
- Apply `context` and `rules` as constraints - but do NOT copy them into the file
- Show brief progress: "Created <artifact-id>"
b. **Continue until all `applyRequires` artifacts are complete**
- After creating each artifact, re-run `openspec status --change "<name>" --json`
- Check if every artifact ID in `applyRequires` has `status: "done"` in the artifacts array
- Stop when all `applyRequires` artifacts are done
c. **If an artifact requires user input** (unclear context):
- Use **AskUserQuestion tool** to clarify
- Then continue with creation
5. **Show final status**
```bash
openspec status --change "<name>"
```
**Output**
After completing all artifacts, summarize:
- Change name and location
- List of artifacts created with brief descriptions
- What's ready: "All artifacts created! Ready for implementation."
- Prompt: "Run `/opsx:apply` or ask me to implement to start working on the tasks."
**Artifact Creation Guidelines**
- Follow the `instruction` field from `openspec instructions` for each artifact type
- The schema defines what each artifact should contain - follow it
- Read dependency artifacts for context before creating new ones
- Use `template` as the structure for your output file - fill in its sections
- **IMPORTANT**: `context` and `rules` are constraints for YOU, not content for the file
- Do NOT copy `<context>`, `<rules>`, `<project_context>` blocks into the artifact
- These guide what you write, but should never appear in the output
**Guardrails**
- Create ALL artifacts needed for implementation (as defined by schema's `apply.requires`)
- Always read dependency artifacts before creating a new one
- If context is critically unclear, ask the user - but prefer making reasonable decisions to keep momentum
- If a change with that name already exists, ask if user wants to continue it or create a new one
- Verify each artifact file exists after writing before proceeding to next

View File

@@ -0,0 +1,156 @@
---
name: openspec-apply-change
description: Implement tasks from an OpenSpec change. Use when the user wants to start implementing, continue implementation, or work through tasks.
license: MIT
compatibility: Requires openspec CLI.
metadata:
author: openspec
version: "1.0"
generatedBy: "1.2.0"
---
Implement tasks from an OpenSpec change.
**Input**: Optionally specify a change name. If omitted, check if it can be inferred from conversation context. If vague or ambiguous you MUST prompt for available changes.
**Steps**
1. **Select the change**
If a name is provided, use it. Otherwise:
- Infer from conversation context if the user mentioned a change
- Auto-select if only one active change exists
- If ambiguous, run `openspec list --json` to get available changes and use the **AskUserQuestion tool** to let the user select
Always announce: "Using change: <name>" and how to override (e.g., `/opsx:apply <other>`).
2. **Check status to understand the schema**
```bash
openspec status --change "<name>" --json
```
Parse the JSON to understand:
- `schemaName`: The workflow being used (e.g., "spec-driven")
- Which artifact contains the tasks (typically "tasks" for spec-driven, check status for others)
3. **Get apply instructions**
```bash
openspec instructions apply --change "<name>" --json
```
This returns:
- Context file paths (varies by schema - could be proposal/specs/design/tasks or spec/tests/implementation/docs)
- Progress (total, complete, remaining)
- Task list with status
- Dynamic instruction based on current state
**Handle states:**
- If `state: "blocked"` (missing artifacts): show message, suggest using openspec-continue-change
- If `state: "all_done"`: congratulate, suggest archive
- Otherwise: proceed to implementation
4. **Read context files**
Read the files listed in `contextFiles` from the apply instructions output.
The files depend on the schema being used:
- **spec-driven**: proposal, specs, design, tasks
- Other schemas: follow the contextFiles from CLI output
5. **Show current progress**
Display:
- Schema being used
- Progress: "N/M tasks complete"
- Remaining tasks overview
- Dynamic instruction from CLI
6. **Implement tasks (loop until done or blocked)**
For each pending task:
- Show which task is being worked on
- Make the code changes required
- Keep changes minimal and focused
- Mark task complete in the tasks file: `- [ ]` → `- [x]`
- Continue to next task
**Pause if:**
- Task is unclear → ask for clarification
- Implementation reveals a design issue → suggest updating artifacts
- Error or blocker encountered → report and wait for guidance
- User interrupts
7. **On completion or pause, show status**
Display:
- Tasks completed this session
- Overall progress: "N/M tasks complete"
- If all done: suggest archive
- If paused: explain why and wait for guidance
**Output During Implementation**
```
## Implementing: <change-name> (schema: <schema-name>)
Working on task 3/7: <task description>
[...implementation happening...]
✓ Task complete
Working on task 4/7: <task description>
[...implementation happening...]
✓ Task complete
```
**Output On Completion**
```
## Implementation Complete
**Change:** <change-name>
**Schema:** <schema-name>
**Progress:** 7/7 tasks complete ✓
### Completed This Session
- [x] Task 1
- [x] Task 2
...
All tasks complete! Ready to archive this change.
```
**Output On Pause (Issue Encountered)**
```
## Implementation Paused
**Change:** <change-name>
**Schema:** <schema-name>
**Progress:** 4/7 tasks complete
### Issue Encountered
<description of the issue>
**Options:**
1. <option 1>
2. <option 2>
3. Other approach
What would you like to do?
```
**Guardrails**
- Keep going through tasks until done or blocked
- Always read context files before starting (from the apply instructions output)
- If task is ambiguous, pause and ask before implementing
- If implementation reveals issues, pause and suggest artifact updates
- Keep code changes minimal and scoped to each task
- Update task checkbox immediately after completing each task
- Pause on errors, blockers, or unclear requirements - don't guess
- Use contextFiles from CLI output, don't assume specific file names
**Fluid Workflow Integration**
This skill supports the "actions on a change" model:
- **Can be invoked anytime**: Before all artifacts are done (if tasks exist), after partial implementation, interleaved with other actions
- **Allows artifact updates**: If implementation reveals design issues, suggest updating artifacts - not phase-locked, work fluidly

View File

@@ -0,0 +1,114 @@
---
name: openspec-archive-change
description: Archive a completed change in the experimental workflow. Use when the user wants to finalize and archive a change after implementation is complete.
license: MIT
compatibility: Requires openspec CLI.
metadata:
author: openspec
version: "1.0"
generatedBy: "1.2.0"
---
Archive a completed change in the experimental workflow.
**Input**: Optionally specify a change name. If omitted, check if it can be inferred from conversation context. If vague or ambiguous you MUST prompt for available changes.
**Steps**
1. **If no change name provided, prompt for selection**
Run `openspec list --json` to get available changes. Use the **AskUserQuestion tool** to let the user select.
Show only active changes (not already archived).
Include the schema used for each change if available.
**IMPORTANT**: Do NOT guess or auto-select a change. Always let the user choose.
2. **Check artifact completion status**
Run `openspec status --change "<name>" --json` to check artifact completion.
Parse the JSON to understand:
- `schemaName`: The workflow being used
- `artifacts`: List of artifacts with their status (`done` or other)
**If any artifacts are not `done`:**
- Display warning listing incomplete artifacts
- Use **AskUserQuestion tool** to confirm user wants to proceed
- Proceed if user confirms
3. **Check task completion status**
Read the tasks file (typically `tasks.md`) to check for incomplete tasks.
Count tasks marked with `- [ ]` (incomplete) vs `- [x]` (complete).
**If incomplete tasks found:**
- Display warning showing count of incomplete tasks
- Use **AskUserQuestion tool** to confirm user wants to proceed
- Proceed if user confirms
**If no tasks file exists:** Proceed without task-related warning.
4. **Assess delta spec sync state**
Check for delta specs at `openspec/changes/<name>/specs/`. If none exist, proceed without sync prompt.
**If delta specs exist:**
- Compare each delta spec with its corresponding main spec at `openspec/specs/<capability>/spec.md`
- Determine what changes would be applied (adds, modifications, removals, renames)
- Show a combined summary before prompting
**Prompt options:**
- If changes needed: "Sync now (recommended)", "Archive without syncing"
- If already synced: "Archive now", "Sync anyway", "Cancel"
If user chooses sync, use Task tool (subagent_type: "general-purpose", prompt: "Use Skill tool to invoke openspec-sync-specs for change '<name>'. Delta spec analysis: <include the analyzed delta spec summary>"). Proceed to archive regardless of choice.
5. **Perform the archive**
Create the archive directory if it doesn't exist:
```bash
mkdir -p openspec/changes/archive
```
Generate target name using current date: `YYYY-MM-DD-<change-name>`
**Check if target already exists:**
- If yes: Fail with error, suggest renaming existing archive or using different date
- If no: Move the change directory to archive
```bash
mv openspec/changes/<name> openspec/changes/archive/YYYY-MM-DD-<name>
```
6. **Display summary**
Show archive completion summary including:
- Change name
- Schema that was used
- Archive location
- Whether specs were synced (if applicable)
- Note about any warnings (incomplete artifacts/tasks)
**Output On Success**
```
## Archive Complete
**Change:** <change-name>
**Schema:** <schema-name>
**Archived to:** openspec/changes/archive/YYYY-MM-DD-<name>/
**Specs:** ✓ Synced to main specs (or "No delta specs" or "Sync skipped")
All artifacts complete. All tasks complete.
```
**Guardrails**
- Always prompt for change selection if not provided
- Use artifact graph (openspec status --json) for completion checking
- Don't block archive on warnings - just inform and confirm
- Preserve .openspec.yaml when moving to archive (it moves with the directory)
- Show clear summary of what happened
- If sync is requested, use openspec-sync-specs approach (agent-driven)
- If delta specs exist, always run the sync assessment and show the combined summary before prompting

View File

@@ -0,0 +1,288 @@
---
name: openspec-explore
description: Enter explore mode - a thinking partner for exploring ideas, investigating problems, and clarifying requirements. Use when the user wants to think through something before or during a change.
license: MIT
compatibility: Requires openspec CLI.
metadata:
author: openspec
version: "1.0"
generatedBy: "1.2.0"
---
Enter explore mode. Think deeply. Visualize freely. Follow the conversation wherever it goes.
**IMPORTANT: Explore mode is for thinking, not implementing.** You may read files, search code, and investigate the codebase, but you must NEVER write code or implement features. If the user asks you to implement something, remind them to exit explore mode first and create a change proposal. You MAY create OpenSpec artifacts (proposals, designs, specs) if the user asks—that's capturing thinking, not implementing.
**This is a stance, not a workflow.** There are no fixed steps, no required sequence, no mandatory outputs. You're a thinking partner helping the user explore.
---
## The Stance
- **Curious, not prescriptive** - Ask questions that emerge naturally, don't follow a script
- **Open threads, not interrogations** - Surface multiple interesting directions and let the user follow what resonates. Don't funnel them through a single path of questions.
- **Visual** - Use ASCII diagrams liberally when they'd help clarify thinking
- **Adaptive** - Follow interesting threads, pivot when new information emerges
- **Patient** - Don't rush to conclusions, let the shape of the problem emerge
- **Grounded** - Explore the actual codebase when relevant, don't just theorize
---
## What You Might Do
Depending on what the user brings, you might:
**Explore the problem space**
- Ask clarifying questions that emerge from what they said
- Challenge assumptions
- Reframe the problem
- Find analogies
**Investigate the codebase**
- Map existing architecture relevant to the discussion
- Find integration points
- Identify patterns already in use
- Surface hidden complexity
**Compare options**
- Brainstorm multiple approaches
- Build comparison tables
- Sketch tradeoffs
- Recommend a path (if asked)
**Visualize**
```
┌─────────────────────────────────────────┐
│ Use ASCII diagrams liberally │
├─────────────────────────────────────────┤
│ │
│ ┌────────┐ ┌────────┐ │
│ │ State │────────▶│ State │ │
│ │ A │ │ B │ │
│ └────────┘ └────────┘ │
│ │
│ System diagrams, state machines, │
│ data flows, architecture sketches, │
│ dependency graphs, comparison tables │
│ │
└─────────────────────────────────────────┘
```
**Surface risks and unknowns**
- Identify what could go wrong
- Find gaps in understanding
- Suggest spikes or investigations
---
## OpenSpec Awareness
You have full context of the OpenSpec system. Use it naturally, don't force it.
### Check for context
At the start, quickly check what exists:
```bash
openspec list --json
```
This tells you:
- If there are active changes
- Their names, schemas, and status
- What the user might be working on
### When no change exists
Think freely. When insights crystallize, you might offer:
- "This feels solid enough to start a change. Want me to create a proposal?"
- Or keep exploring - no pressure to formalize
### When a change exists
If the user mentions a change or you detect one is relevant:
1. **Read existing artifacts for context**
- `openspec/changes/<name>/proposal.md`
- `openspec/changes/<name>/design.md`
- `openspec/changes/<name>/tasks.md`
- etc.
2. **Reference them naturally in conversation**
- "Your design mentions using Redis, but we just realized SQLite fits better..."
- "The proposal scopes this to premium users, but we're now thinking everyone..."
3. **Offer to capture when decisions are made**
| Insight Type | Where to Capture |
|--------------|------------------|
| New requirement discovered | `specs/<capability>/spec.md` |
| Requirement changed | `specs/<capability>/spec.md` |
| Design decision made | `design.md` |
| Scope changed | `proposal.md` |
| New work identified | `tasks.md` |
| Assumption invalidated | Relevant artifact |
Example offers:
- "That's a design decision. Capture it in design.md?"
- "This is a new requirement. Add it to specs?"
- "This changes scope. Update the proposal?"
4. **The user decides** - Offer and move on. Don't pressure. Don't auto-capture.
---
## What You Don't Have To Do
- Follow a script
- Ask the same questions every time
- Produce a specific artifact
- Reach a conclusion
- Stay on topic if a tangent is valuable
- Be brief (this is thinking time)
---
## Handling Different Entry Points
**User brings a vague idea:**
```
User: I'm thinking about adding real-time collaboration
You: Real-time collab is a big space. Let me think about this...
COLLABORATION SPECTRUM
════════════════════════════════════════════
Awareness Coordination Sync
│ │ │
▼ ▼ ▼
┌────────┐ ┌────────┐ ┌────────┐
│Presence│ │Cursors │ │ CRDT │
│ "3 │ │ Multi │ │Conflict│
│online" │ │ select │ │ free │
└────────┘ └────────┘ └────────┘
│ │ │
trivial moderate complex
Where's your head at?
```
**User brings a specific problem:**
```
User: The auth system is a mess
You: [reads codebase]
┌─────────────────────────────────────────────┐
│ CURRENT AUTH FLOW │
└─────────────────────────────────────────────┘
┌─────────────┼─────────────┐
▼ ▼ ▼
┌─────────┐ ┌─────────┐ ┌─────────┐
│ Google │ │ GitHub │ │ Email │
│ OAuth │ │ OAuth │ │ Magic │
└────┬────┘ └────┬────┘ └────┬────┘
│ │ │
└─────────────┼─────────────┘
┌───────────┐
│ Session │
└─────┬─────┘
┌───────────┐
│ Perms │
└───────────┘
I see three tangles. Which one's burning?
```
**User is stuck mid-implementation:**
```
User: /opsx:explore add-auth-system
The OAuth integration is more complex than expected
You: [reads change artifacts]
You're on task 4: "Implement OAuth flow"
Let me trace what's involved...
[draws diagram, explores options, suggests paths]
Want to update the design to reflect this?
Or add a spike task to investigate?
```
**User wants to compare options:**
```
User: Should we use Postgres or SQLite?
You: Generic answer is boring. What's the context?
User: A CLI tool that tracks local dev environments
You: That changes everything.
┌─────────────────────────────────────────────────┐
│ CLI TOOL DATA STORAGE │
└─────────────────────────────────────────────────┘
Key constraints:
• No daemon running
• Must work offline
• Single user
SQLite Postgres
Deployment embedded ✓ needs server ✗
Offline yes ✓ no ✗
Single file yes ✓ no ✗
SQLite. Not even close.
Unless... is there a sync component?
```
---
## Ending Discovery
There's no required ending. Discovery might:
- **Flow into a proposal**: "Ready to start? I can create a change proposal."
- **Result in artifact updates**: "Updated design.md with these decisions"
- **Just provide clarity**: User has what they need, moves on
- **Continue later**: "We can pick this up anytime"
When it feels like things are crystallizing, you might summarize:
```
## What We Figured Out
**The problem**: [crystallized understanding]
**The approach**: [if one emerged]
**Open questions**: [if any remain]
**Next steps** (if ready):
- Create a change proposal
- Keep exploring: just keep talking
```
But this summary is optional. Sometimes the thinking IS the value.
---
## Guardrails
- **Don't implement** - Never write code or implement features. Creating OpenSpec artifacts is fine, writing application code is not.
- **Don't fake understanding** - If something is unclear, dig deeper
- **Don't rush** - Discovery is thinking time, not task time
- **Don't force structure** - Let patterns emerge naturally
- **Don't auto-capture** - Offer to save insights, don't just do it
- **Do visualize** - A good diagram is worth many paragraphs
- **Do explore the codebase** - Ground discussions in reality
- **Do question assumptions** - Including the user's and your own

View File

@@ -0,0 +1,110 @@
---
name: openspec-propose
description: Propose a new change with all artifacts generated in one step. Use when the user wants to quickly describe what they want to build and get a complete proposal with design, specs, and tasks ready for implementation.
license: MIT
compatibility: Requires openspec CLI.
metadata:
author: openspec
version: "1.0"
generatedBy: "1.2.0"
---
Propose a new change - create the change and generate all artifacts in one step.
I'll create a change with artifacts:
- proposal.md (what & why)
- design.md (how)
- tasks.md (implementation steps)
When ready to implement, run /opsx:apply
---
**Input**: The user's request should include a change name (kebab-case) OR a description of what they want to build.
**Steps**
1. **If no clear input provided, ask what they want to build**
Use the **AskUserQuestion tool** (open-ended, no preset options) to ask:
> "What change do you want to work on? Describe what you want to build or fix."
From their description, derive a kebab-case name (e.g., "add user authentication" → `add-user-auth`).
**IMPORTANT**: Do NOT proceed without understanding what the user wants to build.
2. **Create the change directory**
```bash
openspec new change "<name>"
```
This creates a scaffolded change at `openspec/changes/<name>/` with `.openspec.yaml`.
3. **Get the artifact build order**
```bash
openspec status --change "<name>" --json
```
Parse the JSON to get:
- `applyRequires`: array of artifact IDs needed before implementation (e.g., `["tasks"]`)
- `artifacts`: list of all artifacts with their status and dependencies
4. **Create artifacts in sequence until apply-ready**
Use the **TodoWrite tool** to track progress through the artifacts.
Loop through artifacts in dependency order (artifacts with no pending dependencies first):
a. **For each artifact that is `ready` (dependencies satisfied)**:
- Get instructions:
```bash
openspec instructions <artifact-id> --change "<name>" --json
```
- The instructions JSON includes:
- `context`: Project background (constraints for you - do NOT include in output)
- `rules`: Artifact-specific rules (constraints for you - do NOT include in output)
- `template`: The structure to use for your output file
- `instruction`: Schema-specific guidance for this artifact type
- `outputPath`: Where to write the artifact
- `dependencies`: Completed artifacts to read for context
- Read any completed dependency files for context
- Create the artifact file using `template` as the structure
- Apply `context` and `rules` as constraints - but do NOT copy them into the file
- Show brief progress: "Created <artifact-id>"
b. **Continue until all `applyRequires` artifacts are complete**
- After creating each artifact, re-run `openspec status --change "<name>" --json`
- Check if every artifact ID in `applyRequires` has `status: "done"` in the artifacts array
- Stop when all `applyRequires` artifacts are done
c. **If an artifact requires user input** (unclear context):
- Use **AskUserQuestion tool** to clarify
- Then continue with creation
5. **Show final status**
```bash
openspec status --change "<name>"
```
**Output**
After completing all artifacts, summarize:
- Change name and location
- List of artifacts created with brief descriptions
- What's ready: "All artifacts created! Ready for implementation."
- Prompt: "Run `/opsx:apply` or ask me to implement to start working on the tasks."
**Artifact Creation Guidelines**
- Follow the `instruction` field from `openspec instructions` for each artifact type
- The schema defines what each artifact should contain - follow it
- Read dependency artifacts for context before creating new ones
- Use `template` as the structure for your output file - fill in its sections
- **IMPORTANT**: `context` and `rules` are constraints for YOU, not content for the file
- Do NOT copy `<context>`, `<rules>`, `<project_context>` blocks into the artifact
- These guide what you write, but should never appear in the output
**Guardrails**
- Create ALL artifacts needed for implementation (as defined by schema's `apply.requires`)
- Always read dependency artifacts before creating a new one
- If context is critically unclear, ask the user - but prefer making reasonable decisions to keep momentum
- If a change with that name already exists, ask if user wants to continue it or create a new one
- Verify each artifact file exists after writing before proceeding to next

View File

@@ -0,0 +1,152 @@
---
name: /opsx-apply
id: opsx-apply
category: Workflow
description: Implement tasks from an OpenSpec change (Experimental)
---
Implement tasks from an OpenSpec change.
**Input**: Optionally specify a change name (e.g., `/opsx:apply add-auth`). If omitted, check if it can be inferred from conversation context. If vague or ambiguous you MUST prompt for available changes.
**Steps**
1. **Select the change**
If a name is provided, use it. Otherwise:
- Infer from conversation context if the user mentioned a change
- Auto-select if only one active change exists
- If ambiguous, run `openspec list --json` to get available changes and use the **AskUserQuestion tool** to let the user select
Always announce: "Using change: <name>" and how to override (e.g., `/opsx:apply <other>`).
2. **Check status to understand the schema**
```bash
openspec status --change "<name>" --json
```
Parse the JSON to understand:
- `schemaName`: The workflow being used (e.g., "spec-driven")
- Which artifact contains the tasks (typically "tasks" for spec-driven, check status for others)
3. **Get apply instructions**
```bash
openspec instructions apply --change "<name>" --json
```
This returns:
- Context file paths (varies by schema)
- Progress (total, complete, remaining)
- Task list with status
- Dynamic instruction based on current state
**Handle states:**
- If `state: "blocked"` (missing artifacts): show message, suggest using `/opsx:continue`
- If `state: "all_done"`: congratulate, suggest archive
- Otherwise: proceed to implementation
4. **Read context files**
Read the files listed in `contextFiles` from the apply instructions output.
The files depend on the schema being used:
- **spec-driven**: proposal, specs, design, tasks
- Other schemas: follow the contextFiles from CLI output
5. **Show current progress**
Display:
- Schema being used
- Progress: "N/M tasks complete"
- Remaining tasks overview
- Dynamic instruction from CLI
6. **Implement tasks (loop until done or blocked)**
For each pending task:
- Show which task is being worked on
- Make the code changes required
- Keep changes minimal and focused
- Mark task complete in the tasks file: `- [ ]` → `- [x]`
- Continue to next task
**Pause if:**
- Task is unclear → ask for clarification
- Implementation reveals a design issue → suggest updating artifacts
- Error or blocker encountered → report and wait for guidance
- User interrupts
7. **On completion or pause, show status**
Display:
- Tasks completed this session
- Overall progress: "N/M tasks complete"
- If all done: suggest archive
- If paused: explain why and wait for guidance
**Output During Implementation**
```
## Implementing: <change-name> (schema: <schema-name>)
Working on task 3/7: <task description>
[...implementation happening...]
✓ Task complete
Working on task 4/7: <task description>
[...implementation happening...]
✓ Task complete
```
**Output On Completion**
```
## Implementation Complete
**Change:** <change-name>
**Schema:** <schema-name>
**Progress:** 7/7 tasks complete ✓
### Completed This Session
- [x] Task 1
- [x] Task 2
...
All tasks complete! You can archive this change with `/opsx:archive`.
```
**Output On Pause (Issue Encountered)**
```
## Implementation Paused
**Change:** <change-name>
**Schema:** <schema-name>
**Progress:** 4/7 tasks complete
### Issue Encountered
<description of the issue>
**Options:**
1. <option 1>
2. <option 2>
3. Other approach
What would you like to do?
```
**Guardrails**
- Keep going through tasks until done or blocked
- Always read context files before starting (from the apply instructions output)
- If task is ambiguous, pause and ask before implementing
- If implementation reveals issues, pause and suggest artifact updates
- Keep code changes minimal and scoped to each task
- Update task checkbox immediately after completing each task
- Pause on errors, blockers, or unclear requirements - don't guess
- Use contextFiles from CLI output, don't assume specific file names
**Fluid Workflow Integration**
This skill supports the "actions on a change" model:
- **Can be invoked anytime**: Before all artifacts are done (if tasks exist), after partial implementation, interleaved with other actions
- **Allows artifact updates**: If implementation reveals design issues, suggest updating artifacts - not phase-locked, work fluidly

View File

@@ -0,0 +1,157 @@
---
name: /opsx-archive
id: opsx-archive
category: Workflow
description: Archive a completed change in the experimental workflow
---
Archive a completed change in the experimental workflow.
**Input**: Optionally specify a change name after `/opsx:archive` (e.g., `/opsx:archive add-auth`). If omitted, check if it can be inferred from conversation context. If vague or ambiguous you MUST prompt for available changes.
**Steps**
1. **If no change name provided, prompt for selection**
Run `openspec list --json` to get available changes. Use the **AskUserQuestion tool** to let the user select.
Show only active changes (not already archived).
Include the schema used for each change if available.
**IMPORTANT**: Do NOT guess or auto-select a change. Always let the user choose.
2. **Check artifact completion status**
Run `openspec status --change "<name>" --json` to check artifact completion.
Parse the JSON to understand:
- `schemaName`: The workflow being used
- `artifacts`: List of artifacts with their status (`done` or other)
**If any artifacts are not `done`:**
- Display warning listing incomplete artifacts
- Prompt user for confirmation to continue
- Proceed if user confirms
3. **Check task completion status**
Read the tasks file (typically `tasks.md`) to check for incomplete tasks.
Count tasks marked with `- [ ]` (incomplete) vs `- [x]` (complete).
**If incomplete tasks found:**
- Display warning showing count of incomplete tasks
- Prompt user for confirmation to continue
- Proceed if user confirms
**If no tasks file exists:** Proceed without task-related warning.
4. **Assess delta spec sync state**
Check for delta specs at `openspec/changes/<name>/specs/`. If none exist, proceed without sync prompt.
**If delta specs exist:**
- Compare each delta spec with its corresponding main spec at `openspec/specs/<capability>/spec.md`
- Determine what changes would be applied (adds, modifications, removals, renames)
- Show a combined summary before prompting
**Prompt options:**
- If changes needed: "Sync now (recommended)", "Archive without syncing"
- If already synced: "Archive now", "Sync anyway", "Cancel"
If user chooses sync, use Task tool (subagent_type: "general-purpose", prompt: "Use Skill tool to invoke openspec-sync-specs for change '<name>'. Delta spec analysis: <include the analyzed delta spec summary>"). Proceed to archive regardless of choice.
5. **Perform the archive**
Create the archive directory if it doesn't exist:
```bash
mkdir -p openspec/changes/archive
```
Generate target name using current date: `YYYY-MM-DD-<change-name>`
**Check if target already exists:**
- If yes: Fail with error, suggest renaming existing archive or using different date
- If no: Move the change directory to archive
```bash
mv openspec/changes/<name> openspec/changes/archive/YYYY-MM-DD-<name>
```
6. **Display summary**
Show archive completion summary including:
- Change name
- Schema that was used
- Archive location
- Spec sync status (synced / sync skipped / no delta specs)
- Note about any warnings (incomplete artifacts/tasks)
**Output On Success**
```
## Archive Complete
**Change:** <change-name>
**Schema:** <schema-name>
**Archived to:** openspec/changes/archive/YYYY-MM-DD-<name>/
**Specs:** ✓ Synced to main specs
All artifacts complete. All tasks complete.
```
**Output On Success (No Delta Specs)**
```
## Archive Complete
**Change:** <change-name>
**Schema:** <schema-name>
**Archived to:** openspec/changes/archive/YYYY-MM-DD-<name>/
**Specs:** No delta specs
All artifacts complete. All tasks complete.
```
**Output On Success With Warnings**
```
## Archive Complete (with warnings)
**Change:** <change-name>
**Schema:** <schema-name>
**Archived to:** openspec/changes/archive/YYYY-MM-DD-<name>/
**Specs:** Sync skipped (user chose to skip)
**Warnings:**
- Archived with 2 incomplete artifacts
- Archived with 3 incomplete tasks
- Delta spec sync was skipped (user chose to skip)
Review the archive if this was not intentional.
```
**Output On Error (Archive Exists)**
```
## Archive Failed
**Change:** <change-name>
**Target:** openspec/changes/archive/YYYY-MM-DD-<name>/
Target archive directory already exists.
**Options:**
1. Rename the existing archive
2. Delete the existing archive if it's a duplicate
3. Wait until a different date to archive
```
**Guardrails**
- Always prompt for change selection if not provided
- Use artifact graph (openspec status --json) for completion checking
- Don't block archive on warnings - just inform and confirm
- Preserve .openspec.yaml when moving to archive (it moves with the directory)
- Show clear summary of what happened
- If sync is requested, use the Skill tool to invoke `openspec-sync-specs` (agent-driven)
- If delta specs exist, always run the sync assessment and show the combined summary before prompting

View File

@@ -0,0 +1,173 @@
---
name: /opsx-explore
id: opsx-explore
category: Workflow
description: "Enter explore mode - think through ideas, investigate problems, clarify requirements"
---
Enter explore mode. Think deeply. Visualize freely. Follow the conversation wherever it goes.
**IMPORTANT: Explore mode is for thinking, not implementing.** You may read files, search code, and investigate the codebase, but you must NEVER write code or implement features. If the user asks you to implement something, remind them to exit explore mode first and create a change proposal. You MAY create OpenSpec artifacts (proposals, designs, specs) if the user asks—that's capturing thinking, not implementing.
**This is a stance, not a workflow.** There are no fixed steps, no required sequence, no mandatory outputs. You're a thinking partner helping the user explore.
**Input**: The argument after `/opsx:explore` is whatever the user wants to think about. Could be:
- A vague idea: "real-time collaboration"
- A specific problem: "the auth system is getting unwieldy"
- A change name: "add-dark-mode" (to explore in context of that change)
- A comparison: "postgres vs sqlite for this"
- Nothing (just enter explore mode)
---
## The Stance
- **Curious, not prescriptive** - Ask questions that emerge naturally, don't follow a script
- **Open threads, not interrogations** - Surface multiple interesting directions and let the user follow what resonates. Don't funnel them through a single path of questions.
- **Visual** - Use ASCII diagrams liberally when they'd help clarify thinking
- **Adaptive** - Follow interesting threads, pivot when new information emerges
- **Patient** - Don't rush to conclusions, let the shape of the problem emerge
- **Grounded** - Explore the actual codebase when relevant, don't just theorize
---
## What You Might Do
Depending on what the user brings, you might:
**Explore the problem space**
- Ask clarifying questions that emerge from what they said
- Challenge assumptions
- Reframe the problem
- Find analogies
**Investigate the codebase**
- Map existing architecture relevant to the discussion
- Find integration points
- Identify patterns already in use
- Surface hidden complexity
**Compare options**
- Brainstorm multiple approaches
- Build comparison tables
- Sketch tradeoffs
- Recommend a path (if asked)
**Visualize**
```
┌─────────────────────────────────────────┐
│ Use ASCII diagrams liberally │
├─────────────────────────────────────────┤
│ │
│ ┌────────┐ ┌────────┐ │
│ │ State │────────▶│ State │ │
│ │ A │ │ B │ │
│ └────────┘ └────────┘ │
│ │
│ System diagrams, state machines, │
│ data flows, architecture sketches, │
│ dependency graphs, comparison tables │
│ │
└─────────────────────────────────────────┘
```
**Surface risks and unknowns**
- Identify what could go wrong
- Find gaps in understanding
- Suggest spikes or investigations
---
## OpenSpec Awareness
You have full context of the OpenSpec system. Use it naturally, don't force it.
### Check for context
At the start, quickly check what exists:
```bash
openspec list --json
```
This tells you:
- If there are active changes
- Their names, schemas, and status
- What the user might be working on
If the user mentioned a specific change name, read its artifacts for context.
### When no change exists
Think freely. When insights crystallize, you might offer:
- "This feels solid enough to start a change. Want me to create a proposal?"
- Or keep exploring - no pressure to formalize
### When a change exists
If the user mentions a change or you detect one is relevant:
1. **Read existing artifacts for context**
- `openspec/changes/<name>/proposal.md`
- `openspec/changes/<name>/design.md`
- `openspec/changes/<name>/tasks.md`
- etc.
2. **Reference them naturally in conversation**
- "Your design mentions using Redis, but we just realized SQLite fits better..."
- "The proposal scopes this to premium users, but we're now thinking everyone..."
3. **Offer to capture when decisions are made**
| Insight Type | Where to Capture |
|--------------|------------------|
| New requirement discovered | `specs/<capability>/spec.md` |
| Requirement changed | `specs/<capability>/spec.md` |
| Design decision made | `design.md` |
| Scope changed | `proposal.md` |
| New work identified | `tasks.md` |
| Assumption invalidated | Relevant artifact |
Example offers:
- "That's a design decision. Capture it in design.md?"
- "This is a new requirement. Add it to specs?"
- "This changes scope. Update the proposal?"
4. **The user decides** - Offer and move on. Don't pressure. Don't auto-capture.
---
## What You Don't Have To Do
- Follow a script
- Ask the same questions every time
- Produce a specific artifact
- Reach a conclusion
- Stay on topic if a tangent is valuable
- Be brief (this is thinking time)
---
## Ending Discovery
There's no required ending. Discovery might:
- **Flow into a proposal**: "Ready to start? I can create a change proposal."
- **Result in artifact updates**: "Updated design.md with these decisions"
- **Just provide clarity**: User has what they need, moves on
- **Continue later**: "We can pick this up anytime"
When things crystallize, you might offer a summary - but it's optional. Sometimes the thinking IS the value.
---
## Guardrails
- **Don't implement** - Never write code or implement features. Creating OpenSpec artifacts is fine, writing application code is not.
- **Don't fake understanding** - If something is unclear, dig deeper
- **Don't rush** - Discovery is thinking time, not task time
- **Don't force structure** - Let patterns emerge naturally
- **Don't auto-capture** - Offer to save insights, don't just do it
- **Do visualize** - A good diagram is worth many paragraphs
- **Do explore the codebase** - Ground discussions in reality
- **Do question assumptions** - Including the user's and your own

View File

@@ -0,0 +1,106 @@
---
name: /opsx-propose
id: opsx-propose
category: Workflow
description: Propose a new change - create it and generate all artifacts in one step
---
Propose a new change - create the change and generate all artifacts in one step.
I'll create a change with artifacts:
- proposal.md (what & why)
- design.md (how)
- tasks.md (implementation steps)
When ready to implement, run /opsx:apply
---
**Input**: The argument after `/opsx:propose` is the change name (kebab-case), OR a description of what the user wants to build.
**Steps**
1. **If no input provided, ask what they want to build**
Use the **AskUserQuestion tool** (open-ended, no preset options) to ask:
> "What change do you want to work on? Describe what you want to build or fix."
From their description, derive a kebab-case name (e.g., "add user authentication" → `add-user-auth`).
**IMPORTANT**: Do NOT proceed without understanding what the user wants to build.
2. **Create the change directory**
```bash
openspec new change "<name>"
```
This creates a scaffolded change at `openspec/changes/<name>/` with `.openspec.yaml`.
3. **Get the artifact build order**
```bash
openspec status --change "<name>" --json
```
Parse the JSON to get:
- `applyRequires`: array of artifact IDs needed before implementation (e.g., `["tasks"]`)
- `artifacts`: list of all artifacts with their status and dependencies
4. **Create artifacts in sequence until apply-ready**
Use the **TodoWrite tool** to track progress through the artifacts.
Loop through artifacts in dependency order (artifacts with no pending dependencies first):
a. **For each artifact that is `ready` (dependencies satisfied)**:
- Get instructions:
```bash
openspec instructions <artifact-id> --change "<name>" --json
```
- The instructions JSON includes:
- `context`: Project background (constraints for you - do NOT include in output)
- `rules`: Artifact-specific rules (constraints for you - do NOT include in output)
- `template`: The structure to use for your output file
- `instruction`: Schema-specific guidance for this artifact type
- `outputPath`: Where to write the artifact
- `dependencies`: Completed artifacts to read for context
- Read any completed dependency files for context
- Create the artifact file using `template` as the structure
- Apply `context` and `rules` as constraints - but do NOT copy them into the file
- Show brief progress: "Created <artifact-id>"
b. **Continue until all `applyRequires` artifacts are complete**
- After creating each artifact, re-run `openspec status --change "<name>" --json`
- Check if every artifact ID in `applyRequires` has `status: "done"` in the artifacts array
- Stop when all `applyRequires` artifacts are done
c. **If an artifact requires user input** (unclear context):
- Use **AskUserQuestion tool** to clarify
- Then continue with creation
5. **Show final status**
```bash
openspec status --change "<name>"
```
**Output**
After completing all artifacts, summarize:
- Change name and location
- List of artifacts created with brief descriptions
- What's ready: "All artifacts created! Ready for implementation."
- Prompt: "Run `/opsx:apply` to start implementing."
**Artifact Creation Guidelines**
- Follow the `instruction` field from `openspec instructions` for each artifact type
- The schema defines what each artifact should contain - follow it
- Read dependency artifacts for context before creating new ones
- Use `template` as the structure for your output file - fill in its sections
- **IMPORTANT**: `context` and `rules` are constraints for YOU, not content for the file
- Do NOT copy `<context>`, `<rules>`, `<project_context>` blocks into the artifact
- These guide what you write, but should never appear in the output
**Guardrails**
- Create ALL artifacts needed for implementation (as defined by schema's `apply.requires`)
- Always read dependency artifacts before creating a new one
- If context is critically unclear, ask the user - but prefer making reasonable decisions to keep momentum
- If a change with that name already exists, ask if user wants to continue it or create a new one
- Verify each artifact file exists after writing before proceeding to next

View File

@@ -0,0 +1,156 @@
---
name: openspec-apply-change
description: Implement tasks from an OpenSpec change. Use when the user wants to start implementing, continue implementation, or work through tasks.
license: MIT
compatibility: Requires openspec CLI.
metadata:
author: openspec
version: "1.0"
generatedBy: "1.2.0"
---
Implement tasks from an OpenSpec change.
**Input**: Optionally specify a change name. If omitted, check if it can be inferred from conversation context. If vague or ambiguous you MUST prompt for available changes.
**Steps**
1. **Select the change**
If a name is provided, use it. Otherwise:
- Infer from conversation context if the user mentioned a change
- Auto-select if only one active change exists
- If ambiguous, run `openspec list --json` to get available changes and use the **AskUserQuestion tool** to let the user select
Always announce: "Using change: <name>" and how to override (e.g., `/opsx:apply <other>`).
2. **Check status to understand the schema**
```bash
openspec status --change "<name>" --json
```
Parse the JSON to understand:
- `schemaName`: The workflow being used (e.g., "spec-driven")
- Which artifact contains the tasks (typically "tasks" for spec-driven, check status for others)
3. **Get apply instructions**
```bash
openspec instructions apply --change "<name>" --json
```
This returns:
- Context file paths (varies by schema - could be proposal/specs/design/tasks or spec/tests/implementation/docs)
- Progress (total, complete, remaining)
- Task list with status
- Dynamic instruction based on current state
**Handle states:**
- If `state: "blocked"` (missing artifacts): show message, suggest using openspec-continue-change
- If `state: "all_done"`: congratulate, suggest archive
- Otherwise: proceed to implementation
4. **Read context files**
Read the files listed in `contextFiles` from the apply instructions output.
The files depend on the schema being used:
- **spec-driven**: proposal, specs, design, tasks
- Other schemas: follow the contextFiles from CLI output
5. **Show current progress**
Display:
- Schema being used
- Progress: "N/M tasks complete"
- Remaining tasks overview
- Dynamic instruction from CLI
6. **Implement tasks (loop until done or blocked)**
For each pending task:
- Show which task is being worked on
- Make the code changes required
- Keep changes minimal and focused
- Mark task complete in the tasks file: `- [ ]` → `- [x]`
- Continue to next task
**Pause if:**
- Task is unclear → ask for clarification
- Implementation reveals a design issue → suggest updating artifacts
- Error or blocker encountered → report and wait for guidance
- User interrupts
7. **On completion or pause, show status**
Display:
- Tasks completed this session
- Overall progress: "N/M tasks complete"
- If all done: suggest archive
- If paused: explain why and wait for guidance
**Output During Implementation**
```
## Implementing: <change-name> (schema: <schema-name>)
Working on task 3/7: <task description>
[...implementation happening...]
✓ Task complete
Working on task 4/7: <task description>
[...implementation happening...]
✓ Task complete
```
**Output On Completion**
```
## Implementation Complete
**Change:** <change-name>
**Schema:** <schema-name>
**Progress:** 7/7 tasks complete ✓
### Completed This Session
- [x] Task 1
- [x] Task 2
...
All tasks complete! Ready to archive this change.
```
**Output On Pause (Issue Encountered)**
```
## Implementation Paused
**Change:** <change-name>
**Schema:** <schema-name>
**Progress:** 4/7 tasks complete
### Issue Encountered
<description of the issue>
**Options:**
1. <option 1>
2. <option 2>
3. Other approach
What would you like to do?
```
**Guardrails**
- Keep going through tasks until done or blocked
- Always read context files before starting (from the apply instructions output)
- If task is ambiguous, pause and ask before implementing
- If implementation reveals issues, pause and suggest artifact updates
- Keep code changes minimal and scoped to each task
- Update task checkbox immediately after completing each task
- Pause on errors, blockers, or unclear requirements - don't guess
- Use contextFiles from CLI output, don't assume specific file names
**Fluid Workflow Integration**
This skill supports the "actions on a change" model:
- **Can be invoked anytime**: Before all artifacts are done (if tasks exist), after partial implementation, interleaved with other actions
- **Allows artifact updates**: If implementation reveals design issues, suggest updating artifacts - not phase-locked, work fluidly

View File

@@ -0,0 +1,114 @@
---
name: openspec-archive-change
description: Archive a completed change in the experimental workflow. Use when the user wants to finalize and archive a change after implementation is complete.
license: MIT
compatibility: Requires openspec CLI.
metadata:
author: openspec
version: "1.0"
generatedBy: "1.2.0"
---
Archive a completed change in the experimental workflow.
**Input**: Optionally specify a change name. If omitted, check if it can be inferred from conversation context. If vague or ambiguous you MUST prompt for available changes.
**Steps**
1. **If no change name provided, prompt for selection**
Run `openspec list --json` to get available changes. Use the **AskUserQuestion tool** to let the user select.
Show only active changes (not already archived).
Include the schema used for each change if available.
**IMPORTANT**: Do NOT guess or auto-select a change. Always let the user choose.
2. **Check artifact completion status**
Run `openspec status --change "<name>" --json` to check artifact completion.
Parse the JSON to understand:
- `schemaName`: The workflow being used
- `artifacts`: List of artifacts with their status (`done` or other)
**If any artifacts are not `done`:**
- Display warning listing incomplete artifacts
- Use **AskUserQuestion tool** to confirm user wants to proceed
- Proceed if user confirms
3. **Check task completion status**
Read the tasks file (typically `tasks.md`) to check for incomplete tasks.
Count tasks marked with `- [ ]` (incomplete) vs `- [x]` (complete).
**If incomplete tasks found:**
- Display warning showing count of incomplete tasks
- Use **AskUserQuestion tool** to confirm user wants to proceed
- Proceed if user confirms
**If no tasks file exists:** Proceed without task-related warning.
4. **Assess delta spec sync state**
Check for delta specs at `openspec/changes/<name>/specs/`. If none exist, proceed without sync prompt.
**If delta specs exist:**
- Compare each delta spec with its corresponding main spec at `openspec/specs/<capability>/spec.md`
- Determine what changes would be applied (adds, modifications, removals, renames)
- Show a combined summary before prompting
**Prompt options:**
- If changes needed: "Sync now (recommended)", "Archive without syncing"
- If already synced: "Archive now", "Sync anyway", "Cancel"
If user chooses sync, use Task tool (subagent_type: "general-purpose", prompt: "Use Skill tool to invoke openspec-sync-specs for change '<name>'. Delta spec analysis: <include the analyzed delta spec summary>"). Proceed to archive regardless of choice.
5. **Perform the archive**
Create the archive directory if it doesn't exist:
```bash
mkdir -p openspec/changes/archive
```
Generate target name using current date: `YYYY-MM-DD-<change-name>`
**Check if target already exists:**
- If yes: Fail with error, suggest renaming existing archive or using different date
- If no: Move the change directory to archive
```bash
mv openspec/changes/<name> openspec/changes/archive/YYYY-MM-DD-<name>
```
6. **Display summary**
Show archive completion summary including:
- Change name
- Schema that was used
- Archive location
- Whether specs were synced (if applicable)
- Note about any warnings (incomplete artifacts/tasks)
**Output On Success**
```
## Archive Complete
**Change:** <change-name>
**Schema:** <schema-name>
**Archived to:** openspec/changes/archive/YYYY-MM-DD-<name>/
**Specs:** ✓ Synced to main specs (or "No delta specs" or "Sync skipped")
All artifacts complete. All tasks complete.
```
**Guardrails**
- Always prompt for change selection if not provided
- Use artifact graph (openspec status --json) for completion checking
- Don't block archive on warnings - just inform and confirm
- Preserve .openspec.yaml when moving to archive (it moves with the directory)
- Show clear summary of what happened
- If sync is requested, use openspec-sync-specs approach (agent-driven)
- If delta specs exist, always run the sync assessment and show the combined summary before prompting

View File

@@ -0,0 +1,288 @@
---
name: openspec-explore
description: Enter explore mode - a thinking partner for exploring ideas, investigating problems, and clarifying requirements. Use when the user wants to think through something before or during a change.
license: MIT
compatibility: Requires openspec CLI.
metadata:
author: openspec
version: "1.0"
generatedBy: "1.2.0"
---
Enter explore mode. Think deeply. Visualize freely. Follow the conversation wherever it goes.
**IMPORTANT: Explore mode is for thinking, not implementing.** You may read files, search code, and investigate the codebase, but you must NEVER write code or implement features. If the user asks you to implement something, remind them to exit explore mode first and create a change proposal. You MAY create OpenSpec artifacts (proposals, designs, specs) if the user asks—that's capturing thinking, not implementing.
**This is a stance, not a workflow.** There are no fixed steps, no required sequence, no mandatory outputs. You're a thinking partner helping the user explore.
---
## The Stance
- **Curious, not prescriptive** - Ask questions that emerge naturally, don't follow a script
- **Open threads, not interrogations** - Surface multiple interesting directions and let the user follow what resonates. Don't funnel them through a single path of questions.
- **Visual** - Use ASCII diagrams liberally when they'd help clarify thinking
- **Adaptive** - Follow interesting threads, pivot when new information emerges
- **Patient** - Don't rush to conclusions, let the shape of the problem emerge
- **Grounded** - Explore the actual codebase when relevant, don't just theorize
---
## What You Might Do
Depending on what the user brings, you might:
**Explore the problem space**
- Ask clarifying questions that emerge from what they said
- Challenge assumptions
- Reframe the problem
- Find analogies
**Investigate the codebase**
- Map existing architecture relevant to the discussion
- Find integration points
- Identify patterns already in use
- Surface hidden complexity
**Compare options**
- Brainstorm multiple approaches
- Build comparison tables
- Sketch tradeoffs
- Recommend a path (if asked)
**Visualize**
```
┌─────────────────────────────────────────┐
│ Use ASCII diagrams liberally │
├─────────────────────────────────────────┤
│ │
│ ┌────────┐ ┌────────┐ │
│ │ State │────────▶│ State │ │
│ │ A │ │ B │ │
│ └────────┘ └────────┘ │
│ │
│ System diagrams, state machines, │
│ data flows, architecture sketches, │
│ dependency graphs, comparison tables │
│ │
└─────────────────────────────────────────┘
```
**Surface risks and unknowns**
- Identify what could go wrong
- Find gaps in understanding
- Suggest spikes or investigations
---
## OpenSpec Awareness
You have full context of the OpenSpec system. Use it naturally, don't force it.
### Check for context
At the start, quickly check what exists:
```bash
openspec list --json
```
This tells you:
- If there are active changes
- Their names, schemas, and status
- What the user might be working on
### When no change exists
Think freely. When insights crystallize, you might offer:
- "This feels solid enough to start a change. Want me to create a proposal?"
- Or keep exploring - no pressure to formalize
### When a change exists
If the user mentions a change or you detect one is relevant:
1. **Read existing artifacts for context**
- `openspec/changes/<name>/proposal.md`
- `openspec/changes/<name>/design.md`
- `openspec/changes/<name>/tasks.md`
- etc.
2. **Reference them naturally in conversation**
- "Your design mentions using Redis, but we just realized SQLite fits better..."
- "The proposal scopes this to premium users, but we're now thinking everyone..."
3. **Offer to capture when decisions are made**
| Insight Type | Where to Capture |
|--------------|------------------|
| New requirement discovered | `specs/<capability>/spec.md` |
| Requirement changed | `specs/<capability>/spec.md` |
| Design decision made | `design.md` |
| Scope changed | `proposal.md` |
| New work identified | `tasks.md` |
| Assumption invalidated | Relevant artifact |
Example offers:
- "That's a design decision. Capture it in design.md?"
- "This is a new requirement. Add it to specs?"
- "This changes scope. Update the proposal?"
4. **The user decides** - Offer and move on. Don't pressure. Don't auto-capture.
---
## What You Don't Have To Do
- Follow a script
- Ask the same questions every time
- Produce a specific artifact
- Reach a conclusion
- Stay on topic if a tangent is valuable
- Be brief (this is thinking time)
---
## Handling Different Entry Points
**User brings a vague idea:**
```
User: I'm thinking about adding real-time collaboration
You: Real-time collab is a big space. Let me think about this...
COLLABORATION SPECTRUM
════════════════════════════════════════════
Awareness Coordination Sync
│ │ │
▼ ▼ ▼
┌────────┐ ┌────────┐ ┌────────┐
│Presence│ │Cursors │ │ CRDT │
│ "3 │ │ Multi │ │Conflict│
│online" │ │ select │ │ free │
└────────┘ └────────┘ └────────┘
│ │ │
trivial moderate complex
Where's your head at?
```
**User brings a specific problem:**
```
User: The auth system is a mess
You: [reads codebase]
┌─────────────────────────────────────────────┐
│ CURRENT AUTH FLOW │
└─────────────────────────────────────────────┘
┌─────────────┼─────────────┐
▼ ▼ ▼
┌─────────┐ ┌─────────┐ ┌─────────┐
│ Google │ │ GitHub │ │ Email │
│ OAuth │ │ OAuth │ │ Magic │
└────┬────┘ └────┬────┘ └────┬────┘
│ │ │
└─────────────┼─────────────┘
┌───────────┐
│ Session │
└─────┬─────┘
┌───────────┐
│ Perms │
└───────────┘
I see three tangles. Which one's burning?
```
**User is stuck mid-implementation:**
```
User: /opsx:explore add-auth-system
The OAuth integration is more complex than expected
You: [reads change artifacts]
You're on task 4: "Implement OAuth flow"
Let me trace what's involved...
[draws diagram, explores options, suggests paths]
Want to update the design to reflect this?
Or add a spike task to investigate?
```
**User wants to compare options:**
```
User: Should we use Postgres or SQLite?
You: Generic answer is boring. What's the context?
User: A CLI tool that tracks local dev environments
You: That changes everything.
┌─────────────────────────────────────────────────┐
│ CLI TOOL DATA STORAGE │
└─────────────────────────────────────────────────┘
Key constraints:
• No daemon running
• Must work offline
• Single user
SQLite Postgres
Deployment embedded ✓ needs server ✗
Offline yes ✓ no ✗
Single file yes ✓ no ✗
SQLite. Not even close.
Unless... is there a sync component?
```
---
## Ending Discovery
There's no required ending. Discovery might:
- **Flow into a proposal**: "Ready to start? I can create a change proposal."
- **Result in artifact updates**: "Updated design.md with these decisions"
- **Just provide clarity**: User has what they need, moves on
- **Continue later**: "We can pick this up anytime"
When it feels like things are crystallizing, you might summarize:
```
## What We Figured Out
**The problem**: [crystallized understanding]
**The approach**: [if one emerged]
**Open questions**: [if any remain]
**Next steps** (if ready):
- Create a change proposal
- Keep exploring: just keep talking
```
But this summary is optional. Sometimes the thinking IS the value.
---
## Guardrails
- **Don't implement** - Never write code or implement features. Creating OpenSpec artifacts is fine, writing application code is not.
- **Don't fake understanding** - If something is unclear, dig deeper
- **Don't rush** - Discovery is thinking time, not task time
- **Don't force structure** - Let patterns emerge naturally
- **Don't auto-capture** - Offer to save insights, don't just do it
- **Do visualize** - A good diagram is worth many paragraphs
- **Do explore the codebase** - Ground discussions in reality
- **Do question assumptions** - Including the user's and your own

View File

@@ -0,0 +1,110 @@
---
name: openspec-propose
description: Propose a new change with all artifacts generated in one step. Use when the user wants to quickly describe what they want to build and get a complete proposal with design, specs, and tasks ready for implementation.
license: MIT
compatibility: Requires openspec CLI.
metadata:
author: openspec
version: "1.0"
generatedBy: "1.2.0"
---
Propose a new change - create the change and generate all artifacts in one step.
I'll create a change with artifacts:
- proposal.md (what & why)
- design.md (how)
- tasks.md (implementation steps)
When ready to implement, run /opsx:apply
---
**Input**: The user's request should include a change name (kebab-case) OR a description of what they want to build.
**Steps**
1. **If no clear input provided, ask what they want to build**
Use the **AskUserQuestion tool** (open-ended, no preset options) to ask:
> "What change do you want to work on? Describe what you want to build or fix."
From their description, derive a kebab-case name (e.g., "add user authentication" → `add-user-auth`).
**IMPORTANT**: Do NOT proceed without understanding what the user wants to build.
2. **Create the change directory**
```bash
openspec new change "<name>"
```
This creates a scaffolded change at `openspec/changes/<name>/` with `.openspec.yaml`.
3. **Get the artifact build order**
```bash
openspec status --change "<name>" --json
```
Parse the JSON to get:
- `applyRequires`: array of artifact IDs needed before implementation (e.g., `["tasks"]`)
- `artifacts`: list of all artifacts with their status and dependencies
4. **Create artifacts in sequence until apply-ready**
Use the **TodoWrite tool** to track progress through the artifacts.
Loop through artifacts in dependency order (artifacts with no pending dependencies first):
a. **For each artifact that is `ready` (dependencies satisfied)**:
- Get instructions:
```bash
openspec instructions <artifact-id> --change "<name>" --json
```
- The instructions JSON includes:
- `context`: Project background (constraints for you - do NOT include in output)
- `rules`: Artifact-specific rules (constraints for you - do NOT include in output)
- `template`: The structure to use for your output file
- `instruction`: Schema-specific guidance for this artifact type
- `outputPath`: Where to write the artifact
- `dependencies`: Completed artifacts to read for context
- Read any completed dependency files for context
- Create the artifact file using `template` as the structure
- Apply `context` and `rules` as constraints - but do NOT copy them into the file
- Show brief progress: "Created <artifact-id>"
b. **Continue until all `applyRequires` artifacts are complete**
- After creating each artifact, re-run `openspec status --change "<name>" --json`
- Check if every artifact ID in `applyRequires` has `status: "done"` in the artifacts array
- Stop when all `applyRequires` artifacts are done
c. **If an artifact requires user input** (unclear context):
- Use **AskUserQuestion tool** to clarify
- Then continue with creation
5. **Show final status**
```bash
openspec status --change "<name>"
```
**Output**
After completing all artifacts, summarize:
- Change name and location
- List of artifacts created with brief descriptions
- What's ready: "All artifacts created! Ready for implementation."
- Prompt: "Run `/opsx:apply` or ask me to implement to start working on the tasks."
**Artifact Creation Guidelines**
- Follow the `instruction` field from `openspec instructions` for each artifact type
- The schema defines what each artifact should contain - follow it
- Read dependency artifacts for context before creating new ones
- Use `template` as the structure for your output file - fill in its sections
- **IMPORTANT**: `context` and `rules` are constraints for YOU, not content for the file
- Do NOT copy `<context>`, `<rules>`, `<project_context>` blocks into the artifact
- These guide what you write, but should never appear in the output
**Guardrails**
- Create ALL artifacts needed for implementation (as defined by schema's `apply.requires`)
- Always read dependency artifacts before creating a new one
- If context is critically unclear, ask the user - but prefer making reasonable decisions to keep momentum
- If a change with that name already exists, ask if user wants to continue it or create a new one
- Verify each artifact file exists after writing before proceeding to next

View File

@@ -0,0 +1,23 @@
name: Deploy with Docker Compose
on:
push:
branches:
- main # adapte la branche que tu veux déployer
jobs:
deploy:
runs-on: mac-orbstack-runner # le nom que tu as donné au runner
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Deploy stack
env:
DOCKER_BUILDKIT: 1
COMPOSE_DOCKER_CLI_BUILD: 1
AUTH_SECRET: ${{ secrets.AUTH_SECRET }}
AUTH_URL: ${{ vars.AUTH_URL }}
DATA_VOLUME_PATH: ${{ vars.DATA_VOLUME_PATH }}
run: |
docker compose up -d --build

4
.gitignore vendored
View File

@@ -41,3 +41,7 @@ yarn-error.log*
next-env.d.ts next-env.d.ts
/src/generated/prisma /src/generated/prisma
# data
data/
*.db

View File

@@ -6,4 +6,3 @@
"printWidth": 100, "printWidth": 100,
"plugins": [] "plugins": []
} }

96
CLAUDE.md Normal file
View File

@@ -0,0 +1,96 @@
# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## Commands
```bash
# Development
pnpm dev # Start dev server (http://localhost:3000)
pnpm build # Production build (standalone output)
pnpm lint # Run ESLint
# Database (use pnpm instead of npx)
pnpm prisma migrate dev --name <name> # Create and apply migration
pnpm prisma generate # Regenerate Prisma client
pnpm prisma studio # Open DB GUI
```
## Stack
- **Next.js 16** (App Router, standalone output)
- **SQLite + Prisma 7** via `@prisma/adapter-better-sqlite3` (not the default SQLite adapter)
- **NextAuth.js v5** (beta.30) — JWT sessions, Credentials provider only
- **Tailwind CSS v4** — CSS Variables theming in `src/app/globals.css`
- **Drag & Drop** — `@dnd-kit` and `@hello-pangea/dnd` (both present)
- **pnpm** — package manager (use pnpm, not npm/yarn)
## Architecture Overview
### Workshop Types
There are 5 workshop types: `swot` (sessions), `motivators`, `year-review`, `weekly-checkin`, `weather`.
**Single source of truth**: `src/lib/workshops.ts` exports `WORKSHOPS`, `WORKSHOP_BY_ID`, and helpers. Every place that lists or routes workshop types must use this file.
**All types and UI config constants** are in `src/lib/types.ts` (e.g. `MOTIVATORS_CONFIG`, `YEAR_REVIEW_SECTIONS`, `SWOT_QUADRANTS`, `EMOTIONS_CONFIG`).
### Layer Structure
```
src/
├── actions/ # Next.js Server Actions ('use server') — call services, revalidate paths
├── services/ # Prisma queries + business logic (server-only)
│ ├── database.ts # Prisma singleton (global for dev HMR)
│ ├── session-permissions.ts # Shared permission factories
│ └── session-share-events.ts # Shared share + SSE event handlers
├── app/
│ ├── api/ # Route handlers (SSE subscribe + auth)
│ ├── (auth)/ # Login/register pages
│ └── [workshop]/ # One folder per workshop type
├── components/
│ └── collaboration/ # BaseSessionLiveWrapper + share/live UI
├── hooks/
│ └── useLive.ts # SSE client hook (EventSource + reconnect)
└── lib/
├── workshops.ts # Workshop metadata registry
├── types.ts # All TypeScript types + UI config
└── share-utils.ts # Shared share types
```
### Real-Time Collaboration (SSE)
Each workshop has `/api/[path]/[id]/subscribe` — a GET route that opens a `ReadableStream` (SSE). The server polls the DB every 1 second for new events and pushes them to connected clients. Server Actions write events to the DB after mutations.
Client side: `useLive` hook (`src/hooks/useLive.ts`) connects to the subscribe endpoint with `EventSource`, filters out events from the current user (to avoid duplicates), and calls `router.refresh()` on incoming events.
`BaseSessionLiveWrapper` (`src/components/collaboration/`) is the shared wrapper component that wires `useLive`, `CollaborationToolbar`, and `ShareModal` for all workshop session pages.
### Shared Permission System
`createSessionPermissionChecks(model)` in `src/services/session-permissions.ts` returns `canAccess`, `canEdit`, `canDelete` for any Prisma model that follows the session shape (has `userId` + `shares` relation). Team admins have implicit access to their members' sessions.
`createShareAndEventHandlers(...)` in `src/services/session-share-events.ts` returns `share`, `removeShare`, `getShares`, `createEvent`, `getEvents` — used by all workshop services.
### Auth
- `src/lib/auth.ts` — NextAuth config (signIn, signOut, auth exports)
- `src/lib/auth.config.ts` — config object (used separately for Edge middleware)
- `src/middleware.ts` — protects all routes except `/api/auth`, `_next/static`, `_next/image`, `favicon.ico`
- Session user ID is available via `auth()` call server-side; token includes `id` field
### Database
Prisma client is a singleton in `src/services/database.ts`. `DATABASE_URL` env var controls the SQLite file path (default: `file:./prisma/dev.db`). Schema is at `prisma/schema.prisma`.
### Adding a New Workshop
Pattern followed by all existing workshops:
1. Add entry to `WORKSHOPS` in `src/lib/workshops.ts`
2. Add Prisma models (Session, Item, Share, Event) following the existing pattern
3. Create service in `src/services/` using `createSessionPermissionChecks` and `createShareAndEventHandlers`
4. Create server actions in `src/actions/`
5. Create API route `src/app/api/[path]/[id]/subscribe/route.ts` (copy from existing)
6. Create pages under `src/app/[path]/`
7. Use `BaseSessionLiveWrapper` for the session live page

View File

@@ -3,12 +3,16 @@
# ---- Base ---- # ---- Base ----
FROM node:22-alpine AS base FROM node:22-alpine AS base
RUN corepack enable && corepack prepare pnpm@latest --activate RUN corepack enable && corepack prepare pnpm@latest --activate
ENV PNPM_HOME="/pnpm"
ENV PATH="$PNPM_HOME:$PATH"
RUN mkdir -p $PNPM_HOME
WORKDIR /app WORKDIR /app
# ---- Dependencies ---- # ---- Dependencies ----
FROM base AS deps FROM base AS deps
COPY package.json pnpm-lock.yaml ./ COPY package.json pnpm-lock.yaml ./
RUN pnpm install --frozen-lockfile RUN --mount=type=cache,id=pnpm,target=/pnpm/store \
pnpm install --frozen-lockfile
# ---- Build ---- # ---- Build ----
FROM base AS builder FROM base AS builder
@@ -46,7 +50,8 @@ COPY --from=builder /app/prisma.config.ts ./prisma.config.ts
# Install prisma CLI for migrations + better-sqlite3 (compile native module) # Install prisma CLI for migrations + better-sqlite3 (compile native module)
ENV DATABASE_URL="file:/app/data/prod.db" ENV DATABASE_URL="file:/app/data/prod.db"
RUN pnpm add prisma @prisma/client @prisma/adapter-better-sqlite3 better-sqlite3 dotenv && \ RUN --mount=type=cache,id=pnpm,target=/pnpm/store \
pnpm add prisma @prisma/client @prisma/adapter-better-sqlite3 better-sqlite3 dotenv && \
pnpm prisma generate pnpm prisma generate
# Copy entrypoint script # Copy entrypoint script

84
PERF_OPTIMIZATIONS.md Normal file
View File

@@ -0,0 +1,84 @@
# Optimisations de performance
## Requêtes DB (impact critique)
### resolveCollaborator — suppression du scan complet de la table User
**Fichier:** `src/services/auth.ts`
Avant : `findMany` sur tous les users puis `find()` en JS pour un match case-insensitive par nom.
Après : `findFirst` avec `contains` + vérification exacte. O(1) au lieu de O(N users).
### getAllUsersWithStats — suppression du N+1
**Fichier:** `src/services/auth.ts`
Avant : 2 queries `count` par utilisateur (`Promise.all` avec map).
Après : 2 `groupBy` en bulk + construction d'une Map. 3 queries au lieu de 2N+1.
### React.cache sur les fonctions teams
**Fichier:** `src/services/teams.ts`
`getTeamMemberIdsForAdminTeams` et `isAdminOfUser` wrappées avec `React.cache()`.
Sur la page `/sessions`, ces fonctions étaient appelées ~10 fois par requête (5 workshop types × 2). Maintenant dédupliquées en 1 appel.
## SSE / Temps réel (impact haut)
### Polling interval 1s → 2s
**Fichiers:** 5 routes `src/app/api/*/[id]/subscribe/route.ts`
Réduit de 50% le nombre de queries DB en temps réel. Imperceptible côté UX (la plupart des outils collab utilisent 2-5s).
### Nettoyage des events
**Fichier:** `src/services/session-share-events.ts`
Ajout de `cleanupOldEvents(maxAgeHours)` pour purger les events périmés. Les tables d'events n'ont pas de mécanisme de TTL — cette méthode peut être appelée périodiquement ou à la connexion SSE.
## Rendu client (impact haut)
### React.memo sur les composants de cartes
**Fichiers:**
- `src/components/swot/SwotCard.tsx`
- `src/components/moving-motivators/MotivatorCard.tsx` (+ `MotivatorCardStatic`)
- `src/components/weather/WeatherCard.tsx`
- `src/components/weekly-checkin/WeeklyCheckInCard.tsx`
- `src/components/year-review/YearReviewCard.tsx`
Ces composants sont rendus en liste et re-rendaient tous à chaque drag, changement d'état, ou `router.refresh()` SSE.
### WeatherCard — fix du pattern useEffect + setState
**Fichier:** `src/components/weather/WeatherCard.tsx`
Remplacé le `useEffect` qui appelait 5 `setState` (cascading renders, erreur lint React 19) par le pattern idiomatique de state-driven prop sync (comparaison directe dans le render body).
## Configuration Next.js (impact moyen)
### next.config.ts
**Fichier:** `next.config.ts`
- `poweredByHeader: false` — supprime le header `X-Powered-By` (sécurité)
- `optimizePackageImports` — tree-shaking amélioré pour `@dnd-kit/*` et `@hello-pangea/dnd`
### Fix FOUC dark mode
**Fichier:** `src/app/layout.tsx`
Script inline dans `<head>` qui lit `localStorage` et applique la classe `dark`/`light` sur `<html>` avant l'hydratation React. Élimine le flash blanc pour les utilisateurs en dark mode.
## Nettoyage
- Suppression de 5 SVGs inutilisés du template Next.js (`file.svg`, `globe.svg`, `next.svg`, `vercel.svg`, `window.svg`)
## Non traité (pour plus tard)
- **Migration DnD** : consolider `@hello-pangea/dnd` et `@dnd-kit` en une seule lib (~45KB économisés) — 3 boards à réécrire
- **Split WorkshopTabs** (879 lignes) — découper en sous-composants par type
- **Suspense boundaries** sur les pages de détail de session
- **Appel périodique de `cleanupOldEvents`** — à brancher via cron ou à la connexion SSE

View File

@@ -5,18 +5,23 @@ Plateforme d'ateliers managériaux interactifs et collaboratifs.
## ✨ Fonctionnalités ## ✨ Fonctionnalités
### 📊 Analyse SWOT ### 📊 Analyse SWOT
Cartographiez les forces, faiblesses, opportunités et menaces de vos collaborateurs. Cartographiez les forces, faiblesses, opportunités et menaces de vos collaborateurs.
- Matrice interactive avec drag & drop - Matrice interactive avec drag & drop
- Actions croisées et plan de développement - Actions croisées et plan de développement
- Collaboration en temps réel - Collaboration en temps réel
### 🎯 Moving Motivators ### 🎯 Moving Motivators
Explorez les 10 motivations intrinsèques (Management 3.0). Explorez les 10 motivations intrinsèques (Management 3.0).
- Classement par importance - Classement par importance
- Évaluation de l'influence positive/négative - Évaluation de l'influence positive/négative
- Récapitulatif personnalisé - Récapitulatif personnalisé
### 🤝 Collaboration ### 🤝 Collaboration
- Partage de sessions (Éditeur / Lecteur) - Partage de sessions (Éditeur / Lecteur)
- Synchronisation temps réel (SSE) - Synchronisation temps réel (SSE)
- Historique sauvegardé - Historique sauvegardé

Binary file not shown.

Binary file not shown.

BIN
dev.db

Binary file not shown.

View File

@@ -1,6 +1,6 @@
# SWOT Manager - Development Book # Workshop Manager - Development Book
Application de gestion d'ateliers SWOT pour entretiens managériaux. Application de gestion d'ateliers pour entretiens managériaux.
## Stack Technique ## Stack Technique
@@ -47,6 +47,7 @@ Application de gestion d'ateliers SWOT pour entretiens managériaux.
- [x] Installer et configurer Prisma - [x] Installer et configurer Prisma
- [x] Créer le schéma de base de données : - [x] Créer le schéma de base de données :
```prisma ```prisma
model User { model User {
id String @id @default(cuid()) id String @id @default(cuid())
@@ -110,10 +111,11 @@ Application de gestion d'ateliers SWOT pour entretiens managériaux.
action Action @relation(fields: [actionId], references: [id], onDelete: Cascade) action Action @relation(fields: [actionId], references: [id], onDelete: Cascade)
swotItemId String swotItemId String
swotItem SwotItem @relation(fields: [swotItemId], references: [id], onDelete: Cascade) swotItem SwotItem @relation(fields: [swotItemId], references: [id], onDelete: Cascade)
@@unique([actionId, swotItemId]) @@unique([actionId, swotItemId])
} }
``` ```
- [x] Générer le client Prisma - [x] Générer le client Prisma
- [x] Créer les migrations initiales - [x] Créer les migrations initiales
- [x] Créer le service database.ts (pool de connexion) - [x] Créer le service database.ts (pool de connexion)
@@ -260,7 +262,7 @@ Application de gestion d'ateliers SWOT pour entretiens managériaux.
```typescript ```typescript
// actions/swot-items.ts // actions/swot-items.ts
'use server' 'use server';
import { swotService } from '@/services/swot'; import { swotService } from '@/services/swot';
import { revalidatePath } from 'next/cache'; import { revalidatePath } from 'next/cache';
@@ -310,4 +312,3 @@ npm run build
# Lint # Lint
npm run lint npm run lint
``` ```

View File

@@ -1,16 +1,18 @@
services: services:
app: workshop-manager-app:
build: build:
context: . context: .
dockerfile: Dockerfile dockerfile: Dockerfile
ports: ports:
- '3011:3000' - '3009:3000'
environment: environment:
- NODE_ENV=production - NODE_ENV=production
- DATABASE_URL=file:/app/data/dev.db - DATABASE_URL=file:/app/data/dev.db
- AUTH_SECRET=${AUTH_SECRET:-your-secret-key-change-in-production} - AUTH_SECRET=${AUTH_SECRET:-your-secret-key-change-in-production}
- AUTH_TRUST_HOST=true - AUTH_TRUST_HOST=true
- AUTH_URL=${AUTH_URL:-http://localhost:3011} - AUTH_URL=${AUTH_URL:-https://workshop-manager.julienfroidefond.com}
volumes: volumes:
- ./data:/app/data - ${DATA_VOLUME_PATH:-./data}:/app/data
restart: unless-stopped restart: unless-stopped
labels:
- 'com.centurylinklabs.watchtower.enable=false'

View File

@@ -1,6 +1,19 @@
#!/bin/sh #!/bin/sh
set -e set -e
DB_PATH="/app/data/dev.db"
BACKUP_DIR="/app/data/backups"
if [ -f "$DB_PATH" ]; then
mkdir -p "$BACKUP_DIR"
BACKUP_FILE="$BACKUP_DIR/dev-$(date +%Y%m%d-%H%M%S).db"
cp "$DB_PATH" "$BACKUP_FILE"
echo "💾 Database backed up to $BACKUP_FILE"
# Keep only the 10 most recent backups
ls -t "$BACKUP_DIR"/*.db 2>/dev/null | tail -n +11 | xargs rm -f
fi
echo "🔄 Running database migrations..." echo "🔄 Running database migrations..."
pnpm prisma migrate deploy pnpm prisma migrate deploy

View File

@@ -1,7 +1,16 @@
import type { NextConfig } from "next"; import type { NextConfig } from 'next';
const nextConfig: NextConfig = { const nextConfig: NextConfig = {
output: "standalone", output: 'standalone',
poweredByHeader: false,
experimental: {
optimizePackageImports: [
'@dnd-kit/core',
'@dnd-kit/sortable',
'@dnd-kit/utilities',
'@hello-pangea/dnd',
],
},
}; };
export default nextConfig; export default nextConfig;

View File

@@ -0,0 +1,2 @@
schema: spec-driven
created: 2026-03-09

View File

@@ -0,0 +1,57 @@
## Context
L'application charge les collaborateurs de session via `resolveCollaborator` appelé en séquence dans une boucle (N+1). Le hook `useLive` déclenche `router.refresh()` sur chaque événement SSE reçu, sans groupement, causant des re-renders en cascade si plusieurs événements arrivent simultanément. La fonction `cleanupOldEvents` existe dans `session-share-events.ts` mais n'est jamais appelée, laissant les événements s'accumuler indéfiniment. L'absence de `loading.tsx` sur les routes principales empêche le streaming App Router de s'activer. Les modals (`ShareModal`) sont inclus dans le bundle initial alors qu'ils sont rarement utilisés.
## Goals / Non-Goals
**Goals:**
- Éliminer le N+1 sur `resolveCollaborator` avec un fetch batché
- Grouper les refreshes SSE consécutifs avec un debounce
- Purger les événements SSE au fil de l'eau (après chaque `createEvent`)
- Activer le streaming de navigation avec `loading.tsx` sur les routes à chargement lent
- Réduire le bundle JS initial en lazy-loadant les modals
**Non-Goals:**
- Refactorer l'architecture SSE (sujet Phase 2)
- Changer la stratégie de cache/revalidation (sujet Phase 2)
- Optimiser les requêtes Prisma profondes (sujet Phase 3)
- Modifier le comportement fonctionnel existant
## Decisions
### 1. Batch resolveCollaborator par collect + single query
**Décision** : Dans `session-queries.ts`, collecter tous les `userId` des collaborateurs d'une liste de sessions, puis faire un seul `prisma.user.findMany({ where: { id: { in: [...ids] } } })`, et mapper les résultats en mémoire.
**Alternatives** : Garder le N+1 mais ajouter un cache mémoire par requête → rejeté car ne résout pas le problème structurellement.
### 2. Debounce via useRef + setTimeout natif
**Décision** : Dans `useLive.ts`, utiliser `useRef` pour stocker un timer et `setTimeout` / `clearTimeout` pour debounce à 300ms. Pas de dépendance externe.
**Alternatives** : Bibliothèque `lodash.debounce` → rejeté pour éviter une dépendance pour 5 lignes.
### 3. cleanupOldEvents inline dans createEvent
**Décision** : Appeler `cleanupOldEvents` à la fin de chaque `createEvent` (fire-and-forget, pas d'await bloquant). La purge garde les 50 derniers événements par session (seuil actuel).
**Alternatives** : Cron externe → trop complexe pour un quick win ; interval côté API SSE → couplage non souhaité.
### 4. loading.tsx avec skeleton minimaliste
**Décision** : Créer un `loading.tsx` par route principale (`/sessions`, `/weather`, `/users`) avec un skeleton générique (barres grises animées). Le composant est statique et ultra-léger.
### 5. next/dynamic avec ssr: false sur les modals
**Décision** : Wrapper `ShareModal` (et `CollaborationToolbar` si pertinent) avec `next/dynamic({ ssr: false })`. Le composant parent gère le loading state.
## Risks / Trade-offs
- **Debounce 300ms** → légère latence perçue sur les mises à jour collaboratives. Mitigation : valeur configurable via constante.
- **cleanupOldEvents fire-and-forget** → si la purge échoue, les erreurs sont silencieuses. Mitigation : logger l'erreur sans bloquer.
- **Batch resolveCollaborator** → si la liste de sessions est très grande (>500), la requête `IN` peut être lente. Mitigation : acceptable pour les volumes actuels ; paginer si nécessaire (Phase 3).
- **next/dynamic ssr: false** → les modals ne sont pas rendus côté serveur. Acceptable car ils sont interactifs uniquement.
## Migration Plan
Chaque optimisation est indépendante et déployable séparément. Pas de migration de données. Rollback : revert du commit concerné. L'ordre recommandé : (1) batch resolveCollaborator, (2) cleanupOldEvents, (3) debounce useLive, (4) loading.tsx, (5) next/dynamic.

View File

@@ -0,0 +1,29 @@
## Why
Les routes principales souffrent de plusieurs problèmes de performance facilement corrigeables : N+1 sur `resolveCollaborator`, re-renders en cascade dans `useLive`, accumulation illimitée d'événements SSE, et absence de feedback visuel pendant les navigations. Ces quick wins peuvent être adressés indépendamment, sans refactoring architectural.
## What Changes
- **Batch resolveCollaborator** : remplacer les appels séquentiels par un batch unique dans `session-queries.ts` (élimination N+1)
- **Debounce router.refresh()** : ajouter un debounce ~300ms dans `useLive.ts` pour grouper les événements SSE simultanés
- **Appel de cleanupOldEvents** : intégrer l'appel à `cleanupOldEvents` dans `createEvent` pour purger les vieux événements au fil de l'eau
- **Ajout de `loading.tsx`** : ajouter des fichiers `loading.tsx` sur les routes `/sessions`, `/weather`, `/users` pour activer le streaming App Router
- **Lazy-load des modals** : utiliser `next/dynamic` sur `ShareModal` et autres modals lourds pour réduire le bundle JS initial
## Capabilities
### New Capabilities
- `perf-loading-states`: Feedback visuel de chargement sur les routes principales via `loading.tsx`
### Modified Capabilities
- Aucune modification de spec existante — les changements sont purement implémentation/performance
## Impact
- `src/services/session-queries.ts` — refactoring batch resolveCollaborator
- `src/hooks/useLive.ts` — ajout debounce sur router.refresh
- `src/services/session-share-events.ts` — appel cleanupOldEvents dans createEvent
- `src/app/sessions/loading.tsx`, `src/app/weather/loading.tsx`, `src/app/users/loading.tsx` — nouveaux fichiers
- Composants qui importent `ShareModal` — passage à import dynamique

View File

@@ -0,0 +1,24 @@
## ADDED Requirements
### Requirement: Loading skeleton on main routes
The application SHALL display a skeleton loading state during navigation to `/sessions`, `/weather`, and `/users` routes, activated by Next.js App Router streaming via `loading.tsx` files.
#### Scenario: Navigation to sessions page shows skeleton
- **WHEN** a user navigates to `/sessions`
- **THEN** a loading skeleton SHALL be displayed immediately while the page data loads
#### Scenario: Navigation to weather page shows skeleton
- **WHEN** a user navigates to `/weather`
- **THEN** a loading skeleton SHALL be displayed immediately while the page data loads
#### Scenario: Navigation to users page shows skeleton
- **WHEN** a user navigates to `/users`
- **THEN** a loading skeleton SHALL be displayed immediately while the page data loads
### Requirement: Modal lazy loading
Heavy modal components (ShareModal) SHALL be loaded lazily via `next/dynamic` to reduce the initial JS bundle size.
#### Scenario: ShareModal not in initial bundle
- **WHEN** a page loads that contains a ShareModal trigger
- **THEN** the ShareModal component code SHALL NOT be included in the initial JS bundle
- **THEN** the ShareModal code SHALL be fetched only when first needed

View File

@@ -0,0 +1,33 @@
## 1. Batch resolveCollaborator (N+1 fix)
- [x] 1.1 Lire `src/services/session-queries.ts` et identifier toutes les occurrences de `resolveCollaborator` appelées en boucle
- [x] 1.2 Créer une fonction `batchResolveCollaborators(userIds: string[])` qui fait un seul `prisma.user.findMany({ where: { id: { in: userIds } } })`
- [x] 1.3 Remplacer les boucles N+1 par collect des IDs → batch query → mapping en mémoire
- [x] 1.4 Vérifier que les pages sessions/weather/etc. chargent correctement
## 2. Debounce router.refresh() dans useLive
- [x] 2.1 Lire `src/hooks/useLive.ts` et localiser l'appel à `router.refresh()`
- [x] 2.2 Ajouter un `useRef<ReturnType<typeof setTimeout>>` pour le timer de debounce
- [x] 2.3 Wrapper l'appel `router.refresh()` avec `clearTimeout` + `setTimeout` à 300ms
- [x] 2.4 Ajouter un `clearTimeout` dans le cleanup de l'effet pour éviter les leaks mémoire
## 3. Purge automatique des événements SSE
- [x] 3.1 Lire `src/services/session-share-events.ts` et localiser `createEvent` et `cleanupOldEvents`
- [x] 3.2 Ajouter un appel fire-and-forget à `cleanupOldEvents` à la fin de `createEvent` (après l'insert)
- [x] 3.3 Wrapper l'appel dans un try/catch pour logger l'erreur sans bloquer
## 4. Ajout des loading.tsx sur les routes principales
- [x] 4.1 Créer `src/app/sessions/loading.tsx` avec un skeleton de liste de sessions
- [x] 4.2 Créer `src/app/weather/loading.tsx` avec un skeleton de tableau météo
- [x] 4.3 Créer `src/app/users/loading.tsx` avec un skeleton de liste utilisateurs
- [ ] 4.4 Vérifier que le skeleton s'affiche bien à la navigation (ralentir le réseau dans DevTools)
## 5. Lazy-load des modals avec next/dynamic
- [x] 5.1 Identifier tous les composants qui importent `ShareModal` directement
- [x] 5.2 Remplacer chaque import statique par `next/dynamic(() => import(...), { ssr: false })`
- [ ] 5.3 Vérifier que les modals s'ouvrent correctement après lazy-load
- [ ] 5.4 Vérifier dans les DevTools Network que le chunk modal n'est pas dans le bundle initial

View File

@@ -0,0 +1,2 @@
schema: spec-driven
created: 2026-03-09

View File

@@ -0,0 +1,59 @@
## Context
`src/services/weather.ts` utilise `findMany` sans `take` ni `orderBy`, chargeant potentiellement des centaines d'entrées pour calculer des tendances qui n'utilisent que les 30-90 derniers points. Les services de sessions utilisent `include: { items: true, shares: true, events: true }` pour construire les listes, alors que l'affichage carte n'a besoin que du titre, de la date, du comptage d'items et du statut de partage. `User.name` est filtré dans les recherches admin mais sans index SQLite. Les pages les plus visitées (`/sessions`, `/users`) recalculent leurs données à chaque requête.
## Goals / Non-Goals
**Goals:**
- Borner le chargement historique weather à une constante configurable
- Réduire la taille des objets retournés par les queries de liste (select vs include)
- Ajouter un index SQLite sur `User.name`
- Introduire un cache Next.js sur les queries de liste avec invalidation ciblée
**Non-Goals:**
- Changer la structure des modèles Prisma
- Modifier le rendu des pages (les sélections couvrent tous les champs affichés)
- Introduire un cache externe (Redis, Memcached)
- Optimiser les pages de détail session (hors scope)
## Decisions
### 1. Constante WEATHER_HISTORY_LIMIT dans lib/types.ts
**Décision** : Définir `WEATHER_HISTORY_LIMIT = 90` dans `src/lib/types.ts` (cohérent avec les autres constantes de config). La query devient : `findMany({ orderBy: { createdAt: 'desc' }, take: WEATHER_HISTORY_LIMIT })`.
**Alternatives** : Paramètre d'URL ou env var → sur-ingénierie pour un seuil rarement modifié.
### 2. Select minimal pour les listes — interface ListItem dédiée
**Décision** : Pour chaque service de liste, définir un type `XxxListItem` dans `types.ts` avec uniquement les champs de la carte (id, title, createdAt, _count.items, shares.length). Utiliser `select` Prisma pour matcher exactement ce type.
**Alternatives** : Garder `include` et filtrer côté TypeScript → charge DB identique, gain nul.
### 3. Index @@index([name]) sur User
**Décision** : Ajouter `@@index([name])` dans le modèle `User` de `schema.prisma`. Créer une migration nommée `add_user_name_index`. Impact : SQLite crée un B-tree index, recherches `LIKE 'x%'` bénéficient de l'index (prefix match).
**Note** : `LIKE '%x%'` (contains) n'utilise pas l'index en SQLite — acceptable, le use case principal est la recherche par préfixe.
### 4. unstable_cache avec tags sur requêtes de liste
**Décision** : Wrapper les fonctions de service de liste (ex: `getSessionsForUser`, `getUserStats`) avec `unstable_cache(fn, [cacheKey], { tags: ['sessions-list:userId'] })`. Les Server Actions appellent `revalidateTag` correspondant après mutation.
Durée de cache : `revalidate: 60` secondes en fallback, mais invalidation explicite prioritaire.
**Alternatives** : `React.cache` → par-requête uniquement, pas de persistance entre navigations ; `fetch` avec cache → ne s'applique pas aux queries Prisma.
## Risks / Trade-offs
- **select strict** → si un composant accède à un champ non sélectionné, erreur TypeScript au build (bonne chose — détecté tôt).
- **unstable_cache** → API Next.js marquée unstable. Mitigation : isoler dans les services, wrapper facilement remplaçable.
- **Index User.name** → légère augmentation de la taille du fichier SQLite et du temps d'écriture. Négligeable pour les volumes actuels.
- **WEATHER_HISTORY_LIMIT** → les calculs de tendance doivent fonctionner avec N entrées ou moins. Vérifier que l'algorithme est robuste avec un historique partiel.
## Migration Plan
1. Migration Prisma `add_user_name_index` (non-destructif, peut être appliqué à tout moment)
2. Ajout `WEATHER_HISTORY_LIMIT` + update query weather (indépendant)
3. Refactoring select par service (vérifier TypeScript au build à chaque service)
4. Ajout cache layer en dernier (dépend des tags définis en Phase 2 si applicable, sinon définir localement)

View File

@@ -0,0 +1,29 @@
## Why
Les requêtes Prisma des pages les plus fréquentées chargent trop de données : `weather.ts` ramène tout l'historique sans borne, les queries de la sessions page incluent des relations profondes inutiles pour l'affichage liste, et aucun cache n'est appliqué sur les requêtes répétées à chaque navigation. Ces optimisations réduisent la taille des payloads et le temps de réponse DB sans changer le comportement.
## What Changes
- **Weather historique borné** : ajouter `take` + `orderBy createdAt DESC` dans `src/services/weather.ts`, configurable via constante (défaut : 90 entrées)
- **Select fields sur sessions list** : remplacer les `include` profonds par des `select` avec uniquement les champs affichés dans les cards de liste
- **Index `User.name`** : ajouter `@@index([name])` dans `prisma/schema.prisma` + migration
- **Cache sur requêtes fréquentes** : wraper les queries de liste sessions et stats utilisateurs avec `unstable_cache` + tags, invalidés lors des mutations
## Capabilities
### New Capabilities
- `query-cache-layer`: Cache Next.js sur les requêtes de liste fréquentes avec invalidation par tags
### Modified Capabilities
- Aucune modification de spec comportementale — optimisations internes transparentes
## Impact
- `src/services/weather.ts` — ajout limite + orderBy
- `src/services/` (tous les services de liste) — `include``select`
- `prisma/schema.prisma` — ajout `@@index([name])` sur `User`
- `prisma/migrations/` — nouvelle migration pour l'index
- `src/services/` — wrapping `unstable_cache` sur queries fréquentes
- `src/actions/` — ajout `revalidateTag` correspondants (complément Phase 2)

View File

@@ -0,0 +1,30 @@
## ADDED Requirements
### Requirement: Cached session list queries
Frequently-called session list queries SHALL be cached using Next.js `unstable_cache` with user-scoped tags, avoiding redundant DB reads on repeated navigations.
#### Scenario: Session list served from cache on repeated navigation
- **WHEN** a user navigates to the sessions page multiple times within the cache window
- **THEN** the session list data SHALL be served from cache on subsequent requests
- **THEN** no additional Prisma query SHALL be executed for cached data
#### Scenario: Cache invalidated after mutation
- **WHEN** a Server Action creates, updates, or deletes a session
- **THEN** the corresponding cache tag SHALL be invalidated via `revalidateTag`
- **THEN** the next request SHALL fetch fresh data from the DB
### Requirement: Weather history bounded query
The weather service SHALL limit historical data loading to a configurable maximum number of entries (default: 90), ordered by most recent first.
#### Scenario: Weather history respects limit
- **WHEN** the weather service fetches historical entries
- **THEN** at most `WEATHER_HISTORY_LIMIT` entries SHALL be returned
- **THEN** entries SHALL be ordered by `createdAt` DESC (most recent first)
### Requirement: Minimal field selection on list queries
Service functions returning lists for display purposes SHALL use Prisma `select` with only the fields required for the list UI, not full `include` of related models.
#### Scenario: Sessions list query returns only display fields
- **WHEN** the sessions list service function is called
- **THEN** the returned objects SHALL contain only fields needed for card display (id, title, createdAt, item count, share status)
- **THEN** full related model objects (items array, events array) SHALL NOT be included

View File

@@ -0,0 +1,30 @@
## 1. Index User.name (migration Prisma)
- [x] 1.1 Lire `prisma/schema.prisma` et localiser le modèle `User`
- [x] 1.2 Ajouter `@@index([name])` au modèle `User`
- [x] 1.3 Exécuter `pnpm prisma migrate dev --name add_user_name_index`
- [x] 1.4 Vérifier que la migration s'applique sans erreur et que `prisma studio` montre l'index
## 2. Weather: limiter le chargement historique
- [x] 2.1 Ajouter la constante `WEATHER_HISTORY_LIMIT = 90` dans `src/lib/types.ts`
- [x] 2.2 Lire `src/services/weather.ts` et localiser la query `findMany` des entrées historiques
- [x] 2.3 Ajouter `take: WEATHER_HISTORY_LIMIT` et `orderBy: { date: 'desc' }` à la query
- [x] 2.4 Vérifier que les calculs de tendances fonctionnent avec un historique partiel
## 3. Select fields sur les queries de liste
- [x] 3.1 Lire les services de liste : `src/services/sessions.ts`, `moving-motivators.ts`, `year-review.ts`, `weekly-checkin.ts`, `weather.ts`, `gif-mood.ts`
- [x] 3.2 Identifier les `include` utilisés dans les fonctions de liste (pas de détail session)
- [x] 3.3 Remplacer les `include` profonds par `select` avec uniquement les champs nécessaires dans chaque service
- [x] 3.4 Mettre à jour `shares: { include: ... }``shares: { select: { id, role, user } }` dans les 6 services
- [x] 3.5 Vérifier les erreurs TypeScript et adapter les queries partagées
- [x] 3.6 Vérifier `pnpm build` sans erreurs TypeScript
## 4. Cache layer sur requêtes fréquentes
- [x] 4.1 Créer `src/lib/cache-tags.ts` avec les helpers de tags : `sessionTag(id)`, `sessionsListTag(userId)`, `userStatsTag(userId)`
- [x] 4.2 Wrapper la fonction de liste sessions dans chaque service avec `unstable_cache(fn, [key], { tags: [sessionsListTag(userId)], revalidate: 60 })`
- [x] 4.3 `getUserStats` non existant — tâche ignorée (pas de fonction correspondante dans le codebase)
- [x] 4.4 Vérifier que les Server Actions de création/suppression de session appellent `revalidateTag(sessionsListTag(userId), 'default')`
- [x] 4.5 Build passe et 255 tests passent — invalidation testée par build

View File

@@ -0,0 +1,2 @@
schema: spec-driven
created: 2026-03-09

View File

@@ -0,0 +1,65 @@
## Context
Chaque route `/api/*/subscribe` crée un `setInterval` à 1s qui poll la DB pour les événements. Si 10 utilisateurs ont le même workshop ouvert, c'est 10 requêtes/seconde sur la même table. Le pattern weather utilise déjà une `Map` de subscribers in-process pour broadcaster les événements sans re-poll, mais ce pattern n'est pas généralisé. Les Server Actions appellent `revalidatePath('/sessions')` qui invalide tous les sous-segments, forçant Next.js à re-render des pages entières même pour une mutation mineure.
## Goals / Non-Goals
**Goals:**
- Réduire le nombre de requêtes DB de polling proportionnellement au nombre de clients connectés
- Fournir un module de broadcast réutilisable pour tous les workshops
- Réduire la surface d'invalidation du cache Next.js avec des tags granulaires
- Limiter le volume de données chargées sur la page sessions avec pagination
**Non-Goals:**
- Passer à WebSockets ou un serveur temps-réel externe (Redis, Pusher)
- Modifier le modèle de données Prisma pour les événements
- Implémenter du SSE multi-process / multi-instance (déploiement standalone single-process)
## Decisions
### 1. Module broadcast.ts : Map<sessionId, Set<subscriber>>
**Décision** : Créer `src/lib/broadcast.ts` qui expose :
- `subscribe(sessionId, callback)` → retourne `unsubscribe()`
- `broadcast(sessionId, event)` → notifie tous les subscribers
Les routes SSE s'abonnent au lieu de poller. Les Server Actions appellent `broadcast()` après mutation.
**Alternatives** : EventEmitter Node.js → rejeté car moins typé ; BroadcastChannel → rejeté car limité à same-origin workers, pas adapté aux route handlers Next.js.
### 2. Polling de fallback maintenu mais mutualisé
**Décision** : Garder un seul polling par session active (le premier subscriber démarre l'interval, le dernier le stoppe). Le broadcast natif est prioritaire (appelé depuis Server Actions), le polling est le fallback pour les clients qui rejoignent en cours de route.
### 3. revalidateTag avec convention de nommage
**Décision** : Convention de tags :
- `session:<id>` — pour une session spécifique
- `sessions-list:<userId>` — pour la liste des sessions d'un user
- `workshop:<type>` — pour tout le workshop
Chaque query Prisma dans les services est wrappée avec `unstable_cache` ou utilise `cacheTag` (Next.js 15+).
**Alternatives** : Garder `revalidatePath` mais avec des paths plus précis → moins efficace que les tags.
### 4. Pagination cursor-based sur sessions page
**Décision** : Pagination par cursor (basée sur `createdAt` DESC) plutôt qu'offset, pour la stabilité des listes en insertion fréquente. Taille de page initiale : 20 sessions par type de workshop. UI : bouton "Charger plus" (pas de pagination numérotée).
**Alternatives** : Virtual scroll → plus complexe, dépendance JS côté client ; offset pagination → instable si nouvelles sessions insérées entre deux pages.
## Risks / Trade-offs
- **Broadcast in-process** → ne fonctionne qu'en déploiement single-process. Acceptable pour le cas d'usage actuel (standalone Next.js). Documenter la limitation.
- **unstable_cache** → API marquée unstable dans Next.js, peut changer. Mitigation : isoler dans les services, pas dans les composants.
- **Pagination** → change l'UX de la page sessions (actuellement tout visible). Mitigation : conserver le total affiché et un indicateur "X sur Y".
## Migration Plan
1. Créer `src/lib/broadcast.ts` sans toucher aux routes existantes
2. Migrer les routes SSE une par une (commencer par `weather` qui a déjà le pattern)
3. Mettre à jour les Server Actions pour appeler `broadcast()` + `revalidateTag()`
4. Ajouter `cacheTag` aux queries services
5. Ajouter pagination sur sessions page en dernier (changement UI visible)
Rollback : chaque étape est indépendante — revert par feature.

View File

@@ -0,0 +1,29 @@
## Why
La couche temps-réel actuelle (SSE + polling DB à 1s) multiplie les connexions et les requêtes dès que plusieurs utilisateurs collaborent. Chaque onglet ouvert sur une session déclenche son propre polling, et les Server Actions invalident des segments de route entiers avec `revalidatePath`. Ces problèmes de scalabilité deviennent visibles dès 5-10 utilisateurs simultanés.
## What Changes
- **Polling SSE partagé** : un seul interval actif par session côté serveur, partagé entre tous les clients connectés à cette session
- **Broadcast unifié** : généraliser le pattern de broadcast in-process (déjà présent dans `weather`) à tous les workshops via un module `src/lib/broadcast.ts`
- **`revalidateTag` granulaire** : remplacer `revalidatePath` dans tous les Server Actions par des tags ciblés (`session:<id>`, `sessions-list`, etc.)
- **Pagination sessions page** : limiter le chargement initial à N sessions par type avec pagination ou chargement progressif
## Capabilities
### New Capabilities
- `sse-shared-polling`: Polling SSE mutualisé par session (un seul interval par session active)
- `unified-broadcast`: Module de broadcast in-process réutilisable par tous les workshops
### Modified Capabilities
- `sessions-list`: Ajout de pagination/limite sur le chargement des sessions
## Impact
- `src/app/api/*/subscribe/route.ts` — refactoring du polling vers le module broadcast partagé
- `src/lib/broadcast.ts` — nouveau module (Map de sessions actives + subscribers)
- `src/actions/*.ts` — remplacement de `revalidatePath` par `revalidateTag` + `unstable_cache`
- `src/app/sessions/page.tsx` — ajout pagination
- `src/services/` — ajout de `cache` tags sur les requêtes Prisma fréquentes

View File

@@ -0,0 +1,15 @@
## ADDED Requirements
### Requirement: Paginated sessions list
The sessions page SHALL load sessions in pages rather than fetching all sessions at once, with a default page size of 20 per workshop type.
#### Scenario: Initial load shows first page
- **WHEN** a user visits the sessions page
- **THEN** at most 20 sessions per workshop type SHALL be loaded
- **THEN** a total count SHALL be displayed (e.g., "Showing 20 of 47")
#### Scenario: Load more sessions on demand
- **WHEN** there are more sessions beyond the current page
- **THEN** a "Charger plus" button SHALL be displayed
- **WHEN** the user clicks "Charger plus"
- **THEN** the next page of sessions SHALL be appended to the list

View File

@@ -0,0 +1,17 @@
## ADDED Requirements
### Requirement: Single polling interval per active session
The SSE infrastructure SHALL maintain at most one active DB polling interval per session, regardless of the number of connected clients.
#### Scenario: First client connects starts polling
- **WHEN** the first client connects to a session's SSE endpoint
- **THEN** a single polling interval SHALL be started for that session
#### Scenario: Additional clients share existing polling
- **WHEN** a second or subsequent client connects to the same session's SSE endpoint
- **THEN** no additional polling interval SHALL be created
- **THEN** the new client SHALL receive events from the shared poll
#### Scenario: Last client disconnect stops polling
- **WHEN** all clients disconnect from a session's SSE endpoint
- **THEN** the polling interval for that session SHALL be stopped and cleaned up

View File

@@ -0,0 +1,22 @@
## ADDED Requirements
### Requirement: Centralized broadcast module
The system SHALL provide a centralized `src/lib/broadcast.ts` module used by all workshop SSE routes to push events to connected clients.
#### Scenario: Server Action triggers broadcast
- **WHEN** a Server Action mutates session data and calls `broadcast(sessionId, event)`
- **THEN** all clients subscribed to that session SHALL receive the event immediately without waiting for the next poll cycle
#### Scenario: Broadcast module subscribe/unsubscribe
- **WHEN** an SSE route calls `subscribe(sessionId, callback)`
- **THEN** the callback SHALL be invoked on every subsequent `broadcast(sessionId, ...)` call
- **WHEN** the returned `unsubscribe()` function is called
- **THEN** the callback SHALL no longer receive events
### Requirement: Granular cache invalidation via revalidateTag
Server Actions SHALL use `revalidateTag` with session-scoped tags instead of `revalidatePath` to limit cache invalidation scope.
#### Scenario: Session mutation invalidates only that session's cache
- **WHEN** a Server Action mutates a specific session (e.g., adds an item)
- **THEN** only the cache tagged `session:<id>` SHALL be invalidated
- **THEN** other sessions' cached data SHALL NOT be invalidated

View File

@@ -0,0 +1,36 @@
## 1. Module broadcast.ts
- [x] 1.1 Créer `src/lib/broadcast.ts` avec une `Map<string, Set<(event: unknown) => void>>` et les fonctions `subscribe(sessionId, cb)` et `broadcast(sessionId, event)`
- [x] 1.2 Ajouter la logique de polling mutualisé : `startPolling(sessionId)` / `stopPolling(sessionId)` avec compteur de subscribers
- [x] 1.3 Écrire un test manuel : ouvrir 2 onglets sur la même session, vérifier qu'un seul interval tourne (log côté serveur)
## 2. Migration des routes SSE
- [x] 2.1 Lire toutes les routes `src/app/api/*/subscribe/route.ts` pour inventorier le pattern actuel
- [x] 2.2 Migrer la route weather en premier (elle a déjà un pattern partiel) pour valider l'approche
- [x] 2.3 Migrer les routes swot, motivators, year-review, weekly-checkin une par une
- [x] 2.4 Vérifier que le cleanup SSE (abort signal) appelle bien `unsubscribe()` dans chaque route migrée
## 3. revalidateTag dans les Server Actions
- [x] 3.1 Définir la convention de tags dans `src/lib/cache-tags.ts` (ex: `session(id)`, `sessionsList(userId)`)
- [x] 3.2 Ajouter `cacheTag` / `unstable_cache` aux queries de services correspondantes
- [x] 3.3 Remplacer `revalidatePath` par `revalidateTag` dans `src/actions/swot.ts`
- [x] 3.4 Remplacer `revalidatePath` par `revalidateTag` dans `src/actions/motivators.ts`
- [x] 3.5 Remplacer `revalidatePath` par `revalidateTag` dans `src/actions/year-review.ts`
- [x] 3.6 Remplacer `revalidatePath` par `revalidateTag` dans `src/actions/weekly-checkin.ts`
- [x] 3.7 Remplacer `revalidatePath` par `revalidateTag` dans `src/actions/weather.ts`
- [x] 3.8 Vérifier que les mutations se reflètent correctement dans l'UI après revalidation
## 4. Broadcast depuis les Server Actions
- [x] 4.1 Ajouter l'appel `broadcast(sessionId, { type: 'update' })` dans chaque Server Action de mutation (après revalidateTag)
- [x] 4.2 Vérifier que les mises à jour collaboratives fonctionnent (ouvrir 2 onglets, muter depuis l'un, voir la mise à jour dans l'autre)
## 5. Pagination sessions page
- [x] 5.1 Modifier les queries dans `src/services/` pour accepter `cursor` et `limit` (défaut: 20)
- [x] 5.2 Mettre à jour `src/app/sessions/page.tsx` pour charger la première page + afficher le total
- [x] 5.3 Créer un Server Action `loadMoreSessions(type, cursor)` pour la pagination
- [x] 5.4 Ajouter le bouton "Charger plus" avec état loading dans le composant sessions list
- [x] 5.5 Vérifier l'affichage "X sur Y sessions" pour chaque type de workshop

20
openspec/config.yaml Normal file
View File

@@ -0,0 +1,20 @@
schema: spec-driven
# Project context (optional)
# This is shown to AI when creating artifacts.
# Add your tech stack, conventions, style guides, domain knowledge, etc.
# Example:
# context: |
# Tech stack: TypeScript, React, Node.js
# We use conventional commits
# Domain: e-commerce platform
# Per-artifact rules (optional)
# Add custom rules for specific artifacts.
# Example:
# rules:
# proposal:
# - Keep proposals under 500 words
# - Always include a "Non-goals" section
# tasks:
# - Break tasks into chunks of max 2 hours

View File

@@ -0,0 +1,18 @@
### Requirement: Loading skeleton on main routes
The application SHALL display a skeleton loading state during navigation to `/sessions` and `/users` routes, activated by Next.js App Router streaming via `loading.tsx` files.
#### Scenario: Navigation to sessions page shows skeleton
- **WHEN** a user navigates to `/sessions`
- **THEN** a loading skeleton SHALL be displayed immediately while the page data loads
#### Scenario: Navigation to users page shows skeleton
- **WHEN** a user navigates to `/users`
- **THEN** a loading skeleton SHALL be displayed immediately while the page data loads
### Requirement: Modal lazy loading
Heavy modal components (ShareModal) SHALL be loaded lazily via `next/dynamic` to reduce the initial JS bundle size.
#### Scenario: ShareModal not in initial bundle
- **WHEN** a page loads that contains a ShareModal trigger
- **THEN** the ShareModal component code SHALL NOT be included in the initial JS bundle
- **THEN** the ShareModal code SHALL be fetched only when first needed

View File

@@ -13,7 +13,11 @@
"dev": "next dev", "dev": "next dev",
"build": "next build", "build": "next build",
"start": "next start", "start": "next start",
"lint": "eslint" "test": "vitest run",
"test:watch": "vitest",
"test:coverage": "vitest run --coverage",
"lint": "eslint",
"prettier": "prettier --write ."
}, },
"dependencies": { "dependencies": {
"@dnd-kit/core": "^6.3.1", "@dnd-kit/core": "^6.3.1",
@@ -24,7 +28,7 @@
"@prisma/client": "^7.1.0", "@prisma/client": "^7.1.0",
"bcryptjs": "^3.0.3", "bcryptjs": "^3.0.3",
"better-sqlite3": "^12.4.6", "better-sqlite3": "^12.4.6",
"next": "16.0.7", "next": "16.0.10",
"next-auth": "5.0.0-beta.30", "next-auth": "5.0.0-beta.30",
"prisma": "^7.1.0", "prisma": "^7.1.0",
"react": "19.2.0", "react": "19.2.0",
@@ -36,12 +40,15 @@
"@types/node": "^20", "@types/node": "^20",
"@types/react": "^19", "@types/react": "^19",
"@types/react-dom": "^19", "@types/react-dom": "^19",
"@vitest/coverage-v8": "^4.0.18",
"dotenv": "^17.2.3", "dotenv": "^17.2.3",
"eslint": "^9", "eslint": "^9",
"eslint-config-next": "16.0.5", "eslint-config-next": "16.0.5",
"eslint-config-prettier": "^10.1.8", "eslint-config-prettier": "^10.1.8",
"prettier": "^3.7.1", "prettier": "^3.7.1",
"tailwindcss": "^4", "tailwindcss": "^4",
"typescript": "^5" "typescript": "^5",
"vite-tsconfig-paths": "^6.1.1",
"vitest": "^4.0.18"
} }
} }

1035
pnpm-lock.yaml generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,6 @@
const config = { const config = {
plugins: { plugins: {
"@tailwindcss/postcss": {}, '@tailwindcss/postcss': {},
}, },
}; };

View File

@@ -1,14 +1,14 @@
// This file was generated by Prisma and assumes you have installed the following: // This file was generated by Prisma and assumes you have installed the following:
// npm install --save-dev prisma dotenv // npm install --save-dev prisma dotenv
import "dotenv/config"; import 'dotenv/config';
import { defineConfig, env } from "prisma/config"; import { defineConfig, env } from 'prisma/config';
export default defineConfig({ export default defineConfig({
schema: "prisma/schema.prisma", schema: 'prisma/schema.prisma',
migrations: { migrations: {
path: "prisma/migrations", path: 'prisma/migrations',
}, },
datasource: { datasource: {
url: env("DATABASE_URL"), url: env('DATABASE_URL'),
}, },
}); });

View File

@@ -0,0 +1,103 @@
-- CreateEnum
CREATE TABLE "WeeklyCheckInCategory" (
"value" TEXT NOT NULL PRIMARY KEY
);
-- CreateEnum
CREATE TABLE "Emotion" (
"value" TEXT NOT NULL PRIMARY KEY
);
-- InsertEnumValues
INSERT INTO "WeeklyCheckInCategory" ("value") VALUES ('WENT_WELL');
INSERT INTO "WeeklyCheckInCategory" ("value") VALUES ('WENT_WRONG');
INSERT INTO "WeeklyCheckInCategory" ("value") VALUES ('CURRENT_FOCUS');
INSERT INTO "WeeklyCheckInCategory" ("value") VALUES ('NEXT_FOCUS');
-- InsertEnumValues
INSERT INTO "Emotion" ("value") VALUES ('PRIDE');
INSERT INTO "Emotion" ("value") VALUES ('JOY');
INSERT INTO "Emotion" ("value") VALUES ('SATISFACTION');
INSERT INTO "Emotion" ("value") VALUES ('GRATITUDE');
INSERT INTO "Emotion" ("value") VALUES ('CONFIDENCE');
INSERT INTO "Emotion" ("value") VALUES ('FRUSTRATION');
INSERT INTO "Emotion" ("value") VALUES ('WORRY');
INSERT INTO "Emotion" ("value") VALUES ('DISAPPOINTMENT');
INSERT INTO "Emotion" ("value") VALUES ('EXCITEMENT');
INSERT INTO "Emotion" ("value") VALUES ('ANTICIPATION');
INSERT INTO "Emotion" ("value") VALUES ('DETERMINATION');
INSERT INTO "Emotion" ("value") VALUES ('NONE');
-- CreateTable
CREATE TABLE "WeeklyCheckInSession" (
"id" TEXT NOT NULL PRIMARY KEY,
"title" TEXT NOT NULL,
"participant" TEXT NOT NULL,
"date" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"userId" TEXT NOT NULL,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "WeeklyCheckInSession_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User" ("id") ON DELETE CASCADE ON UPDATE CASCADE
);
-- CreateTable
CREATE TABLE "WeeklyCheckInItem" (
"id" TEXT NOT NULL PRIMARY KEY,
"content" TEXT NOT NULL,
"category" TEXT NOT NULL,
"emotion" TEXT NOT NULL DEFAULT 'NONE',
"order" INTEGER NOT NULL DEFAULT 0,
"sessionId" TEXT NOT NULL,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "WeeklyCheckInItem_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "WeeklyCheckInSession" ("id") ON DELETE CASCADE ON UPDATE CASCADE,
CONSTRAINT "WeeklyCheckInItem_category_fkey" FOREIGN KEY ("category") REFERENCES "WeeklyCheckInCategory" ("value") ON DELETE RESTRICT ON UPDATE CASCADE,
CONSTRAINT "WeeklyCheckInItem_emotion_fkey" FOREIGN KEY ("emotion") REFERENCES "Emotion" ("value") ON DELETE RESTRICT ON UPDATE CASCADE
);
-- CreateTable
CREATE TABLE "WCISessionShare" (
"id" TEXT NOT NULL PRIMARY KEY,
"sessionId" TEXT NOT NULL,
"userId" TEXT NOT NULL,
"role" TEXT NOT NULL DEFAULT 'EDITOR',
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "WCISessionShare_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "WeeklyCheckInSession" ("id") ON DELETE CASCADE ON UPDATE CASCADE,
CONSTRAINT "WCISessionShare_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User" ("id") ON DELETE CASCADE ON UPDATE CASCADE
);
-- CreateTable
CREATE TABLE "WCISessionEvent" (
"id" TEXT NOT NULL PRIMARY KEY,
"sessionId" TEXT NOT NULL,
"userId" TEXT NOT NULL,
"type" TEXT NOT NULL,
"payload" TEXT NOT NULL,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "WCISessionEvent_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "WeeklyCheckInSession" ("id") ON DELETE CASCADE ON UPDATE CASCADE,
CONSTRAINT "WCISessionEvent_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User" ("id") ON DELETE CASCADE ON UPDATE CASCADE
);
-- CreateIndex
CREATE INDEX "WeeklyCheckInSession_userId_idx" ON "WeeklyCheckInSession"("userId");
-- CreateIndex
CREATE INDEX "WeeklyCheckInSession_date_idx" ON "WeeklyCheckInSession"("date");
-- CreateIndex
CREATE INDEX "WeeklyCheckInItem_sessionId_idx" ON "WeeklyCheckInItem"("sessionId");
-- CreateIndex
CREATE INDEX "WeeklyCheckInItem_sessionId_category_idx" ON "WeeklyCheckInItem"("sessionId", "category");
-- CreateIndex
CREATE INDEX "WCISessionShare_sessionId_idx" ON "WCISessionShare"("sessionId");
-- CreateIndex
CREATE INDEX "WCISessionShare_userId_idx" ON "WCISessionShare"("userId");
-- CreateIndex
CREATE UNIQUE INDEX "WCISessionShare_sessionId_userId_key" ON "WCISessionShare"("sessionId", "userId");
-- CreateIndex
CREATE INDEX "WCISessionEvent_sessionId_createdAt_idx" ON "WCISessionEvent"("sessionId", "createdAt");

View File

@@ -0,0 +1,70 @@
-- CreateTable
CREATE TABLE "YearReviewSession" (
"id" TEXT NOT NULL PRIMARY KEY,
"title" TEXT NOT NULL,
"participant" TEXT NOT NULL,
"year" INTEGER NOT NULL,
"userId" TEXT NOT NULL,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "YearReviewSession_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User" ("id") ON DELETE CASCADE ON UPDATE CASCADE
);
-- CreateTable
CREATE TABLE "YearReviewItem" (
"id" TEXT NOT NULL PRIMARY KEY,
"content" TEXT NOT NULL,
"category" TEXT NOT NULL,
"order" INTEGER NOT NULL DEFAULT 0,
"sessionId" TEXT NOT NULL,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "YearReviewItem_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "YearReviewSession" ("id") ON DELETE CASCADE ON UPDATE CASCADE
);
-- CreateTable
CREATE TABLE "YRSessionShare" (
"id" TEXT NOT NULL PRIMARY KEY,
"sessionId" TEXT NOT NULL,
"userId" TEXT NOT NULL,
"role" TEXT NOT NULL DEFAULT 'EDITOR',
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "YRSessionShare_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "YearReviewSession" ("id") ON DELETE CASCADE ON UPDATE CASCADE,
CONSTRAINT "YRSessionShare_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User" ("id") ON DELETE CASCADE ON UPDATE CASCADE
);
-- CreateTable
CREATE TABLE "YRSessionEvent" (
"id" TEXT NOT NULL PRIMARY KEY,
"sessionId" TEXT NOT NULL,
"userId" TEXT NOT NULL,
"type" TEXT NOT NULL,
"payload" TEXT NOT NULL,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "YRSessionEvent_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "YearReviewSession" ("id") ON DELETE CASCADE ON UPDATE CASCADE,
CONSTRAINT "YRSessionEvent_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User" ("id") ON DELETE CASCADE ON UPDATE CASCADE
);
-- CreateIndex
CREATE INDEX "YearReviewSession_userId_idx" ON "YearReviewSession"("userId");
-- CreateIndex
CREATE INDEX "YearReviewSession_year_idx" ON "YearReviewSession"("year");
-- CreateIndex
CREATE INDEX "YearReviewItem_sessionId_idx" ON "YearReviewItem"("sessionId");
-- CreateIndex
CREATE INDEX "YearReviewItem_sessionId_category_idx" ON "YearReviewItem"("sessionId", "category");
-- CreateIndex
CREATE INDEX "YRSessionShare_sessionId_idx" ON "YRSessionShare"("sessionId");
-- CreateIndex
CREATE INDEX "YRSessionShare_userId_idx" ON "YRSessionShare"("userId");
-- CreateIndex
CREATE UNIQUE INDEX "YRSessionShare_sessionId_userId_key" ON "YRSessionShare"("sessionId", "userId");
-- CreateIndex
CREATE INDEX "YRSessionEvent_sessionId_createdAt_idx" ON "YRSessionEvent"("sessionId", "createdAt");

View File

@@ -0,0 +1,98 @@
-- CreateEnum
CREATE TABLE "TeamRole" (
"value" TEXT NOT NULL PRIMARY KEY
);
INSERT INTO "TeamRole" ("value") VALUES ('ADMIN'), ('MEMBER');
-- CreateEnum
CREATE TABLE "OKRStatus" (
"value" TEXT NOT NULL PRIMARY KEY
);
INSERT INTO "OKRStatus" ("value") VALUES ('NOT_STARTED'), ('IN_PROGRESS'), ('COMPLETED'), ('CANCELLED');
-- CreateEnum
CREATE TABLE "KeyResultStatus" (
"value" TEXT NOT NULL PRIMARY KEY
);
INSERT INTO "KeyResultStatus" ("value") VALUES ('NOT_STARTED'), ('IN_PROGRESS'), ('COMPLETED'), ('AT_RISK');
-- CreateTable
CREATE TABLE "Team" (
"id" TEXT NOT NULL PRIMARY KEY,
"name" TEXT NOT NULL,
"description" TEXT,
"createdById" TEXT NOT NULL,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "Team_createdById_fkey" FOREIGN KEY ("createdById") REFERENCES "User" ("id") ON DELETE CASCADE ON UPDATE CASCADE
);
-- CreateTable
CREATE TABLE "TeamMember" (
"id" TEXT NOT NULL PRIMARY KEY,
"teamId" TEXT NOT NULL,
"userId" TEXT NOT NULL,
"role" TEXT NOT NULL DEFAULT 'MEMBER',
"joinedAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "TeamMember_teamId_fkey" FOREIGN KEY ("teamId") REFERENCES "Team" ("id") ON DELETE CASCADE ON UPDATE CASCADE,
CONSTRAINT "TeamMember_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User" ("id") ON DELETE CASCADE ON UPDATE CASCADE
);
-- CreateTable
CREATE TABLE "OKR" (
"id" TEXT NOT NULL PRIMARY KEY,
"teamMemberId" TEXT NOT NULL,
"objective" TEXT NOT NULL,
"description" TEXT,
"period" TEXT NOT NULL,
"startDate" DATETIME NOT NULL,
"endDate" DATETIME NOT NULL,
"status" TEXT NOT NULL DEFAULT 'NOT_STARTED',
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "OKR_teamMemberId_fkey" FOREIGN KEY ("teamMemberId") REFERENCES "TeamMember" ("id") ON DELETE CASCADE ON UPDATE CASCADE
);
-- CreateTable
CREATE TABLE "KeyResult" (
"id" TEXT NOT NULL PRIMARY KEY,
"okrId" TEXT NOT NULL,
"title" TEXT NOT NULL,
"targetValue" REAL NOT NULL,
"currentValue" REAL NOT NULL DEFAULT 0,
"unit" TEXT NOT NULL DEFAULT '%',
"status" TEXT NOT NULL DEFAULT 'NOT_STARTED',
"order" INTEGER NOT NULL DEFAULT 0,
"notes" TEXT,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "KeyResult_okrId_fkey" FOREIGN KEY ("okrId") REFERENCES "OKR" ("id") ON DELETE CASCADE ON UPDATE CASCADE
);
-- CreateIndex
CREATE INDEX "Team_createdById_idx" ON "Team"("createdById");
-- CreateIndex
CREATE INDEX "TeamMember_teamId_idx" ON "TeamMember"("teamId");
-- CreateIndex
CREATE INDEX "TeamMember_userId_idx" ON "TeamMember"("userId");
-- CreateIndex
CREATE UNIQUE INDEX "TeamMember_teamId_userId_key" ON "TeamMember"("teamId", "userId");
-- CreateIndex
CREATE INDEX "OKR_teamMemberId_idx" ON "OKR"("teamMemberId");
-- CreateIndex
CREATE INDEX "OKR_teamMemberId_period_idx" ON "OKR"("teamMemberId", "period");
-- CreateIndex
CREATE INDEX "OKR_status_idx" ON "OKR"("status");
-- CreateIndex
CREATE INDEX "KeyResult_okrId_idx" ON "KeyResult"("okrId");
-- CreateIndex
CREATE INDEX "KeyResult_okrId_order_idx" ON "KeyResult"("okrId", "order");

View File

@@ -0,0 +1,76 @@
-- CreateTable
CREATE TABLE "WeatherSession" (
"id" TEXT NOT NULL PRIMARY KEY,
"title" TEXT NOT NULL,
"date" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"userId" TEXT NOT NULL,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "WeatherSession_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User" ("id") ON DELETE CASCADE ON UPDATE CASCADE
);
-- CreateTable
CREATE TABLE "WeatherEntry" (
"id" TEXT NOT NULL PRIMARY KEY,
"sessionId" TEXT NOT NULL,
"userId" TEXT NOT NULL,
"performanceEmoji" TEXT,
"moralEmoji" TEXT,
"fluxEmoji" TEXT,
"valueCreationEmoji" TEXT,
"notes" TEXT,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "WeatherEntry_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "WeatherSession" ("id") ON DELETE CASCADE ON UPDATE CASCADE,
CONSTRAINT "WeatherEntry_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User" ("id") ON DELETE CASCADE ON UPDATE CASCADE
);
-- CreateTable
CREATE TABLE "WeatherSessionShare" (
"id" TEXT NOT NULL PRIMARY KEY,
"sessionId" TEXT NOT NULL,
"userId" TEXT NOT NULL,
"role" TEXT NOT NULL DEFAULT 'EDITOR',
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "WeatherSessionShare_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "WeatherSession" ("id") ON DELETE CASCADE ON UPDATE CASCADE,
CONSTRAINT "WeatherSessionShare_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User" ("id") ON DELETE CASCADE ON UPDATE CASCADE
);
-- CreateTable
CREATE TABLE "WeatherSessionEvent" (
"id" TEXT NOT NULL PRIMARY KEY,
"sessionId" TEXT NOT NULL,
"userId" TEXT NOT NULL,
"type" TEXT NOT NULL,
"payload" TEXT NOT NULL,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "WeatherSessionEvent_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "WeatherSession" ("id") ON DELETE CASCADE ON UPDATE CASCADE,
CONSTRAINT "WeatherSessionEvent_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User" ("id") ON DELETE CASCADE ON UPDATE CASCADE
);
-- CreateIndex
CREATE INDEX "WeatherSession_userId_idx" ON "WeatherSession"("userId");
-- CreateIndex
CREATE INDEX "WeatherSession_date_idx" ON "WeatherSession"("date");
-- CreateIndex
CREATE UNIQUE INDEX "WeatherEntry_sessionId_userId_key" ON "WeatherEntry"("sessionId", "userId");
-- CreateIndex
CREATE INDEX "WeatherEntry_sessionId_idx" ON "WeatherEntry"("sessionId");
-- CreateIndex
CREATE INDEX "WeatherEntry_userId_idx" ON "WeatherEntry"("userId");
-- CreateIndex
CREATE UNIQUE INDEX "WeatherSessionShare_sessionId_userId_key" ON "WeatherSessionShare"("sessionId", "userId");
-- CreateIndex
CREATE INDEX "WeatherSessionShare_sessionId_idx" ON "WeatherSessionShare"("sessionId");
-- CreateIndex
CREATE INDEX "WeatherSessionShare_userId_idx" ON "WeatherSessionShare"("userId");
-- CreateIndex
CREATE INDEX "WeatherSessionEvent_sessionId_createdAt_idx" ON "WeatherSessionEvent"("sessionId", "createdAt");

View File

@@ -0,0 +1,137 @@
-- DropTable
PRAGMA foreign_keys=off;
DROP TABLE "Emotion";
PRAGMA foreign_keys=on;
-- DropTable
PRAGMA foreign_keys=off;
DROP TABLE "KeyResultStatus";
PRAGMA foreign_keys=on;
-- DropTable
PRAGMA foreign_keys=off;
DROP TABLE "OKRStatus";
PRAGMA foreign_keys=on;
-- DropTable
PRAGMA foreign_keys=off;
DROP TABLE "TeamRole";
PRAGMA foreign_keys=on;
-- DropTable
PRAGMA foreign_keys=off;
DROP TABLE "WeeklyCheckInCategory";
PRAGMA foreign_keys=on;
-- CreateTable
CREATE TABLE "GifMoodSession" (
"id" TEXT NOT NULL PRIMARY KEY,
"title" TEXT NOT NULL,
"date" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"userId" TEXT NOT NULL,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "GifMoodSession_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User" ("id") ON DELETE CASCADE ON UPDATE CASCADE
);
-- CreateTable
CREATE TABLE "GifMoodUserRating" (
"id" TEXT NOT NULL PRIMARY KEY,
"sessionId" TEXT NOT NULL,
"userId" TEXT NOT NULL,
"rating" INTEGER NOT NULL,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "GifMoodUserRating_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "GifMoodSession" ("id") ON DELETE CASCADE ON UPDATE CASCADE,
CONSTRAINT "GifMoodUserRating_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User" ("id") ON DELETE CASCADE ON UPDATE CASCADE
);
-- CreateTable
CREATE TABLE "GifMoodItem" (
"id" TEXT NOT NULL PRIMARY KEY,
"sessionId" TEXT NOT NULL,
"userId" TEXT NOT NULL,
"gifUrl" TEXT NOT NULL,
"note" TEXT,
"order" INTEGER NOT NULL DEFAULT 0,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "GifMoodItem_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "GifMoodSession" ("id") ON DELETE CASCADE ON UPDATE CASCADE,
CONSTRAINT "GifMoodItem_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User" ("id") ON DELETE CASCADE ON UPDATE CASCADE
);
-- CreateTable
CREATE TABLE "GMSessionShare" (
"id" TEXT NOT NULL PRIMARY KEY,
"sessionId" TEXT NOT NULL,
"userId" TEXT NOT NULL,
"role" TEXT NOT NULL DEFAULT 'EDITOR',
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "GMSessionShare_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "GifMoodSession" ("id") ON DELETE CASCADE ON UPDATE CASCADE,
CONSTRAINT "GMSessionShare_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User" ("id") ON DELETE CASCADE ON UPDATE CASCADE
);
-- CreateTable
CREATE TABLE "GMSessionEvent" (
"id" TEXT NOT NULL PRIMARY KEY,
"sessionId" TEXT NOT NULL,
"userId" TEXT NOT NULL,
"type" TEXT NOT NULL,
"payload" TEXT NOT NULL,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "GMSessionEvent_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "GifMoodSession" ("id") ON DELETE CASCADE ON UPDATE CASCADE,
CONSTRAINT "GMSessionEvent_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User" ("id") ON DELETE CASCADE ON UPDATE CASCADE
);
-- RedefineTables
PRAGMA defer_foreign_keys=ON;
PRAGMA foreign_keys=OFF;
CREATE TABLE "new_WeeklyCheckInItem" (
"id" TEXT NOT NULL PRIMARY KEY,
"content" TEXT NOT NULL,
"category" TEXT NOT NULL,
"emotion" TEXT NOT NULL DEFAULT 'NONE',
"order" INTEGER NOT NULL DEFAULT 0,
"sessionId" TEXT NOT NULL,
"createdAt" DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" DATETIME NOT NULL,
CONSTRAINT "WeeklyCheckInItem_sessionId_fkey" FOREIGN KEY ("sessionId") REFERENCES "WeeklyCheckInSession" ("id") ON DELETE CASCADE ON UPDATE CASCADE
);
INSERT INTO "new_WeeklyCheckInItem" ("category", "content", "createdAt", "emotion", "id", "order", "sessionId", "updatedAt") SELECT "category", "content", "createdAt", "emotion", "id", "order", "sessionId", "updatedAt" FROM "WeeklyCheckInItem";
DROP TABLE "WeeklyCheckInItem";
ALTER TABLE "new_WeeklyCheckInItem" RENAME TO "WeeklyCheckInItem";
CREATE INDEX "WeeklyCheckInItem_sessionId_idx" ON "WeeklyCheckInItem"("sessionId");
CREATE INDEX "WeeklyCheckInItem_sessionId_category_idx" ON "WeeklyCheckInItem"("sessionId", "category");
PRAGMA foreign_keys=ON;
PRAGMA defer_foreign_keys=OFF;
-- CreateIndex
CREATE INDEX "GifMoodSession_userId_idx" ON "GifMoodSession"("userId");
-- CreateIndex
CREATE INDEX "GifMoodSession_date_idx" ON "GifMoodSession"("date");
-- CreateIndex
CREATE INDEX "GifMoodUserRating_sessionId_idx" ON "GifMoodUserRating"("sessionId");
-- CreateIndex
CREATE UNIQUE INDEX "GifMoodUserRating_sessionId_userId_key" ON "GifMoodUserRating"("sessionId", "userId");
-- CreateIndex
CREATE INDEX "GifMoodItem_sessionId_userId_idx" ON "GifMoodItem"("sessionId", "userId");
-- CreateIndex
CREATE INDEX "GifMoodItem_sessionId_idx" ON "GifMoodItem"("sessionId");
-- CreateIndex
CREATE INDEX "GMSessionShare_sessionId_idx" ON "GMSessionShare"("sessionId");
-- CreateIndex
CREATE INDEX "GMSessionShare_userId_idx" ON "GMSessionShare"("userId");
-- CreateIndex
CREATE UNIQUE INDEX "GMSessionShare_sessionId_userId_key" ON "GMSessionShare"("sessionId", "userId");
-- CreateIndex
CREATE INDEX "GMSessionEvent_sessionId_createdAt_idx" ON "GMSessionEvent"("sessionId", "createdAt");

View File

@@ -0,0 +1,2 @@
-- CreateIndex
CREATE INDEX "User_name_idx" ON "User"("name");

View File

@@ -21,8 +21,32 @@ model User {
motivatorSessions MovingMotivatorsSession[] motivatorSessions MovingMotivatorsSession[]
sharedMotivatorSessions MMSessionShare[] sharedMotivatorSessions MMSessionShare[]
motivatorSessionEvents MMSessionEvent[] motivatorSessionEvents MMSessionEvent[]
// Year Review relations
yearReviewSessions YearReviewSession[]
sharedYearReviewSessions YRSessionShare[]
yearReviewSessionEvents YRSessionEvent[]
// Weekly Check-in relations
weeklyCheckInSessions WeeklyCheckInSession[]
sharedWeeklyCheckInSessions WCISessionShare[]
weeklyCheckInSessionEvents WCISessionEvent[]
// Weather Workshop relations
weatherSessions WeatherSession[]
sharedWeatherSessions WeatherSessionShare[]
weatherSessionEvents WeatherSessionEvent[]
weatherEntries WeatherEntry[]
// GIF Mood Board relations
gifMoodSessions GifMoodSession[]
gifMoodItems GifMoodItem[]
sharedGifMoodSessions GMSessionShare[]
gifMoodSessionEvents GMSessionEvent[]
gifMoodRatings GifMoodUserRating[]
// Teams & OKRs relations
createdTeams Team[]
teamMembers TeamMember[]
createdAt DateTime @default(now()) createdAt DateTime @default(now())
updatedAt DateTime @updatedAt updatedAt DateTime @updatedAt
@@index([name])
} }
model Session { model Session {
@@ -200,3 +224,390 @@ model MMSessionEvent {
@@index([sessionId, createdAt]) @@index([sessionId, createdAt])
} }
// ============================================
// Year Review Workshop
// ============================================
enum YearReviewCategory {
ACHIEVEMENTS // Réalisations / Accomplissements
CHALLENGES // Défis / Difficultés rencontrées
LEARNINGS // Apprentissages / Compétences développées
GOALS // Objectifs pour l'année suivante
MOMENTS // Moments forts / Moments difficiles
}
model YearReviewSession {
id String @id @default(cuid())
title String
participant String // Nom du participant
year Int // Année du bilan (ex: 2024)
userId String
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
items YearReviewItem[]
shares YRSessionShare[]
events YRSessionEvent[]
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@index([userId])
@@index([year])
}
model YearReviewItem {
id String @id @default(cuid())
content String
category YearReviewCategory
order Int @default(0)
sessionId String
session YearReviewSession @relation(fields: [sessionId], references: [id], onDelete: Cascade)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@index([sessionId])
@@index([sessionId, category])
}
model YRSessionShare {
id String @id @default(cuid())
sessionId String
session YearReviewSession @relation(fields: [sessionId], references: [id], onDelete: Cascade)
userId String
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
role ShareRole @default(EDITOR)
createdAt DateTime @default(now())
@@unique([sessionId, userId])
@@index([sessionId])
@@index([userId])
}
model YRSessionEvent {
id String @id @default(cuid())
sessionId String
session YearReviewSession @relation(fields: [sessionId], references: [id], onDelete: Cascade)
userId String
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
type String // ITEM_CREATED, ITEM_UPDATED, ITEM_DELETED, etc.
payload String // JSON payload
createdAt DateTime @default(now())
@@index([sessionId, createdAt])
}
// ============================================
// Teams & OKRs
// ============================================
enum TeamRole {
ADMIN
MEMBER
}
enum OKRStatus {
NOT_STARTED
IN_PROGRESS
COMPLETED
CANCELLED
}
enum KeyResultStatus {
NOT_STARTED
IN_PROGRESS
COMPLETED
AT_RISK
}
model Team {
id String @id @default(cuid())
name String
description String?
createdById String
creator User @relation(fields: [createdById], references: [id], onDelete: Cascade)
members TeamMember[]
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@index([createdById])
}
model TeamMember {
id String @id @default(cuid())
teamId String
team Team @relation(fields: [teamId], references: [id], onDelete: Cascade)
userId String
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
role TeamRole @default(MEMBER)
okrs OKR[]
joinedAt DateTime @default(now())
@@unique([teamId, userId])
@@index([teamId])
@@index([userId])
}
model OKR {
id String @id @default(cuid())
teamMemberId String
teamMember TeamMember @relation(fields: [teamMemberId], references: [id], onDelete: Cascade)
objective String
description String?
period String // Q1 2025, Q2 2025, H1 2025, 2025, etc.
startDate DateTime
endDate DateTime
status OKRStatus @default(NOT_STARTED)
keyResults KeyResult[]
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@index([teamMemberId])
@@index([teamMemberId, period])
@@index([status])
}
model KeyResult {
id String @id @default(cuid())
okrId String
okr OKR @relation(fields: [okrId], references: [id], onDelete: Cascade)
title String
targetValue Float
currentValue Float @default(0)
unit String @default("%") // %, nombre, etc.
status KeyResultStatus @default(NOT_STARTED)
order Int @default(0)
notes String?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@index([okrId])
@@index([okrId, order])
}
// ============================================
// Weekly Check-in Workshop
// ============================================
enum WeeklyCheckInCategory {
WENT_WELL // Ce qui s'est bien passé
WENT_WRONG // Ce qui s'est mal passé
CURRENT_FOCUS // Les enjeux du moment (je me concentre sur ...)
NEXT_FOCUS // Les prochains enjeux
}
enum Emotion {
PRIDE // Fierté
JOY // Joie
SATISFACTION // Satisfaction
GRATITUDE // Gratitude
CONFIDENCE // Confiance
FRUSTRATION // Frustration
WORRY // Inquiétude
DISAPPOINTMENT // Déception
EXCITEMENT // Excitement
ANTICIPATION // Anticipation
DETERMINATION // Détermination
NONE // Aucune émotion
}
model WeeklyCheckInSession {
id String @id @default(cuid())
title String
participant String // Nom du participant
date DateTime @default(now())
userId String
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
items WeeklyCheckInItem[]
shares WCISessionShare[]
events WCISessionEvent[]
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@index([userId])
@@index([date])
}
model WeeklyCheckInItem {
id String @id @default(cuid())
content String
category WeeklyCheckInCategory
emotion Emotion @default(NONE)
order Int @default(0)
sessionId String
session WeeklyCheckInSession @relation(fields: [sessionId], references: [id], onDelete: Cascade)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@index([sessionId])
@@index([sessionId, category])
}
model WCISessionShare {
id String @id @default(cuid())
sessionId String
session WeeklyCheckInSession @relation(fields: [sessionId], references: [id], onDelete: Cascade)
userId String
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
role ShareRole @default(EDITOR)
createdAt DateTime @default(now())
@@unique([sessionId, userId])
@@index([sessionId])
@@index([userId])
}
model WCISessionEvent {
id String @id @default(cuid())
sessionId String
session WeeklyCheckInSession @relation(fields: [sessionId], references: [id], onDelete: Cascade)
userId String
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
type String // ITEM_CREATED, ITEM_UPDATED, ITEM_DELETED, etc.
payload String // JSON payload
createdAt DateTime @default(now())
@@index([sessionId, createdAt])
}
// ============================================
// Weather Workshop
// ============================================
model WeatherSession {
id String @id @default(cuid())
title String
date DateTime @default(now())
userId String
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
entries WeatherEntry[]
shares WeatherSessionShare[]
events WeatherSessionEvent[]
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@index([userId])
@@index([date])
}
model WeatherEntry {
id String @id @default(cuid())
sessionId String
session WeatherSession @relation(fields: [sessionId], references: [id], onDelete: Cascade)
userId String
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
performanceEmoji String? // Emoji météo pour Performance
moralEmoji String? // Emoji météo pour Moral
fluxEmoji String? // Emoji météo pour Flux
valueCreationEmoji String? // Emoji météo pour Création de valeur
notes String? // Notes globales
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@unique([sessionId, userId]) // Un seul entry par membre par session
@@index([sessionId])
@@index([userId])
}
model WeatherSessionShare {
id String @id @default(cuid())
sessionId String
session WeatherSession @relation(fields: [sessionId], references: [id], onDelete: Cascade)
userId String
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
role ShareRole @default(EDITOR)
createdAt DateTime @default(now())
@@unique([sessionId, userId])
@@index([sessionId])
@@index([userId])
}
model WeatherSessionEvent {
id String @id @default(cuid())
sessionId String
session WeatherSession @relation(fields: [sessionId], references: [id], onDelete: Cascade)
userId String
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
type String // ENTRY_CREATED, ENTRY_UPDATED, ENTRY_DELETED, SESSION_UPDATED, etc.
payload String // JSON payload
createdAt DateTime @default(now())
@@index([sessionId, createdAt])
}
// ============================================
// GIF Mood Board Workshop
// ============================================
model GifMoodSession {
id String @id @default(cuid())
title String
date DateTime @default(now())
userId String
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
items GifMoodItem[]
shares GMSessionShare[]
events GMSessionEvent[]
ratings GifMoodUserRating[]
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@index([userId])
@@index([date])
}
model GifMoodUserRating {
id String @id @default(cuid())
sessionId String
session GifMoodSession @relation(fields: [sessionId], references: [id], onDelete: Cascade)
userId String
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
rating Int // 1-5
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@unique([sessionId, userId])
@@index([sessionId])
}
model GifMoodItem {
id String @id @default(cuid())
sessionId String
session GifMoodSession @relation(fields: [sessionId], references: [id], onDelete: Cascade)
userId String
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
gifUrl String
note String?
order Int @default(0)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@index([sessionId, userId])
@@index([sessionId])
}
model GMSessionShare {
id String @id @default(cuid())
sessionId String
session GifMoodSession @relation(fields: [sessionId], references: [id], onDelete: Cascade)
userId String
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
role ShareRole @default(EDITOR)
createdAt DateTime @default(now())
@@unique([sessionId, userId])
@@index([sessionId])
@@index([userId])
}
model GMSessionEvent {
id String @id @default(cuid())
sessionId String
session GifMoodSession @relation(fields: [sessionId], references: [id], onDelete: Cascade)
userId String
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
type String // GIF_ADDED, GIF_UPDATED, GIF_DELETED, SESSION_UPDATED
payload String // JSON payload
createdAt DateTime @default(now())
@@index([sessionId, createdAt])
}

View File

@@ -1 +0,0 @@
<svg fill="none" viewBox="0 0 16 16" xmlns="http://www.w3.org/2000/svg"><path d="M14.5 13.5V5.41a1 1 0 0 0-.3-.7L9.8.29A1 1 0 0 0 9.08 0H1.5v13.5A2.5 2.5 0 0 0 4 16h8a2.5 2.5 0 0 0 2.5-2.5m-1.5 0v-7H8v-5H3v12a1 1 0 0 0 1 1h8a1 1 0 0 0 1-1M9.5 5V2.12L12.38 5zM5.13 5h-.62v1.25h2.12V5zm-.62 3h7.12v1.25H4.5zm.62 3h-.62v1.25h7.12V11z" clip-rule="evenodd" fill="#666" fill-rule="evenodd"/></svg>

Before

Width:  |  Height:  |  Size: 391 B

View File

@@ -1 +0,0 @@
<svg fill="none" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 16 16"><g clip-path="url(#a)"><path fill-rule="evenodd" clip-rule="evenodd" d="M10.27 14.1a6.5 6.5 0 0 0 3.67-3.45q-1.24.21-2.7.34-.31 1.83-.97 3.1M8 16A8 8 0 1 0 8 0a8 8 0 0 0 0 16m.48-1.52a7 7 0 0 1-.96 0H7.5a4 4 0 0 1-.84-1.32q-.38-.89-.63-2.08a40 40 0 0 0 3.92 0q-.25 1.2-.63 2.08a4 4 0 0 1-.84 1.31zm2.94-4.76q1.66-.15 2.95-.43a7 7 0 0 0 0-2.58q-1.3-.27-2.95-.43a18 18 0 0 1 0 3.44m-1.27-3.54a17 17 0 0 1 0 3.64 39 39 0 0 1-4.3 0 17 17 0 0 1 0-3.64 39 39 0 0 1 4.3 0m1.1-1.17q1.45.13 2.69.34a6.5 6.5 0 0 0-3.67-3.44q.65 1.26.98 3.1M8.48 1.5l.01.02q.41.37.84 1.31.38.89.63 2.08a40 40 0 0 0-3.92 0q.25-1.2.63-2.08a4 4 0 0 1 .85-1.32 7 7 0 0 1 .96 0m-2.75.4a6.5 6.5 0 0 0-3.67 3.44 29 29 0 0 1 2.7-.34q.31-1.83.97-3.1M4.58 6.28q-1.66.16-2.95.43a7 7 0 0 0 0 2.58q1.3.27 2.95.43a18 18 0 0 1 0-3.44m.17 4.71q-1.45-.12-2.69-.34a6.5 6.5 0 0 0 3.67 3.44q-.65-1.27-.98-3.1" fill="#666"/></g><defs><clipPath id="a"><path fill="#fff" d="M0 0h16v16H0z"/></clipPath></defs></svg>

Before

Width:  |  Height:  |  Size: 1.0 KiB

6
public/icon.svg Normal file
View File

@@ -0,0 +1,6 @@
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24" fill="none" stroke="#0891b2" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
<path d="M4.5 16.5c-1.5 1.26-2 5-2 5s3.74-.5 5-2c.71-.84.7-2.13-.09-2.91a2.18 2.18 0 0 0-2.91-.09z" />
<path d="m12 15-3-3a22 22 0 0 1 2-3.95A12.88 12.88 0 0 1 22 2c0 2.72-.78 7.5-6 11a22.35 22.35 0 0 1-4 2z" />
<path d="M9 12H4s.55-3.03 2-4c1.62-1.08 5 0 5 0" />
<path d="M12 15v5s3.03-.55 4-2c1.08-1.62 0-5 0-5" />
</svg>

After

Width:  |  Height:  |  Size: 486 B

View File

@@ -1 +0,0 @@
<svg xmlns="http://www.w3.org/2000/svg" fill="none" viewBox="0 0 394 80"><path fill="#000" d="M262 0h68.5v12.7h-27.2v66.6h-13.6V12.7H262V0ZM149 0v12.7H94v20.4h44.3v12.6H94v21h55v12.6H80.5V0h68.7zm34.3 0h-17.8l63.8 79.4h17.9l-32-39.7 32-39.6h-17.9l-23 28.6-23-28.6zm18.3 56.7-9-11-27.1 33.7h17.8l18.3-22.7z"/><path fill="#000" d="M81 79.3 17 0H0v79.3h13.6V17l50.2 62.3H81Zm252.6-.4c-1 0-1.8-.4-2.5-1s-1.1-1.6-1.1-2.6.3-1.8 1-2.5 1.6-1 2.6-1 1.8.3 2.5 1a3.4 3.4 0 0 1 .6 4.3 3.7 3.7 0 0 1-3 1.8zm23.2-33.5h6v23.3c0 2.1-.4 4-1.3 5.5a9.1 9.1 0 0 1-3.8 3.5c-1.6.8-3.5 1.3-5.7 1.3-2 0-3.7-.4-5.3-1s-2.8-1.8-3.7-3.2c-.9-1.3-1.4-3-1.4-5h6c.1.8.3 1.6.7 2.2s1 1.2 1.6 1.5c.7.4 1.5.5 2.4.5 1 0 1.8-.2 2.4-.6a4 4 0 0 0 1.6-1.8c.3-.8.5-1.8.5-3V45.5zm30.9 9.1a4.4 4.4 0 0 0-2-3.3 7.5 7.5 0 0 0-4.3-1.1c-1.3 0-2.4.2-3.3.5-.9.4-1.6 1-2 1.6a3.5 3.5 0 0 0-.3 4c.3.5.7.9 1.3 1.2l1.8 1 2 .5 3.2.8c1.3.3 2.5.7 3.7 1.2a13 13 0 0 1 3.2 1.8 8.1 8.1 0 0 1 3 6.5c0 2-.5 3.7-1.5 5.1a10 10 0 0 1-4.4 3.5c-1.8.8-4.1 1.2-6.8 1.2-2.6 0-4.9-.4-6.8-1.2-2-.8-3.4-2-4.5-3.5a10 10 0 0 1-1.7-5.6h6a5 5 0 0 0 3.5 4.6c1 .4 2.2.6 3.4.6 1.3 0 2.5-.2 3.5-.6 1-.4 1.8-1 2.4-1.7a4 4 0 0 0 .8-2.4c0-.9-.2-1.6-.7-2.2a11 11 0 0 0-2.1-1.4l-3.2-1-3.8-1c-2.8-.7-5-1.7-6.6-3.2a7.2 7.2 0 0 1-2.4-5.7 8 8 0 0 1 1.7-5 10 10 0 0 1 4.3-3.5c2-.8 4-1.2 6.4-1.2 2.3 0 4.4.4 6.2 1.2 1.8.8 3.2 2 4.3 3.4 1 1.4 1.5 3 1.5 5h-5.8z"/></svg>

Before

Width:  |  Height:  |  Size: 1.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 75 KiB

View File

@@ -1 +0,0 @@
<svg fill="none" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 1155 1000"><path d="m577.3 0 577.4 1000H0z" fill="#fff"/></svg>

Before

Width:  |  Height:  |  Size: 128 B

View File

@@ -1 +0,0 @@
<svg fill="none" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 16 16"><path fill-rule="evenodd" clip-rule="evenodd" d="M1.5 2.5h13v10a1 1 0 0 1-1 1h-11a1 1 0 0 1-1-1zM0 1h16v11.5a2.5 2.5 0 0 1-2.5 2.5h-11A2.5 2.5 0 0 1 0 12.5zm3.75 4.5a.75.75 0 1 0 0-1.5.75.75 0 0 0 0 1.5M7 4.75a.75.75 0 1 1-1.5 0 .75.75 0 0 1 1.5 0m1.75.75a.75.75 0 1 0 0-1.5.75.75 0 0 0 0 1.5" fill="#666"/></svg>

Before

Width:  |  Height:  |  Size: 385 B

344
src/actions/gif-mood.ts Normal file
View File

@@ -0,0 +1,344 @@
'use server';
import { revalidatePath, revalidateTag } from 'next/cache';
import { auth } from '@/lib/auth';
import * as gifMoodService from '@/services/gif-mood';
import { sessionsListTag } from '@/lib/cache-tags';
import { getUserById } from '@/services/auth';
import { broadcastToGifMoodSession } from '@/app/api/gif-mood/[id]/subscribe/route';
// ============================================
// Session Actions
// ============================================
export async function createGifMoodSession(data: { title: string; date?: Date }) {
const session = await auth();
if (!session?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
try {
const gifMoodSession = await gifMoodService.createGifMoodSession(session.user.id, data);
revalidatePath('/gif-mood');
revalidatePath('/sessions');
revalidateTag(sessionsListTag(session.user.id), 'default');
return { success: true, data: gifMoodSession };
} catch (error) {
console.error('Error creating gif mood session:', error);
return { success: false, error: 'Erreur lors de la création' };
}
}
export async function updateGifMoodSession(
sessionId: string,
data: { title?: string; date?: Date }
) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
try {
await gifMoodService.updateGifMoodSession(sessionId, authSession.user.id, data);
const user = await getUserById(authSession.user.id);
if (!user) {
return { success: false, error: 'Utilisateur non trouvé' };
}
const event = await gifMoodService.createGifMoodSessionEvent(
sessionId,
authSession.user.id,
'SESSION_UPDATED',
data
);
broadcastToGifMoodSession(sessionId, {
type: 'SESSION_UPDATED',
payload: data,
userId: authSession.user.id,
user: { id: user.id, name: user.name, email: user.email },
timestamp: event.createdAt,
});
revalidatePath(`/gif-mood/${sessionId}`);
revalidatePath('/gif-mood');
revalidatePath('/sessions');
revalidateTag(sessionsListTag(authSession.user.id), 'default');
return { success: true };
} catch (error) {
console.error('Error updating gif mood session:', error);
return { success: false, error: 'Erreur lors de la mise à jour' };
}
}
export async function deleteGifMoodSession(sessionId: string) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
try {
await gifMoodService.deleteGifMoodSession(sessionId, authSession.user.id);
revalidatePath('/gif-mood');
revalidatePath('/sessions');
revalidateTag(sessionsListTag(authSession.user.id), 'default');
return { success: true };
} catch (error) {
console.error('Error deleting gif mood session:', error);
return { success: false, error: 'Erreur lors de la suppression' };
}
}
// ============================================
// Item Actions
// ============================================
export async function addGifMoodItem(
sessionId: string,
data: { gifUrl: string; note?: string }
) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
const canEdit = await gifMoodService.canEditGifMoodSession(sessionId, authSession.user.id);
if (!canEdit) {
return { success: false, error: 'Permission refusée' };
}
try {
const item = await gifMoodService.addGifMoodItem(sessionId, authSession.user.id, data);
const user = await getUserById(authSession.user.id);
if (!user) {
return { success: false, error: 'Utilisateur non trouvé' };
}
const event = await gifMoodService.createGifMoodSessionEvent(
sessionId,
authSession.user.id,
'GIF_ADDED',
{ itemId: item.id, userId: item.userId, gifUrl: item.gifUrl, note: item.note }
);
broadcastToGifMoodSession(sessionId, {
type: 'GIF_ADDED',
payload: { itemId: item.id, userId: item.userId, gifUrl: item.gifUrl, note: item.note },
userId: authSession.user.id,
user: { id: user.id, name: user.name, email: user.email },
timestamp: event.createdAt,
});
revalidatePath(`/gif-mood/${sessionId}`);
return { success: true, data: item };
} catch (error) {
console.error('Error adding gif mood item:', error);
const message = error instanceof Error ? error.message : "Erreur lors de l'ajout";
return { success: false, error: message };
}
}
export async function updateGifMoodItem(
sessionId: string,
itemId: string,
data: { note?: string; order?: number }
) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
const canEdit = await gifMoodService.canEditGifMoodSession(sessionId, authSession.user.id);
if (!canEdit) {
return { success: false, error: 'Permission refusée' };
}
try {
await gifMoodService.updateGifMoodItem(itemId, authSession.user.id, data);
const user = await getUserById(authSession.user.id);
if (!user) {
return { success: false, error: 'Utilisateur non trouvé' };
}
const event = await gifMoodService.createGifMoodSessionEvent(
sessionId,
authSession.user.id,
'GIF_UPDATED',
{ itemId, ...data }
);
broadcastToGifMoodSession(sessionId, {
type: 'GIF_UPDATED',
payload: { itemId, ...data },
userId: authSession.user.id,
user: { id: user.id, name: user.name, email: user.email },
timestamp: event.createdAt,
});
revalidatePath(`/gif-mood/${sessionId}`);
return { success: true };
} catch (error) {
console.error('Error updating gif mood item:', error);
return { success: false, error: 'Erreur lors de la mise à jour' };
}
}
export async function deleteGifMoodItem(sessionId: string, itemId: string) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
const canEdit = await gifMoodService.canEditGifMoodSession(sessionId, authSession.user.id);
if (!canEdit) {
return { success: false, error: 'Permission refusée' };
}
try {
await gifMoodService.deleteGifMoodItem(itemId, authSession.user.id);
const user = await getUserById(authSession.user.id);
if (!user) {
return { success: false, error: 'Utilisateur non trouvé' };
}
const event = await gifMoodService.createGifMoodSessionEvent(
sessionId,
authSession.user.id,
'GIF_DELETED',
{ itemId, userId: authSession.user.id }
);
broadcastToGifMoodSession(sessionId, {
type: 'GIF_DELETED',
payload: { itemId, userId: authSession.user.id },
userId: authSession.user.id,
user: { id: user.id, name: user.name, email: user.email },
timestamp: event.createdAt,
});
revalidatePath(`/gif-mood/${sessionId}`);
return { success: true };
} catch (error) {
console.error('Error deleting gif mood item:', error);
return { success: false, error: 'Erreur lors de la suppression' };
}
}
// ============================================
// Week Rating Actions
// ============================================
export async function setGifMoodUserRating(sessionId: string, rating: number) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
const canEdit = await gifMoodService.canEditGifMoodSession(sessionId, authSession.user.id);
if (!canEdit) {
return { success: false, error: 'Permission refusée' };
}
try {
await gifMoodService.upsertGifMoodUserRating(sessionId, authSession.user.id, rating);
const user = await getUserById(authSession.user.id);
if (user) {
const event = await gifMoodService.createGifMoodSessionEvent(
sessionId,
authSession.user.id,
'SESSION_UPDATED',
{ rating, userId: authSession.user.id }
);
broadcastToGifMoodSession(sessionId, {
type: 'SESSION_UPDATED',
payload: { rating, userId: authSession.user.id },
userId: authSession.user.id,
user: { id: user.id, name: user.name, email: user.email },
timestamp: event.createdAt,
});
}
revalidatePath(`/gif-mood/${sessionId}`);
return { success: true };
} catch (error) {
console.error('Error setting gif mood user rating:', error);
return { success: false, error: 'Erreur lors de la mise à jour' };
}
}
// ============================================
// Sharing Actions
// ============================================
export async function shareGifMoodSession(
sessionId: string,
targetEmail: string,
role: 'VIEWER' | 'EDITOR' = 'EDITOR'
) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
try {
const share = await gifMoodService.shareGifMoodSession(
sessionId,
authSession.user.id,
targetEmail,
role
);
revalidatePath(`/gif-mood/${sessionId}`);
return { success: true, data: share };
} catch (error) {
console.error('Error sharing gif mood session:', error);
const message = error instanceof Error ? error.message : 'Erreur lors du partage';
return { success: false, error: message };
}
}
export async function shareGifMoodSessionToTeam(
sessionId: string,
teamId: string,
role: 'VIEWER' | 'EDITOR' = 'EDITOR'
) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
try {
const shares = await gifMoodService.shareGifMoodSessionToTeam(
sessionId,
authSession.user.id,
teamId,
role
);
revalidatePath(`/gif-mood/${sessionId}`);
return { success: true, data: shares };
} catch (error) {
console.error('Error sharing gif mood session to team:', error);
const message = error instanceof Error ? error.message : "Erreur lors du partage à l'équipe";
return { success: false, error: message };
}
}
export async function removeGifMoodShare(sessionId: string, shareUserId: string) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
try {
await gifMoodService.removeGifMoodShare(sessionId, authSession.user.id, shareUserId);
revalidatePath(`/gif-mood/${sessionId}`);
return { success: true };
} catch (error) {
console.error('Error removing gif mood share:', error);
return { success: false, error: 'Erreur lors de la suppression du partage' };
}
}

View File

@@ -1,8 +1,10 @@
'use server'; 'use server';
import { revalidatePath } from 'next/cache'; import { revalidatePath, revalidateTag } from 'next/cache';
import { auth } from '@/lib/auth'; import { auth } from '@/lib/auth';
import * as motivatorsService from '@/services/moving-motivators'; import * as motivatorsService from '@/services/moving-motivators';
import { sessionsListTag } from '@/lib/cache-tags';
import { broadcastToMotivatorSession } from '@/app/api/motivators/[id]/subscribe/route';
// ============================================ // ============================================
// Session Actions // Session Actions
@@ -16,6 +18,16 @@ export async function createMotivatorSession(data: { title: string; participant:
try { try {
const motivatorSession = await motivatorsService.createMotivatorSession(session.user.id, data); const motivatorSession = await motivatorsService.createMotivatorSession(session.user.id, data);
try {
await motivatorsService.shareMotivatorSession(
motivatorSession.id,
session.user.id,
data.participant,
'EDITOR'
);
} catch (shareError) {
console.error('Auto-share failed:', shareError);
}
revalidatePath('/motivators'); revalidatePath('/motivators');
return { success: true, data: motivatorSession }; return { success: true, data: motivatorSession };
} catch (error) { } catch (error) {
@@ -44,9 +56,11 @@ export async function updateMotivatorSession(
data data
); );
broadcastToMotivatorSession(sessionId, { type: 'SESSION_UPDATED' });
revalidatePath(`/motivators/${sessionId}`); revalidatePath(`/motivators/${sessionId}`);
revalidatePath('/motivators'); revalidatePath('/motivators');
revalidatePath('/sessions'); // Also revalidate unified workshops page revalidatePath('/sessions'); // Also revalidate unified workshops page
revalidateTag(sessionsListTag(authSession.user.id), 'default');
return { success: true }; return { success: true };
} catch (error) { } catch (error) {
console.error('Error updating motivator session:', error); console.error('Error updating motivator session:', error);
@@ -64,6 +78,7 @@ export async function deleteMotivatorSession(sessionId: string) {
await motivatorsService.deleteMotivatorSession(sessionId, authSession.user.id); await motivatorsService.deleteMotivatorSession(sessionId, authSession.user.id);
revalidatePath('/motivators'); revalidatePath('/motivators');
revalidatePath('/sessions'); // Also revalidate unified workshops page revalidatePath('/sessions'); // Also revalidate unified workshops page
revalidateTag(sessionsListTag(authSession.user.id), 'default');
return { success: true }; return { success: true };
} catch (error) { } catch (error) {
console.error('Error deleting motivator session:', error); console.error('Error deleting motivator session:', error);
@@ -111,6 +126,7 @@ export async function updateMotivatorCard(
); );
} }
broadcastToMotivatorSession(sessionId, { type: 'CARD_UPDATED' });
revalidatePath(`/motivators/${sessionId}`); revalidatePath(`/motivators/${sessionId}`);
return { success: true, data: card }; return { success: true, data: card };
} catch (error) { } catch (error) {
@@ -142,6 +158,7 @@ export async function reorderMotivatorCards(sessionId: string, cardIds: string[]
{ cardIds } { cardIds }
); );
broadcastToMotivatorSession(sessionId, { type: 'CARDS_REORDERED' });
revalidatePath(`/motivators/${sessionId}`); revalidatePath(`/motivators/${sessionId}`);
return { success: true }; return { success: true };
} catch (error) { } catch (error) {

View File

@@ -1,8 +1,10 @@
'use server'; 'use server';
import { revalidatePath } from 'next/cache'; import { revalidatePath, revalidateTag } from 'next/cache';
import { auth } from '@/lib/auth'; import { auth } from '@/lib/auth';
import * as sessionsService from '@/services/sessions'; import * as sessionsService from '@/services/sessions';
import { sessionsListTag } from '@/lib/cache-tags';
import { broadcastToSession } from '@/app/api/sessions/[id]/subscribe/route';
export async function updateSessionTitle(sessionId: string, title: string) { export async function updateSessionTitle(sessionId: string, title: string) {
const session = await auth(); const session = await auth();
@@ -28,8 +30,10 @@ export async function updateSessionTitle(sessionId: string, title: string) {
title: title.trim(), title: title.trim(),
}); });
broadcastToSession(sessionId, { type: 'SESSION_UPDATED' });
revalidatePath(`/sessions/${sessionId}`); revalidatePath(`/sessions/${sessionId}`);
revalidatePath('/sessions'); revalidatePath('/sessions');
revalidateTag(sessionsListTag(session.user.id), 'default');
return { success: true }; return { success: true };
} catch (error) { } catch (error) {
console.error('Error updating session title:', error); console.error('Error updating session title:', error);
@@ -61,8 +65,10 @@ export async function updateSessionCollaborator(sessionId: string, collaborator:
collaborator: collaborator.trim(), collaborator: collaborator.trim(),
}); });
broadcastToSession(sessionId, { type: 'SESSION_UPDATED' });
revalidatePath(`/sessions/${sessionId}`); revalidatePath(`/sessions/${sessionId}`);
revalidatePath('/sessions'); revalidatePath('/sessions');
revalidateTag(sessionsListTag(session.user.id), 'default');
return { success: true }; return { success: true };
} catch (error) { } catch (error) {
console.error('Error updating session collaborator:', error); console.error('Error updating session collaborator:', error);
@@ -106,8 +112,10 @@ export async function updateSwotSession(
updateData updateData
); );
broadcastToSession(sessionId, { type: 'SESSION_UPDATED' });
revalidatePath(`/sessions/${sessionId}`); revalidatePath(`/sessions/${sessionId}`);
revalidatePath('/sessions'); revalidatePath('/sessions');
revalidateTag(sessionsListTag(session.user.id), 'default');
return { success: true }; return { success: true };
} catch (error) { } catch (error) {
console.error('Error updating session:', error); console.error('Error updating session:', error);
@@ -129,6 +137,7 @@ export async function deleteSwotSession(sessionId: string) {
} }
revalidatePath('/sessions'); revalidatePath('/sessions');
revalidateTag(sessionsListTag(session.user.id), 'default');
return { success: true }; return { success: true };
} catch (error) { } catch (error) {
console.error('Error deleting session:', error); console.error('Error deleting session:', error);

View File

@@ -0,0 +1,49 @@
'use server';
import { auth } from '@/lib/auth';
import { SESSIONS_PAGE_SIZE } from '@/lib/types';
import { withWorkshopType } from '@/lib/workshops';
import { getSessionsByUserId } from '@/services/sessions';
import { getMotivatorSessionsByUserId } from '@/services/moving-motivators';
import { getYearReviewSessionsByUserId } from '@/services/year-review';
import { getWeeklyCheckInSessionsByUserId } from '@/services/weekly-checkin';
import { getWeatherSessionsByUserId } from '@/services/weather';
import { getGifMoodSessionsByUserId } from '@/services/gif-mood';
import type { WorkshopTypeId } from '@/lib/workshops';
export async function loadMoreSessions(type: WorkshopTypeId, offset: number) {
const session = await auth();
if (!session?.user?.id) return null;
const userId = session.user.id;
const limit = SESSIONS_PAGE_SIZE;
switch (type) {
case 'swot': {
const all = await getSessionsByUserId(userId);
return { items: withWorkshopType(all.slice(offset, offset + limit), 'swot'), total: all.length };
}
case 'motivators': {
const all = await getMotivatorSessionsByUserId(userId);
return { items: withWorkshopType(all.slice(offset, offset + limit), 'motivators'), total: all.length };
}
case 'year-review': {
const all = await getYearReviewSessionsByUserId(userId);
return { items: withWorkshopType(all.slice(offset, offset + limit), 'year-review'), total: all.length };
}
case 'weekly-checkin': {
const all = await getWeeklyCheckInSessionsByUserId(userId);
return { items: withWorkshopType(all.slice(offset, offset + limit), 'weekly-checkin'), total: all.length };
}
case 'weather': {
const all = await getWeatherSessionsByUserId(userId);
return { items: withWorkshopType(all.slice(offset, offset + limit), 'weather'), total: all.length };
}
case 'gif-mood': {
const all = await getGifMoodSessionsByUserId(userId);
return { items: withWorkshopType(all.slice(offset, offset + limit), 'gif-mood'), total: all.length };
}
default:
return null;
}
}

View File

@@ -3,6 +3,7 @@
import { revalidatePath } from 'next/cache'; import { revalidatePath } from 'next/cache';
import { auth } from '@/lib/auth'; import { auth } from '@/lib/auth';
import * as sessionsService from '@/services/sessions'; import * as sessionsService from '@/services/sessions';
import { broadcastToSession } from '@/app/api/sessions/[id]/subscribe/route';
import type { SwotCategory } from '@prisma/client'; import type { SwotCategory } from '@prisma/client';
// ============================================ // ============================================
@@ -17,6 +18,9 @@ export async function createSwotItem(
if (!session?.user?.id) { if (!session?.user?.id) {
return { success: false, error: 'Non autorisé' }; return { success: false, error: 'Non autorisé' };
} }
if (!(await sessionsService.canEditSession(sessionId, session.user.id))) {
return { success: false, error: 'Non autorisé' };
}
try { try {
const item = await sessionsService.createSwotItem(sessionId, data); const item = await sessionsService.createSwotItem(sessionId, data);
@@ -28,6 +32,7 @@ export async function createSwotItem(
category: item.category, category: item.category,
}); });
broadcastToSession(sessionId, { type: 'ITEM_CREATED' });
revalidatePath(`/sessions/${sessionId}`); revalidatePath(`/sessions/${sessionId}`);
return { success: true, data: item }; return { success: true, data: item };
} catch (error) { } catch (error) {
@@ -45,6 +50,9 @@ export async function updateSwotItem(
if (!session?.user?.id) { if (!session?.user?.id) {
return { success: false, error: 'Non autorisé' }; return { success: false, error: 'Non autorisé' };
} }
if (!(await sessionsService.canEditSession(sessionId, session.user.id))) {
return { success: false, error: 'Non autorisé' };
}
try { try {
const item = await sessionsService.updateSwotItem(itemId, data); const item = await sessionsService.updateSwotItem(itemId, data);
@@ -55,6 +63,7 @@ export async function updateSwotItem(
...data, ...data,
}); });
broadcastToSession(sessionId, { type: 'ITEM_UPDATED' });
revalidatePath(`/sessions/${sessionId}`); revalidatePath(`/sessions/${sessionId}`);
return { success: true, data: item }; return { success: true, data: item };
} catch (error) { } catch (error) {
@@ -68,6 +77,9 @@ export async function deleteSwotItem(itemId: string, sessionId: string) {
if (!session?.user?.id) { if (!session?.user?.id) {
return { success: false, error: 'Non autorisé' }; return { success: false, error: 'Non autorisé' };
} }
if (!(await sessionsService.canEditSession(sessionId, session.user.id))) {
return { success: false, error: 'Non autorisé' };
}
try { try {
await sessionsService.deleteSwotItem(itemId); await sessionsService.deleteSwotItem(itemId);
@@ -77,6 +89,7 @@ export async function deleteSwotItem(itemId: string, sessionId: string) {
itemId, itemId,
}); });
broadcastToSession(sessionId, { type: 'ITEM_DELETED' });
revalidatePath(`/sessions/${sessionId}`); revalidatePath(`/sessions/${sessionId}`);
return { success: true }; return { success: true };
} catch (error) { } catch (error) {
@@ -90,6 +103,9 @@ export async function duplicateSwotItem(itemId: string, sessionId: string) {
if (!session?.user?.id) { if (!session?.user?.id) {
return { success: false, error: 'Non autorisé' }; return { success: false, error: 'Non autorisé' };
} }
if (!(await sessionsService.canEditSession(sessionId, session.user.id))) {
return { success: false, error: 'Non autorisé' };
}
try { try {
const item = await sessionsService.duplicateSwotItem(itemId); const item = await sessionsService.duplicateSwotItem(itemId);
@@ -102,6 +118,7 @@ export async function duplicateSwotItem(itemId: string, sessionId: string) {
duplicatedFrom: itemId, duplicatedFrom: itemId,
}); });
broadcastToSession(sessionId, { type: 'ITEM_CREATED' });
revalidatePath(`/sessions/${sessionId}`); revalidatePath(`/sessions/${sessionId}`);
return { success: true, data: item }; return { success: true, data: item };
} catch (error) { } catch (error) {
@@ -120,6 +137,9 @@ export async function moveSwotItem(
if (!session?.user?.id) { if (!session?.user?.id) {
return { success: false, error: 'Non autorisé' }; return { success: false, error: 'Non autorisé' };
} }
if (!(await sessionsService.canEditSession(sessionId, session.user.id))) {
return { success: false, error: 'Non autorisé' };
}
try { try {
const item = await sessionsService.moveSwotItem(itemId, newCategory, newOrder); const item = await sessionsService.moveSwotItem(itemId, newCategory, newOrder);
@@ -131,6 +151,7 @@ export async function moveSwotItem(
newOrder, newOrder,
}); });
broadcastToSession(sessionId, { type: 'ITEM_MOVED' });
revalidatePath(`/sessions/${sessionId}`); revalidatePath(`/sessions/${sessionId}`);
return { success: true, data: item }; return { success: true, data: item };
} catch (error) { } catch (error) {
@@ -156,6 +177,9 @@ export async function createAction(
if (!session?.user?.id) { if (!session?.user?.id) {
return { success: false, error: 'Non autorisé' }; return { success: false, error: 'Non autorisé' };
} }
if (!(await sessionsService.canEditSession(sessionId, session.user.id))) {
return { success: false, error: 'Non autorisé' };
}
try { try {
const action = await sessionsService.createAction(sessionId, data); const action = await sessionsService.createAction(sessionId, data);
@@ -167,6 +191,7 @@ export async function createAction(
linkedItemIds: data.linkedItemIds, linkedItemIds: data.linkedItemIds,
}); });
broadcastToSession(sessionId, { type: 'ACTION_CREATED' });
revalidatePath(`/sessions/${sessionId}`); revalidatePath(`/sessions/${sessionId}`);
return { success: true, data: action }; return { success: true, data: action };
} catch (error) { } catch (error) {
@@ -183,12 +208,16 @@ export async function updateAction(
description?: string; description?: string;
priority?: number; priority?: number;
status?: string; status?: string;
linkedItemIds?: string[];
} }
) { ) {
const session = await auth(); const session = await auth();
if (!session?.user?.id) { if (!session?.user?.id) {
return { success: false, error: 'Non autorisé' }; return { success: false, error: 'Non autorisé' };
} }
if (!(await sessionsService.canEditSession(sessionId, session.user.id))) {
return { success: false, error: 'Non autorisé' };
}
try { try {
const action = await sessionsService.updateAction(actionId, data); const action = await sessionsService.updateAction(actionId, data);
@@ -199,6 +228,7 @@ export async function updateAction(
...data, ...data,
}); });
broadcastToSession(sessionId, { type: 'ACTION_UPDATED' });
revalidatePath(`/sessions/${sessionId}`); revalidatePath(`/sessions/${sessionId}`);
return { success: true, data: action }; return { success: true, data: action };
} catch (error) { } catch (error) {
@@ -212,6 +242,9 @@ export async function deleteAction(actionId: string, sessionId: string) {
if (!session?.user?.id) { if (!session?.user?.id) {
return { success: false, error: 'Non autorisé' }; return { success: false, error: 'Non autorisé' };
} }
if (!(await sessionsService.canEditSession(sessionId, session.user.id))) {
return { success: false, error: 'Non autorisé' };
}
try { try {
await sessionsService.deleteAction(actionId); await sessionsService.deleteAction(actionId);
@@ -221,6 +254,7 @@ export async function deleteAction(actionId: string, sessionId: string) {
actionId, actionId,
}); });
broadcastToSession(sessionId, { type: 'ACTION_DELETED' });
revalidatePath(`/sessions/${sessionId}`); revalidatePath(`/sessions/${sessionId}`);
return { success: true }; return { success: true };
} catch (error) { } catch (error) {

285
src/actions/weather.ts Normal file
View File

@@ -0,0 +1,285 @@
'use server';
import { revalidatePath, revalidateTag } from 'next/cache';
import { auth } from '@/lib/auth';
import * as weatherService from '@/services/weather';
import { sessionsListTag } from '@/lib/cache-tags';
import { getUserById } from '@/services/auth';
import { broadcastToWeatherSession } from '@/app/api/weather/[id]/subscribe/route';
// ============================================
// Session Actions
// ============================================
export async function createWeatherSession(data: { title: string; date?: Date }) {
const session = await auth();
if (!session?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
try {
const weatherSession = await weatherService.createWeatherSession(session.user.id, data);
revalidatePath('/weather');
revalidatePath('/sessions');
revalidateTag(sessionsListTag(session.user.id), 'default');
return { success: true, data: weatherSession };
} catch (error) {
console.error('Error creating weather session:', error);
return { success: false, error: 'Erreur lors de la création' };
}
}
export async function updateWeatherSession(
sessionId: string,
data: { title?: string; date?: Date }
) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
try {
await weatherService.updateWeatherSession(sessionId, authSession.user.id, data);
// Get user info for broadcast
const user = await getUserById(authSession.user.id);
if (!user) {
return { success: false, error: 'Utilisateur non trouvé' };
}
// Emit event for real-time sync
const event = await weatherService.createWeatherSessionEvent(
sessionId,
authSession.user.id,
'SESSION_UPDATED',
data
);
// Broadcast immediately via SSE
broadcastToWeatherSession(sessionId, {
type: 'SESSION_UPDATED',
payload: data,
userId: authSession.user.id,
user: { id: user.id, name: user.name, email: user.email },
timestamp: event.createdAt,
});
revalidatePath(`/weather/${sessionId}`);
revalidatePath('/weather');
revalidatePath('/sessions');
revalidateTag(sessionsListTag(authSession.user.id), 'default');
return { success: true };
} catch (error) {
console.error('Error updating weather session:', error);
return { success: false, error: 'Erreur lors de la mise à jour' };
}
}
export async function deleteWeatherSession(sessionId: string) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
try {
await weatherService.deleteWeatherSession(sessionId, authSession.user.id);
revalidatePath('/weather');
revalidatePath('/sessions');
revalidateTag(sessionsListTag(authSession.user.id), 'default');
return { success: true };
} catch (error) {
console.error('Error deleting weather session:', error);
return { success: false, error: 'Erreur lors de la suppression' };
}
}
// ============================================
// Entry Actions
// ============================================
export async function createOrUpdateWeatherEntry(
sessionId: string,
data: {
performanceEmoji?: string | null;
moralEmoji?: string | null;
fluxEmoji?: string | null;
valueCreationEmoji?: string | null;
notes?: string | null;
}
) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
// Check edit permission
const canEdit = await weatherService.canEditWeatherSession(sessionId, authSession.user.id);
if (!canEdit) {
return { success: false, error: 'Permission refusée' };
}
try {
const entry = await weatherService.createOrUpdateWeatherEntry(
sessionId,
authSession.user.id,
data
);
// Get user info for broadcast
const user = await getUserById(authSession.user.id);
if (!user) {
return { success: false, error: 'Utilisateur non trouvé' };
}
// Emit event for real-time sync
const eventType =
entry.createdAt.getTime() === entry.updatedAt.getTime() ? 'ENTRY_CREATED' : 'ENTRY_UPDATED';
const event = await weatherService.createWeatherSessionEvent(
sessionId,
authSession.user.id,
eventType,
{
entryId: entry.id,
userId: entry.userId,
...data,
}
);
// Broadcast immediately via SSE
broadcastToWeatherSession(sessionId, {
type: eventType,
payload: {
entryId: entry.id,
userId: entry.userId,
...data,
},
userId: authSession.user.id,
user: { id: user.id, name: user.name, email: user.email },
timestamp: event.createdAt,
});
revalidatePath(`/weather/${sessionId}`);
return { success: true, data: entry };
} catch (error) {
console.error('Error creating/updating weather entry:', error);
return { success: false, error: 'Erreur lors de la sauvegarde' };
}
}
export async function deleteWeatherEntry(sessionId: string) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
// Check edit permission
const canEdit = await weatherService.canEditWeatherSession(sessionId, authSession.user.id);
if (!canEdit) {
return { success: false, error: 'Permission refusée' };
}
try {
await weatherService.deleteWeatherEntry(sessionId, authSession.user.id);
// Get user info for broadcast
const user = await getUserById(authSession.user.id);
if (!user) {
return { success: false, error: 'Utilisateur non trouvé' };
}
// Emit event for real-time sync
const event = await weatherService.createWeatherSessionEvent(
sessionId,
authSession.user.id,
'ENTRY_DELETED',
{ userId: authSession.user.id }
);
// Broadcast immediately via SSE
broadcastToWeatherSession(sessionId, {
type: 'ENTRY_DELETED',
payload: { userId: authSession.user.id },
userId: authSession.user.id,
user: { id: user.id, name: user.name, email: user.email },
timestamp: event.createdAt,
});
revalidatePath(`/weather/${sessionId}`);
return { success: true };
} catch (error) {
console.error('Error deleting weather entry:', error);
return { success: false, error: 'Erreur lors de la suppression' };
}
}
// ============================================
// Sharing Actions
// ============================================
export async function shareWeatherSession(
sessionId: string,
targetEmail: string,
role: 'VIEWER' | 'EDITOR' = 'EDITOR'
) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
try {
const share = await weatherService.shareWeatherSession(
sessionId,
authSession.user.id,
targetEmail,
role
);
revalidatePath(`/weather/${sessionId}`);
return { success: true, data: share };
} catch (error) {
console.error('Error sharing weather session:', error);
const message = error instanceof Error ? error.message : 'Erreur lors du partage';
return { success: false, error: message };
}
}
export async function shareWeatherSessionToTeam(
sessionId: string,
teamId: string,
role: 'VIEWER' | 'EDITOR' = 'EDITOR'
) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
try {
const shares = await weatherService.shareWeatherSessionToTeam(
sessionId,
authSession.user.id,
teamId,
role
);
revalidatePath(`/weather/${sessionId}`);
return { success: true, data: shares };
} catch (error) {
console.error('Error sharing weather session to team:', error);
const message = error instanceof Error ? error.message : "Erreur lors du partage à l'équipe";
return { success: false, error: message };
}
}
export async function removeWeatherShare(sessionId: string, shareUserId: string) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
try {
await weatherService.removeWeatherShare(sessionId, authSession.user.id, shareUserId);
revalidatePath(`/weather/${sessionId}`);
return { success: true };
} catch (error) {
console.error('Error removing weather share:', error);
return { success: false, error: 'Erreur lors de la suppression du partage' };
}
}

View File

@@ -0,0 +1,354 @@
'use server';
import { revalidatePath, revalidateTag } from 'next/cache';
import { auth } from '@/lib/auth';
import * as weeklyCheckInService from '@/services/weekly-checkin';
import { sessionsListTag } from '@/lib/cache-tags';
import { broadcastToWeeklyCheckInSession } from '@/app/api/weekly-checkin/[id]/subscribe/route';
import type { WeeklyCheckInCategory, Emotion } from '@prisma/client';
// ============================================
// Session Actions
// ============================================
export async function createWeeklyCheckInSession(data: {
title: string;
participant: string;
date?: Date;
}) {
const session = await auth();
if (!session?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
try {
const weeklyCheckInSession = await weeklyCheckInService.createWeeklyCheckInSession(
session.user.id,
data
);
try {
await weeklyCheckInService.shareWeeklyCheckInSession(
weeklyCheckInSession.id,
session.user.id,
data.participant,
'EDITOR'
);
} catch (shareError) {
console.error('Auto-share failed:', shareError);
}
revalidatePath('/weekly-checkin');
revalidatePath('/sessions');
revalidateTag(sessionsListTag(session.user.id), 'default');
return { success: true, data: weeklyCheckInSession };
} catch (error) {
console.error('Error creating weekly check-in session:', error);
return { success: false, error: 'Erreur lors de la création' };
}
}
export async function updateWeeklyCheckInSession(
sessionId: string,
data: { title?: string; participant?: string; date?: Date }
) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
try {
await weeklyCheckInService.updateWeeklyCheckInSession(sessionId, authSession.user.id, data);
// Emit event for real-time sync
await weeklyCheckInService.createWeeklyCheckInSessionEvent(
sessionId,
authSession.user.id,
'SESSION_UPDATED',
data
);
broadcastToWeeklyCheckInSession(sessionId, { type: 'SESSION_UPDATED' });
revalidatePath(`/weekly-checkin/${sessionId}`);
revalidatePath('/weekly-checkin');
revalidatePath('/sessions');
revalidateTag(sessionsListTag(authSession.user.id), 'default');
return { success: true };
} catch (error) {
console.error('Error updating weekly check-in session:', error);
return { success: false, error: 'Erreur lors de la mise à jour' };
}
}
export async function deleteWeeklyCheckInSession(sessionId: string) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
try {
await weeklyCheckInService.deleteWeeklyCheckInSession(sessionId, authSession.user.id);
revalidatePath('/weekly-checkin');
revalidatePath('/sessions');
revalidateTag(sessionsListTag(authSession.user.id), 'default');
return { success: true };
} catch (error) {
console.error('Error deleting weekly check-in session:', error);
return { success: false, error: 'Erreur lors de la suppression' };
}
}
// ============================================
// Item Actions
// ============================================
export async function createWeeklyCheckInItem(
sessionId: string,
data: { content: string; category: WeeklyCheckInCategory; emotion?: Emotion }
) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
// Check edit permission
const canEdit = await weeklyCheckInService.canEditWeeklyCheckInSession(
sessionId,
authSession.user.id
);
if (!canEdit) {
return { success: false, error: 'Permission refusée' };
}
try {
const item = await weeklyCheckInService.createWeeklyCheckInItem(sessionId, data);
// Emit event for real-time sync
await weeklyCheckInService.createWeeklyCheckInSessionEvent(
sessionId,
authSession.user.id,
'ITEM_CREATED',
{
itemId: item.id,
content: item.content,
category: item.category,
emotion: item.emotion,
}
);
broadcastToWeeklyCheckInSession(sessionId, { type: 'ITEM_CREATED' });
revalidatePath(`/weekly-checkin/${sessionId}`);
return { success: true, data: item };
} catch (error) {
console.error('Error creating weekly check-in item:', error);
return { success: false, error: 'Erreur lors de la création' };
}
}
export async function updateWeeklyCheckInItem(
itemId: string,
sessionId: string,
data: { content?: string; category?: WeeklyCheckInCategory; emotion?: Emotion }
) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
// Check edit permission
const canEdit = await weeklyCheckInService.canEditWeeklyCheckInSession(
sessionId,
authSession.user.id
);
if (!canEdit) {
return { success: false, error: 'Permission refusée' };
}
try {
const item = await weeklyCheckInService.updateWeeklyCheckInItem(itemId, data);
// Emit event for real-time sync
await weeklyCheckInService.createWeeklyCheckInSessionEvent(
sessionId,
authSession.user.id,
'ITEM_UPDATED',
{
itemId: item.id,
...data,
}
);
broadcastToWeeklyCheckInSession(sessionId, { type: 'ITEM_UPDATED' });
revalidatePath(`/weekly-checkin/${sessionId}`);
return { success: true, data: item };
} catch (error) {
console.error('Error updating weekly check-in item:', error);
return { success: false, error: 'Erreur lors de la mise à jour' };
}
}
export async function deleteWeeklyCheckInItem(itemId: string, sessionId: string) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
// Check edit permission
const canEdit = await weeklyCheckInService.canEditWeeklyCheckInSession(
sessionId,
authSession.user.id
);
if (!canEdit) {
return { success: false, error: 'Permission refusée' };
}
try {
await weeklyCheckInService.deleteWeeklyCheckInItem(itemId);
// Emit event for real-time sync
await weeklyCheckInService.createWeeklyCheckInSessionEvent(
sessionId,
authSession.user.id,
'ITEM_DELETED',
{ itemId }
);
broadcastToWeeklyCheckInSession(sessionId, { type: 'ITEM_DELETED' });
revalidatePath(`/weekly-checkin/${sessionId}`);
return { success: true };
} catch (error) {
console.error('Error deleting weekly check-in item:', error);
return { success: false, error: 'Erreur lors de la suppression' };
}
}
export async function moveWeeklyCheckInItem(
itemId: string,
sessionId: string,
newCategory: WeeklyCheckInCategory,
newOrder: number
) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
// Check edit permission
const canEdit = await weeklyCheckInService.canEditWeeklyCheckInSession(
sessionId,
authSession.user.id
);
if (!canEdit) {
return { success: false, error: 'Permission refusée' };
}
try {
await weeklyCheckInService.moveWeeklyCheckInItem(itemId, newCategory, newOrder);
// Emit event for real-time sync
await weeklyCheckInService.createWeeklyCheckInSessionEvent(
sessionId,
authSession.user.id,
'ITEM_MOVED',
{
itemId,
category: newCategory,
order: newOrder,
}
);
broadcastToWeeklyCheckInSession(sessionId, { type: 'ITEM_MOVED' });
revalidatePath(`/weekly-checkin/${sessionId}`);
return { success: true };
} catch (error) {
console.error('Error moving weekly check-in item:', error);
return { success: false, error: 'Erreur lors du déplacement' };
}
}
export async function reorderWeeklyCheckInItems(
sessionId: string,
category: WeeklyCheckInCategory,
itemIds: string[]
) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
// Check edit permission
const canEdit = await weeklyCheckInService.canEditWeeklyCheckInSession(
sessionId,
authSession.user.id
);
if (!canEdit) {
return { success: false, error: 'Permission refusée' };
}
try {
await weeklyCheckInService.reorderWeeklyCheckInItems(sessionId, category, itemIds);
// Emit event for real-time sync
await weeklyCheckInService.createWeeklyCheckInSessionEvent(
sessionId,
authSession.user.id,
'ITEMS_REORDERED',
{ category, itemIds }
);
broadcastToWeeklyCheckInSession(sessionId, { type: 'ITEMS_REORDERED' });
revalidatePath(`/weekly-checkin/${sessionId}`);
return { success: true };
} catch (error) {
console.error('Error reordering weekly check-in items:', error);
return { success: false, error: 'Erreur lors du réordonnancement' };
}
}
// ============================================
// Sharing Actions
// ============================================
export async function shareWeeklyCheckInSession(
sessionId: string,
targetEmail: string,
role: 'VIEWER' | 'EDITOR' = 'EDITOR'
) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
try {
const share = await weeklyCheckInService.shareWeeklyCheckInSession(
sessionId,
authSession.user.id,
targetEmail,
role
);
revalidatePath(`/weekly-checkin/${sessionId}`);
return { success: true, data: share };
} catch (error) {
console.error('Error sharing weekly check-in session:', error);
const message = error instanceof Error ? error.message : 'Erreur lors du partage';
return { success: false, error: message };
}
}
export async function removeWeeklyCheckInShare(sessionId: string, shareUserId: string) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
try {
await weeklyCheckInService.removeWeeklyCheckInShare(
sessionId,
authSession.user.id,
shareUserId
);
revalidatePath(`/weekly-checkin/${sessionId}`);
return { success: true };
} catch (error) {
console.error('Error removing weekly check-in share:', error);
return { success: false, error: 'Erreur lors de la suppression du partage' };
}
}

334
src/actions/year-review.ts Normal file
View File

@@ -0,0 +1,334 @@
'use server';
import { revalidatePath, revalidateTag } from 'next/cache';
import { auth } from '@/lib/auth';
import * as yearReviewService from '@/services/year-review';
import { sessionsListTag } from '@/lib/cache-tags';
import { broadcastToYearReviewSession } from '@/app/api/year-review/[id]/subscribe/route';
import type { YearReviewCategory } from '@prisma/client';
// ============================================
// Session Actions
// ============================================
export async function createYearReviewSession(data: {
title: string;
participant: string;
year: number;
}) {
const session = await auth();
if (!session?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
try {
const yearReviewSession = await yearReviewService.createYearReviewSession(
session.user.id,
data
);
try {
await yearReviewService.shareYearReviewSession(
yearReviewSession.id,
session.user.id,
data.participant,
'EDITOR'
);
} catch (shareError) {
console.error('Auto-share failed:', shareError);
}
revalidatePath('/year-review');
revalidatePath('/sessions');
revalidateTag(sessionsListTag(session.user.id), 'default');
return { success: true, data: yearReviewSession };
} catch (error) {
console.error('Error creating year review session:', error);
return { success: false, error: 'Erreur lors de la création' };
}
}
export async function updateYearReviewSession(
sessionId: string,
data: { title?: string; participant?: string; year?: number }
) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
try {
await yearReviewService.updateYearReviewSession(sessionId, authSession.user.id, data);
// Emit event for real-time sync
await yearReviewService.createYearReviewSessionEvent(
sessionId,
authSession.user.id,
'SESSION_UPDATED',
data
);
broadcastToYearReviewSession(sessionId, { type: 'SESSION_UPDATED' });
revalidatePath(`/year-review/${sessionId}`);
revalidatePath('/year-review');
revalidatePath('/sessions');
revalidateTag(sessionsListTag(authSession.user.id), 'default');
return { success: true };
} catch (error) {
console.error('Error updating year review session:', error);
return { success: false, error: 'Erreur lors de la mise à jour' };
}
}
export async function deleteYearReviewSession(sessionId: string) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
try {
await yearReviewService.deleteYearReviewSession(sessionId, authSession.user.id);
revalidatePath('/year-review');
revalidatePath('/sessions');
revalidateTag(sessionsListTag(authSession.user.id), 'default');
return { success: true };
} catch (error) {
console.error('Error deleting year review session:', error);
return { success: false, error: 'Erreur lors de la suppression' };
}
}
// ============================================
// Item Actions
// ============================================
export async function createYearReviewItem(
sessionId: string,
data: { content: string; category: YearReviewCategory }
) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
// Check edit permission
const canEdit = await yearReviewService.canEditYearReviewSession(sessionId, authSession.user.id);
if (!canEdit) {
return { success: false, error: 'Permission refusée' };
}
try {
const item = await yearReviewService.createYearReviewItem(sessionId, data);
// Emit event for real-time sync
await yearReviewService.createYearReviewSessionEvent(
sessionId,
authSession.user.id,
'ITEM_CREATED',
{
itemId: item.id,
content: item.content,
category: item.category,
}
);
broadcastToYearReviewSession(sessionId, { type: 'ITEM_CREATED' });
revalidatePath(`/year-review/${sessionId}`);
return { success: true, data: item };
} catch (error) {
console.error('Error creating year review item:', error);
return { success: false, error: 'Erreur lors de la création' };
}
}
export async function updateYearReviewItem(
itemId: string,
sessionId: string,
data: { content?: string; category?: YearReviewCategory }
) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
// Check edit permission
const canEdit = await yearReviewService.canEditYearReviewSession(sessionId, authSession.user.id);
if (!canEdit) {
return { success: false, error: 'Permission refusée' };
}
try {
const item = await yearReviewService.updateYearReviewItem(itemId, data);
// Emit event for real-time sync
await yearReviewService.createYearReviewSessionEvent(
sessionId,
authSession.user.id,
'ITEM_UPDATED',
{
itemId: item.id,
...data,
}
);
broadcastToYearReviewSession(sessionId, { type: 'ITEM_UPDATED' });
revalidatePath(`/year-review/${sessionId}`);
return { success: true, data: item };
} catch (error) {
console.error('Error updating year review item:', error);
return { success: false, error: 'Erreur lors de la mise à jour' };
}
}
export async function deleteYearReviewItem(itemId: string, sessionId: string) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
// Check edit permission
const canEdit = await yearReviewService.canEditYearReviewSession(sessionId, authSession.user.id);
if (!canEdit) {
return { success: false, error: 'Permission refusée' };
}
try {
await yearReviewService.deleteYearReviewItem(itemId);
// Emit event for real-time sync
await yearReviewService.createYearReviewSessionEvent(
sessionId,
authSession.user.id,
'ITEM_DELETED',
{ itemId }
);
broadcastToYearReviewSession(sessionId, { type: 'ITEM_DELETED' });
revalidatePath(`/year-review/${sessionId}`);
return { success: true };
} catch (error) {
console.error('Error deleting year review item:', error);
return { success: false, error: 'Erreur lors de la suppression' };
}
}
export async function moveYearReviewItem(
itemId: string,
sessionId: string,
newCategory: YearReviewCategory,
newOrder: number
) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
// Check edit permission
const canEdit = await yearReviewService.canEditYearReviewSession(sessionId, authSession.user.id);
if (!canEdit) {
return { success: false, error: 'Permission refusée' };
}
try {
await yearReviewService.moveYearReviewItem(itemId, newCategory, newOrder);
// Emit event for real-time sync
await yearReviewService.createYearReviewSessionEvent(
sessionId,
authSession.user.id,
'ITEM_MOVED',
{
itemId,
category: newCategory,
order: newOrder,
}
);
broadcastToYearReviewSession(sessionId, { type: 'ITEM_MOVED' });
revalidatePath(`/year-review/${sessionId}`);
return { success: true };
} catch (error) {
console.error('Error moving year review item:', error);
return { success: false, error: 'Erreur lors du déplacement' };
}
}
export async function reorderYearReviewItems(
sessionId: string,
category: YearReviewCategory,
itemIds: string[]
) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
// Check edit permission
const canEdit = await yearReviewService.canEditYearReviewSession(sessionId, authSession.user.id);
if (!canEdit) {
return { success: false, error: 'Permission refusée' };
}
try {
await yearReviewService.reorderYearReviewItems(sessionId, category, itemIds);
// Emit event for real-time sync
await yearReviewService.createYearReviewSessionEvent(
sessionId,
authSession.user.id,
'ITEMS_REORDERED',
{ category, itemIds }
);
broadcastToYearReviewSession(sessionId, { type: 'ITEMS_REORDERED' });
revalidatePath(`/year-review/${sessionId}`);
return { success: true };
} catch (error) {
console.error('Error reordering year review items:', error);
return { success: false, error: 'Erreur lors du réordonnancement' };
}
}
// ============================================
// Sharing Actions
// ============================================
export async function shareYearReviewSession(
sessionId: string,
targetEmail: string,
role: 'VIEWER' | 'EDITOR' = 'EDITOR'
) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
try {
const share = await yearReviewService.shareYearReviewSession(
sessionId,
authSession.user.id,
targetEmail,
role
);
revalidatePath(`/year-review/${sessionId}`);
return { success: true, data: share };
} catch (error) {
console.error('Error sharing year review session:', error);
const message = error instanceof Error ? error.message : 'Erreur lors du partage';
return { success: false, error: message };
}
}
export async function removeYearReviewShare(sessionId: string, shareUserId: string) {
const authSession = await auth();
if (!authSession?.user?.id) {
return { success: false, error: 'Non autorisé' };
}
try {
await yearReviewService.removeYearReviewShare(sessionId, authSession.user.id, shareUserId);
revalidatePath(`/year-review/${sessionId}`);
return { success: true };
} catch (error) {
console.error('Error removing year review share:', error);
return { success: false, error: 'Erreur lors de la suppression du partage' };
}
}

View File

@@ -4,6 +4,7 @@ import { useState } from 'react';
import { signIn } from 'next-auth/react'; import { signIn } from 'next-auth/react';
import { useRouter } from 'next/navigation'; import { useRouter } from 'next/navigation';
import Link from 'next/link'; import Link from 'next/link';
import { Button, Input, RocketIcon } from '@/components/ui';
export default function LoginPage() { export default function LoginPage() {
const router = useRouter(); const router = useRouter();
@@ -44,8 +45,8 @@ export default function LoginPage() {
<div className="w-full max-w-md"> <div className="w-full max-w-md">
<div className="mb-8 text-center"> <div className="mb-8 text-center">
<Link href="/" className="inline-flex items-center gap-2"> <Link href="/" className="inline-flex items-center gap-2">
<span className="text-3xl">📊</span> <RocketIcon className="h-8 w-8 shrink-0 text-primary" />
<span className="text-2xl font-bold text-foreground">SWOT Manager</span> <span className="text-2xl font-bold text-foreground">Workshop Manager</span>
</Link> </Link>
<p className="mt-2 text-muted">Connectez-vous à votre compte</p> <p className="mt-2 text-muted">Connectez-vous à votre compte</p>
</div> </div>
@@ -61,42 +62,32 @@ export default function LoginPage() {
)} )}
<div className="mb-4"> <div className="mb-4">
<label htmlFor="email" className="mb-2 block text-sm font-medium text-foreground"> <Input
Email
</label>
<input
id="email" id="email"
name="email" name="email"
type="email" type="email"
label="Email"
required required
autoComplete="email" autoComplete="email"
className="w-full rounded-lg border border-input-border bg-input px-4 py-2.5 text-foreground placeholder:text-muted-foreground focus:border-primary focus:outline-none focus:ring-2 focus:ring-primary/20"
placeholder="vous@exemple.com" placeholder="vous@exemple.com"
/> />
</div> </div>
<div className="mb-6"> <div className="mb-6">
<label htmlFor="password" className="mb-2 block text-sm font-medium text-foreground"> <Input
Mot de passe
</label>
<input
id="password" id="password"
name="password" name="password"
type="password" type="password"
label="Mot de passe"
required required
autoComplete="current-password" autoComplete="current-password"
className="w-full rounded-lg border border-input-border bg-input px-4 py-2.5 text-foreground placeholder:text-muted-foreground focus:border-primary focus:outline-none focus:ring-2 focus:ring-primary/20"
placeholder="••••••••" placeholder="••••••••"
/> />
</div> </div>
<button <Button type="submit" disabled={loading} loading={loading} className="w-full">
type="submit"
disabled={loading}
className="w-full rounded-lg bg-primary px-4 py-2.5 font-semibold text-primary-foreground transition-colors hover:bg-primary-hover disabled:cursor-not-allowed disabled:opacity-50"
>
{loading ? 'Connexion...' : 'Se connecter'} {loading ? 'Connexion...' : 'Se connecter'}
</button> </Button>
<p className="mt-6 text-center text-sm text-muted"> <p className="mt-6 text-center text-sm text-muted">
Pas encore de compte ?{' '} Pas encore de compte ?{' '}

View File

@@ -4,6 +4,7 @@ import { useState } from 'react';
import { signIn } from 'next-auth/react'; import { signIn } from 'next-auth/react';
import { useRouter } from 'next/navigation'; import { useRouter } from 'next/navigation';
import Link from 'next/link'; import Link from 'next/link';
import { Button, Input, RocketIcon } from '@/components/ui';
export default function RegisterPage() { export default function RegisterPage() {
const router = useRouter(); const router = useRouter();
@@ -73,8 +74,8 @@ export default function RegisterPage() {
<div className="w-full max-w-md"> <div className="w-full max-w-md">
<div className="mb-8 text-center"> <div className="mb-8 text-center">
<Link href="/" className="inline-flex items-center gap-2"> <Link href="/" className="inline-flex items-center gap-2">
<span className="text-3xl">📊</span> <RocketIcon className="h-8 w-8 shrink-0 text-primary" />
<span className="text-2xl font-bold text-foreground">SWOT Manager</span> <span className="text-2xl font-bold text-foreground">Workshop Manager</span>
</Link> </Link>
<p className="mt-2 text-muted">Créez votre compte</p> <p className="mt-2 text-muted">Créez votre compte</p>
</div> </div>
@@ -90,74 +91,55 @@ export default function RegisterPage() {
)} )}
<div className="mb-4"> <div className="mb-4">
<label htmlFor="name" className="mb-2 block text-sm font-medium text-foreground"> <Input
Nom
</label>
<input
id="name" id="name"
name="name" name="name"
type="text" type="text"
label="Nom"
autoComplete="name" autoComplete="name"
className="w-full rounded-lg border border-input-border bg-input px-4 py-2.5 text-foreground placeholder:text-muted-foreground focus:border-primary focus:outline-none focus:ring-2 focus:ring-primary/20"
placeholder="Jean Dupont" placeholder="Jean Dupont"
/> />
</div> </div>
<div className="mb-4"> <div className="mb-4">
<label htmlFor="email" className="mb-2 block text-sm font-medium text-foreground"> <Input
Email
</label>
<input
id="email" id="email"
name="email" name="email"
type="email" type="email"
label="Email"
required required
autoComplete="email" autoComplete="email"
className="w-full rounded-lg border border-input-border bg-input px-4 py-2.5 text-foreground placeholder:text-muted-foreground focus:border-primary focus:outline-none focus:ring-2 focus:ring-primary/20"
placeholder="vous@exemple.com" placeholder="vous@exemple.com"
/> />
</div> </div>
<div className="mb-4"> <div className="mb-4">
<label htmlFor="password" className="mb-2 block text-sm font-medium text-foreground"> <Input
Mot de passe
</label>
<input
id="password" id="password"
name="password" name="password"
type="password" type="password"
label="Mot de passe"
required required
autoComplete="new-password" autoComplete="new-password"
className="w-full rounded-lg border border-input-border bg-input px-4 py-2.5 text-foreground placeholder:text-muted-foreground focus:border-primary focus:outline-none focus:ring-2 focus:ring-primary/20"
placeholder="••••••••" placeholder="••••••••"
/> />
</div> </div>
<div className="mb-6"> <div className="mb-6">
<label <Input
htmlFor="confirmPassword"
className="mb-2 block text-sm font-medium text-foreground"
>
Confirmer le mot de passe
</label>
<input
id="confirmPassword" id="confirmPassword"
name="confirmPassword" name="confirmPassword"
type="password" type="password"
label="Confirmer le mot de passe"
required required
autoComplete="new-password" autoComplete="new-password"
className="w-full rounded-lg border border-input-border bg-input px-4 py-2.5 text-foreground placeholder:text-muted-foreground focus:border-primary focus:outline-none focus:ring-2 focus:ring-primary/20"
placeholder="••••••••" placeholder="••••••••"
/> />
</div> </div>
<button <Button type="submit" disabled={loading} loading={loading} className="w-full">
type="submit"
disabled={loading}
className="w-full rounded-lg bg-primary px-4 py-2.5 font-semibold text-primary-foreground transition-colors hover:bg-primary-hover disabled:cursor-not-allowed disabled:opacity-50"
>
{loading ? 'Création...' : 'Créer mon compte'} {loading ? 'Création...' : 'Créer mon compte'}
</button> </Button>
<p className="mt-6 text-center text-sm text-muted"> <p className="mt-6 text-center text-sm text-muted">
Déjà un compte ?{' '} Déjà un compte ?{' '}

View File

@@ -0,0 +1,65 @@
import { auth } from '@/lib/auth';
import { canAccessGifMoodSession, getGifMoodSessionEvents } from '@/services/gif-mood';
import { createBroadcaster } from '@/lib/broadcast';
export const dynamic = 'force-dynamic';
const { subscribe, broadcast } = createBroadcaster(getGifMoodSessionEvents, (event) => ({
type: event.type,
payload: JSON.parse(event.payload),
userId: event.userId,
user: event.user,
timestamp: event.createdAt,
}));
export { broadcast as broadcastToGifMoodSession };
export async function GET(request: Request, { params }: { params: Promise<{ id: string }> }) {
const { id: sessionId } = await params;
const session = await auth();
if (!session?.user?.id) {
return new Response('Unauthorized', { status: 401 });
}
const hasAccess = await canAccessGifMoodSession(sessionId, session.user.id);
if (!hasAccess) {
return new Response('Forbidden', { status: 403 });
}
const userId = session.user.id;
let unsubscribe: () => void = () => {};
let controller: ReadableStreamDefaultController;
const stream = new ReadableStream({
start(ctrl) {
controller = ctrl;
const encoder = new TextEncoder();
controller.enqueue(
encoder.encode(`data: ${JSON.stringify({ type: 'connected', userId })}\n\n`)
);
unsubscribe = subscribe(sessionId, userId, (event) => {
try {
controller.enqueue(encoder.encode(`data: ${JSON.stringify(event)}\n\n`));
} catch {
unsubscribe();
}
});
},
cancel() {
unsubscribe();
},
});
request.signal.addEventListener('abort', () => {
unsubscribe();
});
return new Response(stream, {
headers: {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
Connection: 'keep-alive',
},
});
}

View File

@@ -1,10 +1,18 @@
import { auth } from '@/lib/auth'; import { auth } from '@/lib/auth';
import { canAccessMotivatorSession, getMotivatorSessionEvents } from '@/services/moving-motivators'; import { canAccessMotivatorSession, getMotivatorSessionEvents } from '@/services/moving-motivators';
import { createBroadcaster } from '@/lib/broadcast';
export const dynamic = 'force-dynamic'; export const dynamic = 'force-dynamic';
// Store active connections per session const { subscribe, broadcast } = createBroadcaster(getMotivatorSessionEvents, (event) => ({
const connections = new Map<string, Set<ReadableStreamDefaultController>>(); type: event.type,
payload: JSON.parse(event.payload),
userId: event.userId,
user: event.user,
timestamp: event.createdAt,
}));
export { broadcast as broadcastToMotivatorSession };
export async function GET(request: Request, { params }: { params: Promise<{ id: string }> }) { export async function GET(request: Request, { params }: { params: Promise<{ id: string }> }) {
const { id: sessionId } = await params; const { id: sessionId } = await params;
@@ -14,74 +22,37 @@ export async function GET(request: Request, { params }: { params: Promise<{ id:
return new Response('Unauthorized', { status: 401 }); return new Response('Unauthorized', { status: 401 });
} }
// Check access
const hasAccess = await canAccessMotivatorSession(sessionId, session.user.id); const hasAccess = await canAccessMotivatorSession(sessionId, session.user.id);
if (!hasAccess) { if (!hasAccess) {
return new Response('Forbidden', { status: 403 }); return new Response('Forbidden', { status: 403 });
} }
const userId = session.user.id; const userId = session.user.id;
let lastEventTime = new Date(); let unsubscribe: () => void = () => {};
let controller: ReadableStreamDefaultController; let controller: ReadableStreamDefaultController;
const stream = new ReadableStream({ const stream = new ReadableStream({
start(ctrl) { start(ctrl) {
controller = ctrl; controller = ctrl;
// Register connection
if (!connections.has(sessionId)) {
connections.set(sessionId, new Set());
}
connections.get(sessionId)!.add(controller);
// Send initial ping
const encoder = new TextEncoder(); const encoder = new TextEncoder();
controller.enqueue( controller.enqueue(
encoder.encode(`data: ${JSON.stringify({ type: 'connected', userId })}\n\n`) encoder.encode(`data: ${JSON.stringify({ type: 'connected', userId })}\n\n`)
); );
unsubscribe = subscribe(sessionId, userId, (event) => {
try {
controller.enqueue(encoder.encode(`data: ${JSON.stringify(event)}\n\n`));
} catch {
unsubscribe();
}
});
}, },
cancel() { cancel() {
// Remove connection on close unsubscribe();
connections.get(sessionId)?.delete(controller);
if (connections.get(sessionId)?.size === 0) {
connections.delete(sessionId);
}
}, },
}); });
// Poll for new events (simple approach, works with any DB)
const pollInterval = setInterval(async () => {
try {
const events = await getMotivatorSessionEvents(sessionId, lastEventTime);
if (events.length > 0) {
const encoder = new TextEncoder();
for (const event of events) {
// Don't send events to the user who created them
if (event.userId !== userId) {
controller.enqueue(
encoder.encode(
`data: ${JSON.stringify({
type: event.type,
payload: JSON.parse(event.payload),
userId: event.userId,
user: event.user,
timestamp: event.createdAt,
})}\n\n`
)
);
}
lastEventTime = event.createdAt;
}
}
} catch {
// Connection might be closed
clearInterval(pollInterval);
}
}, 1000); // Poll every second
// Cleanup on abort
request.signal.addEventListener('abort', () => { request.signal.addEventListener('abort', () => {
clearInterval(pollInterval); unsubscribe();
}); });
return new Response(stream, { return new Response(stream, {
@@ -92,20 +63,3 @@ export async function GET(request: Request, { params }: { params: Promise<{ id:
}, },
}); });
} }
// Helper to broadcast to all connections (called from actions)
export function broadcastToMotivatorSession(sessionId: string, event: object) {
const sessionConnections = connections.get(sessionId);
if (!sessionConnections) return;
const encoder = new TextEncoder();
const message = encoder.encode(`data: ${JSON.stringify(event)}\n\n`);
for (const controller of sessionConnections) {
try {
controller.enqueue(message);
} catch {
// Connection closed, will be cleaned up
}
}
}

View File

@@ -0,0 +1,61 @@
import { NextResponse } from 'next/server';
import { auth } from '@/lib/auth';
import { updateKeyResult } from '@/services/okrs';
import { getOKR } from '@/services/okrs';
import { isTeamMember, isTeamAdmin } from '@/services/teams';
export async function PATCH(
request: Request,
{ params }: { params: Promise<{ id: string; krId: string }> }
) {
try {
const { id, krId } = await params;
const session = await auth();
if (!session?.user?.id) {
return NextResponse.json({ error: 'Non autorisé' }, { status: 401 });
}
// Get OKR to check permissions
const okr = await getOKR(id);
if (!okr) {
return NextResponse.json({ error: 'OKR non trouvé' }, { status: 404 });
}
// Check if user is a member of the team
const isMember = await isTeamMember(okr.teamMember.team.id, session.user.id);
if (!isMember) {
return NextResponse.json({ error: 'Accès refusé' }, { status: 403 });
}
// Check if user is admin or the concerned member
const isAdmin = await isTeamAdmin(okr.teamMember.team.id, session.user.id);
const isConcernedMember = okr.teamMember.userId === session.user.id;
if (!isAdmin && !isConcernedMember) {
return NextResponse.json(
{
error:
'Seuls les administrateurs et le membre concerné peuvent mettre à jour les Key Results',
},
{ status: 403 }
);
}
const body = await request.json();
const { currentValue, notes } = body;
if (currentValue === undefined) {
return NextResponse.json({ error: 'Valeur actuelle requise' }, { status: 400 });
}
const updated = await updateKeyResult(krId, Number(currentValue), notes || null);
return NextResponse.json(updated);
} catch (error) {
console.error('Error updating key result:', error);
const errorMessage =
error instanceof Error ? error.message : 'Erreur lors de la mise à jour du Key Result';
return NextResponse.json({ error: errorMessage }, { status: 500 });
}
}

View File

@@ -0,0 +1,149 @@
import { NextResponse } from 'next/server';
import { auth } from '@/lib/auth';
import { getOKR, updateOKR, deleteOKR } from '@/services/okrs';
import { isTeamMember, isTeamAdmin } from '@/services/teams';
import type { UpdateOKRInput } from '@/lib/types';
export async function GET(request: Request, { params }: { params: Promise<{ id: string }> }) {
try {
const { id } = await params;
const session = await auth();
if (!session?.user?.id) {
return NextResponse.json({ error: 'Non autorisé' }, { status: 401 });
}
const okr = await getOKR(id);
if (!okr) {
return NextResponse.json({ error: 'OKR non trouvé' }, { status: 404 });
}
// Check if user is a member of the team
const isMember = await isTeamMember(okr.teamMember.team.id, session.user.id);
if (!isMember) {
return NextResponse.json({ error: 'Accès refusé' }, { status: 403 });
}
// Check permissions
const isAdmin = await isTeamAdmin(okr.teamMember.team.id, session.user.id);
const isConcernedMember = okr.teamMember.userId === session.user.id;
return NextResponse.json({
...okr,
permissions: {
isAdmin,
isConcernedMember,
canEdit: isAdmin || isConcernedMember,
canDelete: isAdmin,
},
});
} catch (error) {
console.error('Error fetching OKR:', error);
return NextResponse.json({ error: "Erreur lors de la récupération de l'OKR" }, { status: 500 });
}
}
export async function PATCH(request: Request, { params }: { params: Promise<{ id: string }> }) {
try {
const { id } = await params;
const session = await auth();
if (!session?.user?.id) {
return NextResponse.json({ error: 'Non autorisé' }, { status: 401 });
}
const okr = await getOKR(id);
if (!okr) {
return NextResponse.json({ error: 'OKR non trouvé' }, { status: 404 });
}
// Check if user is admin of the team or the concerned member
const isAdmin = await isTeamAdmin(okr.teamMember.team.id, session.user.id);
const isConcernedMember = okr.teamMember.userId === session.user.id;
if (!isAdmin && !isConcernedMember) {
return NextResponse.json(
{ error: 'Seuls les administrateurs et le membre concerné peuvent modifier les OKRs' },
{ status: 403 }
);
}
const body: UpdateOKRInput & {
startDate?: string;
endDate?: string;
keyResultsUpdates?: {
create?: Array<{ title: string; targetValue: number; unit: string; order: number }>;
update?: Array<{
id: string;
title?: string;
targetValue?: number;
unit?: string;
order?: number;
}>;
delete?: string[];
};
} = await request.json();
// Convert date strings to Date objects if provided
const updateData: UpdateOKRInput = { ...body };
if (body.startDate) {
updateData.startDate = new Date(body.startDate);
}
if (body.endDate) {
updateData.endDate = new Date(body.endDate);
}
// Remove keyResultsUpdates from updateData as it's not part of UpdateOKRInput
const { keyResultsUpdates, ...okrUpdateData } = body;
const finalUpdateData: UpdateOKRInput = { ...okrUpdateData };
if (finalUpdateData.startDate && typeof finalUpdateData.startDate === 'string') {
finalUpdateData.startDate = new Date(finalUpdateData.startDate);
}
if (finalUpdateData.endDate && typeof finalUpdateData.endDate === 'string') {
finalUpdateData.endDate = new Date(finalUpdateData.endDate);
}
const updated = await updateOKR(id, finalUpdateData, keyResultsUpdates);
return NextResponse.json(updated);
} catch (error) {
console.error('Error updating OKR:', error);
const errorMessage =
error instanceof Error ? error.message : "Erreur lors de la mise à jour de l'OKR";
return NextResponse.json({ error: errorMessage }, { status: 500 });
}
}
export async function DELETE(request: Request, { params }: { params: Promise<{ id: string }> }) {
try {
const { id } = await params;
const session = await auth();
if (!session?.user?.id) {
return NextResponse.json({ error: 'Non autorisé' }, { status: 401 });
}
const okr = await getOKR(id);
if (!okr) {
return NextResponse.json({ error: 'OKR non trouvé' }, { status: 404 });
}
// Check if user is admin of the team
const isAdmin = await isTeamAdmin(okr.teamMember.team.id, session.user.id);
if (!isAdmin) {
return NextResponse.json(
{ error: 'Seuls les administrateurs peuvent supprimer les OKRs' },
{ status: 403 }
);
}
await deleteOKR(id);
return NextResponse.json({ success: true });
} catch (error) {
console.error('Error deleting OKR:', error);
const errorMessage =
error instanceof Error ? error.message : "Erreur lors de la suppression de l'OKR";
return NextResponse.json({ error: errorMessage }, { status: 500 });
}
}

74
src/app/api/okrs/route.ts Normal file
View File

@@ -0,0 +1,74 @@
import { NextResponse } from 'next/server';
import { auth } from '@/lib/auth';
import { createOKR } from '@/services/okrs';
import { getTeamMemberById, isTeamAdmin } from '@/services/teams';
import type { CreateOKRInput, CreateKeyResultInput } from '@/lib/types';
export async function POST(request: Request) {
try {
const session = await auth();
if (!session?.user?.id) {
return NextResponse.json({ error: 'Non autorisé' }, { status: 401 });
}
const body = await request.json();
const { teamMemberId, objective, description, period, startDate, endDate, keyResults } =
body as CreateOKRInput & {
startDate: string | Date;
endDate: string | Date;
};
if (!teamMemberId || !objective || !period || !startDate || !endDate || !keyResults) {
return NextResponse.json({ error: 'Champs requis manquants' }, { status: 400 });
}
// Get team member to check permissions
const teamMember = await getTeamMemberById(teamMemberId);
if (!teamMember) {
return NextResponse.json({ error: "Membre de l'équipe non trouvé" }, { status: 404 });
}
// Check if user is admin of the team
const isAdmin = await isTeamAdmin(teamMember.team.id, session.user.id);
if (!isAdmin) {
return NextResponse.json(
{ error: 'Seuls les administrateurs peuvent créer des OKRs' },
{ status: 403 }
);
}
// Convert dates to Date objects if they are strings
const startDateObj = startDate instanceof Date ? startDate : new Date(startDate);
const endDateObj = endDate instanceof Date ? endDate : new Date(endDate);
// Validate dates
if (isNaN(startDateObj.getTime()) || isNaN(endDateObj.getTime())) {
return NextResponse.json({ error: 'Dates invalides' }, { status: 400 });
}
// Ensure all key results have a unit and order
const keyResultsWithUnit = keyResults.map((kr: CreateKeyResultInput, index: number) => ({
...kr,
unit: kr.unit || '%',
order: kr.order !== undefined ? kr.order : index,
}));
const okr = await createOKR(
teamMemberId,
objective,
description || null,
period,
startDateObj,
endDateObj,
keyResultsWithUnit
);
return NextResponse.json(okr, { status: 201 });
} catch (error) {
console.error('Error creating OKR:', error);
const errorMessage =
error instanceof Error ? error.message : "Erreur lors de la création de l'OKR";
return NextResponse.json({ error: errorMessage }, { status: 500 });
}
}

View File

@@ -1,10 +1,18 @@
import { auth } from '@/lib/auth'; import { auth } from '@/lib/auth';
import { canAccessSession, getSessionEvents } from '@/services/sessions'; import { canAccessSession, getSessionEvents } from '@/services/sessions';
import { createBroadcaster } from '@/lib/broadcast';
export const dynamic = 'force-dynamic'; export const dynamic = 'force-dynamic';
// Store active connections per session const { subscribe, broadcast } = createBroadcaster(getSessionEvents, (event) => ({
const connections = new Map<string, Set<ReadableStreamDefaultController>>(); type: event.type,
payload: JSON.parse(event.payload),
userId: event.userId,
user: event.user,
timestamp: event.createdAt,
}));
export { broadcast as broadcastToSession };
export async function GET(request: Request, { params }: { params: Promise<{ id: string }> }) { export async function GET(request: Request, { params }: { params: Promise<{ id: string }> }) {
const { id: sessionId } = await params; const { id: sessionId } = await params;
@@ -14,74 +22,37 @@ export async function GET(request: Request, { params }: { params: Promise<{ id:
return new Response('Unauthorized', { status: 401 }); return new Response('Unauthorized', { status: 401 });
} }
// Check access
const hasAccess = await canAccessSession(sessionId, session.user.id); const hasAccess = await canAccessSession(sessionId, session.user.id);
if (!hasAccess) { if (!hasAccess) {
return new Response('Forbidden', { status: 403 }); return new Response('Forbidden', { status: 403 });
} }
const userId = session.user.id; const userId = session.user.id;
let lastEventTime = new Date(); let unsubscribe: () => void = () => {};
let controller: ReadableStreamDefaultController; let controller: ReadableStreamDefaultController;
const stream = new ReadableStream({ const stream = new ReadableStream({
start(ctrl) { start(ctrl) {
controller = ctrl; controller = ctrl;
// Register connection
if (!connections.has(sessionId)) {
connections.set(sessionId, new Set());
}
connections.get(sessionId)!.add(controller);
// Send initial ping
const encoder = new TextEncoder(); const encoder = new TextEncoder();
controller.enqueue( controller.enqueue(
encoder.encode(`data: ${JSON.stringify({ type: 'connected', userId })}\n\n`) encoder.encode(`data: ${JSON.stringify({ type: 'connected', userId })}\n\n`)
); );
unsubscribe = subscribe(sessionId, userId, (event) => {
try {
controller.enqueue(encoder.encode(`data: ${JSON.stringify(event)}\n\n`));
} catch {
unsubscribe();
}
});
}, },
cancel() { cancel() {
// Remove connection on close unsubscribe();
connections.get(sessionId)?.delete(controller);
if (connections.get(sessionId)?.size === 0) {
connections.delete(sessionId);
}
}, },
}); });
// Poll for new events (simple approach, works with any DB)
const pollInterval = setInterval(async () => {
try {
const events = await getSessionEvents(sessionId, lastEventTime);
if (events.length > 0) {
const encoder = new TextEncoder();
for (const event of events) {
// Don't send events to the user who created them
if (event.userId !== userId) {
controller.enqueue(
encoder.encode(
`data: ${JSON.stringify({
type: event.type,
payload: JSON.parse(event.payload),
userId: event.userId, // Include userId for client-side filtering
user: event.user,
timestamp: event.createdAt,
})}\n\n`
)
);
}
lastEventTime = event.createdAt;
}
}
} catch {
// Connection might be closed
clearInterval(pollInterval);
}
}, 1000); // Poll every second
// Cleanup on abort
request.signal.addEventListener('abort', () => { request.signal.addEventListener('abort', () => {
clearInterval(pollInterval); unsubscribe();
}); });
return new Response(stream, { return new Response(stream, {
@@ -92,20 +63,3 @@ export async function GET(request: Request, { params }: { params: Promise<{ id:
}, },
}); });
} }
// Helper to broadcast to all connections (called from actions)
export function broadcastToSession(sessionId: string, event: object) {
const sessionConnections = connections.get(sessionId);
if (!sessionConnections) return;
const encoder = new TextEncoder();
const message = encoder.encode(`data: ${JSON.stringify(event)}\n\n`);
for (const controller of sessionConnections) {
try {
controller.enqueue(message);
} catch {
// Connection closed, will be cleaned up
}
}
}

View File

@@ -1,6 +1,9 @@
import { NextResponse } from 'next/server'; import { NextResponse } from 'next/server';
import { revalidateTag } from 'next/cache';
import { auth } from '@/lib/auth'; import { auth } from '@/lib/auth';
import { prisma } from '@/services/database'; import { prisma } from '@/services/database';
import { shareSession } from '@/services/sessions';
import { sessionsListTag } from '@/lib/cache-tags';
export async function GET() { export async function GET() {
try { try {
@@ -56,6 +59,13 @@ export async function POST(request: Request) {
}, },
}); });
try {
await shareSession(newSession.id, session.user.id, collaborator, 'EDITOR');
} catch (shareError) {
console.error('Auto-share failed:', shareError);
}
revalidateTag(sessionsListTag(session.user.id), 'default');
return NextResponse.json(newSession, { status: 201 }); return NextResponse.json(newSession, { status: 201 });
} catch (error) { } catch (error) {
console.error('Error creating session:', error); console.error('Error creating session:', error);

View File

@@ -0,0 +1,108 @@
import { NextResponse } from 'next/server';
import { auth } from '@/lib/auth';
import { addTeamMember, removeTeamMember, updateMemberRole, isTeamAdmin } from '@/services/teams';
import type { AddTeamMemberInput, UpdateMemberRoleInput } from '@/lib/types';
export async function POST(request: Request, { params }: { params: Promise<{ id: string }> }) {
try {
const { id } = await params;
const session = await auth();
if (!session?.user?.id) {
return NextResponse.json({ error: 'Non autorisé' }, { status: 401 });
}
// Check if user is admin
const isAdmin = await isTeamAdmin(id, session.user.id);
if (!isAdmin) {
return NextResponse.json(
{ error: 'Seuls les administrateurs peuvent ajouter des membres' },
{ status: 403 }
);
}
const body: AddTeamMemberInput = await request.json();
const { userId, role } = body;
if (!userId) {
return NextResponse.json({ error: 'ID utilisateur requis' }, { status: 400 });
}
const member = await addTeamMember(id, userId, role || 'MEMBER');
return NextResponse.json(member, { status: 201 });
} catch (error) {
console.error('Error adding team member:', error);
const errorMessage =
error instanceof Error ? error.message : "Erreur lors de l'ajout du membre";
return NextResponse.json({ error: errorMessage }, { status: 500 });
}
}
export async function PATCH(request: Request, { params }: { params: Promise<{ id: string }> }) {
try {
const { id } = await params;
const session = await auth();
if (!session?.user?.id) {
return NextResponse.json({ error: 'Non autorisé' }, { status: 401 });
}
// Check if user is admin
const isAdmin = await isTeamAdmin(id, session.user.id);
if (!isAdmin) {
return NextResponse.json(
{ error: 'Seuls les administrateurs peuvent modifier les rôles' },
{ status: 403 }
);
}
const body: UpdateMemberRoleInput & { userId: string } = await request.json();
const { userId, role } = body;
if (!userId || !role) {
return NextResponse.json({ error: 'ID utilisateur et rôle requis' }, { status: 400 });
}
const member = await updateMemberRole(id, userId, role);
return NextResponse.json(member);
} catch (error) {
console.error('Error updating member role:', error);
return NextResponse.json({ error: 'Erreur lors de la mise à jour du rôle' }, { status: 500 });
}
}
export async function DELETE(request: Request, { params }: { params: Promise<{ id: string }> }) {
try {
const { id } = await params;
const session = await auth();
if (!session?.user?.id) {
return NextResponse.json({ error: 'Non autorisé' }, { status: 401 });
}
// Check if user is admin
const isAdmin = await isTeamAdmin(id, session.user.id);
if (!isAdmin) {
return NextResponse.json(
{ error: 'Seuls les administrateurs peuvent retirer des membres' },
{ status: 403 }
);
}
const { searchParams } = new URL(request.url);
const userId = searchParams.get('userId');
if (!userId) {
return NextResponse.json({ error: 'ID utilisateur requis' }, { status: 400 });
}
await removeTeamMember(id, userId);
return NextResponse.json({ success: true });
} catch (error) {
console.error('Error removing team member:', error);
return NextResponse.json({ error: 'Erreur lors de la suppression du membre' }, { status: 500 });
}
}

View File

@@ -0,0 +1,96 @@
import { NextResponse } from 'next/server';
import { auth } from '@/lib/auth';
import { getTeam, updateTeam, deleteTeam, isTeamAdmin, isTeamMember } from '@/services/teams';
import type { UpdateTeamInput } from '@/lib/types';
export async function GET(request: Request, { params }: { params: Promise<{ id: string }> }) {
try {
const { id } = await params;
const session = await auth();
if (!session?.user?.id) {
return NextResponse.json({ error: 'Non autorisé' }, { status: 401 });
}
const team = await getTeam(id);
if (!team) {
return NextResponse.json({ error: 'Équipe non trouvée' }, { status: 404 });
}
// Check if user is a member
const isMember = await isTeamMember(id, session.user.id);
if (!isMember) {
return NextResponse.json({ error: 'Accès refusé' }, { status: 403 });
}
return NextResponse.json(team);
} catch (error) {
console.error('Error fetching team:', error);
return NextResponse.json(
{ error: "Erreur lors de la récupération de l'équipe" },
{ status: 500 }
);
}
}
export async function PATCH(request: Request, { params }: { params: Promise<{ id: string }> }) {
try {
const { id } = await params;
const session = await auth();
if (!session?.user?.id) {
return NextResponse.json({ error: 'Non autorisé' }, { status: 401 });
}
// Check if user is admin
const isAdmin = await isTeamAdmin(id, session.user.id);
if (!isAdmin) {
return NextResponse.json(
{ error: "Seuls les administrateurs peuvent modifier l'équipe" },
{ status: 403 }
);
}
const body: UpdateTeamInput = await request.json();
const team = await updateTeam(id, body);
return NextResponse.json(team);
} catch (error) {
console.error('Error updating team:', error);
return NextResponse.json(
{ error: "Erreur lors de la mise à jour de l'équipe" },
{ status: 500 }
);
}
}
export async function DELETE(request: Request, { params }: { params: Promise<{ id: string }> }) {
try {
const { id } = await params;
const session = await auth();
if (!session?.user?.id) {
return NextResponse.json({ error: 'Non autorisé' }, { status: 401 });
}
// Check if user is admin
const isAdmin = await isTeamAdmin(id, session.user.id);
if (!isAdmin) {
return NextResponse.json(
{ error: "Seuls les administrateurs peuvent supprimer l'équipe" },
{ status: 403 }
);
}
await deleteTeam(id);
return NextResponse.json({ success: true });
} catch (error) {
console.error('Error deleting team:', error);
return NextResponse.json(
{ error: "Erreur lors de la suppression de l'équipe" },
{ status: 500 }
);
}
}

View File

@@ -0,0 +1,31 @@
import { NextResponse } from 'next/server';
import { auth } from '@/lib/auth';
import { getUserTeams } from '@/services/teams';
import { getTeamMembersForShare } from '@/lib/share-utils';
export async function GET() {
try {
const session = await auth();
if (!session?.user?.id) {
return NextResponse.json({ error: 'Non autorisé' }, { status: 401 });
}
const teams = await getUserTeams(session.user.id);
const otherMembers = getTeamMembersForShare(teams, session.user.id);
const currentUser = {
id: session.user.id,
email: session.user.email ?? '',
name: session.user.name ?? null,
};
const members = [currentUser, ...otherMembers];
return NextResponse.json({ members });
} catch (error) {
console.error('Error fetching team members:', error);
return NextResponse.json(
{ error: 'Erreur lors de la récupération des membres' },
{ status: 500 }
);
}
}

View File

@@ -0,0 +1,48 @@
import { NextResponse } from 'next/server';
import { auth } from '@/lib/auth';
import { getUserTeams, createTeam } from '@/services/teams';
import type { CreateTeamInput } from '@/lib/types';
export async function GET() {
try {
const session = await auth();
if (!session?.user?.id) {
return NextResponse.json({ error: 'Non autorisé' }, { status: 401 });
}
const teams = await getUserTeams(session.user.id);
return NextResponse.json(teams);
} catch (error) {
console.error('Error fetching teams:', error);
return NextResponse.json(
{ error: 'Erreur lors de la récupération des équipes' },
{ status: 500 }
);
}
}
export async function POST(request: Request) {
try {
const session = await auth();
if (!session?.user?.id) {
return NextResponse.json({ error: 'Non autorisé' }, { status: 401 });
}
const body: CreateTeamInput = await request.json();
const { name, description } = body;
if (!name) {
return NextResponse.json({ error: "Le nom de l'équipe est requis" }, { status: 400 });
}
const team = await createTeam(name, description || null, session.user.id);
return NextResponse.json(team, { status: 201 });
} catch (error) {
console.error('Error creating team:', error);
return NextResponse.json({ error: "Erreur lors de la création de l'équipe" }, { status: 500 });
}
}

View File

@@ -0,0 +1,32 @@
import { NextResponse } from 'next/server';
import { auth } from '@/lib/auth';
import { prisma } from '@/services/database';
export async function GET() {
try {
const session = await auth();
if (!session?.user?.id) {
return NextResponse.json({ error: 'Non autorisé' }, { status: 401 });
}
const users = await prisma.user.findMany({
select: {
id: true,
email: true,
name: true,
},
orderBy: {
createdAt: 'desc',
},
});
return NextResponse.json(users);
} catch (error) {
console.error('Error fetching users:', error);
return NextResponse.json(
{ error: 'Erreur lors de la récupération des utilisateurs' },
{ status: 500 }
);
}
}

View File

@@ -0,0 +1,65 @@
import { auth } from '@/lib/auth';
import { canAccessWeatherSession, getWeatherSessionEvents } from '@/services/weather';
import { createBroadcaster } from '@/lib/broadcast';
export const dynamic = 'force-dynamic';
const { subscribe, broadcast } = createBroadcaster(getWeatherSessionEvents, (event) => ({
type: event.type,
payload: JSON.parse(event.payload),
userId: event.userId,
user: event.user,
timestamp: event.createdAt,
}));
export { broadcast as broadcastToWeatherSession };
export async function GET(request: Request, { params }: { params: Promise<{ id: string }> }) {
const { id: sessionId } = await params;
const session = await auth();
if (!session?.user?.id) {
return new Response('Unauthorized', { status: 401 });
}
const hasAccess = await canAccessWeatherSession(sessionId, session.user.id);
if (!hasAccess) {
return new Response('Forbidden', { status: 403 });
}
const userId = session.user.id;
let unsubscribe: () => void = () => {};
let controller: ReadableStreamDefaultController;
const stream = new ReadableStream({
start(ctrl) {
controller = ctrl;
const encoder = new TextEncoder();
controller.enqueue(
encoder.encode(`data: ${JSON.stringify({ type: 'connected', userId })}\n\n`)
);
unsubscribe = subscribe(sessionId, userId, (event) => {
try {
controller.enqueue(encoder.encode(`data: ${JSON.stringify(event)}\n\n`));
} catch {
unsubscribe();
}
});
},
cancel() {
unsubscribe();
},
});
request.signal.addEventListener('abort', () => {
unsubscribe();
});
return new Response(stream, {
headers: {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
Connection: 'keep-alive',
},
});
}

View File

@@ -0,0 +1,68 @@
import { auth } from '@/lib/auth';
import {
canAccessWeeklyCheckInSession,
getWeeklyCheckInSessionEvents,
} from '@/services/weekly-checkin';
import { createBroadcaster } from '@/lib/broadcast';
export const dynamic = 'force-dynamic';
const { subscribe, broadcast } = createBroadcaster(getWeeklyCheckInSessionEvents, (event) => ({
type: event.type,
payload: JSON.parse(event.payload),
userId: event.userId,
user: event.user,
timestamp: event.createdAt,
}));
export { broadcast as broadcastToWeeklyCheckInSession };
export async function GET(request: Request, { params }: { params: Promise<{ id: string }> }) {
const { id: sessionId } = await params;
const session = await auth();
if (!session?.user?.id) {
return new Response('Unauthorized', { status: 401 });
}
const hasAccess = await canAccessWeeklyCheckInSession(sessionId, session.user.id);
if (!hasAccess) {
return new Response('Forbidden', { status: 403 });
}
const userId = session.user.id;
let unsubscribe: () => void = () => {};
let controller: ReadableStreamDefaultController;
const stream = new ReadableStream({
start(ctrl) {
controller = ctrl;
const encoder = new TextEncoder();
controller.enqueue(
encoder.encode(`data: ${JSON.stringify({ type: 'connected', userId })}\n\n`)
);
unsubscribe = subscribe(sessionId, userId, (event) => {
try {
controller.enqueue(encoder.encode(`data: ${JSON.stringify(event)}\n\n`));
} catch {
unsubscribe();
}
});
},
cancel() {
unsubscribe();
},
});
request.signal.addEventListener('abort', () => {
unsubscribe();
});
return new Response(stream, {
headers: {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
Connection: 'keep-alive',
},
});
}

View File

@@ -0,0 +1,65 @@
import { auth } from '@/lib/auth';
import { canAccessYearReviewSession, getYearReviewSessionEvents } from '@/services/year-review';
import { createBroadcaster } from '@/lib/broadcast';
export const dynamic = 'force-dynamic';
const { subscribe, broadcast } = createBroadcaster(getYearReviewSessionEvents, (event) => ({
type: event.type,
payload: JSON.parse(event.payload),
userId: event.userId,
user: event.user,
timestamp: event.createdAt,
}));
export { broadcast as broadcastToYearReviewSession };
export async function GET(request: Request, { params }: { params: Promise<{ id: string }> }) {
const { id: sessionId } = await params;
const session = await auth();
if (!session?.user?.id) {
return new Response('Unauthorized', { status: 401 });
}
const hasAccess = await canAccessYearReviewSession(sessionId, session.user.id);
if (!hasAccess) {
return new Response('Forbidden', { status: 403 });
}
const userId = session.user.id;
let unsubscribe: () => void = () => {};
let controller: ReadableStreamDefaultController;
const stream = new ReadableStream({
start(ctrl) {
controller = ctrl;
const encoder = new TextEncoder();
controller.enqueue(
encoder.encode(`data: ${JSON.stringify({ type: 'connected', userId })}\n\n`)
);
unsubscribe = subscribe(sessionId, userId, (event) => {
try {
controller.enqueue(encoder.encode(`data: ${JSON.stringify(event)}\n\n`));
} catch {
unsubscribe();
}
});
},
cancel() {
unsubscribe();
},
});
request.signal.addEventListener('abort', () => {
unsubscribe();
});
return new Response(stream, {
headers: {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
Connection: 'keep-alive',
},
});
}

View File

@@ -0,0 +1,525 @@
'use client';
import { useState } from 'react';
import {
Avatar,
Badge,
Button,
Card,
CardContent,
CardDescription,
CardFooter,
CardHeader,
CardTitle,
CollaboratorDisplay,
DateInput,
Disclosure,
DropdownMenu,
EditableGifMoodTitle,
EditableMotivatorTitle,
EditableSessionTitle,
EditableTitle,
EditableWeatherTitle,
EditableWeeklyCheckInTitle,
EditableYearReviewTitle,
InlineFormActions,
Input,
IconCheck,
IconClose,
IconDuplicate,
IconEdit,
IconButton,
IconPlus,
IconTrash,
Modal,
ModalFooter,
PageHeader,
ParticipantInput,
RocketIcon,
Select,
SegmentedControl,
SessionPageHeader,
Textarea,
ToggleGroup,
FormField,
NumberInput,
} from '@/components/ui';
const BUTTON_VARIANTS = [
'primary',
'secondary',
'outline',
'ghost',
'destructive',
'brand',
] as const;
const BUTTON_SIZES = ['sm', 'md', 'lg'] as const;
const BADGE_VARIANTS = [
'default',
'primary',
'strength',
'weakness',
'opportunity',
'threat',
'success',
'warning',
'destructive',
'accent',
] as const;
const SELECT_OPTIONS = [
{ value: 'editor', label: 'Editeur' },
{ value: 'viewer', label: 'Lecteur' },
{ value: 'admin', label: 'Admin' },
];
const SECTION_LINKS = [
{ id: 'buttons', label: 'Buttons' },
{ id: 'badges', label: 'Badges' },
{ id: 'icon-button', label: 'IconButton' },
{ id: 'form-inputs', label: 'Form Inputs' },
{ id: 'select-toggle', label: 'Select & Toggle' },
{ id: 'form-field', label: 'FormField / Date / Number' },
{ id: 'cards', label: 'Cards' },
{ id: 'avatars', label: 'Avatar & Collaborators' },
{ id: 'disclosure-dropdown', label: 'Disclosure & Dropdown' },
{ id: 'menu', label: 'Menu' },
{ id: 'editable-titles', label: 'Editable Titles' },
{ id: 'session-header', label: 'Session Header' },
{ id: 'participant-input', label: 'ParticipantInput' },
{ id: 'icons', label: 'Icons' },
{ id: 'modal', label: 'Modal' },
] as const;
export default function DesignSystemPage() {
const [modalOpen, setModalOpen] = useState(false);
const [toggleValue, setToggleValue] = useState<'cards' | 'table' | 'list'>('cards');
const [selectMd, setSelectMd] = useState('editor');
const [selectSm, setSelectSm] = useState('viewer');
const [selectXs, setSelectXs] = useState('admin');
const [selectLg, setSelectLg] = useState('editor');
const [menuCount, setMenuCount] = useState(0);
return (
<main className="mx-auto max-w-7xl px-4 py-8">
<PageHeader
emoji="🎨"
title="Design System"
subtitle="Guide visuel des composants UI et de leurs variantes"
actions={
<Button variant="brand" size="sm">
Action principale
</Button>
}
/>
<div className="grid items-start gap-8" style={{ gridTemplateColumns: '240px minmax(0, 1fr)' }}>
<aside>
<Card className="sticky top-20 p-4">
<p className="mb-3 text-sm font-medium text-foreground">Menu de la page</p>
<nav className="flex flex-col gap-1.5">
{SECTION_LINKS.map((section) => (
<a
key={section.id}
href={`#${section.id}`}
className="rounded-md px-2.5 py-1.5 text-sm text-muted transition-colors hover:bg-card-hover hover:text-foreground"
>
{section.label}
</a>
))}
</nav>
</Card>
</aside>
<div className="space-y-8">
<Card id="buttons" className="p-6">
<h2 className="mb-4 text-xl font-semibold text-foreground">Buttons</h2>
<div className="space-y-4">
{BUTTON_SIZES.map((size) => (
<div key={size} className="flex flex-wrap items-center gap-3">
<span className="w-12 text-xs uppercase tracking-wide text-muted">{size}</span>
{BUTTON_VARIANTS.map((variant) => (
<Button key={`${size}-${variant}`} variant={variant} size={size}>
{variant}
</Button>
))}
</div>
))}
<div className="flex flex-wrap items-center gap-3 border-t border-border pt-4">
<Button loading>Chargement</Button>
<Button disabled>Desactive</Button>
</div>
</div>
</Card>
<Card id="badges" className="p-6">
<h2 className="mb-4 text-xl font-semibold text-foreground">Badges</h2>
<div className="flex flex-wrap gap-2">
{BADGE_VARIANTS.map((variant) => (
<Badge key={variant} variant={variant}>
{variant}
</Badge>
))}
</div>
</Card>
<Card id="icon-button" className="p-6">
<h2 className="mb-4 text-xl font-semibold text-foreground">IconButton</h2>
<div className="flex flex-wrap items-center gap-2">
<IconButton icon={<IconEdit />} label="Edit" />
<IconButton icon={<IconDuplicate />} label="Duplicate" variant="primary" />
<IconButton icon={<IconTrash />} label="Delete" variant="destructive" />
</div>
</Card>
<Card id="form-inputs" className="p-6">
<h2 className="mb-4 text-xl font-semibold text-foreground">Form Inputs</h2>
<div className="grid gap-4 md:grid-cols-2">
<Input label="Input standard" placeholder="Votre texte" />
<Input label="Input avec erreur" defaultValue="Valeur invalide" error="Champ invalide" />
<Textarea label="Textarea standard" placeholder="Votre description" rows={3} />
<Textarea
label="Textarea avec erreur"
defaultValue="Texte"
rows={3}
error="Description trop courte"
/>
</div>
</Card>
<Card id="select-toggle" className="p-6">
<h2 className="mb-4 text-xl font-semibold text-foreground">Select & ToggleGroup</h2>
<div className="grid gap-4 md:grid-cols-2">
<div className="space-y-3">
<Select
label="Select XS"
size="xs"
value={selectXs}
onChange={(e) => setSelectXs(e.target.value)}
options={SELECT_OPTIONS}
/>
<Select
label="Select SM"
size="sm"
value={selectSm}
onChange={(e) => setSelectSm(e.target.value)}
options={SELECT_OPTIONS}
/>
<Select
label="Select MD"
size="md"
value={selectMd}
onChange={(e) => setSelectMd(e.target.value)}
options={SELECT_OPTIONS}
/>
<Select
label="Select LG"
size="lg"
value={selectLg}
onChange={(e) => setSelectLg(e.target.value)}
options={SELECT_OPTIONS}
/>
</div>
<div className="space-y-3">
<p className="text-sm font-medium text-foreground">Toggle group</p>
<ToggleGroup
value={toggleValue}
onChange={setToggleValue}
options={[
{ value: 'cards', label: 'Cards' },
{ value: 'table', label: 'Table' },
{ value: 'list', label: 'List' },
]}
/>
<p className="text-sm text-muted">Valeur active: {toggleValue}</p>
<p className="pt-2 text-sm font-medium text-foreground">Segmented control</p>
<SegmentedControl
value={toggleValue}
onChange={setToggleValue}
options={[
{ value: 'cards', label: 'Cards' },
{ value: 'table', label: 'Table' },
{ value: 'list', label: 'List' },
]}
/>
</div>
</div>
</Card>
<Card id="form-field" className="p-6">
<h2 className="mb-4 text-xl font-semibold text-foreground">FormField / Date / Number</h2>
<div className="grid gap-4 md:grid-cols-2">
<FormField label="FormField">
<Input placeholder="Control custom" />
</FormField>
<DateInput label="DateInput" defaultValue="2026-03-03" />
<NumberInput label="NumberInput" defaultValue={42} />
</div>
</Card>
<Card id="cards" className="p-6">
<h2 className="mb-4 text-xl font-semibold text-foreground">Cards & Header blocks</h2>
<div className="grid gap-4 md:grid-cols-2">
<Card hover>
<CardHeader>
<CardTitle>Card title</CardTitle>
<CardDescription>Description secondaire</CardDescription>
</CardHeader>
<CardContent>
<p className="text-sm text-muted">Contenu principal de la card.</p>
</CardContent>
<CardFooter className="justify-end">
<Button size="sm" variant="outline">
Annuler
</Button>
<Button size="sm">Valider</Button>
</CardFooter>
</Card>
<Card className="p-4">
<h3 className="mb-3 font-medium text-foreground">Inline actions</h3>
<Input placeholder="Exemple inline" className="mb-2" />
<InlineFormActions
onCancel={() => {}}
onSubmit={() => {}}
isPending={false}
submitLabel="Ajouter"
/>
</Card>
</div>
</Card>
<Card id="avatars" className="p-6">
<h2 className="mb-4 text-xl font-semibold text-foreground">Avatar & Collaborators</h2>
<div className="grid gap-4 md:grid-cols-2">
<div className="flex items-center gap-3">
<Avatar email="jane.doe@example.com" name="Jane Doe" size={40} />
<Avatar email="john.smith@example.com" name="John Smith" size={32} />
<Avatar email="team@example.com" size={24} />
</div>
<div className="space-y-3">
<CollaboratorDisplay
collaborator={{
raw: 'Jane Doe',
matchedUser: {
id: '1',
email: 'jane.doe@example.com',
name: 'Jane Doe',
},
}}
showEmail
/>
<CollaboratorDisplay
collaborator={{
raw: 'Intervenant externe',
matchedUser: null,
}}
/>
</div>
</div>
</Card>
<Card id="disclosure-dropdown" className="p-6">
<h2 className="mb-4 text-xl font-semibold text-foreground">Disclosure & Dropdown</h2>
<div className="space-y-4">
<Disclosure icon="" title="Panneau pliable" subtitle="Composant Disclosure">
<p className="text-sm text-muted">Contenu du panneau.</p>
</Disclosure>
<DropdownMenu
panelClassName="mt-2 w-56 rounded-lg border border-border bg-card p-2 shadow-lg"
trigger={({ open, toggle }) => (
<Button type="button" variant="outline" onClick={toggle}>
Menu demo {open ? '▲' : '▼'}
</Button>
)}
>
{({ close }) => (
<Button
size="sm"
variant="secondary"
onClick={() => {
setMenuCount((prev) => prev + 1);
close();
}}
>
Incrementer ({menuCount})
</Button>
)}
</DropdownMenu>
</div>
</Card>
<Card id="menu" className="p-6">
<h2 className="mb-4 text-xl font-semibold text-foreground">Menu</h2>
<DropdownMenu
panelClassName="mt-2 w-64 overflow-hidden rounded-lg border border-border bg-card py-1 shadow-lg"
trigger={({ open, toggle }) => (
<Button type="button" variant="outline" onClick={toggle}>
Ouvrir le menu {open ? '▲' : '▼'}
</Button>
)}
>
{({ close }) => (
<>
<div className="border-b border-border px-4 py-2">
<p className="text-xs text-muted">MENU DE DEMO</p>
<p className="text-sm font-medium text-foreground">Navigation rapide</p>
</div>
<button
type="button"
onClick={close}
className="block w-full px-4 py-2 text-left text-sm text-foreground hover:bg-card-hover"
>
👤 Mon profil
</button>
<button
type="button"
onClick={close}
className="block w-full px-4 py-2 text-left text-sm text-foreground hover:bg-card-hover"
>
👥 Equipes
</button>
<button
type="button"
onClick={close}
className="block w-full px-4 py-2 text-left text-sm text-destructive hover:bg-card-hover"
>
Se deconnecter
</button>
</>
)}
</DropdownMenu>
</Card>
<Card id="editable-titles" className="p-6">
<h2 className="mb-4 text-xl font-semibold text-foreground">Editable Titles</h2>
<div className="space-y-6">
<div>
<p className="mb-2 text-sm font-medium text-foreground">EditableTitle (base)</p>
<EditableTitle
sessionId="demo-editable-title"
initialTitle="Titre modifiable (cliquez pour tester)"
canEdit
onUpdate={async () => ({ success: true })}
/>
</div>
<div className="grid gap-4 md:grid-cols-2">
<EditableSessionTitle
sessionId="demo-session-title"
initialTitle="Session title wrapper"
canEdit={false}
/>
<EditableMotivatorTitle
sessionId="demo-motivator-title"
initialTitle="Motivator title wrapper"
canEdit={false}
/>
<EditableYearReviewTitle
sessionId="demo-year-review-title"
initialTitle="Year review title wrapper"
canEdit={false}
/>
<EditableWeatherTitle
sessionId="demo-weather-title"
initialTitle="Weather title wrapper"
canEdit={false}
/>
<EditableWeeklyCheckInTitle
sessionId="demo-weekly-checkin-title"
initialTitle="Weekly check-in title wrapper"
canEdit={false}
/>
<EditableGifMoodTitle
sessionId="demo-gif-mood-title"
initialTitle="Gif mood title wrapper"
canEdit={false}
/>
</div>
</div>
</Card>
<Card id="session-header" className="p-6">
<h2 className="mb-4 text-xl font-semibold text-foreground">Session Header</h2>
<SessionPageHeader
workshopType="swot"
sessionId="demo-session"
sessionTitle="Atelier de demonstration"
isOwner={true}
canEdit={false}
ownerUser={{ name: 'Jane Doe', email: 'jane.doe@example.com' }}
date={new Date()}
collaborator={{
raw: 'Jane Doe',
matchedUser: {
id: '1',
email: 'jane.doe@example.com',
name: 'Jane Doe',
},
}}
badges={<Badge variant="primary">DEMO</Badge>}
/>
</Card>
<Card id="participant-input" className="p-6">
<h2 className="mb-4 text-xl font-semibold text-foreground">ParticipantInput</h2>
<ParticipantInput name="participant" />
</Card>
<Card id="icons" className="p-6">
<h2 className="mb-4 text-xl font-semibold text-foreground">Icons</h2>
<div className="flex flex-wrap items-center gap-4 text-foreground">
<div className="flex items-center gap-2">
<IconEdit />
<span className="text-sm text-muted">Edit</span>
</div>
<div className="flex items-center gap-2">
<IconTrash />
<span className="text-sm text-muted">Trash</span>
</div>
<div className="flex items-center gap-2">
<IconDuplicate />
<span className="text-sm text-muted">Duplicate</span>
</div>
<div className="flex items-center gap-2">
<IconPlus />
<span className="text-sm text-muted">Plus</span>
</div>
<div className="flex items-center gap-2">
<IconCheck />
<span className="text-sm text-muted">Check</span>
</div>
<div className="flex items-center gap-2">
<IconClose />
<span className="text-sm text-muted">Close</span>
</div>
<div className="flex items-center gap-2">
<RocketIcon className="h-5 w-5" />
<span className="text-sm text-muted">Rocket</span>
</div>
</div>
</Card>
<Card id="modal" className="p-6">
<h2 className="mb-4 text-xl font-semibold text-foreground">Modal</h2>
<Button onClick={() => setModalOpen(true)}>Ouvrir la popup</Button>
</Card>
</div>
</div>
<Modal isOpen={modalOpen} onClose={() => setModalOpen(false)} title="Exemple de popup" size="md">
<p className="text-sm text-muted">
Ceci est un exemple de modal avec ses actions standardisees.
</p>
<ModalFooter>
<Button variant="outline" onClick={() => setModalOpen(false)}>
Annuler
</Button>
<Button onClick={() => setModalOpen(false)}>Confirmer</Button>
</ModalFooter>
</Modal>
</main>
);
}

View File

@@ -0,0 +1,67 @@
import { notFound } from 'next/navigation';
import { auth } from '@/lib/auth';
import { getGifMoodSessionById } from '@/services/gif-mood';
import { getUserTeams } from '@/services/teams';
import { GifMoodBoard, GifMoodLiveWrapper } from '@/components/gif-mood';
import { Badge, SessionPageHeader } from '@/components/ui';
interface GifMoodSessionPageProps {
params: Promise<{ id: string }>;
}
export default async function GifMoodSessionPage({ params }: GifMoodSessionPageProps) {
const { id } = await params;
const authSession = await auth();
if (!authSession?.user?.id) {
return null;
}
const session = await getGifMoodSessionById(id, authSession.user.id);
if (!session) {
notFound();
}
const userTeams = await getUserTeams(authSession.user.id);
return (
<main className="mx-auto max-w-7xl px-4">
<SessionPageHeader
workshopType="gif-mood"
sessionId={session.id}
sessionTitle={session.title}
isOwner={session.isOwner}
canEdit={session.canEdit}
ownerUser={session.user}
date={session.date}
badges={<Badge variant="primary">{session.items.length} GIFs</Badge>}
/>
{/* Live Wrapper + Board */}
<GifMoodLiveWrapper
sessionId={session.id}
sessionTitle={session.title}
currentUserId={authSession.user.id}
shares={session.shares}
isOwner={session.isOwner}
canEdit={session.canEdit}
userTeams={userTeams}
>
<GifMoodBoard
sessionId={session.id}
currentUserId={authSession.user.id}
items={session.items}
shares={session.shares}
owner={{
id: session.user.id,
name: session.user.name ?? null,
email: session.user.email ?? '',
}}
ratings={session.ratings}
canEdit={session.canEdit}
/>
</GifMoodLiveWrapper>
</main>
);
}

View File

@@ -0,0 +1,119 @@
'use client';
import { useState } from 'react';
import { useRouter } from 'next/navigation';
import {
Card,
CardHeader,
CardTitle,
CardDescription,
CardContent,
Button,
DateInput,
Input,
} from '@/components/ui';
import { createGifMoodSession } from '@/actions/gif-mood';
import { GIF_MOOD_MAX_ITEMS } from '@/lib/types';
export default function NewGifMoodPage() {
const router = useRouter();
const [loading, setLoading] = useState(false);
const [error, setError] = useState<string | null>(null);
const [selectedDate, setSelectedDate] = useState(new Date().toISOString().split('T')[0]);
const [title, setTitle] = useState(
() =>
`GIF Mood - ${new Date().toLocaleDateString('fr-FR', { day: 'numeric', month: 'long', year: 'numeric' })}`
);
async function handleSubmit(e: React.FormEvent<HTMLFormElement>) {
e.preventDefault();
setError(null);
setLoading(true);
const date = selectedDate ? new Date(selectedDate) : undefined;
if (!title) {
setError('Veuillez remplir le titre');
setLoading(false);
return;
}
const result = await createGifMoodSession({ title, date });
if (!result.success) {
setError(result.error || 'Une erreur est survenue');
setLoading(false);
return;
}
router.push(`/gif-mood/${result.data?.id}`);
}
return (
<main className="mx-auto max-w-2xl px-4">
<Card>
<CardHeader>
<CardTitle className="flex items-center gap-2">
<span>🎞</span>
Nouveau GIF Mood Board
</CardTitle>
<CardDescription>
Créez un tableau de bord GIF pour exprimer et partager votre humeur avec votre équipe
</CardDescription>
</CardHeader>
<CardContent>
<form onSubmit={handleSubmit} className="space-y-6">
{error && (
<div className="rounded-lg border border-destructive/20 bg-destructive/10 p-3 text-sm text-destructive">
{error}
</div>
)}
<Input
label="Titre de la session"
name="title"
placeholder="Ex: GIF Mood - Mars 2026"
value={title}
onChange={(e) => setTitle(e.target.value)}
required
/>
<DateInput
id="date"
name="date"
label="Date"
value={selectedDate}
onChange={(e) => setSelectedDate(e.target.value)}
required
/>
<div className="rounded-lg border border-border bg-card-hover p-4">
<h3 className="font-medium text-foreground mb-2">Comment ça marche ?</h3>
<ol className="text-sm text-muted space-y-1 list-decimal list-inside">
<li>Partagez la session avec votre équipe</li>
<li>Chaque membre peut ajouter jusqu&apos;à {GIF_MOOD_MAX_ITEMS} GIFs</li>
<li>Ajoutez une note à chaque GIF pour expliquer votre humeur</li>
<li>Les GIFs apparaissent en temps réel pour tous les membres</li>
</ol>
</div>
<div className="flex gap-3 pt-4">
<Button
type="button"
variant="outline"
onClick={() => router.back()}
disabled={loading}
>
Annuler
</Button>
<Button type="submit" loading={loading} className="flex-1">
Créer le GIF Mood Board
</Button>
</div>
</form>
</CardContent>
</Card>
</main>
);
}

View File

@@ -1,7 +1,7 @@
@import 'tailwindcss'; @import 'tailwindcss';
/* ============================================ /* ============================================
SWOT Manager - CSS Variables Theme System Workshop Manager - CSS Variables Theme System
============================================ */ ============================================ */
:root { :root {
@@ -11,7 +11,7 @@
/* Cards & Surfaces */ /* Cards & Surfaces */
--card: #ffffff; --card: #ffffff;
--card-hover: #f1f5f9; --card-hover: #e2e8f0;
--card-border: #e2e8f0; --card-border: #e2e8f0;
/* Primary - Cyan/Teal */ /* Primary - Cyan/Teal */
@@ -39,6 +39,7 @@
/* Accent Colors */ /* Accent Colors */
--accent: #8b5cf6; --accent: #8b5cf6;
--accent-hover: #7c3aed; --accent-hover: #7c3aed;
--purple: #8b5cf6;
/* Status */ /* Status */
--success: #059669; --success: #059669;
@@ -75,7 +76,7 @@
/* Cards & Surfaces */ /* Cards & Surfaces */
--card: #1e293b; --card: #1e293b;
--card-hover: #283548; --card-hover: #334155;
--card-border: #2d3d53; --card-border: #2d3d53;
/* Primary - Cyan/Teal (softened) */ /* Primary - Cyan/Teal (softened) */
@@ -103,6 +104,7 @@
/* Accent Colors */ /* Accent Colors */
--accent: #a78bfa; --accent: #a78bfa;
--accent-hover: #c4b5fd; --accent-hover: #c4b5fd;
--purple: #a78bfa;
/* Status (softened) */ /* Status (softened) */
--success: #4ade80; --success: #4ade80;

Some files were not shown because too many files have changed in this diff Show More