Problems Fixed:
1. **User Creation (Invite)**
- ❌ Missing password field (required by API)
- ❌ Hardcoded organizationId 'default-org-id'
- ❌ Wrong role format (lowercase instead of ADMIN/USER/MANAGER)
- ✅ Now uses currentUser.organizationId from auth context
- ✅ Added password field with validation (min 8 chars)
- ✅ Fixed role enum to match backend (ADMIN, USER, MANAGER, VIEWER)
2. **Role Change (PATCH)**
- ❌ Used 'as any' masking type errors
- ❌ Lowercase role values
- ✅ Proper typing with uppercase roles
- ✅ Added success/error feedback
- ✅ Disabled state during mutation
3. **Toggle Active (PATCH)**
- ✅ Was working but added better feedback
- ✅ Added disabled state during mutation
4. **Delete User (DELETE)**
- ✅ Was working but added better feedback
- ✅ Added disabled state during mutation
5. **UI Improvements**
- Added success messages with auto-dismiss (3s)
- Added error messages with auto-dismiss (5s)
- Added loading states on all action buttons
- Fixed role badge colors to use uppercase keys
- Better form validation before API call
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
Problem:
- CSV files uploaded to MinIO via admin panel
- But getAvailableCsvFiles() only listed local filesystem
- Result: rate search returned 0 results even though files exist in MinIO
Solution:
- Modified getAvailableCsvFiles() to check MinIO first
- Lists files from csv_rate_configs table with minioObjectKey
- Falls back to local filesystem if MinIO not configured
- Logs clearly which source is being used
This ensures rate search uses the uploaded CSV files from MinIO storage.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
Changes:
- Moved all Traefik labels from root level to deploy.labels for Swarm compatibility
- Added deploy.restart_policy to backend, frontend, and MinIO services
- Removed restart: unless-stopped (incompatible with Swarm mode)
- Keeps depends_on for startup ordering
This fixes Gateway Timeout errors by ensuring Traefik can properly discover
and route to services when running in Docker Swarm mode.
Services updated:
- xpeditis-backend: Deploy labels + restart policy
- xpeditis-frontend: Deploy labels + restart policy
- xpeditis-minio: Deploy labels + restart policy
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
- Fixed test user migration to use real Argon2id hash for Password123!
- Replaced random uuidv4() with fixed UUIDs in organization seeds
- Updated auth.service.ts to use DEFAULT_ORG_ID constant
- Added ON CONFLICT DO UPDATE to migration for existing users
- Ensures consistent UUIDs across environments (dev/preprod/prod)
Fixes:
- Registration 500 error (foreign key constraint violation)
- Login 401 error (invalid password hash)
- Organization ID mismatch between migrations and application code
Test users (Password: Password123!):
- admin@xpeditis.com (ADMIN)
- manager@xpeditis.com (MANAGER)
- user@xpeditis.com (USER)
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
CRITICAL FIX: Frontend was serving raw CSS with uncompiled @tailwind directives,
resulting in completely unstyled pages (plain text without any CSS).
Root cause:
- postcss.config.js and tailwind.config.js/ts were excluded in .dockerignore
- This prevented PostCSS/Tailwind from compiling CSS during Docker builds
- Local builds worked because config files were present
Changes:
1. apps/frontend/.dockerignore:
- Commented out postcss.config.js exclusion
- Commented out tailwind.config.js/ts exclusions
- Added explanatory comments about why these files are needed
2. apps/backend/Dockerfile:
- Copy src/ directory to production stage for CSV upload paths
- Create csv-storage/rates directory with proper permissions
- Fix EACCES errors when uploading CSV files
3. apps/backend/src/application/controllers/admin/csv-rates.controller.ts:
- Add getCsvUploadPath() helper function
- Support both local dev and Docker environments
- Use absolute paths instead of relative paths
4. docker-compose.dev.yml:
- Change backend port mapping to 4001:4000 (avoid local dev conflicts)
- Change frontend port mapping to 3001:3000
- Update CORS_ORIGIN and NEXT_PUBLIC_API_URL accordingly
Impact:
- ✅ Fixes completely broken frontend CSS in Docker/production
- ✅ Applies to CI/CD builds (uses apps/frontend/.dockerignore)
- ✅ Applies to Portainer deployments (pulls from CI/CD images)
- ✅ Fixes CSV upload permission errors in backend
- ✅ Enables local Docker testing on Mac ARM64
Testing:
- Local Docker build now shows compiled Tailwind CSS (60KB+)
- Frontend displays properly styled pages at http://localhost:3001
- Backend CSV uploads work without permission errors
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
- Add docker-compose.local.yml with all services
- Use production images from Scaleway registry
- Configure local PostgreSQL, Redis, MinIO
- Add comprehensive testing guide in LOCAL_TESTING.md
- Includes debugging commands and troubleshooting
This allows testing production Docker images locally before
deploying to Portainer.
- Remove SHA-based tags (preprod-<sha>)
- Remove semver and PR tags
- Keep only branch-based tag (preprod)
- Keep latest tag for default branch
- Add Portainer debugging documentation
This ensures Portainer always pulls the stable branch tag
instead of commit-specific tags.
- Add fs and path imports at top of file
- Remove inline require() statements that violated ESLint rules
- Fixes 6 @typescript-eslint/no-var-requires errors
- Change backend image from DockerHub to Scaleway registry
- Change frontend image from :latest to :preprod tag
- Fix S3 bucket name to match CSV rates system
- Add comprehensive deployment fix documentation
This fixes container startup issues where Portainer was trying
to pull images from DockerHub instead of Scaleway Container Registry.
- Export S3StorageAdapter directly from StorageModule
- Import StorageModule and ConfigModule in CsvRateModule
- Fix dependency injection for S3StorageAdapter in CSV controllers
- Upload CSV files to MinIO/S3 after validation
- Store MinIO object key in database metadata
- Support loading CSV from MinIO with fallback to local files
- Delete from both MinIO and local storage when removing files
- Add migration script to upload existing CSV files to MinIO
- Graceful degradation if MinIO is not configured
Fixed CSV file upload to properly generate filename based on company name. The previous implementation tried to read `req.body.companyName` in multer's filename callback, but the body is not yet parsed at that point, causing files to be named "unknown.csv".
## Solution
1. Use temporary filename during upload (timestamp + random)
2. After validation and parsing, rename file to proper company name format
3. Delete old file if it exists before renaming
4. Store final filename in database configuration
## Changes
- Multer filename callback now generates temporary filename
- Added file renaming logic after successful validation
- Updated database records to use final filename instead of temp name
- Added logging for file operations
## Impact
- New CSV uploads will have correct filenames (e.g., "ssc-consolidation.csv")
- No more "unknown.csv" files
- Existing "unknown.csv" needs to be manually deleted via dashboard
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
Fixed CSV rate upload to use the company name provided in the upload form instead of reading it from the CSV file's companyName column. This prevents "unknown" or incorrect company names from being used.
## Changes
**Domain Layer**
- Updated `CsvRateLoaderPort` interface to accept optional `companyNameOverride` parameter
- Modified `CsvRateSearchService.loadAllRates()` to pass company name from config when loading rates
**Infrastructure Layer**
- Updated `CsvRateLoaderAdapter.loadRatesFromCsv()` to accept `companyNameOverride` parameter
- Modified `mapToCsvRate()` to use override company name if provided, otherwise fallback to CSV column value
- Added logging to show which company name is being used (from override or CSV)
**Application Layer**
- Updated CSV upload controller to pass `dto.companyName` to the loader
## Impact
- When uploading a CSV file through the admin interface, the company name from the form is now correctly used
- Existing CSV files with "unknown" in the companyName column will now show the correct company name from the database configuration
- Backward compatible: if no override is provided, the CSV column value is still used
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
Added comprehensive CSV rates management interface to the frontend dashboard with full CRUD operations.
## Backend Changes
- Added `GET /api/v1/admin/csv-rates/files` endpoint to list all uploaded CSV files with metadata
- Added `DELETE /api/v1/admin/csv-rates/files/:filename` endpoint to delete CSV files and their configurations
- Both endpoints provide frontend-compatible responses with file info (filename, size, rowCount, uploadedAt)
- File deletion includes both filesystem cleanup and database configuration removal
## Frontend Changes
- Added "CSV Rates" navigation item to dashboard sidebar (ADMIN only)
- Moved CSV rates page from `/app/admin/csv-rates` to `/app/dashboard/admin/csv-rates` for proper dashboard integration
- Updated CsvUpload component to include required `companyEmail` field
- Component now properly validates and sends all required fields (companyName, companyEmail, file)
- Enhanced form validation with email input type
## Features
- ✅ Upload CSV rate files with company name and email
- ✅ List all uploaded CSV files with metadata (filename, size, row count, upload date)
- ✅ Delete CSV files with confirmation dialog
- ✅ Real-time file validation (format, size limit 10MB)
- ✅ Auto-refresh after successful operations
- ✅ ADMIN role-based access control
- ✅ Integrated into dashboard navigation
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
Added two notification jobs that send Discord webhooks:
- notify-success: Sends a green embed when pipeline succeeds
- notify-failure: Sends a red embed when pipeline fails
Notifications include:
- Repository and branch information
- Commit SHA with clickable link
- Docker image names (backend & frontend)
- Link to workflow run for debugging
Requires DISCORD_WEBHOOK_URL secret to be configured in repository settings.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
The .dockerignore was excluding package-lock.json, causing npm ci to fail
with "The npm ci command can only install with an existing package-lock.json".
Commented out the package-lock.json line in .dockerignore to allow it to be
copied into the Docker build context.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
GitHub Actions cache (type=gha) is not available in Gitea, causing timeout errors.
Replaced with registry-based cache which works with any container registry.
Changes:
- Backend: cache-from/to type=registry,ref=backend:buildcache
- Frontend: cache-from/to type=registry,ref=frontend:buildcache
- Removed debug step that's no longer needed
This allows Docker layer caching while maintaining Gitea compatibility.
Images are successfully being pushed to rg.fr-par.scw.cloud/weworkstudio
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>